I recently attended Storage Field Day 9 viewing presentations from a range of storage vendors. Many of the vendors focused their presentations on their system architecture and the perceived advantages or USPs of their approach. Each vendor clearly felt passionate and believed that their way was the best. What was clear was that whilst common themes ran through the designs there was also often other aspects to the design which were at odds to what other vendors were proposing. An example of one of these design debates would be commodity v custom hardware. The pro commodity hardware advocates would argue it gives the best price point and flexibility whereas those pushing custom hardware would argue it is built for a purpose and therefore gives the greatest performance and resilience.
Another theme that became clear was that any product is a compromise, there is no such thing as the perfect product. Every architectural decision will be at odds with its opposing factor e.g. cost v quality, resilience v speed etc.
With Vendors all lining up to tell us why their architecture is correct and whilst also being willing at times to tell us why another vendors design decisions are incorrect through FUD, how can we find what truly is the best system for our business. We will start the discussion by looking at what may seem like an unrelated discipline, sports.
Styles make fights
Before many boxing matches both competitors enter into ‘trash-talk’, not only ridiculing the other boxer but also focusing on why their style or method of doing things is better. “My jab is too strong and my movement will be too quick” or “I am too strong I will be too much for him”. But as any bookie will tell you a perceived advantage on paper does not always add up, upsets happen. What may appear a superior method on paper in practical terms is not up to the job. It is only when putting the boxers to the test is it possible to see whose method is superior.
In my unsuccessful pursuit to master the perfect golf swing I have been to the driving range and have witnessed guys with the most awful technique I have ever seen swinging wildly, off balance with no alignment but then consistently hitting perfect shots. The take home lesson is again what may appear a poor method when put to practical use produces great results.
What’s this got to do with storage?
Does this mean that architecture is unimportant and can be ignored? Short answer no. Clearly you need to find an architecture that can deliver your demands in terms of your performance, availability and features. But the take home message is that you cannot look at architecture alone, what looks like a stunning concept may have little practical benefit to performance.
So how can customers cut through all this ‘my systems better than your system’? Only with the transparency and clarity that would come with stats. When you buy a car you can easily see all the key stats related to performance, economy, size and weight. This gives customers clarity and allows them to make informed buying decisions. If car A does 10MPG than car B and your key buying criteria is efficiency then you have a clear decision and choose car A. To an extent who cares how the engine was designed and if car B’s manufacturer still says their architecture is better, the figures have given transparency that car A is more efficient and allowed an informed buying decision.
The closest we have to an agreed standard at the moment is the SPC-1 and SPC-2 tests which measure maximum transactions and throughput respectively. The tests have been criticised for focusing on max flat-out performance with vendors providing unrealistic configurations to maximise results. Other criticisms have been that the current SPC-1 tests do not allow dedupe which effectively stops some vendors with an always on dedupe design from taking part. However today they provide the closest to a standard bench mark we have.
The way forwards to establish an industry standard benchmark is for the industry to regulate itself and come up with a standard testing methodology. Trying to get the vendors to agree to a set of testing that is realistic and is suitable to all parties may be more difficult than searching for the pot of gold at the end of the rainbow. However it would encourage competition, those with strong products would have nothing to fear and vendors would know the kind of test their systems would be put through during the design phase.
Any design is a compromise, which is the best product is the one that suits your organisation in terms of the required benefits you have defined . To aid that decision to find the perfect fit clear industry standard performance statistics are required. Until this happens the only alternative is to shortlist systems and run a proof of concept to find the system that is truly best for your environment.
Let me know if you think vendors will ever agree on a standard testing method and what stats would you like to see?
Disclosure to the disclosure: This part is boring you probably don’t want to read. Oh really you still here, go on then: My flights, accommodation, food etc was paid for by Tech Field Day but I was under no obligation to write about the events. My time at the event was not paid for.