VMware and performance, a strange combination

Most consultants that are deep into virtualization and VMware in specific, know that VMware has a special policy on publishing performance data. By EULA it is not allowed to publish your own performance results if VMware hasn’t had the chance to check your test plan and configuration. Not many other products in the IT have this kind of policy and often people don’t understand it when they hear about it. VMware states that it is very difficult to do good performance testing because a virtual environment works quite different from a physical environment, some of the tools you use in a physical environment don’t work like they should in a virtual environment and therefore the produced performance test can be unreliable. With this policy, VMware wants to prevent that reports are being published with incorrect results, positive or negative.

This policy hasn’t been a big deal for some time, because everybody was convinced that VMware ESX was a more superior hypervisor then any other on the market. But competition hasn’t been sleeping and their products have improved and as they improve, more and more performance reports are being released. Reports, mostly drafted by “independent” authors, that do compare performance between a number of hypervisors, including VMware ESX. Now, this is strange. Did they not have to agree to the EULA agreement on the second screen when installing VMware ESX? Or did they first have their results checked by VMware like they are supposed to? Looking at the results of their reports, I doubt it. And I still have to find a section in one of these reports that shows that VMware has seen and approved these findings. Big question this raises is: “What are you going to do VMware?”

VMware has a large group of followers that would love to report about their own performance findings, but they know about the policy and they cannot risk the relationship with VMware which they need to be able to do their normal daytime job. The use of the word “followers” is well thought of, because this group lives and breathes VMware and they actually feel hurt when a report is being released that turns out to be bad testing and makes VMware look bad. What hurts even more, is that it looks like the authors that publish these reports based on bad testing methods, can get away with it, which leaves the followers empty handed. They can’t speak up for VMware and VMware doesn’t speak up for its own. In the end, these followers are the ones that get “the beating” when talking to customers that wave these reports in front of the followers’ noses.

It is a good virtue of VMware to try and only publish clean, well tested reports, but it turns out to be impossible to stick to this. In the Netherlands we have quite some experience with being best boy in the class, especially in the European Council and we learned that taking up that role eventually doesn’t give you the reward you were looking for. Trying to keep everything clean is good most of the time, but when you turn into a doormat that is a sign you have to change. VMware has two options here.

First, they can ask the authors that already published, to run their tests by VMware first and then evaluate and publish them together. And if the author refuses to cooperate, ask them to withdraw the report.

Second option would be to remove the policy from the EULA. Since there are so many reports already, why not allow your followers to publish their findings?

Note: I published this post before VMware reacted to the reports I based this blogpost on. See this reaction from VMware: Why There’s Still a Benchmarking Clause in Our EULA . And this VMware post does make it look like they are actively searching for bad conducter performance reports. I hope VMware is capable of getting the reports rectified or withdrawn, because just posting a reaction is not enough I think. Google will find the report,  but the customer will most likely not search for VMware’s reaction to it.

3 thoughts on “VMware and performance, a strange combination

  1. I’m rather afraid that VMware are both saint and sinner in this regard. They publish some excellent guidelines on how to do benchmarking in a virtual environment, but then throw all that good work away by wording their EULA in such a negative and undefensible way.

    “VMware reserve the right to refuse publication of any benchmark”

    To have the right to veto a poorly conceived testing methodology is quite acceptable, but if they continue to reserve the right require a tester not to publish just because VMware don’t like the results or conclusions that are drawn from them, then I’m rather afraid that the mud slinging will continue.

  2. I agree the VMware EULA only contributes to the problem. Much like Simon writes above.

    Yes, Microsoft is using their more ‘open’ EULA to encourage their prospective customers to benchmark Hyper-V against ESX.

    In these type of competitive market environments, Microsoft is going to use any advantage it can find to gain entry into this space.

    VMware will always have the taint of “something to hide” with their currently worded EULA.

  3. Gabe, as you’ve pointed out, the EULA doesn’t prevent you publishing benchmarks. It simple requires you to contact VMware in advance.

    If you, as a “follower” have some independent and credible test results you’d like to share with the world, why not shoot VMware a mail and ask for permission to publish?

Comments are closed.