We all love a good survey. I especially like those little survey factoids they have in USA Today that provide useful information like how many gallons of water got flushed during commercials on “American Idol.”
I often wonder how useful survey data is. To the uninitiated, a survey seems like cold hard data—”just the facts, maam.” The truth is that survey data is up to as much interpretation as anything.
The famous economist John Kenneth Galbraith was the man who coined the phrase “conventional wisdom.” People forget however that he used that term in a derisive fashion. He noted that humans often associate truth with convenience and self-interest; we are more likely to believe a version of truth that can avoid any effort or discomfort on our part.
The dirty secret is that survey data does not contain truth—it only contains a version of the truth. In other words, its opinion.
So when an analyst company publishes survey results, it also publishes its opinion of what the results mean. Other interested parties such as vendors or product champions within an enterprise will add their spin. The problem is that survey data becomes the conventional wisdom simply because we have been taught that figures never lie.
For the uninitiated, most IT analyst groups have traditionally fallen into one of two camps. The first is the qualitative research model. This model attempts to take some smart, knowledgeable people—hopefully with a background in IT—and let them speak with lots of users. From this interaction the analysts form an opinion and publish it in the form of a research note.
The second camp is the quantitative model, which relies heavily on survey work to show trends in behavior. But numbers do not speak for themselves. So analysis, otherwise known as opinion, is then applied. The important point here, folks, is that both models still ultimately represent opinion.
As a practitioner of the qualitative approach to research, I still love to look at survey data because all data points provide more insight into a subject. There is, however, no perfect model. I am always interested in how one analyst views the data versus how I would view the data. This is a practice not unlike playing devils advocate, and one I highly recommend that users try when presented with survey results produced by analysts and used by vendors for their own purposes.
I was reviewing a survey recently published by Gartner on database management systems spending and deployment characteristics. It caught my eye not only because the database market is my particular area of interest, but also because I was fascinated by the analysis of what the numbers meant, so much so that I had a conversation with Colleen Graham, one of the authors of the report.
Graham works in Gartners Dataquest division, which is the decidedly quantitative side of Gartner. She also works on the annual market share survey for the database market. As a professional at conducting surveys, Ms Graham commiserated with me that she is often surprised at how the numbers she produces are used and spun beyond recognition in some cases.
In the report, 704 respondents were interviewed about their intentions. One part of the survey asked users which databases they planned to deploy in the future, which ones were already installed, and which did they have no plans to deploy. I found the results and the analysis to be very interesting.
Essentially every respondent answered the same questions for each of six database platforms plus a seventh “other” category. The written analysis emphasized the surprise that 19 percent of respondents indicated plans to deploy more DB2 compared with only 13 percent for Oracle.
Next Page: Putting a different spin on things.
2
When I looked at the numbers I came away with a different take. Earlier in the report, one response showed that 14 percent of respondents planned to deploy more on z/OS. The analysis then pointed out that this coincided with the 14 percent growth in DB2 on z/OS, which means that of the 19 percent planning to deploy DB2, only 5 percent were planning to do it on a platform other than IBMs proprietary platform. How do you think Oracle could spin that?
Sometimes what is not emphasized is as interesting as what is. For example, every database platform showed double-digit increases in plans for deployment. That indicates either a strong level of loyalty/satisfaction or of vendor lock-in, although its difficult to tell which it might be from looking at the numbers.
To me, the percentage showing those with no plans to deploy was of real interest. Paradoxically here, IBM, which had the largest percentage of “plans to deploy,” also had 47 percent of respondents stating they have no plans to deploy versus only 26 percent for Oracle and 19 percent for SQL Server. So which should we emphasize, growth or avoidance? Which number would IBM emphasize?
So you see the inherent problem with survey data. A little emphasis here, a little there, makes all the difference in the world. Then of course is the problem of what we count. Some might remember the famous photo of President Truman holding a newspaper with the headline “Dewey Defeats Truman!”
Indeed, all the leading polls were calling for a Truman defeat. As it turned out, the survey was conducted by calling people and asking likely voters their opinion. The flaw of course was that in 1948, many voters did not own phones at that time and they tended to vote Democratic.
When the survey comes out in May on the database market share, firms like Gartner and IDC will be counting license revenue and maintenance money spent to determine the size of the market. I asked Graham if she felt it was an accurate way to look at the market, considering that lower-cost yet more utilized products, as well as new products, would be at a distinct disadvantage.
She conceded that was the case. But speaking as someone who has done her share of surveys, she noted that you have to agree on something to count, and more importantly, you have to be able to count it.
So there you have it—conventional wisdom served up on a survey platter. Whatever makes you feel uncomfortable or will cause you too much effort to interpret, feel free to ignore.
Charles Garry is an independent industry analyst based in Simsbury, Conn. He is a former vice president with META Groups Technology Research Services. He can be reached at cegarry@yahoo.com.
Check out eWEEK.coms for the latest database news, reviews and analysis.