Continue in 2 seconds

OLAP Paradoxes

  • January 01 2003, 1:00am EST
More in

As a DM Review reader, you might have been one of the 2,236 people in 49 countries who participated in The OLAP Survey 2 last summer. Many more people participated in this survey than had in the first, thus allowing for much more thorough analysis. Many surprises and paradoxes emerged from the analysis of this data, a few of which are covered in this article.

Slightly more than 60 percent of the sample had not even considered buying online analytical processing (OLAP) yet, though the fact that they participated in the survey suggests that they had some interest in the subject. Another 10 percent had considered buying, but had not yet done so, while the remaining 30 percent had purchased and deployed OLAP. This latter group answered dozens of additional questions concerning their purchase and usage of OLAP.

Figure 1: OLAP Purchase Rates by Total Organization Revenue (The numbers in parentheses indicate the number of respondents for each segment.)

OLAP Usage Drops at the Top

Predictably, larger organizations were more likely to have deployed OLAP applications than smaller companies. However, the trend seems to flatten out at revenues greater than $500 million; and, curiously, the largest organizations with revenues greater than $15 billion had less OLAP usage than those with revenues of $1 billion to $15 billion.

There may be two reasons for this unexpected drop at the top:

  • Government employees are likely to put themselves in this top band, and there is less OLAP usage in government than in other sectors.
  • While most large organizations use OLAP somewhere, it seems that its usage is often in pockets rather than being deployed on an enterprise scale; therefore, respondents to this survey may not even be aware of OLAP applications in other parts of their huge, sprawling organizations. In smaller organizations, there may be more unified systems, and people probably have a better idea of projects elsewhere in the business.

Bad News for Startups

Turning to those who had considered or purchased OLAP, we asked what factors influenced their choice of products to evaluate, and there are some clear differences between those who had already bought and those who were merely considering. However, there is bad news for new market entrants: The marketing techniques they would have to use were not reported to be particularly influential. Only approximately one- quarter of the respondents said they were influenced by press articles or case studies, while seminars, trade shows and conferences were even less influential. This presents a real dilemma for young companies with smart new ideas. Not only is launching new products very expensive, but potential buyers say they are not heavily influenced by conventional marketing. Conversely, if buyers are largely influenced by previous experience, how will they ever discover anything new?

Influence Bought (685) Considered (230)
Previous experience 50.1% 33.0%
Internet research 44.5% 50.4%
Industry analyst research 38.2% 22.6%
Consultant advice 38.1% 24.3%
Product already used elsewhere in the organization 33.1% 12.6%
Press articles and case studies 25.0% 27.4%
Attending face-to-face seminars 21.8% 18.3%
Trade shows, conferences and exhibitions 21.6% 20.0%
Word of mouth 20.1% 22.6%
Advertisements 19.9% 27.4%
Bundled with another application 16.2% 13.9%
Vendor mail shots and marketing material 9.1% 10.0%
Webinars 8.9% 9.1%
Other 4.1% 6.1%
Average number of influences cited 3.51 2.98

Figure 2: How did you Compile the List of Products to Evaluate?

The other paradox is that respondents placed functionality and ease of use at the top of their buying criteria, with nearly twice as many votes as for price or performance. The success being enjoyed by low-priced OLAP products and the hard bargaining that goes on in most large contracts suggest that price is actually more important than respondents would like to think. However, although price was not reported to be an important factor when first selecting a product, costs were the biggest deterrent to wider deployment later. Therefore, if you intend to deploy widely, give costs a high priority from the beginning – before selecting the product.

1. Functionality 39.3%
2. Ease of use for end users 34.6%
3. Integrates with other products already in use 27.3%
4. Price 20.3%
5. Performance 18.2%
6. Ability to support large numbers of concurrent users 17.0%
7. Bundled with another product 16.7%
8. Corporate standard 16.6%
9. Ease of use for application builders 16.3%
10. Product reputation 12.5%
11. Large data-handling capacity 9.9%
12. Vendor relationship/reputation 8.7%
13. Web architecture 8.1%
14. Completed proof of concept faster than others 7.8%
15. Server platform support 6.9%
16. Chosen vendor did a better sales job 3.8%
17. Other 2.4%

Figure 3: Reasons Given for Choosing OLAP Products

Performance comes a lowly fifth in the buying criteria; however, this turns out to be the biggest product-related problem later. More than twice as many people complain of poor query performance than missing functionality. This suggests that most products have adequate functionality for real-world deployments, so buyers should put less emphasis on evaluating a long features list and more on performance testing before buying.

Measuring Success

We were very keen to calibrate the success of projects in a consistent way that could be applied to every aspect of the process, and it made sense to do this in business rather than technical terms. Therefore, OLAP users were asked how well their projects had met their original goals and to what extent each of eight separate business benefits had been realized. These included soft benefits such as improved reporting, as well as harder benefits such as provably increased revenues or headcount savings. The seven-point scale went from a top scoring "proven and quantified" down to a negative scoring "got worse or more expensive." Not surprisingly, the soft benefits were much more likely to have been realized than the hard benefits; nearly 70 percent of respondents said that improved reporting was a proven benefit while only 19.3 percent said that they had proven a saving in information systems (IS) headcount.

We used the composite benefit index as the key measure of project success and soon found a real surprise: those organizations who had undertaken a competitive evaluation not only had a higher benefit score than those who had skipped this step, but every one of the eight benefits scored higher too. They were also more likely to achieve their project goals.

Figure 4: Goals and Benefits Achieved vs. Selection Method

This improved success rate may be because this process does lead to a more suitable product being selected, or it may simply be that projects that skip this step also omit other important steps, thus jeopardizing the overall project. It may also be that the process of evaluating multiple products is itself educational and leads to better projects regardless of the product finally selected.

It is a confirmation that it's not necessarily a good idea to simply use add-on or bundled modules from incumbent vendors. In particular, don't just assume that your ERP, CRM or database vendor's add-on analytical products are a risk-free way of extracting the best value from your data – if you evaluate other options, you may well do better.

Implementation Spend and Success Rates

Most (83 percent) organizations used in-house resources for at least part of their OLAP implementations, though almost as many also employed external consultants. More than two-thirds of projects were in-house led, mainly by IT. Of the external resources, specialist BI/OLAP consulting firms are the most likely to lead projects; and these also had by far the highest benefit score of all, 4.5.

Because so many sites used in-house resources, the median external spend was only approximately $18,000. However, there was some evidence that external consulting spend of at least $10,000 increased the benefits score.

Web Anomalies

I reserved the biggest surprise for last. Among the many areas surveyed was the Web deployment rate, both at the time of the survey and projected for 12 months later. The respondents to the first OLAP survey had reported high and increasing Web deployment rates. Even though we had not expected the new survey, with data captured approximately 18 months later, to show the full projected increase, we did expect some increase. However, to our surprise, the Web deployment rates reported this time were actually significantly lower than those reported per the first survey.

The first reaction was to suspect a statistical glitch; but there weren't any other strange results from other questions, so this seemed an unlikely explanation. The next theory was that the reduced rate reflected the more international sample this year, with 50 percent rather than 72 percent from North American organizations which were more likely to have Web-deployed OLAP applications. This might have played a small part in reducing the total, but even the North American sample showed a significant drop in Web deployment this time. For example, in the 2000/1 sample, 51 percent of North American users said that at least 50 percent of their seats were Web-deployed; in the summer of 2002, only 38 percent said the same. There was also a small drop in Web deployment rates in international users, from 41 to 37 percent.

The drop in the North American figure is too large to be just a statistical anomaly, so we had to consider two other possibilities:

  • A genuine return from Web deployment to client/server, perhaps because of disappointment with the costs, functionality or performance of Web OLAP products. There is anecdotal evidence that this does sometimes happen, but it is hard to believe that it accounts for the full extent of the fall.
  • Perhaps people were overestimating their OLAP deployment rates in 2000/1 because they were carried away by the dot-com hype of the time. This time, they were probably more realistic; therefore, the lower figures might simply reflect reality rather than an apparent sharp decline in Web deployments. This seems likely the largest contributor to the fall.

Extranet deployment rates were also down, rather than showing the fast growth predicted. There may indeed be a fall here, given the extra security concerns after September 11 and the tougher economic times.
It's easy to make assumptions and projections based on personal experience, but there's no substitute for real data from the real world. Many results from this year's survey confirmed what I already suspected, but others were a genuine surprise. There's no doubt that learning from other people's experience can save you from problems in your own applications.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access