Here are quick-hit best practices from Deloitte analytics and visualization expert Dave Steier.
IT Businessman Looking Confused at Laptop Computer
Eric Hood/Getty Images/iStockphoto
Establish What Users Will Do With Results
Just as cockpits could not be designed without understanding what pilots need in order to fly an airplane, analytic interfaces should be driven by an understanding of what users will do with the results. Frame the discussion on uses around role-based design, with sensory cues directing action on only the most critical pieces of information.
Businessman pressing virtual Start button
Pascal Luijpen/Getty Images/Hemera
Let the Users Lead
User-centric analytics follows the approach of other user-centric designs: Start from user needs and work backward to design the interface that supports those needs, ultimately to the analytics that will drive that interface. Even when users cannot specify in advance what they really want, it is critical to involve them early and often as analytic interfaces are designed. Users are likely to feel about interfaces the same way Supreme Court Justice Potter Stewart described obscenity they can't define it, but they know it when they see it. Users are even better gauges for bad interfaces if enough users believe an interface is unsatisfactory, the designer is well-advised to accept their judgment.
Leadership and communication
Marina Zlochin/Getty Images/iStockphoto
Talk to the Users
If you are contemplating giving users the ability to set analytics modeling parameters, determine if they want to set those parameters and that they know how to do so or at least give them default values. Users can help identify early wins the designer may not have thought of and might provide useful introductions to other potential users and their communities. A user who feels a sense of ownership in interface design can become an advocate for the technology respected by other users. Users of different abilities may point out accessibility considerations, such as how and when color is used so color-blind users get the same information from the intensity of the display.
A Picture is Worth a Thousand Numbers
Because of our human ability to understand relationships quickly based on size, position and other spatial attributes, the eye can summarize what might otherwise require thousands of numbers to convey. As an example, Figure 1 represents an analytics interface at a large consumer products company. It shows the effectiveness of trade promotion investments in distributors of products offered by the company. Each dot represents a distributor, with the horizontal axis showing the amount of the investment in rebates offered through that distributor and the vertical axis showing the profit or loss from that investment. The vast majority of investments are small, with correspondingly small profits or losses.
From Analytics to Action
An analytics interface may be visually appealing, but if it doesn't stimulate action, it's not going to be very effective. Good interfaces provide the context to let the user know when action might be required. Consider the treemap displayed in Figure 2, drawn from a health care organization that operates a number of hospitals. Each rectangle represents emergency room visits to one hospital; the larger the rectangle, the higher the number of ER visits. The color represents the number of visits relative to the forecasted number of ER visits. Red means the actual number of visits was higher than forecast, while green means the actual number of visits was lower than forecast. In both red and green cases, there is a problem with the forecasts.
Michael Travers/Getty Images/Hemera
Dont Automate Everything
Automatic clustering saves weeks of work, but it is not be important for the marketing manager to know the details. What he or she needs to know in order to figure out what to do next is the size of the outlier investments and returns. Surely, the manager could have inferred this with a spreadsheet and enough time, but this display communicates the findings at a glance. It is easy to see the number of outliers in each cluster, how the clusters relate to each other and the magnitude of the problem or opportunity.
Daniil Semenov/Getty Images/Hemera
Apply principles for good visual design. Displays of related information are horizontally and vertically aligned so the eye can see patterns across related variables (they do not have unintended alignments that suggest misleading or irrelevant comparisons). Color serves to highlight exceptions, not to enliven a dull dashboard. Analytic results are not presented to 10 decimal places when the user does not need such precision to make a decision. The displays have a high "data-ink" ratio, following Yale professor Edward Tufte's principles for designing statistical graphics. Good interfaces avoid 3-D effects or ornate gauge designs when simple numbers, charts and graphs will do.
For more on visualizations
Click here to read the extended original article that this slide show was based on from David Steier. For the Information-Management.com page on visualization how-to, trends and news, click here. Images used with permission from the author and ThinkStock.