8:58 a.m. CST - Seats filling up and, thanks to coffee (and is that Lady Gaga on the PA?), eveyone's eyes opening up ahead of opening presentation by Bruno Aziza, former BI director at Microsoft. Just spotted him chatting w/ Eric Siegel, Ph.D. and president of Prediction Impact, the consultancy that leads the PAW conferences here and in D.C., London, Germany, Boston, and ones already held in Toronto and San Francisco. Check out what Eric had to say ahead of their Toronto event in this "10 Minutes ..." Q&A w/ Information-Management.com.
9:15 a.m. - Aziza, now at SiSense, starts with partial clip from classic SNL fake TV ad for "Bad Idea Jeans" ... Delving into why BI is broken, why an estimated $7 billion is spent on business intelligence every year, but the adoption rate is 30 percent, even less at some enterprises, amounting to about one out of 10 end-users harnessing business analytics. It's "pretty depressing." Maybe, like Mel Brooks' Moses in "History of the World, Part 1," the some old rules need to be cut out of the conversation.
Most companies not equipt to handle "small data," much less big data. When we think about our space, it's a very hot space to be in right now." That's creating a lot of fear over big data: that it's only available to large enterprises, that you need a data scientist ... but it's a "fairly simple problem." Ex. of U.K. railroad monitor, asking about how to check the quality of the tracks. Currently, they physically walk along the tracks doing spot checks. Non-ending, expensive and not entirely safe project. So CIO proposes putting cameras taking pictures under trains every three seconds. But Aziza said you'd need a massive database to index, analyze and store.
Aziza: "The data piece is one small sliver of the problem. You have to think of what service this data is creating."
On expressing "big data" to most enterprises/executives: Nobody cares about data volumes and speed; they only care about the final number. You must think about your data as a service (not to be entirely confused with DaaS). You need applications that are mature enough for users to drill down to that one number they need, that they're looking for. The ability to map that semantic layer and create a direct service is where there will be "wins," Aziza says. If you can't explain the value of your data to ... your end users, you should probably reassess your approach, he says.
With frequently low returns on streams of data, Aziza stresses to just "get the data." Storage can be expensive, but in-memory capabilities growing (enabling far more access-to-network). Audience responding with a wild range of their belived storage costs -- from $20,000 to $80 ... Aziza pegs it at $30 for 1 terabyte of disk storage today, down from $14 million for 1 terabyte from 1980. That's a drastic reduction, but also over, in computing terms, a long time frame. "Incredible opportunity for us to say to people, 'Don't discriminate' ... store everything." (Take Bruno's LinkedIn poll on what makes a data scientist here.) RAM on commodity hardware alone definitely faces storage limitations, but meld w/ a data warehouse system for crunching and deeper storing where need-be. Two other pillars Aziza says are crucial: 1) handle scarcity of human resources to handle large data volumes via as-a-service or a competition approach, like at Kaggle ... seek and combine resources because "you don't know where the best ideas are going to come from"; and, 2) the importance of marketing by folks dealing with/developing data solutions ... think about how people make decisions, and outside of your bias toward approaches on data.
10:45 a.m. - James Taylor, self-described "decision management guy," from Decision Management Solutions starts his talk on profitable decisions through analytics: "what decision you're trying to improve." ("You can't just deliver a bunch of math.") Taylor dives right in --
Analytics team, IT and business: all three have to collaborate. Quick framework:
1) Be clear what decisions you're improving. Begin with the decisions in mind. In analytics, that means: risk (credit, insurance, supply chain); fraud detection; and customer-centered (maximizing interactions and potential ... fast-moving and real-time). Use your business questions to define decisions.
2) How do the decisions impact the business (good or bad)? Target decision-making on KPIs. Link KPIs to your definition of what is to be improved or isn't working. From here you can put analytics in context: what processes will be improved; what events trigger events; and who, internally, cares about analytics. "See where this decision making is going to fit."
3) Decompose decisions to understand them. Taylor says: "It's not typically enough to know that it exists to be able to improve it." Dig into what is required to make decisions: guidelines, expertise, regulations, existing system logic, external reference data, etc. Example of insurance company: used decision model to find 5 areas for risk with life insurance policies, pointed out that no matter how predictive the model was, did not change underpinning underwriter governance. Wouldn't have known w/out model in place, so able to address end-around instead of just layer on top.
4) Still have to deploy analytics back to the processes that need them. Taylor says, remember that operational systems and analytic systems are often pointed in separate directions. So it has to go back based on agility, analytics embedding and adaptive capabilities.