They say good things come in small packages, but they can also come from smaller organizations. I enjoy being able to interact with small local businesses that offer great quality products. I head out every few days to the local baker to buy my loaf of bread, and get a real kick out of seeing the place getting inundated with customers from near and far to get the best loaf in town. The look on someone’s face when he or she arrives early afternoon only to realize everything’s sold is bittersweet for the owner. The owner is sorry a customer will leave empty handed but satisfied with demand so high that the store never has anything left over.

The flurry of activity that goes on behind the counter earlier in the day is as exciting to watch as is eating the final product. Delivery trucks unloading flour and other ingredients, bakers preparing the dough and pastries, the phones ringing with orders for the day, and the delivery guy loading the truck for the daily run to delis and top-end restaurants around town. The owner openly admits there are days when he has no insight into anything that happens outside his small world. With technology limited to just email, feedback comes back to the store through direct customer engagement in person or over the phone and sometimes through the driver. 

Based on these winning ingredients (pardon the pun) on how to run a decent bakery, they’ve since opened four additional stores around town, extending the product range to include coffee and other café-style food – and a larger dedicated baking facility that delivers to some supermarkets now.  In relative terms, it’s the degree of growth larger competitors can only dream of. You know you’re doing well when commercial bakeries start replicating your products and then use their muscle to squeeze you out of your assigned supermarket shelf space. For smaller organizations that have exhausted their know-how and gut-instinct to take them to where they are, the question on their minds now is “What’s next?”  

This scenario talks to a case in point about how smaller, more nimble companies with the right product mix and reputation can encourage larger corporations to react. And when products are becoming increasingly homogenized and competition is tough, smaller operators must innovate everywhere. This could be through product development, marketing initiatives, streamlining supply chain, etc.  Innovation assumes some element of failure as part of the discovery process, and when resources are limited you don’t really want to be trying new things too often.  One of the most powerful tools to help mitigate this risk is analytics – ultimately meaningful reports providing powerful insights to help make informed decisions.  That’s fine if the resources are readily available; however, these are resources that smaller organizations often lack for the following reasons:

  1. Poor IT architecture: The technology deployed is usually fragmented with no access to business intelligence architecture, i.e., starting from nothing.
  2. No data governance: When no architecture is in place, organizations do not have the complete picture of what data is available to them. In the event they do know, they’re not sure what to do with it.
  3. Isolated reporting capability: Isolated reports result in varying ideas on measurement and dimension definition as well as poor governance.
  4. Time constraints: Employees of smaller organizations typically assume extra responsibilities compared to larger corporations. This is certainly the case in the bakery example. When identified subject matter experts spend time away from core operations it creates a gap that needs resource planning to mitigate. Also, solutions are needed in weeks, maybe months, but never in years, as can be the case in some larger organizations.
  5. Budget: Mid-tier organizations simply don’t have the same access to funding that larger organizations do. Traditional BI methodologies would cost way too much for a mid-tier organization to implement – yielding a smaller return on the investment.

Given these constraints, I recommend alternative approaches to technology and methodology to meet business needs sooner.  From a technology perspective, I refer to cloud-based BI offerings (meaning BI software-as-a-service). From an implementation perspective, ideas around agile BI implementation methodologies are concepts that can be leveraged thanks to tightly integrated BI SaaS architecture and the powerful underlying technology stacks these services can run on.
For small to mid-tier organizations, the cloud promises much in the way of reducing the total cost of ownership of IT infrastructure with improved accessibility -- accessibility not only from a hardware and software perspective (i.e., virtual servers running on the latest platforms can be deployed in seconds), but access to storage and processing power that is scalable on demand. With flexible licensing models, organizations simply pay for what they use based on both storage and processing demand. Plenty of offerings exist from major vendors,  complete with cloud-based data warehouse and application server offerings. Amazon has also entered the fray; its recent release of RedShift is tightly integrated to all other components in the Amazon Web Service offering. With such a rapidly changing landscape, selecting a strategic cloud service provider requires research, especially if the intention is to deploy a BI solution due to complex data and processing requirements.

As it stands today, most cloud offerings don’t really offer BI as a service out of the box; so architecture still plays a key role even though the cloud removes many technical complexities. A decent understanding of the tools available are needed to establish a robust but flexible BI architecture. Configuration and some development are still required to use those components to build a solution that will ultimately be a “fit for purpose” BI service ready for consumption by your stakeholders.

Since cloud architecture is designed for rapid deployment, how do we go about deploying BI services rapidly, too? A change in paradigm is required to ensure that BI services deployed in the future can exploit that underlying dynamic technical architecture too, meaning that traditional BI methodologies might not be appropriate for use.

To talk about methodology in detail will require another article in itself, but for the purposes of this article, let’s say that traditional BI methodology can be broken down into the following: requirements gathering for usage and data needs, data modeling, ETL processes and finally reporting and dashboard creation. We’re all familiar with the pain points:

  • Untimely project delivery,
  • Complexity due to data frequently moved between systems and subsequent processing,
  • Various points of failure due to dependence on so many source systems and processes,
  • High costs of implementation
  • Limited benefit to the stakeholder, because by the time the solution is delivered, requirements have moved on – with subsequent impacts on change management. 

For small to mid-tier organizations, the above approach is difficult to even consider and following it fraught with risk.
So to help some of our customers, we’ve adopted the following new ideas to deploy capability sooner and exploit the cloud:

1. Acquire all data. As the business is unsure of the exact information it need now or in the future, acquire it all and consume as required. This ensures that you don’t gather “slices” of data and avoids information silos being developed. Many users have the habit of extracting the same data from the same source system but for a slightly different purpose – ultimately leading to data governance issues.

2. Acquire data in near real time. This avoids the unnecessary complication of batch loads and decreases data latency.

3. Acquire data as a system of record. Do not transform the data on acquisition. The idea is to mirror the source. There are several reasons for this as follows:

  • This aids any information audit purposes.
  • It reduces the load on the source system.
  • This provides increased flexibility when dealing with time series requirements.
  • It ensures that data is acquired as quickly as possible from source.

4. Move and store data once. On acquisition, data is physically moved from the source system to the cloud-based repository as it is acquired. From that point, objects become virtual and data doesn't need to be physically summarized and stored. This achieves reduced complexity, fewer points of failure, lower infrastructure cost and shorter project delivery times. 

5. The traditional ETL approach must change. ETL being used in data warehouses today typically move and/or summarize data only. In this approach, data isn’t being summarized and stored, so traditional ETL becomes irrelevant.

6. No traditional data modeling. Data models change frequently, and traditional modeling methods just aren’t responsive enough to business needs due to their latency in development and deployment. We instead have a new capability that we’ll call a “data mashup,” where data sets come together and disband regularly based on demand.

7. Bring the processing to the data rather than data to the processing. Considering that we have taken as much data as possible, it should never leave the repository.  Processing should occur within the repository, and cloud solutions are capable of offering this. This becomes paramount when dealing with real-time data and virtual objects.

Cloud-based technology can be used to change the way delivery of BI solutions – getting results faster using fewer resources. Removing typical points of failure from traditional methods thanks to the cloud helps us achieve this. Fundamentally, though, a decent solution architecture always needs to be defined with its guiding principles adhered to by those deploying the capability. Combined, it provides smaller, dynamic organizations access to powerful tools that were previously out of reach – tools that guys like my local baker now need to add to their list of ingredients to stay competitive.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access