Editor's note: This article is second part of a series. Read Part 1 here, Part 3 here and Part 4 here.

Organizations are constantly exploring and researching means by which to maintain a competitive edge and stay ahead of the competition in today’s constantly changing landscape. The business value that real-time analytics provides is the enhanced capability to react to these changes and make the quick and informed decisions required. Information is made available in real time to enable management to recognize changes and patterns and make the necessary tweaks to stay ahead of the competition This article examines the process of how real-time analytics and business intelligence can be leveraged to provide these capabilities. These capabilities can be implemented in practically any industry.

Information analytics or business intelligence has been the core around which organizations build data related activities. While there are multitudes of initiatives and projects around data, the fundamental purpose of managing and storing it remains the same: analysis. This has spawned numerous related functions that serve to promote or enhance information management and the architecture around it.

BI as a function has evolved and matured over the years, and some organizations have established centers of excellence around the practice. Such an awareness of the usage and capabilities of BI has created needs such as real-time and near real-time data. These needs have been fulfilled by functionalities such as:

  • Real-time analytics,
  • Data virtualization, and
  • Virtual dashboards.

Real time or virtual data as a concept has been in existence for some time, but the technology under the hood has improved. The technology transmitting the data from a transactional system to the users in real time or near real time has advanced by leaps and bounds in recent years. The technology and the process involved is now streamlined, automated and repeatable.
Products available in the marketplace deliver this functionality and require minimal effort and programming. Most of these products are graphical user interface driven and are tightly integrated with data movement technologies. This further alleviates the need for extensive tool evaluations and requests for proposal that can be time-consuming and tedious. The factors that need to be taken into consideration while evaluating any specific tool will be determined by the requirements, the environment, platform and architecture. There are enterprise solutions available that use massively parallel processing technologies that have made real-time data practical as well as feasible.

A holistic view of business intelligence provides a perspective of the function that traverses multiple lifecycle stages and processes, ranging from business objectives and requirements analysis to design and integration of information to analytics. Along the way, a few related processes and functions have taken shape – information quality and master data management being the most prominent. Each provides a means to ensuring that the information being provided for decision-making is accurate and meaningful. A logical sequence of events that constitute BI may follow a progression like this:

  • Requirements analysis, Source to target analysis,
  • Conceptual, logical and physical modeling,
  • Architecture and target state design,
  • Source integration, information quality, master data,
  • Analytics and decision support framework.               

It might not always be possible to implement these disciplines in their entirety relative to BI, even less so with a real-time solution. A practical approach to achieving data quality and MDM is a phased approach, with the success of each phase lending further credibility to the program. The reality is that that these disciplines lend greater maturity and effectiveness to the BI process.
Most initiatives have the quality and master data components embedded along the path outlined above. Most if not all projects infuse quality processes somewhere along the line of implementing BI. Quality checks and cleansing routines are often embedded in the integration routines. That is the quick means to achieving the quality objectives of a BI program. The stress here on quality of information is not by accident; it is necessary to emphasize the importance of good quality information within a BI program. Eventually the goal is to present upper management and executives in organizations with accurate information upon which to base business decisions. These decisions are crucial in bringing about change and growth to organizations. Hence, it is critical that information being provided is of the highest quality.

Business Use Case

A hedge fund manager would like access to investment and trade activity information in real time to near real time to analyze various positions, holdings and activity and be able to examine risks taken to influence investment decisions. The various metrics to be tracked include performance analysis, calculation of risk, return and ratios, single analysis versus benchmarking, peer group, calculation of daily/weekly/ monthly returns, annualized averages, drawdown analysis and volatility analysis. The information required includes tracking a broad range of investments such as stocks, bonds, currencies and commodities accessed through cash investments or derivatives.

The hedge fund manager needs to analyze this information on a real-time basis as well as on a daily, weekly and monthly basis. The master data around securities, products, accounts and pricing information is also required. This information is stored in product, account and securities master data stores. In short, there is a need to integrate data on prices of securities (and possibly even economic and other financial data) with accounting/transactional logs of securities bought and sold.

The analysis of requirements and sources determines the frequency of trade information being posted. This information includes updates, posting of new trades and any adjustments to existing information. The information needs to be sourced from the transactional trading systems and tied to various pieces of qualitative information in order to satisfy the analytical needs.

At this point, emphasis needs to be placed on the analysis to be performed in order to determine and tie together the various pieces of information required. From a longer-term perspective, further detailed analysis of each source will need to be performed to understand the state of the information contained within each source. The state of the information in the various sources might not be very good or may be of different frequency, with some data stored on a daily basis and other data on a monthly basis. Scrubbing, cleansing and gap analysis may be required to bring the data into a reasonable state. Unfortunately, the time required to resolve any quality issues with the data is a drawback that real-time analytics has to live with.

For more traditional analytical needs, the analysis can be further decomposed into the following steps:

  • Business requirements analysis,
  • Source system analysis,
  • Gap analysis, and
  • Target state determination.

The trade information is captured in proprietary trading systems and needs to be captured in real time. One of the technologies that can be used to achieve this is messaging. A message-based service-oriented architecture is an approach that can be adopted to achieve the objective.
A thorough examination of the requirements will determine the most appropriate type of design. The analytical objectives that require real-time and near real-time analytical needs drive the design and choice of the data model. More traditional and historical analytical needs require a different type of model design.

Real-time analytical needs require that the loading of the data be an extremely efficient process. This necessitates a data model or design that is more normalized. This type of design could be similar or close to a third normal form. The design might call for a separate instance to be created that will facilitate the fast and efficient loading of data from transactional systems. Several possibilities can be used in conjunction with each other. It is likely that the design choices might not conform to popular or traditional design methodologies, but the designer needs to get creative and innovate based on the need.

Real-time to near real-time analysis requires a separate instance that stores the information. This will be based on the specific requirements around how long this information needs to be retained during the day. The service populates information several times daily. Each feed can either be stored for the length of the day or fed to an operational store each time a new feed is provided. Data from the real-time instance can be moved to the operational store at the end of each day. Based on historical needs, a data warehouse can store longer-term data needs. Service calls can be made to each master data store for qualitative information to enrich the information from the warehouse. This is the information that upper management receives for decision-making.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access