Continue in 2 seconds

The Role of Data Connectivity in Business Intelligence

Published
  • March 04 2004, 1:00am EST

Increasingly companies of all sizes are turning to business intelligence (BI) software to enable faster and more informed decision making. As more and more users demand access to business intelligence tools to access and analyze enterprise data, the attributes of the underlying IT infrastructure become critical. One frequently overlooked portion of this infrastructure is data connectivity – the middleware that connects business intelligence applications to the underlying databases.

Data connectivity components, such as ODBC and JDBC drivers and ADO.NET providers, can have a significant impact on applications that rely on databases to support the storage, access and management of information, especially business intelligence solutions. The sophistication and quality of the data connectivity architecture affects the performance, scalability and reliability of business intelligence applications. In fact, one of the greatest impediments to application performance and scalability is the bottleneck between the application and the underlying database caused by flawed or sub-par data connectivity.

As developers at packaged software vendors and in corporate IT departments build and deploy business intelligence solutions, it is important to examine the role that data connectivity plays and options available in the market today.

The Importance of Data Connectivity

Business intelligence users require reliable and fast access to data that may reside in any number of databases across the enterprise. Since real-time business intelligence systems are designed to provide an accurate analysis and report of what is occurring at any given point in time, stable data connectivity components are imperative. Without reliable data access, the system runs a risk of downtime that could have a significant impact on the course of business, inhibiting decision-makers’ ability to see changes in the business as they are occurring, when there is still time to take corrective action.

Organizations that want to utilize a real-time business intelligence system often need access to multiple corporate data stores in order to analyze the information. For example, using a query and reporting tool to analyze data from manufacturing, sales and financial systems can require access to a multitude of databases. This diversity translates into the need for a flexible and high-performance data connectivity layer comprised of components capable of supporting a wide range of databases, database versions, connectivity standards and platforms.

Business intelligence systems often require the use of a tool for data extraction, transformation and loading (ETL). ETL tools enable the combination of data from diverse systems into a common format for reporting or storage. Like query and reporting engines, ETL tools are highly dependent on fast, reliable and scalable data connectivity.

Since all database communications must pass through the connectivity component, bugs or flaws in the data access architecture or implementation cause performance and support problems. In addition, given the query-intensive nature of real-time business intelligence applications and their users’ reliance on these solutions to return consistent and accurate results from database queries, high-performance data connectivity is an important component needed to build an efficient system. With this in mind, developers should look for robust and scalable data connectivity components to ensure consistent behavior and performance while leveraging their existing investment in IT architecture.

Selecting a Data Connectivity Solution

While data connectivity is traditionally a technical decision, its implementation and architecture have significant business implications. When evaluating data connectivity components, organizations have four broad categories to consider:

  • Vendor-provided: Often seen as the most cost- effective alternative for application development, these components come bundled with the purchase of database engines. However, vendor-provided solutions have significant limitations that can impact project success, development and support costs. Since these vendors are focused on selling databases, their data connectivity components are developed and tested for a specific version of one database or a family of databases, which can result in increased costs, suboptimal features and performance in a multiplatform environment.
  • Native: Native components are proprietary, client library-based applications developed in the absence of industry connectivity standards. The native connectivity approach is becoming obsolete, as it fails to meet the needs of today’s dynamic marketplace with changing user and budget requirements. Instead, most organizations look toward adopting a standards- based approach, offering significant interoperability advantages.
  • Open source: Distributed through open source projects, these components are developed and tested by communities of developers who are not directly paid for their efforts. While these drivers and providers may appear at first glance to be a bargain – they are often available for little or no up- front cost – they frequently come with some serious limitations, including software quality, performance, scalability, technical support and hidden costs.
  • Third-party, independent: When performance, scalability, portability and quality are important, software developers find third-party data connectivity components are the best choice. Buyers should choose components from vendors with a comprehensive product line, a long track record of delivering proven solutions and a strong leadership position in the data connectivity market.

Great data connectivity depends on deep knowledge of industry standards as well as the underlying database engines and protocols used to communicate with them. Developers and development managers should seek data connectivity components offered by vendors with strong relationships with all database vendors and significant ties to major standards committees. It is also important to make sure the vendor employs engineers, testers, customer support professionals and developers with expertise in data connectivity.
An increasingly important area of focus for today’s enterprise, business intelligence applications promise clear advantages by reducing costs through the management and distribution of corporate information in real time. As companies analyze business intelligence data and the complexity of these transactions rise, data connectivity will continue to have a dramatic effect on application performance, scalability, portability and reliability.

Poor data connectivity choices directly impact the bottom line through increased development costs, slower deployments and missed revenue opportunities. Therefore, application developers and IT managers should carefully evaluate their connectivity choices as early as possible in a business intelligence development effort, in order to reap the most value from their data sources.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access