It’s become fashionable in the current technological environment to announce that everything is new again, reinvented or reconfigured for an outsourced implementation. Software as a service, storage as a service, platform as a service, infrastructure as a service, analytics as a service…indeed, every XaaS you can think of has been proclaimed as the next best thing to come down the pike.  (My favorite, introduced on April Fools’ Day, was analyst as a service.  You do the math.)
The proclaimed advantages are obvious, and in many cases, real:  faster time to implementation and results, platform elasticity that enables rapid scaling according to need, the ability to shift lower costs to operating expenses as opposed to capital expenditures, always pleasing to CFOs and guaranteed performance to burn.
In this environment, data warehousing as a service has begun to flourish. Allowing a trusted provider of managed or hosted services to take over the “heavy lifting,” i.e., physical construction of the data center, network connections between the client and the center, as well as providing the analytical database and software, is a likely solution.  For business intelligence professionals, these benefits and potential drawbacks must be weighed before deciding on how best to proceed (or if to proceed) in implementing a cloud-based data analytics and BI strategy.  The good news, however, is that companies have successfully designed, implemented and benefitted from outsourced data analytics for more than a decade, long before the acronym and buzzwords were in place; the cloud component has been easily appended, and the means to provide the needed security are already in place.

What’s Already Been Done


Businesses have been running and benefitting from DaaS-type infrastructures for years. For example, a global telecom provider has, for a 12 years, been outsourcing its call records for analysis and insight; the company needs millions of records rapidly processed and investigated to determine which call plans are working, which aren’t and which customers are most at-risk for churn.  The company’s volume of records has consistently increased over time, requiring the outsourcing firm to add additional computing and storage capacity on an as-needed basis and to handle the growth in an invisible manner to its client.
On a more modest scale, a restaurant chain with more than 150 locations needed to process information from each site and determine which offers were driving additional revenue to its bottom line.  It needed to be able to access, on a variable basis, compiled data that would help it discover trends and correlations based on fact, rather than rely on the gut feeling of managers.
In both cases, the companies outsourced their information to a third party, which did the work for them.  The vendor was able to perform the work in a manner consistent with the federal government’s newly released working definition of cloud computing in a recent request for information:  “Cloud computing is a pay-per-use model for enabling available, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.”
The government, however, does not address security in its list of key characteristics, leaving it to individual vendors to answer questions such as, “Describe your handling of data isolation, data recovery and handling/security of data at rest and in transit,”  “Can you guarantee that data will remain within the continental United States, both in transit and at rest? If so, how?” and “Please explain how you provide physical security in a shared tenant environment.”  By not including security at its very base, we believe the government may unintentionally be underemphasizing its critical importance, especially given the ongoing debate that is raging about data security in the cloud. 
That debate has raised multiple questions, ranging from who controls the data, where it resides, who has access to it, to if it is co-located with data from other firms, and more.  That concern is based, again, with CFOs and others who must put their name to any number of governance, risk management and compliance (GRC) documents required by laws and standards such as Sarbanes-Oxley, HIPAA, the Payment Card Industry’s Data Security Standards (PCI DSS), Basel II and more.

The Trusted Cloud


I propose a solution as it relates to the implementing of outsourced data analytics:  I believe such work may be accomplished through the establishment of a trusted cloud, evolving beyond the government’s definition of a private cloud, where the cloud infrastructure is owned or leased by a single organization and is operated solely for that organization.  Market dynamics, which we have observed first-hand for years, demonstrate to us that the private cloud may not go far enough in meeting the real-world requirements that potential customers are increasingly requiring of their cloud services vendors.
The government’s private cloud definition, for example, does not specifically address questions raised in a recent teleconference by Forrester Research analyst Chenxi Wang.  She said companies considering the implementation of cloud-based outsourcing must ask a full range of questions, including those about the security of data, both at rest and in flight, as well as how companies will be able to verify compliance procedures and what happens when service-level agreements are not met.
Again, the data’s security must be paramount in any cloud-based scenario.  It’s one thing to have your mp3 files stored somewhere in the cloud; you know you can quickly recover them when you need to do so, and you don’t particularly care where that “somewhere” is.  But companies dealing with corporate-critical information can’t afford such a laissez-faire attitude; for them, the private cloud may provide the reliability and comfort needed to implement DaaS or other cloud-based service.
Specifically, the trusted cloud may begin by choosing a vendor partner to handle the technical side.  The chosen company can host your data and analyze it at its site, or you can allow them to access it behind your corporate firewall.  The firm is also responsible for handling hardware deployment and upkeep.  Most importantly, the firm is chosen as the preferred vendor because the end user may have an already-established relationship with the vendor, or may know a colleague who is using the firm to provide similar services, or may perceive value because the vendor has a presence in the community where the company is based.  As such, there’s a level of trust present with that type of vendor, where another even larger vendor may not engender the same level; in some cases, as a Gartner analyst recently implied, their very size could work against them, especially if customers’ data is located at multiple, undisclosed locations.
DaaS can easily address these concerns.  Vendors can host the data themselves, at their site; companies requiring the analytic processes obviously place a level of trust in the vendor by allowing them to access it initially.  Transmitting it offsite via a secured network does not infringe on that trust.  Alternately, a vertical market-specific partner, which is a managed service provider focused on a particular vertical industry, may be able to host the data and run the analytics, becoming a DaaS vendor on their own after licensing the software from the developer.  Finally, companies that wish to have DaaS services performed on their information, yet do not want the information leaving their location, can simply have a server installed at their location, and allow the vendor to access it via a secure private link, thus permitting the analytical work to be done onsite and maintaining control, while still gaining the benefits of outsourcing.
There are, of course, several caveats in all of this, least of which is that the entire definition of cloud computing and cloud services seems to be evolving by the day; the Government’s own RFI says, “Cloud computing is still an evolving paradigm. Its definitions, use cases, underlying technologies, issues, risks, and benefits will be refined in a spirited debate by the public and private sectors. These definitions, attributes and characteristics will evolve and change over time.”
Spirited debates notwithstanding, companies requiring the benefits of a here-and-now DaaS implementation should review an entire list of questions, including:

  • How much data do I need analyzed?  (Typically, clients have environments (e.g., data warehouses) holding less than 100TB of data,)
  • How long do I want to outsource this capability, when do I want to bring it in-house, and how does that compare with my need for immediate answers?
  • How quickly will I begin receiving those answers?
  • How much money will I save over the course of a contract by having someone else do the work, as opposed to hiring more staff and buying more equipment?;
  • What has the vendor done to tangibly demonstrate that they can be trusted?  Have they put a guaranteed SLA contract into writing?  Have they shown me exactly where my data will be stored?  Will it be held on the same machine as data from other firms or will I have a dedicated server?

Finally, one additional point:  implementing DaaS through a trusted cloud will not give you an entire level of hands-off administration.  Decisions will still have to be made, such as any expansion or contraction depending on the amount of data being analyzed. Moreover, let’s not forget that for any DaaS service to succeed, a little bit of upfront planning is still needed, which can go a long way.  Given the importance of the data, however, working with a trusted partner should be an accepted item.   DaaS implementations have been delivering quantifiable results for years; having them delivered via the trusted cloud may enable you to take advantage of its benefits while moving beyond where many believe cloud computing is today.  

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access