9 key rules for transitioning a data and analytics strategy to the cloud

Register now

The availability of cloud services focused on data and analytics technology has matured in the last few years.

Now, with varied cloud service options for storage and compute, enterprises are able to have better control on cost. With flexible pricing models, enterprises adopting cloud to modernize their data and analytics architecture is increasing.

Cloud transformation is a good case to build a strong data foundation supported by some of the paramount capabilities like real-time, higher performance, scalability, reliability, AI integration and ease of delivery. However there are several questions to be asked before getting started on the data warehouse cloud transformation journey:

  • Which cloud platform should I consider?
  • Should it be IaaS or PaaS or SaaS-based?
  • What cloud services provider should I consider?
  • Should the approach be to re-host, re-platform, re-engineer, or something else?

Further confusion is created by the proliferation of varied cloud services and options that are available across all cloud platforms, including Microsoft, Amazon or Google.

Following are the nine key rules to be considered in getting an on-premises data and analytics environment successfully transformed to the cloud.

Rule 1: Understand current state of data and analytic functions

Perform a comprehensive analysis of the existing data warehouse to understand the functionality delivered and the pros and cons. This will create the existing data environment documentation.

Having this provides visibility in terms of current state complexity and helps in building a confident transformation project plan. This will also help in getting to know commonality in terms of data, challenges and design patterns, and build an optimized, consolidated cloud transformation.

Since many of the larger data environments would have got built across 20-plus years, automation of this discovery and assessment process is a must.

Rule 2: Understand business stakeholder pain points and their needs

Plan for sessions with the business team to understand their current challenges and what is critical for them. This will help in delivering key business needs as part of cloud transformation. As you talk to the business units, it’s important to present some of the cloud capabilities and use cases relevant to the business.

Rule 3: Choose the cloud platform and its services

Choosing a cloud platform is generally driven at the enterprise level, but choosing the cloud services is driven by the functionality required by the data teams.

The important task is in listing the functionality in the existing environment and mapping them to the services on Azure or AWS or GCP. This is also an opportunity to bring in cloud services that will improve existing processes like automating manual data uploads, more frequent data refresh and cognitive services.

You should also be open for multi-cloud options, which involves running a combination of services across cloud like data on AWS, visualization on Azure and machine learning on GCP.

Rule 4: Define capacity and usage control mechanism

Capturing the utilization of the existing infrastructure will help in defining the baseline capacity required on the cloud.

Unlike on-premise development environment, every testing process of ETL or reporting will be costed. It’s mandatory that you define “what services to be used, by whom, until what time” and have toolsets setup for automated shutdown and alert generation for development environments.

Rule 5: Right time to consolidate, standardize, optimize and diversify

Choosing the right application function for lift and shift can be a quicker way to experience and benchmark the difference between cloud vs on-prem approaches. Standardization of technology platforms like ETL products or database platforms is also a good case to consider.

Re-engineering to bring in newer capabilities like quicker delivery of information or machine learning are good cases for delivering better experience to users.

Rule 6: Define an evolution journey

Define a cloud roadmap which incrementally grows. To start, you can plan for re-hosting from on-prem to IaaS model, followed by moving the compute to the PaaS, then the data store. Defining an evolutionary approach especially for larger environments will ensure that the transformation has lesser risk.

Rule 7: Execute a Foundation Solution

Build a foundation solution in six to 12 week timeframe bringing together all the cloud components in the architecture. The prerequisite for the foundation solution is to determine the key functions, the user group and the best use cases. The process and tools setup as part of the foundation solution should help build further transformation process.

Rule 8: Setup a factory model and utilities for the transformation

The cloud transformation at a basic level should ensure that the functionality delivered by the existing data applications are made available. Such a transformation can be run in a factory model since what needs to be delivered and the process (from foundation solution) to be followed are well defined.

Automation of the conversion process like on-prem ETL or reports to the cloud platform should be built, and this will be applicable for scenarios where we want to perform one to one conversion.

Rule 9: Right Time to bring in governance process

Pay attention to the right time to put together a business glossary, metadata management, data governance and DevOps process. Creating an inventory of the objects in cloud and implementing processes for better management like data governance and DevOps will bring in lot of value to the newer environment.

The objective is to look at all nine key aspects and ensure that we are prepared before-hand for a successful cloud transformation journey.

For reprint and licensing requests for this article, click here.