5 trends that will drive data strategy, Blockchain and persistent memory practices

Register now

Last year was a huge year for technology and data strategy. The role of the chief data officer is firmly here and it will become more prevalent in 2020, especially in financial services. We’ll also see data literacy take center stage, as the term big data will take a back seat.

Solutions that leverage blockchain in conjunction with IoT and AI will cut supply chain costs and boost customer experience through seamless logistics. In 2020, persistent memory will also ramp up, but adoption will be slow.

Here are some top technology and data analytics predictions for 2020.

Data literacy will take on the buzz of agile methodologies

Agile has advanced from the world of software development to a widespread project management initiative, touted as the way to cope with continuous change. One of the tenants of agile is the ability to break down organizational siloes where innovation is (or should be) highly technical and data driven, without the proper language needed to communicate.

In 2020, data literacy will emerge as a mission critical action for organizations looking to constantly innovate around the growing volumes of data collected. There will be an emergence of tools that empower everyone across organizations with the ability to collaborate around innovations they support. Data literacy will be seen as the facilitator of this dream state; non-technical employees will be able to describe their proposals to the data scientists and understand barriers to the success of their ideas.

Big data takes a back seat, and enterprise data strategy takes center stage

In 2020, the term ‘big data’ will drift away as companies mature beyond this buzzwordy lexicon. Instead, they will have use-case specific terms in which to frame their data analytics efforts.

For example, instead of saying “we do big data,” they will say “we’re working with customer demographics, credit card statements, transactions and point of sale data, online and mobile transfers and payments, and credit bureau data to discover similarities to define tens of thousands of micro-segmentations in the customer base. We then build ‘next product to purchase’ models that increase sales and customer retention.”

The rise of the CDO in financial services

Appointments of CDOs have risen dramatically in the last two years but more so in financial service institutions (FSI). In 2020, CDOs will be more prevalent in FSI over other industries as they formalize and commit to implementing an office of the CDO. This is broadly driven by FSI companies’ acceptance and a broader understanding of the role; the ability to demonstrate impact and value; continuing and growing budgets; and increasing positive engagement across the C-suite.

The CDO role in FSI firms is transforming, moving from its original technical roots to encompass a broad agenda that spans data management, analytics, data science, ethics and digital transformation. More importantly, CDOs are using their high profile and pivotal role to act as change agents for the business, moving from a focus on risk mitigation toward business impact and value realization.

While the CDO’s primary responsibilities have focused on regulatory compliance and operationalizing regulatory mandates, leading FSI companies are using CDOs as an enabler of business insights, strategies, and innovation such as developing value-adding data services that are enabled by the new foundational processes and policies.

Firms will gain measurable benefits from blockchain in conjunction with IoT and AI in logistics use cases

Today, supply chain transactions are coordinated across multiple parties and spread across multiple geographies and legal jurisdictions. Some of these parties will be organizations, some will be individuals, and some may be automated, network-connected devices. Solutions that leverage blockchain in conjunction with IoT, and potentially AI, will cut supply chain costs and boost CX through seamless logistics.

AI is a learning system; it looks at the behavior of the component over time across the ecosystem, correlates with other data points (i.e. weather, market data), then makes prescriptive recommendations or automates decisions. Smart contracts built into the blockchain gives the ability to execute code autonomously, which is ideal in an ecosystem environment, when many parties interact but have a limited relationship (and therefore trust) with one another.

However, the utopian vision of enterprises across industries migrating their processes onto shared blockchains, and interacting on a trusted ecosystem, still seems far-fetched. Blockchain projects that are not narrow in focus continuously run into problems with scaling. Businesses start with a desire for blockchain, rather than the business problem, which is often solvable without the need for distributed ledger technology.

Hype around persistent memory will ramp up but adoption will remain low

Persistent memory (PM) has the potential to address the growing gap between storage and memory technologies. Specifically, system memory such as DRAM may not be big enough to process and analyze all data in real time and storage such as solid-state drives, although cheaper, can’t compete on performance.

In 2020, we’ll see a ramp up in awareness building and marketing around emerging PM technologies such as Intel Optane and ReRam (resistive random-access memory). As a result, data intensive enterprises will investigate and experiment with PM to decide if it offers a significant shift in data management processing and economics.

In 2020, however, most initiatives will focus on specialist workloads that require ultra-low response times such as those for IoT and high frequency trading applications.

Two factors will contribute to this. First, there are economic considerations. PM chips such as Optane are slower and cheaper than DRAM chips, but dramatically faster and costlier than SSD chips. Organizations must consider the cost of using PM for data centric workloads to determine if it is economically viable.

Second, there are architectural considerations as organizations may need to re-evaluate the design of applications to assess what data and in what scenarios it makes sense to take advantage of PM.

For reprint and licensing requests for this article, click here.