The market for high performance computing is huge and growing. Estimates of the total HPC market range from $21 billion to $29.4 billion.
The cloud category is the fastest growing category in the market at 14%-18% CAGR, and yet it represents an extremely small percentage of the market. Some analysts suggest less than 20% of current HPC workloads are delivered via the cloud and for good reason.
Many HPC workloads are not ready to run on today’s cloud architectures and most public cloud HPC offerings are designed only to effectively support HPC workloads without meaningful communications and I/O requirements. And yet in the near future one can see how cloud computing has the potential to disrupt the HPC market in the same way it has disrupted the enterprise software market.
Today, HPC growth, in general, is restricted by architectural complexity, availability of trained personnel and budgetary issues. The HPC discussion typically focusses on technical computing such as large scientific and engineering problems, often regarded as the domain of academia, government agencies and large multinationals.
Historically, this made up 75% of the HPC market (IDC, 2015). However, big data application and workflows are driving the rest of the market at faster growth rate, high performance business computing.
This part of the market has not suffered from austerity measures in the government and academic sectors and has been embracing the cloud and expanding down in to medium-size and SME segments over the five years. It is these SMEs that will ultimately lead to the disruption of the HPC market.
HPC capacity has typically not been viewed as a solution except for those with a lot of money or skill. As such, there is a significant unmet need for cost-effective easy to use HPC capacity for SMEs.
Similarly, larger organisations with dedicated HPC infrastructure often hit deadline pressure resulting in peak capacity. Due to the set-up times and capital expenditure required, these organisations find it difficult to burst capacity for overflow (“surge”) workloads.
Public cloud providers including Microsoft and Google are already eating away at the lower end of the HPC market, providing on-demand solutions for loosely coupled workloads such as 3D image rendering and specialist HPC cloud providers like Penguin Computing, Univa, and Sabalcore are moving up the HPC value chain.
A key inflection point is coming in the HPC market. Public clouds are beginning to incorporate lower latency interconnects, larger memory, enhanced security, heterogeneous resources and more cost-effective choices on operating environments. At the same time, ISVs are architecting their software for deployment in the cloud to exploit the new business, grow the market and gain competitive advantage. Democratising HPC is key for cloud adoption.
If cloud service providers and ISVs make HPC in the cloud easy to use, it will transform the HPC market.
(About the author: Professor Theo Lynn is a professor of digital business at Dublin City University and is the pricinal investigator at the Irish Center for Cloud Computing and Commerce. He is a researcher on the European Union Horizon 2020 project, CloudLightning, developing an architecture for self-organizing, self-managing heterogeneous clouds.)
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access