Cloud is the new norm for managed compute, data and software services. One way or another, most organizations are moving away from on-premises IT and into the cloud. But while some reports suggest cloud adoption is accelerating, the nuance is that a centralized “mega cloud” will be supplemented with many distributed “mini clouds” closer to the edge that can better serve changing cloud workloads.

Why aren’t central clouds enough? Most web traffic today is generated by central clouds that are backed by content delivery networks (CDNs) all over the world. The role of the CDN is one of caching content closer to users. Without CDNs, the internet just won’t work.

While we’ve all heard about Netflix’s Amazon deployment, many people remain unaware that Netflix has clusters deployed in 250 locations worldwide that serve content directly to viewers. Without them, the Netflix user experience would be unbearable.
Also important is that cloud workloads are changing. Enterprise applications, transactional processing and analytics, and the Internet of Things (IoT) are all very different than traditional Google and Facebook web traffic. With these new cloud workloads:

  • Traffic is bi-directional, upstreaming sensor data, updated documents, transactions, and more.
  • Unlike CDNs (cache), data lasts for years and gets backed up in multiple locations.
  • There is more structured data: telemetry, messages, data streams and transactions.
  • There is more complex computation, such as image recognition, anomaly detection, or analyzing and summarizing raw data to reduce upstream bandwidth.

All this means we need a new type of edge, one that is capable of processing data in real-time, rather than simply caching it. And for that processing to be effective, this edge requires ample context with richer historical data and a wider “perspective” (the number of users or devices it serves).

Clouds will become slower, while the edge speeds up

A significant trend is that many cloud providers are moving farther away from metro areas as a way to cut costs and better handle power and cooling challenges. The problem is, the distance creates higher latencies. Therefore, network bandwidth to the cloud cannot grow at the same rate of growth as cloud users and services, because long distance links are expensive.

The edge can help combat this latency problem. Last-mile carrier networking is speeding up. Google drove a gigabit to the home at $70 per month and other carriers, like AT&T are following suit. 5G wireless is around the corner. Soon, every device will be able to run gigabit traffic. An average city with one million users and devices will require 1,000 terabits to these devices, while several orders of magnitude lower bandwidth can upstream to the cloud.

But for practical reasons, data needs to be analyzed at the edge with many applications. For instance, cities are beginning to adopt new traffic lights with built-in video cameras and sensors. There is no way all this data will go upstream to the cloud for analysis. It’s got to get stored and analyzed closer to the edge due to bandwidth constraints and low latency, which is required for immediate alerting.

So where is the edge ? Peter Levine’s talk on the “end of cloud” is about this workload migration from central cloud to the edge. Levine suggests much of the computation will be on the device which may send some data back to the cloud.

Many devices will be disposable or low on power or foot-print. Do we really want to store historical data on those? Just remember the last time you lost a phone and had to track down contacts and chat histories. Others say the edge is their company’s on-prem enterprise deployment. But it seems like the right location for edge mini-clouds is on the network backbone and in the metro area, due to the need to balance economies of scale versus latency and bandwidth considerations.

Certainly our cars and homes will be much smarter, but if we want accurate models drawn from them, artificial intelligence and machine learning must aggregate statistics from many independent device instances and experiments. Therefore, the area covered by the edge cannot be too small: a hazard situation reported by one car can be interpreted in several different ways, but many cars reporting a hazard in a certain location or condition becomes much more significant and less open to mistaken analysis, as they provide more data to analyze and figure out what is really going on.

What this all means is the edge needs to be a smaller version of clouds rather than a variation of on-prem IT. It should support multi-tenancy to accommodate multiple users and apps. It should also have large enough compute, storage and networking capacity to handle peak requirements and new workloads. The edge must deliver abstract services that are easy to consume. It should automatically back up data or move workloads to near-edge locations or central clouds, and multiple locations should get managed as one logical entity.

The right location for the edge is within carrier metro networks so it can enjoy infinite bandwidth. Since data travels at about 100 miles in a single millisecond, the speed will be unnoticeable for most applications, while delivering a much better experience than the use of central clouds.

Edge solutions should also focus on much greater density and efficiency. There won’t always be room for thousands of servers that run inefficient stacks, so we’ll be seeing solutions making more optimal use of hardware, CPUs, GPUs, Flash, and more.

Where does that leave us? The cloud is the future. On-premises IT likely will be relegated to maintaining tough-to-move legacy infrastructure and applications while the cloud will become more distributed with the edge gaining significant importance. It is certain that cloud providers will look at capitalizing on the edge, but it’s a different challenge – and one that opens up opportunities for agile companies – which can create dense, fast, mini versions of a modern cloud.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access