Communicating Cloud and Big Data Standards

Published
  • August 07 2012, 9:11am EDT

Like any other industry, telecommunications has niche needs from evolving technology. Industry vendors also provide tech services and solutions, and share particular interest in the development of public on-demand service clouds or data tools to scroll social media channels for customers experiencing outages. One industry group, the nonprofit TM Forum, represents about 900 communications providers and buyers, ranging from Cisco, NTT and SAS to Ohio State University and telecoms in Croatia and Egypt.

Prompted by members a few years ago, the group dove deeper into concerns and possibilities with emerging information management capabilities. Recently, members sandboxed data disaster recovery capability scenarios and announced a joint enterprise cloud standards effort with the TM Forum’s Enterprise Cloud Leadership Council and another trade group, the Open Data Center Alliance. TM Forum Chief Operating Officer Aileen Smith, a leader on that cloud council, recently discussed particular big data applications for the communications industry and how they don’t plan to step on any toes with their cloud standards work.

What is the background for the TM Forum’s engagement with the overall cloud marketplace?

About three years ago, we were approached by some members who wanted us to help deliver solutions and methodologies on this incredibly hot topic: Telecommunications service providers looking for business models for cloud services of all kinds. They’ve always been invested in huge computing capabilities and they have a lot of core competencies that apply directly to cloud computing, things like security, customer knowledge customer experience and managing SLAs on an individual basis.

So what were some of your first steps?

One of the things that the TM Forum does really well is, in addition to being able to define requirements and define technical standards to define those requirements, we also have a very important program called the Catalyst program. It’s effectively a live demonstration, kind of a sandbox capability, where a group of buyers comes together and defines a certain problem statement. In a very short time frame, about 6 months, our members … came through on a challenge to some of their vendors to work together. These were … vendors like Oracle, or a smaller cloud provider with only a handful of employers, and we generally have systems integrators involved, too, like IBM or HP. In that time with the Catalyst program, that group of vendor companies works together [with our members] to provide a valuable, live demonstration of the solution that the communications buyers had sought out. Some findings from our cloud catalyst at our Dublin Catalyst event … involved members discussing with providers about putting all their data in a third-party data center in the cloud, and then that center becomes involved in some kind of natural disaster. One of the findings was is that it’s really possible to do some predictive modeling of some of those kinds of disaster events, but today that isn’t typically written into a standard cloud operation.

In June the TM Forum announced it would be working with the Open Data Center Alliance on a set of cloud standards. If I can play devil’s advocate, isn’t there a concern that you’ll just be adding to an existing pool of established or upcoming cloud standards?

It’s hugely important to all of our members because it’s not just a question of overlapping or confusing standards, it’s also because our members are members of these other industry organizations as well. It’s in nobody’s best interest to put out conflicting group [of standards]. We have an active group that monitors what is coming out from the other standards organizations, and it’s always possible to not be aware of something another group may be working on. But we are actively plugged into organizations like the [Open Data Center Alliance] and the Cloud Security Alliance. Obviously we have to focus on where there is potential overlap. Focused on own areas of expertise, for example, we have a long-standing competency in the area of SLA management. SLA management for cloud-based services provides a particular use case for agreements, but it’s certainly not an attempt to reinvent the wheel.

For example, for us to make sure who’s doing what and why, we last year in the fall wrote a joint white paper with the DMTF [Distributed Management Task Force] basically looking at the landscape of what is already on hand, and where there is risk of overlap. It has to be taken seriously because it’s not good for the industry if overlapping standards emerge.

What’s the time frame for your standards results?

Effectively we’re working with them on a few deliverables that will come out between now and our Orlando conference [in December]. Most important is probably the technical white paper that is a combination of requirements, business models and case studies which will be out in September.

That Orlando conference also has “big data” pretty prominently placed in the agenda. What are specific opportunities and challenges with big data for communications data and networking providers?

One of the critical things for communications organizations is the topic of customer experience management. The emerging best practice in that field look at a lot of information from the different social networking tools that are out there. In the past, you managed your customer experience in kind of a technical way. You could see what was happening in your network. For example, you’d have a flaw come in from a particular geographical location. So, I could infer, at this point and time, that in this location there would be other customers calling in who are annoyed with me. Of course, that’s a valuable feed of information for a communications provider to have. However, say it’s nighttime and everyone’s asleep, or the customer isn’t impacted by the outage, you may be overreaching and you don’t understand the full impact on all of the customers.

The emerging trend in customer experience is to look at big feeds of data that is constantly scanning the social networks, looking for information from my customers: What they’re saying? Is it positive? Or negative? And then how should be best address it? We ran a Catalyst [sandboxing exercise] previously on this at a Dublin [conference], and it does make for really interesting reviews. There can be this academic approach to big data, just saying, ‘Oh, we’ve got lots of data.’ The useful approach is what you’re going to do with it.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access