Deep learning’s preeminence to the enterprise today is significant for two reasons. It represents the ultimate expression of machine learning’s advanced capabilities and, as such, has become virtually synonymous with Artificial Intelligence because of its progressive learning prowess.
Deep learning is at the core of the most intricate AI capabilities including speech recognition, image and video recognition, speech generation, and aspects of robotics. It’s unparalleled at swiftly analyzing data at scale in accordance with the differentiation characteristic of big data.
When one considers the massive influx of unstructured data besieging the enterprise, the ascending interest in AI, and the pivotal context with which deep learning purveys nearly any use case, it’s clear 2018 is the year this technology’s utility will finally supersede classic machine learning’s.
“Traditional machine learning is more like statistics,” indico CEO Tom Wilde reflected. “It’s powerful, but it has limitations. Deep learning is the breakthrough. The way neural networks work has made the ability for the computer to solve problems so much more robust.”
Noisy, Unstructured Data
Horizontally, deep learning has empowered contemporary computing environments with the deluge of unstructured (and semi-structured) data that is principally manifest in three forms.
“Text, image and speech are the three segments that have been utterly reinvented by deep learning,” Wilde remarked.
Other unstructured data common to the enterprise includes transaction data and the myriad data types to support such data in verticals like finance, telecommunications, insurance, and others. In this instance, deep learning enables “understanding customer behavior through transactions” according to Razorthink CEO Gary Oliver, which leads to micro-segmentation for an array of competitive advantage. Whether impacting these traditional use cases or more sophisticated AI ones involving conversational interfaces (for customers and employees), deep learning provides numerous advantages unmatched by classic machine learning.
Its feature detection identifies points of relevance between data and business problems, it encompasses many more variables than traditional machine learning does, and it parses through data at scale more rapidly than classic machine learning can.
“Deep learning is much better at understanding the context of positive and negative sentiment in ways that you wouldn’t think to train it on,” Wilde revealed. Furthermore, the variation in the unstructured content for deep learning analytics is frequently seen in singular use cases, such as those involving the Internet of Things.
“In IoT you’ve got video, you’ve got voice and all of that data coming in,” Biotricity CEO Waqaas Al-Siddiq said. “And then remote patient monitoring is bringing in the clinical data. The voice and the video data is providing context, and the clinical data is actually providing the change is someone’s biology.” The imaging data in healthcare is a prime use case for deep learning’s propensity to derive signal from what Wilde termed “noisy data”; in 2018 that ability will be directed ever more towards speech.
Regardless of the reason for its deployment, deep learning’s primary value proposition is the peerless context of its analysis users can redeem for business value.
“It delivers two things,” Al-Siddiq acknowledged. “One is context and the other is increased throughput and better understanding. That’s the key: how you are able to add context.”
In text analytics and speech recognition applications, deep learning pairs with Natural Language Processing to provide a contextualized meaning not otherwise possible. The former facilitates “a deeper understanding of the context of natural language, as opposed to just the verbatim understanding of things like sentiment,” Wilde explained.
Use cases for this contextualized understanding are found in intelligent interactions with verbal interfaces such as Apple’s Siri. Siddiq mentioned that several pilot programs have emerged to combine deep learning with healthcare objectives such as “medical adherence with an Amazon Alexa integration.” They also focus on what Oliver termed “optical character recognition”, in which organizations can “scan a document or form in, and [the system] automatically understands all the context on the form by using deep learning and processor training.”
The predominant motif for deep learning’s enterprise utility in the coming year centers around its deployment for expediting and enhancing judgment calls.
“What it’s really going to be doing is being a decision support tool,” Siddiq commented.
Deep learning’s decision support capacity functions at the nexus of its ability to rapidly analyze unwieldy unstructured and semi-structured data and facilitate contextualized understanding. According to Oliver, there is growing market interest in the desire to annex deep learning into existing workflows for decision support: “The intelligence that deep learning brings provides the insight in the predictions and prescriptions that you couldn’t do without it. In many cases, customers have existing systems, workflows and business processes where they want to plug that in directly.”
The contextualized understanding, micro-segmentation, and granular analytics underpin a host of novel use cases for decision support—alluding to deep learning’s maturity.
Wilde mentioned deep learning’s role in aggrandizing unstructured data into Blockchain’s structured data, and leveraging its analytics for aerial data to discern “if the parking lot at the mall is full or empty. Hedge funds are trying to use that to see if retailers are doing well or not.”
Siddiq mentioned a future in which deep learning aids remote patient monitoring for a physician support tool analyzing biometric data “against all of the history of and knowledge of patients and doctors in the system.”
Oliver referenced deep learning’s acceleration of repetitive tasks such as automotive and medical insurance claims involving image data, in which “AI can do 98 percent of a claim because of deep learning.”
Decreased Inhibitors, Broadening Adoption
With increasing cloud options, the influx of GPUs, and the automation of data science techniques, deep learning’s conventional inhibitors are nonexistent. Contributing to this trend is transfer learning, which enables organizations to leverage this technology with what Wilde called “one thousandth of the training data” otherwise required.
All of these distinctions have pushed deep learning beyond machine learning to the forefront of AI’s resurgence; the foregoing use cases and others should cement its place there.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access