Hadoop, a framework that allows for the distributed processing of large data sets across clusters of computers, along with Kognitio's ability to analyze data in-memory, according to MetaScale, will provide Big Data analytics to companies faster, easier and more cost-effectively.
Delivered via private cloud or on-premises implementations, MetaScale said its approach—“Big Data as a Service”—can replace legacy mainframe or batch processing. MetaScale is also touting the technology’s functionality.
“Even if they already have an enterprise data warehouse, the MetaScale/Kognitio partnership will allow them to rapidly deploy and conduct additional analytic projects, without imposing a greater burden on their existing systems and on a scale they once thought impossible,” said a MetaScale spokesperson.
"MetaScale understands how to make Hadoop practical," said Steve Millard, CEO of Kognitio. Millard noted that the Kognitio Analytical Platform, working as an analytical accelerator to Hadoop, is capable of producing answers to data-intensive queries up to 80 percent faster than with traditional systems. This, he said, can reduce the "time-to-insight" by hours.
"The agile implementation of business analytics is increasingly important to companies. By blending Hadoop's processing power with Kognitio's analytic ability and delivering it through the cloud, MetaScale will enable companies to obtain the insights they require, without bogging down their existing infrastructure," said Phil Shelley, CEO of MetaScale.
This story first appeared on the Insurance Networking News web site.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access