November 30, 2012 - Amazon Web Services (AWS) has previewed a new enterprise data warehouse service called Amazon Redshift, which the company says will scale to a petabyte-scale on demand. The company separately launched new hardware configurations for in-memory analytic processing.
Amazon says pricing for Redshift begins at $1,000 per terabyte per year, far less than the cost of buying a comparable solution. It is compatible with common SQL query tools and an instance can be launched remotely by a customer from Amazon’s Management Console. Amazon says the service lets users avoid high capital and fixed costs for building and operating a data warehouse via its turnkey service. All components are monitored and backed up redundantly by Amazon, the company says.
Large scale, long term data warehouse projects can be inefficient assets to own and operate and the service route for enterprise data warehouses follows an outsourcing trend across different components of information technology. In this case, though it warns performance will vary depending on requirements, Amazon says the system, which uses columnar databases, compression and a high performance network, has outperformed standard relational data warehouses based on internal tests.
Separately, Amazon has begun offering Elastic Cloud Computing (EC2) hardware configurations as a hosted service with in-memory storage for high performance analytics. The EC2 instance comes in with 20GB of solid state storage; a disk system is also available 117GB of RAM and up to 48TB of disk.
Another product called Data Pipeline will route stored information across tiers of Amazon storage including S3.
The news came at Amazon’s Invent conference in Las Vegas.