More and more organizations need to move large unstructured data sets across the world quickly and easily as part of their global media workflows or for big data analytics using Hadoop. But many are finding that traditional transfer methods often fail under the weight of today’s large data volumes and distributed networks.

Conventional methods for moving files over the Internet, like FTP and HTTP, are still the default means of transferring data, but are highly inefficient when it comes to moving large files in high latency, high bandwidth networks. And, as organizations with big data analytics initiatives look to new, shared storage options, one critical step is often forgotten. How will they move that data?

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access