Enterprise Applications: Hadoop World 2011: 15 Key Takeaways
NEW YORK CITYBig Data is in the house and Hadoop is one of the key technologies making it happen. Hadoop is a framework for running applications on large clusters of commodity hardware. The Hadoop framework transparently provides applications both reliability and data motion. Hadoop implements a computational paradigm named map/reduce, where the application is divided into many small fragments of work, each of which may be executed or re-executed on any node in the cluster. In addition, it provides a distributed file system that stores data on the compute nodes, providing very high aggregate bandwidth across the cluster. Both map/reduce and the distributed file system are designed so that node failures are automatically handled by the framework. The emergence of Big Data and the opportunity it represents was a key message coming out of the Hadoop World conference here. On Nov. 7, Cloudera announced that it had gotten $40 million in new venture capital funding, and the next day at the Hadoop World, Accel Partners announced a $100 million fund to invest in Big Data companies. At the show, Ping Li, a partner at Accel, announced the Accel Big Data Fund, calling it "incredibly important given the explosion of Big Data. The new initiative's goal is to fund transformative early stage and growth companies throughout the Big Data ecosystem, from next-generation storage and data management platforms to a wide range of revolutionary software applications and services, such as data analytics, business intelligence, collaboration, mobile, and vertical applications.