10 Big Data Trends From the GigaOM Structure Data Conference

 
 
By Eric Lundquist  |  Posted 2013-04-01 Email Print this article Print
 
 
 
 
 
 
 


4. The Internet of Lots of Things

This gets to the essence of big data which, aside from being pretentiously given the initial capitals of a proper noun, can really mean anything or nothing. The scale of the technology infrastructures developed by the likes of Google, Facebook and Amazon Web Services is a new model of computing that extends from hardware, through software through those new applications. Hardware engineers have to reconsider what their infrastructure looks like based on vast arrays of disposable hardware,  software systems are built around interconnecting in-house applications with outside services, and the apps themselves may be unknown until the final development process is complete. The business potential from combining and analyzing customer sentiment, demographic patterns and weather trends won't become apparent until those combinations take place. Making the transition from a technology environment of scarcity to overwhelming capacity may be the most difficult transition for today's chief information officers.

5. The emerging platform

Hadoop is "damn hard to use," said Todd Papaioannou, the founder of Continuuity and former big data engineer at Yahoo, who was in charge of developing 45,000 Hadoop servers within Yahoo's 400,000-node private cloud. As he told the GigaOM attendees, "Hadoop is hard—let's make no bones about it. It's damn hard to use. It's low-level infrastructure software, and most people out there are not used to using low-level infrastructure software." That is both the promise and problem with Hadoop and big data. Hadoop was an outgrowth of efforts by Google to create a framework for data-intensive and distributed applications. Big data is more concept than product. In between the two is a need for programming tools, infrastructure management and enterprise-level security and compliance—in short all the elements needed to create scalable, flexible and secure computing infrastructures. This new Internet-style computing model is revolutionary and new. The platform is emerging, and executives need to realize that as with all emerging platforms, lots of platform needs are still unmet.

6. Making the Big Shift

While Hadoop, Hdapt, Alteryx and other startups were touting their paradigm-busting, chasm-hopping, shark-jumping companies, not all was quiet on Wall Street. Oracle missed on its financial results and sent shivers throughout the established tech industry. Tibco, the company somewhat synonymous with enterprise infrastructure, missed big. And Dell saw its best-laid plans to go private, tossed into the competing-bids game that may not include its namesake founder. So, as I sat at the Chelsea Piers at the GigaOM conference, it was impossible not to think that a big shift from up-front, long-deployment, high service cost enterprise software to rapid, outside-in big data running on disposable, inexpensive hardware will upend the tech industry.

7. Discerning the signal versus noise

The signal versus noise theme was best outlined by CIA CTO Ira Hunt. The current thinking around big data and business intelligence tends to be built around a very simplistic model. You acquire lots of data in lots of formats from lots of sources, apply some business intelligence and voila you get your answer. As Hunt pointed out, the volume of information available continues to increase from everything from social networks to sensors and manipulating that data has become an art in itself. In my opinion, the number of people who might be described as data analysis artists is very small, and the executives thinking that a big data dive is all they need to reform their business are mistaken.



 
 
 
 
 
 
 
 
 
 
 
 
 

Submit a Comment

Loading Comments...
 
Manage your Newsletters: Login   Register My Newsletters























 
 
 
 
 
 
 
 
 
 
 
Rocket Fuel