Hadoop Gets a Lot More Channel Friendly
Posted on Wednesday Jun 13th 2012 by Michael Vizard.
Filed under: Cloud Computing
Filed under: Cloud Computing
Separate moves by VMware and Pentaho make Hadoop more accessible to the channel
While the amount of interest in Big Data at the moment may far exceeds the actual number of IT projects being launched using the technology platform; it’s clear that technologies such as Hadoop are creating a significant new opportunity for the channel.
Solution providers looking to take advantage of those opportunities should take note of two significant developments this week. The first is a new Project Serengeti effort led by VMware that makes it possible to configure and provision a Hadoop cluster in less than 10 minutes.
The second is a move by Dell to extend its portfolio of Hadoop offerings to include analytics and data integration from Pentaho, which Dell is bundling with servers that are configured to run the Hadoop distribution managed by Cloudera.
According to Eddie White, executive vice president of business development, Pentaho already has a close working relationship with Cloudera, so its alliance with Dell essentially represents the creation of a new Hadoop ecosystem. There seems to be no shortage of Hadoop ecosystem these days, so solution providers have multiple options. The challenge they really face is figuring out when all the interest in Hadoop will turn into actual project opportunities and, once those projects are launched, finding the Hadoop expertise they will need to be successful.
In addition, it can hard to find the right person within an organization to sponsor such a project. Some database administrators are still hostile towards Hadoop because it’s not based on SQL. At the same time, a lot of companies have invested in data warehousing applications that may very well be rendered obsolete by Hadoop. There’s also a relative shortage of Hadoop applications, but that void is expected to be filled rapidly. There is, however, also a shortage of data scientists that have the requisite expertise to actually make use of a Big Data application.
The successful Hadoop solution provider ideally needs to find a business executive that is savvy enough about the potential value of Big Data to make the investment. More often than not, that executive is probably sitting in a marketing department that is fairly frustrated with the limited ability they have to analyze massive amounts of data being generated by social networks. Conversely, that executive could be in charge of aggregating a massive amount of unstructured machine-to-machine (M2M) data. In both instances, either existing technologies can’t cost-effectively address the challenge.
White says the Hadoop market opportunity has already reached $1 billion and shows no immediate signs of abating in terms of its growth potential. Given the open source nature of the technologies involved, a huge portion of those dollars are being generated by IT services. The good news is there are companies such as Cloudera that provides services around Hadoop. So the fastest way to tap into the opportunity may be to resell those base level services as a first step towards gaining Hadoop competency.
Hadoop is one of those technologies where right now there is a lot of mystery and, as the saying goes, wherever there is mystery there is profit.