Making the Most of Big Data Opportunities
Filed under: Cloud Computing
Need for simpler ways to deploy Hadoop creates opportunities for the channel
While there’s a lot of talk these days about moving everything into the cloud, there’s just as much talk about Big Data. What’s ironic about that is that it’s prohibitively expensive to transfer massive amounts of data in and out of the cloud. Because of that a lot of IT organizations are starting to make decisions about where they will host Big Data platforms such as the open source Apache Hadoop framework. Given the need to access that data on a fairly regular basis, it turns out that many of them are opting to deploy those systems on premise rather than in the cloud.
In fact, we’re starting to see increasing recognition of Hadoop as an application development platform. Rather than ship data all across the network, IT organizations are looking to bring applications to a private on premise cloud where the data most often resides.
But before any of the can happen large numbers of them are looking for help setting up Hadoop systems because, funny enough, setting up a Hadoop cluster can be more complicated than actually running one. One of the first vendors to realize the opportunity this creates is Hewlett-Packard, which this week at its Discover 2012 conference unveiled an HP AppSystem for Apache Hadoop appliance that simplifies and speeds the deployment of Hadoop.
Interest levels in Hadoop are skyrocketing, but one of the bigger issues holding back adoption is that most organizations don’t have a lot of expertise. That obviously creates a lot of opportunity for solution providers in the channel. With the release of the HP AppSystem for Apache Hadoop appliance, HP is trying to lower the bar for solution providers to get into the Hadoop business.
According to Duncan Campbell, vice president of worldwide marketing for converged infrastructure in the HP enterprise group, one of the other benefits of this approach is that those same appliances can optionally be configured run the latest release of Vertica, the in-memory database for performing analytics that HP acquired last year.
Ultimately, HP is trying to combine Hadoop, Vertica and its most recent Autonomy acquisition into a suite of offerings that address of variety of Big Data opportunities. In fact, Campbell says HP research shows that 98 percent of the organizations the company surveyed admitted they can’t deliver the right information at the right time. That’s a serious indictment of IT that from a business perspective creates a major conversation opener for solution providers, especially as more organizations start to treat data as an asset that needs to be maximized.
That’s obviously hard to accomplish without some underlying framework or manage large amounts of data. The good news is that number of frameworks for managing all the data are rapidly proliferating. The better news is that many organizations have the expertise needs to use them, which creates a vacuum that solution providers in the channel should be able to readily fill as fast as they can come up to speed on all the nuances of Big Data themselves.