Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

Report: Hadoop market to reach US$21bn by 2018

  • Print
  • Share
  • Comment
  • Save

The market for hardware, software and services to support deployments of Apache Hadoop is going to grow at an average annual rate of nearly 55% between now and 2018, according to a newly published report by Transparency Market Research, an Albany, New York-based research and consulting firm.

 

The analysts said the Hadoop market would grow from its 2012 size of US$1.5bn to about $21bn in 2018, driven by “exponential” growth in the amount of unstructured data generated by private and public organizations.

 

Hadoop is the most popular open-source framework for processing a lot of data in parallel by a cluster of commodity servers. It was created in 2005 by an engineer at Yahoo. The framework uses a programming model called MapReduce, which originated at Google.

 

As organizations increasingly want to make use of all the data they have accumulated in their data centers, Hadoop is gaining momentum because it offers a relatively inexpensive way to crunch through massive data sets quickly.

 

However, there is a shortage of professionals skilled enough to help companies realize the value of the framework. For this reason, about half of the revenue from the Hadoop market in 2012 came from services, according to Transparency. Hardware had second-largest market share.

 

While the analysts expect services to continue to drive the biggest portion of the market through 2018, they say software's share will grow rapidly and will outgrow hardware in terms of revenue by 2017. Transparency explains it by continuous technological changes in the software market.

 

North America has so far seen the highest rate of adoption of Hadoop solutions, but Europe is considered an emerging Hadoop market and leading players, such as Cloudera and Hortonworks (both have Hadoop distributions and commercial products and services around them), have been focused on the region.

 

Cloudera currently has the largest revenue share in the Hadoop market, the analysts said. However, players like Hortonworks, MapR and Greenplum are expected to become powerful players over the long term.

 

The government sector is the largest consumer of Hadoop-related products and services, according to the report. It is followed by the the banking, financial services and insurance vertical.

 

The latter is followed by healthcare and life sciences and retail. The telecommunications sector is currently in initial stages of Hadoop implementation and will become a significant user around 2014.

Related images

  • A Hortonworks Hadoop conference bag

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required
required
  • Print
  • Share
  • Comment
  • Save

Webinars

  • Is Hyperconvergence a Viable Alternative to the Public Cloud?

    Thu, 31 Mar 2016 15:00:00

    Enterprise IT leaders are right to be skeptical of such bold claims. After all, is it really possible to deliver an on-premises infrastructure that delivers the same agility, elasticity, and cost-effectiveness of public cloud providers like Amazon Web Services? If you’re following the traditional IT model, with its many siloes and best-of-breed point solutions, the answer is, most likely, no. To truly deliver a viable alternative to public cloud, you need to look beyond traditional IT. Join Evaluator Group and SimpliVity to learn more about how hyperconverged infrastructure can deliver the efficiency, elasticity, and agility of public cloud."

  • "Single Pane of Glass” comes to your Datacenter facility & IT operations

    Wed, 24 Feb 2016 18:00:00

    Join Hewlett Packard Enterprise and RoviSys,as well as OSIsoft, for a webinar hosted by DatacenterDynamics’ CTO Stephen Worn, as they discuss the implementation of the “Single Pane of Glass” solution and some of its resulting benefits: •An estimated 10 Million kWh saved in the first full year of operation •Ability to easily meet “Best-Practices” throughout data center operations

  • Overhead Power Distribution – Best Practice in Modular Design

    Wed, 3 Feb 2016 16:00:00

    Overhead power distribution in your data center offers many attractive possibilities, but is not without its challenges. Join Starline's Director of Marketing, Mark Swift; CPI’s Senior Data Center Consultant, Steve Bornfield; and University of Florida's Joe Keena for an exploration of the options and some of the pitfalls, supported by real-life examples from the field.

  • White Space 29: See the diagram

    Fri, 29 Jan 2016 10:40:00

    Peter is investigating DevOps, Bill looks at security while Max is stuck on the infrastructure level

  • White Space 28: The good, the bad and the ugly

    Thu, 21 Jan 2016 12:10:00

    Join the DCD team again this week, as they discuss AMD's ARM based efforts, waste heat, BT's EU deal and cloud for Canucks and much more! Enjoy.

More link