Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

Report: Hadoop market to reach US$21bn by 2018

  • Print
  • Share
  • Comment
  • Save

The market for hardware, software and services to support deployments of Apache Hadoop is going to grow at an average annual rate of nearly 55% between now and 2018, according to a newly published report by Transparency Market Research, an Albany, New York-based research and consulting firm.

 

The analysts said the Hadoop market would grow from its 2012 size of US$1.5bn to about $21bn in 2018, driven by “exponential” growth in the amount of unstructured data generated by private and public organizations.

 

Hadoop is the most popular open-source framework for processing a lot of data in parallel by a cluster of commodity servers. It was created in 2005 by an engineer at Yahoo. The framework uses a programming model called MapReduce, which originated at Google.

 

As organizations increasingly want to make use of all the data they have accumulated in their data centers, Hadoop is gaining momentum because it offers a relatively inexpensive way to crunch through massive data sets quickly.

 

However, there is a shortage of professionals skilled enough to help companies realize the value of the framework. For this reason, about half of the revenue from the Hadoop market in 2012 came from services, according to Transparency. Hardware had second-largest market share.

 

While the analysts expect services to continue to drive the biggest portion of the market through 2018, they say software's share will grow rapidly and will outgrow hardware in terms of revenue by 2017. Transparency explains it by continuous technological changes in the software market.

 

North America has so far seen the highest rate of adoption of Hadoop solutions, but Europe is considered an emerging Hadoop market and leading players, such as Cloudera and Hortonworks (both have Hadoop distributions and commercial products and services around them), have been focused on the region.

 

Cloudera currently has the largest revenue share in the Hadoop market, the analysts said. However, players like Hortonworks, MapR and Greenplum are expected to become powerful players over the long term.

 

The government sector is the largest consumer of Hadoop-related products and services, according to the report. It is followed by the the banking, financial services and insurance vertical.

 

The latter is followed by healthcare and life sciences and retail. The telecommunications sector is currently in initial stages of Hadoop implementation and will become a significant user around 2014.

Related images

  • A Hortonworks Hadoop conference bag

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required
required
  • Print
  • Share
  • Comment
  • Save

Webinars

  • Do Industry Standards Hold Back Data Centre Innovation?

    Thu, 11 Jun 2015 14:00:00

    Upgrading legacy data centres to handle ever-increasing social media, mobile, big data and Cloud workloads requires significant investment. Yet over 70% of managers are being asked to deliver future-ready infrastructure with reduced budgets. But what if you could square the circle: optimise your centre’s design beyond industry standards by incorporating the latest innovations, while achieving a significant increase in efficiency and still maintaining the required availability?

  • The CFD Myth – Why There Are No Real-Time Computational Fluid Dynamics?

    Wed, 20 May 2015 14:00:00

    The rise of processing power and steady development of supercomputers have allowed Computational Fluid Dynamics (CFD) to grow out of all recognition. But how has this affected the Data Center market – particularly in respect to cooling systems? The ideal DCIM system offers CFD capability as part of its core solution (rather than as an external application), fed by real-time monitoring information to allow for continuous improvements and validation of your cooling strategy and air handling choices. Join DCIM expert Philippe Heim and leading heat transfer authority Remi Duquette for this free webinar, as they discuss: •Benefits of a single data model for asset management •Challenges of real-time monitoring •Some of the issues in CFD simulation, and possible solutions •How CFD can have a direct, positive impact on your bottom line Note: All attendees will have access to a free copy of the latest Siemens White Paper: "Using CFD for Optimal Thermal Management and Cooling Design in Data Centers".

  • Prioritising public sector data centre energy efficiency: approach and impacts

    Wed, 20 May 2015 11:30:00

    The University of St Andrews was founded in 1413 and is in the top 100 Universities in the world and is one of the leading research universities in the UK.

  • A pPUE approaching 1- Fact or Fiction?

    Tue, 5 May 2015 14:00:00

    Rittal’s presentation focuses on the biggest challenge facing data centre infrastructures: efficient cooling. The presentation outlines the latest technology for rack, row, and room cooling. The focus is on room cooling with rear door heat exchangers (RHx)

  • APAC - “I Heard It Through the Grapevine” – Managing Data Center Risk

    Wed, 29 Apr 2015 05:00:00

    Join this webinar to understand how to minimize the risk to your organization and learn more about Anixter’s unique approach.

More link