Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

Hadoop-based MapR targets European enterprises

  • Print
  • Share
  • Comment
  • Save

California-based software company MapR, which this year saw its approach to Hadoop adopted by Google for its Compute Engine, is kicking off operations in Europe.

It opened an office in London and Germany this week, placing new employees on the ground to target sales in the area and boost relationships it already has formed with partners such as EMC and Cisco.

MapR was formed about three years ago and one of its co-founders came from Google. Unlike other platforms that build on Hadoop with open source code, MapR charges for its platform with the promise of an enterprise-grade product.

MapR VP of Marketing Jack Norris said the company, which has been in operation for three years, remained in “stealth mode” for two of these as it worked on innovations around its code to gain advantages around the Hadoop-based software built on for its Big Data analytics.

Norris said MapR tried to make Hadoop easier to use, with customizable dashboards, self-healing, snapshots and full automated states for failover.

“This is what is really taking Hadoop to the next level,” Norris said.

Apache Hadoop is an open-source software framework for data-intensive distributed applications which can carry out map/reduce and provide a distributed file system that can store data on compute nodes.

It was first designed to support the distribution for Yahoo’s Nutch search engine project. Today Hadoop is used by Facebook, ebay and Google among others.

More interestingly, in recent times, Norris said MapR’s growth has come from the financial services industry, major retailers, manufacturers and government agencies.

“These companies are all looking at leveraging new data sources, part of this unstructured, which can range from click stream data to social media to sensor data,” Norris said.

“In manufacturing this could be information from industrial equipment to increase uptime. MapR allows for self-healing, which means you can automatically recover and do rolling upgrades. It involves a real-time response to business operations and business situations, it is not about analysing what happened last quarter,” Norris says.

Norris said people will be willing to pay for MapR’s advanced features, which make it more applicable to a broad set of mission-critical applications.

“We also have some services we provide, and a network of partners who can connect with our customers,” Norris said.

“What we do is we take the open-source software of Hadoop and combine it with our own innovations to include the open-source code.  Things like HBase, Hive and Zookeeper are part of the MapR distribution. It is basically a big data platform you can support a variety of operations on top of.”

Norris said recently, the company has seen customers building their own applications ad algorithms on top of the MapR platform.

 

 

Related images

  • Hadoop.jpg

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required
required
  • Print
  • Share
  • Comment
  • Save

Webinars

  • Do Industry Standards Hold Back Data Centre Innovation?

    Thu, 11 Jun 2015 14:00:00

    Upgrading legacy data centres to handle ever-increasing social media, mobile, big data and Cloud workloads requires significant investment. Yet over 70% of managers are being asked to deliver future-ready infrastructure with reduced budgets. But what if you could square the circle: optimise your centre’s design beyond industry standards by incorporating the latest innovations, while achieving a significant increase in efficiency and still maintaining the required availability?

  • The CFD Myth – Why There Are No Real-Time Computational Fluid Dynamics?

    Wed, 20 May 2015 14:00:00

    The rise of processing power and steady development of supercomputers have allowed Computational Fluid Dynamics (CFD) to grow out of all recognition. But how has this affected the Data Center market – particularly in respect to cooling systems? The ideal DCIM system offers CFD capability as part of its core solution (rather than as an external application), fed by real-time monitoring information to allow for continuous improvements and validation of your cooling strategy and air handling choices. Join DCIM expert Philippe Heim and leading heat transfer authority Remi Duquette for this free webinar, as they discuss: •Benefits of a single data model for asset management •Challenges of real-time monitoring •Some of the issues in CFD simulation, and possible solutions •How CFD can have a direct, positive impact on your bottom line Note: All attendees will have access to a free copy of the latest Siemens White Paper: "Using CFD for Optimal Thermal Management and Cooling Design in Data Centers".

  • Prioritising public sector data centre energy efficiency: approach and impacts

    Wed, 20 May 2015 11:30:00

    The University of St Andrews was founded in 1413 and is in the top 100 Universities in the world and is one of the leading research universities in the UK.

  • A pPUE approaching 1- Fact or Fiction?

    Tue, 5 May 2015 14:00:00

    Rittal’s presentation focuses on the biggest challenge facing data centre infrastructures: efficient cooling. The presentation outlines the latest technology for rack, row, and room cooling. The focus is on room cooling with rear door heat exchangers (RHx)

  • APAC - “I Heard It Through the Grapevine” – Managing Data Center Risk

    Wed, 29 Apr 2015 05:00:00

    Join this webinar to understand how to minimize the risk to your organization and learn more about Anixter’s unique approach.

More link