Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

Intel launches own Hadoop distribution

  • Print
  • Share
  • Comment
  • Save

Intel Corp. has announced its own distribution for Apache Hadoop, the collection of open-source software for performing data analytics on clustered hardware.

 

The primary solution is called “Intel Manager for Apache Hadoop”, which the company says it has built “from the silicon up” for high performance and security. The distribution provides complete encryption with support of Intel AES New Instructions in Intel Xeon processors.

 

Boyd Davis, VP and general manager of Intel's Datacenter Software division, said people and machines were producing a a lot of information that could be used to enrich our lives in many ways through analytics. "Intel is committed to contributing its enhancements made to use all of the computing horsepower available to the open-source community to provide the industry with a better foundation from which it can push the limits of innovation and realize the transformational opportunity of big data."

 

While there are multiple solutions for using clustered commodity hardware to perform analytics on large data sets, the Hadoop framework has emerged as the most popular one. Companies that provide popular distributions of the open-source software include Cloudera, MapR and Hortonworks.

 

Intel's silicon-based encryption support of the Hadoop Distributed File System means companies can analyze data sets securely, without compromising performance, according to Intel. The analytics performance is further improved by optimizations the chipmaker made for the networking and IO technologies in the Xeon.

 

Intel has also put some thought in simplifying deployment, configuration and monitoring of analytics clusters for system administrators to help them deploy new applications. This is done through the Intel Active Tuner for Apache Hadoop, which automatically configures the infrastructure for optimal performance, Intel says.

 

The chipmaker has partnered with a number of vendors to integrate the software into future infrastructure solutions and to work on deploying it in public and private cloud environments. The partners include Cisco, Cray, Dell, Hadapt, Infosys, SAP, SAS, Savvis, Red Hat and many more.

 

Intel's announcement came on the same day data storage giant EMC unveiled its own Hadoop distribution called “Pivotal HD”. It integrates natvely the vendor's Greenplum database technology with the Hadoop framework.

Related images

  • An Intel Xeon processor die

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required
required
  • Print
  • Share
  • Comment
  • Save

Webinars

  • Do Industry Standards Hold Back Data Centre Innovation?

    Thu, 11 Jun 2015 14:00:00

    Upgrading legacy data centres to handle ever-increasing social media, mobile, big data and Cloud workloads requires significant investment. Yet over 70% of managers are being asked to deliver future-ready infrastructure with reduced budgets. But what if you could square the circle: optimise your centre’s design beyond industry standards by incorporating the latest innovations, while achieving a significant increase in efficiency and still maintaining the required availability?

  • The CFD Myth – Why There Are No Real-Time Computational Fluid Dynamics?

    Wed, 20 May 2015 14:00:00

    The rise of processing power and steady development of supercomputers have allowed Computational Fluid Dynamics (CFD) to grow out of all recognition. But how has this affected the Data Center market – particularly in respect to cooling systems? The ideal DCIM system offers CFD capability as part of its core solution (rather than as an external application), fed by real-time monitoring information to allow for continuous improvements and validation of your cooling strategy and air handling choices. Join DCIM expert Philippe Heim and leading heat transfer authority Remi Duquette for this free webinar, as they discuss: •Benefits of a single data model for asset management •Challenges of real-time monitoring •Some of the issues in CFD simulation, and possible solutions •How CFD can have a direct, positive impact on your bottom line Note: All attendees will have access to a free copy of the latest Siemens White Paper: "Using CFD for Optimal Thermal Management and Cooling Design in Data Centers".

  • Prioritising public sector data centre energy efficiency: approach and impacts

    Wed, 20 May 2015 11:30:00

    The University of St Andrews was founded in 1413 and is in the top 100 Universities in the world and is one of the leading research universities in the UK.

  • A pPUE approaching 1- Fact or Fiction?

    Tue, 5 May 2015 14:00:00

    Rittal’s presentation focuses on the biggest challenge facing data centre infrastructures: efficient cooling. The presentation outlines the latest technology for rack, row, and room cooling. The focus is on room cooling with rear door heat exchangers (RHx)

  • APAC - “I Heard It Through the Grapevine” – Managing Data Center Risk

    Wed, 29 Apr 2015 05:00:00

    Join this webinar to understand how to minimize the risk to your organization and learn more about Anixter’s unique approach.

More link