Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.

sections

HP Discover: HP tightens embrace of Hadoop

  • Print
  • Share
  • Comment
  • Save

HP is making efforts to expand its reach into the market for “Big Data” analytics (extracting useful business intelligence out of large and often chaotic mixes of data), using its hardware, software and services capabilities.

Much of this play is offering customers help with leveraging Apache Hadoop, an open-source distributed-computing framework that provides a way to cluster many simple commodity servers to process massive workloads. It has become one of the most popular platforms in the quickly growing space of unstructured-data analytics.

The company announced on Monday a hardware architecture designed to run Hadoop and unveiled consulting services to help companies use the framework. It also announced that Vertica – a company it bought in March 2011 – will integrate the latest release of its analytics platform (Vertica 6) with Hadoop.

Vertica 6 – also launched on Monday – can use data it gets from a variety of sources, HP says, which Hadoop is now one of. It takes this ability to use many different interfaces from its FlexStore architecture, according to HP.

In a statement, the company said this architecture enabled integrates or federation with Hadoop, Autonomy (another recent analytics acquisition) or literally “any other structured, unstructured or semi-structured data source.”

In addition to Hadoop, the latest version of Vertica adds native support for parallel execution of R (a programming language developed specifically for data analytics), and better support for implementations on cloud infrastructure or for Software-as-a-Service delivery.

HP made all of the announcements at is annual Discover conference in Las Vegas.

The hardware architecture for Hadoop unveiled on Monday is called HP AppSystem for Apache Hadoop. It consists of an 18-node cluster of HP’s latest ProLiant Gen8 DL380 servers and HP networking gear, all managed through the HP Insight Cluster Management software.

The vendor can deliver the system as a “turnkey appliance” for customers that do not have the resources or the desire to design and build a Hadoop system in-house.

HP has developed reference architectures, tools and whitepapers to support the top Hadoop distribution vendors: Cloudera, Hortonworks and MapR. These companies provide commercial products and services associated with Hadoop besides distributing Hadoop itself.

Cloudera, one of the three, is also partnered with a range of major HP competitors, including Acer, Dell, NetApp, SGI, Oracle and Supermicro. Hoartonworks’ partner list includes NetApp, Seamicro and Teradata, as well as Vertica, among others.

MapR’s has EMC, Cisco, IBM, Teradata and VMware as its partners in addition to many more.

Autonomy, whose founder and until recently CEO Mike Lynch was let go in May, is also now more tightly integrated with Hadoop. HP can now embed the unit’s Intelligent Data Operating Layer (IDOL) 10 engine in each server node in a Hadoop cluster.

Rafiq Mohammadi, CEO of Autonomy Promote, an Autonomy unit, said, “IDOL 10 is now part of the Hadoop network. Enterprises can drop an IDOL instance right into a hadoop node.”

IDOL is the “heart” of Autonomy’s infrastructure software, the company says. The secret sauce is the company’s proprietary structure for storing data, optimized for fast processing and data retrieval.

Close coupling with IDOL 10 gives users access to hundreds of its functions, including automatic categorization, clustering, hyperlinking and more.

Lynch's departure was announced on the same day HP announced its Q2 fiscal 2012 earnings and a new restructuring plan that included laying off or sending into early retirement about 27,000 of its staff around the world.

Related images

  • Rafiq Mohammad, CEO, CEO of Autonomy Promote, HP (left), and Colin Mahony, VP and general manager at HP Vertica

Have your say

Please view our terms and conditions before submitting your comment.

required
required
required
required
required
  • Print
  • Share
  • Comment
  • Save

Webinars

  • Designing Flexibility into your Data Center Power Infrastructure

    Wed, 4 May 2016 18:00:00

    As power density is rapidly increasing in today’s data center, provisioning the right amount of power to the rack without under sizing or over provisioning the power chain has become a real design challenge. Managing the current and future power needs of the data center requires Cap-Ex to deploy a flexible power infrastructure: safely handling peak power demands, balancing critical loads and easily scaling to meet growing power needs. In this webinar you will learn: > How to create Long term power flexibility and improved availability for your operation > How to increase energy efficiency and improve SLAs through a comprehensive set of best practices.

  • White Space 35: Signal to noise

    Mon, 18 Apr 2016 15:05:00

    In this weeks episode, we talk about networking records, Google’s outage and Amazon’s cloud dominance.

  • White Space 34: Open season

    Mon, 11 Apr 2016 13:15:00

    We talk open hardware, open source software and oxymorons.

  • White Space 33: All about Oregon

    Wed, 6 Apr 2016 08:35:00

    This weeks, White Space looks at news from the High Desert, and remember an American businessman who ran for president.

  • Is Hyperconvergence a Viable Alternative to the Public Cloud?

    Thu, 31 Mar 2016 15:00:00

    Enterprise IT leaders are right to be skeptical of such bold claims. After all, is it really possible to deliver an on-premises infrastructure that delivers the same agility, elasticity, and cost-effectiveness of public cloud providers like Amazon Web Services? If you’re following the traditional IT model, with its many siloes and best-of-breed point solutions, the answer is, most likely, no. To truly deliver a viable alternative to public cloud, you need to look beyond traditional IT. Join Evaluator Group and SimpliVity to learn more about how hyperconverged infrastructure can deliver the efficiency, elasticity, and agility of public cloud."

More link