Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.


SGI crunches big data loads in-memory

  • Print
  • Share
  • Comment
  • Save

High performance computer maker SGI has created a big data number cruncher for its partner SAP - a single-node appliance that can process whole terabytes of information in-memory within the HANA system.

The invention was unveiled at the Sapphire Now conference in Orlando, Florida, on June 3 2014.

It uses a scale-up, single-node architecture with coherent shared memory, which SGI claims has not been achieved before.

The means that enterprise resource planning (ERP), data warehousing and other processor intensive applications can run on a single, in-memory system that can handle loads of up to 6TB.

This avoids the limits and complexity of clustered appliances, SGI said.

It means that data centers can perform analytics, transactions and processes in real time.

As a consequence, it is possible for banks, manufacturers and retailers to maker for faster and smarter decisions, according to SGI.

President and CEO of SGI Jorge Titinger said: “They’re striving for the performance of single node systems beyond the current threshold.”

The first release of the appliance will contain an eight-socket UV system using Intel Xeon E7 8890 v2 processors coupled with 6TB of shared memory.

This is designed to scale to 32 sockets and 24TB of shared memory as a single-node system.

Expansion is achieved by adding extra sockets, SGI claimed, so the complexity of clusters is eliminated.

This means that performance increases consistently and here’s no need to redistribute data or balance I/O.

Shared memory communication is achieved by SGI NUMAlink seven interconnects.

To protect data against memory power loss, data and log files will be stored on disk using dual NetApp E2700 RAID arrays.

“Our customers can now get the full power of HANA now,” Titinger said.



Related images

  • Big data number cruncher

Have your say

Please view our terms and conditions before submitting your comment.

  • Print
  • Share
  • Comment
  • Save


  • Overhead Power Distribution – Best Practice in Modular Design

    Tue, 10 Nov 2015 16:00:00

    Overhead power distribution in your data center offers many attractive possibilities but is not without its challenges. Join UE Corp’s Director of Marketing, Mark Swift, and CPI’s Senior Data Center Consultant, Steve Bornfield, for an exploration of the options and some of the pitfalls, supported by real-life examples from the field.

  • Overcoming the Challenges of High Power Density Deployments

    Wed, 4 Nov 2015 19:00:00

    Increasing rack power densities saves space and energy, and improves both OPEX and CAPEX. But it can also create unintended problems that could bring your data center to a screeching halt. Join Raritan’s VP of Products & Marketing, Henry Hsu, and DCD’s Head of Research, Kevin Restivo, as they reveal the three key challenges in deploying a high density cabinet, and explain how to: reduce operating costs, Increase up-time , Improve mean time to repair, become more energy-efficient manage existing capacity and plan for growth.

  • Squeezing the Lemon - The Power to do More with Less

    Tue, 20 Oct 2015 08:00:00

    Energy costs rising, manpower resources falling – managing a data center is getting more stressful by the day. One cold night could be all it takes to tip your power supply over the edge. And let's not forget the never-ending demands from IT for additional space. More information on its own is not the answer. Join Rittal's webinar to understand how to: • Lower your power consumption and OPEX charges, with 'smart' power distribution • Identify issues before they become problems, with intelligent PDUs' monitoring capabilities • Expand your DC as your business grows, with modular PDUs • Profile your power requirements to help you plan and make better-informed decisions REGISTER NOW Note: All attendees will receive a free copy of the latest White Paper from Rittal.

  • Live Customer Roundtable: Optimizing Capacity (12:00 EST)

    Tue, 8 Sep 2015 16:00:00

    The biggest challenge facing many data centers today? Capacity. How to optimize what you have today. And when you need to expand, how to expand your capacity smarter. Learn from the experts about how Data Center Infrastructure Management (DCIM) and Prefabricated Modular Data Centers are driving best practices in how capacity is managed and optimized: - lower costs - improved efficiencies and performance - better IT services delivered to the business - accurate long-range planning Don;t miss out on our LIVE customer roundtable and your chance to pose questions to expert speakers from Commscope, VIRTUS and University of Montana. These enterprises are putting best practices to work today in the only place that counts – the real world.

  • Power Optimization – Can Your Business Survive an Unplanned Outage? (APAC)

    Wed, 26 Aug 2015 05:00:00

    Most outages are accidental; by adopting an intelligent power chain, you can help mitigate them and reduce your mean-time to repair. Join Anixter and DatacenterDynamics for a webinar on the five best practices and measurement techniques to help you obtain the performance data you need to optimize your power chain. Register today!

More link