Cookie policy: This site uses cookies (small files stored on your computer) to simplify and improve your experience of this website. Cookies are small text files stored on the device you are using to access this website. For more information on how we use and manage cookies please take a look at our privacy and cookie policies. Some parts of the site may not work properly if you choose not to accept cookies.


Welcome to The Machine

  • Print
  • Share
  • Comment
  • Save

Where have you been, we ask, as HPE’s re-imagined system comes a little closer to reality

“This isn’t about producing six Machines as supercomputers, this is about producing the next generation of infrastructure that we would imagine every city on the planet having multiple data centers using this technology to run everyone’s lives.”

Since its announcement in 2014, news on HPE’s ambitious The Machine has been slim, with the project’s lead announcing his retirement, memristors being removed from the immediate roadmap and many suggesting that the whole thing was simply an elaborate marketing push.

But now, as HPE gears up to release its first prototype, Martin Sadler tells DCD that the company still hopes to change the world with The Machine.

It’s alright we know where you’ve been

The Machine, HPE, Star Trek

The Machine

Source: HPE

“The world is generating ever more, greater amounts of data, that we are not very good at extracting value from,” said Sadler, an HPE VP and the director of security and manageability at HP Labs. 

“There’s a huge amount of data that we are collecting as we instrument the globe more and more that large numbers of companies really can’t get to, or if they can get to it, they don’t know what to do with it, and they certainly can’t process it fast enough or cheaply enough for them to get some real value back.

“How do we get more of this data to more of our customers in a way that they can consume it usefully?”

For HPE, the solution to this question lies in flipping the traditional processor-centric architecture found in existing computers.

“Today’s architectures are built with a processor at the center, around which everything else is peripheral, including memory and storage. You’re limited by what this one processor can do. Now with data centers, we really need to put the data at the center of the architecture,” Sadler said.

“When you think of this massive pool of data, whether you think of it as being in memory or as being in storage, it’s just a lot of it, and then you want to have a massive number of processors that all hang off that same information pool.

The Machine may end up releasing as The Machine.

Or it may end up releasing as various versions of The Machine.

Or it may not release at all, with parts of The Machine finding their way into other products.

“It’s not like today’s cluster solutions where each little processor owns a bit of memory, and it controls what that bit of memory does. You want to flip the architecture to being memory-driven rather than processor-driven.”

To do that, one has to think “how am I going to shift all of this information back and forth between different processors and different bits of memory?,” he said.

“And that leads you to think about what you will do on the fabric itself, so the changes you need to make to the underlying fabric - how we move the bits around very, very efficiently, and that leads into us using a lot more photonics in the underlying architecture, the more we can get as light on fibers, as opposed to electrons on wires, the better.”

The Machine Prototype, The Machine, HPE, Hewlett Packard Enterprise

The Machine Prototype

Source: HPE

You’ve been in the pipeline, filling in time

This, Sadler insists, is a reason to get excited. “I think it is a big deal for the computing industry because every time we start locking things down and standardizing everything, innovation disappears.

“So the moment you change the game - and we think this changes the game - you allow an awful lot of innovation to happen and history has shown in the IT industry that those rapid stages of innovation are what drives massive amounts of growth for the industry as a whole.”

He continued: “I think we would see loads of companies doing very well out of a memory driven ecosystem for a class of servers or data centers, or however we might want to position this in future… I think whole loads of companies will pile in around all aspects of the stack to take advantage of being able to do stuff differently.”

This is what The Machine plans to achieve, and it has even more ambitious hopes.

In a marketing blitz this summer, where HPE teamed up with Star Trek Beyond to turn much of the movie into product placement for the company (when Jeff Bezos wasn’t making a cameo), The Machine was front and center.

In a promotional video, a group of Starfleet Academy cadets walk across an alien world. “At the beginning of the 21st Century, the Earth needed to find a way to keep up with the data from over 30 billion connected devices,” an instructor tells them.

“So a bold group of scientists and researchers in Silicon Valley had a breakthrough they called The Machine. It changed the basic architecture of computing, putting a massive pool of memory at the center of everything, and by doing so, it changed the world. It has been part of every new technology for the past 250 years-”

“-Everything?” a student cuts in.

“Everything” the instructor replies, disappearing as a transporter teleports him away, presumably powered by technology from The Machine.

But The Machine may end up just like Star Trek - science fiction that does not fully translate into reality.

What did you dream?

“At some stage we may or may not actually launch something that is close to our labs prototypes as a product in its own right. But those decisions are not yet made as to when we would do that,” Sadler said.

“Our intent is still to shape some kind of product that looks quite close to the kind of prototype that we’re building, but it’s y’know… we may decide not to do that, that it’s just better to take some of the individual pieces of technology out into other product lines because we can have just as much effect with them there.

In fact, there may even be multiple machines - “we’ve talked a lot about that internally, so we certainly see that range from something that is dealing with many, many users, so incredibly multitenanted, which might be a city context where there isn’t just one administration, many, many separate companies need to use this information. Dealing with that multitenanted world is very different from if you’re doing weather modeling, or some governmental uses at that HPC end of the spectrum. And very different again if you think of this as close to the edge.

“You could see one a little bit more geared for the telecoms industry. This is not the point where this will be one product.”

The Machine may end up just like Star Trek - science fiction that does not fully translate into reality.

So The Machine may end up releasing as The Machine. Or it may end up releasing as various versions of The Machine. Or it may not release at all, with parts of The Machine finding their way into other products.

Sadler defends this confusion over whether The Machine is a real product or simply an R&D product and marketing attempt: “This began life as a research project in our labs, we are still continuing with it as a research project, so it has a life as where we will be taking our infrastructure research over the next few years whatever happens.

“That’s why you get mixed answers back from people, because it’s kind of both.”

Any problems, lack of clarity or clear product lineup are apparently just the result of The Machine being so reliant on emerging technologies.

The Machine Memristor, The Machine, Memristor

Memristors, once destined for The Machine 

Source: HPE

It’s alright we told you what to dream

The Machine originally hoped to come packed with memristors, but as they did not develop as fast as HPE expected, this year’s prototype will instead sport DRAM. The prototype is expected to come with 320 terabytes of memory and more than 2,500 CPU cores, less than the original outline of The Machine that envisioned as much as 160 petabytes of data.

Should The Machine continue as a product, “we’ll move very rapidly from tens to hundreds of terabytes, and then you’re into systems that will maybe be using petabytes of main memory pretty shortly afterwards.”

The hope is to add memristors, or a similar product, into The Machine before any final version releases. “Non-volatile memory isn’t quite there yet, so we may be a year or two ahead of the curve when we talk about real products, because the real products you’d want to produce here would be built with non-volatile memory.”

Lots of doing this is not new science. It’s not like we didn’t know some of the ideas, back in even the ‘70s. It’s just that nobody could actually build them back then.

CPUs also remain a challenge, peskily created with the prevailing computer architecture in mind. “CPUs are designed to be the center of the universe with everything hanging off them, now every processor becomes a peripheral in this world.

“We need slightly different emphasis on functionality with the chips, so, we would see some that changing [after The Machine is released]. That’s a challenge in how we use the current chip sets to make this work well.”

The operating system was another issue that the team had to address: “When we started we had the idea that we would do our own operating system - if you’re going to change the world, go for it, change the world. We very rapidly came to the conclusion that that’s probably a little too hard, and developers are not going to find it very easy to adopt everything changing under them. 

“I think we felt that the hardware changes are going to be revolutionary enough in what we are doing, so we have really looked to do the minimal amount of changes we need to to Linux to keep it as a very familiar development platform and operations platform.”

He added: “We probably thought we would have done more there, but I think that was a little bit blue sky when we started. That was probably us just getting ahead of ourselves a little bit, and the need to come back and be a little more realistic and get stuff out.”

You dreamed of a big star

The reason for announcing and developing The Machine so early, when some of the tech is not there to support it, is to remain ahead of the curve, Sadler said. “If you wait for everything, you’re usually too late. You can’t wait for everything to get going, because you never get going. I think we did it at the right time. I think if we did it five years ago, it would have been a great science project, but that’s all it would have been, y’know?

“I think if we had waited another five years, I think we would have found someone else had done it before us. I think we got our timing right.”

In fact, Sadler believes that The Machine was conceived at the perfect time, at the inflection point where a product such as this became possible as a sustainable business. 

“Lots of doing this is not new science. It’s not like we didn’t know some of the ideas, back in even the ‘70s. It’s just that nobody could actually build them back then. This is the point in history in which we have the engineering capability to build these kinds of systems.

“Maybe we could have built them 10 years ago, there are university projects doing things, trying different kinds of architectures. The way we have done this is in a way where it could become a product, so this is not a one-off university design to build an artificial brain or something like that where everything is just hard-wired differently and then you can’t really replicate it easily. We are doing this in a way where, in principle, something very close to this could come off a production line.”

He added that “if we hadn’t built one now, I think we would have expected competitors to be building them pretty soon.” In the future, he believes that “everyone will come around to this putting memory at the center of things, it’s an obvious idea for a huge range of applications. If Hewlett Packard Enterprise hadn’t done it, I’m sure any number of other players would have started doing something very similar.”

So welcome to the machine

With creator/father Martin Fink set to retire in October, some have cast doubt over the project’s success. “I think Martin set a very clear direction for this, one that everyone bought into and I think everyone is still on that mission. So I don’t think the mission has changed in the slightest in terms of what we’re aiming to get done,” Sadler asserts.

“As we do each little practical step along the way, I’m sure it will be different not having Martin around because he had a lot of influence on those specific steps. But we have a lot of very capable people in this program, and I think people would still see Martin as, even from the outside, as being a little bit of the shining light about what it was we were trying to do, and why we’re going in a particular direction. He is very interested in us getting to launch by the end of the year.

“This is not him retiring and out of the picture. He really is super interested in every aspect of this.”

The prototype’s launch will mark a defining moment in the story of The Machine, with the world’s reception of the product helping decide its future. “At that point, it might change from some of Martin’s original ideas in where we go, because we suddenly find that actually everyone’s really excited to apply it in a particularly different way.

“We very much want to shape where it goes in line with the feedback we get, this isn’t us revealing it and saying ‘guys we’re going to go away again for a while and do it all in secret’, this is something we do want to do as far as we can in the open!”

Have your say

Please view our terms and conditions before submitting your comment.

  • Print
  • Share
  • Comment
  • Save


More link