Technology Breakthroughs: Power of Memory-Driven Computing…

Hewlett Packard Enterprise (HPE) has pioneered the world’s largest single-memory computer. The prototype contains 160 terabytes (TB) of memory, capable of simultaneously working with the data held in every book in the Library of Congress five times over—or approximately 160 million books. It has never been possible to hold and manipulate whole data sets of this size in a single memory system. The prototype is the latest milestone in The Machine research project (The Machine). The Machine is aimed at delivering a new paradigm called Memory-Driven Computing—an architecture custom-built for the big data era.

Based on the current prototype, HPE expects the architecture could easily scale to an exabyte-scale single-memory system and, beyond that, to a nearly limitless pool of memory—4,096 yottabytes. For context, that is 250,000 times the entire digital universe today.

With that amount of memory, it will be possible to simultaneously work with every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles and every data set from space exploration all at the same time—getting to answers and uncovering new opportunities at unprecedented speeds.

Posted in

Karan Sorensen