eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.
1How HPE Designed The Machine to Handle Challenging Big Data Projects
After 14 years of research and development, Hewlett Packard Enterprise has unveiled The Machine, which it describes as a supercomputer with 160 TB of main memory. In fact it’s a memory-driven high performance computer, which means it makes memory, not processor chips, the workhorses of the computing architecture, according to HPE. HPE designed The Machine as a groundbreaking computer that was designed to take on challenging computing tasks required by a big data dominated world. The Machine has the memory and processing power to crunch the volume of data contained in 160 million books. This will enable it to handle such data-intensive tasks as health record analysis, genomics, weather forecasting, oil and gas exploration or quantum mechanical calculations. Read on to learn more about The Machine’s capabilities.
2Understanding the ‘Data Dilemma’
To understand The Machine, one must first understand what HPE calls the “Data Dilemma.” Processing speed isn’t accelerating quickly enough to match the volume of data being created, it says. Today’s technology relies on 60-year-old basic chip architecture, and computers use 90 percent of their resources just moving data between memory and storage.
3What Is Memory-Driven Computing?
The Machine uses memory-driven computing, which gives every processor in a system access to its entire pool of memory, rather than using memory only for certain applications. According to HPE, today’s method creates inefficiencies and throttles data analysis. The Machine aims to create much faster and capable machines.
4It’s Designed to Handle on the Biggest of Big Data Analytics Chores
HPE believes today’s computing environment isn’t ready for Big Data. Memory is too far from the processing power, data is too big and machines are not powerful enough. The Machine was designed to be the world’s biggest single-memory system, reliably holding and processing massive amounts of information in as little time as possible.
5The Machine Has 160TB of Shared Memory
6This Is a Linux-Based Technology
7What The Machine’s Future Might Look Like
8Looking to the Community for Help
HPE has said in numerous postings about The Machine that it wants—and perhaps even needs—the help of the broader computing science community. So, The Machine is entirely open-source and available to the community to tweak and improve as time goes on. It’s unclear, however, how many organizations are working behind the scenes on The Machine.
9Security and Detection Could Be Improved
HPE says the security community could benefit greatly from The Machine, noting the increase in ways hackers can target companies and number of internet of things-enabled devices coming online. Because it can more quickly analyze and cut through data, The Machine should do a better job of detecting problems, HPE says.
10Some More Ways The Machine Can Help
HPE described other ways in which The Machine could benefit industries. The Machine could one day analyze far more information in real time to limit delays in air travel, conserve fuel and help pilots circumvent bad weather. In health care, The Machine’s ability to crunch so much data could help doctors quickly diagnose and treat problems at a fraction of the time.
11Major Questions Remain
While HPE believes The Machine could profoundly impact the world, The Machine currently is still a prototype and the project is in its infancy. It’s unknown how much The Machine would cost, when it will launch and whether any companies have signed on as customers. But The Machine is not a mass market product. Typically a few such ultra high-performance machines are sold to U.S. government agencies, university research centers, and enterprises with deep pockets and special needs.