
Hewlett Packard Enterprise (HPE) presented a prototype of “The Machine”, the computer of the future announced in June 2014. It is a single-memory computer, which means it exceeds the division between RAM and mass memory with its 160 TB (TeraByte) memory, more than that of servers. However, this is not a memtor, the technology being developed that should best combine the benefits of RAM and mass memory.
The Machine project begun to go beyond the current computer architectures, which have been refined over the last few decades but now increasingly show their limits. In the next decade, data management needs will be so high that a new type of computer is needed and the division between RAM and mass memory is one of the bases that will become outdated by HPE’s project.
The Machine represents the application of the concept of Memory-Driven Computing, a type of computer that puts memory and not the processor at the center of its architecture. The presence of a single non-volatile memory type, which means that once written remains memorized but as fast as current computers’ RAM, allows to handle data in a much more efficient way, greatly reducing processing time.
HPE has been developing memristor technology, a type of non-volatile memory that at the same time allows very fast reading and wirting but has suffered delays. For the moment, The Machine prototypes use another type of memory that literally puts together on the same modules, called NVDIMMs, DRAMs, and flas memory.
These NVDIMMs are a good solution for current systems, in fact HPE put it on the market for some of its business systems, but are an adaptation of existing technologies. In the case of The Machine this is a temporary solution but it’s unclear when it will be possible to use memristors, a technology that is suitable for the next generation computers.
All of this doesn’t mean that the other components of the next computers are neglected, in fact another leap forward will be made by using optical / photonic connections to overcome the electrical connection limits. In essence, The Machine transmits data among its components through tiny optical fibers that allow the use of light to transmit data much faster and using much less energy.
There are very optimistic statements from HPE and Cavium, which is developing The Machine along with HPE, on the possibilities of developing this type of architecture. The Big Data perspective is really interesting for the possibility to build new generation supercomputers with computing speed and power far superior to the current ones. When costs come down, it should also be possible to build personal computers based on that type of architecture.
The problem as always is the availability timeline. The availability of memristors will strongly affect the schedule of a project that was supposed to be ready for 2018. Now someone’s talking about five more years but those are still estimates so it’s really hard to figure out when The Machine can be available for sale.
Permalink