In the 1990s the vast majority of computers only needed to cool their CPUs with a simple fan that fit in the palm of a human hand. This is still how we cool most of our computers in the 21st century, except for the fact that the fans are now more efficient and larger.
The problem with hardware is that electrical impedance creates a sort of “friction” that results in excessive amounts of heat. As hardware gets more powerful, it needs to dissipate more of this heat. Most high-budget PC enthusiasts have built computers by hand or through bare-bones kits that include water cooling, a more efficient method of removing heat from multiple pieces of hardware. But what if we had a system with cooling that functioned a lot like our own vascular systems? Would that be more efficient?
The Traditional Cooling Model Is Failing Us
For most modern computers, it is sufficient to have a fan cooling the CPU. That’s not because air cooling is necessarily the most efficient, but because all of the operations we do on that computer can be accomplished through the CPU it currently has, and that CPU can be cooled by air. Overclocking enthusiasts may have to opt for water cooling not because they are pushing the limits of the hardware, per se, but because they will eventually push the hardware to a point that is acceptable for the system but only if the cooling method is more efficient.
Ultimately, what we end up with is hardware that is “dumbed down” to work with the traditional air cooling method. Water cooling isn’t extraordinarily popular because you always risk leaking fluid into the system and frying the computer whenever you are performing maintenance. For this reason, it’s difficult to sell the idea of water cooling to the masses. Instead, we have a compromise where we start adding more fans to pre-built systems and leaving it at that (many CPU coolers have two or more fans).
What we need is a new cooling method that is simple to maintain and allows us to accelerate the process by which we can develop hardware. It’s either that, or we start coming up with magical CPUs that don’t give off a whole lot of heat. Both are possible, but a circulatory system for computers would probably be the best solution.
Circulatory Cooling vs. Water Cooling
In 2013 IBM showcased its invention of a computer that runs on “blood.” The concept is rather simple: Distribute power to the system and remove its heat. The electrolyte is conductive, making it possible to route power wherever it’s needed. At the same time, it can carry heat away from the computer into a pump that cools the liquid once again. It sounds like a glorified water cooler until we add one more (very important) detail: it has capillaries that can distribute themselves into smaller areas.
The human body is efficient at maintaining its temperature not only because of the presence of sweat glands, but also because our blood seems to be all over. When you get a cut, no matter where its location on your body, blood somehow manages to come out of the surface. This is because there are millions upon millions of capillaries scattered all around the body. Considering that the human brain consumes roughly 20 percent of the body’s resting metabolic rate, it produces a lot of heat for one single organ. The blood that enters the brain helps dissipate all of this heat.
Doing this with a computer could help spark more innovation that doesn’t have to work around the traditional cooling practices we employ. By filling a computer with capillaries, we can distribute power and cool small clusters of transistors, allowing us to make denser CPUs. This construction could help us also pioneer a new age in robotics. Instead of measuring a CPU by how many tasks it can accomplish in one second, it could perhaps be more adequate to measure its performance by how many tasks it can churn through with one gram of fluid.
Do you think that circulatory cooling has a place in the modern day home or business computer? Tell us what you think in a comment!