Reading an article in the Globe and Mail this morning about using a “solar wall” on the side of a building as a means to reduce energy consumption, I was struck by this statement:
Think of the biggest energy cost for a typical Canadian business – one running multiple computers and keeping the lights on at all hours. It’s electricity, right? Wrong.
“People think of electricity first but actually, in Canada, indoor heating is the largest use of energy,” says Victoria Hollick, vice-president of renewable energy firm Conserval Engineering in Toronto. “Given that [we use] heat seven months of the year, that represents a tremendous use of energy.”
Made me think. Computing equipment needs electricity to operate, but it throws off (as anyone sitting near a server rack knows) a lot of heat which demands more electricity for cooling. In the winter–or for seven months, buildings need to be heated. This creates an interesting little system.
What if the energy being dissipated by the computing equipment in the form of heat thrown off were reclaimed or invested as energy for another source? That is, kind of like a nuclear reactor or steam engine, what if the heat were used to boil water or somesuch that was then either (a) directed to heat the premises in the winter or (b) used to generate some of the power required to cool the premises in the summer? On a smaller and more closed scale, perhaps the power generated by this capture of heat could be reintroduced into the source for the computing equipment itself creating a slowly degenerating closed loop.
It wouldn’t eliminate the need for electrical power–obviously, but maybe it would reduce the overall need in the short run and expand the imagination in the long run.
Of course, I’m sure others have thought about and implemented this already. But, hey, it struck me now.