Maybe this is a stupid question but if Apple wants to reduce CPU heat and increase battery life why don't they connect a thermocoupling to the heatsink to convert the wasted heat into electricity?
Maybe this is a stupid question but if Apple wants to reduce CPU heat and increase battery life why don't they connect a thermocoupling to the heatsink to convert the wasted heat into electricity?
Well, how about a "steam" engine implementation? Use the CPU heat to heat a working fluid that powers turbines to generate electricity?
Using a thermo-electric material would be a great idea, but tubgirl's got it. They aren't very good at converting heat to electricity yet. First use would be in automobiles though. Much higher temp difference there.
I think Apple would only consider an inexpensive and compact solid state solution to recover leaked power, thus the reason I'd see them interested in using a thermo-coupling.
Yeah a huge problem would be the fact that computers should not get above 150°F.. Steam forms at 212.
Yes, but I mentioned "steam" to relay a concept. The working fluid would obviously have to be something that would have a phase change at the temperatures and pressure a CPU, GPU, and core logic would produce. Or it could simpy be a single phase system (gas only).
Quote:
Heat -> electricity is probably never going to be efficient because of that whole 'thermodynamics' mumbo jumbo about 'entropy'
Yeah, heat engines are only 15 to 20% efficient. But, I wouldn't knock anything that can increase battery life. It all helps in the end. It's the cost and weight that will determine if it is viable or not.
I reckon that instead of using standard transistors, we should use LED transistors and convert the light back into electricity and let the thing power itself.
Why not just use plastic solar cells, these work within the infrared range anyways so they could absorb heat. They are fairly easy to produce unlike traditional solar cells, and although they don't have the best return they probably wouldn't hurt.
yes, solar cells can be significantly more effective, but a even a hot cpu (~70° C) doesn't radiate very much energy. (maybe someone clever could do the math?)
and then you have to address the issue of where to place the cells inside the case?
Comments
Maybe this is a stupid question but if Apple wants to reduce CPU heat and increase battery life why don't they connect a thermocoupling to the heatsink to convert the wasted heat into electricity?
Size?
you'd probably be better off spending that weight and volume on a bigger battery.
Using a thermo-electric material would be a great idea, but tubgirl's got it. They aren't very good at converting heat to electricity yet. First use would be in automobiles though. Much higher temp difference there.
Heat -> electricity is probably never going to be efficient because of that whole 'thermodynamics' mumbo jumbo about 'entropy'
Yeah a huge problem would be the fact that computers should not get above 150°F.. Steam forms at 212.
Yes, but I mentioned "steam" to relay a concept. The working fluid would obviously have to be something that would have a phase change at the temperatures and pressure a CPU, GPU, and core logic would produce. Or it could simpy be a single phase system (gas only).
Heat -> electricity is probably never going to be efficient because of that whole 'thermodynamics' mumbo jumbo about 'entropy'
Yeah, heat engines are only 15 to 20% efficient. But, I wouldn't knock anything that can increase battery life. It all helps in the end. It's the cost and weight that will determine if it is viable or not.
Key quote: "today?s standard thermopiles are 0.2% to 0.8% efficient for
temperature differences of five to 20 deg. C"
Also: http://www.wirelessnewsfactor.com/perl/story/13874.html
and then you have to address the issue of where to place the cells inside the case?