possible sollution for heat problems in notebooks
I was just reading this thing on apple's news page about the virginia tech cluster. There's a line in there"
" If you've ever sat with a TiBook in your lap, you understand that there is a further significant issue. As hot as a G4 runs, a G5 runs hotter."
I was thinking about that notebook thing and thought the main reason for poor battery life in laptops is not that the batteries are crap (though of course they could be better) but it's the fact that processors are very inefficient in power consumption. And how exactly do we notice that?
Why just as we do with lightbulbs, HEAT. I believe a lightbulb only converts about 5 percent of it's wattage into light, the rest is converted into heat. Now, in processors it seems to me to be much the same, the efficiency is probably higher than 5% but I doubt it would exceed 40% or so.... I dunno.
The thing is, there are devices that convert light into electricity, devices that convert kinetic energy into electricity, we're trying all these things to power our portable devices. Yet in a notebook, there already is a powersourse, the processor and even the battary itself are powersources heatwise. Isn't there a way to convert perhaps half or even a quarter of that heat back into electricity ? ? ? you could have it charge a backup battery or the running battery itself (dunno if that works well) and you'll take away a percentage of the heat by converting it back into electricity.
Thus making your device far more efficient in stead of trying to find all them hight tech work arounds....
Am I babbling here ?
" If you've ever sat with a TiBook in your lap, you understand that there is a further significant issue. As hot as a G4 runs, a G5 runs hotter."
I was thinking about that notebook thing and thought the main reason for poor battery life in laptops is not that the batteries are crap (though of course they could be better) but it's the fact that processors are very inefficient in power consumption. And how exactly do we notice that?
Why just as we do with lightbulbs, HEAT. I believe a lightbulb only converts about 5 percent of it's wattage into light, the rest is converted into heat. Now, in processors it seems to me to be much the same, the efficiency is probably higher than 5% but I doubt it would exceed 40% or so.... I dunno.
The thing is, there are devices that convert light into electricity, devices that convert kinetic energy into electricity, we're trying all these things to power our portable devices. Yet in a notebook, there already is a powersourse, the processor and even the battary itself are powersources heatwise. Isn't there a way to convert perhaps half or even a quarter of that heat back into electricity ? ? ? you could have it charge a backup battery or the running battery itself (dunno if that works well) and you'll take away a percentage of the heat by converting it back into electricity.
Thus making your device far more efficient in stead of trying to find all them hight tech work arounds....
Am I babbling here ?
Comments
In reality, converting heat to energy takes more room than a laptop will provide. Unless they find an alloy that will naturally do this, it wont happen for a long time.
make a laptop thermos hybrid !
huh ? huh ?
how'bout them apples ! ? !
Technology Research News
The question is can you fit them into laptops?
but 18% efficiency wouldn't really be worth it....
glad to see there is development going on though. :-)
makes me feel sane again
Originally posted by neumac
You mean something like this:
Technology Research News
The questions is can you fit them into laptops.
Note that they get 18% efficiency with an input heat source of 200-300 °C. Processors start to fail considerably below that. We would have to get the processors glowing cherry-red before these devices would be practical.
~ufo~ has raised a good question, but the problem remains insoluble at this point. Direct thermoelectric conversion is too inefficient even with these new breakthroughs. Other methods to effect a conversion of the heat back to electricity involve too much bulk and mechanical complexity. At the moment, the cheapest, simplest alternatives are better batteries and asbestos underwear, unfortunately.
Originally posted by ~ufo~
Am I babbling here ?
No.
I have a sneaky suspicion that a device that can convert heat into electrical energy will be a MEMS device, not a semiconductor one. Essentially your regular old fluid driven - coal, nuclear, gas, whatever - power plant, shrunk down to nano or micro size.
Originally posted by THT
No.
I have a sneaky suspicion that a device that can convert heat into electrical energy will be a MEMS device, not a semiconductor one. Essentially your regular old fluid driven - coal, nuclear, gas, whatever - power plant, shrunk down to nano or micro size.
All of which will fit into a 1 inch form factor and will at best only give you a 20% increase in battery life at a large cost? I don't think so. Apple would ship a better battery before such a device.
In this case lets say the chip could run about 360K and the cool end of the converter might run about 300K. This means the maximum efficiency would be about 16%. That would add about 25 minutes to a system that otherwise ran four hours. In the real world the efficiency would and benefit would be much less. Hardly worth the effort.
All such devices are based on a temperature difference. If you don't somehow remove heat from the cold end it will warm up and the device will stop functioning.
Originally posted by Yevgeny
All of which will fit into a 1 inch form factor and will at best only give you a 20% increase in battery life at a large cost? I don't think so. Apple would ship a better battery before such a device.
Well, yeah in the near term. But down the line when such a device could be in production, could fit in a 1 inch factor, and when mobile laptops are using 60 Watt CPUs, such a thing would be really handy.
Originally posted by neutrino23
All such devices are based on a temperature difference. If you don't somehow remove heat from the cold end it will warm up and the device will stop functioning.
Would a MEMS scale device be more efficient?
Originally posted by THT
Would a MEMS scale device be more efficient?
No.
This is a maximum theoretical efficiency for any heat engine. Any real machine would be hard pushed to come anywhere near it, and in practice, the lower the temperature difference available, the more difficult it is to approach the theoretical limit.
michael
Originally posted by ~ufo~
aye
but 18% efficiency wouldn't really be worth it....
glad to see there is development going on though. :-)
makes me feel sane again
Another way of doing this would be to make batteries that are AC powered ? Each time the power turns, the battery would be charged a little, prolonging its life. Power would eventually run out since it takes more power to charge a battery than you get when discharging it..
A true project for mad scientists
.:BoeManE:.
Originally posted by mmicist
This is a maximum theoretical efficiency for any heat engine. Any real machine would be hard pushed to come anywhere near it, and in practice, the lower the temperature difference available, the more difficult it is to approach the theoretical limit.
Yes, you guys are right. Should have taken some time to think about it. I'm still questioning it however since the physics at MEMS scales would operate a little differently then they do at large scales...
Anyways, I would take that 15%.
It would be better to put the effort into developing new technologies that use less power to start with. OLED displays will use less power compared to LCDs. New generations of HDs seem to use less power. Newer CPUs might use less power. Better power management strategies will cut down average power use.
If small improvements to all of these methods cuts down power requirements by ten percent that is probably a much bigger benefit than you could get by trying to get energy from a hot CPU.