I have a Dual 2GHz G5, which I am beginning to use as a real digital hub (it answers my telephone, I run my stereo through it, etc.) As a result I have taken to setting it never to sleep. My question is, how much energy does this use?
Hm, you'll probably need to go to your hardware store and get a power meter to figure out how much power your computer uses.
Posted over at MacNN:
An average desktop PC consumes about 100watts of power (even though its power supply can, on occasion, draw several hundred watts).
Electricity is sold by the kilowatt/hour - that is, 1000watts for one hour. A PC would consume 2.4KW/H (24 x 100w) per day if left powered on.
I just checked my power bill. In North Carolina I'm paying $.09 per KILOWATT/HOUR.
2.4 X .09 = 21 cents per day or about $6 per month.
I think that may be a bit high. I've had four or five peecees running 24/7 and didn't notice a $20+ increase in my electric bill.
If you 'sleep' your computer for an entire month, it probably wouldn't cost 25 cents for the electricity.
Tell your significant other that the blow dryer is off limits. That sucker draws 1500watts of electricity.
If saving money on your power bill is the goal - ignore lightbulbs, computers, and audio/video electronics. None of those things have a big impact on the monthly bill. Instead, limit your use of heat-generating electrical devices...oven, dryer, etc.
See that 20,000watt figure for electric heat? That's why insulation pays for itself so quick.
Those little 1200 watt portable space heaters will add $75/month to your power bill if it runs continuously.
The Power Mac G5 is rated at around 650W; in practice it may use only 300-400W.
The power supply in the G5 is only rated at 450W, so I don't know where you got the 650W number. The average draw is not going to be 300-400W. I know people that were running 4 to 5 HDs, an optical drive, and the processor on the old G4 power supplies (rated at 340W), so simply running 1 or 2 HDs and the G5 will not come even close to your 400W figure. I really wonder where you got this info from.
Quote:
Big Mac
kupan, if this were slashdot, I'd rate your post highly informative.
I can't take all teh credit, it was somethign I nabbed from over at MacNN when someone asked about things there.
Is that really how you figure this out (6.5 times 100 = 650W)? So if I use the G5 in another country (with a different voltage, like say 200-240V AC), then the power supply becomes 1250W?
Replacement Power Supply for PowerMac G5 1.6Ghz #661-2903
Fully Tested. New, Extended 6 Month Warranty .
Used in PowerMac G5 1.6Ghz Only. Apple Part Number 661-2903 450 Watt
So this is an Apple power supply, from a G5, and it is listed as a 450W power supply. I have a feeling that the power supplys in the G5 are only up to 450W. In whch case, the average draw is probably around 100W (maybe a bit more, but I doubt it).
G5's MAX is around 7.5A MAX so that in North America (120V) the power consumption is 900VA MAX.
1VA is approx. 0.6 Watt so that the result of MAX power consumption in Watt is 540W.....
I too really am curious about the number in Asia and Europe
Well the G5 you buy in the US will have the same power supply as the one in Asia. But my power supply suddenly can't handle more because the voltage there is more, right? I will admit I am on shakey ground here, as I don't know much about how the power works in various places, but it seem like if a power supply is rated at 450W, just because I move somewhere else, I suddenly am able to power more objects on the same unit.
From what I understand the amp rating should refer to the highest sustained power draw this power supply will require from an A/C line (at a specific voltage) without tripping a fuse or overload protection circuit and or causing power distortion, give or take a small tolerance. When a computer, or just about any electrical device starts, a power surge is created for a very brief time which can greatly exceed the devices maximum power rating.
But back to the issue at hand, there is almost no chance this guy is going to be doign a max draw 24/7 with his machine. I serieuosly doubt that the G5, system control, 2 HDs, and a optical drive in use 24/7 would be using the entire power supplys capability. Average useage varies by number of drives in the machine among other things, but for servers I have seen that an average machine will draw roughly 50% of the power supply rating. As far as the maximum power use, generally a well designed system will not come close to the maximum draw.
So lets assume the power supply is 450W, and that on average this machine is using 225W and runs 24/7. Then plugging into my equation we see that it is roughly $14/month. But I really think that we are overshooting, and that the average would be less (as servers generally have more drives and the drives are accessed more frequently).
Comments
Originally posted by JBL
Ovolab Phlink. I actually just got it so I can't give you much of a review yet but it looks really cool.
awesome--i could really use that.
Originally posted by chych
Hm, you'll probably need to go to your hardware store and get a power meter to figure out how much power your computer uses.
Posted over at MacNN:
An average desktop PC consumes about 100watts of power (even though its power supply can, on occasion, draw several hundred watts).
Electricity is sold by the kilowatt/hour - that is, 1000watts for one hour. A PC would consume 2.4KW/H (24 x 100w) per day if left powered on.
I just checked my power bill. In North Carolina I'm paying $.09 per KILOWATT/HOUR.
2.4 X .09 = 21 cents per day or about $6 per month.
I think that may be a bit high. I've had four or five peecees running 24/7 and didn't notice a $20+ increase in my electric bill.
If you 'sleep' your computer for an entire month, it probably wouldn't cost 25 cents for the electricity.
Tell your significant other that the blow dryer is off limits. That sucker draws 1500watts of electricity.
If saving money on your power bill is the goal - ignore lightbulbs, computers, and audio/video electronics. None of those things have a big impact on the monthly bill. Instead, limit your use of heat-generating electrical devices...oven, dryer, etc.
See that 20,000watt figure for electric heat? That's why insulation pays for itself so quick.
Those little 1200 watt portable space heaters will add $75/month to your power bill if it runs continuously.
Originally posted by wmf
The Power Mac G5 is rated at around 650W; in practice it may use only 300-400W.
The power supply in the G5 is only rated at 450W, so I don't know where you got the 650W number. The average draw is not going to be 300-400W. I know people that were running 4 to 5 HDs, an optical drive, and the processor on the old G4 power supplies (rated at 340W), so simply running 1 or 2 HDs and the G5 will not come even close to your 400W figure. I really wonder where you got this info from.
Big Mac
kupan, if this were slashdot, I'd rate your post highly informative.
I can't take all teh credit, it was somethign I nabbed from over at MacNN when someone asked about things there.
Originally posted by kupan787
The power supply in the G5 is only rated at 450W, so I don't know where you got the 650W number.
Apple's site says 6.5A at 100-125V.
Originally posted by wmf
Apple's site says 6.5A at 100-125V.
Is that really how you figure this out (6.5 times 100 = 650W)? So if I use the G5 in another country (with a different voltage, like say 200-240V AC), then the power supply becomes 1250W?
I got my 450W figure from http://mac-pro.com/Merchant2/merchan...tegory_Code=ps
Replacement Power Supply for PowerMac G5 1.6Ghz #661-2903
Fully Tested. New, Extended 6 Month Warranty .
Used in PowerMac G5 1.6Ghz Only. Apple Part Number 661-2903 450 Watt
So this is an Apple power supply, from a G5, and it is listed as a 450W power supply. I have a feeling that the power supplys in the G5 are only up to 450W. In whch case, the average draw is probably around 100W (maybe a bit more, but I doubt it).
G5's MAX is around 7.5A MAX so that in North America (120V) the power consumption is 900VA MAX.
1VA is approx. 0.6 Watt so that the result of MAX power consumption in Watt is 540W.....
I too really am curious about the number in Asia and Europe
Originally posted by Leonis
VA actually is the result of Voltage x Amp
G5's MAX is around 7.5A MAX so that in North America (120V) the power consumption is 900VA MAX.
1VA is approx. 0.6 Watt so that the result of MAX power consumption in Watt is 540W.....
I too really am curious about the number in Asia and Europe
Well the G5 you buy in the US will have the same power supply as the one in Asia. But my power supply suddenly can't handle more because the voltage there is more, right? I will admit I am on shakey ground here, as I don't know much about how the power works in various places, but it seem like if a power supply is rated at 450W, just because I move somewhere else, I suddenly am able to power more objects on the same unit.
From what I understand the amp rating should refer to the highest sustained power draw this power supply will require from an A/C line (at a specific voltage) without tripping a fuse or overload protection circuit and or causing power distortion, give or take a small tolerance. When a computer, or just about any electrical device starts, a power surge is created for a very brief time which can greatly exceed the devices maximum power rating.
But back to the issue at hand, there is almost no chance this guy is going to be doign a max draw 24/7 with his machine. I serieuosly doubt that the G5, system control, 2 HDs, and a optical drive in use 24/7 would be using the entire power supplys capability. Average useage varies by number of drives in the machine among other things, but for servers I have seen that an average machine will draw roughly 50% of the power supply rating. As far as the maximum power use, generally a well designed system will not come close to the maximum draw.
So lets assume the power supply is 450W, and that on average this machine is using 225W and runs 24/7. Then plugging into my equation we see that it is roughly $14/month. But I really think that we are overshooting, and that the average would be less (as servers generally have more drives and the drives are accessed more frequently).