Pc power usage?
my uncle made me plug my pc into a smart outlet that tracks power usage so he can see if my pc increases his bill or not, how much power will my pc actually use on average? should i be worried?
- 4 months ago
I fell the only way it would make a difference in electricity but is if your pc is using over 1500 watts maybe 1000 watts and how long you use it for in a day.
- 5 months ago
All you have to do is take off the cover and look at the power box. It should have the wattage on a label. My girlfriends older dell optiplex SSF uses a 255 watt standard power supply.
- RichardLv 65 months ago
without knowing specs, this question is difficult to answer
- m8xpayneLv 75 months ago
Depends on the specs. I have a bunch of those Kill-O-Watt meters too and you can track the usage yourself if you figure out how to work the meter.
I'd say something with a Ryzen 5 or Core i5 along with a card like the RTX 2060 would use around 300w while gaming. Typically when you're gaming, other parts like the CPU or disk drives aren't running at 100%. I would suggest for you to do some research on how much power these Gaming computers actually use. Product reviews on Processors and Graphics cards are a good place to start. Once you have that figured out, you can either figure out how to do the math yourself or you can use an online calculator.
Having a PSU with a higher efficiency rating will help reduce electrical usage a little bit due to a PSU with a higher efficiency rating wasting less power.. Older power supplies or cheap power supplies are 80% to 85% efficient, maybe more or less. In north america, An 80-Plus Platinum unit is 90% to 92% efficient unless the Power supply is overloaded and overheating. PSU efficiency is something you can factor into the overall costs of running a PC.
If your PC does consume 300w and your electricity costs 12 cents a kw/h then it costs .036 cents to run your PC every hour. If you play games on your PC an average of 3 hours a day then your costs would be $3.25 a month.
If you have a card that eats wattage like a Vega 64, Radeon VII, R9 390x, or something like an RTX 2080ti, GTX 1080ti then you can expect a higher power consumption figure.
During these past 10-12 years, Intel, Nvidia, and AMD have been working hard on making their products consume as little power as possible while in an idle state. I have a older Phenom II x4 955 which came out in 2009 an it consumes around 120w even in an idle state. It won't help you much if you have an older HEDT Intel CPU like a Core i7-950, 3930k or an 8-core AMD FX processor that uses a lot of power. If you have an old rig then it's lack of low power states or power efficiency is working against you.
People get too caught up in the power figures. During the winter it evens out since the PC is putting off heat. This means the heaters won't run as much. But, this also depends on how much power your PC consumes.
If you uncle want to save money on his power bill then he needs to look at the insulation in the house, the windows, and how efficient the heating or AC happens to be. Having poor insulation or bad windows is costly during the winter. How long is the oven, clothing dryer on? How many people are using hot water? Is anyone in thee house hibernating in front of a space heater or running an A/C unit? Portable A/C units are terribly inefficient. It's thing like this that put the screws to the power bill. Power bills are always higher during the winter and not having the house sealed up properly is going to play a role in having a high power bill.
- How do you think about the answers? You can sign in to vote the answer.
- TStoddenLv 75 months ago
The power usage generally depends on the system specs & usage.
My laptop's power supply is 150 W... So, let's go with a WORST CASE SCENARIO being fully active & 100% busy 24/7 (like a high-demand remote server) for a month (being 30 days). That's .15 x 24 x 30 = 108 KW/hrs for a month... Running with the national average of $0.14 per KW/hr, that would place the monthly cost at $15.12 / mo. If you're living in Hawaii, which has the highest rates in the US @ ~$0.33 per KW/hr... that would come out to $35.64 / mo.
HOWEVER, that's isn't going to be the average use case. Computers try to be power efficient & will gear itself down when idling. Taking my laptop processor's specs, it pulls 45 W under full load... but drops down to 35 W under more idle loads (-10 W there). The graphics processor (GTX 1050) pulls 75 W under full load, when it's needed for gaming & more 3D intensive stuff. When doing more casual stuff, it's usually idle or powered down, dropping down to about 5 W (-70 W there).
So in a more idle state, it's using around 80 W below peak usage (taking it down to 70 W or 0.07 KW/hr)... Power off the screen drops another 20 W of usage. Putting the system to sleep when NOT in use drops power consumption WAY down to only 10 W as it's just keeping power to RAM (to hold it's contents) & the processor is powered down to run the minimal programming like power management settings (like maintaining the idle timer) & waiting for a "wake up" signal (like hitting any key on the keyboard, instead of just the power button).
HOW my computer's power settings are (while plugged in)...
* The Screen shuts off after 15 minutes of inactivity
* System goes to sleep after 2 hours
* System goes into hibernation after 3 hours
Hibernation effectively turns off the computer completely (dropping it to 0.5 W), but it dumps the RAM to the storage drive & marks a special boot flag to pick up where it left off.
To put this in a more realistic out look...
* 2 Hrs of high-end gaming (@ 0.150 KW/hr OR 0.300 KW/hr used)
* 2 Hrs of casual use (@ 0.070 KW/hr or 0.140 KW/hrs used)
* 2 Hrs idling, screen off / laptop closed (@ 0.050 KW/hr or 0.100 KW/hrs used)
* 1 Hr sleep (@ 0.010 KW/hr)
* 17 Hrs Hibernating w/ full battery (@ 0.0005 KW/hr or ~0.009 KW/hrs used)
That places the daily power usage at ~0.559 KW/hrs, which comes out to ~16.770 KW/hrs per month. The power costs would be ~$2.35 / mo. on average (up to ~$5.54 / mo. in Hawaii), which is about ~15.543% (between 1/6th & 1/7th) of the worst case scenario.
If you use more computer more than me, the power consumption (& cost) will be higher, but this should give you an idea. If your computer is a desktop, peak power consumption will be considerably higher (around 500 - 1,000 W) as well.. So use this more as a guideline than anything.
Considering that your uncle is potentially being a tightwad with the electrical bill (as he wants to monitor your computer's usage), you could try to negate it by replacing bulbs around the house. Replacing incandescent bulbs with CFL's (the "Spiral Bulbs") or LED's will net a significant savings, as replacing a 60 W incandescent bulb with a CLF will only use 15 W (saving 0.045 KW/hr) & a LED would only use 10 W (saving 0.050 KW/hr) by comparison. Assuming 5 hours of usage per day, that's 0.225 - 0.250 KW/hrs per bulb, so if I can replace around 3 incandescent bulbs... I can negate my laptop's power usage. If I was replacing CFL's with LED's, I would have to replace about 30 bulbs here to net the same result.
Please note that an average US household uses around 900 - 1,000 KW/hrs (or 0.9 - 1.0 MW/hrs) per month, so my laptop would account roughly 1.677 - 1.863% of the total power bill. If you have a high-end gaming desktop, it MIGHT account for ~5% of the total bill, just to put things in perspective for you.
Hope this sheds some light on the subject.
- 5 months ago
unless it's a s*** hot gaming PC it shouldn't draw more than 150 watts or so
- PLv 75 months ago
Maybe at the worst you will have to pay him $10 a month in electricity. You can cut that down quite a bit by turning it off when you don't use it. You may have to learn how to access and calculate the power usage yourself since your uncle may try to exaggerate it. People like him who obsesses over electric usage are prone to that.
- StarryskyLv 75 months ago
Depends on the "pc" model. A laptop uses very little, like a 100W light bulb. In 10 hours use, that is 1KWh. For most places, that is about 25 cents, maybe a little more.
Most printers use about the same, but for a very short period. Laser printer uses a bit more.For console box with fan and monitor, maybe 3 times as much power is used.
- DCM5150Lv 75 months ago
That depends on how much of a d*** your uncle is. Yes, your PC uses power. But how much is going to depend on how much you use the computer and what for. If you are doing intense gaming, video editing, Netflix streaming these will use a lot more power then internet browsing or editing documents.
As a starting point, you are likely using in the neighborhood of $5/month depending on your cost of electricity (which can vary widely from place to place so if you are in a high area that could be triple or again a very high user it could go up)
I did have a PC that I didn't realize was running a lot of power all the time for no reason. By coincidence, I used one of those devices and found out I was using in the neighborhood of $50/mo. I quickly fixed the issue.
- spacemissingLv 75 months ago
Anything that consumes power will increase the amount charged each month,
but a computer uses a lot less energy than some other things do.
Connect the refrigerator to an outlet like that and see what happens!