Thank You Brian!!  PI maybe a project down the road.

On 2021-07-23 11:16, Brian Cluff via PLUG-discuss wrote:
Power supply ratings are their maximum output they are capable of.
Computer power supplies are going to be oversized (if the computer was
built right) otherwise they wouldn't last for very long and would run
very hot.  Computers, especially modern ones,  power usage is going to
vary wildly from one second to the next based on it's load and what's
connected to it.  If your system is just sitting there, on, doing
nothing, it will likely be under 100watts, especially if the monitor
is off, asleep or non-existent.  Servers will tend to draw more,
because they have a lot more fans, hard drives, and power profiles
that don't allow for them throttle as much.

Even if you do have a system that only uses 50 watts normally, I still
recommend getting something low power like a raspberry pi to serve
your  house because even if you have to buy the PI and the existing
computer is free, the PI will quickly pay for itself and after that
it's almost free to run it... and a lot more quiet and you also don't
have to pay for your air conditioner to cool off the room that your
higher power computer heated up which is also a very real cost that
hasn't really been mentioned yet.

I had to argue with an electrician about power supply sizes when I
build a computer lab with custom built computers with massively
oversized power supplies.  He went around adding out all the wattage
ratings of the power supplies and decided that my 30 computer lab
would require a minimum of 15 circuits in order to not pop breakers. 
I never could convince him that I was right, and that the breakers
wouldn't pop and he finally did want I asked him to do which was to
add 4 circuits, which we never has any problems with.

Brian Cluff

On 7/23/21 10:22 AM, Keith Smith via PLUG-discuss wrote:

Based on what we have been discussing I assume my 400 watt power supply may be drawing much less power based on actual usage. Therefore maybe my computer might only be using 60 watts... making the cost lower.

Your thoughts.



On 2021-07-22 21:39, Mike Bushroe via PLUG-discuss wrote:
I usually use a mental rule of thumb that for every watt of 24/7/365
power consumption costs about $1 per year. Obviously this is failing
as electric rates keep going up. So to first order of magnitude a 100
watt server would cost around $100 a year, but if the server was using
the whole 400 watts it would cost more like $400 a year.

If my home web server is using 100 watts an hour that mean 100
watts *>> 30 days * 24 hours or 72K watts.

I'm thinking 72 * .1085 = $7.81 a month.

               KINDNESS

is most VALUABLE when it is GIVEN AWAY for

                   FREE
---------------------------------------------------
PLUG-discuss mailing list - [email protected]
To subscribe, unsubscribe, or to change your mail settings:
https://lists.phxlinux.org/mailman/listinfo/plug-discuss
---------------------------------------------------
PLUG-discuss mailing list - [email protected]
To subscribe, unsubscribe, or to change your mail settings:
https://lists.phxlinux.org/mailman/listinfo/plug-discuss

---------------------------------------------------
PLUG-discuss mailing list - [email protected]
To subscribe, unsubscribe, or to change your mail settings:
https://lists.phxlinux.org/mailman/listinfo/plug-discuss
---------------------------------------------------
PLUG-discuss mailing list - [email protected]
To subscribe, unsubscribe, or to change your mail settings:
https://lists.phxlinux.org/mailman/listinfo/plug-discuss

Reply via email to