Turning off PCs: does it really save anything?

DH
Posted By
Dan_Heller
Oct 23, 2003
Views
370
Replies
6
Status
Closed
This may not feel like a photoshop quesiton, but it does relate to color calibration on monitors, and I can’t find a more appropriate forum. Apologies in advanced.

The general question is: does it REALLY save significant energy or help computer equipment last longer by turning them off at night? Here’s my experience followed by a question…

I’ve moved my photo business to be almost exclusively digital now, hence, PS. Obviously, color calibration in my monitor is
imperative. An artifact about monitors is that the longer they get power, the better and longer it remains color-calibrated. If I turn off the monitor, it not only takes a long time to recover to it’s previous settings, but recalibration is required more often. Anecdotal evidence suggests that it also wears the monitor, reducing its lifespan. The last time I got a new high-end monitor, I took the advice and kept the entire computer on all the time (a 1Ghz pentium PC with all the other whistles).

Sure enough, the monitor has not only lasted a good 12 months, but it also has never required reclibration. I use it just about every day, except for 4-5 trips a year (three weeks each), which are the only times the monitor and computer are off.

What I noticed is that my energy consumption has never been significantly reduced (as a percentage of my entire household energy bill), and my computer equipment (at least my monitor) doesn’t suffer one bit. Add to that the convenience of just walking up to my computer at any time, and it gets me wondering: does it really matter anymore to turn off computers? I know it used to back in the 80s when components were more….primitive(?).

This is MY experience, but I have no idea whether this is true over the general spectrum of systems, users, and environments. Any data on this?

Must-have mockup pack for every graphic designer 🔥🔥🔥

Easy-to-use drag-n-drop Photoshop scene creator with more than 2800 items.

KN
Ken_Nielsen_
Oct 23, 2003
I’m assuming you are not talking about an LCD display. I turn my CRT monitor off because it attracts dust and who knows what else, inside and out. It’s a cleaning thing to me. The monitor is a high-end Radius PrecisionView that is now old enough to toss out into the big green dumpster, but it still works fine, is holding its color and brilliance true to calibration, so I hate to dump it in the recycle bin.

I use the new cinema displays and one will replace the old radius some day soon. Point being: it will be out of date before it will wear out if you turn it on and off each day.
RK
Rob_Keijzer
Oct 23, 2003
I have thought of this also. I usually base my judgement on monitor aging on visible CRT deflection changes, rather than colour shift.
(When the High Voltage unit degrades, electrons emitted from the gun have less momentum and thus are more susceptible to the "steering power" of the deflection unit.)

The High Voltage unit ages because the condensors in it change rating, and often start to leak their charge.
This process takes place only during power on.

I agree that it takes a while for a CRT-monitor to be on calibration after power on, but when I’m off I prefer to switch the monitor off too.

I work at a broadcast control room with CRT monitors (Barcos) powered on 24 hours 365 days. They lose accuracy after appr two years because of that. My pc monitor (Philips 109S), is kicking over 4 years now, and I can’t see any change yet. (unless I am degrading myself, which never is 100 percent deniable.)

Rob
H
Ho
Oct 24, 2003
The High Voltage unit ages because the condensors in it change rating, and often start to leak their charge. This process takes place only during power on.

I assume that when a monitor is on standby these aging processes are reduced if not outright eliminated, since the high voltage circuits are switched off. Anyone know for sure?
JT
Joe Thibodeau
Oct 24, 2003
I suspect that standby simply stops the flow of electrons from the emitter (cathode) but doesn’t disable the high voltage charge on the plate. The way to tell is is there is a noticable change in the energy in front of the tube when put in standby. Ever notice when turning of a CRT there is this crackling electromagnetic charge sitting close to the screen such that when you put your hand close by you become the grounding rod for all the excess charge? If the high voltate supply is turned off during standby then this same effect would take place. At least in my mind. Since I never contracted with a company that manufactured monitors I am only guessing.

Joe

wrote in message
The High Voltage unit ages because the condensors in it change
rating,
and often start to leak their charge. This process takes place only
during
power on.

I assume that when a monitor is on standby these aging processes are
reduced if not outright eliminated, since the high voltage circuits are switched off. Anyone know for sure?
RK
Rob_Keijzer
Oct 24, 2003
Ho,

That is correct. But I always switch it off completely when I’m not "at it" for a long time.
Rob
WD
Walter Donavan
Oct 26, 2003
In South Florida, I believe it pays to turn electronic equipment off when not using it. Lightning and power surges take their toll.


Walter Donavan, Fort Lauderdale
Author of "Revelation: The Seven Stages
of the Journey Back to God"
www.revelation7stages.com
www.1stbooks/bookview/15479
wrote in message
Ho,

That is correct. But I always switch it off completely when I’m not "at
it" for a long time.
Rob

How to Master Sharpening in Photoshop

Give your photos a professional finish with sharpening in Photoshop. Learn to enhance details, create contrast, and prepare your images for print, web, and social media.

Related Discussion Topics

Nice and short text about related topics in discussion sections