LCD vs. CRT Monitors
In the not-too-distant future, cathode-ray tube (CRT) monitors and televisions will be relegated to 1980s science fiction movies, but today, CRT monitors are still kicking around in many organizations. CRT monitors create their display by shooting an electron beam through a glass tube. They are high-voltage devices. By contrast, liquid crystal display (LCD) monitors shine florescent light through liquid crystals and glass. A typical 17” LCD monitor uses 35 watts, whereas a 17” CRT monitor uses 100 watts. Thus, the LCD display saves 60 percent in electricity costs over the CRT display. How much is that? You can use the following formula to determine annual cost:
(watts x hours used / 1000) x cost per kWh = Total Cost
Let’s apply the formula to a CRT monitor that runs 8 hours a day, 5 days a week, 52 weeks a year or 2080 hours per year, at an average cost for electricity of 12 cents per kWh:
(100 watts x 2080 hours / 1000) x $0.12 per kWh = $24.96 per year
Now, let’s apply the formula to an LCD display with the same usage and cost figures:
(35 watts x 2080 hours / 1000) x $0.12 per kWh = $8.73 per year
Using an LCD display at the national average cost of electricity would save $16.23 per year in electric costs over the use of a CRT monitor. Additionally, because LCD displays give off very little heat, they can save as much as 50 percent in reduced air conditioning costs in most areas, making the total savings a bit higher. Clearly, savings in electricity alone might not convince a CFO to replace everyone’s CRT monitors, but this formula is a good example of how you can compute electricity costs for IT.
U4 is committed to wise use of our resources and therefore all new monitor purchases will be LCD displays.