I have done contract work at many places and it seems that a common theme exists. Save pennies, take them from the hardware, the workstations, the servers, the lunch breaks, the travel, and the vacation. But so many people get so caught up in saving “Pennies!” they forget about the “Dollars!”. The real reason why this is an issue is because a lot of management does not realize how to “quantify” work and to recognize “value” out of work performed. For that matter, even for relational confidence. I could go on that matter for quite a while.
The real deal with machines performance in IT is that it allows an employee to get a job done quicker. We live in a new age, where developers require 8 GB minimum. 16 GB for you Microsoft, Java, or ColdFusion folks out there. Especially if you are running Eclipse, Visual Studio, Photoshop, Premiere, or Final Cut. I mean, really this is not a question. The amount of time a 2 to 4 GB of RAM machine takes to swap data from hard disks to RAM when paging completely outweighs the amount of work, or retaining attention span, that could have been done with a machine that was equipped with the proper resources.
Have you heard of the stories where it takes a single computer 15 minutes to start up and 15 to shut down? Okay, imagine that over 5 work days a week and 48 weeks a year (factoring in vacation and time off), that equates to 120 hours per year of time lost. Let’s multiply that by an entry level position at 40k a year, or lets just say $20 / hour for the heck of it. There is $2400 lost in the most conservative point of view.
Want to take a different hourly rate or salary? Be my guest. You tell me how much computer you could purchase for $1000. Now tell me how often you wait for something to load on your computer and how conservative my figures actually are. This happens at server levels too. It happens all over the place. (Ability to quantify) Let’s think about the dollars, not the pennies.