Thought I put this up for those interested to kick around.
Last year I began to feel somewhat hardware constrained. My PC is an HP Media Center machine from mid 2005.
I had added a low end graphics card ATI Hypermemory type to HP's standard configuration. So, I had a 2.4Gz
AMD 3800 single core CPU, 200 GB's of HD storage, 1 Gb of Ram, and a starter graphics card.
This was initialy adequate but, recently it was feeling sluggish and performance constrained, especially on some games.
So, I began to wonder do I try and save up alot more for a whole new system or put in some money on upgrading what
is currently obsolete technology?
I went ahead and opted for upgrading my current system. Here's what I added: Removed old RAM and installed 2
GB's of fast OCZ memory(PC 3200), a new power supply (Corsair 450W 80+ Cert.), a new graphics card (Full featured ATI
DirectX 9.0C 512MB card), and 300GB external HD(no spare internal slot).
So, am I happy with this decision? Yes, it's worked out great. The additional memory makes it possible to play
games without windows swapping out to the pagefile. The games I play are more responsive now and with the full
featured graphics card newer games such as EE3 play well and everything can be run at higher graphics quality settings.
Moreover, since XP is still the largest installed base of home PC's, many new software titles will be made XP compatible
for awhile into the future. I am thinking at least through all of next year.
So, I believe I can now wait until sometime next year to relook at a new system where I will get more bang for the
buck options to consider. A super fast duo-core system will be cheaper next year as will entry level quad core
machines.
So, what do think? Are you sticking with your current technology? As you may have noticed the decision for me
switched from an OS centric one, when do I need Vista?, to a hardware centric one, how to have adequate hardware
performance? This was because I realized, for me, Vista isn't required --- yet.