Kajiimagi
<Aristocrat╭ರ_•́>
- 3,024
- 5,627
I've been at least partially building PC's since the mid 90's. All were intel until AMD released the first 1ghz processor. Since then it's been mix and match. Last 3-4 systems have been AMD (this one is ALL AMD) usually paired with a Nvidia GPU. Never had issues with AMD, their video drivers used to be dog shit but that's way past.I appreciate this insight. I'll admit I don't know all the terms you mentioned, nor all the impacts that might have on what I'm doing, or how I'll upgrade the machine over time. But, I think context on my end might be helpful as well. I'm not going to be using this new computer - or expecting it to be able to - for bleeding edge graphics or games. I'm looking to get something that's about 8/10ths of the max and viable for 4-6 years.
It does make sense to me to be able to swap items out over time (ram, SSDs, GPU, power supply etc.) but the less I have to do that the better and only being able to use some builder's proprietary hardware seems like a negative to me.
Lanx & @Fucker mentioned the struggles of Intel. So, does that theoretically (realistically?) impact my ability to upgrade over the next few years? Since 1998 when I got into computers, I've never had an AMD system. Its a dumb question perhaps, but are they reliable? I consider reliability above all else (cars, guns, tech) when making decisions to buy. They must be, right? Or else they wouldn't still be around.
Yes, I do over-analyze just about all of my decisions. I'm working with a therapist on it, lol.
For upgrade path, AMD all the way. AM5 is a new CPU standard and should last for the next few years. Hell there are still AM4 CPU's being released. My experience with intel is when they release new CPU's it's a whole new motherboard.
My $.02 anyhow.
EDIT: Shit didn't realize

Last edited: