Desktop Computers

Brahma

Obi-Bro Kenobi-X
11,914
41,875
Uh...you have an open chassis and wonder how it doesn't get rid of heat? How does this even happen? As I said before, if your PC is causing thermal problems in your room, maybe upgrade your room?

I sometimes have 5 computers on in my home office, and they don't impact the ambient temp in any meaningful way. I have a small-assed HO at about 450 sq-ft. Are you people computing in closets FFS?

Please explain to me how a chassis gets rid of heat. I'm serious. There something new out there I don't know about?

The heat from the PC is being dissipated via the liquid cooling/radiator/fans. The purpose of any chassis is obviously to mount components on top of and look damn cool. The key word is dissipated. INTO MY ROOM. Hence my room heats up.
 

mkopec

<Gold Donor>
25,384
37,453
Yes but heat dissipation in a case also works on negative/positive air pressure, open case kind of deefeats that purpose, therefore your components heat up more. In any case if your computer is literally heating up the ambient temperature inside a , say 10x12 or bigger room, youre doing something wrong.
 

Crone

Bronze Baronet of the Realm
9,707
3,210
If you are looking for an ultra portable monitor is there any options? Other than a monitor carry thing? Kinda wanna build a itx case for traveling.
 

Big Phoenix

Pronouns: zie/zhem/zer
<Gold Donor>
44,529
92,935
No one ever said that. Besides, if you are in a room where a fucking PC can upset the thermal balance, upgrade your living conditions.
Dunno, buying a new AC for my house and upgrading my insulation is a pretty big investment.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,362
22,118
Maybe you guys are stupid and don't understand the performance per watt differential between Nvidia and AMD so here you go:

208277
 

mkopec

<Gold Donor>
25,384
37,453
Maybe you guys are stupid and don't understand the performance per watt differential between Nvidia and AMD so here you go:

View attachment 208277

So who buys AMD cards anyways, who are you preaching at? AMD has been shit since like the 7950 series, and even then they were 2nd rate cards. the only reason anyone had AMD cards is because they were the price/performance leaders for low end cards, like the 5850, 6850, etc... when you didnt have the $400-$600 to spend on the nvidias. (Now up to a $1000, because no competition)

Im fucking cheering for them to make something good so Nvidia actually has some competition, which is good for us, the consumer, in the long run.
 
  • 2Solidarity
Reactions: 1 users

Argarth

On the verandah
1,206
1,045
Haven't seen any evidence that we'll get any real competition for NVidia's top-end cards until the generation AFTER Navi anyway.

No reason now to buy AMD for VRR, Radeon VII was just a re-tooled Vega leaf-blower, and the high-end Navi cards due next year appear to still be positioned around 2080 performance at best (speculation). I still hope Navi turns out to be a great release, and with the console deals, makes them a bunch more money to re-invest.

This is a decent short summary: AMD's next gen GPU architecture 2020-21 I would LOVE to see a blockbuster, truly bleeding edge GPU from AMD, one day.
 
Last edited:

Whidon

Blackwing Lair Raider
1,880
2,906
I bought an AMD card for my current main comp in 2017. I turned down an offer of replacing my rx 580 with a vega 64 for 280 bux because it's just been that disappointing to me. Today, I play games almost exclusively on my ps Pro despite it being, in theory, a detuned version of my desktops GPU with a super crappy CPU.

When i first started playing Overwatch I got random 'black Screen" bugs and all sorts of other issues that made the game unplayable as a competitive multiplayer game. AMD driver updates fixed this. yet the issues have never 100% gone away, I still can never get rid of the issue where Frame rate and loading get far worse after the computers been on for a day or so, especially after each new load of a game. It can drop from 120-150 un the highest settings on OW to all the way to 40-100, a huge fluctuating range clearly. I got an SSD to play on and upgraded to 16gb ram and neither solved the issue.

It's not strictly OW either, I've noticed more issues playing games on this card then any card from Nvidia I have ever owned. Sure that 6tflops of power for only $200 in 2016 sounded amazing on paper. but in reality, it feels like you often can't actually unleash that theoretical power. I almost broke down and bought a RTX2060 recently BC I know it would have far less "kinks" then i have played games on the RX 580.

That being said I hate the idea of buying something from Nvidia. as Mist Mist points out in her chart they deliberately giving us low power versions of its tech right now because AMD is so far behind. Nvidia should be giving us a $200 mid-range card that uses 180-220w like AMD is doing.
 

Folanlron

Trakanon Raider
2,218
639
I bought an AMD card for my current main comp in 2017. I turned down an offer of replacing my rx 580 with a vega 64 for 280 bux because it's just been that disappointing to me. Today, I play games almost exclusively on my ps Pro despite it being, in theory, a detuned version of my desktops GPU with a super crappy CPU.

When i first started playing Overwatch I got random 'black Screen" bugs and all sorts of other issues that made the game unplayable as a competitive multiplayer game. AMD driver updates fixed this. yet the issues have never 100% gone away, I still can never get rid of the issue where Frame rate and loading get far worse after the computers been on for a day or so, especially after each new load of a game. It can drop from 120-150 un the highest settings on OW to all the way to 40-100, a huge fluctuating range clearly. I got an SSD to play on and upgraded to 16gb ram and neither solved the issue.

It's not strictly OW either, I've noticed more issues playing games on this card then any card from Nvidia I have ever owned. Sure that 6tflops of power for only $200 in 2016 sounded amazing on paper. but in reality, it feels like you often can't actually unleash that theoretical power. I almost broke down and bought a RTX2060 recently BC I know it would have far less "kinks" then i have played games on the RX 580.

That being said I hate the idea of buying something from Nvidia. as Mist Mist points out in her chart they deliberately giving us low power versions of its tech right now because AMD is so far behind. Nvidia should be giving us a $200 mid-range card that uses 180-220w like AMD is doing.

Want too see what a 180-220w nvidia card will do, go testrun a laptop and play it on "battery" mode.
 

Fucker

Log Wizard
11,518
26,006
Maybe you guys are stupid and don't understand the performance per watt differential between Nvidia and AMD so here you go:

View attachment 208277

We aren't stupid. Performance per watt doesn't mean jack and shit if you don't live in a fucking shoebox without modern amenities like air conditioning and windows and stuff. You keep bringing it up like it is some valid metric for people who own a house built after 1500. It isn't.

I fully understand it as a metric when you have a building full of CPU's or GPU's...but I'd wager that is none of us in terms of home use.

As I said before, if your PC is heating up your room enough to make life difficult for you, upgrade your fucking room or game on an Atari 2600.
 

Lanx

<Prior Amod>
60,521
132,421
We aren't stupid. Performance per watt doesn't mean jack and shit if you don't live in a fucking shoebox without modern amenities like air conditioning and windows and stuff. You keep bringing it up like it is some valid metric for people who own a house built after 1500. It isn't.

I fully understand it as a metric when you have a building full of CPU's or GPU's...but I'd wager that is none of us in terms of home use.

As I said before, if your PC is heating up your room enough to make life difficult for you, upgrade your fucking room or game on an Atari 2600.
yea, be an internet meme and game in the basement
 

a_skeleton_05

<Banned>
13,843
34,508
Well for me it is.

Gotcha. Yeah, that's just something that's largely unavoidable, AMD or not. My system turns my room into a furnace in the summer without things to deal with it in the room itself. While AMD for the most part is going to make that a bigger issue, it's not going to be a large enough difference to really be noticeable. The bigger issues with the extra heat generation all happen inside the case, on the board itself, and in the production/development process.
 

Big Phoenix

Pronouns: zie/zhem/zer
<Gold Donor>
44,529
92,935
R7 3700X: 3.6/4.4 65W $329

R7 3800X: 3.9/4.5 105W $399

R9 3900X: 3.8/4.6 105W $499

7/7 release.
 
  • 1Like
Reactions: 1 user

ver_21

Molten Core Raider
975
-361
tl;dr Ryzen 3rd gen has performance parity with /slight advantage over Intel's best gaming CPUs at half the power and lower MSRP.

Per AMD/Lisa Su Keynote @ Computex:

Server:
* 50 EPYC cloud instances in production right now
* 1.5 exaflop supercomputer w/ advanced EPYC and Instinct
* EPYC in Microsoft Azure
* ROME 2x to 4x faster than Intel's Cascade Lake, launching Q3 2019

GPU:
* Radeon confirmed in Google Stadia
* NAVI: confirmed next gen Playstation w/ semi-custom navi and Zen 2, all new "RadeonDNA" architecture--not GCN, PCIexpress 4.0, new optimized compute unit design, new cache hierarchy, 1.25X faster than Vega, 1.5x more efficient than Vega...

Navi is Radeon RX5000 family: RX5700 10% greater than RTX 2070 on Strange Brigade demo, early version...July release...more at E3 June 10.

CPU:
* greater Microsoft partnership with AMD
* greater Asus partnership with AMD
* X570 mobos, 30 designs from Asus alone
* greater Acer partnership with AMD (Acer President Jerry Kao totally dissed Intel and Nvidia on stage)
* AM4, PCIe 4.0
* core w/ doubled floating point, doubled cache, InstructionsPerClock +15% uplift

Ryzen 7 3700X: 8 cores 16 threads, 3.6GHz base, 4.4GHz boost, 65 watts, 36MB cache, 15-18% gains over 2700x, beats Intel 9700K by 30% on Cinebench R20...$329
Ryzen 7 3800X: 8 cores 16 threads, 3.9 base, 4.5 boost, 105 watts, 36MB cache, 30% improvement over 2700X, matches Intel 9900K in PUBG demo...$399
9900K w/ 2080ti versus Ryzen 7 3800X w/ Navi RX 5700: 3dMArk bandwidth test, AMD system wins 25fps to 15 fps...

Ryzen 9 3900X: 12 cores 24 threads, 3.8GHz base 4.6GHz boost, 70MB cache, 105 watts, versus 9920X in blender--3900X 18% faster...$499

CPUs available July 7.
 
Last edited: