new Nvidia GeForce RTX 2080 cards released

a_skeleton_05

<Banned>
13,843
34,508
I dunno there's a place called the internet, it's got some places to figure stuff out.
Final Fantasy XV Benchmark Performance Analysis

You're going to have to excuse me for not going to read an entire analysis of a little-used benchmarking system (from a completely different source than the graphs you posted, meaning it's fucking useless as a direct comparison as the methodologies & setup are different) instead of just waiting for a trustworthy version of some basic fucking FPS numbers that everyone relies on.

I think I figured out this internet thing, This might be helpful for you
 
  • 2Like
Reactions: 1 users

fanaskin

Well known agitator
<Silver Donator>
55,850
137,944
You're going to have to excuse me for not going to read an entire analysis of a little-used benchmarking system (from a completely different source than the graphs you posted, meaning it's fucking useless as a direct comparison as the methodologies & setup are different) instead of just waiting for a trustworthy version of some basic fucking FPS numbers that everyone relies on.

I think I figured out this internet thing, This might be helpful for you

i dunno saying your lazy doesn't compute directly to "you're not smart"
 
  • 1Smuggly
  • 1Like
  • 1Worf
Reactions: 3 users

fanaskin

Well known agitator
<Silver Donator>
55,850
137,944
4 stages of dom
1.) I demand information
2) here is information
3.)your not as smart as you think you are!
4.) picard response for good measure
 
  • 2Salty
  • 1Like
Reactions: 2 users

a_skeleton_05

<Banned>
13,843
34,508
i dunno saying your lazy doesn't compute directly to "you're not smart"

I didn't say you weren't smart, I said you weren't that bright. Some measure of it would clue you in that looking at the numerical results of a benchmark that uses many different factors to come to its result without knowing exactly what factors went into coming to that result (Hint: Anything other than just the GPU used) is useless outside of vague comparisons.

Here's what the makers of the benchmark has to say about it:

Points of Caution

  • *The scores shown here are the average values calculated for the various GPUs. The resulting aggregate scores may be affected by the specifications for the CPUs, memory and graphics drivers etc. used in individual tests.
  • *These scores are for reference purposes only and do not constitute a guarantee to the performance when running the release version of the game

Do you understand now? They are unreliable numbers without knowing all of the variables.
 

a_skeleton_05

<Banned>
13,843
34,508
From the dude that handles Afterburner re: the new OC stuff

RTSS 6.7.0 beta 1
I shared my impressions about NVIDIA Scanner technology from software developers’s point of view. Now I’d like to post the impressions from end user and overclocker POV.
I was never a real fan of automated overclocking because the reliability was always the weakest spot of overclocking process automation. NVIDIA Scanner is not a revolution like many newsmakers are calling it simply because different forms of automatic overclocking already existed in both NVIDIA and AMD drivers for couple decades, if not more (for example NVIDIA had it inside since CoolBits era, AMD had it in Overdrive). However, it was more like a toy and marketing thing, ignored by serious overclockers because everybody used to the fact that traditionally it crashed much more than it actually worked. Different third party tools also tried to implement their own solutions for automating the process of overclocking (the best of them is excellent ATITool by my old good friend w1zzard), but reliability of result was also the key problem.
So I was skeptical about new NVIDIA Scanner too and had serious doubts on including it into MSI Afterburner. However, I changed my mind after trying it in action on my own system with MSI RTX 2080 Ventus card. Yes, it is not a revolution but it is an evolution of this technology for sure. During approximately 2 weeks of development, I run a few hundreds of automatic overclocking detection sessions. None of them resulted in a system crash during overclocking detection. None of them resulted in wrongly detecting abnormally high clocks as stable ones. The worst thing I could observe during automatic overclocking detection was GPU hang recovered during scanning process, and the scanner was always able to continue scanning after recovering GPU at software level and lower the clocks until finding stable result. In all cases it detected repeatable approximately +170MHz GPU overclocking of my system, resulting in GPU clock floating in 2050-2100MHz range during 3D applications runtime after applying such overclocking. Even for the worst case (i.e. potential system crash during overclocking detection) Scanner API contains the recovery mechanisms, meaning that you may simply click “Scan” one more time after rebooting the system and it will continue scanning from the point before crash. But I simply couldn’t even make it crash to test such case and emulated it by killing OC scanner process during automatic overclocking detection. So embedded NVIDIA workload and test algorithms used inside the Scanner API look really promising for me. And it will be interesting to read impressions of the rest overclockers and RTX 2080 card owners who try NVIDIA Scanner in action in nearest days/weeks.
 

Utnayan

I Love Utnayan he’s awesome
<Gold Donor>
16,290
12,054
4 stages of dom
1.) I demand information
2) here is information
3.)your not as smart as you think you are!
4.) picard response for good measure

Four rages of Fanaskin

1) Fanboy a proven unethical company taking advantage of a current lack of competition.

2) Throw up some useless benchmark bullshit which shows.... nothing.

3) Continues to not understand, or purposefully troll, a thread (doing a good job of trolling honestly) that unequivocally shows Nvidia is not only a piece of shit company, the 2000 series is a bust where unethical protections are in place, and extended, to getting real information out to the consumers.

4) You should Picard yourSELF.
 
  • 3Like
Reactions: 2 users

Erronius

Macho Ma'am
<Gold Donor>
16,460
42,368
I dunno there's a place called the internet, it's got some places to figure stuff out.
Final Fantasy XV Benchmark Performance Analysis

FaroffEnlightenedDolphin-size_restricted.gif
 
  • 1Like
Reactions: 1 user

fanaskin

Well known agitator
<Silver Donator>
55,850
137,944
Four rages of Fanaskin

1) Fanboy a proven unethical company taking advantage of a current lack of competition.

this isn't really true either, just everyone who posts in this thread is extremely jaded for some reason, like the dredges like being indignant.
 

a_skeleton_05

<Banned>
13,843
34,508
this isn't really true either, just everyone who posts in this thread is extremely jaded for some reason, like the dredges like being indignant.

Wanting reliable benchmarks for video cards that cost upwards of $1200 isn't being jaded, it's being a smart consumer. You're being called a fanboy because of the way you're getting incredibly defensive about anything negative or cautionary being said about these products.

If you want to rely on random bullshit on the internet to make your purchasing decisions, nobody will stop you. But if you're going to peddle that shit as something other people should pay attention to then don't be surprised if someone calls it out. But please, if you see some benchmarks that let us know how many rods to the hogshead that the 2080ti gets: let us know.

I'm tired of video cards costing the price of the rest of a computer

The good news is that GPU's are getting to the point now where the lower end cards are able to power the majority of games at the full potential of the monitors most people use. 1080@60 takes very little power and 1440@100-144/4k@60 will be very doable for cheap with the generation after turing. Stuff like raytracing etc... won't really work out at that low end, but that's not really a big deal. But yeah... it's a painful pill to swallow for the higher-end enthusiast.
 
  • 1Like
Reactions: 1 user

fanaskin

Well known agitator
<Silver Donator>
55,850
137,944
Wanting reliable benchmarks for video cards that cost upwards of $1200 isn't being jaded, it's being a smart consumer.


then wait for nda to drop, i was just posting something as it came in and you get mad literally for no reason. you think i'm hiding something from you?
 

Punko

Macho Ma'am
<Gold Donor>
7,912
12,564
I'm running every game at 4k, 30 fps, on a gforce 1080. Can't imagine upgrading the first couple of years, going from 30 to 60 fps is worth very little to me.

A 1080TI can already run GTAV at just over 60 fps average, with everything maxed.
 
  • 3Like
  • 1Picard
Reactions: 3 users

fanaskin

Well known agitator
<Silver Donator>
55,850
137,944
this card looks like early adopter cycle, if you already own a 1080ti I would also wait for 7nm nvidia. I am suspicious there's a ti model already out and they want to get to 7nm quick because of amd.
 

a_skeleton_05

<Banned>
13,843
34,508
then wait for nda to drop, i was just posting something as it came in and you get mad literally for no reason. you think i'm hiding something from you?

"Rather useless without knowing exactly how FFXV's benchmark applies score weights " qualifies as mad to you? You have some pretty thin skin, dude. If you thought I was attacking you for posting it or something: I wasn't. I was attacking the validity of the results. The industry is full of endless amounts of useless garbage cluttering things up, especially shortly before a product release as everyone is struggling for clicks, and they'll post anything. Add that onto actual tech websites doing shoddy benchmarking and you just have a large amount misinformation, and it needs to be called out, or we just get more situations like the Ryzen release.


I'm running every game at 4k, 30 fps, on a gforce 1080. Can't imagine upgrading the first couple of years, going from 30 to 60 fps is worth very little to me.

A 1080TI can already run GTAV at just over 60 fps average, with everything maxed.

It's been my experience that unless you've spent at least a little bit of time playing at high refresh rates (above 60) you don't really know what you're missing, and by that I mean once you breach that barrier: It's hard to go back, but you don't really enjoy the games any more than you did when you only knew 60 or even 30. If you're enjoying your experience, then that's all that matters.

I'm personally skipping this generation as my 1080ti is good enough for now and I'd rather wait to upgrade when I buy a new monitor, as it will probably be a 4k high refresh and I'll need the extra power then.
 

fanaskin

Well known agitator
<Silver Donator>
55,850
137,944
"Rather useless without knowing exactly how FFXV's benchmark applies score weights " qualifies as mad to you?

yeah cause you can just look up the score weights

neways it looks like 35ish% faster in generic rasterization, looks like a good card all said and done
 
  • 1Like
  • 1Picard
Reactions: 1 users

a_skeleton_05

<Banned>
13,843
34,508
yeah cause you can just look up the score weights

neways it looks like 35ish% faster in generic rasterization, looks like a good card all said and done

The page you linked to is only useful if the systems that the benchmarks you linked to previously were the exact same systems. It's not just about the GPU.

You simply cannot compare the numbers derived from two different testing setups without it being unreliable. To be able to understand the results of the benchmark you would need to be able to reverse engineer the results to figure out exactly what all went into it and the values that were applied to those individual elements. It's also impossible as the benchmark results are the average of the systems that have been testing the cards. That means you're comparing the benchmarks from GPU's with unfinished drivers, using an assortment of unknown system configs, against that of a single system setup's results. At most, the benchmarks are useful for broad averaging result understanding to come to an estimated % increase, and that's assuming there's no fuckery going on behind the scenes with beta drivers skewing results. I'm done trying to explain this as I feel like I'd need to break out flash cards or something.

And yes, it does look to be a good card if the early information holds up (price concerns aside)
 
  • 3Like
Reactions: 2 users

Brahma

Obi-Bro Kenobi-X
11,916
41,922
this isn't really true either, just everyone who posts in this thread is extremely jaded for some reason, like the dredges like being indignant.

Because Nvidia is pulling a bunch of shady crap since AMD presents no threat to them anytime soon.

The funniest one was the head guy at Nvidia saying their next gen cards was waaaaay out for release. Bam. Next quarter the cards announced and soon to be released. Insuring people snagged up the 10xx cards. That was bullshit.

How about that Gforce partners program they tried to pull?

Fucking over sites with crazy NDA's so they control the narrative.

People are jaded and fed up with their shenanigans.
 
  • 3Like
  • 3Solidarity
  • 1Salty
Reactions: 6 users

Springbok

Karen
<Gold Donor>
9,010
12,550
2080 is the worst deal I think I've ever seen in a new GPU - marginally better performance than a 1080TI (and worse in some games) for $200 more. Line up boys, we've got a hot deal comin'!
 
  • 3Like
Reactions: 2 users