NVidia GeForce RTX 30x0 cards

Denamian

Night Janitor
<Nazi Janitors>
7,184
18,963
Ashcktually...

Gen 1 and 2 Ryzen/Epyc was made in the US in NY on Global Foundaries 14/12nm process. Its still used to produced connector chips in the latest Ryzen/Eypic CPUs.

That's Fab 8 in Malta, pretty close by. That contract ends in 2024. They were originally planning to do 7nm there, but those plans were scrapped as it was going to be too expensive.
 

Big Phoenix

Pronouns: zie/zhem/zer
<Gold Donor>
44,628
93,230
Oh, I didn't even know that. I think GF did make a mistake taking over the IBM fabs and development team. They thought after sinking billions of dollars they would get a 7nm process out of it, but GF has been steadily losing money.

Would be great if a third foundry could create a stable 7nm process after TSMC and Samsung, but it looks like not even Intel with all their cash can do it.
Problem is there's only so much talent to go around as this is the bleeding edge of tech. Even if you have the talent you can still end up constrained other ways such as production iirc there's literally only one company in the world(asml) that makes the machines that used in cutting edge nodes.

That's Fab 8 in Malta, pretty close by. That contract ends in 2024. They were originally planning to do 7nm there, but those plans were scrapped as it was going to be too expensive.
I believe gf is trying to go public. AMD is the majority of their business so without them they are going to be screwed.
 

Regime

LOADING, PLEASE WAIT...
<Aristocrat╭ರ_•́>
16,421
37,796
7C88E155-8FF7-4371-AB51-FC007541D7EF.jpeg
 
  • 1Worf
Reactions: 1 user

Neranja

<Bronze Donator>
2,605
4,143
Problem is there's only so much talent to go around as this is the bleeding edge of tech.
I think this is literally Intel's problem right now. Not only in the manufacturing process, but also in the design and development. They fucked up basically everything except x86 processors in the last decade, and even their performance leadership there resulted in security problems like Spectre/Meltdown.

I mean, Intel even wanted to eat Nvidia's lunch in the GPU HPC market by bringing their cores to a massively parallel design with Xeon Phi (Larrabee/Knight's Landing). That didn't work out well.
 

Mist

Eeyore Enthusiast
<Gold Donor>
30,410
22,196
I think this is literally Intel's problem right now. Not only in the manufacturing process, but also in the design and development. They fucked up basically everything except x86 processors in the last decade, and even their performance leadership there resulted in security problems like Spectre/Meltdown.

I mean, Intel even wanted to eat Nvidia's lunch in the GPU HPC market by bringing their cores to a massively parallel design with Xeon Phi (Larrabee/Knight's Landing). That didn't work out well.
The new, actually old, CEO of Intel is pulling engineers out of retirement just to unfuck the company from what I read.
 

spronk

FPS noob
22,599
25,652
I've got four (modern) laptops laying around and was considering setting them all up in their own room for more or less 24/7 Ethereum mining. How feasible is it to do this with laptops vs. desktops, even if the laptops are recent/powerful? Can I expect reasonable returns? I'm hoping to pull in $8-10 a day combined between all four of them and consider that to be "reasonable returns" personally.

id guess you'd spend more in regular grid electricity than actually turn a profit. you need either stripped down PC racks or free power (solar geothermal etc) to make it decent. a laptop isn't gonna be great since all forms of mining are essentially SHA256 factoring and GPUs have orders of magnitude more ALUs (arithmetic processors) than CPUs. Custom ASICs are even better, which is where the farm of racks running mining ops in China off solar come in. Chinese miners account for something like 65% of mining compute power for bitcoin/ethereum/etc.

also
 
Last edited:

Denamian

Night Janitor
<Nazi Janitors>
7,184
18,963
The new, actually old, CEO of Intel is pulling engineers out of retirement just to unfuck the company from what I read.

Yeah, they're trying to bring back some people with actual expertise in what they do instead of having the company run by people who just want to make their stonk go brrrrrrr.

I believe gf is trying to go public. AMD is the majority of their business so without them they are going to be screwed.

Yeah, they're going to need to secure a new cash cow soon. Most of their fabs are old and can't make anything beyond 12nm. I could see them going for a partnership with a car manufacturer. Some of them have been getting fucked by the shortages and I could see Chevy or Ford make having the brains of the car manufactured in the US as a selling point.
 

Janx

<Silver Donator>
6,301
16,933
3080 PC so close but not Scheduled until Monday =/ One good thing about living near Memphis is FedEx packages usually get delivered super early.
1616765873138.png
 

Pescador

Trakanon Raider
234
239
Need some feedback on two orders I placed. I'm trying to decide which to keep. The first build is arriving this weekend and the second one will ship sometime in April. I currently have 2x 60Hz 1080p monitors but will probably upgrade to 1440, maybe 120/144Hz. Don't currently care about 4k.

Build 1 - $1,464:
  1. 10th Gen Intel Core i7 10700F (8-Core, 16MB Cache, 2.9GHz to 4.8GHz w/Turbo Boost Max 3.0)
  2. 16GB Dual Channel DDR4 XMP at 3200MHz
  3. Dark Side of the Moon chassis with High-Performance CPU Liquid Cooling and 550W Power Supply
  4. 512GB M.2 PCIe NVMe SSD
  5. NVIDIA(R) GeForce RTX(TM) 3060 Ti 8GB GDDR6
Build 2 - $1968:
  1. 11th Gen Intel Core i7 11700F (8-Core, 16MB Cache, 2.5GHz to 4.9GHz w/Intel Turbo Boost Max)
  2. 16GB Dual Channel DDR4 XMP at 3200MHz
  3. Dark Side of the Moon chassis with High-Performance CPU Liquid Cooling and 1000W Power Supply
  4. 512GB M.2 PCIe NVMe SSD
  5. NVIDIA GeForce RTX 3080 10GB GDDR6X
Basically, is it worth $500 to go from 3060 ti --> 3080, 550W --> 1000W, 10700F --> 11700F for 1440p gaming? I would have preferred an AMD processor but they didn't have any available that seemed like a "deal" the way these did (I'm using that term loosely since prices are absurb for anything these days...)

I suppose I would set it up to mine eth during the day while I'm working.

One reason I'm asking is because I've read that Dell uses custom cards with worse cooling, and the 3080 in particular can have throttling issues due to the airflow of the case.
 

Janx

<Silver Donator>
6,301
16,933
Need some feedback on two orders I placed. I'm trying to decide which to keep. The first build is arriving this weekend and the second one will ship sometime in April. I currently have 2x 60Hz 1080p monitors but will probably upgrade to 1440, maybe 120/144Hz. Don't currently care about 4k.

Build 1 - $1,464:
  1. 10th Gen Intel Core i7 10700F (8-Core, 16MB Cache, 2.9GHz to 4.8GHz w/Turbo Boost Max 3.0)
  2. 16GB Dual Channel DDR4 XMP at 3200MHz
  3. Dark Side of the Moon chassis with High-Performance CPU Liquid Cooling and 550W Power Supply
  4. 512GB M.2 PCIe NVMe SSD
  5. NVIDIA(R) GeForce RTX(TM) 3060 Ti 8GB GDDR6
Build 2 - $1968:
  1. 11th Gen Intel Core i7 11700F (8-Core, 16MB Cache, 2.5GHz to 4.9GHz w/Intel Turbo Boost Max)
  2. 16GB Dual Channel DDR4 XMP at 3200MHz
  3. Dark Side of the Moon chassis with High-Performance CPU Liquid Cooling and 1000W Power Supply
  4. 512GB M.2 PCIe NVMe SSD
  5. NVIDIA GeForce RTX 3080 10GB GDDR6X
Basically, is it worth $500 to go from 3060 ti --> 3080, 550W --> 1000W, 10700F --> 11700F for 1440p gaming? I would have preferred an AMD processor but they didn't have any available that seemed like a "deal" the way these did (I'm using that term loosely since prices are absurb for anything these days...)

I suppose I would set it up to mine eth during the day while I'm working.

One reason I'm asking is because I've read that Dell uses custom cards with worse cooling, and the 3080 in particular can have throttling issues due to the airflow of the case.
Pretty sure the 3080 absolutely crushes the 3060. 3060/70 series weren't even on my radar for this gen due to the 3080 performance. Guess it depends on what your'e doing. VR? Might want to stick with the 3080 or if you upgrade to high refresh rate 1440p monitors.
 

Springbok

Karen
<Gold Donor>
9,021
12,587
Need some feedback on two orders I placed. I'm trying to decide which to keep. The first build is arriving this weekend and the second one will ship sometime in April. I currently have 2x 60Hz 1080p monitors but will probably upgrade to 1440, maybe 120/144Hz. Don't currently care about 4k.

Build 1 - $1,464:
  1. 10th Gen Intel Core i7 10700F (8-Core, 16MB Cache, 2.9GHz to 4.8GHz w/Turbo Boost Max 3.0)
  2. 16GB Dual Channel DDR4 XMP at 3200MHz
  3. Dark Side of the Moon chassis with High-Performance CPU Liquid Cooling and 550W Power Supply
  4. 512GB M.2 PCIe NVMe SSD
  5. NVIDIA(R) GeForce RTX(TM) 3060 Ti 8GB GDDR6
Build 2 - $1968:
  1. 11th Gen Intel Core i7 11700F (8-Core, 16MB Cache, 2.5GHz to 4.9GHz w/Intel Turbo Boost Max)
  2. 16GB Dual Channel DDR4 XMP at 3200MHz
  3. Dark Side of the Moon chassis with High-Performance CPU Liquid Cooling and 1000W Power Supply
  4. 512GB M.2 PCIe NVMe SSD
  5. NVIDIA GeForce RTX 3080 10GB GDDR6X
Basically, is it worth $500 to go from 3060 ti --> 3080, 550W --> 1000W, 10700F --> 11700F for 1440p gaming? I would have preferred an AMD processor but they didn't have any available that seemed like a "deal" the way these did (I'm using that term loosely since prices are absurb for anything these days...)

I suppose I would set it up to mine eth during the day while I'm working.

One reason I'm asking is because I've read that Dell uses custom cards with worse cooling, and the 3080 in particular can have throttling issues due to the airflow of the case.
#2 by some distance. Feck you could probably sell the 3080 alone, even as a custom card for $1400.
 

Xexx

Vyemm Raider
7,429
1,624
Need some feedback on two orders I placed. I'm trying to decide which to keep. The first build is arriving this weekend and the second one will ship sometime in April. I currently have 2x 60Hz 1080p monitors but will probably upgrade to 1440, maybe 120/144Hz. Don't currently care about 4k.

Build 1 - $1,464:
  1. 10th Gen Intel Core i7 10700F (8-Core, 16MB Cache, 2.9GHz to 4.8GHz w/Turbo Boost Max 3.0)
  2. 16GB Dual Channel DDR4 XMP at 3200MHz
  3. Dark Side of the Moon chassis with High-Performance CPU Liquid Cooling and 550W Power Supply
  4. 512GB M.2 PCIe NVMe SSD
  5. NVIDIA(R) GeForce RTX(TM) 3060 Ti 8GB GDDR6
Build 2 - $1968:
  1. 11th Gen Intel Core i7 11700F (8-Core, 16MB Cache, 2.5GHz to 4.9GHz w/Intel Turbo Boost Max)
  2. 16GB Dual Channel DDR4 XMP at 3200MHz
  3. Dark Side of the Moon chassis with High-Performance CPU Liquid Cooling and 1000W Power Supply
  4. 512GB M.2 PCIe NVMe SSD
  5. NVIDIA GeForce RTX 3080 10GB GDDR6X
Basically, is it worth $500 to go from 3060 ti --> 3080, 550W --> 1000W, 10700F --> 11700F for 1440p gaming? I would have preferred an AMD processor but they didn't have any available that seemed like a "deal" the way these did (I'm using that term loosely since prices are absurb for anything these days...)

I suppose I would set it up to mine eth during the day while I'm working.

One reason I'm asking is because I've read that Dell uses custom cards with worse cooling, and the 3080 in particular can have throttling issues due to the airflow of the case.


Keep the 3080 - no brainer - grab a 34 and 27 inch monitor :D for a 3080 get a 3440x1440p

Also you can keep the card which value in aftermarket makes up 80% of your build and change case