Start a Conversation

Unsolved

D

31 Posts

2313

January 21st, 2021 16:00

Aurora R11, RTX 3080, constantly hitting power/voltage limit

Hello, I monitored 20 minutes of gaming in Cyberpunk using MSI Afterburner and I noticed that GPU is constatly hitting power and voltage limits. I don't have any OC, everything is stock. Max GPU temp. was 75 C°. Also I can't notice downgraded performance while gaming, it's running pretty good, but I'm curious. I got a 1000w PSU so there's enough power. Any thoughts?

DrKey_0-1611275090618.png

 

2 Intern

 • 

395 Posts

January 21st, 2021 18:00

Since you're not overheating and not having problems like freezes or shutdowns, I wouldn't worry and just enjoy the game.

Cyberpunk is a new and relatively hard running game even on the best of the new video cards.  In an intensive game you are going to hit the limits.  As long as nothing tries to exceed those limits there is no problem. 

8 Wizard

 • 

17K Posts

January 21st, 2021 22:00

Like @Doghouse Reilly  says, if it aint broken don't fix it.

Why do you think 75c is max temp for RTX-3080 ?

It's near high, but not max. You're good I think.

6 Professor

 • 

6K Posts

January 21st, 2021 22:00

I would be more concerned about the PCI-X slot version 4.0 running at 8x instead of 16X. That's really a shame for a gaming machine, that they could not provide a chipset solution that runs at the proper 16X speed. In effect this slot runs as fast as a PCI-X version 3 16X slot, negating the benefits PCI-X version 4.0 brings to the table.

 

I just noticed you got an R11, I thought you had an R10 Ryzen. My bad.

31 Posts

January 22nd, 2021 03:00

No I meant my card was hitting 75°, not that the RTX 3080 is limited to that. According to Afterburner temp. limit is set to 83°.

31 Posts

January 22nd, 2021 03:00

Actually I was experiencing freezes but only playing Cyberpunk. I did all sort of tests/benchmarks and I couldn't reproduce the problem. Then I saw this thread and I did the same: DDU + install older driver version and yesterday evening it didn't crash. I also got some suggestions from Dell to tweak some Cyberpunk's settings... I'll keep my fingers crossed.

Anyway I was just curious why that happens, I'm not worried...

31 Posts

January 22nd, 2021 03:00

Well the R11 also has a PCIe 3.0 that could work at x16 but currently runs at x8 according to GPU-Z and official manual. Don't know why but I agree it's quite disappointing

DrKey_0-1611314171429.png

 

Community Manager

 • 

54.3K Posts

January 22nd, 2021 07:00

Just an FYI, this data was always present in the online Aurora R11 documentation.

Aurora R11 Setup and Specifications PDF

Capture.JPGCapture1.JPG




Aurora R11 Service Manual PDF

Capture2.JPG

Capture3.JPG

6 Professor

 • 

6K Posts

January 22nd, 2021 12:00

Yes, I knew it before I purchased the machine.

But that does not take away the point that it's rather disappointing for a gaming machine that costs $3,000 or more to have a chipset configuration that is not capable of running at least the main PCI-X slot at X16.

I can understand with dual cards dropping down to X8, but a single card?

It's going to be a performance hit with PCI 3.0 to use X8 instead of X16. With PCI 4.0 it will not be, since it will have identical bandwidth as PCI 3.0 X16 and that is still enough for these cards.

Since these machines use the latest most powerful cards, and mine came with PCI 4.0, it is disappointing to see this in that price range.

Now I am assuming this is a chipset limitation, but I guess it could be for another reason, because that document does not explain why. (Which is also disappointing)

9 Legend

 • 

47K Posts

January 22nd, 2021 14:00

@Vanadiel 

PCI-E  3.0  x8 is the same bandwidth as PCI-E  2.1  x16

no idea why you reference  PCI-X  that is a very old bus that is not used for more than 10 years now

Version Intro-
duced
Line code Transfer
rate
Throughput
x1 x2 x4 x8 x16
1.0 2003 8b/10b 2.5GT/s  0.250 GB/s 0.500 GB/s 1.000 GB/s 2.000 GB/s 4.000 GB/s
2.1 2009 128b/130b 5.0 GT/s 0.500 GB/s 1.000 GB/s 2.000 GB/s 4.000 GB/s 8.000 GB/s
3.0 2010 128b/130b 8.0 GT/s 0.985 GB/s 1.969 GB/s 3.938 GB/s 8.000 GB/s 15.754 GB/s

https://www.youtube.com/watch?v=XJuj16gRoBI

PCIe x16 has 82 pins and is 89 mm long. Regardless of the size or number of pins, the key notch is always at the eleventh pin.

3.3v is now required for all pci slots

5v is no longer supported

 

PCI-X is NOT PCI-EPCI-X is NOT PCI-E\

 

PCI-X is very old and no longer usedPCI-X is very old and no longer used

8 Wizard

 • 

17K Posts

January 22nd, 2021 14:00

But that does not take away the point that it's rather disappointing for a gaming machine that costs $3,000 or more to have a chipset configuration that is not capable of running at least the main PCI-X slot at X16.

I can understand with dual cards dropping down to X8, but a single card?

It's going to be a performance hit with PCI 3.0 to use X8 instead of X16. With PCI 4.0 it will not be, since it will have identical bandwidth as PCI 3.0 X16 and that is still enough for these cards.

Since these machines use the latest most powerful cards, and mine came with PCI 4.0, it is disappointing to see this in that price range.

Now I am assuming this is a chipset limitation, but I guess it could be for another reason, because that document does not explain why. (Which is also disappointing)

=============================

When the Area-51 was still being sold I used to explain it like this:

https://www.dell.com/community/Alienware-Desktops/Aurora-R11-resizable-bar-with-the-30xx-Series-and-10th-Gen-CPU/m-p/7787735/highlight/true#M38796

Since the Aurora-R5 ... it's always been a step-down with x8. But if the Area-51 is not coming back, I think these new Nvidia cards could use the extra bandwidth interface.

Sure, (if it is actually) PCIe v4.0 running at x8 now, that is a little better. Still, hard to argue AGAINST (single slot) wider-bandwidth.

I've not been invited to any Alienware desktop betas lately, but if I was, I would Zonk it. 

Right, x8 is fine for SLI (even if it's a BIOS option, dip-switch, jumper, etc.). Thing is, SLI is being slowly depreciated. I think 90% of users just want the latest Nvidia GPU (on a nice, properly cooled board) and have it running at full-speed. Go ahead and throw in the "Resizable Bar" thing while you are at it.

 

6 Professor

 • 

6K Posts

January 22nd, 2021 14:00

Well, you know what I was talking about. That's in the end all that matters. It's still a shame a $3,000 gaming machine cannot run a single PCI-E slot at X16 under revision 3.0 or 4.0 of the PCI-E standard and has to revert to running it at 8x.

It's unfortunate when they named these 2 different standards, they choose names that are very identical.

 

PCI-X, short for Peripheral Component Interconnect eXtended

PCI Express, short for Peripheral Component Interconnect Express

 

6 Professor

 • 

6K Posts

January 22nd, 2021 17:00

To be honest, they should put this limitation on the web page where you order the machines, instead of mentioning it casually in the documentation. It should be on the specs on the ordering page.

 

At one hand they tell you "for optimal graphics performance, use a PCI-Express X16 slot for connecting the graphics card", and right below that under a "note" they casually mention it only works at X8.

Well, it's not really a PCI-E X16 slot when it only works as a PCI-E X8 slot, is it now?

Just call it what it is, a PCI-E X8 slot. You will never get PCI-E X16 out of it unless they somehow remove that limitation.

 

31 Posts

January 22nd, 2021 17:00

I agree with @Vanadiel, considering that PCIe 3.0 was released in 2010 it's simply ridicolous that after 10 years these machines are running at 8x. I can't really imagine any incompatibility issue that takes 10+ years to be addressed.

Btw I found a nice video which compares performance of a RTX 3080 on PCIe 3.0/4.0 at 8x/16x, check it out. Actually it seems to be a bottleneck already for this card, I wonder how a 3090 would perform then. But my biggest concern is more about the future since I won't be able to upgrade my GPU unless Dell will release an update to support x16.

https://www.youtube.com/watch?v=iq8Vv-WqlIE

January 29th, 2021 06:00

“At one hand they tell you "for optimal graphics performance, use a PCI-Express X16 slot for connecting the graphics card",

- They knew what they were doing, its pretty disgusting, and disappointing.

 

Just call it what it is, a PCI-E X8 slot. You will never get PCI-E X16 out of it unless they somehow remove that limitation.

- How simple would that be & if its something that could be done through drivers or some type of update....Is there anyway to bypass the restriction, whether it voids the warranty or not is no concern of mine.

I really want to return mine at this point. Going to check into that. I even ordered the Corsair Fans and  AIO cooler, that other users had good results with. The bottom line seems to be, that your stuck with the ridiculous 320 watt power limit and old tech for the PCI.

Money was never the issue here, it was availability of the GPU I wanted. Well I did have a dell rebate in my pocket and a nice chunk of their dell rewards that were close to expiring...so all that made me jump on the purchase...But certainly the main factor was not being able to find a 3080/90 anywhere on the planet.

January 29th, 2021 06:00

I did not know this, is this something an update can fix? Or is it hardware? It’s bogus to me they don’t allow any play on your power limit. The card is not being use anywhere close its potential. Some results I’ve seen are what I would consider an acceptable amount of performance loss...However in my case the loss is substantial. I’ve been looking into anything I can do to remove these unnecessary restrictions but haven’t found anything. I’m actually going to look into the return policy when I get home.

No Events found!

Top