Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

168635

November 28th, 2012 08:00

Questions about a Alienware Aurora r4 ALX

Hello i just got my deal with a Alienware Aurora R4 ALX from denmark. Before im going to wire the money i asked support some questions about the hardware in the computer and learned that its motherboard dont have PCI Express 3.0. I expect this computer to last for atleast two years before i even think about buying another, but without PCI Express 3.0 im concerned that maybe it was a bad time to buy a AlienWare. But they have a great deal right now and i dont want to miss it but at the same time i dont want to buy a pc that is not future proof.

The pc will be used for the newest games and programs.

Photoshop CS6, Far Cry 3, Battlefield 3 and whatever interest me for the next couple of years.

148 Posts

November 29th, 2012 08:00

Tesla, the vidoe card does have 3Gb onboard ;-)

www.evga.com/.../Product.aspx

EVGA GeForce GTX 660 Ti FTW+ 3GB w/Backplate ;-)

The face remains that X79 with PCI Express 2.0 / 2.1 is a huge fail!

8 Wizard

 • 

17K Posts

November 29th, 2012 12:00

... this model supports 2.0 and 2.1 PCI express. The chipset it has is Intel X79.

 
Interesting. According to Dell manuals ...
 
Aurora R3 = Intel P67 Chipset (for SandyBridge)
PCIe x16 2.0
-
Others in series = H67, P67, Z68
 
Aurora R4 = Intel X79 Chipset
PCIe x16 2.0
-
-
-
 
While the X79 is better than P67, that's just because P67 was rather lame as compared to X58. I too thought socket 2011 was PCIe 3.0. Ya, here we go, so who is right?
 
Hmm ... looks like PCIe 3.0 isn't native on x79, so it takes another chip to add more lanes and speed?
 -
 
EDIT
 
Ok, I'm confused. I think it's because:
(1) PCIe x16 v2.0 is the same as
(2) PCIe x8 v3.0 
 

8 Wizard

 • 

17K Posts

November 29th, 2012 12:00

Tesla, the vidoe card does have 3Gb onboard ;-)

 

www.evga.com/.../Product.aspx

 

EVGA GeForce GTX 660 Ti FTW+ 3GB w/Backplate ;-)

 

The face remains that X79 with PCI Express 2.0 / 2.1 is a huge fail!

 
Sorry ... cool card ... load up those high-res textures
 
Still, GPUz seemed to be having a problem reading that card. If you press ? (Render Test) can you get DirectX Support and Texture Fillrate to fill in? Also, get it to show @x16 2.0 and not 1.1 (which is way wrong).
 

148 Posts

November 29th, 2012 13:00

Interestingly enough after a quick look on Newegg.

Of the forty X79 motherboards available thirty eight  of them are  PCI Express 3.0,(two of which are even Micro ATX form factor). and only two are limited to PCI Express 2.0 / 2.1.

IMO  I seriously doubt this had anything to do with Intel ;-)

8 Wizard

 • 

17K Posts

November 29th, 2012 13:00

The face remains that X79 with PCI Express 2.0 / 2.1 is a huge fail!

 
Seems like an Intel FAIL if you ask me. Still requires more investigation, but thanks for bringing this to my attention. Intel is definitely holding back (this and USB 3.0 for example). This will likely play a part in next gen. purchase or custom build. Who doesn't want a little future proofing?
 
On the other hand, yes ... I think it would take a dual GPU card (in a single slot) to max out bandwidth of PCIe v2.0, but single GPUs are getting better every year so who knows when they might start to get bottle-necked.

8 Wizard

 • 

17K Posts

November 29th, 2012 14:00

More info:

http://en.wikipedia.org/wiki/List_of_Intel_chipsets#Core_i_Series_chipsets

Notice x79 is missing from this list:

http://en.wikipedia.org/wiki/LGA_1155#Ivy_Bridge_chipsets

Notice the chipset model numbers are going backwards. Maybe there is a new "X Series" Socket-2011 based chipset on the horizon?

And this note:

3 For PCIe 3.0 capability, the Ivy Bridge CPU must have the relevant PCIe 3.0 controller built in. Some Ivy Bridge CPUs only have a PCIe 2.0 controller built in.

 

 

 

8 Wizard

 • 

17K Posts

November 29th, 2012 14:00

Interestingly enough after a quick look on Newegg.

 

Of the forty X79 motherboards available thirty eight  of them are  PCI Express 3.0,(two of which are even Micro ATX form factor). and only two are limited to PCI Express 2.0 / 2.1.

 

IMO  I seriously doubt this had anything to do with Intel ;-)

 
Yes, Newegg is pretty good at getting specs right, but remember:
1. Even if a board supports PCIe x8 v3.0 they can print "PCIe v3.0 support" ... but that's actually not faster than PCIe x16 v2.0 (same max bandwidth).
2. Some Newegg boards might have the additional (non-Intel chipset) circuits to enable full PCIe x16 v3.0. This will likely only be on high-end expensive boards.
 
Notice that on NONE of the official Intel block-diagram graphics is PCIe v3.0 mentioned.
 
I added more info in above post. Someone that is more familiar with this new tech will have to clarify.

148 Posts

November 29th, 2012 17:00

FYI those 38 mother boards range from 1 PCIE x16 up to as many as 7 PCIE x 16.

So my point was more that it would libeen nice to incorporate especially knowing that all the new AMD and Nvidia cards were going to be PCIE 3.0.  

I guess we will have to wait and see what the Aurora R5 has in store for us when it arrives.

8 Wizard

 • 

17K Posts

November 29th, 2012 19:00

Dear saucefar,

 

GPU-Z is not always able to detect the PCIe 3.0 feature. When I read your question, I suddenly entered my Aurora R4 Bios at the system startup by pressing F2. Then, I went to Advanced < Integrated Devices < PCIe Gen3 < Enabled. So, to answer your question, the Aurora R4 motherboard supports PCIe 3.0. You can enable or disable it by entering the Bios. I have a GTX 680 SLI. Hope my answer can help you.

 
Really good info ... thanks.

148 Posts

November 30th, 2012 08:00

Awesome info +10

Thank you.

6 Posts

November 30th, 2012 08:00

I have discovered why GPU-Z and Nvidia Inspector persisted in showing the PCIe as 2.0 rather than 3.0. The problem is not the Alienware Aurora R4 X79 motherboard - which is fully PCIe 3.0 compliant - but Nvidia forcing its 600 series Kepler graphics cards to work under PCIe 2.0 when coupled with a Sandy Bridge E CPU (i7 3820, 3930K, 3960X).

To resolve that issue Nvidia realeased a small fix that every 600 series card user can download from here http://www.techpowerup.com/downloads/2148/NVIDIA_GeForce_Kepler_PCIe_3.0_mode-enabling_patch_for_Sandy_Bridge-E_systems.html

Ater running this fix, shutting down the system and then starting it again, GPU-Z and Nvidia Inspector will correctly detect your Nvidia 600xxx card as a PCIe 3.0. I have tested the fix and report here the results running my GTX 680 SLI on Aurora R4 i7 3930K.

This is the screen before the fix: http://i48.tinypic.com/ddg6tc.jpg  where Nvidia Inspector correctly shows a PCIe 3.0 interface (Alienware R4 X79 mobo) but running at 'only' 2.0 (Nvidia forcing 2.0).

This is the screen after the fix http://i48.tinypic.com/5zl55u.jpg where Nvidia Inspector shows both PCIe 3.0 interface and graphics card running at 3.0.

Same screen with GPU-Z after the fix, showing full PCIe 3.0 graphics card activity: http://i49.tinypic.com/30hphsg.jpg

So, to fully clarify your doubts, the Aurora R4 totally supports the PCIe 3.0 feature. You only have to both enable it from the R4 Bios by going to Advanced < Integrated Devices < PCIe Gen3 < Enabled and then apply the Nvidia fix if you have a Geforce 600 series graphics card. If you go for an Ati 7000 series you don't need any fix, just enable the PCIe Gen3 support from the Aurora R4 Bios. Hope this can help you.

8 Wizard

 • 

17K Posts

November 30th, 2012 12:00

Hashley,

More great info. Thanks for taking the time to post it.

So Dell's implementation of x79 does provide at least enough PCIe lanes to full-bandwidth support one video card at PCIe x16 v3.0. Those screen-shots are golden.

What's confusing to me is the official Intel block diagrams (see above post) show only PCIe v2.0 ... your thoughts?

148 Posts

December 1st, 2012 11:00

The Nvidia fix  worked :-) Thank you.

And now I apologizse for my comment on Dell being the fail on this. It was actually Nvidia that implemented this with GTX 600 cards running with SB-E  CPU's

2437.test.gif

8 Wizard

 • 

17K Posts

December 1st, 2012 21:00

Hashley,

Thanks for the complete explaination. I posted a message at another broader forum and got basically the same info (but not as detailed as yours).

I'm looking to build or buy an Intel Performance Premium Level P2 system with a 6-core processor 3rd quarter 2013 (my Aurora will then be 3 years old). I'm thinking it will be Socket-2011 because I don't see 6-core on other future desktop roadmaps. Either i7-3930K, i7-3970K, or similar future chip (non-Extreme)

http://www.brightsideofnews.com/news...n-q3-2013.aspx

http://www.tomshardware.com/news/Ivy...obo,16588.html

I'll ask you what I asked them ...

Around Nov 2013, will I be able to get an 6-core Intel (non-extreme) "K processor" on some chipset other than x79 ... that is Intel certified for PCIe v3.0?

Will i7-3930K and i7-3970K only ever run on x79? Near the end of next year, these chips should be cheaper.

They seemed to think that a 4-core Haswell will beat a 6-core SB in most everything and never be far behind in anything. Extra 2cores/4threads and quad-channel memory isn't enough to make a difference. What do you think?

I'd rather not build a Xeon workstation.

 

 

6 Posts

December 2nd, 2012 10:00

Hashley,

 

Thanks for the complete explaination. I posted a message at another broader forum and got basically the same info (but not as detailed as yours).

 

I'm looking to build or buy an Intel Performance Premium Level P2 system with a 6-core processor 3rd quarter 2013 (my Aurora will then be 3 years old). I'm thinking it will be Socket-2011 because I don't see 6-core on other future desktop roadmaps. Either i7-3930K, i7-3970K, or similar future chip (non-Extreme)

 

http://www.brightsideofnews.com/news...n-q3-2013.aspx

 

http://www.tomshardware.com/news/Ivy...obo,16588.html

 

I'll ask you what I asked them ...

 

Around Nov 2013, will I be able to get an 6-core Intel (non-extreme) "K processor" on some chipset other than x79 ... that is Intel certified for PCIe v3.0?

 

Will i7-3930K and i7-3970K only ever run on x79? Near the end of next year, these chips should be cheaper.

 

They seemed to think that a 4-core Haswell will beat a 6-core SB in most everything and never be far behind in anything. Extra 2cores/4threads and quad-channel memory isn't enough to make a difference. What do you think?

 

I'd rather not build a Xeon workstation.

 

 

 

 

 

Tesla1856,

you will be able to get a 6 core k CPU on a non X79 chipset by Q3 2013 because the Ivy Bridge E solutions will be out by then and they will come with a new chipset other than X79. Consider that even if you buy a current X79 chipset, you will keep the future compatibility with both next gen Sandy Bridge E CPUs and Ivy Bridge E processors.

So you have two options: 1) you can go for a X79 chipset by now if you should decide to replace your current system soon. If you get a Sandy Bridge E 3970K + X79 in Q1 2013, you will be able to put the Ivy Bridge E 49xxK in November 2013 on the same X79 chipset you can buy right now; 2) you can wait until November 2013 and go directly for an Ivy Bridge E processor with a new motherboard other than the X79 chipset.

In both cases you will have PCIe 3.0 if you get a combination of compliant CPU+GPU+MOBO. But as both Sandy Bridge E and Ivy Bridge E will have CPU+MOBO PCIe 3.0 ready, your only concern will be the graphics card: Geforce 600 series or superior; Radeon 7000 series or superior.

And yes, i7-3930K and i7-3970K will run on the new chipset as well. You may choose to make them run on a current X79 chipset or on the future one. They will run PCIe 3.0 if the above mentioned three requirements are fulfilled (CPU+GPU+MOBO). The Alienware X79 modo is already PCIe 3.0 x16 full bandwidth for SLI/Crossfire (see my GPU-Z bus interface readout test here http://www39.zippyshare.com/v/99200174/file.html).

As for 4-core-Haswell, yes a quad will beat a six SBe in most cases. But it largely depends on the use you're going to plan. If you're likely to use you're new rig for encoding, compiling or hard demanding video games you will rather go for a 6 core SBe/IVBe. If you're likely to use your new system for limited multi-threading activities but greater speed/power efficiency, between mainstream and premium performance, you'll better go for 4-core-Haswell. Extra 2 cores/4 threads + QuadCh mem really makes the difference when working on a server based environment, i.e. not on mainstream solutions.

Consider that current SBe are Xeon-like/Server-like CPUs so they will have all the strenght to handle hard demanding applications, enthousiast gaming too, especially Nvidia surround or Ati Eyefinity solutions. Most games nowadays don't use more than 4 cores, but things may change faster in the future so I will pave my way through a SBe/IVBe from next year on. Save your money for X processors/XEON and rather go for 'intelligent' K solutions: they have high Cache amount, they can fuel the same high memory bandwith as X/XEON processors and they OC very well.

If you are a demanding gamer and mind playing surround, go for Ati high-end solutions. Nvidia graphics cards are really outstanding, drivers are very well optimized, 3D vision is a blast...but they can't compete against Ati cards on very high resolutions (2560x1600 or 5760x1080). I have personally seen high-end Eyefinity on multiple Samsung S27A950D 120Hz displays and it looks awesome, the frame-rate keeps always top notch if compared with Nvidia surround solutions. Or you may want to go for a high quality IPS if you don't mind playing at 120Hz nor surround.

May I suggest you a very good liquid cooling solution, too. Liquid cooling for GPUs are not that common, but they are the rule for high end CPUs if you are an hardcore gamer. Alienware Aurora desktops offer great performances and very good solutions but...trust me...there are better choices around. If you are a hard demanding gamer who wants the best look no further than Digital Storm: give a look here http://www.digitalstormonline.com/compblackops.asp whether you should decide to buy your new rig in a couple of months or in a year, it makes no matter. They always have the best gaming rigs in the world: you can choose among Sub-Zero LCS Cooling System, Cryo cooling systems for both CPU and GPU, FrostChill Cooling solutions and so on. Their customizable solutions are right now way better than Alienware ones: 2133Mhz DDR3 16/32 GB QC mem, full ATX mobos (while Alienware is limited to few slots micro ATX), higher PSUs (1200/1300W Vs Alienware 'standard' 850W), Quad Sli/Double CF solutions (Alienware stops at SLI/CF), higher 'stock OC' CPUs (Alienware 1st and 2nd Level OC Steps turn pale in comparison), and very high quality SSD (may I recommend you to go for a high performance SSD in your next rig).

No Events found!

Top