Graphic card manufacturer and model number: Sapphire Radeon RX 480 8GB GDDR5 PCI Express 3.0 x16
Operating System: Windows 10 Pro
Power supply manufacturer: stock Dell 460w PSU
Compatible: Yes, does work
Original card with system that worked: EVGA GTX 680
Monitor: LG 34UC87C 3440 x 1440 60Hz ultrawide curved LED monitor using HDMI
I upgraded from old HP 1680 x 1050 monitor to the LG listed above last month, which is wonderful for many non-gaming reasons. I wasn't intending to upgrade my GPU also, but the price point of the RX 480 was just right. I knew going in that 1440p gaming was going to be rough on this card, but at least it will allow me to enjoy higher settings on modern games at ultrawide 1080p. For a better 1440p experience, you'll need to spend a few times more, e.g. HanoverB's GTX 1080 upgrade.
The RX 480 is shorter than my old GTX 680 and slightly easier to install since the power plugs aren't under the front cage. Also only needs a single 6 pin power vs the two 6 pins needed for the GTX 680. My XPS 8500 is mostly stock, as you can see. I had added a second optical drive (Bluray) from my previous computer and a 1 TB Samsung SSD. There was a stock WiFi PCI card, if I remember correctly, that failed in the past and is now long gone.
I ran DDU in safe mode before physical installation and got some error in the middle of it. I proceeded to install anyway. Used the latest AMD drivers so I should be go as far as the PCI-E power draw issue the previous drivers suffered from.
Using HDMI, my refresh rate is at 50Hz at 3440 x 1440. I'm guessing switching to DisplayPort will remedy that. I have the same issue on the GTX 680. I get 60Hz at 2560 x 1080. I don't know how to look at temps, so please advise and I'll try it out if there's interest.
As far as performance, I briefly checked a few of the games I've installed. I got a few more, but they're older and already ran 50/60fps at max settings depending on resolution on the GTX 680, so those weren't interesting.
The Witcher 3: 2560 x 1080, 55-60 fps at high graphics and post presets with hairworks off. Switching to 3440 x 1440 takes me to 30 fps, which is playable in this game, and you can squeeze out more bumping down settings. I only tried presets since I don't really know which settings have the most performance impact. If I were to replay this game, I'd go with 2560 x 1080.
Fallout 4: Doesn't have native 21:9 support, so I only tried 16:9. It crashes on me trying to do fullscreen 2560 x 1440. I'm not sure if there are workarounds. At 1920 x 1080, max settings, it runs 60 fps. I didn't really do much combat in this test though. I expect it to dip some.
Dragon Age Inquisition: Runs quite well, 2560 x 1080 at high preset runs 55-60fps during combat. Pretty solid 60fps during exploration. At 3440 x 1440 at high preset gives me 45-50fps during combat.
Tomb Raider (2013): I only ran the benchmark. 3440 x 1440 at ultimate preset gives min/max/avg fps as 38.0/50.4/46.4 (48.4/50.0/50.0 on ultra). Lowering the resolution to 2560 x 1080 gives me 60.0/60.0/60.0 on ultimate.
I'll see if I can shop for a Displayport cable tonight and see how close to 60fps I can actually get.
Yes, the HDMI cable limits you at 3440 x 1440 to 50Hz, use Displayport to get 60Hz and above.
Nice GPU upgrade and great monitor. I really like the ultrawide monitors for not only gaming but productivity in general vs 2 monitors.
You can play TW3 at 30fps for sure at 3440 x 1440, I would think the game would look better at that native resolution vs 2560 x 1080.
Try tweaking settings a little. This is a good guide on how to optimize your settings to get the best fps, like dropping foiliage visibility settings (+ 5 to 19 fps difference ), changing Ambient Occlusion settings (+4 to 5 fps difference). I found it very helpful.:
Here is a youtube video on how to use MSI Afterburner to show stats on screen.
If you want to do a screengrab just use this tab and set a key.
I couldn't find a displayport cable locally so waiting on an Amazon order. I haven't tried TW3 again since I got into playing DA:I. I'll try tweaking settings later.
MSI Afterburner didn't give me GPU temp monitoring for some reason, so I ended up using HWInfo to drive RivaTuner's OSD. Idle temps are pretty much what's been reported reviews. I see high 30s C idle. Playing DA:I cutscenes that run 30 fps and doesn't fully stress the GPU around 70 C. During gameplay at sustained 100% GPU usage, around 80 C. I'm not really sure how this compares to my old EVGA GTX 680 as I never monitored it. But anecdotally this RX 480 card with a really basic cooler (see teardowns) seems to warm up the room more. However it is also summer and getting warmer and warmer, so I don't know.