Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

444057

April 20th, 2013 08:00

U3014; problems getting full performance through HDMI

I just got my U3014 delivered, but am having a lot of problems getting it to run.

My initial test was with my MSI GT780DXR (nVidia GeForce GTX 570M) windows 7 64 system. The laptop only has VGA and HDMI 1.4 out, but HDMI should be able to do 2560x1600 @ 32 bit. But the nNvidia device driver keeps complaining that the U3014 can't do that. The highest I could get was 1920x1080. After some custom configuration in the nVidia settings, I've got it running on 2560x1600 @ 16 bit / 60 Hz. But on such a big screen 16 bits do not look great (to say the least).

So I connected it to my desktop system also W7 64 bit (nVidia GeForce GTX 560) and it immediately switched to 2560x1600 @ 32 bit (60 Hz). So I know it works (at looks great).

A third trial involve a HP laptop with Ubuntu and a DisplayPort. It too only got to 1920x1080. 

I'm starting to suspect that the HDMI 1.4 implemented in the U3014 is not capable of doing 2560x1600 @ 32 bit / 60Hz, as the HDMI says it should.

18 Posts

September 13th, 2013 10:00

Dunno. But you could also try 48Hz instead of 60. It was quite acceptable.

1 Message

November 1st, 2013 13:00

I have a Dell Latitude notebook w/ Win 7 and with integrated Intel HD Graphics 4000, and I had no luck creating a custom 2560 x 1600 setting using any of the steps suggested here. However, I did find another page that explains how to create 2560 x 1440 at 55 hz /32-bit setting using CVT-RB" (Reduced Blank) as the Timing Standard.

Here's the link. I also explains how to accomplish w/ this with Linux, AMD, and the Nvidia GeForce graphics card with Optimus.

http://www.notebookcheck.net/2560x1440-or-2560x1600-via-HDMI.92840.0.html

1 Message

May 9th, 2016 02:00

It's old, I know. But I just faced this problem.

I want to use the HDMI socket of the U3014 monitor (the other sockets are occupied by cables from other computers). My graphic card (nvidia GTX 950) has a HDMI 2.0 socket. I have bought a HDMI 2.0 cable and it doesn't work.

With the hacks of Gilahacker I see a 2560x1600 resolution, but of a bad quality.

Why the doesn't HDMI work out of the box? Dell specified it with HDMI1.4 for that model, what definitely must be able to do 2560x1600!

ftp.dell.com/.../dell-u3014_User's%20Guide_en-us.pdf says: "Depending on your graphics card, connections using HDMI may only support up to 2560 x 1600".
Ok, fine. But it doesn't!

www.in.tum.de/.../Dell_UltraSharp_U3014.pdf says "Allows HDMI connection at full panel native resolution" as a benefit against the older U3011 model.
Ok, fine. But it doesn't!

How can Dell-Chris M write "highest resolutions HDMI can do is 1920x1080 or 1920x1200." It isn't written anywhere else?!

I am speechless ...

Is there a possibility that this will be fixed somehow?

Community Manager

 • 

54.5K Posts

May 9th, 2016 07:00

Notice the wording in the Manual, "may only support up to 2560 x 1600". That is not stating that you will get 2560 x 1600. It is stating you could get any resolution up to 2560 x 1600 or 2560 x 1600, depending on your graphics card and the its driver.

A user posted =
I've used a standard HDMI cable. Then I entered the Nvidia tool and created a custom resolution (because the monitor does not list what it can do over HDMI). Not all settings work, but what does work is:
- 2560x1600, 16 bit, 60Hz
- 2560x1600, 32 bit, 24Hz
- 2560x1600, 32 bit, 36Hz MacBook Pro Retina
I'm not seeing any flicking @24Hz so I'm leaving it at that for now. According to the 1.3 specification, it should be able to do 24 or 30 bit at 60Hz, and 1.4 even more. But I can't configure that. I need to pick that up with Nvidia.

No Events found!

Top