Start a Conversation

This post is more than 5 years old

Solved!

Go to Solution

26942

February 1st, 2013 08:00

Nvidia 560TI and HDMI

Just curious about something. My nephew and I both have the Aurora R4 computers with the GTX560ti graphics cards and Dell 2320L 23 inch monitors. He tried to change DVI cable to HDMI and the picture was terrible. No matter how the monitor and card are adjusted it still doesn't look good. He tried 2 different HDMI cables with the same results, gave up and put the DVI cable back in and the picture is great again. Thinking he must have a bad graphics card I did the same thing when I got home. I have the exact thing on my computer when using a HDMI cable, the graphics are really bad, with DVI great. Both our resolutions are set to 1080P 1920x1080. He did try 1920x1080 native and that didn't help. This is not a big problem, but was just wondering why it happens. I thought DVI and HDMI were the same except HDMI carried sound where DVI doesn't. Both cards are the OEM version, so don't know if that could enter into it.

1 Rookie

 • 

445 Posts

February 4th, 2013 19:00

I tried the HDMI to DVI adapter with the same HDMI cable and you are right. It works fine. The refresh rate is 59Hz with that setup.

Seems we learn something everyday.

I am just going to leave it using the DVI connection for now.

I plan on uprading the graphics card to a named brand 660ti card in a couple months, so will see how things work out then.

1 Rookie

 • 

445 Posts

February 1st, 2013 19:00

Beings there is no response I guess that's a normal thing for the setup. I will just use DVI in the future.

8 Wizard

 • 

17K Posts

February 2nd, 2013 15:00

I use DVI for computer LCDs. Next best choice would be DisplayPort. Last choice VGA.

I only use HDMI when connecting to real TVs and HomeTheater.

1 Rookie

 • 

445 Posts

February 2nd, 2013 15:00

Thanks for the advise. I talked to a friend that has a AMD card and a Samsung monitor and has the same problem we ran into so gues it's not only Nvidia cards and Dell monitors. When he called Samsung support, they told him that HDMI cables dont like 1080P 1920x1080 resolution. That doesn't sound right, but no matter, I will use DVI on my computer and HDMI for my TV.

Thanks again.

8 Wizard

 • 

17K Posts

February 2nd, 2013 16:00

When he called Samsung support, they told him that HDMI cables dont like 1080P 1920x1080 resolution. That doesn't sound right,

No, it doesn't because that is the res. of most 1080p 16:9 HDTV. However, in AMD Catalyst (and IIRC nVidia Control Panel as well) ... there is a special place to setup HDTV's.

Stick to the rule I suggested, and you will have less problems.

1 Rookie

 • 

445 Posts

February 2nd, 2013 17:00

That's what I thought too. I will stick to what I know works best.

Community Manager

 • 

54.9K Posts

February 4th, 2013 11:00

lynne4270,

Below are the specs for the NVidia GeForce GTX560Ti.

Two DVI-I DL 1.0 =
Dual-Link Maximum resolution over digital port (single GPU and SLI mode): 2560×1600×32bpp at 60Hz
Single-Link Maximum resolution over digital port (single GPU and SLI mode): 1920×1200×32bpp at 60Hz

DisplayPort = Maximum resolution: 2560×1600×32bpp at 60Hz

HDMI 1.4a  = Maximum resolution: 1920×1080×32bpp at 60Hz

That being said, going from HDMI to both the ST2320L HDMI and the SR2320L should have looked fine at 1920×1080×32bpp at 60Hz? I am wondering if there was a tweak needed in the Nvidia Control Panel...

1 Rookie

 • 

445 Posts

February 4th, 2013 13:00

Thanks Chris for digging into that for me. I will take a look at the NVidia control panel and see if I can find a setting that may have to be tweeked. I know when I plug the monitor in it auto detects and tell me it's HDMI, but I never throught to look in the the NVidia control panel. I will look in there tonight and see if I can find anything.

Thanks again.

2.4K Posts

February 4th, 2013 17:00

Thanks Chris for digging into that for me. I will take a look at the NVidia control panel and see if I can find a setting that may have to be tweeked. I know when I plug the monitor in it auto detects and tell me it's HDMI, but I never throught to look in the the NVidia control panel. I will look in there tonight and see if I can find anything.

 

Thanks again.

 



If you Google this issue you will see all kinds of posts about this with all types of systems and monitors. Everything from Apple to HP

Some people will take the HDMI and put a DVI adapter on it and all of a sudden the picture is fine again. HDMI to HDMI = bad Picture. HDMI to DVI = Good picture. So people assume it's an issue with their system/monitor and HDMI. One Apple forum said that Apple released an update to fix it. /shrug

It's an issue I've never ran into myself. My HDMI, Display Port and DVI all work...knock on wood. I would suggest checking the refresh rate when playing with it. If it drops from lets say 60Hz to 30Hz then it will make the picture blurry especially when viewing text.

No Events found!

Top