The 2560x1440@144Hz resolution would require a pixel clock of 586Mhz. The GeForce GTX 770 max pixel clock is 540Mhz. You'll need a GTX 980 class GPU, which has a higher pixel clock, in order to support the 2560x1440@144hz resolution.
My brother has a 1440p 144hz monitor with a gtx 970 and he is able to select 144hz as an option. So how does that make sense?
I asked on a different forum and a guy there replied that he could enable 144hz on his 780 card, so that is probably the minimum requirement.
Again this "makes sense", butl leaves a pretty big question: Running the monitor at 1920x1080@144 hz should work based on the above, correct? Pixel clock at that should be in the mid 300s, which a 770 clearly beats.or even a lower resolution. Why doesn't it?
Exactly! IF the pixel clock was the deciding factor in this, lowering the resolution should work. I do not buy, for a second, that a 980 is required to achieve 1440 144hz. Numerous people who have below a 980 are able to do it. So what gives???
Hmmm... I too am only able to set to 120Hz with my new S2716DG.
I'm connected via displayport using the included cable to my Nvidia GTX 660. I'm planning to upgrade the video card once the new 10th gen drop at the end of the month, so I don't know if it is a limitation of this card.
No it doesn't! Seems like the monitor is scaling up any signal to native resolution no matter if it's 800x600, 1024x768, 1920x1080 or anything else.
So if your GPU can't handle 2560x1440 @ 144Hz it is not possible to run this monitor with 144Hz, no matter what resolution. Here's the proof: http://imgur.com/Mq9Ioyx
Seems to be a limitation of the card. I just got a GTX 1080, the monitors now run at 144 Hz. I used to have a GTX 660