I tried both your suggestions, and it appears I got contradictory results.
First, I tried to use the "bad" monitor without the good one on the original computer. It still did not work. I have an ATI Radeon HD 3400 Series display adapter on that computer. (I uninstalled and re-installed the driver, but that did not solve the problem.)
I then tried the "bad" monitor on a second computer ... and it did work. The brightness of the screen briefly flickered once, but then it worked fine. I have an NVIDIA GeForce 8400 GS display adapter on the second computer.
I'll have to admit that I cannot make heads or tails of these results. Can you? lol
I tried the "bad" monitor on the NVIDIA GeForce 8400 GS card for several days, and it works just fine. Interestingly, the "good" monitor that I left on the OptiPlex 780 has now ALSO started to blink and go black. There's some kind of 'degradation' effect going on with OptiPlex 780.
I am using a DVI port on the OptiPlex 780, and a VGA port on the other 'test' computer. I am using different cables on the two computers ... but only because the ports are different.
I think I have a port or cable issue of some kind. What do you think I should try next?
I need to make a modification to my previous post.
The port I'm using on my S199WFP monitor is a DVI port. However, I'm connecting the monitor to a "Dell DisplayPort" (whatever that is) on my OptiPlex. I'm using a DP2DVI cable connector (made by Molex) to do it. Is this information relevant? Should I be looking for a single cable that goes from a DVI port to a "Dell DisplayPort"? Or is it possible that my Dell monitor simply doesn't support the Dell DisplayPort?
Near the base of the OptiPlex 780 (on the back), there are two DisplayPorts. I presume there is a card of some sort that's been added to the computer. On each monitor, I attach a DVI cable. Then, I attach a (smaller) DP2DVI cable connector. Then I plug each cable into a DisplayPort.
Maybe my next step should be to open the OptiPlex and see what "card" I have that is giving me two DispayPorts. Should I just use a DVI splitter cable on one DVI port? Or will I lose some sort of functionality if I do that?
Excellent call-out my man. The DisplayPort (as well as a female DVI port) were covered by a plastic cover, so I had not seen them. I plugged a monitor into the DisplayPort and received the following message:
This computer has both a display port monitor and attached and a card installed in PCI-Express slot 1. This configuration is not supported. Please remove the PCI-Express card or disconnect the display port monitor.
I disconnected the DisplayPort monitor and connected a monitor to the female DVI port. That didn't work either. I received the following error message:
This computer has an add-in graphics card, but the monitor cable is plugged into the integrated video connector.
My add-in graphics card is an ATI Radeon HD 3400 Series display adapter with two (2) DisplayPorts. Presumably, it is mounted in my PCI-Express slot 1.
In a way, I'm back to where I started. Perhaps my best bet is to buy a low profile display adapter with two female DVI ports (if one exists). That way, I (a) dump the DP2DVI cable connector cables and (b) replace a display adapter with potentially bad display ports.
hill10003
1 Rookie
•
28 Posts
0
April 5th, 2012 22:00
Hello Elijah.
I tried both your suggestions, and it appears I got contradictory results.
First, I tried to use the "bad" monitor without the good one on the original computer. It still did not work. I have an ATI Radeon HD 3400 Series display adapter on that computer. (I uninstalled and re-installed the driver, but that did not solve the problem.)
I then tried the "bad" monitor on a second computer ... and it did work. The brightness of the screen briefly flickered once, but then it worked fine. I have an NVIDIA GeForce 8400 GS display adapter on the second computer.
I'll have to admit that I cannot make heads or tails of these results. Can you? lol
Steve
hill10003
1 Rookie
•
28 Posts
0
April 17th, 2012 20:00
Hello Elijah.
I tried the "bad" monitor on the NVIDIA GeForce 8400 GS card for several days, and it works just fine. Interestingly, the "good" monitor that I left on the OptiPlex 780 has now ALSO started to blink and go black. There's some kind of 'degradation' effect going on with OptiPlex 780.
I am using a DVI port on the OptiPlex 780, and a VGA port on the other 'test' computer. I am using different cables on the two computers ... but only because the ports are different.
I think I have a port or cable issue of some kind. What do you think I should try next?
Thanks.
hill10003
1 Rookie
•
28 Posts
0
April 18th, 2012 11:00
Elijah,
I need to make a modification to my previous post.
The port I'm using on my S199WFP monitor is a DVI port. However, I'm connecting the monitor to a "Dell DisplayPort" (whatever that is) on my OptiPlex. I'm using a DP2DVI cable connector (made by Molex) to do it. Is this information relevant? Should I be looking for a single cable that goes from a DVI port to a "Dell DisplayPort"? Or is it possible that my Dell monitor simply doesn't support the Dell DisplayPort?
Thanks.
hill10003
1 Rookie
•
28 Posts
0
April 19th, 2012 04:00
Near the base of the OptiPlex 780 (on the back), there are two DisplayPorts. I presume there is a card of some sort that's been added to the computer. On each monitor, I attach a DVI cable. Then, I attach a (smaller) DP2DVI cable connector. Then I plug each cable into a DisplayPort.
Steve
hill10003
1 Rookie
•
28 Posts
0
April 20th, 2012 09:00
Elijah,
Maybe my next step should be to open the OptiPlex and see what "card" I have that is giving me two DispayPorts. Should I just use a DVI splitter cable on one DVI port? Or will I lose some sort of functionality if I do that?
hill10003
1 Rookie
•
28 Posts
0
April 23rd, 2012 19:00
Elijah,
Excellent call-out my man. The DisplayPort (as well as a female DVI port) were covered by a plastic cover, so I had not seen them. I plugged a monitor into the DisplayPort and received the following message:
This computer has both a display port monitor and attached and a card installed in PCI-Express slot 1. This configuration is not supported. Please remove the PCI-Express card or disconnect the display port monitor.
I disconnected the DisplayPort monitor and connected a monitor to the female DVI port. That didn't work either. I received the following error message:
This computer has an add-in graphics card, but the monitor cable is plugged into the integrated video connector.
My add-in graphics card is an ATI Radeon HD 3400 Series display adapter with two (2) DisplayPorts. Presumably, it is mounted in my PCI-Express slot 1.
In a way, I'm back to where I started. Perhaps my best bet is to buy a low profile display adapter with two female DVI ports (if one exists). That way, I (a) dump the DP2DVI cable connector cables and (b) replace a display adapter with potentially bad display ports.
How does this look?:
http://www.amazon.com/EN210-SILENT-DI-1GD3-V2/dp/B004I8W4VI/ref=lh_ni_t
hill10003
1 Rookie
•
28 Posts
0
April 24th, 2012 13:00
How do I do THAT? I don't even know what a power rating is! lol