Devices: OptiPlex 9020 Micro Win7 => Displayport to HDMI => HDMI 4K Visio TV 43"
Issue: periodic flickering. Seems to recognize the 4K of Resolution
Question: Is it unreasonable to expect this PC to work with a 4K display (mainly text and scientific plots). If it is reasonable has anyone successfully used a 4K monitor with the 9020 Micro? Any tests / corrections are appreciated: Thank you.
Dell makes no such claims about non interlaced 4K full screen video on the 9020 micro. If you have a GPU with 4 to 12 GIGS of Video ram that's another story. Intel Graphics is not an EVGA 12G-P4-3990-KR G-SYNC GeForce GTX TITAN Z and is priced accordingly.
@Speedstep: thanks for the insight: much appreciated . Your comments prompted me to ask myself the question: how much RAM is needed? The PC is now on a 1600x900 monitor:
* 64MB Dedicated Video Memory
* 0 System Video Memory
* 1632 MB Shared System Memory
Available Graphics Memory Total=1696 MB. It would be nice if there was a utility to show what fraction of the Shared System memory is used by the graphics card. Maybe if this setting was increased, it would improve the performance at 4K? To my surprise, this PC can not be configured with more than 8GB: http://www.crucial.com/usa/en/compatible-upgrade-for/Dell/optiplex-9020-micro
Interesting enough, this article indicates that the GPU RAM bandwidth is a bigger factor quantity of RAM:
"the quantity of RAM that the graphics card employs doesn't have a direct impact on game performance, but it can have an indirect impact. Graphics card RAM will only negatively affect performance if there isn't enough to handle what a specific game title requires. The point is that all other factors being equal, a graphics card with 2GB (2,048 MB) should perform exactly the same as a graphics card with 512MB as long as the game's graphics memory requirements are below 512MB. If the game's settings and resolution require more than 512MB of graphics card RAM, the 512MB card will demonstrate a performance penalty compared to the 2GB card."
I am not gaming and running less demanding applications (text, plots) and now left wondering what is the max practical resolution for this PC. Much is written in the Intel forums, however, I prefer field experience: I am hoping it will drive a http://www.lg.com/us/monitors/lg-29UM65-P-ultrawide-monitor, when I return from vacation and back to the office.
The Intel site indicates 4K is supported:
"Intel® HD Graphics P4600 offers an integrated ring bus technology that connects all CPU components (computational cores, L3 cache, graphics, and system agent with the memory controller) to enable an optimized approach of communicating with the system memory via the fast L3 cache. Intel® HD Graphics P4600 processor-based graphics will support Microsoft DirectX* 11, OpenGL* 4.0, OpenCL* 1.2, and DirectCompute* 5.0 standards. In addition to improvements in both 2-D and 3-D graphics performance, Intel® Quick Sync Video 2.0 technology almost doubles the H.264 transcoding speed of its predecessors and now supports up to 4K display resolutions."
Maybe Dell will take notice of this question and start positioning products / communications to make it easier for users. Intel indicates limitations with HDMI and there will soon be many 4K HDMI devices. Apparently Intel is advising against using HDMI adapters, however, I don't think that really serves users. As I understand, all HDMI cables can support 4K (HDMI2.0). In short, Dell should expect users adopting 4K HDMI displays and position marketing accordingly. Users are am not inclined to spend another $1-$2K just for a 4K monitor to accommodate displayport connecters, when a 4K TV is available.
Aside: after watching NVIDIA video, I am wondering if the performed CFD to design their cooling.
The Optiplex micro does not come with XEON E3 1200 HD Graphics.
Intel® HD Graphics P4600 Graphics is based on Intel® Xeon® Processor E3-1200 family.
The amount of ram is not as significant as the BITS per frame aka 64 bit , 128 bit, 256 bit, 384 bit , 512 bit, 640 bit ram interface.
And the number of shaders 20, 40, 80, 400, 800, 2468, and ROPS
2 ,4 , 8, 16, 32. They all use PCI-E bus. GDDR3 is fast but GDDR5 is better.
So the assertion that it supports or should support 4k is not correct. A 4K resolution, as defined by Digital Cinema Initiatives, is 4096 x 2160. This should not be confused with UHD-1 TV video which has a resolution of 3840 x 2160.
Expected maximum for INTEL CORE series is 2560 x 1600
If there is a 9020 micro with Intel Iris Pro graphics that might be able to eek out 4k graphics but most dedicated gpu's with 4 gigs to 12 gigs of Video ram will still be much faster.
Basic Core I3 I5 INTEL HD graphics has 6 shaders.
Intel P4600 Graphics has 20 shaders
I think the 4th Gen I7 series has P4600 comparable video, This is on the $1000+ 9020 however.
(More than 300percent more than base INTEL HD Graphics.)
Intel IRIS PRO Graphics has 48 shaders and 128 Megs Dedicated GPU Ram not shared with the OS.
These cards are much faster than Core I3 I5 intel video but even they do not support 4K video.
|AMD Radeon HD 6450||AMD Radeon HD 5450|
|Memory Clock||900MHz DDR3||800MHz DDR3|
|Memory Bus Width||64-bit||64-bit|
|VRAM||1GB / 512MB||1GB / 512MB|