Start a Conversation

Unsolved

This post is more than 5 years old

Community Manager

 • 

54.2K Posts

121501

April 27th, 2005 16:00

Projector Resolution: Native vs. Maximum

Projector Resolution: Native vs. Maximum
Evan Powell, April 6, 2005

If you are like a lot of folks getting into projectors for the first time, you may be confused by the fact that every projector has TWO separate specifications for resolution: native and maximum. Why two different specs, and what is the difference between them?

It is actually pretty simple. Every projector that uses microdisplays, whether they are LCD panels, or DLP or LCOS chips, has a fixed array of pixels on those microdisplays. That fixed array of pixels is known as the native resolution of the projector. So native resolution is the actual, true, physical resolution of the projector. The projector will never be able to display more actual pixels than it has on those panels or chips.

So then what is maximum resolution? Well, that number has nothing to do with the projector's physical display. Instead it has to do with signal formats. Computer and video signals come in a wide variety of resolution formats. And every projector is programmed to recognize many of those different signals. Maximum resolution is the highest resolution signal that the projector has been programmed to process and display.

Converting non-native signal formats to native resolution. When a projector gets a signal that does not match its native resolution, it must convert that signal to the format of its native resolution in order to display it properly. This conversion process is commonly referred to as scaling.

So for example, let's assume you have a projector with a native resolution of 1280x720 that is capable of displaying an HDTV 1080i signal. That means that the projector's physical pixel matrix is 1280 pixels wide by 720 pixels in height. However, each frame of video in an HDTV 1080i signal contains 1920x1080 pixels, which is a lot more than the projector has on its physical display. So in order to display the 1080i signal the projector must compress it into a 1280x720 format. It can do this because it has been programmed to do the compression from 1920x1080 to 1280x720. Furthermore, if 1920x1080 is the highest resolution that your projector has been programmed to recognize and compress into its native display, then 1920x1080 is known as the maximum resolution of that projector.

Sometimes the incoming signal format is smaller than the native resolution of the display. For example, let's assume you have a native XGA resolution projector, and you are displaying a standard NTSC television signal. In this case, your projector has a native 1024x768 pixel array. But a regular NTSC television signal is only 640x480 pixels. So the projector must "scale" or expand that television signal up from 640x480 pixels to 1024x768 pixels in order to display the image full frame.

Conversely, in our previous example, we talked about converting a larger 1920x1080 signal into a smaller 1280x720 display. Technically speaking, that is known as compression since you are compressing the signal to fit the display. However, it is less common to use the term compression these days. In typical usage, the term scaling refers to any conversion of a data or video signal to a projector's native display format, whether it is being scaled up (expanded), or scaled down (compressed).

No matter what, you always lose something in scaling. The conversion of a non-native signal to a native resolution is a process of approximation. In essence, what the projector is doing is estimating what the pixel information would have been if the signal had been created in the projector's native resolution to begin with. In doing this, the projector cannot add new information to the original signal. The best it can hope to do is make a very close approximation and not lose much picture detail the process. So contrary to what you might think, when an XGA projector scales a television signal up from 640x480 to 1024x768, it does not add detail or sharpness. In actuality the picture will usually look a bit softer than it would if the projector had a native 640x480 display and avoided the scaling to begin with.

Overall, the scaling engines have gotten to be very accurate with video these days. Quite often a scaled video image looks just about as clear and crisp as it would if displayed in its own native format. However, this is not as true of computer data signals. So let's focus on data projection for a moment. It is common for a native XGA resolution projector (1024x768) to have, say, SXGA (1280x1024) listed as its maximum resolution. All that means is that you can feed a 1280x1024 computer signal into the projector. However, when the projector compresses this signal into its native 1024x768 display, the picture will be somewhat fuzzier (sometimes a lot fuzzier) than it would be if the signal was native XGA to begin with.

So if you are going to give a data presentation using the Internet, or text documents, or financial spreadsheets, you will want to have the sharpest image possible. In this situation, you will be better off to ignore the projector's maximum resolution spec—the fact that the projector can accept a higher resolution signal does not mean you will get a sharper image. Instead, make sure to set your computer's output resolution to match the native resolution of the projector; if you have a native XGA projector, then set your computer to output 1024x768. This will eliminate any scaling or compression, and give you the cleanest possible data image that your projector is capable of.


Dell customer care/service. If already out of warranty, click hereFind your Service Tag
DELL-Chris M
#IWork4Dell

1 Message

August 14th, 2005 21:00

Evan,  I'm trying to decide between the Dell 1100MP and the 2300 MP.  Sounds like since normal TV is 640X480, the native resolution of the 1000MP will be more than enough for normal TV.  Correct?

Since the 2300MP is almost twice the cost of the 1100MP and the only feature differences I see are native resolution (800x600 vs 1024x768) and maximum resolution (1400x1050 vs 1600x1200), I'm wondering if the difference in spec is worth the extra $. 

I plan on using the projector for home theater driven by my cable box.  Room is about 15x15.   Box supports HDTV.  Don't watch that many DVD's but would want high quality when I do.

It is difficult to know what is right since I don't know of a venue where I can let my eyes decide.  Any advice will be appreciated. 

 

Community Manager

 • 

54.2K Posts

August 15th, 2005 02:00

riverbank,

Evan wrote that document. Contact Evan here:
http://www.projectorcentral.com/maximum_resolution.htm

February 27th, 2006 20:00

I have a 5100 on the way.  I also have an Inspiron 6100 with 1600x1200 resolution capabilities.  I use my laptop for online games that sometimes require up to 4 windows to be open at once which is only possible with 1600x1200 resolution.  With the 5100, can I achieve that without much distortion?  Can it handle the video from the laptop.

Thanks

Steve

No Events found!

Top