PDA

View Full Version : Help with Apple 20-inch Cinema at 1600x1200


bmac
March 22nd 06, 11:34 PM
I recently had to rebuild my PC with Windows XP Home edition. This is
the third time I've rebuilt the PC so I'm getting pretty good at it.
The first two times were due to viruses, and this time was because I
had a network problem. I originally had a 17" Viewsonic tube but
upgraded to a Apple Cinema 20-inch monitor in 2005. The PC is an HP
Pavilion a450n with a 3 GHz P4 CPU and an nVidia GeForce 5200 8x AGP
video card. So here's my problem. I used to run at 1600x1200, but now I
can't. Now I don't remember having to do anything special to get that
resolution. Now I can only get 1680x1050. What gives?

The Apple web site says that this is the native resolution so I'd think
I'm just imagining things, but my home-made wallpaper from my digital
photos are all 1600x1200 specifically for the screen. Now they don't
fit 'one-up' and look funny when stretched.

So how did I run at 1600x1200? Just like the previous times, when I
rebuilt PC I installed all HP recommended upgrades, all Windows patches
(including SP2), and the latest nVidia driver (ForceWare Release 80
Version: 84.21). The PC reports the monitor as a plug-and-play monitor
at 60Hz and the nVidia software reports the monitor capable of
1680x1050.

Help!

bmac

Bob Davis
March 23rd 06, 04:42 PM
"bmac" > wrote in message
ups.com...

> So how did I run at 1600x1200? Just like the previous times, when I
> rebuilt PC I installed all HP recommended upgrades, all Windows patches
> (including SP2), and the latest nVidia driver (ForceWare Release 80
> Version: 84.21). The PC reports the monitor as a plug-and-play monitor
> at 60Hz and the nVidia software reports the monitor capable of
> 1680x1050.

This is a widescreen monitor, right? Setting res to 1680x1050 is
proportional to the screen's physical size, but 1600x1200 is not, and using
the latter will stretch the images to fit the screen. I would think you
would prefer to use the former setting.

Benjamin Gawert
March 23rd 06, 09:50 PM
bmac schrieb:

> I recently had to rebuild my PC with Windows XP Home edition. This is
> the third time I've rebuilt the PC so I'm getting pretty good at it.
> The first two times were due to viruses, and this time was because I
> had a network problem. I originally had a 17" Viewsonic tube but
> upgraded to a Apple Cinema 20-inch monitor in 2005. The PC is an HP
> Pavilion a450n with a 3 GHz P4 CPU and an nVidia GeForce 5200 8x AGP
> video card. So here's my problem. I used to run at 1600x1200, but now I
> can't. Now I don't remember having to do anything special to get that
> resolution. Now I can only get 1680x1050. What gives?

The Apple 20" Cinema display doesn't support 1600x1200. It supports only
resolutions up to 1680x1050 (which is the native resolution).

> The Apple web site says that this is the native resolution so I'd think
> I'm just imagining things, but my home-made wallpaper from my digital
> photos are all 1600x1200 specifically for the screen. Now they don't
> fit 'one-up' and look funny when stretched.
>
> So how did I run at 1600x1200?

You didn't, because you can't run a TFT with a higher resolution than
it's native (and 1600x1200 is higher than 1680x1050), at least not
without the use of virtual desktops (that scroll around on the screen).
The Apple Cinema 20" supports 1680x1050. Period. It does not run with
1600x1200.

Besides that, 1600x1200 is a 4:3 resolution while 1680x01050 is a
widescreen resolution (16:10). 1600x1200 on a widescreen display means
everything would be stretched and deformed.

Your problem sounds clearly like a misconfiguration of Windows regarding
wallpapers (if you choose "stretched" you will have a stretched wallpaper).

> Just like the previous times, when I
> rebuilt PC I installed all HP recommended upgrades, all Windows patches
> (including SP2), and the latest nVidia driver (ForceWare Release 80
> Version: 84.21). The PC reports the monitor as a plug-and-play monitor
> at 60Hz and the nVidia software reports the monitor capable of
> 1680x1050.

Sure, since this is what the monitor supports.

> Help!

With what? Running a 1680x1050 display at a higher resolution than what
it supports? Besides that this is impossible it also makes no sense
because even if you could run the display at 1600x1200 (which you
can't!) everything would just be strecthed.

Benjamin

bmac
March 24th 06, 02:53 AM
I'm running it now at 1680x1050 but I swear I used 1600x1200. I
certainly understand that I was (probably) skewing the display, but I
guess I got used to it. So I'll run the native resolution.

At work I have two HP 2035 displays sitting side by side, giving me a
3200x1200 display. It's real cool and I was using multiples of my
1600x1200 home-made wall paper. I didn't notice any distortation, but
I probably wasn't looking. Thanks!

Benjamin Gawert
March 24th 06, 06:37 AM
bmac schrieb:

> I'm running it now at 1680x1050 but I swear I used 1600x1200.

You didn't, simply because it is not possible. You can't display more
pixels on a LCD than it has, period.

Maybe you were running one of the other widescreen resolutions like
1600x1000 or 1600x1024, but these usually are not available with the
Apple Cinema 20".

> I
> certainly understand that I was (probably) skewing the display, but I
> guess I got used to it. So I'll run the native resolution.

Which also provides the best image quality.

> At work I have two HP 2035 displays sitting side by side, giving me a
> 3200x1200 display. It's real cool and I was using multiples of my
> 1600x1200 home-made wall paper. I didn't notice any distortation, but
> I probably wasn't looking. Thanks!

You probably mixed up the reolutions you remember with your work place.

Benjamin