LCD / Video Card advice request
Taavi Burns
taavi-LbuTpDkqzNzXI80/IeQp7B2eb7JE58TQ at public.gmane.org
Tue Apr 27 03:07:58 UTC 2004
On Mon, Apr 26, 2004 at 07:40:17PM -0400, Peter King wrote:
> It seems as though an LCD monitor with a DVI input, paired with a video
> card having a DVI output, produces better-quality results than analogue
> connections. (Correct me if I'm wrong.)
Aye, that'd be the point. ;)
> Many LCD monitors -- certainly those capable of 1280x1024 or 1600x1200
> resolution, which is good for my purposes -- have DVI inputs.
>
> --> Question [1]: Any recommendations for DVI-enabled LCD monitors
> at higher resolutions? Good, bad, horror stories?
I have an 18" LCD monitor at work that displays at 1280x1024. It seems
to work just fine, and has both VGA and DVI inputs.
> --> Question [2]: Any recommendations among the various ATI cards? Or
> should I be looking at some other manufacturer for
> this? Again, the good/bad/horrific stories are all
> useful.
The general wisdom that I'm aware of is that Matrox cards produce
crisper output than ATI cards, which produce cripser output than
NVidia cards. (there was an article with oscilloscope captures,
but I sure can't find it now...so take that for what it's worth)
Now, that's probably far less relevant on an LCD display, because
the pixels are crisply bounded in the physical domain, not in
the signal's domain.
That being said, there are two things you should do to ensure
a crisp image on any LCD:
1) Use it at native resolution
2) "retrain" the monitor to the specific video source
As far as I can tell, my monitor at work samples the video
input signal at regular points; if you just toss a random signal at it,
it will do its best to guess where pixels lie with respect to the
vsync and hsync signals, but it's not perfect. I noticed 'buzzing' of
pixels at high contrast boundaries (like non-antialiased black text on
a white background). The monitor's OSD has an option to "auto-tune" the
display, which proceeded to make the screen look funky for a few seconds,
and then left everything CRYSTAL clear. I have to do that when switching
between the desktop's Matrox card and the laptop's ATI chipset. It's
perfectly useable without, but that extra level of crispness is but a
config setting away. DVI probably does not suffer from this problem,
though, being digital and all. ;)
--
taa
I am suggesting we examine the damage to children
whose childhoods might not even be called strict.
--Karen Walant
/*eof*/
--
The Toronto Linux Users Group. Meetings: http://tlug.ss.org
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://tlug.ss.org/subscribe.shtml
More information about the Legacy
mailing list