Technical Discussion
  >> Hardware Issues


Register (or login) on our website and you will not see this ad.


  Print Thread
Standard User John_ON
(knowledge is power) Mon 06-Feb-12 11:57:25
Print Post

When is DVI-I not DVI-I ?


[link to this post]
 
Hi Folks,

My old graphics card is an ATI Radeon 2600 with two DVI sockets. I'm using it to drive two VGA monitors with DVI-VGA adapters.

It's getting a bit old so, in my ignorance, recently tried to up grade it to this one. I soon learned that DVI-D doesn't provide any VGA output on the DVI socket.

All credit to Ebuyer, they exchanged it for this one and, down in the specs, it says DVI-I (dual link). I fitted that one this morning and there's no VGA output from the DVI socket out of this one either.

Again, all credit to eBuyer, they are collecting this one and gave me the option of ordering a different card or having a refund. As I now haven't got a clue how to tell if any particular card will drive two VGA monitors (with or without DVi-VGA adapter), I had no option but to go for the refund.

So, does anyone have any first hand experience of an AMD graphics card, of similar performance to the ones I've tried, that they know provides two VGA outputs?

Thanks,

John.

________________________________________________
RouterStats 6.7 and RouterStats -Lite Line monitoring tools for the DG834(G) and others..

Website: vwlowen.co.uk
ISP:Be*

Edited by John_ON (Mon 06-Feb-12 12:01:19)

Standard User iand
(experienced) Mon 06-Feb-12 18:42:32
Print Post

Re: When is DVI-I not DVI-I ?


[re: John_ON] [link to this post]
 
If you have a graphics monitor that is DVI-I then you get digital and analogue

http://pinouts.ru/Video/dvi_pinout.shtml

I use a HD5770 which works with a converter so I can use a VGA monitor...

IanD
Standard User John_ON
(knowledge is power) Mon 06-Feb-12 20:07:09
Print Post

Re: When is DVI-I not DVI-I ?


[re: iand] [link to this post]
 
My monitors are VGA-only unfortunately. If AMD were to stick to the proper DVI standard, the female DVI sockets they fit to their DVI-D cards shouldn't accept a male VGA-to-DVI adapter because the adapter would be trying to use 'holes' that shouldn't be there.

It wouldn't solve my present issue but at least I'd have known instantly and for certain that their cards don't output a VGA signal on the DVI socket. Even their own website just says "DVI" without specifying -D or -I.

These are just "mid-range" cards, not some super-gaming things and VGA monitors are perfectly adequate. If AMD are trying to quietly drop DVI-I, I think they'll find their card sales dropping off instead of people rushing out to replace perfectly adequate monitors.

John.

________________________________________________
RouterStats 6.7 and RouterStats -Lite Line monitoring tools for the DG834(G) and others..

Website: vwlowen.co.uk
ISP:Be*


Register (or login) on our website and you will not see this ad.

Standard User mixt
(experienced) Mon 06-Feb-12 20:23:30
Print Post

Re: When is DVI-I not DVI-I ?


[re: John_ON] [link to this post]
 
As I understand it, VGA is an analogue signalling system. DVI is (no surprise here!) - digital. smile Initial standards allowed for transmission of the analogue signals along with the digital ones, on respective pins, enabling, as you say, one to connect a DVI to VGA adapter so the output can go to a monitor that supports only VGA. This is apparently classed DVI-A.

Source: http://en.wikipedia.org/wiki/Digital_Visual_Interface

So it would seem AMD are now producing cards that don't output these analogue signals any more, meaning you can't use VGA adapters with the cards. Time to buy a new monitor (you know they manufacturer stuff now with a life time of 2 years, right? Within that time, stuff breaks or becomes outdated anyway wink )

If I was you, I would look for a graphics card that has both sockets on the back, one of those being an actual VGA connector. That way, at least you know it supports VGA and will work with a VGA monitor.

EDIT: I suspect you already know all of the above, just clarifying anyway.

Now on <aaisp.net> (21CN+IPv6)
Previous ISPs: Virgin Media (50Mb/Cable), Be* Un Limited, ZeN
Is Linux routing your internet connection?
Need to make BIND geo-aware?

Edited by mixt (Mon 06-Feb-12 20:29:06)

Standard User John_ON
(knowledge is power) Mon 06-Feb-12 20:53:07
Print Post

Re: When is DVI-I not DVI-I ?


[re: mixt] [link to this post]
 
The dificuly is that I'm running dual monitors but I suppose I could just buy one - although it is nice when they're "matched". The new cards that I've tried have a VGA socket and a DVI (-?) socket. Either of the VGA monitors work fine (naturally) in the VGA socket but neither will work in the DVI socket with an adapter.

As the link provided by the previous poster shows, a female DVI socket that doesn't carry the VGA signal should not have the holes where the VGA connections would be. AMD are, in effect, fitting DVI-I sockets but only wiring them internally as DVI-D. So much for "Standards" or perhaps AMD are above that sort of thing!

What really annoys/concerns me though, is that it would now appear to be a lottery as to whether a card provides a VGA output as the major manufacturer seems 'coy' about stating either -I or -D.

John.

________________________________________________
RouterStats 6.7 and RouterStats -Lite Line monitoring tools for the DG834(G) and others..

Website: vwlowen.co.uk
ISP:Be*
Standard User 12eason
(eat-sleep-adslguide) Wed 08-Feb-12 15:44:34
Print Post

Re: When is DVI-I not DVI-I ?


[re: John_ON] [link to this post]
 
I have an nvidia card that supports 2 analog, but none of the AMD cards I've got have 2. Like you I've bought cards (and a motherboard) based on it being advertised as dual-link DVI-I but it turns out to be dual link DVI-D. Best bet with ATI is to find one with Displayport, as you can buy vga adapters for them.

Edited by 12eason (Wed 08-Feb-12 16:00:28)

  Print Thread

Jump to