All the information can be found here:
http://members.cox.net/minoten
Email me if your interested in being part of the group buy, you can find my contact info on my site.
Also, there are two threads with talk of the kits:
http://www.lumenlab.com/forums/index.php?showtopic=3138
http://www.diybuildergroup.com/showthread.php?threadid=319 (look at the last 5 pages or so)
Thank you.
http://members.cox.net/minoten
Email me if your interested in being part of the group buy, you can find my contact info on my site.
Also, there are two threads with talk of the kits:
http://www.lumenlab.com/forums/index.php?showtopic=3138
http://www.diybuildergroup.com/showthread.php?threadid=319 (look at the last 5 pages or so)
Thank you.
I'm amazed that there hasn't been any intrest in this. This is the panultimate LCD right now in projector building! I can't wait to see how they work.
-Jim
-Jim
Too much $$$ for most people to spend when you can get a 15" lcd monitor at staples for $200 bucks
Gordon
Gordon
hahah I won't even try to explain how it has 3 times the pixels and offers dvi hdcp pip ....
There are now 14 people involved, more than enough for this to happen.
There are now 14 people involved, more than enough for this to happen.
i'm not trying to knock that it is a great panel it's just s little too expensive for most of us🙁
I'd love to have one but i'd need to sell my kidneys to get it
Gordon
I'd love to have one but i'd need to sell my kidneys to get it

Gordon
Yeah, bit too much $$$ for me at the moment. What's the time frame on this?
Also, minoten, those are my extension pics of the ffc. I wasn't planning on it, but if you want I'll put up some better looking pics. I took those ones in a hurry and didn't bother to light it very well.
Also, minoten, those are my extension pics of the ffc. I wasn't planning on it, but if you want I'll put up some better looking pics. I took those ones in a hurry and didn't bother to light it very well.
hey TJH,
I threw that LCD thing together today, I should have given you credit but I totally forgot, I will add it now, my apologies.
If you could take some brighter pictures that would be awesome, I had to run your pictures through a photoshop filter to brighten them, but it is still very dim. I would definately appreciate that, your the only person I know with the stripped LCD besides gazzaden.
How do you think my summary of the LCD panel stripping is? Accurate? That whole guide was definately not official, but it was kind of what I figured out from reading and looking at pictures.
The time frame...well I would like to collect the test kit money (theres currently 14 people, so like $58/person for us to get a test kit. Once ordered the kit would probably take 2 weeks to be programmed/delivered. Once I have it I estimate it will take 2 weeks to fully test it (I need to buy some electronic components to test some of this stuff, and I will probably need to buy the mouser FFC like you did)
At that point in time I would put in our order for the other orders, I am going to go ahead and say that some people are going to be interested once I post some high-res pictures of the panel in action. So we can probably add those that want to join at that point in time, and get the panels for even cheaper. So that puts us at like 2.5-3 weeks to recieve the test kit, and 4-5 weeks until I collect the money for the actual group buy for the kits. These dates arent set in stone though, just what I would like to do.
Edit: sorry if my grammar sucks, I get worked up when I write about this stuff and I just spew words 🙂
I threw that LCD thing together today, I should have given you credit but I totally forgot, I will add it now, my apologies.
If you could take some brighter pictures that would be awesome, I had to run your pictures through a photoshop filter to brighten them, but it is still very dim. I would definately appreciate that, your the only person I know with the stripped LCD besides gazzaden.
How do you think my summary of the LCD panel stripping is? Accurate? That whole guide was definately not official, but it was kind of what I figured out from reading and looking at pictures.
The time frame...well I would like to collect the test kit money (theres currently 14 people, so like $58/person for us to get a test kit. Once ordered the kit would probably take 2 weeks to be programmed/delivered. Once I have it I estimate it will take 2 weeks to fully test it (I need to buy some electronic components to test some of this stuff, and I will probably need to buy the mouser FFC like you did)
At that point in time I would put in our order for the other orders, I am going to go ahead and say that some people are going to be interested once I post some high-res pictures of the panel in action. So we can probably add those that want to join at that point in time, and get the panels for even cheaper. So that puts us at like 2.5-3 weeks to recieve the test kit, and 4-5 weeks until I collect the money for the actual group buy for the kits. These dates arent set in stone though, just what I would like to do.
Edit: sorry if my grammar sucks, I get worked up when I write about this stuff and I just spew words 🙂
No worries about the credit, just took a quick look and realized how bad the pics were, figured you might want some better looking ones.
I just got into all this not to long ago, and have only stripped the Kogi L4AX, but your guide sounds feasible. Each manufacturer tends to do things in very different ways, so from what I have gathered with each screen there is always a bit of improvisation. I had a few pesky screws from this dated, 1993 build date, lcd which forced me to actually cut away the plastic around them, but for a newer model like this that shouldn't be an issue, just don't strip the screws by using the wrong screw driver 😉
As far as the pics go, I'm gonna grab some sleep, but I'll post some better ones later today.
I just got into all this not to long ago, and have only stripped the Kogi L4AX, but your guide sounds feasible. Each manufacturer tends to do things in very different ways, so from what I have gathered with each screen there is always a bit of improvisation. I had a few pesky screws from this dated, 1993 build date, lcd which forced me to actually cut away the plastic around them, but for a newer model like this that shouldn't be an issue, just don't strip the screws by using the wrong screw driver 😉
As far as the pics go, I'm gonna grab some sleep, but I'll post some better ones later today.
A note:
According to the link in the diybuildergroup forum, anything over 1600x1200 requires two DVI connections. At least, that's how I read it:
Does this mean we need to have some funky Dual-link workstation card to use this? Would a 6800gt with two DVI ports do? Perhaps a pair of 6x00 SLI cards?
According to the link in the diybuildergroup forum, anything over 1600x1200 requires two DVI connections. At least, that's how I read it:
To be DVI compliant, the graphics hardware must support a minimum of 25.175MHz, which is the signal frequency required to support 640x480 pixel resolution at 60Hz. Today's DVI 1.0 spec specifies a maximum single-channel bandwidth of 165MHz. This is good enough to support a 1600x1200 display in most cases, including CRTs refreshing at 60Hz.
* Projector Torture Test: LCD versus DLP
* The Biggest LCD Panel Ever
* LCD Monitors: Technology Update
Note that some LCDs may require less stringent blanking interval timing. A blanking interval is the time it takes for the display to start displaying the next frame or field. CRT's often require fairly high blanking intervals, so the electron gun can realign to the start of the refresh cycle. LCD flat panels often require smaller blanking intervals, but not all LCD blanking intervals are the same. The standard requires a miminum spec of 5% blanking interval for LCD displays, but if the blanking interval is longer, somewhat higher bandwidth may be required. You can theoretically "get away" with a clock rate of around 142MHz, if you assume a 5% blanking interval -- but hardware that will only support 142 MHz for 1600x1200 is skating on thin ice.
DVI is based on the TMDS (transition minimized differential signaling) electrical protocol. TMDS is a differential signaling scheme that encodes pixel data and ships it over a serial link that typically consists of three to six data channel pairs plus a clock channel pair. Note, however, that DVI can support alternative media, such as fiberoptic transmission. (For more gory details on the DVI standard, you can actually download the DVI 1.0 spec from the DDWG web site.)
If you move to two DVI channels, then the standard allows for even higher bandwidth. Dual-link DVI graphics cards today (such as nVidia's Quadro FX series) can support up to 330MHZ, which can easily handle 8 bits per pixel up to 2048x1536. A card with two DVI channels contains two TMDS transmitters and two DVI connectors. These two connectors can be used to either connect two different digital displays, allowing the user to have a dual display system. Alterntaively, they can be used to connect to a single display device that requires a lot of signal bandwidth. Note that 2048x1536 (aka "QXGA") will only require 240MHz of signal bandwidth in typical cases, but that's still more than the 165MHz maximum bandwidth that one DVI channel can deliver. If you want to use an ultra high resolution display, like Viewsonic's VP2290b, then you'll need graphics hardware with two DVI transmitters.
Does this mean we need to have some funky Dual-link workstation card to use this? Would a 6800gt with two DVI ports do? Perhaps a pair of 6x00 SLI cards?
If you look on my website it says that it is a single-link DVI-D interface. I explained how it all worked on DBG forums...
You can actually output at 1920x1200@60hz with a single link DVI, but you must use the "reduced blanking" feature (which our Controller says it supports).
A single link DVI's limits are at 1600x1200 without tweaks, but by reducing blanking you can achieve higher resolutions, and when you reduce hz you can even make it go up to WQUXGA.
Read here:
http://www20.graphics.tomshardware.com/graphic/20041129/tft_connection-05.htm
Ok, here are the new pics (each about 100kb):
FFC Top
FFC Top #2
FFC Bottom
And if you want some older ones before it was extended:
Original Connection
FFC Alone
And the connector
Hope these help. Enjoy! 🙂
FFC Top
FFC Top #2
FFC Bottom
And if you want some older ones before it was extended:
Original Connection
FFC Alone
And the connector
Hope these help. Enjoy! 🙂
Thank you for calming my acute FUD, but then I read some more, and it still appears that the spec might have problems w/ WUXGA.
Only two of the NV cards of the 6 ones in the THG review appear to reduce blanking in their test. The ones that do, bring the bandwidth down from 162mhz to around 141mhz for UXGA. 141 * (1920/1600) = 169mhz, which is beyond the spec. The theoretical 5% blanking figures at the beginning would do about 125mhz for UXGA, and 145.1mhz for WUXGA, but we don't know if video cards are capable of attaining those.
We definitely need to figure out how blanking works - does it only get used when it's necessary? Is the testing company turning it on/off? Does it have further bandwidth-reducing capacity? Is the ability to reduce blanking inherent within the DVI spec, and merely disabled in some of the cards because their signal is good enough to work at full blanking?
I realize that a 60hz image isn't nearly as much of a problem on an LCD compared to a CRT, but it's still a bit of a goal(And, come to think of it, isn't it a requirement for 1080i HD?[edit: Even though it's displayed progressively, 1080i probably uses acceptable bandwidth assuming blanking because of 1080 vertical lines vs 1200]).
Also, it's recommended to decrease the size of the cable as much as possible, as that seems to drastically increase noise.
This slashdot post makes it appear hit-and-miss:
DVI distortion looks horrific, this:
Compared to this:
I really want to see this thing succeed, I'm just concerned that people will spend their $600, get their panels, and then not be able to use them on their HTPCs.
The result of our DVI compliance test is positive across the board, with all six cards reaching DVI compliance. However, while the three ATI based cards provided by ABIT and ATi turned in exemplary results, MSI's NVIDIA based cards are only able to reach DVI compliance in UXGA at a reduced frequency of 141 MHz and using a reduced blanking interval. This greatly limits the NVIDIA cards' "upward mobility" - since they don't have enough reserves for TFT displays with higher native resolutions than UXGA (1600x1200). The MSI NX6800 card only reached compliance at 162MHz when a separate TMDS transmitter chip was used. Counting these results, it seems that ATi's integrated TMDS transmitter is superior to NVIDIA's implementation. Yet the MSI cards' eye diagrams displayed a turbulent distribution of the data even when the SiL 164 TMDS transmitter was used. This, in turn, limits the maximum usable cable length, especially when cheaper cables are used.
Only two of the NV cards of the 6 ones in the THG review appear to reduce blanking in their test. The ones that do, bring the bandwidth down from 162mhz to around 141mhz for UXGA. 141 * (1920/1600) = 169mhz, which is beyond the spec. The theoretical 5% blanking figures at the beginning would do about 125mhz for UXGA, and 145.1mhz for WUXGA, but we don't know if video cards are capable of attaining those.
We definitely need to figure out how blanking works - does it only get used when it's necessary? Is the testing company turning it on/off? Does it have further bandwidth-reducing capacity? Is the ability to reduce blanking inherent within the DVI spec, and merely disabled in some of the cards because their signal is good enough to work at full blanking?
I realize that a 60hz image isn't nearly as much of a problem on an LCD compared to a CRT, but it's still a bit of a goal(And, come to think of it, isn't it a requirement for 1080i HD?[edit: Even though it's displayed progressively, 1080i probably uses acceptable bandwidth assuming blanking because of 1080 vertical lines vs 1200]).
Also, it's recommended to decrease the size of the cable as much as possible, as that seems to drastically increase noise.
This slashdot post makes it appear hit-and-miss:
yes, that is true: on paper one dvi-d link caps out at 165Mhz, which apparently specs out to 1600x1200. However, it _is_ possible to get a single DVI-D link to drive 19x12: Witness that people have gotten both the sony 23", Apple 23", and Samsung 24" driven digitally buy off the shelf cards that only supply single-link DVI (as you say, no card on the market provides dual-link DVI-D).
To whit, the ATI 9700 Pro has been verified to work even though it shouldn't. I'm currently installing Xfree 4.3.0 to try to drive an ATI 9000 Pro at that resolution. If it doesn't work, Dell get the p232w back. The Apple 23 inch is driven over single link DVI (well, ADC, which is DVI-D +other cables), and that is the offical way to do it.
This is contrary to what both ATI's marketing AND specs say. The common claim is that the blanking delay, which is there as a holdover from analog days, can be be used to push the additional 3600+ pixels per redraw to the monitor. Alternately, you can feed it less than 60hz.
Reminds you of the old joke: the american asks the frenchman about his new theory "Yes, but does it work in practice?", while the frenchman asks about the american's prototype "well, does yours work in theory?"
DVI distortion looks horrific, this:
An externally hosted image should be here but it was not working when we last tested it.
Compared to this:
An externally hosted image should be here but it was not working when we last tested it.
I really want to see this thing succeed, I'm just concerned that people will spend their $600, get their panels, and then not be able to use them on their HTPCs.
- Scenario:
A) Blanking is a yes/no feature, cards are shipped with the capability or not
B) Blanking is an on/off feature, either you have it to the prescribed degree or you don't have it at all
C) the DVI spec max frequency of 165mhz is a hard limit
Then we'll be unable to use these@ 60hz on any cards that can't blank significantly better than the two in that test.
Edit1: This would only mean that you couldn't use them at full res @ 60hz, you could probably use them at lower refresh rates.
Edit2: The numbers do work out, assuming the same efficiency of reduced blanking in reducing bandwidth, for a 1080i signal. (141)(1080/1200)(1920/1600) = 152mhz
Seems like all these issues have really nothing to do with the lcd panel/controller, but are good points in building an HTPC.
All I can tell you is that it is a single link dvi-d and says it supports wuxga with reduced blanking. I will be testing it with 1080i/720p and various computer resolutions with the dvi-d input when I recieve the test kit, and those results will of course be posted.
I don't buy NVidia anymore, so im glad to say my ATI card will work fine at 162mhz, not that crappy 142.
Note that this controller uses Genesis GM1601 "ULTRA-RELIABLE" DVI interface...so i'd imagine its pretty good 🙂
All I can tell you is that it is a single link dvi-d and says it supports wuxga with reduced blanking. I will be testing it with 1080i/720p and various computer resolutions with the dvi-d input when I recieve the test kit, and those results will of course be posted.
I don't buy NVidia anymore, so im glad to say my ATI card will work fine at 162mhz, not that crappy 142.
Note that this controller uses Genesis GM1601 "ULTRA-RELIABLE" DVI interface...so i'd imagine its pretty good 🙂
I got the impression from the THG article that 142 was bad, for some reason. For our applications, why should it be? 142 indicates that reduced blanking is being used for UXGA, 162+ means that it's not being used.
If there are no degrees to how reduced blanking is applied, and it's a non-optional design feature of the board(and 165mhz is a hard cap for the DVI hardware), and you're running @ 162 already at UXGA, then you definitely won't be able to use this at WUXGA @ 60hz. More importantly for you, it would mean that you won't be able to use even true 1080p.
While this stuff IS really HTPC stuff, I doubt that many people have invested the insane amounts of money for a large-sized 1200x1920 panel(if they exist, I assume so), with the intent of hooking it up to an HTPC.
If there are no degrees to how reduced blanking is applied, and it's a non-optional design feature of the board(and 165mhz is a hard cap for the DVI hardware), and you're running @ 162 already at UXGA, then you definitely won't be able to use this at WUXGA @ 60hz. More importantly for you, it would mean that you won't be able to use even true 1080p.
While this stuff IS really HTPC stuff, I doubt that many people have invested the insane amounts of money for a large-sized 1200x1920 panel(if they exist, I assume so), with the intent of hooking it up to an HTPC.
My impression was that it the NVidia cards count not be cranked past 142mhz or they would be out of spec, they can't transfer as much data as the one that cranked up to 162mhz (165mhz is the dvi single link limit). In order to reduce the mhz you can reduce blanking, which is what they did in order to get 1600x1200 to work at 142mhz, what they are saying that with no reduced blanking, running it at 160-165mhz that the nvidia card was out of spec, in order to get it in spec they have to reduce the blanking until it was at 142mhz.
This is not a good thing, in order to run WUXGA at 142mhz you would have to reduce the blanking a lot or even change the frequency. While someone that can keep their card at 162mhz just needs a slight blanking adjustment.
That was my interpretation, you can always look up the apple cinema displays for talk about 1920x1200 computer displays.
This is not a good thing, in order to run WUXGA at 142mhz you would have to reduce the blanking a lot or even change the frequency. While someone that can keep their card at 162mhz just needs a slight blanking adjustment.
That was my interpretation, you can always look up the apple cinema displays for talk about 1920x1200 computer displays.
minoten said:My impression was that it the NVidia cards count not be cranked past 142mhz or they would be out of spec, they can't transfer as much data as the one that cranked up to 162mhz (165mhz is the dvi single link limit). In order to reduce the mhz you can reduce blanking, which is what they did in order to get 1600x1200 to work at 142mhz, what they are saying that with no reduced blanking, running it at 160-165mhz that the nvidia card was out of spec, in order to get it in spec they have to reduce the blanking until it was at 142mhz.
This is not a good thing, in order to run WUXGA at 142mhz you would have to reduce the blanking a lot or even change the frequency. While someone that can keep their card at 162mhz just needs a slight blanking adjustment.
That was my interpretation, you can always look up the apple cinema displays for talk about 1920x1200 computer displays.
It depends on how blanking is applied - are we sure that it is a feature that the vidcard 1) HAS inherently, if it supports the DVI spec, and 2) can throttle to different degrees
Regarding apple cinema, they do an end-run around the issue by using dual link architecture (they have two chips driving the cord).
is a feature that the vidcard 1) HAS inherently, if it supports the DVI spec, and 2) can throttle to different degrees
1) From what i've read reduced blanking can be turned on and off.
2) I am not sure about this, I would assume that there would be programs to control it
I think the panel will support reduced blanking fine, the problem lies in the users video card...
Google's bounty:
The capability to reduce blanking is probably tied to a set of VESA standards, which one I cannot tell.. They are accessible by a special advanced setting (it even makes you agree to an "Advanced EULA" 🙂) on Forceware drivers. I can't find documentation for Catalyst drivers.
A TI4200 and a 9700 have been confirmed working @ 1920x1200x60.
The stated conjecture was that anything from the radeon 7500 / geforce 4 generation and up have the capability.
The thing that was throwing me off about the THG article was that I wasn't sure what settings were constrained, and what settings were constraining.
</anxiety>
The capability to reduce blanking is probably tied to a set of VESA standards, which one I cannot tell.. They are accessible by a special advanced setting (it even makes you agree to an "Advanced EULA" 🙂) on Forceware drivers. I can't find documentation for Catalyst drivers.
A TI4200 and a 9700 have been confirmed working @ 1920x1200x60.
The stated conjecture was that anything from the radeon 7500 / geforce 4 generation and up have the capability.
The thing that was throwing me off about the THG article was that I wasn't sure what settings were constrained, and what settings were constraining.
</anxiety>
- Status
- Not open for further replies.
- Home
- General Interest
- Everything Else
- The Moving Image
- LCD
- 15.4" Wuxga Lcd Kit