Yeah, I'm stumped.
I guess you should peruse Toasty X's forums.
Hm , Toasty said its not possible which is what I thought initially. Maybe I am missing something?
https://www.monitortests.com/forum/...Hz-pixel-clock-is-not-showin-up-for-selection
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
Yeah, I'm stumped.
I guess you should peruse Toasty X's forums.
I actually have that same monitor. I can run some tests on it as well if need be although I'm not sure if the thin client attached to it can do much more than 2560x1600.But I do have a LaCie electron blue IV, which I believe has a higher max scan rate (140kHz) than the FW900. So let me know if you guys can think of any high resolution-related tests I should try.
Wait, you got 2560x1920 on the Lacie?!? How? Was this via VGA or using the Sunix adapter?Today I tried it with my LaCie 22 blue IV, which tops out at 140kHz, and I was able to select WAY higher resolutions. Stuff like 2560x1920@70hz (495mHz pixel clock) or 2720x1530 @ 86Hz (520mHz) pixel clock, and they were all stable.
But I still eventually hit point where windows wasn't showing some resolutions. Like I couldn't select 2880x2160@60 on the Sunix, even though I could over VGA.
Wait, you got 2560x1920 on the Lacie?!? How? Was this via VGA or using the Sunix adapter?
Thank you so much! I thought 2560x1600 was going to be the last crt resolution I was going to use, but looks like the lacie has some extra life in it with the right card and adapter.Both.
When on VGA, I have to use Toasty X's pixel clock patcher on my AMD card since the resolution is over the 400mHz pixel clock limit for VGA connections. DP 1.2 doesn't have that limitation, so I can run that resolution with no hacking on the Sunix.
From what jka is saying above, I think Sunix might be your only option for Nvidia if you want to go for the silly high pixel clocks.
But you can also always create an interlaced version of the resolution if you just want to try it out and stay below 400mHz.
Yes, something like that, dont remember now but it was one of these steps: 48,52 or 55
I only perform 9300K adjustment on my monitors because thats the only temperature that I use. So I simply skip the others. I personally dislike the yellow whites of 6500K and lower.
But when I measure fullwhite now I am getting 105cd. If I bump contrast from 90 to 100 I still get 105, which means it probably reached the default ABL cutoff of total sum of electrons emitted. Or is that a power or heat limit applied on the cathodes? It takes upto a minute to stabilize so it is probably something analog. I was also thinking if it isnt more tied to the anode thing, where you are limited by a maximum number of kV and producing more electrons than that amount of positive charge can attract. I think I have had ABL fail on me many times and it resulted in 2 things (one OR the other):
1) monitor shut down
2) picture started to be super blurry and it stretched (by 300%) until it faded away. In 2D animation terms it was like applying ZoomIn + FadeOut at the same time.
And I think number 2 could look exactly the same as if you had too much electrons in the tube without being able to escape through the top anode. Again, I am no electrician so these are very wild theories on my part.
My FW900 G2 is on 149 I think and its indeed possible that it is too low. I will try to up it a notch and re-do the WPB procedure. I have defined the G2 by looking at the "2 bars" pattern by spacediver where I set G2 so that the right side (7 IRE i think) is only very barely visible. Perhaps it should be a little more than barely visible because if I look at it now after WPB and LUT adjustment, its a lot visible with room lights on. I dont want to use the full greyscale pattern because the right side is too light and basically lights up the whole tube. But maybe I should for whatever reason - it is indeed an actual instruction in WinDAS, to load exactly that pattern to find G2.
UPDATE: Contrary to some things I said above about ABL, I just tested upping my OSD brightness level and I was able to get upto 125cd on full-white. Now how to get that with 31% brightness hmm![]()
Hi guys. I’ve been out of the loop. I was wondering if anyone can tell me what is the best hdmi/dp adapter to buy for the FW900 currently. There’s quite a few products mentioned in this thread. Could anyone help please?
EDIT: is this the best adapter everyone has been talking about? The Sunix 3000?
https://www.amazon.com/Sunix-DisplayPort-miniDP-DP-Cable-DPU3000-D3/dp/B00JARYTVK
Yes it is the best
DPU3000-D2 is for video cards with mini displayport
DPU3000-D3 is for video cards with displayport
It seems that to have full stability it is required the USB Type-A to Micro USB Cable for additional power and i don't know if it is included
On the Sunix website the usb cable is shown as included,but better to wait confirm from other users
The D3 version I received included the USB cable, and the D2 should as well. That's why I told earlier that I wondered if the ones on sale on Amazon weren't refurbished (with means potentially problems and/or parts lacking).It does not include the USB cable. It does seem more stabile since some card don't provide enough power, but you may be able to get by without it.
The D3 version I received included the USB cable, and the D2 should as well. That's why I told earlier that I wondered if the ones on sale on Amazon weren't refurbished (with means potentially problems and/or parts lacking).
Okay so the USB cable: does that need to be plugged into a phone charger or something or is it supposed to be plugged into a computer’s USB port?
I ordered the D3 last night. Will test it out in a few days.
I bought the Sunix DPU3000-D3 from Amazon about a week ago. The package includes, besides the splitter itself, the miniDP-to-DP cable and the micro USB cable both 10 cm long.
I have a GTX980 running on Windows 7 x64 and two 21" CRTs, the Dell P1130 and Sony GDM-F520.
I use CRU, NVIDIA pixel clock patcher and the GeForce control panel to set-up custom resolutions.
If I only connect the included DP cable the LED on the Sunix splitter lights-up barely but when I also connect the USB cable the LED intensity increases.
THE GOOD NEWS: On both CRTs the image quality and responsiveness is as good if not better than the integrated RAMDAC. I can see no ghosting or text doubling or shadows around icons, the image is as sharp and colourful while input lag is virtually non-existent. While playing a fast paced shooter like Call of Duty 1 I have the feeling is even faster than the 980's RAMDAC but that could also be my imagination. Definitely as good!
THE BAD NEWS: No matter what I do I can't get the splitter to be stable above 1920×1440
I can do 85Hz on the Dell and 90Hz on the Sony at that resolution and everything is perfect. As soon as I try 2048x1536 the monitors lose sync and go to standby about 50% percent of the time. It doesn't matter what refresh I try from 60 to 80Hz (85Hz on the F520) the signal is lost either when I restart Windows, change resolutions and return to 2048x1536 or simply when doing nothing and waiting long enough. I do find it weird as 85Hz seems more stable than 60Hz. I tried all modes, GTF, DMT, CVT, even manually adjusting polarities...same thing. What is stranger is if I try higher resolutions until I saturate the horizontal bandwidth of the monitors they do not lose sync anymore and shut down as they did on 2048x1536. Instead I see some rippling wavy like artefacts or the image is displaced in such a way that, for instance, the right side goes to the far left of the monitor while the left part of the image is moved to the middle. It doesn't stay like that much and soon I get rapid black screen flickering on the top part of the image. I didn't bother much with resolutions above 2048x1536 as the the text becomes very small, blurry and hard to read.
I tried a different DP cable who did not lit up the LED and required the use of the included USB cable to power up the splitter. The results were the same.
The included USB cable doesn't provide any benefit. Everything works the same with or without it but I do recommend keeping it plugged in to be sure as the LED lights up brighter with it.
I've also tried to power the splitter with a different USB cable, either to a motherboard USB port or a 5.3V 2.0A USB phone charger plugged to the wall. Same behaviour.
Quite disappointed actually![]()
Yes, I tried increments of 1Hz from 60 to 85 at resolutions very close to 2048x1536 like modifying x to 2040 or 2060 while increasing y to about 1600 etc...
For a 4:3 picture the monitors shut down from 1530 to about 1600 vertical lines no matter the refresh rate. After 1600 the splitter is "stable" but I get the artefacts and symptoms mentioned earlier.
I don't have a Linux PC available to test. Sorry
I never used Ubuntu before but I think I can give it a shot when I have some free time again. Does CRU work in Ubuntu? I use it to generate the INF monitor driver file with only the resolutions I'm interested in.How about live USB stick of Ubuntu? That's what I did, lol.
I'm a bit dubious about a responsiveness improvement but it is absolutely possible that the image quality is better with the adapter than with the internal DAC of a 980. Nvidia analog outputs have a reputation of mediocre quality.THE GOOD NEWS: On both CRTs the image quality and responsiveness is as good if not better than the integrated RAMDAC. I can see no ghosting or text doubling or shadows around icons, the image is as sharp and colourful while input lag is virtually non-existent. While playing a fast paced shooter like Call of Duty 1 I have the feeling is even faster than the 980's RAMDAC but that could also be my imagination. Definitely as good!
THE BAD NEWS: *clip*
Interesting. One particular thing that I did not report was the Dell P1130 acts differently than the F520 when I install the INF monitor driver created with CRU. While on the integrated DAC there's no difference between the two, with the Sunix splitter the Dell monitor doesn't register the 2048x1536 resolution in either Windows 7 display resolutions list or NVIDIA control panel. I have to manually add it in custom resolutions section of NVCP. With the F520 that resolution is registered immediately just like when it's connected to the integrated DAC.Interesting, my Sunix could 1920x1440@90hz no problem. Likewise with 2560x1920@70hz, as well as other crazy resolutions.
One thing the Sunix does is pass the EDID from the monitor to the GPU. On Radeon cards, it seems that I'll get a hard limit on max pixel clock because of that. I'm not sure if EDID data has an influence on Nvidia's behavior, but I'm just throwing that out there.
Also make sure Nvidia is reporting the full 5.4 Gbps clock for DP 1.2 connections. On my AMD card you can view that somewhere in the driver control panel.
I never used Ubuntu before but I think I can give it a shot when I have some free time again. Does CRU work in Ubuntu? I use it to generate the INF monitor driver file with only the resolutions I'm interested in.
Honestly I doubt is the OS. I initially suspected the DP cable but after trying a different one that does not power the splitter I can safely say it's the splitter fault for the instability.
Edit: On WIndows on AMD, we can't add interlaced resolutions after a certain point but it works fine in Ubuntu, so it can still be a driver issue.
I’m having trouble getting the sunix to give me proper 1920x1200. The monitor becomes “Synaptics Inc”. I installed the FW900 on top of that. Every time I add custom resolution using Nvidia’s control panel, it keep showing me a very distorted version of 1920x1200. Can anyone provide a step by step guide on how to properly getting the Sunix running on DisplayPort?
Edit: Never mind, I’ve set it to “GTF” mode and seems to be working okay now. Is this the correct mode for CRT?
The monitor becomes “Synaptics Inc”. I installed the FW900 on top of that.
I’m having trouble getting the sunix to give me proper 1920x1200. The monitor becomes “Synaptics Inc”. I installed the FW900 on top of that. Every time I add custom resolution using Nvidia’s control panel, it keep showing me a very distorted version of 1920x1200. Can anyone provide a step by step guide on how to properly getting the Sunix running on DisplayPort?
Edit: Never mind, I’ve set it to “GTF” mode and seems to be working okay now. Is this the correct mode for CRT?
UPDATE:
I borrowed a GTX 1070 and the results are identical to my 980.
I even tried a VGA-5BNC cable (same outcome) but I had the added inconvenience of CRU resolutions not working this time since with this cable the only info transmitted is RGBHV and no EDID data. I have to manually create ALL resolutions in NVIDIA custom res panel so I recommend using a standard VGA cable but with all pins active.
Again, resolutions at or close to 2048x1536 make the monitors go into standby very often.
I toyed with the highest resolution my F520 produces a somewhat sharp image at 60Hz, namely 2533x1900.
The monitor doesn't shutdown but I get the rippling/waving and displaced parts of the image symptoms.
Will further lower the resolution in small increments to see if this behavior continues.