24" Widescreen CRT (FW900) From Ebay arrived,Comments.

fagoatse

n00b
Joined
Jan 4, 2016
Messages
12
When I owned my FW900, and hell, for my Artisan and F520, I never messed with any of the geometry setup procedures as it's more often than not unnecessary. The only time I did the full geometry setup was when I wrote that guide a while back on doing the geometry for all of you. :D

But anyways - the only thing a monitor typically needs to have done is a touch up on the dynamic convergence (and even then most monitors don't need it unless you're just really picky) and a recal of the white balance. Geometry, landing, etc, shouldn't really need to be changed unless you're an ultra perfectionist. For 99% people out there, I'd say it isn't worth it and you're probably going to ruin it more than fix it.

To keep things into perspective, deflection yokes on flat CRT's are literally round pegs steering electron beams onto a square face. It won't be perfect. Also, these monitors have tighter deflection angles to keep the depth of the monitor down. The best geometry I've ever seen is from a CRT projector and that's because the tubes are deeper than they are wide/long. My 9-inch PVM also has damn near perfect geometry probably because of its smaller 70 degree deflection angle (the 14 inch monitor has a 90 degree angle and has the same depth as the 9 inch).


My 2070sb has 4 additional geometry variables to play with and I've gotten it nearly perfect on my 2 units. No ovals at all. It's a shame that Sony didn't expose such settings in the OSD. Does WinDAS allow for more adjustements in this area? A friend of mine has bought an FW900 that seems to be fine except for absolutely unfixable geometry.
 

jbltecnicspro

Supreme [H]ardness
Joined
Aug 18, 2006
Messages
7,866
My 2070sb has 4 additional geometry variables to play with and I've gotten it nearly perfect on my 2 units. No ovals at all. It's a shame that Sony didn't expose such settings in the OSD. Does WinDAS allow for more adjustements in this area? A friend of mine has bought an FW900 that seems to be fine except for absolutely unfixable geometry.

WinDAS does open up a couple of more geometry controls but it all depends on the monitor model itself. For the FW-900, it allows additional controls of C-BOW, C-BOW-BAL, S-BOW, S-BOW-BAL. These adjust the bowing on the various portions of the screen. S-Bow does the top and bottom third of the screen, and I think C-Bow does the top and bottom sixth of the screen? Hard to explain in a quick reply like this, I'm sure someone else can chime in. Basically, their adjustment allows the vertical edges of the screens to be very straight when adjusted correctly.

EDIT: There are a couple of pots on the deflection yoke as well that adjust the horizontal trapezoid as well. Why this wasn't included in WinDAS and on a pot? Who knows...
 

jbltecnicspro

Supreme [H]ardness
Joined
Aug 18, 2006
Messages
7,866
safety regarding the health of the tube. In particular, the health of the cathode and the phosphor.

Yeppers! Overdriving the phosphor is a sure-fire (hur, hur!) way to permanently burn your screen. Likewise, running that cathode too high kills the life of your monitor as well... :D
 

Blutrache

n00b
Joined
Jan 28, 2016
Messages
47
My 2070sb has 4 additional geometry variables
May I ask what values are you refering to? The only geometry problem with my 2070sb is with the linearity of the upper part of the display, slightly streched compared to the rest of the screen. I do have another diamondtron (2060u) that has almost spot on geometry
KhjMDK6.jpg
 

spacediver

2[H]4U
Joined
Mar 14, 2013
Messages
2,681
First off, TV line count is related to the horizontal resolution, and your question suggests that you think that TV lines means vertical resolution.

I don't know if 1400 has been properly tested, and I'm not sure what standards are used to measure tv lines.

The US government did a decent evaluation of the FW900 in 2001. The most relevant test is probably the contrast modulation, although they didn't do extensive testing here (see page 76 in this document). They did do an addressability test at 1920 x 1200 and confirmed that the pixels were properly rendered, but I don't know enough about the test pattern and the interpretation of its appearance to say what this means in terms of how clearly each pixel was resolved. They basically fed the monitor a pattern that was a rectangle with two diagonals, and reported that

"All perimeter lines were confirmed to be visible with no irregular jaggies on diagonals."

With CRTs, resolution isn't an all or none affair. As the limits of the aperture grille and electron optics are approached, the ability to display sharp black and white lines is going to be challenged in a gradual fashion. Contrast modulation is a measure of how much contrast there is between adjacent black and white lines. The higher the contrast, the more sharply resolved are the lines. If the resolution is poor, then there will be blur from the white lines bleeding into the lines that are meant to be black, and contrast will suffer.

The most straightforward way to characterize a display's resolving capability is to measure its contrast modulation across a number of different resolutions (i.e. a black and white alternating lines at different densities), and you get a function (similar to the modulation transfer function of a camera). Say you choose your contrast cutoff at 0.4 (anything below this you consider not well resolved), then you can find which spatial frequency (or resolution) is able to be resolved with a contrast modulation of 0.4. The rub here is that you need to define your cutoff in order to intelligently talk about this sort of thing.

There are other techniques for characterizing these functions. If you see how the display responds to a square wave signal (so the signal has a sharp edge), you can get the edge spread function, which is related to the line spread function (which itself is basically the impulse response of the system). If the system is linear, then you can infer its response to any conceivable input (including its contrast modulation to varying spatial frequencies).

For an excellent illustration of how useful impulse responses are, see 3:22 in this video - basically, if you know how a (linear) system responds to an impulse (in the spatial domain, this would be an infinitely small point, or in our case, since we're interested in horizontal resolution, an infinitely thin line), you can infer how the system response to any input, since any input can be considered a linear combination of impulses. Since we can't really present an infinitely thin line, we can present something that has an edge, and use some math to derive the impulse response.

Thing is, I don't think a CRT is linear, particularly because of things like veiling glare. So I'd trust measurements directly at different resolutions.

I have equipment to make these measurements, but it's a tedious and time consuming project.

Also see my post here.

Finally, the concept of the raster addressability ratio is relevant (see here).

TLDR, it's unclear, and probably best to measure yourself. You can feed some test patterns in, and judge with a loupe if the lines are well resolved.

If your'e interested in vertical resolution testing, the same principles apply, except that the resolution will not be limited by grille pitch.
 

Mumble

Weaksauce
Joined
Dec 11, 2015
Messages
106
Yes sorry, we know it will fully resolve 1920x1200, but will it fully resolve 2304x1440?
 

fagoatse

n00b
Joined
Jan 4, 2016
Messages
12
Yes sorry, we know it will fully resolve 1920x1200, but will it fully resolve 2304x1440?
No. My eyes are not easily deceived and it really is a step down from 1200p. Perhaps It'd be slightly better If I lowered the refresh rate to 72Hz or so but I've been using DSR instead(4K->1200p + the softness of a CRT looks extremely good).
 

15sunrises

n00b
Joined
Dec 14, 2017
Messages
6
Hi all,

Unfortunately it looks like there's no more use for my prized FW900 in my studio. :( It has no color issues or scratches on the anti-glare coating, some marks on the bezel, but nothing that impacts operation, still looks great considering the age. Can anybody give me an idea of what kind of price I could be looking to get for it?
 
Joined
Feb 26, 2018
Messages
1
I'm trying to do dynamic calibration on a Sun GDM 5410 and a Dell D1626HT - is there a reason the 'Dcnv' option is always greyed out in the Windas menu?

Sorry for only being tangentially related to FW900 talk, but this seems to be the only place on the internet for answers. Most of the links in the guides for dynamic convergence in Windas are long since dead.
 

Strat_84

Limp Gawd
Joined
Jul 16, 2016
Messages
481
The answer is easy: the GDM-5410 lacks the electronics related to dynamic convergence (the S board). So that setting is disabled in Windas. ;)
 

douchyhat

n00b
Joined
Jan 8, 2017
Messages
1
Other than obvious size/weight differences, is there any reason to get a Sony XBR 910/960/970 over an FW900 for PC gaming? Could get an FW900 for somewhere over €50 if I'm not too late. XBR I would have to look for (or get from eBay... Oversea shipping *shudder*)
 

Strat_84

Limp Gawd
Joined
Jul 16, 2016
Messages
481
If the Sony XBR you're talking about is a TV, it must be limited to much lower resolutions/refresh rates than the FW900. Interesting to watch movies but certainly not suited for computer use.


By the way, is there anything new about the Delock displayport -> VGA adapter ?
 

Enhanced Interrogator

[H]ard|Gawd
Joined
Mar 23, 2013
Messages
1,288
Other than obvious size/weight differences, is there any reason to get a Sony XBR 910/960/970 over an FW900 for PC gaming?

XBR, you're limited to 1080i/540p@60hz, 900i@70hz, etc.

I am curious if it's possible to make a frankenstein XBR with some of the electronics from a CRT PC monitor to make it run out-of-spec resolutions

By the way, is there anything new about the Delock displayport -> VGA adapter ?

Is there any reason to care about that now that the Sunix adapter seems to have a higher pixel clock? Price maybe?
 
Joined
Oct 16, 2016
Messages
639
Other than obvious size/weight differences, is there any reason to get a Sony XBR 910/960/970 over an FW900 for PC gaming? Could get an FW900 for somewhere over €50 if I'm not too late. XBR I would have to look for (or get from eBay... Oversea shipping *shudder*)
No. You'd cap out at 1080i/540p at 60 Hz, and probably not even fully resolved at that.

GDMs in general are like what PVM/BVMs are to their consumer TVs, but an order of magnitude moreso because of their sharpness, general lack of bloom with changing brightness in parts of the scene (my 24" WEGA SDTV has a big problem with this in the horizontal axis) as long as the circuitry's in proper working order, and of course, the possible resolutions/refresh rates. Only downside's that a typical GDM can't sync below 31 KHz horizontal, so 15/24 KHz consoles, arcade boards and retro computers are right out without a scandoubler (XRGB, OSSC, that sorta thing).

I'd just get the FW900 for your €50 if you don't mind sitting a bit closer. Just make sure it doesn't do like mine did and start flashing colors at fullbright from a cold start that goes away when warmed up (like I found it at the craigslist seller's house), because that means something in the HV section's about to give up the ghost if not serviced. Still need to fix mine before a certain parent gets overzealous with junking my old computer tech to clear up space in the house, too...
 

Strat_84

Limp Gawd
Joined
Jul 16, 2016
Messages
481
Is there any reason to care about that now that the Sunix adapter seems to have a higher pixel clock? Price maybe?
Of course price ! The Sunix (or Delock equivalent, still not tested) is at least 5 times more expensive than the Delock 62967, while the extra pixel clock is totally unnessary for regular use on a 21" (usually 1600x1200 up to 100hz).

On the top of that, there is still no feedback about the picture quality. There is no flaw obvious enough as far as I saw, but there may be a sharpness decrease compared to an AMD native analog output (which could be either because the ones on sale on Amazon were refurbished junk, or because of the device itself). And I still haven't checked mine because the mess with AMD drivers forces me either to move half of the room back and forth, or to risk ruining a Windows install without an improvement ... :hungover:
 
Joined
Dec 13, 2006
Messages
612
Ok my XP laptop died on me and now I need a USB to TTL cable and am also wondering if WinDAS will work on Windows 10. Any input/recommendations?
 

Derupter

Limp Gawd
Joined
Jun 25, 2016
Messages
227
Of course price ! The Sunix (or Delock equivalent, still not tested) is at least 5 times more expensive than the Delock 62967, while the extra pixel clock is totally unnessary for regular use on a 21" (usually 1600x1200 up to 100hz).

On the top of that, there is still no feedback about the picture quality. There is no flaw obvious enough as far as I saw, but there may be a sharpness decrease compared to an AMD native analog output (which could be either because the ones on sale on Amazon were refurbished junk, or because of the device itself). And I still haven't checked mine because the mess with AMD drivers forces me either to move half of the room back and forth, or to risk ruining a Windows install without an improvement ... :hungover:

The latest news from Delock are from end of January,they said they have no solutions at the moment so i asked them to try to make an adapter like this http://www.delock.com/produkte/G_65653/merkmale.html but with the ANX9847 chip.
That is the best solution to solve the problems about cable,connector and solder quality.
Last reply was that they will come back to me if they will do an adapter like that,but i don't know if or when they will do that and i have not insisted too much after the discovery of the Synaptics chip.
About the picture quality,well the ANX9847 is official a 270 MHz 24 bit DAC,i asked Delock and Synaptics for informations about the VMM2322 chipset,but for now no reply.
If the VMM2322 specifications are lower that those of the ANX9847,there is the possibility that the DAC precision is lower,but it's all theory
I asked informations even about a new chip from Realtek,the RTD2169U used in some new USB-C to VGA adapters.
The great advantage of the VMM2322 is that it has four HBR2 input lanes so its digital input limit is 720 MHz 24 bit and if the DAC holds up,well you have seen the results.
If I'm not mistaken danny_discus has both the Delock 62967 and Sunix adapters,so he can say if there are differences in sharpness,quality,ecc..
 
Last edited:

jka

Weaksauce
Joined
Feb 16, 2017
Messages
108
Ok my XP laptop died on me and now I need a USB to TTL cable and am also wondering if WinDAS will work on Windows 10. Any input/recommendations?

1.

Works fine on Windows 8 64-bit for me. From a faint memory:

At some point you will need to manually install MSFLXGRD.OCX driver which is a 32-bit driver and is a little tricky to install.

Copy the MSFLXGRD.OCX file into C:\Windows\SysWOW64 folder, open cmd any type these 2 commands:

cd C:\Windows\SysWOW64
regsvr32 MSFLXGRD.OCX

There are also multiple versions (or rather "builds" because its not an app you can install) of WinDAS floating around, I dont remember where I got mine (probably from somewhere in this thread), but only that particular one worked for me. I dont have a webserver to upload it so shoot me a PM with email or give me a link to some good upload service and I can share it.

2.

Regarding cable, I bought Silicon Labs CP2102 (without cables for 4 dollars). I had zero issues with it, no data loss or anything. Here is a random shop (not where I bought it, I bought locally in person):

http://www.surplusgizmos.com/Silicon-Labs-CP2102-USB-to-Serial-TTL-Module_p_2476.html

You will need 4 traces and I would HIGHLY recommend that at least one side of the whole cable has one big connector with 4 links together otherwise its a pain to connect it via FW900 tiny backdoor. I had to make my own cable anyway but if you have other options in your shops, I would really recommend not buying 4 separate cables for each trace.
 

Strat_84

Limp Gawd
Joined
Jul 16, 2016
Messages
481
The latest news from Delock are from end of January,they said they have no solutions at the moment so i asked them to try to make an adapter like this http://www.delock.com/produkte/G_65653/merkmale.html but with the ANX9847 chip.
That is the best solution to solve the problems about cable,connector and solder quality.
Last reply was that they will come back to me if they will do an adapter like that,but i don't know if or when they will do that and i have not insisted too much after the discovery of the Synaptics chip.
About the picture quality,well the ANX9847 is official a 270 MHz 24 bit DAC,i asked Delock and Synaptics for informations about the VMM2322 chipset,but for now no reply.
If the VMM2322 specifications are lower that those of the ANX9847,there is the possibility that the DAC precision is lower,but it's all theory
I asked informations even about a new chip from Realtek,the RTD2169U used in some new USB-C to VGA adapters.
The great advantage of the VMM2322 is that it has four HBR2 input lanes so its digital input limit is 720 MHz 24 bit and if the DAC holds up,well you have seen the results.
If I'm not mistaken danny_discus has both the Delock 62967 and Sunix adapters,so he can say if there are differences in sharpness,quality,ecc..
Thanks for the update !

Well, "no solution at the moment", when the obvious solution is to stop using their crap flashy cables/plugs, that means they probably dropped the ball ... :/
 

jka

Weaksauce
Joined
Feb 16, 2017
Messages
108
Quick question, I am planning on updating my GPU to ideally watercooled Titan X (Maxwell) and then getting more for SLI in the coming years. But SLI is something that I would rather not do since they have their own set of problems. So do you think that there is a realistic chance for an adapter in the future that can drive newer GPUs at full 400MHz without(!) any compromises in quality/speed/anything? Or should I just bite the bullet and start building my way towards a high-end 2014 retro rig? I dont need the new GPU any time soon as I dont particularly enjoy the newer console ports they dress as PC games, so I wouldnt mind waiting a reasonable amount of time.
 

DooLocsta

[H]ard|Gawd
Joined
Jan 26, 2005
Messages
1,737
Quick question, I am planning on updating my GPU to ideally watercooled Titan X (Maxwell) and then getting more for SLI in the coming years. But SLI is something that I would rather not do since they have their own set of problems. So do you think that there is a realistic chance for an adapter in the future that can drive newer GPUs at full 400MHz without(!) any compromises in quality/speed/anything? Or should I just bite the bullet and start building my way towards a high-end 2014 retro rig? I dont need the new GPU any time soon as I dont particularly enjoy the newer console ports they dress as PC games, so I wouldnt mind waiting a reasonable amount of time.

I am kind of stuck in the same boat, well not the Titan X but I just can't let my 980Ti go until I can get a reliable adapter.
 

Enhanced Interrogator

[H]ard|Gawd
Joined
Mar 23, 2013
Messages
1,288
So do you think that there is a realistic chance for an adapter in the future that can drive newer GPUs at full 400MHz without(!) any compromises in quality/speed/anything? .

I am kind of stuck in the same boat, well not the Titan X but I just can't let my 980Ti go until I can get a reliable adapter.

The Sunix adapter we've been talking about for the past 10 pages will hit 550mHz+. We're still not sure about the exact limit, but it blows everything else out of the water, as far as pixel clock is concerned.

It uses a Synaptics splitter chip with an integrated DAC. So any adapter with that chip should work.

I haven't been using it full time becasue I'm still using a Radeon 380x which has VGA out, but I used it for a couple days and didn't have any major issues. Color seemed identical as far as I could tell
 

jka

Weaksauce
Joined
Feb 16, 2017
Messages
108
The Sunix adapter we've been talking about for the past 10 pages will hit 550mHz+. We're still not sure about the exact limit, but it blows everything else out of the water, as far as pixel clock is concerned.

It uses a Synaptics splitter chip with an integrated DAC. So any adapter with that chip should work.

I haven't been using it full time becasue I'm still using a Radeon 380x which has VGA out, but I used it for a couple days and didn't have any major issues. Color seemed identical as far as I could tell

Thank you man, I will make a note about Sunix and search it when I have more time to read up on it (stuck with doing WPB right now, its going sloooow). Speaking of RAMDAC, there is apparently an option to overclock it with CRU update (didnt believe it at first) but I tried it some time ago and it didnt do anything (as I kinda expected). Is there a trick/step that I am missing here? I have 400MHz RAMDAC on the card and would really benefit from more since I keep hitting that limit often at lower refresh rates (3200x1536@58Hz where it should go upto 75Hz without pixel clock limit). Not sure how much FW900 can handle though, wont that be the real bottleneck?
 

Enhanced Interrogator

[H]ard|Gawd
Joined
Mar 23, 2013
Messages
1,288
Toasty X makes a separate utility called the "pixel clock patcher". He has a separate one for AMD and Nvidia. After you run that, you should be able to see 400+ mHz pixel clock modes you've created.
 

jka

Weaksauce
Joined
Feb 16, 2017
Messages
108
Toasty X makes a separate utility called the "pixel clock patcher". He has a separate one for AMD and Nvidia. After you run that, you should be able to see 400+ mHz pixel clock modes you've created.

Yes, thats the one I tried. I didnt see the resolutions in LIST ALL MODES later. Where do you apply them, Nvidia Control Panel? How much beyond 400 can I push it with that tool?
 

Enhanced Interrogator

[H]ard|Gawd
Joined
Mar 23, 2013
Messages
1,288
Mine just showed up in Windows display settings. I created them in CRU. I never tested the upper limit, but it was definitely close to 600 if not beyond.

If it's still not showing up in Windows settings, maybe try this QuickRes utility, at the bottom of this page:

https://www.ultimarc.com/download_old.html

I switch resolutions so much for games and video that I just have this program set to launch at startup. It stays in the system tray. It's also the only program I've been able to use that will show sub-480p resolutions, for when I'm hooking up my PC to my standard definiton CRT's (for emulators and other low-res games like Sonic Mania).
 
  • Like
Reactions: jka
like this

jka

Weaksauce
Joined
Feb 16, 2017
Messages
108
Also, after like 15 passes of WPB, I am getting circa 100 cd/m2 on full-white. Is that in line with what you guys have?

I have been writing down all values for each step on paper and gotten to a point where I can no longer change the values compared to a previous pass because they are now "perfect".

But.... there was a step that asked to verify luminance of 115 but I only got upto 98 in that particular test. Any ideas?

During a step that asks to wait for luminance to stabilize it showed 115 though. Also not sure what stabilize means because the Y number always changes in a range of about 0.1 Just to be sure it stabilized I even let it sit for an hour at that step (probably stupid but I wanted to rule it out) and it was still changing within 0.1 range.

BTW I skipped all 6500K, 5000K and SRGB steps because I dont think they are needed if I wont use them.
 

jka

Weaksauce
Joined
Feb 16, 2017
Messages
108
Mine just showed up in Windows display settings. I created them in CRU. I never tested the upper limit, but it was definitely close to 600 if not beyond.

If it's still not showing up in Windows settings, maybe try this QuickRes utility, at the bottom of this page:

https://www.ultimarc.com/download_old.html

I switch resolutions so much for games and video that I just have this program set to launch at startup. It stays in the system tray. It's also the only program I've been able to use that will show sub-480p resolutions, for when I'm hooking up my PC to my standard definiton CRT's (for emulators and other low-res games like Sonic Mania).

I tried it again just now and I cant see the resolutions. I only tried to push it to 420MHz. The patcher says everything is patched. The QuickRes utility only opens the native windows "Screen Resolution" settings area and it is still not there. Isnt there anything else that I might be missing? What OS/GPU do you have?
 

jka

Weaksauce
Joined
Feb 16, 2017
Messages
108
After restart it does show that long list but none of them is a detailed resolution made via CRU. It doesnt show any of my detailed resolutions, not even those that are well under 400MHz. It just shows all "standard" resolutions.

Now if I open restart64.exe from CRU, the quickres utility stops showing any resolutions, not even standard ones. It just shows "Display Properties" there.
 

jka

Weaksauce
Joined
Feb 16, 2017
Messages
108
uTGv3Ea.png


<= This is how it looks when I open it. It already seems to be at "Display".
 

Aktan

Weaksauce
Joined
Dec 31, 2017
Messages
122
Of course price ! The Sunix (or Delock equivalent, still not tested) is at least 5 times more expensive than the Delock 62967, while the extra pixel clock is totally unnessary for regular use on a 21" (usually 1600x1200 up to 100hz).

On the top of that, there is still no feedback about the picture quality. There is no flaw obvious enough as far as I saw, but there may be a sharpness decrease compared to an AMD native analog output (which could be either because the ones on sale on Amazon were refurbished junk, or because of the device itself). And I still haven't checked mine because the mess with AMD drivers forces me either to move half of the room back and forth, or to risk ruining a Windows install without an improvement ... :hungover:

Whoa, I'm a bit late in the thread now, but after a few weeks (a month?) of using 2 SUNIX on both my CRTs, I'd very happy to report I have zero problems after I set it to a refresh rate it likes (or maybe the GPU likes). I have been playing games and using the computer pretty much everyday and it looks and feels fine. No difference from native AMD RAMDAC that I can see.
 

Enhanced Interrogator

[H]ard|Gawd
Joined
Mar 23, 2013
Messages
1,288
the extra pixel clock is totally unnessary for regular use on a 21" (usually 1600x1200 up to 100hz).

I can give you a quick real-world example where the extra pixel clock comes in handy: 4k@96i (580mHz pixel clock)

That is a really good setting to view 4k Blu-rays on a FW900 or other monitor. You don't have to downscale, it's in perfect cadence with 24fps movies, and the flicker and combing is minimized from the high refresh rate.

Yeah, I know the FW900 can't fully resolve 4k horizontally, but it can get close on the vertical axis, which definitely helps, and should look better than scaling to a lower resolution.

Whoa, I'm a bit late in the thread now, but after a few weeks (a month?) of using 2 SUNIX on both my CRTs, I'd very happy to report I have zero problems after I set it to a refresh rate it likes (or maybe the GPU likes). I have been playing games and using the computer pretty much everyday and it looks and feels fine. No difference from native AMD RAMDAC that I can see.

Good to hear. We still need to hear from a Nvidia user though, to see if there are any odd bugs/incompatiblities.
 

jbltecnicspro

Supreme [H]ardness
Joined
Aug 18, 2006
Messages
7,866
Also, after like 15 passes of WPB, I am getting circa 100 cd/m2 on full-white. Is that in line with what you guys have?

I have been writing down all values for each step on paper and gotten to a point where I can no longer change the values compared to a previous pass because they are now "perfect".

But.... there was a step that asked to verify luminance of 115 but I only got upto 98 in that particular test. Any ideas?

During a step that asks to wait for luminance to stabilize it showed 115 though. Also not sure what stabilize means because the Y number always changes in a range of about 0.1 Just to be sure it stabilized I even let it sit for an hour at that step (probably stupid but I wanted to rule it out) and it was still changing within 0.1 range.

BTW I skipped all 6500K, 5000K and SRGB steps because I dont think they are needed if I wont use them.


100 cd/m2 on maxed-out contrast? That sounds about right to me.
 
  • Like
Reactions: jka
like this
Top