Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
I mentioned at the top of the previous page that John from Digital Foundry found that BFI + VRR worked on his CX's latest firmware at least from the Xbox Series X, but only when Dolby Vision was enabled:I'm not sure what you're referring to but you can't even enable BFI without disabling VRR first.
I mentioned at the top of the previous page that John from Digital Foundry found that BFI + VRR worked on his CX's latest firmware at least from the Xbox Series X, but only when Dolby Vision was enabled:
https://hardforum.com/threads/lg-48cx.1991077/page-197#post-1045180420
There was no clarification as to whether this could also work on a PC whether with Dolby Vision-enabled software or via some CRU tweaking or the like.
I mentioned at the top of the previous page that John from Digital Foundry found that BFI + VRR worked on his CX's latest firmware at least from the Xbox Series X, but only when Dolby Vision was enabled:
https://hardforum.com/threads/lg-48cx.1991077/page-197#post-1045180420
There was no clarification as to whether this could also work on a PC whether with Dolby Vision-enabled software or via some CRU tweaking or the like.
I let this thread stew a little bit, but on my return I'm kind of disappointed now the thread as a whole seems to have doubled on HDR and isn't even willing to experiment to see if this BFI + VRR thing even works let alone if it's useful.
I find it very interesting that, with high-refresh monitors, BFI + VRR is seen as a sort of "holy grail", and I figured there's be more of those types of people here considering OLED's crazy-fast response time. But it seems I was mistaken and that the crowd here is a third niche between the cinema buff's "BFI good, HDR good, high framerate bad" and the high refresh gamer's "BFI good, high framerate good, HDR unnecessary".
This is making me think that most OLED users actually fall into the more 60fps-to-100fps gaming category rather than the 100fps+ category which, as I'm typing this, I'm kind of coming to the conclusion may be an audience that is exactly as I said - they value high framerate up to a point but will put more focus on static image quality such as resolution and HDR and will very much sacrifice top-end framerates to do so (one only has to note the kinds of GPUs most people here are using... and I very much don't have interest in such high-end GPUs). And it's this 60fps-to-100fps that's simply not going to be as useful with any sort of BFI+VRR combination.
Basically, I made the mistake of figuring that the high frame-rate gamer would be the main PC OLED customer - it turns out it's the high graphics settings gamer instead. And while I'm not really a high frame-rate gamer, I'm even less of a high graphics settings gamer. Really I'm just a "I want to be able to use a screen in the dark without it either blinding me or looking like crap without sacrificing the motion resolution I've become used to on a CRT" and I've been in that camp for over a decade now.
I want to make one thing very clear though - I'm not married to the idea of BFI and will happily ditch it if the screen has a high enough refresh rate to compensate, but "only" 120Hz ain't gonna cut it since that's already what I've been doing on my CRT for ages.
Also while typing this, I'm reminded that the flickering on a CRT is less noticeable than BFI on at least an LED-backlit LCD screen of comparable flicker-rate, presumably because a CRT's PWM is more akin to a sine wave while LEDs output much more of a square wave. And IIRC I'm pretty sure I found the flickering on CRTs to be reduced as you decrease brightness and/or contrast as I do in dark environments as it's simply reducing the amplitude of the PWM waveform, but on LED-backlit LCDs the flicker instead increases as it instead increases the "off" time during the backlight strobing in order to reduce brightness.
For reference I have no idea how a flicker-less LED backlight would behave with any sort of backlight strobing, but with OLED I would imagine that it would be both similar to a CRT where it just simply reduces the amplitude but also like an LED-backlit LCD whereby it operates as square wave on/off rather than a smoother CRT-like sine wave.
This is making me think that most OLED users actually fall into the more 60fps-to-100fps gaming category rather than the 100fps+ category
RTings measured lag to be 11ms at 4k with VRR. I have an Acer X27 and RTings measured that monitor to have 14ms of lag at 4k with VRR. Using both my CX and X27 side by side, the CX feels vastly more responsive than my X27, more than a 3ms difference. OLED's instant response time probably has something to do with it, but yeah the CX feels plenty responsive and only the 0.1% of super hardcore competitive gamers will find it to be "not responsive enough".
I tried just BFI its self when I first got the tv. It is indeed smooth, but the brightness drop is too severe and most of us here don't like maxing out the brightness (oled light) anyway.I let this thread stew a little bit, but on my return I'm kind of disappointed now the thread as a whole seems to have doubled on HDR and isn't even willing to experiment to see if this BFI + VRR thing even works let alone if it's useful.
I find it very interesting that, with high-refresh monitors, BFI + VRR is seen as a sort of "holy grail", and I figured there's be more of those types of people here considering OLED's crazy-fast response time. But it seems I was mistaken and that the crowd here is a third niche between the cinema buff's "BFI good, HDR good, high framerate bad" and the high refresh gamer's "BFI good, high framerate good, HDR unnecessary".
This is making me think that most OLED users actually fall into the more 60fps-to-100fps gaming category rather than the 100fps+ category which, as I'm typing this, I'm kind of coming to the conclusion may be an audience that is exactly as I said - they value high framerate up to a point but will put more focus on static image quality such as resolution and HDR and will very much sacrifice top-end framerates to do so (one only has to note the kinds of GPUs most people here are using... and I very much don't have interest in such high-end GPUs). And it's this 60fps-to-100fps that's simply not going to be as useful with any sort of BFI+VRR combination.
Basically, I made the mistake of figuring that the high frame-rate gamer would be the main PC OLED customer - it turns out it's the high graphics settings gamer instead. And while I'm not really a high frame-rate gamer, I'm even less of a high graphics settings gamer. Really I'm just a "I want to be able to use a screen in the dark without it either blinding me or looking like crap without sacrificing the motion resolution I've become used to on a CRT" and I've been in that camp for over a decade now.
I want to make one thing very clear though - I'm not married to the idea of BFI and will happily ditch it if the screen has a high enough refresh rate to compensate, but "only" 120Hz ain't gonna cut it since that's already what I've been doing on my CRT for ages.
Also while typing this, I'm reminded that the flickering on a CRT is less noticeable than BFI on at least an LED-backlit LCD screen of comparable flicker-rate, presumably because a CRT's PWM is more akin to a sine wave while LEDs output much more of a square wave. And IIRC I'm pretty sure I found the flickering on CRTs to be reduced as you decrease brightness and/or contrast as I do in dark environments as it's simply reducing the amplitude of the PWM waveform, but on LED-backlit LCDs the flicker instead increases as it instead increases the "off" time during the backlight strobing in order to reduce brightness.
For reference I have no idea how a flicker-less LED backlight would behave with any sort of backlight strobing, but with OLED I would imagine that it would be both similar to a CRT where it just simply reduces the amplitude but also like an LED-backlit LCD whereby it operates as square wave on/off rather than a smoother CRT-like sine wave.
I tried just BFI its self when I first got the tv. It is indeed smooth, but the brightness drop is too severe and most of us here don't like maxing out the brightness (oled light) anyway.
On top of that if I remember correctly it had a dreadful pwm like flicker.
I am a high refresh rate evangelist, but the way I see it is I am using a display with the best image quality that I have ever seen. If I wanted to mar the image quality. I would pick up a 240hz or better display for competitive games that require that clarity.
One thing to keep in mind is that due to oled response times, it seems faster than the 120hz refresh rate would imply.
Well I guess my hunch from back then about my CX feeling far more responsive than my X27 wasn't just placebo.
I guess it also goes to show that the need for BFI on an OLED isn't needed as much as it is on an LCD since the image is just so clean already, the only limiting factor is the 120Hz refresh rate itself.
The LG C1 can completely remove 24p judder though, which is a different side effect from stutter. It can completely remove it when you enable cinema screen mode which multiplies the 24fps x5 to match the 120hz panel refresh rate exactly.Because of the TV's fast response time, low frame rate content can appear to stutter since each frame is held on for longer. If it bothers you, motion interpolation can help.
Even instant pixel response (0 ms) can have lots of motion blur due to sample-and-hold
Your eyes are always moving when you track moving objects on a screen. Sample-and-hold means frames are statically displayed until the next refresh. Your eyes are in a different position at the beginning of a refresh than at the end of a refresh; this causes the frame to be blurred across your retinas
<...>At 120Hz the flicker shouldn't be noticeable and I even used BFI on High which is suppose to be the worst for flickering but couldn't really notice it at 120Hz.
BFI + VRR would've sweetened the deal even more.
I am a high refresh rate evangelist, but the way I see it is I am using a display with the best image quality that I have ever seen. If I wanted to mar the image quality. I would pick up a 240hz or better display for competitive games that require that clarity.
One thing to keep in mind is that due to oled response times, it seems faster than the 120hz refresh rate would imply.
Command Execution Time = Current Server Time – (Packet Latency + Client View Interpolation)
Put into English this means that once you pull the trigger and this information package gets sent to the server, it then goes back from the current server time (the time the pulling the trigger package was received) by your ping plus your interpolation time. Only then it is determined if the client hit the shot or not.
Imagine you want to reach the boxes on top mid on Mirage while an AWP is perched in window. You jump out of the cover of the wall, fly and land safe behind the boxes. In the moment you land, the AWP shot goes off and you somehow die and end up right on the edge of the boxes, a good half meter away from where you stood on your screen. In the German scene, you would have just been “interped”, even though “being lag compensated” might be the more accurate term (I acknowledge that is way more clunky and less easy to complain about).
As the peeking CT moves into the gap of the double doors, his lag compensated hitbox and model are still behind the door, giving the Terrorist no real chance to respond. However, it is imperative in this scenario for the peeking player to actually hit (and in most cases kill) his opponent in the time it takes the server to compute all executed commands and the appropriate lag compensation. Of course, the showcased example is taken with a ping of 150ms, which is unrealistically high for most people, artificially lengthening that time.
Should any of you reading this have the good fortune to play on LAN one day, you should keep in mind that peeker's advantage is solely dependent on lag compensation, a big part of which is made up by a players ping. With the typical LAN connection ping of 3-7ms, peeker's advantage is practically non-existent anymore. Together with many other factors, this is one of the reasons why CS:GO has to be played differently in certain aspects on LAN than on the internet.
Note: In an example where two players shoot each other, and both shots are hits, the game may behave differently. In some games. e.g. CSGO, if the first shot arriving at the server kills the target, any subsequent shots by that player that arrive to the server later will be ignored. In this case, there cannot be any "mutual kills", where both players shoot within 1 tick and both die. In Overwatch, mutual kills are possible. There is a tradeoff here.
- If you use the CSGO model, people with better latency have a significant advantage, and it may seem like "Oh I shot that guy before I died, but he didn't die!" in some cases. You may even hear your gun go "bang" before you die, and still not do any damage.
- If you use the current Overwatch model, tiny differences in reaction time matter less. I.e. if the server tick rate is 64 for example, if Player A shoots 15ms faster than player B, but they both do so within the same 15.6ms tick, they will both die.
- If lag compensation is overtuned, it will result in "I shot behind the target and still hit him"
- If it is undertuned, it results in "I need to lead the target to hit them".
I would love simultaneous BFI VRR, but I don't have time to experiment to try to get it working. I'm hoping someone else does though.
When streaming music to the C1 with Apple AirPlay or whatever its called, is there a way to stop the TV from going into the firework screensaver? I ordered a service remote...
<...>
I feel like it fatigues your (especially my) eyes even if you supposedly aren't consciously "seeing" it. People claim to "not really notice it" , etc. as if it is borderline. While you might try to be obvlivious to it I don't think your eyes and brain are.
like you I am probably going to prioritize HDR now because after doing those AutoHDR tweaks on my CX and having played around with it for over 2 weeks now, games just look absolutely FANTASTIC now with AutoHDR. So given the choice between BFI and AutoHDR, it's going to be AutoHDR. I've been having a blast playing Fallout 4 with some fan made DLC mods and AutoHDR. Sure maybe the DTM ON + Low Black Level tweak isn't the most "accurate" way of doing it compared to CRU or Special K, but when we are literally taking a game that never had any HDR to begin with and are now shoehorning HDR into it through AutoHDR, should we REALLY be debating "accuracy" here? I say just use whatever looks the best to you and for me these tweaks give me a good enough picture 90% of the way there that I don't feel the need to go messing with CRU/Special K to get that extra 10%.
Hi there,
can you confirm or not confirm the following: I changed from a Nvidia 3060ti to a Amd 6600XT. For me text sharpness seems WAY better than with the nvidia card. I am using 4k@120Hz.
Thanks
Marco
Text sharpness is going to be affected by 3 things:
1. RGB 444 Chroma. Make sure you have the AMD card to output that format in the Radeon software, it might be outputting YCbCr 422 or 420.
2. PC Mode on the HDMI input you are using. Changing GPUs means that it was reset from PC mode back to Generic HDMI input so you will have to go back and relabel it as PC input.
3. Cleartype settings.
Display info (green button) says ycbcr444 10b 4L10
Changed from HDMI 1 to PC, no noteable change
Cleartype: Also menu elements not affected by cleartype seem way sharper
Will change back to 3060tio tomorrow to have the AB BA effect
I mean the opposite...to keep the screen on, like a mad man, so it shows the album art and text of song playing. The screensaver turns on before one song finishes. I haven't used the service remote since it smells like someone dropped it in a vat of perfume, somehow. I am guessing the only solution is to void the warranty and disable the power/screen saving stuff?I don't know if this helps your scenario or not but I use the "turn off the screen" feature which turns the oled emitters off. You can set that turn off the screen command icon to the quick menu so it's only 2 clicks to activate with the remote (I set mine to the bottom-most icon on the quick menu), or you can enable voice commands and then hold the mic button and say "turn off the screen". I wish there was a way to set it to one of the colored buttons so you could just hit one button but otherwise it works well enough. Clicking any button on the remote wakes up the emitters instantly. I usually hit the right side of the navigation wheel personally.
https://www.reddit.com/r/OLED/comments/j0mia1/quick_tip_for_a_fast_way_to_turn_off_the_screen/
While the emitters are off everything is still running, including sound. This works great to pause games or movies and go afk/out of the room for awhile for example. I sometimes cast tidalHD to my nvidia shield in my living room from my tablet utilizing the "turn off the screen" (emitters) feature. That allows me to control the playlists, find other material, pause, skip etc from my tablet with the TV emitters off when I'm not watching tv. You can do the same with youtube material that is more about people talking than viewing anything. I do that sometimes when cooking in my kitchen that is adjacent to my living room tv. You can probably cast or airplay to the tv webOS itself similarly. Some receivers also do airplay/tidal etc directly to the receiver.
I mean the opposite...to keep the screen on, like a mad man, so it shows the album art and text of song playing. The screensaver turns on before one song finishes. I haven't used the service remote since it smells like someone dropped it in a vat of perfume, somehow. I am guessing the only solution is to void the warranty and disable the power/screen saving stuff?
I'm waiting for it! Hopefully I can hold out till it's sub-$1k.That 42" OLED will be the default gaming monitor. Hard to beat it with anything but insane refresh rate and hardware g-sync.
The OLEDs are effectively hardware G-SYNC, just not made by NVIDIA. They need LG's custom controller to drive the individual pixels with G-SYNC certified VRR, unlike LCD monitors from manufacturers who make half-baked VRR firmware for existing controllers. NVIDIA's hardware G-SYNC module just standardizes them all.That 42" OLED will be the default gaming monitor. Hard to beat it with anything but insane refresh rate and hardware g-sync.
Yes, as of this morning, and no not that I’ve noticed yet. LG has been really good about not breaking previously working features on these, so I went ahead and took the plunge.Anyone update to firmware 4.30.35 or 4.30.40 yet? Any negative side effects?
Nice, but I'm using the free Microsoft Powertoys which includes FancyZones. Seems to work seamlessly recently, even with Windows 11. Any advantages to using Displayfusion?FYI for anyone interested for multi monitor setups and window management etc. Displayfusion is 50% of right now, black Friday deal.
Nice, but I'm using the free Microsoft Powertoys which includes FancyZones. Seems to work seamlessly recently, even with Windows 11. Any advantages to using Displayfusion?
I use displayfusion pro which has a window position save function where you can save window positions to one or multiple named profiles.
I also set my regularly used apps to displayfusion window moving functions so they all have their own "home" position. I set those each to their own hotkey in displayfusion and link those hotkeys each to their own streamdeck button with a little icon for each app.
I cobbled together some displayfusion user scripts from different functions so that each time I hit an apps button it:
--checks to see if the app is open or not, if not, opens it and moves it to set position.
--if it's open it checks to see if it's minimized.
--If it's open and not minimize it.
--If it's open and minimized, restore it to the set position.
... I can hit the same button a few times and get the particular app window back to the home position or minimize/restore it bascially
... I also have a window position profile button on the streamdeck set to the displayfusion window position profile hotkey - so I can simply shuffle them all back at once without having to use the scripted function buttons.
I just posted a bunch of things I do with it for regular system use outside of obs/streaming, about 3 replies ago. Any action or function you can trigger, toggles, multi-actions/functions.
I integrate it with displayfusion multi monitor / window management app via hotkeys for a lot of useful functions.
https://www.displayfusion.com/Features/Functions/
https://www.displayfusion.com/ScriptedFunctions/
There is also a growing library of useful streamdeck plugins you use directly in the streamdeck app:
https://apps.elgato.com/plugins
The big draw is tactile hardware buttons with cutomizable graphics that can change when you for example toggle which sound device you are using on a multi-press... mute/unmute mic, etc but there are a ton of window management and other things you can do with it. I use it to launch and place every app in my 3 monitor setup, restore everything to a saved window position profile, etc. I use it to launch (and place) practically everything now. I have button icons to place the active window to any of a number of set locations and screen-ratios on my 3 screens on the fly. I never have to drag or resize windows ever. I also launch steam big picture or back to desktop version with a toggle button, terminate and re-launch steam button, launch other game stores/libraries. Pop apps or windows system panels up and minimize/restore toggle-cycle them, etc. etc. There is a ton you can trigger with these buttons and the elgato streamdeck library has a lot of stuff already. They are also useful as the toolbox for things like photoshop or gimp or other graphics/video editors, streaming apps, etc etc. There are a lot of app intergration plugins already on the plugin site I linked.
The Nvidia module is the most effective one. One thing for sure, Nvidia hardware module starts synchronizing frames with 1fps, LG - with 28fps, below that you get the usual v-sync stutter.The OLEDs are effectively hardware G-SYNC, just not made by NVIDIA. They need LG's custom controller to drive the individual pixels with G-SYNC certified VRR, unlike LCD monitors from manufacturers who make half-baked VRR firmware for existing controllers. NVIDIA's hardware G-SYNC module just standardizes them all.
The Nvidia module is the most effective one. One thing for sure, Nvidia hardware module starts synchronizing frames with 1fps, LG - with 28fps, below that you get the usual v-sync stutter.
One of these innovations is low framerate compensation (LFC), which addresses the framerate dropping below the monitor’s range. For example, if the FPS drops below the monitor’s 30Hz range, LFC will increase the monitor’s refresh rate with a consistent ratio. So, if the game is at 25 FPS, LFC will set the refresh rate to 50Hz and that will still prevent the gamer from being affected by screen tearing.