[Sticky] Monitor/Display Information Thread + G-SYNC/FreeSync + Lightboost/ULMB  

  RSS
Jester
(@jester)
Member Moderator
1Gep.jpg

There is a lot of misinformation and general lack of knowledge going around regarding monitors and displays in general. Let this thread serve as a guide.

Display performance is objective. There are clear cut targets for almost every metric, according to picture quality. Color accuracy, color temperature, gamma, all have very specific targets, and higher resolution and refresh rate (with no ceiling) is objectively better in quality, although some operating systems and applications don't scale well with insanely high resolutions.

So first, let's define these performance metrics and specify what the ideal targets are.

Color Gamut, Volume, and Depth

Gamut and Volume are related, Depth is another metric.

Color Gamut: Also referred to as color space. The amount of colors it can produce. See the visual below comparing three common color spaces: Rec. 709 aka sRGB (this is what most content, including all video games are rendered in), DCI-P3 (digital movie standard), and Rec. 2020 (Ultra HD standard). More is not necessarily better. What you want is for your display to have 100% coverage of the color space that the content is made for, whether it's a game or a movie. That is 100%, no more and no less. Too low means washed out lack of colors, too much means oversaturation. Ideally, a display would have different modes, each covering 100% of the color gamuts it supports, no more and no less.

mKNKbTS.png

Color Volume: This refers to how many colors a screen can display across a range of brightnesses, not ignoring darker and lighter shades for each color. The higher the better, with 100% being the highest value. Color volume is measured in accordance to a targeted color space, like those in the graph above.

Color Depth: Also known as bit-depth. This is the number of bits used to indicate the color of a single pixel. More is better, as more bits means more tones per channel for every pixel, leading to much more possible color tones (but NOT more colors like gamut/space). In other words, more bit-depth means smoother gradients as illustrated below.

oEX80nP.png

8-bit is the current standard. Only HDR movies/TV might use 10-bit, and nothing goes beyond that largely due to interface bandwidth limitations (HDMI and DisplayPort). Read more about color depth here. The image below is one of the most important parts of that article.

sbbpOz7.jpg

Play any game and look up at the sky when it's a single color. Notice how much banding there is, how bad the gradient is. Higher color depth would solve that. That is one of our biggest limitations/shortcomings in digital graphics right now, especially for entertainment since 8-bit is far too little.

Static vs Dynamic Contrast Ratios, Color Accuracy, Color Temperature, Grayscale/White Balance, Gamma, and Brightness

Let's kill many birds with one stone here.

Contrast Ratio: The luminance difference between the brightest and darkest color of a screen. This is one of the most important performance metrics for picture quality. Higher = better, with no ceiling. Static contrast refers to contrast ratio measured on a single static brightness setting, so this is the most important contrast measurement for LCD displays. Dynamic contrast is the measured contrast ratio with a varying brightness level; the problem here is, LCDs are lit by back lighting (in this day and age, this means LEDs mounted either on the sides or behind the panel). Higher end LCD (aka LED) TVs and now monitors have a feature called "local dimming" which adjusts the backlight level based on the content on screen, e.g. it dims in dark scenes and brightens in bright scenes (HDR gives finer control and range of this). As you can imagine, this is terrible on LCD screens with minimal compartmentalized backlight "dimming zones" as it would just lead to an excessively dim or excessively bright picture, depending on the content.

The band-aid workaround for this is increasing the amount of dimming zones. The highest I know of is around 620? Or maybe something has more now, but either way, a 1920 x 1080 TV has 2,073,600 pixels, so really it needs 2,073,600 dimming zones while a 4k TV needs 4x that which is 8,294,400.

Don't roll your eyes. OLED displays have an OLED for every pixel, so yes the 4k OLED TVs actually have 8,294,400 "dimming zones" so to speak. Each pixel will actually dim as on-screen content darkens, as far as turning completely off for true blacks. This means OLED naturally, mathematically has a contrast ratio of infinity (since the peak brightness level is divided by absolute zero), and this contrast ratio is always "dynamic contrast" since it is not a static luminance level since OLED is just too damned good for that.

For LCD, unless you're dealing with a screen with full array backlighting (backlighting mounted behind the panel) and a relatively high amount of dimming zones (384 or more), ignore the outrageous dynamic contrast measurements like 1,000,000:1. Here is a table for average static contrast per display type.

Panel Type Common Usage Typical Static Contrast Ratio
TN Monitors 900:1 - 1,000:1 for modern TN (used to be much lower)
IPS Monitors and low end TVs 1,000:1 - 1,300:1 for modern IPS (used to be much lower)
VA High end TVs and some monitors 5,000:1 - 7,000:1 for high end TVs, 2,500:1 - 3,000:1 for monitors

Color Accuracy: Measured in delta E, a standard set by the International Commission on Illumination (CIE). This is a very complex formula that only got more complex over time, the goal being to measure color difference, but the end result is during calibration, the lower the value (down to 0) the more accurate the color. Values over 3 start to get noticeably inaccurate.

Color Temperature: Characteristic of visible light measured in Kelvin. Affects the general color tone of the light emitted by the display. Higher values = cooler (more blue), warmer values = warmer (more orange/red). "Low Blue Light" modes just severely lower color temperature. Ideally, the color temperature should be perfectly neutral so as to not corrupt the image. The neutral value thus the color temperature target is 6500k (white).

Grayscale/White Balance: Measured in delta E, this is like color accuracy but only for the grayscale (whites, grays, blacks). So again, the ideal value is 0, so lower is better.

Gamma: Brightness of the grayscale. The ideal value for accurate viewing of the grayscale is 2.2, so try to get as close to that as possible for all shades of grey.

Brightness: This is just the general brightness or backlight level (LCD) of the display. Brightness setting impacts the performance in all other areas so it is quite important. LCD displays are designed to be run at around 120 cd/m2, but the ideal brightness setting depends on your setting, namely the ambient light of the room. The display's brightness needs to be not overpowering so that it doesn't fatigue your eyes too much. Higher brightness also worsens contrast by making black levels inferior, ESPECIALLY on IPS panels. My recommendation is to adjust room lighting according to the display's ideal ~120 cd/m2 target, NOT the other way around. This means no ambient lighting and only a bias light, which is a 6500k white light on the back of the monitor with even distribution, which improves perceived contrast and black levels (seen in the video at the bottom of this post).

Brightness is less of a factor for OLED since its brightness range is far more limited anyhow, and since it has perfect blacks thus infinite contrast at any brightness setting.

So these are the important metrics and the target for each of them. Accuracy is achieved via hardware calibration, using a colorimeter or spectrophotometer. Rarely are displays pre-calibrated, and high end TVs have far better factory calibration than monitors. Here is what a detailed calibration report looks like.

Few websites go this far when reviewing displays. For TVs, rtings.com is one of the only ones. For monitors, check out prad.de, tftcentral.co.uk, and pcmonitors.info

Now we will look at advanced technologies like high refresh rate, variable refresh rate (G-SYNC, FreeSync, AdaptiveSync), and blur reduction via strobing (Lightboost/ULMB and others). My Acer Predator XB270HU offers both NVIDIA G-SYNC and ULMB. G-SYNC on this monitor operates between 30-144 Hz, and ULMB is available at 85 Hz, 100 Hz, and 120 Hz. I've done extensive testing of each of these.

High Refresh Rate

First and foremost I should discuss the advantages of high refresh rate (more than 60 Hz). The default refresh rate (so to speak) of LCD displays is 60 Hz. That's the standard, that's what nearly every LCD display is on the market, whether it's a TV or a monitor, whether it uses a TN, IPS, or VA panel.

Fake High Refresh Rate: Any TV that's listed as 240 Hz or above is telling you a blatant, stupid lie. Most TVs listed as 120 Hz are lying equally so. Almost every TV is 60 Hz. Only a minuscule amount can actually output 120 Hz, and all of these do so at 1080p (1920 x 1080). Any TV running at 4k (3840 x 2160) is also running at 60 Hz. Here is a list of true 120 Hz TVs.

The question is, is there any truth to these lies? If they're really 60 Hz TVs, then why do manufacturers claim 120 Hz or 240 Hz? All that really means is that those TVs use motion interpolation to convert content to 120 Hz or 240 Hz, but it can only output 60 Hz. Such interpolation can have positive or negative effects depending on what refresh rate it's converting to, and depending on the model itself. Both 120 Hz and 240 Hz interpolation should be beneficial for watching TVs or movies because those are shot at 24 FPS, and 24 FPS doesn't divide evenly into 60 Hz but it does divide into 120 Hz/240 Hz. Thus, interpolation to 120/240 Hz can potentially deliver judder-free movie/TV playback.

Benefits of Higher Refresh Rate

  • Allows the monitor/TV to display higher frame rates. A 60 Hz display can only physically display 60 FPS. A 120 Hz display can display 120 FPS. There used to be so much confusion over this topic, but these days it's not so bad. Console gamers still like to say "the human eye can't see more than 60 FPS!" Some even say that we can't see over 30 FPS. That's all random, uninformed, uneducated lies. In reality the human eye can easily detect up to around 150 FPS; it's around this area where it becomes harder to tell, and this is where some people can fare better than others (some may easily see the difference between 200 FPS and 144 FPS for example).
  • Reduced eye strain.
  • Lower Response Times - Response time refers to pixel response time. Higher = more motion blurring. Higher refresh rates lead to lower response times, thus less motion blur. See the image below.
TFTCentral wrote:
dZq0S.jpg

The image above is from TFTCentral's review of the ASUS ROG Swift PG279Q. It clearly demonstrates the reduced motion blur from higher refresh rate, which is because it has a much lower response time at the higher refresh rates. They measured an average of 8.5 ms pixel response time at 60 Hz, 6.5 ms response time at 100 Hz, and 5.2 ms response time at 144 Hz.

The advantage doesn't stop at 144 Hz. Below is an example from PCMonitors who tested the ASUS ROG Swift PG248Q. Look at the 60 Hz vs 100 Hz vs 144 Hz vs 180 Hz comparison. 180 Hz produces a noticeably clearer image than 144 Hz.

PCMonitors wrote:
ejUvx3S.png

What this translates to in practice is, any moving content on the screen will appear less blurry especially when trying to focus on anything in motion.

Disadvantages of High Refresh rate

  • Taxes your graphics hard a bit more. Not that this is a big deal at all. Higher refresh rates are a universally good thing.
NVIDIA G-SYNC / AMD FreeSync

These two technologies are the most talked about, so I'll go over these before discussing blur reduction. G-SYNC and FreeSync do the same thing, they're just implemented differently. AMD FreeSync is actually VESA AdaptiveSync but taken by AMD and given a new name. Then there is FreeSync 2, which is FreeSync with tightened standards such as mandatory HDR support, mandatory LFC (Low Framerate Compensation), and more. But first, what are G-SYNC and FreeSync? The answer lies in V-Sync, the technology that has been around forever. It was invented to remove screen tearing.

V-Sync (Vertical Synchronization) - Forces a game/application's frame rate to match the display's refresh rate. Alternatively it can be set to 1/2 refresh rate, 1/3 refresh rate, 1/4 refresh rate. All of this is done to remove screen tearing, an annoying issue pictured here. Screen tearing occurs when a game's frame rate does not match the display's refresh rate, and it's more common at lower frame rates. V-Sync does remove tearing, but has several flaws which are listed below.

Flaws of V-Sync

  • Stutter - If the frame rate can't match the refresh rate, then the game will switch to V-Sync at half refresh rate instead of full refresh rate (e.g. 30 FPS instead of 60 FPS, which is how most console games operate). But the transition between V-Sync full refresh rate and V-Sync half refresh rate creates stutter. Stutter is demonstrated here.
  • Input Lag - V-Sync increases input lag, which can sometimes be felt. Triple buffered V-Sync, which should always be used if you're going to use V-Sync, minimizes additional input lag compared to double buffered (which consoles use, not 100% sure if the latest consoles are double buffered though). Read this if you want to learn more about buffering. Triple buffering should always be used if using V-Sync, if available. It can add no more than 1 frame of lag (16.67 ms at 60 Hz, 8.34 ms at 120 Hz).

NVIDIA invented Adaptive V-Sync, which just disables V-Sync if the game can't maintain the target frame rate. This solves the stuttering issue, but introduces screen tearing when V-Sync is disabled obviously.

The solution to V-Sync flaws? That's another thing NVIDIA invented. It's called G-SYNC. We really have so much to thank NVIDIA for as they either invented or pushed forward some of the best game changing technology in history. G-SYNC is one, as is unified shader processing (introduced with the GeForce 8000 series graphics cards / G80 chipset). LightBoost is another; not an original invention but if it wasn't for LightBoost we wouldn't have a bunch of blur reduction monitors today. NVIDIA also created by far the best, most advanced game physics engine called PhysX. But I digress.

NVIDIA G-SYNC - Hardware implementation of variable refresh rate technology. Instead of forcing a game's frame rate to match the refresh rate (or a fraction of the refresh rate), which is what V-Sync does, G-SYNC works in reverse by syncing the display's refresh rate to the game's frame rate. This removes all screen tearing, without any of the downsides of V-Sync. Input lag? Virtually none is added. The G-SYNC hardware module is also the monitor scalar, and it is a very fast scalar with perhaps the lowest input lag ever. The downside is, it has zero resolution scaling capability, so running non-native resolutions via DisplayPort on a G-SYNC monitor will use GPU scaling, which looks awful.

Stuttering? None, and G-SYNC actually reduces microstuttering leading to a smoother game experience in every conceivable way. This test demonstrates microstuttering, it's basically less severe stuttering. Note that G-SYNC only works on modern NVIDIA graphics cards obviously, and it only works over DisplayPort.

Here is a G-SYNC simulation, to give you an idea of how good it is.

G-SYNC always works right up to the monitor's maximum refresh rate, and down to about 30 Hz. Below 30 FPS, G-SYNC will switch to frame doubling (aka G-SYNC @ 1/2 refresh rate) down until about 15 FPS. Under 15 FPS, it will switch to frame tripling (aka G-SYNC @ 1/3 refresh rate) until 10 FPS or so (because 10 FPS x 3 = 30 Hz and G-SYNC doesn't work below this).

A newer version of G-SYNC is on the horizon, dubbed "G-SYNC HDR." The only known difference is the inclusion of HDR support.

Note about using G-SYNC: When enabling G-SYNC, you should just use the default settings that are enabled when simply enabling G-SYNC. These default settings include setting V-Sync to "Force On." The reason this setting should be used is because it keeps your frame rate from going past your maximum refresh rate—if this were to happen, G-SYNC obviously disables and then screen tearing can return. Keeping V-Sync set to Force On means V-Sync will enable itself when your frame rate/refresh rate reaches the upper limit. A real life example; on my XB270HU at 144 Hz, using these settings means G-SYNC is enabled as long as the refresh rate is between around 30-143 Hz (10-143 FPS), but at 144 FPS/144 Hz V-Sync takes over to effectively limit my frame rate.

AMD FreeSync - Free implementation of variable refresh rate technology, stemming directly from VESA AdaptiveSync. It does the same thing as G-SYNC, but because of the different implementation (does not require one specific proprietary scalar, although the scalar is still involved in the process just not to the extent of G-SYNC) there are some differences between G-SYNC and FreeSync monitors that use the same panel. FreeSync has no standards to reach, so many FreeSync monitors have a terrible implementation including non-functional Low Framerate Compensation (LFC). G-SYNC drives up monitor cost by $150-200. FreeSync is supported by AMD and Intel graphics processors.

FreeSync refresh rate range is more limited than that of G-SYNC, but LFC can make that negligible... if it works. However, at lower frame rates (toward the bottom of the FreeSync range), FreeSync will add more lag than G-SYNC and possibly a less smooth experience, since frames are queued on the G-SYNC module which has its own RAM, thus the GPU doesn't repeat frames ever with G-SYNC.

It is however possible to have resolution scaling on the monitor side with FreeSync running over DisplayPort. It all depends on the scalar.

AMD FreeSync 2 - While FreeSync had very little standards, FreeSync 2, the successor, has tight standards. While the variable refresh rate range doesn't have to be as enormous as that of G-SYNC, it is still more tightly regulated and most of all LFC (Low Framerate Compensation) is mandatory, so that the effective frame rate range for FreeSync 2 goes all the way down to values as low as 10-15 FPS. It also mandates HDR support, intelligent dynamic HDR on/off switching, and more. FreeSync 2 is thus better than G-SYNC, and should be equal to "G-SYNC HDR."

Advantages of G-SYNC/FreeSync

  • Removes all screen tearing.
  • Removes microstuttering.
  • Does not introduce stutter like V-Sync does.
  • Does not introduce input lag like V-Sync does.

Disadvantages

  • G-SYNC significantly adds to the cost of the monitor, as does FreeSync 2 (FreeSync does not).
  • G-SYNC does not work with NVIDIA ULMB and Lightboost.
  • Since refresh rate varies, response time varies with it. Remember that PG279Q comparison pictured/discussed above? How it averages 8.5 ms at 60 Hz, 6.5 ms at 100 Hz, and 5.2 ms at 144 Hz. This means motion blur varies as frame rate varies. There is noticeably more motion blur at 60 FPS than at 135 FPS for example. So we're sacrificing a bit of motion clarity for the removal of screen tearing.
Blur Reduction (NVIDIA Lightboost / ULMB)

I briefly touched on this before. Pretty much every blur reduction technology, like NVIDIA Lightboost which popularized it, and also NVIDIA ULMB (Ultra Low Motion Blur), BenQ Blur Reduction, Eizo Turbo240, and others, they all work essentially the same way. First, you need to understand that LCD displays are backlit; the screen is illuminated by lights (LEDs in this day and age) installed either at the edges of the panel (every computer monitor I know of) or behind the panel (the best TVs, and this is the best method).

These blur reduction technologies remove essentially all perceivable motion blur by strobing the backlight; turning it off while waiting for pixel transactions, and strobing the backlight only on fully refreshed frames. This replaces the sample and hold effect, which non-strobing monitors use. No more displaying or holding a frame while waiting for the next one as this is what causes most motion blur. Instead the backlight is shut off in between frames. As you may have guessed, this can cause visible flicker like PWM dimming which can cause headaches or worse, but a high enough refresh rate makes the flicker not noticeable.

The image used above when demonstrating high refresh rates on the ASUS ROG Swift PG248Q is the best example, so we'll just repeat it.

TFTCentral wrote:
ejUvx3S.png

That 180 Hz photo is the cleanest non-strobing motion image I've ever seen. And even ULMB @ 85 Hz beats it. ULMB @ 100 Hz and 120 Hz destroy it!

Blur reduction can't be used at the same time as variable refresh rate. Except for a hack on at least one Lightboost monitor. But this is a shame, because the ideal monitor would have variable refresh rate (VESA AdaptiveSync implemented like G-SYNC with a custom scalar with built in memory being used like the G-SYNC module, except with resolution scaling) and an open standard GPU independent blur reduction implementation that can be used together.

To implement such a thing, the manufacturers would have to set a constant brightness target so that the brightness of the screen can be adjusted depending on refresh rate (since faster strobing = higher brightness). So, such a monitor should have a VRR + strobing ("StrobeSync?") mode in which you define the brightness target for the screen, preferably in 10 cd/m2 intervals (100 cd/m2, 110, 120, 130, 140, 150, etc.). The max available brightness available in this "StrobeSync" mode would be a value attainable by the monitor at any given refresh rate with blur reduction enabled. The "StrobeSync" range would ideally be from the maximum refresh rate down to 50 Hz, but the user should be able to raise the minimum refresh rate for StrobeSync mode however they please (in 1 Hz intervals). E.g., on a 144 Hz monitor, the user should be able to select a "StrobeSync" range from 50-144 Hz up to 120-144 Hz (no need to go above 120). Below 50 Hz, the strobing should switch off but VRR should remain.

There should be two StrobeSync modes on monitors with both as well: One with zero interpolation and 1:1 strobing throughout its entire range. This is for the few flicker insensitive people out there who can handle 50-60 Hz single strobing. The other mode would use motion interpolation and different strobing modes depending on the refresh rate. What I'd like is, using a 50-144 Hz "StrobeSync" range as an example:

  • 50 - 100 Hz = Motion interpolation would be used to convert content to double refresh rate (e.g. any "120 Hz" TV), double strobing would be used at the converted refresh rate (with a user defined strobe length for each strobe, by default a long and short strobe would be used like Eizo Turbo240). However, this interpolated refresh rate range should be user defined. I would make it 50-100 Hz, someone less sensitive to flicker than me might make it 50-80 Hz.
  • Over 100 Hz = Interpolation free 1:1 single strobing, with the user able to adjust strobe length still.

Then at maximum refresh rate (in this case 144 Hz) the user can enable V-Sync, a frame rate limiter, Fast Sync, or nothing to take over.

As for brightness? Remember, the user would define a target brightness for this "StrobeSync." I would set it to 120 cd/m2 or even 110 cd/m2, so the brightness of the screen would adjust itself depending on refresh rate to maintain the user defined brightness. No part of a monitor knows what brightness it is outputting, so this would have to be a hard coded feature from the monitor manufacturer. But, worth it!

Such perfection will likely never be developed.

Blur Reduction Pros

  • Removes all perceivable motion blur. Creates CRT-like motion clarity at 120 Hz with V-Sync enabled (or 144 Hz + V-Sync for BenQ).

Blur Reduction Cons

  • Flicker, which is more obvious the lower the refresh rate. I don't think anyone can SEE the flicker when strobing at 120 Hz, but eye fatigue can still occur after lengthy exposure. At 100 Hz I can see flicker on the desktop but not in games, but eye fatigue occurs in well under an hour and headaches occur for me at around an hour or so of exposure. Below 100 Hz, forget about it since the flicker is blatant everywhere.
  • Reduced maximum brightness. Should only be used in dim/dark rooms. Eizo's Turbo240 is the only one I know of that can still achieve very high brightness (over 250 cd/m2, too much for every scenario except for having natural light illuminating the monitor). BenQ's blur reduction still allows the monitor to reach around 190 cd/m2 which is bright (dark rooms should stick to around 120 cd/m2 for reference). 120 Hz ULMB displays can still achieve around 120 cd/m2 but not much higher. Lightboost gets brighter than ULMB. The reason ULMB is so dim is because it doesn't apply extra voltage to the backlight to compensate for the reduced brightness caused by strobing. Also, higher refresh rate = faster strobing = less brightness loss, hence why Eizo Turbo240 gets so bright (240 Hz strobing + overvolted backlight).
  • Strobe crosstalk. A type of ghosting that appears with blur reduction. It's worse at the bottom of the screen on most models. Easily visible if V-Sync is disabled. I also still see lots of it even with V-Sync enabled and ULMB enabled at 85 Hz and 100 Hz on my Acer XB270HU, and plenty of it at 120 Hz toward the bottom of the screen but it's less intrusive than motion blur.
  • Building on my last point, most will only find single strobe blur reduction usable at 120 Hz or above with V-Sync enabled. Good luck maintaining 120 FPS V-Sync in modern games! Some games look better than others with ULMB 100 Hz, so your mileage may vary there. Note this is on an IPS monitor. 100 Hz single strobe blur reduction might yield nice results on TN, I can't say. 120 Hz ULMB with V-Sync 1/2 refresh rate still looks bad, worse than no strobing. From my experience only Eizo Turbo240 is effective at lower refresh rates/frame rates.
  • Reduced contrast and color accuracy. It's only awful on LightBoost monitors though. The effect on BenQ, ULMB, and Turbo240 displays is minimal. Turbo240 has less of an impact on these than ULMB.
  • Can increase input lag slightly. Seems to be negligible for ULMB and BenQ blur reduction.

Below is a listing of common blur reduction technologies. Almost all of them only work over DisplayPort.

  • NVIDIA LightBoost: Perhaps the first blur reduction implementation on computer monitors? It's the first time I heard of it. LightBoost has since been replaced by NVIDIA's own ULMB, since LightBoost reduces maximum brightness far more, and it degrades contrast and color quality far more. Works at no more than 120 Hz.
  • NVIDIA ULMB (Ultra Low Motion Blur): A fitting name indeed. Like LightBoost, but less impact on peak luminance (although it still GREATLY limits it), and insignificant effect on contrast and color quality. Many ULMB monitors have adjustable pulse width, so ULMB is generally customizable unlike other blur reduction modes! Lower pulse width = lower brightness but less blur, and the opposite is true. ULMB functions at 85 Hz, 100 Hz, and 120 Hz, but not 144 Hz sadly.
    • ULMB 85 Hz - The backlight strobes at 85 Hz (once per refresh interval). As a result there is an incredible amount of flicker so it's not recommended. Instant headache or epilepsy. It also has an insane amount of crosstalk. Terrible.
    • ULMB 100 Hz - The backlight strobes at 100 Hz (once per refresh interval). As a result there is some flicker on the desktop (might not be immediately noticeable but fatiguing after a while), but not in games to my eyes. Some games show more crosstalk than others; the majority I've tested have an unacceptable amount of crosstalk, others have enough to make it difficult to choose over no ULMB (crosstalk vs motion blur). A few actually look better with 100 Hz ULMB. There is some overshoot/inverse ghosting with my XB270HU and 100 Hz ULMB.
    • ULMB 120 Hz - The backlight strobes at 120 Hz (once per refresh interval). As a result, flickering isn't really seen but it can still be felt after some amount of time (depends on the person). Crosstalk, at least when V-Sync is enabled, is present toward the bottom of the screen, resulting in imperfect motion clarity but it's certainly not as bad as the motion blur received when not using ULMB.
    • Pulse Width: Most ULMB monitors let you adjust the pulse width, which adjusts the strobe length. Lower setting = shorter strobe length = significantly lower brightness. In theory lower pulse width might lower motion blur somewhat, but I don't see it at 120 Hz. It looks exactly the same to me at 10 vs 100, except way darker. Less than 65 yields too dark of a picture for any purpose I find, with my XB270HU and ULMB at 120 Hz. I suppose 120 Hz strobing is fast enough to not benefit from very short strobe intervals. But at 100 Hz, lowering the pulse width seems to improve motion clarity slightly and somewhat alleviate strobe crosstalk.

  • Eizo Turbo240: Seen on the Eizo Foris FG2421 monitor, a legend of a monitor. The first gaming oriented VA monitor ever. Turbo240 utilizes 240 Hz interpolation (internally converts refresh rate to 240 Hz) and then applies backlight strobing at 240 Hz intervals, thus it has the least obvious flicker (the faster the strobing the less visible the flicker). Turbo240 has the least impact on maximum brightness (the FG2421 can still output over 250 cd/m2 with Turbo240 enabled) and it functions as low as 60 Hz, but no matter the setting it interpolates to 240 Hz. I thought the interpolation would be a bad thing, but it doesn't seem to be. I don't feel the extra lag although some superhuman Counter-Strike players and the like would. Motion clarity is superior to ULMB on my Acer Predator XB270HU, less ghosting. Even Turbo240 at 60 Hz looks surprisingly good and better than not using it.
  • BenQ Blur Reduction: Considered the best by those who have tried all of the above. Can achieve much more max brightness than LightBoost and ULMB, but much less than Turbo240. Works at 144 Hz unlike all the others, even works down to 60 Hz but I wouldn't expect this to look good. Minimal impact on contrast and colors. At 144 Hz and 120 Hz the backlight strobing is 1:1 with refresh rate.
  • Samsung Blur Reduction: With this I'm referring to the method found on Samsung's CFG70 monitors. The entire backlight doesn't blink at once, it scans in four different areas, thus removing strobe crosstalk! Definitely the best implementation except for the fact that it has locked brightness for some reason. Single strobes at 100 Hz, 120 Hz, and 144 Hz, double strobes below 100 Hz.

My rule of thumb? Single strobe blur reduction is only ideal at 120 Hz (with V-Sync) or higher. Maybe 110 Hz too but I can't test that with ULMB. On a TN monitor, I'm guessing it'd be beneficial at 100 Hz as well, but not on IPS.

Also, I love Eizo Turbo240. To me it's superior to ULMB. Less flicker obviously due to the 240 Hz strobing, but surprisingly it's smoother. Less ghosting. Turbo240 is actually beneficial at 60 Hz, 80 Hz, and 120 Hz, and doesn't look terrible when frame rate drops below refresh rate. I want more Turbo240 monitors. Or "Turbo288" since 144 Hz is all the rage nowadays.

Here is a brief slow-motion video I captured showing off 100 Hz ULMB backlight strobing on my Acer Predator XB270HU, with reduced pulse width (so shorter strobes). Notice that most of the strobing or blinking is toward the bottom of the screen, which explains why strobe crosstalk is worse toward the bottom.

Also be sure to read this article entry about display panel technologies (e.g. TN, IPS, VA, OLED).

ReplyQuote
Posted : 02/01/2016 7:34 pm
Share:

You Might Also Like

  
Working

Please Login or Register