> That will help save enormous amounts of power: up to 48 percent on a single charge,
Why does refresh rate have such a large impact on power consumption? I understand that the control electronics are 60x more active at 60 Hz than 1 Hz, but shouldn't the light emission itself be the dominant source of power consumption by far?
I used to be a display architect about 15 years back (for Qualcomm mirasol, et al), so my knowledge of the specifics / numbers is outdated. Sharing what I know.
High pixel density displays have disproportionately higher display refresh power (not just proportional to the total number of pixels as the column lines capacitances need to be driven again for writing each row of pixels). This was an important concern as high pixel densities were coming along.
Display needs fast refreshing not just because pixel would lose charge, but because the a refresh can be visible or result in flicker. Some pixels tech require flipping polarity on each refresh but the curves are not exactly symmetric between polarities, and further, this can vary across the panel. A fast enough refresh hides the mismatch.
Since you are knowledgable about this, do you have any idea what happened to Mirasol technology? I was fascinated by those colour e-paper like displays, and disappointed when plans to manufacture it was shelved. Then I learnt Apple purchased it but it looks more like a patent padding purchase than for tech development as nothing has come out of it form Apple too. Is it in some way still being developed or parts of its research tech being used in display development?
Being a key technology architect for it (not the core inventor), I know all about it, and then some more!
I cannot however talk publicly about it. :-(
It has been a disappointment for me as well. I had worked on it for nearly eight years. The idea was so interesting--using thin-film interference for creating images is akin to shaping Newton's rings into arbitrary images, something which even Newton would not have imagined! The demos and comparisons we had shown to various industry leaders and sometimes publicly were often instantly compelling. The people/engineers in the team were mostly the best I have ever worked with, and with whom I still maintain a great connection. But unfortunately, there were problems (not saying how much tech how much people) that were recognized by some but never got (timely) addressed. And a tech like it does not exist till date.
I do not think anything on it is being developed further.
The earliest of the patents would have expired by now.
Liquavista, Pixtronics, etc., have been alternative display technologies that also ultimately didn't make the impact desired, AFAIK.
Meanwhile, LCDs developed high pixel densities (which led to pressures on mirasol tech too), Plasma got sidelined. EInk displays have since then made good progress, though, in my opinion, are still far from colors and speeds that mirasol had. And of course, OLED, Quantum dots, ...
My fantasy display would be some kind of reflective-mode display that can passively show static images like e-ink, have partial updates like MIP LCD in wearables, response times like modern LCD and AMOLED, and "super-real" contrast/gain.
I.e. actually do wavelength conversion to not just reflect a narrow-pass filtered version of the ambient light, but convert that broad spectrum energy into the desired visuals, so it isn't always inherently dimmer than the environment. I can only imagine this being either:
1. some wild materials science stuff that manages interference
2. some wild materials science stuff that controls multi-photon fluorescence
3. some wild materials science stuff to fuse photoelectric and electroemissive functions in the same panel. i.e. not really passive but extremely low loss active system to double-convert the ambient light that can follow the power curve of available light
>> My fantasy display would be some kind of reflective-mode display that can passively show static images like e-ink, have partial updates like MIP LCD in wearables, response times like modern LCD and AMOLED, and "super-real" contrast/gain.
What about cost? :-) It is an important factor too outside of the fantasy world and can kill new display technologies. The latter often suffer from yield issues (dead pixels, etc.) during early phases of R&D which can make initial costs be still higher as compared to already matured technologies.
>> I.e. actually do wavelength conversion to not just reflect a narrow-pass filtered version of the ambient light, but convert that broad spectrum energy into the desired visuals
Reflecting filtered version of the ambient light, if done efficiently, brings the display to as bright as other natural/common objects around. So it should be good enough for most purposes, even in a somewhat darker ambient with eyes adjusted.
It would not however be attention-grabbing by being brighter than those surrounding objects. So many users, often used to seeing brighter emissive displays, still do not pick those as a preference.
>> I can only imagine this being either:
>> ...
Another way to make it look brighter is to reflect more light towards the users/eyes while capturing it from broader directions. This would compromise on viewing angle (unless more fantasy tech is brought in), but I think this in itself take the display to wow levels.
Well, the reflectivity of color MIP LCD is not very satisfactory. It is barely adequate, even for people like me who are fans. This is both because of the narrow-band RGB filtering and the inherent losses of the polarization-based switching method. Even the "white" state is discarding most polarizations of the ambient light, and then the darker colors are even blocking that.
My fantasy is having the reflectivity be at least as good as good white paper, and with deep contrast too.
It also needs to be brighter in practice than normal objects because, no matter what, it will have to overcome some glare from whatever protective glass and touch sensing layers there are over the actual display.
I think the idea is that in an always-on display mode, most of the screen is black and the rest is dim, so circuitry power budget becomes a much larger fraction of overhead.
I interpreted that bit as E2E system uptime being up by 48%. Sounds more plausible to me, as there'd fewer video frames that would need to be produced and pushed out.
This is an OLED display, so I don't think the control electronics are actually any less active. (They would be for LCD, which is where most of these low-refresh-rate optimizations make sense.)
The connection between the GPU and the display has been run length encoded (or better) since forever, since that reduces the amount of energy used to send the next frame to the display controller. Maybe by "1Hz" they mean they also only send diffs between frames? That'd be a bigger win than "1Hz" for most use cases.
But, to answer your question, the light emission and computation of the frames (which can be skipped for idle screen regions, regardless of frame rate) should dwarf the transmission cost of sending the frame from the GPU to the panel.
The more I think about this, the less sense it makes. (The next step in my analysis would involve computing the wattage requirements of the CPU, GPU and light emission, then comparing that to the KWh of the laptop battery + advertised battery life.
Really disappointing to only learn this after a decade, but on Linux changing from 60hz to 40hz decreased my power draw by 40% in the last hour since reading this comment.
Before OLED (and similar), most displays were lit with LEDs (behind or around the screen, through a diffuser, then through liquid crystals) which was indeed the dominant power draw... like 90% or so!
But the article is about an OLED display, so the pixels themselves are emitting light.
It doesn't. They take extreme use cases such as watching video until the battery depletes at maximum brightness where 90% of power consumption is the display. But in realistic use cases the fraction of power draw consumed by the display is much smaller when the CPU is actually doing things.
> HKC has announced a new laptop display panel that supports adaptive refresh across a 1 to 60Hz range, including a 1Hz mode for static content. HKC says the panel uses an Oxide (metal-oxide TFT) backplane and its low leakage characteristics to keep the image stable even at 1Hz.
Sorry, might be obvious to some, but is that rate applied to the whole screen or can certain parts be limited to 1Hz whilst others are at a higher rate?
The ability to vary it seems like it would be valuable as there are significant portions of a screen that remain fairly static for longer periods but equally there are sections that would need to change more often and would thus mess with the ability to stick to a low rate if it's a whole screen all-or-nothing scenario.
Anyone who has accidentally snapped the controller off a working LCD can tell you that the pixel capacitance keeps the colours approximately correct for about 10 seconds before it all becomes a murky shadowy mess...
So it makes sense you could cut the refresh time down to a second to save power...
Although one wonders if it's worth it when the backlight uses far more power than the control electronics...
Horrid website: forced cookies, invisible adverts (Mamma Mia, anyone?), and that thing where it’s a page of garbage links when you go back. I will never click a PC World URL again.
I'm guessing that for this to work you need to be able to selectively refresh parts of the screen at different rates? a 1Hz refresh rate would be rubbish just to follow the mouse cursor, so at least that part of the screen needs to refresh faster. However, it does make sense for the parts of the screen that are mostly static. Looking at my screen as I type this, the only part that needs a high-refresh rate is the text-box where I'm typing (I can type several keys per second so I wouldn't want a refresh rate of 1 Hz). However, the rest of the screen is not changing at all so a slow refresh is perfectly fine.
Modern software regularly takes like 1 second to load anyways.
200ms is the minimum human reaction time, so adding 100ms would only add like 50% to the REPL user interaction. Something like 10Hz might be quite usable while minimally contributing to lag.
The idea of having a 60Hz screen is nice, but in practice it turns out that display refresh rate is not the bottleneck for most software.
Sure dropping toward 1Hz could be huge. But the moment you scroll, watch video, or even have subtle UI animations, you're back in higher refresh territory
this is just regurgitating the manufacturer's claim. I believe it when I see it. Most of display energy use is to turn on the OLED/backlight. They're claiming, because our display flickers less, it's 48% more efficient now.
174 comments
> That will help save enormous amounts of power: up to 48 percent on a single charge,
Why does refresh rate have such a large impact on power consumption? I understand that the control electronics are 60x more active at 60 Hz than 1 Hz, but shouldn't the light emission itself be the dominant source of power consumption by far?
High pixel density displays have disproportionately higher display refresh power (not just proportional to the total number of pixels as the column lines capacitances need to be driven again for writing each row of pixels). This was an important concern as high pixel densities were coming along.
Display needs fast refreshing not just because pixel would lose charge, but because the a refresh can be visible or result in flicker. Some pixels tech require flipping polarity on each refresh but the curves are not exactly symmetric between polarities, and further, this can vary across the panel. A fast enough refresh hides the mismatch.
I cannot however talk publicly about it. :-(
It has been a disappointment for me as well. I had worked on it for nearly eight years. The idea was so interesting--using thin-film interference for creating images is akin to shaping Newton's rings into arbitrary images, something which even Newton would not have imagined! The demos and comparisons we had shown to various industry leaders and sometimes publicly were often instantly compelling. The people/engineers in the team were mostly the best I have ever worked with, and with whom I still maintain a great connection. But unfortunately, there were problems (not saying how much tech how much people) that were recognized by some but never got (timely) addressed. And a tech like it does not exist till date.
I do not think anything on it is being developed further.
The earliest of the patents would have expired by now.
Liquavista, Pixtronics, etc., have been alternative display technologies that also ultimately didn't make the impact desired, AFAIK.
Meanwhile, LCDs developed high pixel densities (which led to pressures on mirasol tech too), Plasma got sidelined. EInk displays have since then made good progress, though, in my opinion, are still far from colors and speeds that mirasol had. And of course, OLED, Quantum dots, ...
I.e. actually do wavelength conversion to not just reflect a narrow-pass filtered version of the ambient light, but convert that broad spectrum energy into the desired visuals, so it isn't always inherently dimmer than the environment. I can only imagine this being either:
1. some wild materials science stuff that manages interference
2. some wild materials science stuff that controls multi-photon fluorescence
3. some wild materials science stuff to fuse photoelectric and electroemissive functions in the same panel. i.e. not really passive but extremely low loss active system to double-convert the ambient light that can follow the power curve of available light
>> My fantasy display would be some kind of reflective-mode display that can passively show static images like e-ink, have partial updates like MIP LCD in wearables, response times like modern LCD and AMOLED, and "super-real" contrast/gain.
What about cost? :-) It is an important factor too outside of the fantasy world and can kill new display technologies. The latter often suffer from yield issues (dead pixels, etc.) during early phases of R&D which can make initial costs be still higher as compared to already matured technologies.
>> I.e. actually do wavelength conversion to not just reflect a narrow-pass filtered version of the ambient light, but convert that broad spectrum energy into the desired visuals
Reflecting filtered version of the ambient light, if done efficiently, brings the display to as bright as other natural/common objects around. So it should be good enough for most purposes, even in a somewhat darker ambient with eyes adjusted.
It would not however be attention-grabbing by being brighter than those surrounding objects. So many users, often used to seeing brighter emissive displays, still do not pick those as a preference.
>> I can only imagine this being either:
>> ...
Another way to make it look brighter is to reflect more light towards the users/eyes while capturing it from broader directions. This would compromise on viewing angle (unless more fantasy tech is brought in), but I think this in itself take the display to wow levels.
My fantasy is having the reflectivity be at least as good as good white paper, and with deep contrast too.
It also needs to be brighter in practice than normal objects because, no matter what, it will have to overcome some glare from whatever protective glass and touch sensing layers there are over the actual display.
The connection between the GPU and the display has been run length encoded (or better) since forever, since that reduces the amount of energy used to send the next frame to the display controller. Maybe by "1Hz" they mean they also only send diffs between frames? That'd be a bigger win than "1Hz" for most use cases.
But, to answer your question, the light emission and computation of the frames (which can be skipped for idle screen regions, regardless of frame rate) should dwarf the transmission cost of sending the frame from the GPU to the panel.
The more I think about this, the less sense it makes. (The next step in my analysis would involve computing the wattage requirements of the CPU, GPU and light emission, then comparing that to the KWh of the laptop battery + advertised battery life.
But the article is about an OLED display, so the pixels themselves are emitting light.
The Apple Watch Series 5 (2019) has a refresh rate down to 1Hz.
M4 iPad Pro lacks always-on display despite OLED panel with variable refresh rate (2024):
https://9to5mac.com/2024/05/09/m4-ipad-pro-always-on-display...
> LG’s press release leaves several questions unanswered, including the source of the “Oxide” name...
> Source: https://www.pcworld.com/article/3096432 [2026-03-23]
---
> HKC has announced a new laptop display panel that supports adaptive refresh across a 1 to 60Hz range, including a 1Hz mode for static content. HKC says the panel uses an Oxide (metal-oxide TFT) backplane and its low leakage characteristics to keep the image stable even at 1Hz.
> Source: https://videocardz.com/newz/hkc-reveals-1hz-to-60hz-adaptive... [2025-12-29]
---
> History is always changing behind us, and the past changes a little every time we retell it. ~ Hilary Mantel
The ability to vary it seems like it would be valuable as there are significant portions of a screen that remain fairly static for longer periods but equally there are sections that would need to change more often and would thus mess with the ability to stick to a low rate if it's a whole screen all-or-nothing scenario.
So it makes sense you could cut the refresh time down to a second to save power...
Although one wonders if it's worth it when the backlight uses far more power than the control electronics...
> A 1Hz panel is almost, but not quite, on the level of an e-ink panel, which isn’t the prettiest to look at.
level of what? Power consumption? dude e-ink takes 0 power between refreshs.
And e-ink is pretty?
Saving battery is nice, but I'm not leaving Linux for that misery any time soon
Apple already uses similar tech on the phones and watches.
The idea of having a 60Hz screen is nice, but in practice it turns out that display refresh rate is not the bottleneck for most software.
There are also mini LED laptop for creative work. Few more things to check before buying new laptop.