• 2 Posts
  • 180 Comments
Joined 1 year ago
cake
Cake day: June 4th, 2023

help-circle
  • gamescope is a mess inside of Flatpak as of right now. Some issue with using Proton inside of gamescope inside of Flatpak.

    I played Alan Wake 2 using the native Heroic version and native gamescope.

    The gamescope version still needs to be pretty recent. After starting the game make sure its brightness does not change when you play with the SDR intensity slider. If it does, it’s not using HDR.

    Also, Alan Wake is a pretty bleak looking game in general so you might want to pop some flares to bring some color for testing.








  • Once again, the format doesn’t work for me when the main topic is about a fad that nobody talks about anymore.

    It worked in South Park for a long time because they had a relevant episode a week or two after it happened. In Futurama, not so much.

    The Bender story was pretty neat though. They could have left out all of the NFT stuff and focused just on the Bender plot and it would have been a significantly better episode.


  • So what’s the big fuggin’ problem here? That Intel won’t use the term “recall”?

    Would you say the same thing about a car?

    “We know the door might fall off but it has not fallen off yet so we are good.”

    The chances of that door hurting someone are low and yet we still replace all of them because it’s the right thing to do.

    These processors might fail any minute and you have no way of knowing. There’s people who depend on these for work and systems that are running essential services. Even worse, they might fail silently and corrupt something in the process or cause unecessary debugging effort.

    If I were running those processors in a company I would expect Intel to replace every single one of them at their cost, before they fail or show signs of failing.

    Those things are supposed to be reliable, not a liability.




  • Domi@lemmy.secnd.metoLinux Gaming@lemmy.worldHDR Confusion
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 months ago

    But why does it end up washing out colors unless I amplify them in kwin? Is just the brightness absolute in nits, but not the color?

    The desktop runs in SDR and the color space differs between SDR and HDR, meaning you will end up with washed out colors when you display SDR on HDR as is.

    When you increase the slider in KDE, you change the tone mapping but no tone mapping is perfect so you might want to leave it at the default 0% and use the HDR mode only for HDR content. In KDE for example, colors are blown out when you put the color intensity to 100%.

    Why does my screen block the brightness control in HDR mode but not contrast? And why does the contrast increase the brightness of highlights, instead of just split midtones towards brighter and darker shades?

    In SDR, your display is not sent an absolute value. Meaning you can pick what 100% is, which is your usual brightness slider.

    In HDR, your display is sent absolute values. If the content you’re displaying requests a pixel with 1000 nits your display should display exactly 1000 nits if it can.

    Not sure about the contrast slider, I never really use it.

    Why is truehdr400 supposed to be better in dark rooms than peak1000 mode?

    Because 1000 nits is absurdly bright, almost painful to watch in the dark. I still usually use the 1000 mode and turn on a light in the room to compensate.

    Why is my average emission capped at 270nits, that seems ridiculously low even for normal SDR screens as comparison.

    Display technology limitations. OLED screens can only display the full brightness over a certain area (e.g. 10% for 400 nits and 1% for 1000 nits) before having to dim the screen. That makes the HDR mode mostly unuseable for desktop usage since your screen will dim/brighten when moving large white or black areas around the screen.

    OLED screens simply can’t deliver the brightness of other display technologies but their benefits easily make it worth it.






  • We don’t have many unit tests that test against live APIs, most use mock APIs for testing.

    The only use for this header would be if somebody sees it during development, at which point it would already be in the documentation or if you explicitly add a feature to look if the header is present. Which I don’t see happening any time soon since we get mailed about deprecations as well.


  • I don’t really get the purpose of a header like this, who is supposed to check it? It’s not like developers casually check the headers returned by an API every week.

    Write them a mail if you see deprecated functions being used by a certain API key, probably much more likely to reach somebody that way.

    Also, TIL that the IETF deprecated the X- prefix more than 10 years ago. Seems like that one didn’t pan out.