• 0 Posts
  • 24 Comments
Joined 2 years ago
cake
Cake day: July 2nd, 2023

help-circle
  • Nobara: Has all the gaming features I want on my gaming pc (like gamescope) and is htpc capable. Also, it’s based on Fedora, which I’m familiar with.

    Fedora: I like gnome and it’s always fairly up to date and rock solid. Great on my laptop.

    Have considered switching to openSUSE though. It’s German (as am I), it’s the first Linux distro I ever used (on my granddad’s PC, more than a decade ago) and I’ve heard a lot of good about tumbleweed.




  • Game dev salaries have increased roughly in line with inflation though, so development time still costs the studio the same as 15 years ago, while AAA game prices are only now starting to surpass the $70 mark with games not generally surpassing the $60 mark until 2020.

    It’s a wonder, they haven’t increased to prices any sooner, as much as I‘d like them staying where they were.

    And again: if you don’t like the prices, vote with your wallet, buy used or on sale or don’t pay at all.


  • Yea, I don’t generally disagree. Especially if you‘re someone who plays games for hundreds of hours, instead of dozens.

    But $100 is still a lot of money for a lot of people. I‘d have to save up for months for that (I’m a trainee and have less than 1000€ per month for rent, food, internet, gas, etc.), so I rather wait until I can get games cheaper.


  • Eh, there‘s some truth to either one. Game development is expensive and pricing hasn’t kept up with inflation ($60 in 2010 are almost $90 today). But also, games are ridiculously expensive at full price, especially in todays economy and especially if they’re as badly received as Skull and Bones, while Nintendo games are at the very least usually pretty decent.

    I’d recommend voting with your wallet and only buying games on sale or used. Just wait a little. (Or pirate them, if you can live with not supporting the developers at all).











  • No, HDR can’t make your monitor brighter than it is. But it can take full advantage of the brightness and contrast of modern displays in a way SDR cannot. In almost every case HDR looks better than SDR but brighter and/or more contrasty displays take the most advantage.

    In a more technical sense, SDR content is mastered with a peak brightness of 100 nits in mind. HDR is mastered for a peak brightness of 1000 nits, sometimes 2000 nits and the resulting improved contrast.

    If you don’t watch movies in HDR on a modern TV, you’re not taking full advantage of its capabilities.


  • That’s incorrect. While it can be assumed that HDR content supports at least 10bit colour, it is not necessary for monitor or content. The main difference is contrast and brightness. SDR is mastered for a brightness of 100 nits and a fairly low contrast. HDR is mastered for brighnesses of usually 1000 or even 2000 nits since modern displays are brighter and capable of higher contrast and thus can produce a more lifelike picture through the additional information within HDR.

    Of course you need a sufficiently bright and/or contrasty monitor for it to make a difference. An OLED screen or displays with a lot of dimming zones would produce the best results there. But even a 350nit cheap TV can look a bit better in HDR.