I know how to buy xorg users
I know how to buy xorg users
![](https://lemmy.world/pictrs/image/484dc1bd-5cab-4537-8ece-a5cc28a3b70e.jpeg?format=webp&thumbnail=128)
![](https://lemmy.world/pictrs/image/484dc1bd-5cab-4537-8ece-a5cc28a3b70e.jpeg?format=webp)
I know how to buy xorg users
Nah, I don't Need HDR
HDR is like RGB, sometimes cool if done really well but usually just a useless selling point.
Yeah I'll stick with my monochrome 2 color display for now.
Joke's on you I can't afford an HDR display & also I'm colorblind.
You can still profit from the increase in brightness and contrast! Doesn’t make a good HDR screen any cheaper though…
Was gonna say the same thing, HDR is like flac and expensive amps for audiophiles Maybe we should start calling them visualphiles ? 🤷♂️
"FLAC? Mate I destroyed my ears when I was 14 and listening to Linkin Park MP3s grabbed off Kazaa in the cheapest chinese earbuds my allowance could buy, at the highest volume my fake iPod could drive. I cannot hear the subtleties in your FLAC if I tried."
Cheek aside I believe the word would be Videophiles to pair with Audiophiles.
Eh SDR to HDR is a waaay bigger jump than Mp3 to FLAC. Assuming of course you have an actual HDR display. And not one of those "HDR" displays that only have like 400 nits of peak brightness.
HDR? Ah, you mean when videos keep flickering on Wayland!
I will switch when I need a new GPU.
Now that explicit sync has been merged this will be a thing of the past
And it was never a thing on AMD GPUs.
videos? everything flickers for me on wayland. X.org is literally the only thing keeping me from switching back to windows right now.
Wayland has started to support Explicit Sync which can fix the behavior of Nvidia's dumpster fire of a driver
For me it's just games, I'm guessing it's an Nvidia GPU? I hope explicit sync helps with that.
You want to win me over? For starters, provide a layer that supports all hooks and features in xdotool
and wmctrl
. As I understand it, that's nowhere near present, and maybe even deliberately impossible "for security reasons".
I know about ydotool
and dotool
. They're something but definitely not drop-in replacements.
Unfortunately, I suspect I'll end up being forced onto Wayland at some point because the easy-use distros will switch to it, and I'll just have to get used to moving and resizing my windows manually with the mouse. Over and over. Because that's secure.
I think the Wayland transition will not be without compromises
May I ask why you don't use tiling window managers if you don't like to move windows with the mouse?
I think it's possible to make such a tool for Wayland, but in Wayland stuff like that are completely on the compositor
So, ask the compositor developers to expose the required shit and you can make such a tool
Unfortunately, I suspect I’ll end up being forced onto Wayland at some point because the easy-use distros will switch to it, and I’ll just have to get used to moving and resizing my windows manually with the mouse. Over and over. Because that’s secure.
I think you were being sarcastic but it is more secure. Less convenient though.
I'm not sure if that's what you're looking for but KDE has nice window rules that can affect all sorts of settings. Placement, size, appearance etc. Lot of options. And you can match them per specific windows or the whole application etc. I use it for few things, mostly to place windows on certain screens and in certain sizes.
OK but can you please call NVidiachan? I know you two don't get along but maybe you can ask her for some support?
NVidiachan is busy selling GPUS for AI, but she is also working on adding explicit sync
If I understand correctly Nvidia isn't doing anything to do with explicit sync, it just doesn't support implicit sync which is currently what Wayland uses because we don't have explicit sync yet. Explicit sync would work with existing Nvidia drivers.
hahahaha tell that to nvidia users
Smart Nvidia users are ex Nvidia users
Actually wait until the next de releases hit repos, all the nvidia problems just got solved
Obviously it won't be all of them but I too am very excited about not having to get lucky with my games flickering or not.
I'm a happy Nvidia user on Wayland. Xorg had a massive bug that forced me to try out Wayland it has been really nice and smooth. I was surprised, seeing all the comments. But I might've just gotten lucky.
may 15 Arch users are going to be down loading the NoVideo Wayland Driver.
How will HDR affect the black screen I get every time I've tried to launch KDE under Wayland?
It will make your screen blacker.
But it all seriousness, your display manager might not support Wayland. Try something like ssdm.
You already had me at "144hz on one monitor and 60hz on the other so I can enjoy the nice monitor without having to buy a new secondary one."
Option "AsyncFlipSecondaries" "true"
It's not ready yet.
The protocol for apps/games to make use of it is not yet finalized.
The protocol won't be "finalized" for a long time as new shit is proposed every day.
But, for me and many others, it has had enough protocols to work properly for some years now. Right now I'm using Wayland exclusively with some heavy workloads and 0 issues.
I can play games and watch videos in HDR though
Been watching this drama about HDR for a year now, and still can't be arsed to read up on what it is.
HDR or High Dynamic Range is a way for images/videos/games to take advantage of the increased colour space, brightness and contrast of modern displays. That is, if your medium, your player device/software and your display are HDR capable.
HDR content is usually mastered with a peak brightness of 1000nits or more in mind, while Standard Dynamic Range (SDR) content is mastered for 80-100nit screens.
HDR makes stuff look really awesome. It's super good for real.
without any interruption to gaming compability I definitely don't want to switch sorry.
HDR is cool and I look forward to getting that full game compability and eventually making the switch but it's just not there yet
Deleted
Me, not much of a gamer and not a movie buff and having no issues with the way monitors have been displaying things for the past 25 years: No.
When I could no longer see the migraine-inducing flicker while being irradiated by a particle accelerator shooting a phosphor coated screen in front of my face, I was good to go.
It was exciting when we went from green/amber to color!
HDR is almost useless to me. I'll switch when wayland has proper remote desktop support (lmk if it does but I'm pretty sure it does not)
Seems like there's a bunch of solutions out there:
As of 2020, there are several projects that use these methods to provide GUI access to remote computers. The compositor Weston provides an RDP backend. GNOME has a remote desktop server that supports VNC. WayVNC is a VNC server that works with compositors, like Sway, based on the wlroots library. Waypipe works with all Wayland compositors and offers almost-transparent application forwarding, like ssh -X.
Do these not work for your use case?
MPV playback and games aren’t everything, but it’s decent for the starter
Kde on Wayland doesn't even have sticky keys.
Still can't use Barrier/Synergy with it 🤷🏻♂️
Jokes on you I use NVIDIA
Cries
I'm not touching Wayland until it has feature parity with X and gets rid of all the weird bugs like cursor size randomly changing and my jelly windows being blurry as hell until they are done animating
Not sure why your getting down voted, I wish I could switch, but only X works reliability.
have you tried plasma 6?
have you tried plasma 6?
Network transparency OR BUST
Sure, let me dust off my fucking SPARCStation and connect up to my fucking NIS server so I can fuck off and login to my Solaris server and run X11
Fucking WHO needs mainframe oriented network transparency in the 21 century leave that shit in 1989 like it belongs
Does wine run on wayland?
Edit, had to look up wth HDR is. Seems like a marketing gimmick.
It isn't, it's just that marketing is really bad at displaying what HDR is about.
HDR means each color channel that used 8 bits can now use 10 bits, sometimes more. That means an increase of 256 shades per channel to 1024, allowing a higher range of shades to be displayed in the same picture, and avoiding the color banding problem:
That’s just 10 bit color, which is a thing and does reduce banding but is only a part of the various HDR specs. HDR also includes a significantly increased color space, as well as varying methods (depending on what spec you’re using) of mapping the mastered video to the capabilities of the end user’s display. In addition, to achieve the wider color gamut required HDR displays require some level of local dimming to increase the contrast by adjusting the brightness of various parts of the image, either through backlight zones in LCD technologies or by adjusting the brightness per pixel in LED technologies like OLED.
Thank you.
I assume HDR has to be explicitly encoded into images (and moving images) then to have true HDR, otherwise it's just upsampled? If that's the case, I'm also assuming most media out there is not encoded with HDR, and further if that's correct, does it really make a difference? I'm assuming upsampling means inferring new values and probably using gaussian, dithering, or some other method.
Somewhat related, my current screens support 4k, but when watching a 4k video at 60fps side by side on a screen at 4k resolution and another 1080p resolution, no difference could be seen. It wouldn't surprise me if that were the same with HDR, but I might be wrong.
I have never seen banding before, the image seems specifically picked to show the effect. I know it's common when converting to less than 256 but color, e.g. if you turn images into svgs for some reason, or gifs (actual gifs, not video)
Also dithering exists.
Anyway, it'll surely be standard at some point in the future, but it's very much a small quality improvement and not something one definitely needs.
HDR is not just marketing. And, unlike what other commenters have said, it’s not (primarily) about larger colour bitrate. That’s only a small part of it. The primary appeal is the following:
Old CRTs and early LCDs had very little brightness and very low contrast. Thus, video mastering and colour spaces reflected that. Most non HDR ("SDR") films and series are mastered with a monitor brightness of 80-100nits in mind (depending on the exact colour space), so the difference between the darkest and the brightest part of the image can also only be 100nits. That’s not a lot. Even the cheapest new TVs and monitors exceed that by more than double. And of course, you can make the image brighter and artificially increase the contrast but that‘s the same as upsampling DVDs to HD or HD to 4K.
What HDR set out to do was providing a specification for video mastering, that takes advantage of modern display technology. Modern LCDs can get as bright as multiple thousands of nits and OLEDs have near infinite contrast ratios. HDR provides a mastering process with usually 1000-2000nits peak brightness (depending on the standard), thus also higher contrast (the darkest and brightest part of the image can be over 1000 nits apart).
Of course, to truly experience HDR, you’ll need a screen that can take advantage of it. OLED TVs, bright LCDs with local dimming zones (to increase the possible contrast), etc. It is possible to profit from HDR even on screens that aren’t as good (my TV is an LCD without local dimming and only 350nit peak brightness and it does make a noticeable difference although not a huge one) but for "real" HDR you’d need something better. My monitor for example is Vesa DisplayHDR 600 certified, meaning it has a peak brightness of 600nits plus a number of local dimming zones and the difference in supported games is night and day. And that’s still not even near the peak of HDR capabilities.
tl;dr: HDR isn‘t just marketing or higher colour depth. HDR video is mastered to the increased capabilities of modern displays, while SDR ("normal") content is not.
It’s more akin to the difference between DVD and BluRay. The difference is huge, as long as you have a screen that can take advantage.