Not knowing how anything works, being scared by errors that you don’t know how to get around or deal with, not knowing alternatives for your former favourite apps to do things quickly, wondering if you get the peripherals you currently own to run?
I mean, that’s how you start learning stuff - not knowing how something works
Being scared by errors that you don’t know how to get around or deal with
Isn’t that the case for every OS in existence? When something breaks, you don’t know how to deal with it. Enter google/ddg/whatever
Not knowing alternatives for your former favourite apps to do things quickly
See point 1 - and yet there are Linux apps that let you do things quicker than Windows stuff. I can’t imagine myself at this point having to use frigging photoshop to crop or add a border to a image when you could do that with a ´magick -crop´
Wondering if you get the peripherals you currently own to run?
Wasn’t that the whole point of live images? Not that they will charge you for downloading them. And hardware support is infinitely better today than back in the day. Just look at what the folks at asahi did - that’s nothing short of incredible
My brother in penguin computers, that wasn’t a negative post. I just mentioned all the things to point out that switching OS can be really hard and the first time getting to Linux ain’t all sunshine and rainbows. 😉
Really just needs one problem with some necessity you can’t fix to screw up the whole experience. Doesn’t even have to be Linux’ fault; just thinking about god damn printers…
Tossing Gentoo onto an old Pentium III box, typing emerge world and coming back four hours later to see if it’s done was awesome.
And no, it wasn’t done compiling KDE yet.
But I definitely wouldn’t want to experiment with Linux on my only PC with no way to look things up if I break networking (or the whole system). Thankfully, this is no longer an issue in the age of smartphones.
Yes, but Windows is normal and therefore all of its myriad problems are just part-and-parcel with using a computer and can be ignored. Linux is not normal, though, so the slightest roadbump is an instant deal-breaker.
There’s also the fact that if you have modern hardware, you’ll find that half the features that you paid for don’t work properly in Linux (or at all). It’s a great OS to keep an old PC alive, though.
That’s less of an issue these days. In the 2000s it was like that, especially since people used all sorts of add-in cards. These days a lot of those cards have merged with the mainboard (networking, sound, USB) or have fallen out of fashion (e.g. TV tuners).
The mainboard stuff is generally well-supported. The days of the Winmodem are over. The big issues these days are special-purpose hardware (which generally doesn’t work with later Windows versions either), laptops, and Nvidia GPUs (which are getting better).
I said what I said because it’s relevant today. I literally had this issue last month with modern hardware, when I couldn’t get HDR working properly in KDE 6 Plasma (colors are washed-out and have no contrast when HDR is on). And features from my GPU are completely missing, like SDR-to-HDR conversion, AI upscaling, and the entire 3D Settings Page (the one where you can change settings not available in-game). When I ask people for help with restoring these features/settings, no one has any idea what I’m talking about. So I gave up and went back to Windows.
Ah, the old Nvidia problem. It’s true that Nvidia’s Linux driver isn’t very good (although I don’t think their Windows driver is very good either, it just has more features).
The 3D Settings page is specific to the Nvidia Windows driver. Even an AMD user might’ve been slightly confused (although AMD ships comparable features, just located elsewhere under a different name). This is indeed something the Linux drivers plain don’t have in that form, although I can’t remember the last time I felt a need to really muck around in there.
Admittedly, overriding game rendering behavior might not even always be possible, seeing that DirectX games are run through a translation layer before the GPU gets to do anything.
I wasn’t able to find solid info for AI upscaling even on Windows, mainly because of the terrible name of that feature and because Nvidia offers both “AI Upscaling” and “Nvidia Image Scaling” and I have no idea if those are the same thing. The former seems to be specific to the Nvidia SHIELD.
Unless you’re talking about DLSS, which is supported.
The HDR one is odd but might again be related to the Nvidia driver not being very good. This should improve in the future but they are admittedly trailing behind.
See this is what I mean. You don’t even know what I’m talking about because these features don’t even exist in Linux yet. Thank you for confirming that the 3D Settings page still doesn’t exist. I won’t be switching until it does.
Furthermore, AI upscaling has nothing do with DLSS or Nvidia Shield. It’s a GPU feature that upscales any video playing on your PC to 4K, whether it be in a video player or your favorite browser of choice. It’s a really neat feature to have (especially for watching older content), and not something I can go without now that I’m used to it. Same goes for SDR-to-HDR conversion. Yes I’m aware that it’s not true HDR, but it’s convincing enough to fool me. YouTube videos look so much better with it on. Whites are whiter and colors really pop. Again, not something I can live without, now that I’m used to it.
It doesn’t matter to me who’s fault it is; what matters to me is being able to use the features I paid for, and for that reason alone I’m stuck with Windows. Believe me, I really want to switch and get away from all the privacy-invading telemetry, but I can’t just yet.
This is a case of you having some very specific requirements that can only be met in a certain way, that being Windows in this case. Whether or not a switch makes sense depends on how important those requirements are to you. Seems perfectly reasonable to me.
I personally found the ability to override a game’s rendering settings to only be worth it in very few cases but that’s me. But if you use it a lot then you use it a lot.
As for AI upscaling, my main issue there is that Nvidia chose a name so generic that it’s hard to google. And then they made a second unrelated feature with a very similar name.
There is AI video upscaling for Linux but it probably doesn’t work quite the same way Nvidia’s offering does. That might be a problem or it might not; I admittedly only invested a minute to look it up so I don’t have any details.
The same applies to SDR-to-HDR. There seems to be something but it probably doesn’t work like what you currently use.
So in the end you’ll have to decide whether you’d be more annoyed by not having those features or by having to use whatever zany shit Microsoft come up with. Not a great decision but that’s life.
I personally might have stuck with Windows longer on my desktop if my 4080 hadn’t turned out to be wonky and Nvidia’s driver hadn’t turned out to be so capricious that I had to spend two months ruling out plausible error causes. That drove me back to AMD, which made the switch easy. But again, that’s me and not you.
“I want everything on my computer to look shiny and fake and rendered and if i cant have that with linux then microsoft will just get to keep raping my data. Because when im watching my YOUTUBERS i dont want to see people i want them to look like filtered upscaled animations.”
^ See this is the other issue I have with Linux users. $20 says you hate 24/30Hz to HFR conversion as well. You probably hate 3D audio too. You’re all the same.
Call it “shiny” and “fake” all you want, but it looks a lot more real to me than the dull, stuttery, pixelated video we’ve all been fooled into believing is superior because Hollywood told generations of people that it was, simply because the technology wasn’t there yet.
Now it’s here, and you people call it “shiny, fake, and rendered” because you’re old and stubborn and unable to accept the fact that technology evolves. Hate to burst your bubble, but it doesn’t actually look the way you’re imagining it does. I wish I could show you in person. You’d become a believer, like everyone else I’ve shown my HTPC setup to.
If the point of video is not to capture a slice of life, then what is the point of video? 1080p SDR @ 24 FPS does not look real to me, but 4K HDR @ 120+ is much closer, even when upscaled to that.
I wonder if he even has the proprietary driver installed using the package manager of his distro and has chosen the right packages for cuda and vulcan or if he just manually installed the proprietary driver via deb file and has still the nouveau/reverse engineered version of cuda and vulcan installed 🤔
Oh to be a young lad experiencing linux for the first time again
For higher throughput in middle age, I recommend Edge-ing every so often, and then going back to a real browser.
Not knowing how anything works, being scared by errors that you don’t know how to get around or deal with, not knowing alternatives for your former favourite apps to do things quickly, wondering if you get the peripherals you currently own to run?
naah thanks mate, hard pass.
I mean, that’s how you start learning stuff - not knowing how something works
Isn’t that the case for every OS in existence? When something breaks, you don’t know how to deal with it. Enter google/ddg/whatever
See point 1 - and yet there are Linux apps that let you do things quicker than Windows stuff. I can’t imagine myself at this point having to use frigging photoshop to crop or add a border to a image when you could do that with a ´magick -crop´
Wasn’t that the whole point of live images? Not that they will charge you for downloading them. And hardware support is infinitely better today than back in the day. Just look at what the folks at asahi did - that’s nothing short of incredible
My brother in penguin computers, that wasn’t a negative post. I just mentioned all the things to point out that switching OS can be really hard and the first time getting to Linux ain’t all sunshine and rainbows. 😉
Really just needs one problem with some necessity you can’t fix to screw up the whole experience. Doesn’t even have to be Linux’ fault; just thinking about god damn printers…
Tossing Gentoo onto an old Pentium III box, typing
emerge world
and coming back four hours later to see if it’s done was awesome.And no, it wasn’t done compiling KDE yet.
But I definitely wouldn’t want to experiment with Linux on my only PC with no way to look things up if I break networking (or the whole system). Thankfully, this is no longer an issue in the age of smartphones.
I feel like this supporting Windows servers and navigating Win 11/12 clients at work these days.
Yes, but Windows is normal and therefore all of its myriad problems are just part-and-parcel with using a computer and can be ignored. Linux is not normal, though, so the slightest roadbump is an instant deal-breaker.
There’s also the fact that if you have modern hardware, you’ll find that half the features that you paid for don’t work properly in Linux (or at all). It’s a great OS to keep an old PC alive, though.
That’s less of an issue these days. In the 2000s it was like that, especially since people used all sorts of add-in cards. These days a lot of those cards have merged with the mainboard (networking, sound, USB) or have fallen out of fashion (e.g. TV tuners).
The mainboard stuff is generally well-supported. The days of the Winmodem are over. The big issues these days are special-purpose hardware (which generally doesn’t work with later Windows versions either), laptops, and Nvidia GPUs (which are getting better).
Me, with two soundblaster cards (Z, AE-5), and a nvme pcie card (2 drives, raid0) in my main rig, along a 10Gb card sitting on my parts cart: hello.
I said what I said because it’s relevant today. I literally had this issue last month with modern hardware, when I couldn’t get HDR working properly in KDE 6 Plasma (colors are washed-out and have no contrast when HDR is on). And features from my GPU are completely missing, like SDR-to-HDR conversion, AI upscaling, and the entire 3D Settings Page (the one where you can change settings not available in-game). When I ask people for help with restoring these features/settings, no one has any idea what I’m talking about. So I gave up and went back to Windows.
Ah, the old Nvidia problem. It’s true that Nvidia’s Linux driver isn’t very good (although I don’t think their Windows driver is very good either, it just has more features).
The 3D Settings page is specific to the Nvidia Windows driver. Even an AMD user might’ve been slightly confused (although AMD ships comparable features, just located elsewhere under a different name). This is indeed something the Linux drivers plain don’t have in that form, although I can’t remember the last time I felt a need to really muck around in there.
Admittedly, overriding game rendering behavior might not even always be possible, seeing that DirectX games are run through a translation layer before the GPU gets to do anything.
I wasn’t able to find solid info for AI upscaling even on Windows, mainly because of the terrible name of that feature and because Nvidia offers both “AI Upscaling” and “Nvidia Image Scaling” and I have no idea if those are the same thing. The former seems to be specific to the Nvidia SHIELD.
Unless you’re talking about DLSS, which is supported.
The HDR one is odd but might again be related to the Nvidia driver not being very good. This should improve in the future but they are admittedly trailing behind.
See this is what I mean. You don’t even know what I’m talking about because these features don’t even exist in Linux yet. Thank you for confirming that the 3D Settings page still doesn’t exist. I won’t be switching until it does.
Furthermore, AI upscaling has nothing do with DLSS or Nvidia Shield. It’s a GPU feature that upscales any video playing on your PC to 4K, whether it be in a video player or your favorite browser of choice. It’s a really neat feature to have (especially for watching older content), and not something I can go without now that I’m used to it. Same goes for SDR-to-HDR conversion. Yes I’m aware that it’s not true HDR, but it’s convincing enough to fool me. YouTube videos look so much better with it on. Whites are whiter and colors really pop. Again, not something I can live without, now that I’m used to it.
It doesn’t matter to me who’s fault it is; what matters to me is being able to use the features I paid for, and for that reason alone I’m stuck with Windows. Believe me, I really want to switch and get away from all the privacy-invading telemetry, but I can’t just yet.
This is a case of you having some very specific requirements that can only be met in a certain way, that being Windows in this case. Whether or not a switch makes sense depends on how important those requirements are to you. Seems perfectly reasonable to me.
I personally found the ability to override a game’s rendering settings to only be worth it in very few cases but that’s me. But if you use it a lot then you use it a lot.
As for AI upscaling, my main issue there is that Nvidia chose a name so generic that it’s hard to google. And then they made a second unrelated feature with a very similar name.
There is AI video upscaling for Linux but it probably doesn’t work quite the same way Nvidia’s offering does. That might be a problem or it might not; I admittedly only invested a minute to look it up so I don’t have any details.
The same applies to SDR-to-HDR. There seems to be something but it probably doesn’t work like what you currently use.
So in the end you’ll have to decide whether you’d be more annoyed by not having those features or by having to use whatever zany shit Microsoft come up with. Not a great decision but that’s life.
I personally might have stuck with Windows longer on my desktop if my 4080 hadn’t turned out to be wonky and Nvidia’s driver hadn’t turned out to be so capricious that I had to spend two months ruling out plausible error causes. That drove me back to AMD, which made the switch easy. But again, that’s me and not you.
“I want everything on my computer to look shiny and fake and rendered and if i cant have that with linux then microsoft will just get to keep raping my data. Because when im watching my YOUTUBERS i dont want to see people i want them to look like filtered upscaled animations.”
^ See this is the other issue I have with Linux users. $20 says you hate 24/30Hz to HFR conversion as well. You probably hate 3D audio too. You’re all the same.
Call it “shiny” and “fake” all you want, but it looks a lot more real to me than the dull, stuttery, pixelated video we’ve all been fooled into believing is superior because Hollywood told generations of people that it was, simply because the technology wasn’t there yet.
Now it’s here, and you people call it “shiny, fake, and rendered” because you’re old and stubborn and unable to accept the fact that technology evolves. Hate to burst your bubble, but it doesn’t actually look the way you’re imagining it does. I wish I could show you in person. You’d become a believer, like everyone else I’ve shown my HTPC setup to.
If the point of video is not to capture a slice of life, then what is the point of video? 1080p SDR @ 24 FPS does not look real to me, but 4K HDR @ 120+ is much closer, even when upscaled to that.
I wonder if he even has the proprietary driver installed using the package manager of his distro and has chosen the right packages for cuda and vulcan or if he just manually installed the proprietary driver via deb file and has still the nouveau/reverse engineered version of cuda and vulcan installed 🤔
PTSD of wifi and gpu drivers:
dog war flashback meme
Gets super horny
Hol up
I’m doing that rn. Not the first time as I’ve used it before, but this time as a daily driver.
Am enjoying it more than ever almost 20 years in.