So where is the line drawn? What about the teens who want to lookup how to do an exercise correctly without getting injured? The people who make these videos are usually very fit (big surprise!)
I have a feeling this is going to be driven by some AI model that’s gonna do more harm than good
It’s usually that they seem to block the main channels and the small ones that don’t know what they are talking about slip through. Going to get some kids hurt doing this.
From what I understand this is just the recommended feed so it wouldn’t affect searching for specific stuff, or binging a channel’s backlog.
And frankly speaking this should be a default feature. All too often the algorithm thinks “oh you watched this one video let me drown you in that shit at the expense of everything else”.
The whole thing meshes well with what we know from child/youth psychology, btw: Agency makes all the difference, whether they’re seeking information, or are (in currentyear), doomscrolling it. One tends to involve critical engagement, the other is an osmosis sponge.
So where is the line drawn? What about the teens who want to lookup how to do an exercise correctly without getting injured?
From the article:
The platform will still allow 13- to 17-year-olds to view the videos, but its algorithms will not push young users down related content “rabbit holes” afterwards.
Well, no. It was more a joke, but I have a third party channel blocker installed that I use to block them, but every now and then, I get a new one recommended to me. What I find interesting is is that I never engage with those types of channels, so why would the algorithm ever suggest them in the first place if the algorithm? In fact, the only political content of any type I watch is Behind the Bastards, but it never suggests any left wing content. Kinda odd.
I find this happens any time I engage with anything anyone on the right also likes watching, like a gun channel, or a non-political video from a right leaning channel. I think the algorithm is just saying “I saw a republican watch this once so if you watched it there’s still some chance you’ll engage with this right wing content.”
I think it pushes it so heavily because it’s a gold mine (to the algorithm) since content by those channels is so heavily consumed.
Because it’s not God making the choices it’s an algo, God would know what you want bur an algo needs data - if there is a popular video that lots of people who watch content you like watch then it makes sense they see if you’re interested.
It does the same with everything, your just notice the stuff you hate more, right wingers claim youtube and Facebook push woke stuff for the exact same reason.
So where is the line drawn? What about the teens who want to lookup how to do an exercise correctly without getting injured? The people who make these videos are usually very fit (big surprise!)
I have a feeling this is going to be driven by some AI model that’s gonna do more harm than good
It’s YouTube, there is no line, just a vague squiggle that you can cross without any warning.
It’s usually that they seem to block the main channels and the small ones that don’t know what they are talking about slip through. Going to get some kids hurt doing this.
From what I understand this is just the recommended feed so it wouldn’t affect searching for specific stuff, or binging a channel’s backlog.
And frankly speaking this should be a default feature. All too often the algorithm thinks “oh you watched this one video let me drown you in that shit at the expense of everything else”.
The whole thing meshes well with what we know from child/youth psychology, btw: Agency makes all the difference, whether they’re seeking information, or are (in currentyear), doomscrolling it. One tends to involve critical engagement, the other is an osmosis sponge.
Oh. Speaking of youtube fitness channels, here’s a good one. And another one. Like, especially if you haven’t done anything in a while, just watch this.
From the article:
Is there any way they can stop suggesting me right wing channels?
Yes, delete your watch history and use the do not recommend option on video drop downs
Well, no. It was more a joke, but I have a third party channel blocker installed that I use to block them, but every now and then, I get a new one recommended to me. What I find interesting is is that I never engage with those types of channels, so why would the algorithm ever suggest them in the first place if the algorithm? In fact, the only political content of any type I watch is Behind the Bastards, but it never suggests any left wing content. Kinda odd.
I find this happens any time I engage with anything anyone on the right also likes watching, like a gun channel, or a non-political video from a right leaning channel. I think the algorithm is just saying “I saw a republican watch this once so if you watched it there’s still some chance you’ll engage with this right wing content.”
I think it pushes it so heavily because it’s a gold mine (to the algorithm) since content by those channels is so heavily consumed.
One personal benefit of RegularCarReviews coming out as gay has been the purge of right wingers from his channel.
Still, if google can tweak the algo so kids don’t get fitness videos, I should be able to have a toogle to keep right wing videos out.
Because it’s not God making the choices it’s an algo, God would know what you want bur an algo needs data - if there is a popular video that lots of people who watch content you like watch then it makes sense they see if you’re interested.
It does the same with everything, your just notice the stuff you hate more, right wingers claim youtube and Facebook push woke stuff for the exact same reason.
Don’t sign in?