Highly Terrible Algorithmic Nightmare
“YouTube never set out to serve users with sexual interests in children — but in the end... its automated system managed to keep them watching with recommendations... disturbingly on point... Users do not need to look for videos of children to end up watching them. The platform can lead them there through a progression of recommendations... So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.”