BLUE
H
Hypervisible
@hypervisible.bsky.social
Every future imagined by a tech company is worse than the previous iteration…or something like that.
6.3k followers605 following5.1k posts
Hhypervisible.bsky.social

“The LAION-5B machine learning dataset used by Google, Stable Diffusion, and other major AI products has been removed by the organization that created it after a Stanford study found that it contained 3,226 suspected instances of child sexual abuse material, 1,008 of which were externally validated”

Largest Dataset Powering AI Images Removed After Discovery of Child Sexual Abuse Material
Largest Dataset Powering AI Images Removed After Discovery of Child Sexual Abuse Material

The model is a massive part of the AI-ecosystem, used by Google and Stable Diffusion. The removal follows discoveries made by Stanford researchers, who found thousands instances of suspected child sex...

12

DCdavidcarroll.org

Oh So they _can_ remove data from training sets. How about that.

1
WRwingedrayeth.bsky.social

Wow...

0
SNsnightshade.com

Fucking idiots. Indiscriminate scraping of data like this is a totally amateur move I’ve only seen done by dumb college students who had to learn their lesson the hard way.

0
LMthemckenziest.gay

Well this is incredibly fucking disturbing

0
TQelizajoku.bsky.social

So CS has learned only the most basic lessons from the history of the Lena image? Great 😐

0
Rralphesq.bsky.social

Google, Yahoo, Facebook, Dropbox, etc. regularly filter these hash values and report users. Stunning that there was no check done here.

1
TCtakooki.bsky.social

wow guys who would have thought ai would be used for unsavory and evil motives

0

wait wait wait how did an industry filled with amoral grifters, nepo babies, frat boy business school date r*pists and soulless corporate shills allow something like this to happen??

0
H
Hypervisible
@hypervisible.bsky.social
Every future imagined by a tech company is worse than the previous iteration…or something like that.
6.3k followers605 following5.1k posts