So in a just world, google would be heavily penalized for not only allowing csam on their servers, but also for violating their own tos with a customer?
They were not only not allowing it, they immediately blocked the user’s attempt to put it on their servers and banned the user for even trying. That’s as far from allowing it as possible.
This, literally the only reason I could guess is that it is to teach AI to recognise childporn, but if that is the case, why is google going it instead of like, the FBI?
Google isn’t the only service checking for csam. Microsoft (and other file hosting services, likely) also have methods to do this. This doesn’t mean they also host csam to detect it. I believe their checks use hash values to determine if a picture is already clocked as being in that category.
This has existed since 2009 and provides good insight on the topic, used for detecting all sorts of bad category images:
So in a just world, google would be heavily penalized for not only allowing csam on their servers, but also for violating their own tos with a customer?
They were not only not allowing it, they immediately blocked the user’s attempt to put it on their servers and banned the user for even trying. That’s as far from allowing it as possible.
This, literally the only reason I could guess is that it is to teach AI to recognise childporn, but if that is the case, why is google going it instead of like, the FBI?
Who do you think the FBI would contract to do the work anyway 😬
Maybe not Google but it would sure be some private company. Our government doesn’t do stuff itself almost ever. It hires the private sector
Removed by mod
Removed by mod
What’s the ‘applescript’?
Removed by mod
Removed by mod
Google wants to be able to recognize and remove it. They don’t want the FBI all up in their business.
So, Google could be allowed to have the tools to collect, store, and process CSAM all over the Web without oversight?
Pretty much everyone else would get straight to jail for attempting that.
Google isn’t the only service checking for csam. Microsoft (and other file hosting services, likely) also have methods to do this. This doesn’t mean they also host csam to detect it. I believe their checks use hash values to determine if a picture is already clocked as being in that category.
This has existed since 2009 and provides good insight on the topic, used for detecting all sorts of bad category images:
https://technologycoalition.org/news/the-tech-coalition-empowers-industry-to-combat-online-child-sexual-abuse-with-expanded-photodna-licensing/