Apples finally doing it
by Archonx - August 06, 2021 at 04:42 AM
#13
(August 06, 2021 at 08:03 AM)Amsterdam Wrote:
(August 06, 2021 at 05:23 AM)imgr8ness Wrote:
(August 06, 2021 at 04:44 AM)Archonx Wrote:
(August 06, 2021 at 04:43 AM)Amsterdam Wrote: heard it already and autistic af

What are your thoughts tho? I'm more surprised it's legal. They are pushing it with ios15

yeah they can do what they want, being a private company and all but lets be real, they don't give a fuck about children, this is about using 'child abuse' as the excuse for them analyzing your photos and therefore, any remaining privacy you thought you had is all but gone.
Because, by using the detection and prevention of child abuse as the reason for this, it means anyone who opposes the move (for ANY reason) will be seen to be someone who is a child abuser. After all, why would you deny something which could help children?
Its the same principle they use with Greta Thunberg. You might disagree with what we're being told about climate change but expressing any doubt or questions will simply get you labelled as harassing and targetting a young girl. You wouldn't do that would you?
Its all about psychological manipulation.
Apple COULD and probably already has used this technology albeit not in an official capacity and probably hasn't done anything with the information they learned from it but they are so big that if they want to do this across all customers, they need to have official blessing from regulators.
Fighting child abuse is a surefire way of getting it approved.
And the thing is that ai will scan thru pictures and if it detect child abusive content human will be alerted and that can seek many weirdos trying to get that type of job or even detecting normal pictures as child abusive content so your personal shit can get detected and set to random peeps on company
It doesn't scan through the pictures like you think it does. I don't know if you know but the technology is limited to seeing the image hash which is something like this: b64g0m0r2k2lksxg
So if that "hash" matches up to another image hash then it's a clear sign that it is child pornography. Law enforcement already has tons of known CP image hashes, all this would do is match the database to the CP image hashes. I know we're not supposed to believe them, but I do believe that Apple would lose millions of customers if they abused this image hashing technology. There is already a python library which does this, newly released 2021/2020?
#14
someone said to me "but yeh....to train Apples AI think about how many child p0rn images they're going to have to use" O.o

not sure if thats true or if its only the hashing without AI detection but either way it seems like theres probably a few additional benefits to Apple that arent for the sake of kids....
#15
(August 10, 2021 at 03:21 PM)splint Wrote:
(August 06, 2021 at 08:03 AM)Amsterdam Wrote:
(August 06, 2021 at 05:23 AM)imgr8ness Wrote:
(August 06, 2021 at 04:44 AM)Archonx Wrote:
(August 06, 2021 at 04:43 AM)Amsterdam Wrote: heard it already and autistic af

What are your thoughts tho? I'm more surprised it's legal. They are pushing it with ios15

yeah they can do what they want, being a private company and all but lets be real, they don't give a fuck about children, this is about using 'child abuse' as the excuse for them analyzing your photos and therefore, any remaining privacy you thought you had is all but gone.
Because, by using the detection and prevention of child abuse as the reason for this, it means anyone who opposes the move (for ANY reason) will be seen to be someone who is a child abuser. After all, why would you deny something which could help children?
Its the same principle they use with Greta Thunberg. You might disagree with what we're being told about climate change but expressing any doubt or questions will simply get you labelled as harassing and targetting a young girl. You wouldn't do that would you?
Its all about psychological manipulation.
Apple COULD and probably already has used this technology albeit not in an official capacity and probably hasn't done anything with the information they learned from it but they are so big that if they want to do this across all customers, they need to have official blessing from regulators.
Fighting child abuse is a surefire way of getting it approved.
And the thing is that ai will scan thru pictures and if it detect child abusive content human will be alerted and that can seek many weirdos trying to get that type of job or even detecting normal pictures as child abusive content so your personal shit can get detected and set to random peeps on company
It doesn't scan through the pictures like you think it does. I don't know if you know but the technology is limited to seeing the image hash which is something like this: b64g0m0r2k2lksxg
So if that "hash" matches up to another image hash then it's a clear sign that it is child pornography. Law enforcement already has tons of known CP image hashes, all this would do is match the database to the CP image hashes. I know we're not supposed to believe them, but I do believe that Apple would lose millions of customers if they abused this image hashing technology. There is already a python library which does this, newly released 2021/2020?
the thing is you can edit photo it takes like 10 seconds to do so and it changes hash completely
This forum account is currently banned. Ban Length: 2 Days (1d, 23h, 51m remaining).
Ban Reason: Begging in SB
#16
(August 06, 2021 at 05:23 AM)imgr8ness Wrote:
(August 06, 2021 at 04:44 AM)Archonx Wrote:
(August 06, 2021 at 04:43 AM)Amsterdam Wrote: heard it already and autistic af

What are your thoughts tho? I'm more surprised it's legal. They are pushing it with ios15

yeah they can do what they want, being a private company and all but lets be real, they don't give a fuck about children, this is about using 'child abuse' as the excuse for them analyzing your photos and therefore, any remaining privacy you thought you had is all but gone.
Because, by using the detection and prevention of child abuse as the reason for this, it means anyone who opposes the move (for ANY reason) will be seen to be someone who is a child abuser. After all, why would you deny something which could help children?
Its the same principle they use with Greta Thunberg. You might disagree with what we're being told about climate change but expressing any doubt or questions will simply get you labelled as harassing and targetting a young girl. You wouldn't do that would you?
Its all about psychological manipulation.
Apple COULD and probably already has used this technology albeit not in an official capacity and probably hasn't done anything with the information they learned from it but they are so big that if they want to do this across all customers, they need to have official blessing from regulators.
Fighting child abuse is a surefire way of getting it approved.


I agree with this. This sounds all good until some Karens nudes get leaked or another Snowden wanna be pops up.
#17
How else do you think you should go against something like that? It only becomes problematic as soon as the search is extended to other criteria. For example, according to faces.

Do you mean this script?
https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX

How do you get the files needed for it from an Apple computer?

Possibly Related Threads…
Thread Author Replies Views Last Post
Is the Age of Passwords Finally Coming to an End? quantumBaldie 3 1,309 May 11, 2021 at 05:27 PM
Last Post: quantumBaldie
USA FINALLY CRACKS - SAY BYE TO YOUR PRIVACY ThatDevAaron 5 944 July 18, 2020 at 02:36 AM
Last Post: Thund3rClap

 Users browsing this thread: 1 Guest(s)