> Microsoft wanted me to confirm my age, that I was a "real person" along with identity. So Microsoft somehow reached out to the police department, based on my address information in my Microsoft accounts, with a check of some kind. I had to go to the local police department to verify who I was and my age. The police department told me it was odd. They are just following up on Microsoft complaint. This happened a few or so years ago. Microsoft confirmed my identity then. However, the Microsoft account profile photo issue still exists today.
they dont. and microsoft doesnt contact local police. this post is dubious.
if its CSAM related (which is implied via photodna involvement), microsoft does not contact local police. they contact NCMEC (or the appropriate equivalent), who then coordinates the law enforcement response.
if it isnt CSAM, microsoft does not contact local police to aid with support, because that would be ridiculous to coordinate over a billion accounts across tens of thousands of police departments around the world. and police forces would obviously not tolerate acting as microsoft support personnel.
there has to be a substantial amount of missing context, or this story is (partially? fully?) fabricated, or the user is mistaken/wasnt talking to microsoft.
> That sounds like straight up scammer behavior. "
Microsoft reached out to the police department, then the person went to the local police department to verify who they were. I don't see how this could be a scam.
> Microsoft's PhotoDNA scanning is not just in OneDrive, through the Microsoft's eco-system. Basically, if you are using your Microsoft account to sign in to Windows 11, PhotoDNA scans your entire computer. This information came directly from Microsoft Support.
This sounds like a horrible privacy violation. Is it true? What do they do if they find a match?
The general consensus from I saw from discussions years ago was that scanning of your local files was not something that happened (which would be detectable and eventually discovered and called out by someone). Doing so would also require the dll which contains how photodna works, which Microsoft does/did not want out in the wild and requires an NDA to use. Secretly exfiltrating your files for scanning would get Microsoft in legal trouble.
The obvious alternative of course, is openly and aggressively getting users to agree to uploading their files to Microsoft’s computers (OneDrive), which are scanned.
However in the age of machine learning, copilot and the like, I would not be surprised if local scans start becoming a thing, since offering classification of objects in photos is a perfectly reasonable thing to offer from Microsoft’s point of view, and of course CSAM detection can come along with that.
the police part makes me really question what is going on here and the validity of this report.
if you get multiple child sexual abuse material (CSAM) matches, the police will be knocking on (down) your door. microsoft isnt going to nicely ask you to go down the the police station. they dont even contact local police, they forward the information to the appropriate national entity (e.g. NCMEC) who coordinates the law enforcement response.
and if it isnt CSAM related, microsoft is not going to be contacting your local police, period.
something isnt adding up here. i suspect this post is ragebait.
The perceptual hashes used for this kind of thing are, necessarily, much more susceptible to collisions than cryptographic hashes - so it's not out of the question at all.
That's my guess as well. Could be a collision, or it might be he's in a corpus. Or he's been RATed and is not talking to Microsoft at all. I wasn't aware they required face pics to provide service.
No, TFA says the picture was associated to an old account that got flagged - presumably anything linked to that account, picture included, is now cursed.
TFA also says the police were involved. It seems unlikely MS would call the police just for a flagged account, or that if they did, the police would care.
The whole point of PhotoDNA (CSAM scanner) is that it can detect variations of photos without them being identical and without having CSAM to directly compare it to.
I take it you're not one of the many people who've had a dozen different services over the years get bought up by Microsoft, then forcefully migrated to multiple Microsoft Accounts, and then lose access to all of them?
You what now???
Since when does your local police department respond to a "Microsoft complaint?"
if its CSAM related (which is implied via photodna involvement), microsoft does not contact local police. they contact NCMEC (or the appropriate equivalent), who then coordinates the law enforcement response.
if it isnt CSAM, microsoft does not contact local police to aid with support, because that would be ridiculous to coordinate over a billion accounts across tens of thousands of police departments around the world. and police forces would obviously not tolerate acting as microsoft support personnel.
there has to be a substantial amount of missing context, or this story is (partially? fully?) fabricated, or the user is mistaken/wasnt talking to microsoft.
Microsoft reached out to the police department, then the person went to the local police department to verify who they were. I don't see how this could be a scam.
This sounds like a horrible privacy violation. Is it true? What do they do if they find a match?
Incidentally, how it works is clever and interesting imo, though defeatable if you know how it works: https://www.hackerfactor.com/blog/index.php?%2Farchives%2F93...
The obvious alternative of course, is openly and aggressively getting users to agree to uploading their files to Microsoft’s computers (OneDrive), which are scanned.
However in the age of machine learning, copilot and the like, I would not be surprised if local scans start becoming a thing, since offering classification of objects in photos is a perfectly reasonable thing to offer from Microsoft’s point of view, and of course CSAM detection can come along with that.
if you get multiple child sexual abuse material (CSAM) matches, the police will be knocking on (down) your door. microsoft isnt going to nicely ask you to go down the the police station. they dont even contact local police, they forward the information to the appropriate national entity (e.g. NCMEC) who coordinates the law enforcement response.
and if it isnt CSAM related, microsoft is not going to be contacting your local police, period.
something isnt adding up here. i suspect this post is ragebait.
I guess its a hash collision, but that is pretty crazy. Sounds like the plot to a scifi dystopia.
Coming soon to every AI enabled product near you
Through A Scanner Darkly, indeed.
Oh well, Philip K Dick enters the chat again. With Solar Lottery this time.
There’s your problem. Don’t create a Microsoft account? Why would you need one anyway? To use windows? Why? Get Linux or switch to Mac.
What?
I take it you're not one of the many people who've had a dozen different services over the years get bought up by Microsoft, then forcefully migrated to multiple Microsoft Accounts, and then lose access to all of them?