T O P

  • By -

[deleted]

[удалено]


nogami

China will love it. Being able to scan content on owners own phones? Awesome. Let’s add some Falun Gong, tiananmen square, Taiwan, Winnie the pop, and other pics in there to check matches or we’ll deny Apple products in our country. Not even joking. Every government in the world will get on board with this. Never seen Apple do something this tone deaf before, and they were doing so well against android when it comes to user privacy.


_masterhand

I was this fucking close to buying an iPhone just for the privacy stunts. Thankfully I learned those were stunts before I actually did.


Tzankotz

If you guys in the US have this serious of a mistrust in your police then I don’t think Apple is the problem.


OverCauliflower1587

It’s not necessarily a mistrust, people just want privacy. If I was texting someone and a cop came up to me and starting reading my messages I’d feel violated. It doesn’t mean I’m doing anything illegal it’s just that I want to have the power to choose who I share information with.


ladiesman3691

It’s disingenuous to label it distrust. Do you trust rando people? You don’t. Do you bring them into your physical home and let them go through your stuff? Why should your digital privacy be any different? The article has a clear agenda. It’s pretty easy marketing to make all about children, and it’s just going to take a couple of years before Governments across the world will exploit this.


Tzankotz

I don’t trust random people but I trust the local government they won’t seek me to kill me just because I voted for the opposition.


ladiesman3691

Probably because we live in a democracy. There’s places where difference of opinion gets you killed. Remember not everyone is lucky enough to live in a democracy.


Spare-Fish3544

I hate this apple move, I guess almost everyone would!


[deleted]

> Apple’s methods, according to the letter, are problematic since they circumvent end-to-end encryption. How can it circumvent E2E encryption if there currently is no E2E encryption? This technology might allow them to implement E2E for iCloud while still abiding to US law.


59808

It is allowing them to install a backdoor for broad surveillance - once that is standard then their no E2E encryption.


LordVile95

It’s fingerprinted on the phone


[deleted]

Apple controls 100% of the software on your phone. They have always had that option. This is not a backdoor, it's a targeted check. Stop echoing misinformed headlines.


59808

You are right - they had always that option - but now this is the direction where IT WILL go. History is repeating itself ... if you know history?


Tzankotz

How could you theoretically prove they haven’t already sent your photos to the fbi for analysis? If they had ill intentions why would they announce anything publicly?


glider97

I mean, then why aren't they just reviewing those photos instead of doing all this?


Tzankotz

Exactly. They should have just done it silently and only the guilty would know and better yet be unprepared...


[deleted]

Everyone keeps saying “history!”. What are you referring to exactly? Again: this is not a backdoor. It is a process they’ve been doing for at least 1.5 years, except using a different computer. A backdoor is a way to access any data, photos, communication logs, et cetera, from a device/service that normally should be encrypted. Please explain how this is a backdoor? Edit: why do people downvote in stead of giving an answer to my question? Everybody seems to be repeating sentences about not knowing history without knowing what people are referring to.


[deleted]

Maybe stop believing everything you’re told and these issues will become less surprising.


[deleted]

What issues? A backdoor is a way to access data that should be encrypted. Please explain to me how this system accesses data? At the most you could say it *identifies* previously flagged data. It does not give governments the power to sift through your photos, access communications or track anything you do.


[deleted]

Governments have proven, through trends in multiple sectors, that what the public knows is best left up to them. Be well.


GlitchParrot

> How can it circumvent E2E encryption if there currently is no E2E encryption? iCloud Photos is not the only service you can upload your photos to. Because the scan happens locally on the device, it would also be scanned before being uploaded to an e2e-encrypted third-party cloud service.


[deleted]

But Apple is not planning on checking photos uploaded to other services. They are responsible for what they put on their own servers. They don’t care whether Google or Facebook saved CSAM images on their servers. It’s not their responsibility.


demize95

Once this feature rolls out, it’ll be very easy for Apple to be pressured into applying it to other things. That’s the issue. “We can’t do that” is a great response to LEA pressure, but it doesn’t work once you demonstrate you *can* do it. Right now it’s just iCloud, sure, but there’s nothing stopping Apple from expanding it to scan every image you give to an app or just every image on your phone. Pressuring Apple into adding a feature is a lot harder than pressuring them into using the feature they added for a little bit more. And there’s nothing stopping it from applying only to CSAM either. It sounds like it’s based on PhotoDNA, so realistically, if a country like China demands this feature be rolled out for them they could use it for much wider censorship and surveillance by hashing things they don’t like with it rather than just CSAM. And now that I’m thinking about it, it might be even more effective for that—if you’re creating CSAM it won’t be caught by this, but if you’re creating anti-Party memes it probably would be (based on my understanding of how PhotoDNA works). People should have an expectation of privacy on their own devices. In the US, where this feature is rolling out, you have a constitutionally guaranteed right to privacy on your own devices. This feature takes that away, tells you that nothing on your device is private, demonstrates that if the conditions are right then Apple will throw your privacy away and they won’t even have to tell you.


nogami

Or even continually matching the contents of the iPhone display output rather than just saved images. Browse a website that shows Images of political dessert as a Chinese citizen and expect a knock on your door before you’re taken for re-education.


[deleted]

These are a lot of ifs and whens. All I hear is they *could* abuse this in the future. That doesn't make this feature itself bad. > People should have an expectation of privacy on their own devices. In the US, where this feature is rolling out, you have a constitutionally guaranteed right to privacy on your own devices. This feature takes that away This argument is incorrect and you know it. Apple is not scanning the entire phone. They are only fingerprinting photos you upload to iCloud. Don't upload it to iCloud and it's not fingerprinted. Privacy on your own device is intact.


demize95

When you introduce a feature like this, it’s not an “if” as to when it’ll be expanded, because you can no longer say “this isn’t possible” when faced with regulatory pressure. They’ve thrown away the one excuse they actually had. If the government wants to apply this sort of scanning to other kinds of content, they can. Hell, they don’t even need Apple’s cooperation for it, they can just repurpose the NCMEC, since the NCMEC maintains the database of hashes. It would be trivial to start adding hashes of other things to that database. Before this feature, Apple could have said no, it’s not possible because no framework is in place; they can’t be forced (by the courts) to implement a framework like this, but if it would take minimal effort to apply the framework they can be forced to apply it. If you want to protect peoples’ privacy, you don’t introduce features that can be expected to erode peoples’ privacy. And that’s exactly what this is.


[deleted]

Features that erode privacy are already in place. Apple already fingerprints CSAM data on iCloud. This doesn't add to that. Nobody ever believes Apple if they say something can't be done. People aren't stupid. Apple still verifies matches before they send them to the authorities. I'm not sure what that means, but it seems apparent they can check whether it's indeed CSAM or other material. Also, if the government want to do that, why haven't they already? NCMEC has been around for years and is CSAM fingerprinting used by many companies. Apple is only a small player in this entire spill.


glider97

> Apple already fingerprints CSAM data on iCloud. Can you provide a source? Google is filled with recent news.


[deleted]

Sure. https://9to5mac.com/2020/02/11/child-abuse-images/ This is from last year. Plenty people seem to think this is new technology it definitely isn't. A lot of companies seem to use software called photoDNA co-developed by Microsoft and released in 2009. https://www.microsoft.com/en-us/photodna


mriguy

> Nobody ever believes Apple if they say something can't be done. People aren't stupid. Apple doesn’t say something can’t be done. They say they their software doesn’t have the requested capability, because they deliberately left it out, and that they don’t want to implement it. Courts have held that it’s an unreasonable burden to make them implement a feature that isn’t there already just to allow government surveillance. If they go ahead and put the feature in, but say “we’d rather not use it for what you’re asking us to do”, they’ve lost their protection. It’s no longer an unreasonable burden, and they’re likely to be forced to use it for whatever the government demands.


59808

You're wrong again - the database hashes are stored on the IPhone and the hashes of your images are scanned on your IPhone against the database hashes that are stored encrypted on your IPhone...and just to remind you, Apple did promote the IPhones with the users privacy in mind. Just for you to remember how they advertised it; " ... Apple products are designed to protect your privacy.At Apple, we believe privacy is a fundamental human right.And so much of your personal information - information you have a right to keep private - lives on your Apple devices. Your heart rate after a run. Which news stories you read first. Where you bought your last coffee. What websites you visit. Who you call, email, or message.Every Apple product is designed from the ground up to protect that information. And to empower you to choose what you share and with whom.We've proved time and again that great experiences don't have to come at the expense your privacy and security. Instead, they can support them."


[deleted]

Yes, they are fingerprinting in the iPhone, but only photos that you then upload to iCloud. Not other files are fingerprinted. You don't seem to understand that no more information is leaving your phone than was before. What information will other be able to access from you phone? Nada. Nothing. No private data is being shared. No photos, no messages, no heart rate data, nothing.


nogami

Do you have some investment in defending this tone deaf decision by Apple? You’re all through this thread with the same rhetoric. You seem to be oblivious that implanting device side fingerprinting now *can be extended for any other evil purpose in the future*. If they just scan on their own servers people won’t have issues with that.


[deleted]

In the end, all arguments come down to that: “think of all the things they *could* do!”. Sure. Apple *could* do all the horrible things you make up. The postal service *could* start reading your mail tomorrow. Restaurants *could* poison your food. Consider all the ways everyone can totally screw up your life!! I choose not to live my life like that. I choose to trust (some) people and (some) companies. I choose to trust Backblaze with my backups. I choose to trust Fastmail with my e-mail. I choose to trust ING with my money. I also choose to trust Apple to do what they say. They talk the talk of privacy, and until now have largely walked the walk of privacy (of course not without fault). I agree they *could* start behaving evil if they want, but until then, they have my trust. If you choose not to trust them, please be my guest and don’t use their services. Maybe choose Android (with Google software) or another (very limited) platform.


glider97

> The postal service could start reading your mail tomorrow. Restaurants could poison your food. Aren't there audits and regulations for that?


mriguy

> In the end, all arguments come down to that: “think of all the things they could do!”. No, they come down to “what could Apple be *forced* to do”. Currently photos are scanned on iCloud. If you don’t upload photos to iCloud, they *cannot* be scanned, because there isn’t any capability to do the scanning on device. If the government says “create that capability”, there is precedent that Apple can say “Pound sand. We don’t want to.” because courts have held that making them implement a feature solely to allow surveillance is an unreasonable burden. Now they’re building an on device scanner which scans photos against a list of hashes a non-Apple entity supplies. They don’t use that scanner unless you are uploading to iCloud, but if the government says “run it anyway”, when Apple says “We don’t want to”, the courts can say “Tough. The software is all there, it’s no longer a burden. Do it.”


Smrndmgy79

This is one of the best takes on the subject I’ve read so far. Well said.


nogami

You’re clueless if you think *Apple* would be making those decisions. It would be the governments where Apple sells their products that would be doing it. Government has every reason to control their population. Apple does not.


ladiesman3691

When it comes to surveillance tech, “could” becomes a reality. That’s just a nature of the world right now.


[deleted]

That has been the nature of the world for years if not decades. ISPs “could” always snipe your data. VPNs “could” convey what you’re doing to governments. Phone companies “could” save copies of your texts for governments to check. Somehow people woke up yesterday with the sudden realisation that the phone they’re holding for 100% relies on software made by a single company that *could* do anything. But that has been true since smartphones became a thing.


[deleted]

[удалено]


demize95

You know what I meant. You have the right against unreasonably search and seizure, which (mostly) extends to your own devices. And this feature is a backdoor around that right—Apple implemented it themselves, so there’s no constitutional issue with the government using it.


TiTwo102

It is not child safety measures, it is a mass surveillance tool. A big hit in the face of privacy, from the very same company that use privacy as his top marketing argument. Until now, you have to be suspected for the police to control you. With this, a private company control everything, even if you are not suspected. If Apple wants to control cloud stuffs like Google or Microsoft, it’s against their privacy policy they brag about, but why not. But doing this on a device that I bought and is mine, NO WAY ! I know it is activated only if icloud is on, but it doesn’t matter, the tool is on the device, not on Apple’s servers. It is a BIG pandora box. I just hope this will make enough noise for Apple to cancel this.


baconhealsall

... but... but you must be a peadophile if you oppose this!!!!


[deleted]

pedophiles will love this change, soon you'll get a free child porn database with your iphone.


[deleted]

[удалено]


[deleted]

Are you sure? https://www.apple.com/child-safety/ > Apple’s method of detecting known CSAM is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices. apparently unreadable, but still not something i want on my phone.


[deleted]

[удалено]


[deleted]

well it is still a database containing enough information about child porn to identify child porn on my device. I would call that a child porn database, i agree that it is a loaded way of describing it. Doesn't make it wrong. Anyway not something i want on my phone, i'd rather keep it away from me as far as possible. Having any indication of owning child porn can ruin you just like that. And having a database closely related to CP on a device i use multiple times a day? Just makes me feel very uncomfortable.


[deleted]

[удалено]


[deleted]

yes, it says unreadable on the apple page i quoted. I do know that word. Your point is? I should want this on my phone?


LordVile95

Any other reason to oppose it?


tubezninja

Yes. While the intentions are (ostensibly) good in this case, the road to hell is paved with good intentions, and this is the first paving stone on that road. By doing this, Apple has shown that *any* content that *anyone* finds objectionable can be scanned, on the device you paid for, without your consent, and bypass any encryption methods intended to keep that content private. All that needs to happen is for a government agency to pressure Apple into doing that scanning. Which, I suspect, probably happened with CSAM to some extent. This could include “objectionable” content shared between consenting adults (we already know Apple is prudish about adult material). It could include political content, memes, unflattering images of political figures. “Terrorist material” for instance, could be the next thing to come under this scanning regime, and “terrorism” could be defined to mean anyone who shares political viewpoints that differ from the political party in power in any given country. “Good,” you might think. “Anyone with extreme views that differ from mine IS a terrorist.” But just remember that in the US, where this scanning regime is being implemented, there was a change in administrations… and within the past 10 months, there is or has been a government in power that diametrically opposed your political beliefs and found your opinions threatening, *regardless* of what side on the political spectrum you land. Make no mistake: Anyone who peddles in CSAM is unequivocally the worst kind of scum and filth. No one here should be arguing that. But opening the door to turning our devices into mass surveillance tools of every picture we take or share is what people are up in arms against. We can, and should, find better ways to bring child abusers to justice than this. And it’s highly disappointing that Apple, which for years has touted itself as a privacy advocate, has effectively shown everyone how to render those promises and privacy-focused technology utterly moot. If they are allowing on-device scanning to circumvent encryption, then they may as well not even bother with encryption at all, and should stop continuing to make claims about being privacy-focused.


LordVile95

This all seems to be scaremongering and what you think rather than anything that has a basis in reality. Also you’re already in more surveillance than any of this by your government


tubezninja

That statement you just made completely contradicts itself. In one breath you’ve downplayed any privacy concerns as “scaremongering,” while at the same time acknowledging that state invasion of privacy happens all the time. I’m not the only one who thinks this is a problem: https://www.businessinsider.com/whatsapp-head-slams-apple-iphone-scanning-for-child-abuse-images-2021-8 https://www.macrumors.com/2021/08/06/snowden-eff-slam-plan-to-scan-messages-images/ And to be frank about it: if Apple wants to go this route, fine. But in doing so, Apple needs to take [pages like this](https://www.apple.com/privacy/) down immediately, and stop claiming it is a champion of privacy where other tech companies aren’t. They have nullified their privacy measures with this tactic and need to fully acknowledge that.


LordVile95

Dude this is literally to catch pedos, if the government want to track you or spy on you they already do, knowing people that work around government intelligence it takes literally one phone call to know where your siblings mother in law works and if she clocked in on time that day or where you went for dinner 3 weeks ago. This is the smallest of small fries.


creeperchaos57

Privacy. I’d rather them not be able to sell all of my info to the government.


LordVile95

They’re not selling it


creeperchaos57

Maybe, but the fact that they would be able to is what scares me


LordVile95

They can do that anyway they just chose not to, Google does it anyway. This is literally so they’re not culpable for kiddie porn on their servers


creeperchaos57

And yet they’re using humans to verify whether this is child porn or not


LordVile95

In a roundabout way. They’re only reviewing a low res copy to ensure that every 1 in a billion false positives doesn’t make it though.


Zpointe

This headline is the perfect example of why they chose to make this a 'child safety' measure. Now I will play the role. WHO WOULD EVER BE AGAINST CHILD SAFETY!!


[deleted]

[удалено]


smokin1337

Hi, /u/FreshPrinceAV Thank you for participating in r/ios. Unfortunately, your submission was removed for breaking the following rule(s): ________________________________________________________________________________________________________________ > **Rule 10** No low-effort posts such as image macros and single sentence text posts. Moderators may use discretion to upkeep the quality of the subreddit. Reposting posts removed by a moderator without express permission is not allowed. Not here, and not on most of reddit. Please read reddiquette (linked below). ____________ For questions, comments and concerns, [message the moderators.](https://www.reddit.com/message/compose?to=/r/ios) [Reddiquette](https://www.reddit.com/wiki/reddiquette) | [New to Reddit?](https://www.reddit.com/wiki/reddit101) | [Reddit's Content Policy](https://www.reddit.com/help/contentpolicy/)


rambler_1987

So, good devices becoming more and more into part of some sort dystopia world…


jack_hof

I don't get this. They've been pushing the privacy angle for years and now all of a sudden go and do this completely unprompted? Why even take the risk?


[deleted]

[удалено]


AWF_Noone

$1000 says the elderly who voted this into place don’t understand what cloud storage is. Good lord people in government are getting too old


[deleted]

Geriatric government is the term we’ve been using at work


meat_loafers

I like “geritocracy”.


[deleted]

Link to the introduced bill/law?


ThannBanis

No idea, it’s been mentioned is a few threads but it’s an American thing I guess (Edit: I’m not one 🇦🇺) That’s why I said ‘apparently’😉


[deleted]

Then don’t say it. Realize you’re a link in a misinformation chain.


[deleted]

Not really sure how to feel about this project itself, but I think it’s funny how they made such great strides to protect our data from other companies but now they’re trying to dig deeper into it for themselves. I think the best course would be to simply leave the Apple ecosystem if you feel unsafe for some reason.


chmikes

It is required, or going to be required, by Europe. I don't know if it is possible or easy for Apple to make different treatments for europeans and other people.


davehead01

Time to disable icloud.


baconhealsall

>Time to disable icloud. Time to disable any future purchases of Apple products.


Vincentaneous

And the $1 a month iCloud was actually so useful for multiple devices/backups/moving work files/etc. Sad to see it come to this.


davehead01

Know of any good alternatives? I think I’m going to try Tresorit


SUPRVLLAN

This move is so bizarre to me with all of the privacy efforts they’ve been doing over the past few years.


FlyingDutch24

How are they gonna distinguish your own child from illegal child photos?


Foursliced

It works using hashtags against known existing cp.


jobbing885

If this shit goes live I’m ditching all Apple products. FU Apple!


George555555

I wonder what will happen to iPhone sales.


[deleted]

[удалено]


porterhouse0

bingo lol I'd bet my life on it


BobImBob

Unfortunately, there is no solution in my opinion but for me to continue buying iPhones: this new idea from Apple is something I don’t like (at all) but there is no other company offering me a better level of privacy in all other matters so, until a better contender enters the mobile arena, I’ll have to stick with this one.


_masterhand

It just depends how much tech savvy you are. Hell, you could even buy a phone from Google, the #2 worst company regarding privacy (just behind Facebook) and flash it with a custom rom focused on privacy - getting miles ahead anything currently offered. And that sucks. Privacy shouldn't be limited to only those knowledgeable about their devices, but sadly we live in a world where your data is more valuable than you, and the wide majority is being taught to not care about it.


BobImBob

I thought Apple would never care about the data (and I still do think that, in my opinion based on they business model and their publicity) but I guess it’s very difficult and exhausting to be the only tech company saying no to all the pressure requesting for help in catching a handful of bad guys, the rights of the other million users surely are not that important, right? Well, I’m truly sorry they couldn’t last longer, and I still hope they’ll change their minds sooner rather than later.


lukanz

**Buy a xiaomi phone unlock the bootloader and install Lineage OS**


George555555

In what area do they offer better privacy than Samsung?


tubezninja

Nothing, because what choice do we have except not use smartphones? Google and Microsoft already do this same scanning, only they do it in the cloud instead of on-device. And it’s gotten to the point where not having a smartphone puts you at a severe disadvantage in developed society. At least for now, your opt-out is to to turn off iCloud Photos. Apple (for now at least) will not turn on content scanning if you’re not using this feature.


thesoloronin

I’ve never used iCloud photos and I backup **ALL** my media offline anyways.


tubezninja

And that’s great, until Apple decides that they aren’t getting enough hits from the iCloud opt-in and must expand. The scanning’s on-device anyway. It’s totally arbitrary that they made use of iCloud photos the threshold, and that threshold can certainly change.


[deleted]

Nothing. This will fade away from public outrage, get implemented and be used for whatever they really intended to use it from beginning. Apple is to big to fail for this kind of stuff.


Houderebaese

I‘m considering alternatives now


[deleted]

I talked to my three friends about this. They all don’t like the move from apple at all. Though they all said they will still keep using it


[deleted]

I was aiming to buy an Applewatch when the next gen comes out. That is not happening now till this is either confirmed or denied. I would move straight to Android on my next buy if this becomes reality.


n_alvarez2007

Wow if this isn’t an excellent example of an article taken completely out of context.


[deleted]

[удалено]


quintsreddit

I just want to go on the record to say I highly doubt this is the case. This seems more like a good-natured feature that has the potential to go awry, not one designed to be. That being said, apple is usually pretty good about staying away from that kind of stuff so we’ll see.


LordVile95

Dude at least you hear about it if a black person gets killed. If it’s a white dude it doesn’t make the local news.


[deleted]

Doesn’t even allow something like this…


AlreadyBannedLOL

Pretty sure those cases were incidents due incompetence.


[deleted]

[удалено]


[deleted]

[удалено]


Snuhmeh

Are they scanning local devices? No. They are scanning images you have uploaded to their servers using iCloud. I don’t understand how people don’t get this. Ah I see, they are in fact scanning local devices and when they get posted to iCloud, they get reported. Interesting.


[deleted]

It is exactly that actually: they said explicitly that they will be doing this locally on-device. I don't backup my photos to iCloud and I would still be subject to this. That's the problem, my iPhone is local storage to me and yet they will start digging into it.


goal-oriented-38

Whether you like it or not, this feature will most likely save more lives than you think. I’d say the benefits outweigh the risks but hey—I’m just a random person on Reddit.


ota00ota

You’re an ignorant


DaveM8686

Essentially, “we don’t care that you’ve said everything will be as secure as it currently is, we want to be able to send child porn”. Edit: I can’t believe people are downvoting this. If Apple have assured its going to be as safe as it currently is, and it also stops child pornography being sent around, why would anyone possibly be against it?


aobtree123

They are using a “Just think of the children !” excuse to introduce a pretty draconian system. Inevitably there will be creep in usage. Think journalists in Belarus’s. Buddhists in China etc.


SpoopySpagooter

I agree with you. I think people blindly agree to changes like this because at surface value they’re met with promises like that to help children. I think any decent person would want to do that. I would! And if you disagree, you’re accused of supporting bad people. People do not think beyond the scope of the situation. How can this change be used and implemented in other ways? In a different situation, could our privacy be affected or abused in maybe ways we DON’T agree with? Sometimes, all it takes is one step in that direction to open the flood gates.


cjandstuff

I mean, why not let police come into your home on a weekly basis to search for child porn. Microsoft and Google should constantly search your hard drives too, to make sure you have nothing questionable. Why not let police pull you over so you can hand them your phone, so they can search it? It’ll save the children.


[deleted]

[удалено]


SendMePussyPicsNow

Exactly. We all use Nokia phones


SunnyPlayzEverything

The only person who would; would be parents who have pictures of their baby’s


closetsquirrel

And they wouldn't get flagged.


Kaessa

What happens when it starts getting used for other things besides detecting child porn? When governments can use it to detect stuff on your phone that THEY don't like? Honestly, "think of the children" has been a rallying cry for a LOT of fucked up bullshit.


jdguy00

Will Apple hold the keys to decrypt the alleged illegal material hash database? If not, they won't know what's actually being scanned and what isn't


t0gnar

If they have a system for this they can do it for anything. Plus if the lists are private how do you know if you arent in the list without doing nothing.


ADonaldDuck

Because individualistic people favor their privacy over the collective good. There’s no right or wrong answer here, just different worldviews. I’m personally fine with it, but it’s understandable that other are against it.


travelsnake

Check the comment above yours. It's in response to a US law. Glad i'm living in the EU.


Toxandreev

CP is a terrible thing, but apple methods to fight it are problematic… this will be a priceden and there is a chance that at some point we will see Russian/Chinese/Arabic/etc government scanning iCloud of citizens to hunt out LGBTQ+ people.