Feedback & Followups
- 🇪🇺 The Digital Markets Act (DMA) & Digital Services Act (DSA) have both moved one step closer to going into effect, having been approved in the European Parliament. The final step is formal approval by the council of ministers, which is just a formality, so the laws are expected to come into force before the end of the year — www.imore.com/…
- Social Media News:
- TikTok Claims Strong Data Security, Still Planning to Improve It Further — www.macobserver.com/… & TikTok employees in China can see your data, company admits — www.imore.com/…
- Crypto chaos sees Meta ditch its Novi payment wallet after just 9 months — www.imore.com/…
- There will be lots of legal fallout still to come, but from a security and privacy point of view, the danger of Musk potentially turning Twitter into a kind of digital wild west has passed: Elon Musk tries to walk away from Twitter deal — www.axios.com/…
Deep Dive 1 — 🇺🇸 The Practical Privacy Fallout from the Recent Supreme Court Decision on Abortion Rights
A few weeks ago the law in the US changed dramatically when the highest court in the land reversed itself on abortion rights — for 50 years there were federal constitutional protections for people’s right to assert control over their own pregnancy. That federal right has been removed, so each state gets to set its own rules now. In other words — things just got complicated for many Americans.
Everything is still in flux, but there are legitimate grounds for Americans living in some states to fear any of the following could become a reality soon:
- States could subpoena corporations that hold data on menstrual cycles so as to detect pregnancies and use that data to target people who appear to become pregnant and then cease to be pregnant without giving birth. This data would include the 10-15% of pregnancies that end with a miscarriage.
- States could subpoena corporations that hold search histories for users using abortion-related search terms. This is broader than just traditional search engines, it would include digital assistants that retain logs.
It’s very important to understand that many corporations are very well incentivised to try to determine which of their users are pregnant because expecting parents spend a lot of money, so they are extremely valuable ad and marketing targets. Because of the strong financial motivation, many cycle tracking apps sell data on their users to data brokers who in turn sell that data to advertisers, marketplaces, and retailers.
It’s also important to understand that many traditional and non-traditional search engines (such as smart assistants) record user searches to help flesh out their user profiles, again, for sale, and, to improve their own algorithms.
Finally, it’s vital to understand that some network infrastructure is also financially motivated to track you and sell the data they gather to data brokers. In the US, it is legal for ISPs & carriers to track your browsing activity and sell it. The FTC published a report on what they’re collecting: www.ftc.gov/… Free VPNs that are not run by charitable foundations make their money by tracking users and selling the data.
But thankfully it’s equally important to understand that different companies are motivated by different business models, and there are companies who make their money by selling hardware and services directly to users, and they are motivated to protect the users so as to keep them loyal. These companies are as strongly motivated to protect user privacy as others are to exploit it.
So, as always, the key is to follow the money.
So, the practical advice for people living in states where there are reasons to be fearful:
- Only use cycle tracking apps that store data locally, or, use end-to-end encryption to sync data between devices. Apple’s Health Kit is end-to-end encrypted, so the simplest way to be safe on iOS is to use apps that write their data to Health, and only to Health, or to simply use the Health app’s directly built-in cycle tracking feature. Consider iOS app Euki which claims to store your data only on your device.
- Never use a Smart Assistant that retains a search history to search for anything pregnancy related whatsoever. Siri disconnects search terms from user IDs, so it’s impossible to subpoena Apple to hand over information they don’t have. To the best of my knowledge, Siri is the only smart assistant that protects privacy like this.
- Always use Private Browser/Incognito mode for all pregnancy-related searching and browsing.
- Always use a trustworthy VPN such as [Private Internet Access (PIA)] (iOS app Euki) for all pregnancy-related searching and browsing.
Opinion by Bart: If that sounds like a lot of effort, you are correct 🙁 It is my informed opinion that this is a direct result of the perfect storm of a total lack of privacy protections and a strong erosion of civil liberties and rights. I cannot envision any short-term fixes to this problem that has developed over decades. All solutions will take time, and all require the electorate to get involved by making their views known to elected officials, regulatory bodies, and those seeing office, and by actually voting in elections at levels of government. There are no quick fixes, so just get stuck in and do what you can to be a part of the solution.
A personal story from Allison
In 1989, Steve and I very much wanted to have another child. We were delighted when a doctor’s blood test confirmed that I conceived, but we waited a full 3 months before telling anyone because of the high chance of miscarriage. The day after I announced the happy news to everyone I knew, we had an ultrasound performed.
It turns out that while I had indeed conceived, the cell never divided. My body had built a home for a baby who would never exist.
I had to undergo an abortion on a day I had expected to see my baby.
Under many state laws, aborting a conceived child at 12 weeks would be an illegal medical procedure. I’m not sure what they would do with my body under these laws – would I have to let the non-baby rot inside me?
Even if they did allow the procedure, having people come after me for it because of data they harvested from period-tracking apps to try to find out why a baby was never born is beyond my worst nightmares.
Links
Deep-Dive 2 — Apple’s Upcoming Lockdown Mode
WWDC may have been last month, but this week Apple added a new feature to the upcoming version of iOS & macOS — Lockdown mode.
This is a mode that lets users change their balance between security and convenience, sacrificing features for their safety. This feature is fundamentally different from Apple’s other security and privacy protections — it’s not intended to be widely used, it’s targeted squarely at a small subset of users. Only a few users need to enable lockdown mode, but those that do really do need to!
Who are these people? We’ve actually mentioned them a lot over the years, they’re the people who are important enough to very well-funded organisations to be worthy of targeting with the absolute most sophisticated attack tools out there. In other words, people who’ve come to the wrong kind of attention of a state-level actor (i.e. government, government agency, or really big cybercrime organisation). That includes journalists, campaigners, some kinds of lawyers, some kinds of researchers, industry leaders, and of course politicians, government officers, and military officers.
At this stage, Apple’s OSes are mature enough that the security community and Apple know where the weakest links in the security chain are. Some of those links are technological, and some are human, and Apple is tackling both.
We know that processing data coming from others is inherently dangerous, so two of the three protections are file type restrictions, removal of document previews, and web browser limitations (websites are other people’s code running on your device!).
We also know that social engineering has been successfully used to trick users into effectively hacking their own devices by installing configuration profiles or registering their devices into a malicious Mobile Device Management (MDM) service, so both of those actions are blocked in lock-down mode.
We also know that it’s common to use addresses/usernames that look similar to legitimate ones to trick people into sharing information they shouldn’t with outsiders, so lockdown mode restricts some incoming communication types to only contacts you have sent messages to recently. This also doubles as protection from harassment and the torrents of unwanted communications doxing victims are subjected to.
This is not going to be a static feature, the exact details will evolve over time as the security community in general and Apple particularly learn about new threats. But for now, this is what Apple are promising:
Messages: Most message attachment types other than images are blocked. Some features, like link previews, are disabled.
Web browsing: Certain complex web technologies, like just-in-time (JIT) JavaScript compilation, are disabled unless the user excludes a trusted site from Lockdown Mode.
Apple services: Incoming invitations and service requests, including FaceTime calls, are blocked if the user has not previously sent the initiator a call or request.
Wired connections with a computer or accessory are blocked when iPhone is locked.
Configuration profiles cannot be installed, and the device cannot enroll into mobile device management (MDM), while Lockdown Mode is turned on.
Also, Apple are putting their money where their mouth is on this — they’re offering up to $2M for bugs that break lockdown mode via their bug bounty program!
Links
- Apple’s press release announcing the feature: Apple expands industry-leading commitment to protect users from highly targeted mercenary spyware — www.apple.com/…
- Apple Adds Lockdown Mode to Protect Activists and Government Targets — tidbits.com/…
- Why Lockdown mode from Apple is one of the coolest security ideas ever — arstechnica.com
- Apple will pay you $2 million if you can break its new ‘Lockdown Mode’ — www.imore.com/…
❗ Action Alerts
- Google patches “in-the-wild” Chrome zero-day – update now! — nakedsecurity.sophos.com/…
- If you run servers, there are two particularly important bugs you need to patch:
Worthy Warnings
Notable News
- 🇬🇧 The UK government is proposing an amendment to the UK Online Safety Bill that could mandate companies to implement client-side CSAM scanning like the controversial feature Apple announced and then paused deployment of last year — www.macobserver.com/…
- An important reminder of the dangers of the crypto hype: Harmony blockchain loses nearly $100M due to hacked private keys — nakedsecurity.sophos.com/…
Top Tips
- One to file away for future reference: Startup and Recovery Modes on M1 and M2 Macs – The Eclectic Light Company — eclecticlight.co/…
Interesting Insights
- 2022 CWE Top 25 Most Dangerous Software Weaknesses — cwe.mitre.org/…
- Why Passkeys Will Be Simpler and More Secure Than Passwords — tidbits.com/…
- Defending Ukraine: Microsoft’s Early Lessons from Russia’s Cyberwar — tidbits.com/…
Palate Cleansers
- From Allison: Help NASA Scientists Find Clouds on Mars — www.nasa.gov/…
- From Allison & Bart: Astronomy Picture of the Day: Saturn and the ISS — apod.nasa.gov/…
Legend
When the textual description of a link is part of the link it is the title of the page being linked to, when the text describing a link is not part of the link it is a description written by Bart.
Emoji | Meaning |
---|---|
🎧 | A link to audio content, probably a podcast. |
❗ | A call to action. |
flag | The story is particularly relevant to people living in a specific country, or, the organisation the story is about is affiliated with the government of a specific country. |
📊 | A link to graphical content, probably a chart, graph, or diagram. |
🧯 | A story that has been over-hyped in the media, or, “no need to light your hair on fire” 🙂 |
💵 | A link to an article behind a paywall. |
📌 | A pinned story, i.e. one to keep an eye on that’s likely to develop into something significant in the future. |
🎩 | A tip of the hat to thank a member of the community for bringing the story to our attention. |
QUOTE: “1.States could subpoena corporations that hold data on menstrual cycles so as to detect pregnancies and use that data to target people who appear to become pregnant and then cease to be pregnant without giving birth. This data would include the 10-15% of pregnancies that end with a miscarriage.
2. States could subpoena corporations that hold search histories for users using abortion-related search terms. This is broader than just traditional search engines, it would include digital assistants that retain logs.”
That is fear mongering at its worst, Bart. This was not a “deep dive”; this was a purely political blurb. Sorry to hear of your plight Allison, but I remember being chastised for a less egregious political statement than what I see here in “security bits”.