mastodon.ie is one of the many independent Mastodon servers you can use to participate in the fediverse.
Irish Mastodon - run from Ireland, we welcome all who respect the community rules and members.

Administered by:

Server stats:

1.6K
active users

#dataprotection

20 posts13 participants0 posts today

"Also at odds with the G7 statement is Canada’s own proposed border-security bill (C-2), which has been widely condemned by this author and numerous other rights groups for the ways it may open up transborder surveillance by foreign governments into Canada. As written, the bill might actually facilitate further transnational repression.

As my Citizen Lab colleague Kate Robertson noted in a recent analysis, Bill C-2 “contains several areas where proposed powers appear designed to roll out a welcome mat for expanded data-sharing treaties or agreements with the United States, and other foreign law-enforcement authorities.” In light of the authoritarian train wreck unfolding in the U.S., and the prospect of high-risk individuals fleeing that country for Canada, such data-sharing could conceivably become a tool of transnational repression used by our closest neighbour, not to mention other repressive regimes.

Pledges are important and the Canadian-backed G7 statement on countering transnational repression and abuse of spyware is certainly a very welcome one. But for Canada to actually translate those pledges into meaningful laws and policies will require some serious self-reckoning about how our own past and current practices are actually implicated in the very acts we have once again condemned."

theglobeandmail.com/opinion/ar

The Globe and Mail · The G7 condemned transnational repression, but will Canada meet its own commitments?By Ronald Deibert

"EU law enforcement bodies could be capable of decrypting your private data by 2030.

This is one of the ambitious goals the EU Commission presented in its Roadmap on June 24, 2025. A plan on how the bloc intends to ensure police officers' "lawful and effective" access to citizens' data.

The Roadmap is the first step forward in the ProtectEU strategy, first unveiled in April 2025 – but privacy experts have already begun raising the alarm."

techradar.com/vpn/vpn-privacy-

TechRadar · The EU wants to decrypt your private data by 2030By Chiara Castro

Laut einem Bericht des Bundesrechnungshofs würden weniger als 10% der 100 Rechenzentren des Bundes die Mindeststandards des #BSI erfüllen, in Krisenzeiten sei nicht einmal der Notstrom garantiert. Das Sicherheitsniveau der Rechenzentren sei insgesamt "unzureichend" und der Zustand der IT-Sicherheit in der Bundesverwaltung "unverändert defizitär", urteilen die Prüfer.

Keine Überraschung leider.

heise.de/news/Bundesrechnungsh

heise online · Sogar Notstrom fehlt: Schlechte Sicherheitstandards in Rechenzentren des BundesBy Axel Kannenberg

If you know anyone in the European Union who is seriously considering buying Meta's spyglasses, please remind them that using those privacy nightmares would make them a controller according to the GDPR. They'd be responsible for ensuring full compliance with our privacy laws, which - surprise, surprise - will be impossible.

Please tell them to save themselves and everyone around them the trouble and give the money to any NGO instead.
--
#privacy #DataProtection #GDPR #Meta #BanRayBan

In response to the increasing power of America’s digital surveillance machine, WIRED asked #security and #privacy experts for their advice for hardening personal privacy protections and resisting #surveillance. Here are their recommendations:

wired.com/story/the-wired-guid

WIRED · The WIRED Guide to Protecting Yourself From Government SurveillanceBy Andy Greenberg

"Google has been ordered by a court in the U.S. state of California to pay $314 million over charges that it misused Android device users' cellular data when they were idle to passively send information to the company.

The verdict marks an end to a legal class-action complaint that was originally filed in August 2019.

In their lawsuit, the plaintiffs argued that Google's Android operating system leverages users' cellular data to transmit a "variety of information to Google" without their permission, even when their devices are kept in an idle state.

"Although Google could make it so that these transfers happen only when the phones are connected to Wi-Fi, Google instead designed these transfers so they can also take place over a cellular network," they said.

"Google's unauthorized use of their cellular data violates California law and requires Google to compensate Plaintiffs for the value of the cellular data that Google uses for its own benefit without their permission.""

thehackernews.com/2025/07/goog

"Billions of people worldwide use private messaging platforms like Signal, WhatsApp, and iMessage to communicate securely. This is possible thanks to end-to-end encryption (E2EE), which ensures that only the sender and the intended recipient(s) can view the contents of a message, with no access possible for any third party, not even the service provider itself. Despite the widespread adoption of E2EE apps, including by government officials, and the role of encryption in safeguarding human rights, encryption, which can be lifesaving, is under attack around the world. These attacks most often come in the form of client-side scanning (CSS), which is already being pushed in the EU, UK, U.S., and Australia.

CSS involves scanning the photos, videos, and messages on an individual’s device against a database of known objectionable material, before the content is then sent onwards via an encrypted messaging platform. Before an individual uploads a file to an encrypted messaging window, it would be converted into a digital fingerprint, or “hash,” and compared against a database of digital fingerprints of prohibited material. Such a database could be housed on a person’s device, or at the server level.

Proponents of CSS argue that it is a privacy-respecting method of checking content in the interests of online safety, but as we explain in this FAQ piece, CSS undermines the privacy and security enabled by E2EE platforms. It is at odds with the principles of necessity and proportionality, and its implementation would erode the trustworthiness of E2EE channels; the most crucial tool we have for communicating securely and privately in a digital ecosystem dominated by trigger-happy surveillance."

accessnow.org/why-client-side-

Access Now · Why client-side scanning is a lose-lose propositionClient-side scanning (CSS) on encrypted platforms undermines people’s privacy and security by circumventing end-to-end encryption (E2EE).

Whenever you're building new tech, please seek privacy advice right from the start. Not as a last step before deployment.

Privacy people want to make your tech better! For you, your customers, and everybody else. But they can only do so if you make them part of your project, not just some final to-do on your checklist.
--
#privacy #DataProtection #GDPR #tech

"For context, last week Facebook began showing users a prompt asking them to opt into "cloud processing," TechCrunch reported. Should you consent, this allows Facebook to grab stuff from your camera roll and upload it to Facebook's servers "on a regular basis" so it can generate recaps and "AI restylings" of your photos.

The important detail is that by opting in, Meta is asking you to agree to its AI terms, which state that, "once shared, you agree that Meta will analyze those images, including facial features, using AI." Meta would also gain the right to "retain and use" the information shared with its AI systems.

Your alarm bells should already be ringing. Any data that gets fed into an AI system runs the risk of being coughed up or reproduced in some shape or form. And asking for access to your entire camera roll so Meta's tech can "analyze" your photos is a huge and invasive escalation — it's shameless that Meta's even asking. Apparently, already using everyone's billions of Facebook and Instagram posts made since 2007 wasn't enough for Zuckerberg's tech juggernaut.

Moreover, Meta's AI terms don't make it clear if your unpublished camera roll photos it uses for "cloud processing" are safe from AI training. That's in stark contrast with the terms outlined for apps like Google Photos, the Verge noted, which explicitly state that your personal info won't be used as training data."

futurism.com/meta-sketchy-trai

Futurism · Meta Is Being Incredibly Sketchy About Training Its AI on Your Private PhotosBy Frank Landymore