Meta fined €1.2bn for mishandling user information

In the ongoing saga of Facebook wrongdoing, it’s no surprise to me that Meta, the parent company for Facebook, has been fined a record €1.2bn following “revelations that European users’ data is not sufficiently protected from US intelligence agencies when it is transferred across the Atlantic”.

The Guardian

Facebook has been ordered to suspend the transfer of user data from the EU to the US. Worryingly the ruling doesn’t apply to the other platforms in the Meta stable.

Advertisement

Facebook, Cambridge Analytics, and Brexit

This story from Reuters announcing “Facebook parent Meta to settle Cambridge Analytica scandal case for $725” million has me wondering. Can Facebook be sued for their part in securing a leave win in the Brexit referendum?

It is my understanding that Dominic Cummings and Vote Leave took the information Cambridge Analytica scraped from Facebook, built detailed profiles of users, then targeted them with advertising designed to nudge voters in the 2016 referendum.

For those who think Facebook’s reach and impact is negligible, just another silly social media platform, consider this. 17,410,742 people voted to leave the European Union, compared to the 16,141,241 who voted to remain. That’s a difference of 1,269,501 people. Vote Leave won by nudging the attitudes and opinions of at least, but probably more than, 1.25 million people. The thing is, that’s only 2.82% of the 45 million people it’s estimated use Facebook in the United Kingdom. I also think it significant that the total electorate, those who voted in the referendum, was only 1,501,241 more than the total number of UK Facebook users.

Personally, I think the platform allowed itself to be used by Cummings and Vote Leave to reach and influence enough of the electorate, personally and specifically, to swing the vote their way. Facebook certainly took the money and ran the adverts without worrying about intention or means. I also think the picture is more complicated than simply targeting Facebook users with adverts that confirm an individual’s prejudices and trigger their fears. They also used targeted advertising to convince the apathetic or complacent, that leave could never win, nothing ever changes, so why bother voting at all. Turnout for the referendum was 72.2%, comparatively high when seen against the 2019 General Election at 67.3%. That’s a difference of 4.9%. Significant, in a conspiratorial kind of way, when you realise Vote Leave only won with a majority of 4%.

If you don’t believe me, and why would you, could you, should you, watch Dominic Cummings explain in his own words, “Why Leave Won the Referendum”. He gave this talk at the Ogilvy Nudgestock event in 2017. Nudgestock calls itself a festival of “behavioural science and creativity” that provides “science-led evaluation and optimization of nudge strategies, ideas and campaigns designed to change perception and behaviour”.

All of this sounds to me like psychological warfare, employed against the population of the United Kingdom, for political and economic gain.

Facebook is a data collection machine

A response prompted by a Kari Paul article in The Guardian.

The Guardian

The Sturdy app shows anyone who wants to see that Facebook is not a social network, it’s a data collection machine.

A new Facebook app will allow users to sell the company data on how they use competitors’ apps.

How does Facebook use the data it collects? I think it’s using our data against us. When I first wrote that sentence it came out as, “using it against its users”. I quickly realised, even if you don’t use Facebook, you come into contact with someone who does, Facebook knows something about you through them. When it says it’s connecting people, it really is, it’s mapping the many ways we brush against each other.

Imagine you’re walking along Piccadilly at 3.30 in the afternoon. Someone takes a picture, and posts it at 3.31. Facebook knows something about the person who posted the picture, and the location of everyone captured in the photo. What if Mark Zuckerberg was walking along Piccadilly, and at 3.32 someone spat in his face. The picture taken at 3.30 might show the assailant. It makes everyone in the picture a suspect.

Facebook gets to work cross-referencing various accounts, pulling up the latest facial recognition software. Suddenly the police are at your door, making you account for your actions between 3.15 and 3.45. You were minding your own business, but now you have to prove it, you have to prove somehow you didn’t spit in Mark Zuckerberg’s face. They’re not trying to prove you did it, you’re trying to prove you didn’t.

At this point I can hear a certain section of the population repeating a mantra, throwing it in my direction like some spunk sodden flannel, “nothing to hide nothing to fear”. That’s not an argument, it’s an accusation. You assume I have something I hide because I don’t want to account for my whereabouts.

Now imagine the world taking a sudden turn towards the authoritarian? What if people below a certain income level aren’t allowed to walk along Piccadilly? The police are at your door, questioning you about the assault on Mark Zuckerberg, but arresting you for being too poor to be on Piccadilly.

Who knows how this technology is being used, or will be used in the future? Facebook aren’t mining data because it’s fun, they’re doing it because it’s worth something. The information they collect can be used for what? Changing your purchasing habits? Telling you what you know about the world? Influencing elections?

Facebook is not a benign force, it’s a privately owned data collection machine.

Now ask yourself how’s it being used?

Facebook ‘unintentionally uploaded’ 1.5 million people’s email addresses

Another insight into the watchers in the tower from Rob Price of Business Insider. Facebook “unintentionally” harvested millions of peoples contact emails.

Price’s report exposes a contradiction from Facebook. They “disclosed to Business Insider that 1.5 million people’s contacts were collected”. This was apparently unintentional, a hangover from another protocol, that automatically uploaded new users email contacts.

These contacts were “fed into Facebook’s systems, where they were used to improve Facebook’s ad targeting, build Facebook’s web of social connections, and recommend friends to add”.

If that’s true, what’s the point of Facebook assurances that “these contacts were not shared with anyone and we’re deleting them”. The contacts have already been fed into Facebook’s system. Deleting them makes no difference, the damage is done, the ad targeting has begun.

Facebook’s role in Brexit – and the threat to democracy

Possibly the most explosive TED talk I’ve seen, ever. Carole Cadwalladr breaks down “Facebook’s role in Brexit – and the threat to democracy”.

TED

Carole Cadwalladr’s investigations may be the most important of a generation. Her work has exposed the workings of the tower at the centre of the panopticon, the machine that manipulates democracy.

For those unfamiliar, the panopticon is an idea, a circular prison with cells that have glass walls. Watched from a central tower, compliance is teased from its tenants because we never know when we’re being watched.

Michel Foucault used it as a metaphor highlighting the way power, since the destruction of absolute monarchies, has sought to hide itself from view. If there is no focus for our anger, it’s impossible to remove the cause of our pain.

If we are tenants of the panopticon, Facebook has made themselves the warders, and they’re stressing us into compliance. What I’d like to know is who pays them? Because whomever pays the warder calls the shots.

The biggest obstacle to finding that, is what we see when we look out of our cells. It’s not the looming black tower at the centre, but our own reflections in the glass.

We need to find ways to get a light into that tower.

Cadwalladr has gone some way to doing that. With the help of whistleblower Christopher Wylie, she was able to expose a small part of the tower’s mechanism, how the various platforms, stairs, landings, and corridors, link.

There are still questions to be answered. Where do the corridors lead, who is behind the various doors of the labyrinthine maze? I have theories, I’m sure Cadwalladr does too.

I just hope she keeps looking, because we all need her answers.

Once she does have more answers, we have to decide what we do with her revelations, because they will be revelations. Keep in mind that the structure we’re all part of is designed to have us stare like Narcissus at our own reflection. Do we have the will to see past our own image, to the structure of the tower, and what’s hidden within?

This will take great effort and the will to see it all? I think we must.

Facebook knew of Cambridge Analytica data misuse earlier than reported

Julia Carrie Wong in The Guardian reports “Facebook employees were aware of concerns about improper data-gathering practices” by Cambridge Analytica months before the Guardian first reported, in December 2015″.

The plot thickens like dehydrated honey on the chin of beetle.

A private Facebook is a supplement, not a replacement

Brian Feldman in Intelligencer reports plains laid out my Mark Zuckerberg for a “social network that wasn’t aggressively tracking everything its users did”.

Intelligencer

I agree with the basic tenet of Fieldman’s piece. A “private” Facebook doesn’t address the bad features Facebook already has.

For me the notion of a “private” Facebook is a distraction. It’s the same strategy employed by old media for decades. Faux outrage is routinely spewed by populist newspapers trying to distract us away from the real issues. They function like a pickpocket pulling our attention, getting us to look at this shiny thing over here, while they steal the Apple Watch from our wrist. But distractions are just that, a distraction. Sooner or later we’re going to realise, our watch is gone.

The question then becomes, do we care?

So many of us seem wilfully ignorant of the manipulations we are subject to. Perhaps we accept these manipulations because the “truth” is too painful to accept. We all like to believe we have agency. Accepting that we are being manipulated removes that agency. It’s easier to accept that a “private” Facebook will give us back what they took, what we wilfully gave them, than accept we have no power in this dynamic.

I don’t think a “private” Facebook will change anything. Ephemerality doesn’t remove the ethos at the core of Facebook, an ethos that believes because they own the platform they own what we share.

It is easier to accept a shiny promise of a private network than accept, Facebook owns us, and we are but serfs to Lord Zuckerberg’s want.

%d bloggers like this: