Apr
23
I needn’t tell readers of this blog what apparently came as news to a U.S. Senator: Advertising revenues keep the lights on at Facebook and account for Mark Zuckerberg’s $62.2 billion net worth.
Nor need I point out that the more targeted an advertising medium, the more valuable it is to advertisers, and that it is with targeting that Facebook shines.
With every Facebook action, and with every personality test (Which Muppet are you?), users reveal a good deal more about themselves than their fondness for kitten videos. Facebook abounds with opportunities to disclose your age, location, interests, reading choices, product preferences, religion, sexual orientation, political leanings, eating habits, TV and movie favorites, clothing preferences, music choices, favorite activities, travel habits, marital status, and more. That data is compiled, and it is sortable.
So, say your product is ideal for married vegan Trekkies who like reggae, drive a Prius, and own a dog. Facebook lets you select and show your ads only to people fitting that profile. (I’m not making that up.) That much seems to upset a lot of people, though it needn’t. Each Facebook is a data point among billions. Advertisers aren’t interested in peering into your individual life. They’re interested in not wasting money trying to sell steaks to vegans.
Data-driven targeting benefits users, too. It cuts down the number of irrelevant ads showing up in your feed. (Yes, without it, you’d see even more irrelevant ads.) It lets you enjoy Facebook—and oodles of content that come with it—without having to shell out. If the thought of your data being amassed no matter how it’s used creeps you out on general principle, that’s one thing. Otherwise, Facebook data gathering is arguably helpful.
Then came Cambridge Analytica.
The seeds for trouble spilled onto rich soil shortly after academic psychologist and data scientist Aleksandr Kogan obtained a boatload of data from Facebook. He obtained it in accordance with Facebook policy, so that much wasn’t the problem. The problem was that then he turned around and gave the data, which wasn’t his to give, to British political consulting firm Cambridge Analytica.
There’s a reason Facebook is in hot water even though it was Kogan who broke the rules. “Unlike other recent privacy breakdowns,” wrote TIME’s Lisa Eadicicco earlier this month,
“… thieves or hackers did not steal information. [Facebook] actually just handed the data over, then didn’t watch where it went.” [Italics added.]
What puts Facebook in even hotter water is that Cambridge Analytica’s clients didn’t use the data to sell mac and cheese or hand soap, but to promote political causes and candidates—from Brexit, to Ted Cruz, to Donald Trump.
(Time to pause for a disclaimer: This isn’t about Brexit or Trump. It’s about data.)
The way Cambridge Analytica may have applied the data has people upset. The New York Times painted a scary picture:
One recent advertising product on Facebook is the so-called “dark post”: A newsfeed message seen by no one aside from the users being targeted. With the help of Cambridge Analytica, Mr. Trump’s digital team used dark posts to serve different ads to different potential voters, aiming to push the exact right buttons for the exact right people at the exact right times.
Imagine the full capability of this kind of “psychographic” advertising. In future Republican campaigns, a pro-gun voter whose Ocean score ranks him high on neuroticism could see storm clouds and a threat: The Democrat wants to take his guns away. A separate pro-gun voter deemed agreeable and introverted might see an ad emphasizing tradition and community values, a father and son hunting together.
In this election, dark posts were used to try to suppress the African-American vote. According to Bloomberg, the Trump campaign sent ads reminding certain selected black voters of Hillary Clinton’s infamous “super predator” line. It targeted Miami’s Little Haiti neighborhood with messages about the Clinton Foundation’s troubles in Haiti after the 2010 earthquake. Federal Election Commission rules are unclear when it comes to Facebook posts, but even if they do apply and the facts are skewed and the dog whistles loud, the already weakening power of social opprobrium is gone when no one else sees the ad you see—and no one else sees “I’m Donald Trump, and I approved this message.”
(Time for another disclaimer: This isn’t about the Republican Party, either. Examples focus on the GOP because in the U.S. Cambridge Analytica refuses to work for other parties.)
The fear is less that dark posts might change minds and more that it might push fence-sitting minds to the message-sender’s side. Cambridge Analytica reportedly knows how to identify and push the hot buttons of large numbers of people by sending them tailored messages. If they present misleading or even false information, there’s pretty much no one to call them on it, because those likely to object don’t see those messages.
This, as reported by Reuters, has not helped ease concerns:
The suspended chief executive of Cambridge Analytica said in a secretly recorded video broadcast on Tuesday that his UK-based political consultancy’s online campaign played a decisive role in U.S. President Donald Trump’s 2016 election victory.
Yet some voices are skeptical.
Vox quite bluntly states, “There’s nearly no evidence these ads could change your voting preferences or behavior.”
To be sure, advertising is oft accused of persuasion power it doesn’t have. And as yet no hard data support the claim that dark posts affected the outcome of the Brexit vote or the U.S. 2016 elections. Consider, for instance, that the first U.S. politician to retain Cambridge Analytica was Ted Cruz. As you may have heard, Cruz didn’t secure the nomination.
For that matter, targeted messaging is nothing new. The only difference is that technology can amass data faster, in greater volume, and in near real-time; has sharpened marketers’ aim; and facilitates matching messages to audiences in a way never before seen.
But it’s equally true that it’s premature to dismiss claims about dark data’s potential to influence undecideds. It may simply be that dark data is so new that there hasn’t been time to execute valid tests. We can assuredly expect those tests very soon.
On a lighter note
Shall we end on a lighter note? Here are three of my favorite questions put to Mark Zuckerberg by U.S. Senators in last week’s hearing:
Is Twitter the same as what you do? —Senator Lindsey Graham, R, South Carolina
I’m communicating with my friends on Facebook, and indicate that I love a certain kind of chocolate. And, all of a sudden, I start receiving advertisements for chocolate. What if I don’t want to receive those commercial advertisements? —Senator Bill Nelson, D, Florida
How do you sustain a business model in which users don’t pay for your service? —Senator Orrin Hatch, R, Utah (where I live). (Zuckerberg: Senator, we run ads.)
How reassuring it is to know that powerful people who don’t understand Facebook are investigating Facebook on our behalf.