I’ve been following the whole Cambridge Analytica hysteria, like everyone, and the story arc of Marvel's Civil War II kept popping into my head.
Just to make sure, I re-read the entire thing (highly recommend it!) and surely enough: there's a very strong analogy.
Civil War II: a primer (Spoilers!)
Civil War II is the follow-up to the first Marvel comics event of the same name. Back then, the story revolved around accountability; this one was mainly about control. And context.
A new Inhuman named Ulysses is introduced, with the ability to, seemingly, predict the future. He predicts the arrival of Thanos, and later a cosmic destroyer — both of which events come true.
(By the way, the Inhumans in general are a fascinating subject with not a little amount of Transhumanist metaphors.)
In the confrontation with Thanos, War Machine is killed, and after the fight with the destroyer Ulysses' next vision is about the Hulk, killing most if not all heroes. That's when everything goes to hell.
(This isn't a new twist, though. The Tony Stark-led Illuminati have entertained the possibility of the Hulk becoming an enemy, and they shot him out to space - which kickstarted the World War Hulk storyline. That didn't end too well.)
Long story short, Bruce Banner - who made a contingency plan for the time he may 'hulk out' - is shot dead by Hawkeye, and next on Ulysses' visions is Miles Morales' Spider-Man, seemingly killing Steve Rogers.
It's a really great read, particularly if you read all the comics, and not just the main event.
Traditionally, Civil War II is viewed as a metaphor for determinism and fate; which is a fascinating subject by itself. But, this being Starkplug, focusing on current(ish) technology and cultural/societal changes, I have a different perspective to offer today.
A more... topical one, if you will.
Facebook, Cambridge Analytica, and privacy
I dislike Facebook as much as the next person. But for different reasons. Their business tools (from the users of which they’re making their money) are sub-par, their overall strategy is flaky, their communications are arrogant. But, most importantly, they’re cowards. Facebook bends over backwards any time there’s a conflict. (Others do too, but to a much lesser extent.)
There's a lot, and I mean a lot misunderstandings about the whole incident. I don't want to go into all those, especially since there's an incredibly good article on Medium about it.
If you follow the Guardian or the New York Times, or any major news network, you are likely to have noticed that a company called Cambridge Analytica have been in the headlines a lot.
For me, the fascinating thing about the scandal was the lack of a bigger picture, a lack of context. And when we lose the context, we lose the ability for control.
I often talk about context and control (or their lack thereof) being the root problem.
Having common sense would allow us to step back and look at the larger context of events and information. A basic level of common sense can also recognize the control with which and for which each participating party, from Facebook to Cambridge Analytica to Donald Trump or pro-Brexit politicians, fight for and with.
Everyone involved, from media outlets to campaign operatives to social media platforms, is at fault here.
The mainstream media, which should be educating the public about the details, became a sensationalist machine grasping at fleeting moments of attention. Even outlets like Ars Technica or The New York Times, which I generally consider trustworthy sources, got onto the hype-train. That's their fault.
What's Facebook's fault? That they didn’t educate people. I’m not trying to assign malice to this fact, just stating it. I do believe that when it comes to technologies that do disrupt a paradigm — and Facebook has certainly done that — the responsibility of educating people is on them. Facebook could’ve and should’ve done a better job at onboarding people.
But they didn't, and on top of that they - being the cowards I mentioned they are - became submissive when the old world (meaning lawmakers, and generally people who aren't capable of or interested in understanding the scope of change they're fighting against) came knocking.
I understand that being a coward makes for better business. Standing up would’ve made their growth a lot slower. But I can not and will not condone sacrificing long-term morality for short-term profitability.
Last but not least: what did Cambridge Analytica do wrong? They bought data illegally. That's it, that's all of it.
The base intent isn't evil. (Not necessarily, anyway.) It's not even new: it's no different from any marketing-slash-sales tactic that has been around for a long, long time. The only difference is, with the digital tools at our disposal, we’re vastly more effective in achieving the same goal we’ve been trying to achieve for decades or centuries.
There is nothing, nothing different with newspaper ads. Or billboards. The aim is the same: get you to do something. (Buy something, think about something in a certain way, etc.) The only difference is, big data and the tools to make sense of it made it much more effective. That’s why Facebook is making money, because they offer a better ROI on advertising dollars. Because they have algorithms that filter out the misses from the advertising process. (Most of them, anyway.)
Us online marketers have been doing this for years. We survey, we observe, and we try our hardest to find the trigger (be that a phrase in the ad copy, an image, a video, whatever) that makes you do the thing we want you to do. But before us came a long, long line of the same behavior.
Cambridge Analytica acquired the data in a morally questionable way. How about the FCC's decision to allow ISPs to sell your browsing history though? That data is not only more comprehensive in its ability to predict your behavior, but also a lot less anonymous. There was a lot of uproar around that, but nothing much came of it.
My point is: the data is out there. We need to have serious safeguards against the misuse of it - and not just in the US but globally, otherwise it doesn't worth a damn - but we won't get there by deleting our Facebook accounts.
And let’s be brutally honest here: if they were able to make an effect on your behavior, it’s not their fault. That’s a job well done; not necessarily something you may agree with morally, but nevertheless.
People have the right notion: change starts with us. We are responsible, individually, for the creation of such data. But it's not a binary thing, where you can either fully submit or completely opt out. But people need to learn about the consequences their actions invoke.
Thus, I'm Team Stark. Again.
(Granted, it wasn’t much of a question, I mean look at the theme of this series. But while I’m happy to be a cheerleader for all things Tony Stark, I’m also very fond of making up my own mind. I was Team Stark during Civil War because Tony had the more reasonable argument. (Both in the comics and in the movie.) And, once again, Tony is the voice of reason in Civil War II.)
What can Civil War II teach us about the Cambridge Analytica incident?
1. Captain Marvel doesn't have context. And neither do the majority of people.
Everybody, it would seem, is on Captain Marvel’s side of the conflict.
Carol Danvers wants to use Ulysses’ power to, in a very Minority Report-fashion, stop events before they happen. Tony Stark wants to examine Ulysses, figure out what makes him tick and — most importantly — figure out how his visions come to be.
As a futurist, and a genius futurist at that, Tony is the topmost authority in predicting what’s going to happen. But precisely because of that he knows that the future isn’t set, it cannot be predicted, and probability isn’t a good enough reason for punishment.
What this means in terms of the Cambridge Analytica scandal is that just because we're able to - in some fashion, with a very limited scope - predict what someone else may do it doesn't mean they will. Even with the right trigger, it's a hit-and-miss operation.
Those who think like Captain Marvel accept whatever they read at face value. Carol Danvers doesn't see Ulysses' visions in context. Tony keeps telling her - and everyone else - how the perspective of Ulysses can skew what those visions mean.
2. Emotions. (When it goes viral, it goes to hell.)
A lot is upended in Civil War II, with War Machine dead, She-Hulk in a critical condition, and Bruce Banner being killed by Hawkeye. War Machine’s death is the emotional catalyst for almost the entire event. Bruce Banner’s death — that’s more interesting from a metaphysical standpoint..
But what’s really aligned to my current point is Spider-Man. (Miles Morales.)
Ulysses sees him (and, to make matters emotionally worse, everyone else does, too) standing over the impaled body of Steve Rogers, in front of the Capitol building in Washington, DC.
Carol Danvers snaps into action, and although she orders the arriving local police to stand down, it’s only to prevent a spectacle. She arrives soon after with full intention of taking Miles into custody.
Opposing her is Steve Rogers himself (but that's a whole different story), and Iron Man arriving in a "Captain Marvel-buster" armor. That’s the final fight of the story arc, that leaves Tony Stark in a… coma? Well, he’s dead, for all intents and purposes. (That’s where we pick up in Ironheart.)
Big data (Facebook, in this context) is Ulysses. The power that we start to see the effects of, but far from comprehending - even those who do have context around it - the full extent of it.
I can empathize. But I can see the narrative the media and the actors themselves have created.
Those of us on Team Stark however argue for scaling the context to the point where emotions become meaningless. While we aren't immune to them, being human and all, we can focus our attention to seek understanding, not revenge.
3. Control is with those who question and study it, not with those who submit to it.
Ulysses’ power is undeniable, the same as Facebook and big data's, but it needs to be examined thoroughly. Just like Tony Stark is doing in Civil War II.
Unlike Ulysses who evolved past the reality of the Marvel universe and took his place among the cosmic entities, Facebook and Twitter and the rest of social media is here to stay. They’re here to evolve, alongside with us. They can’t do that, we can’t do that, if the leaders of our society are unable to comprehend the scope and depth of the change we’re living through.
We need to understand, as a society, what these new “powers” mean and how to use them. We need to find new types of morality, because we cannot cling to the old one that has no answers for today’s problems. It’s a slow process, and we’re already way behind.
The only way to prevent your privacy to be violated is to forgo any and all digital — and indeed, most if not all analog — tools. By that logic we shouldn’t use credit cards, only cash. Never use metro cards, but buy tickets. Close down our bank accounts, and indeed stop using the internet and computers altogether. We even go off the grid, which we can’t do thanks to satellites orbiting the planet. And even if you do everything to stay off the grid, it takes a single other person who doesn’t to nullify that effort. If someone really wants to follow you, they can, no matter what you do.
The proverbial toothpaste is out of the tub.
We can’t reasonably argue against our data being used — but we can control, to a great degree, that data.
The people who collected the data from those 50 million (a comparatively low number, I might add, given the 2+ billion users on Facebook) users did so with consent. They did share the data against the rules, nobody’s debating that, but they wouldn’t had the access to begin with if people had just a little common sense.
A large majority of people only read headlines, if that much. The faintest trace of common sense is considered “too much effort”.
Yes, the amount of information surrounding us is tremendous. But we do have the tools to make processing that information easier and faster.
It’s not that we’re not capable of catching up to technology — it’s that we’re not willing to. Instead, we either blame technology or accept anything coming through it. Both are equally bad. The former leads to a Luddite movement that has no place in this day and age, and completely impractical in any event; and the latter leads to fake news and the radicalization of societies.
(On a fascinating sidenote, on April 1st everyone seems to be capable of telling truth from fiction, and I just can't understand how that's not the default on every other day, too.)
Facebook (and Twitter, and Snapchat, and the rest) should be held responsible for their lack of effort and cowardice — but not regulated into oblivion based on the communications paradigm the lawmakers grew up with. That paradigm is done for. It’s been done for for decades.
This new paradigm, the digital age, needs new legislations, and a new cultural understanding of the capabilities of the human race.
4. Perspective isn't reality - but it can be.
I don't want to delve into metaphysics too much, but people tend to mesh "facts" with "true". Objective reality is a bit of a myth, because our reality is shaped by the perspective through which we view the world.
What I mean is that "facts", which are parts of the objective reality surrounding us, can likewise be used for or against our agendas. There's a big difference between data, which is analogous with facts, and information which is the equivalent of narratives we create from facts.
Data has no agenda. Facts have no agenda.
But information and narrative comes with one. Not necessarily in a malicious, or even conscious, way, but it does nonetheless. Realizing this very simple thing can go a long way in evaluating (or even recognizing) the context around information and narratives.
The problem is when information becomes treated as fact, and then everything becomes skewed, along an exponential curve. And when this effect reaches a critical mass, it becomes a self-fulfilling cycle.
Big data is also a very, very blunt instrument. I see people worried about their privacy, but obody is interested in your behavior. They’re interested in patterns. Trends. They’re not interested in individuals, not because they wouldn’t know what to do with information like that but because they don’t have information like that. At the very best firms like Cambridge Analytica can only deal with patterns. Large groups. (And even then it's a gamble.)
We’re very, very far from the kind of power Ulysses had, and even his power was vague and confusing for the most part. The direction is certainly pointing his way. That brings with it, of course, a lot of debate about the morality of the use of this power.
But what it doesn’t do is bring a debate about the existence of it. We're past the point of no return on that.
Do you know who Cambridge Analytica (and a host of others, from the Russian hackers to more grey and even white hat operatives) is in this conflict?
They’re Steve Rogers. The Hydra operative, who takes advantage of the conflict to serve his own agenda.
He can do that, because all those people who can't be bothered to learn about the nuances of events like the Cambridge Analytica scandal will let whoever sings their tune to be in charge. It doesn't matter who that is, or whether or not they are honest.
(For what it's worth, I'm loving his Hydra story. Unlike most people, who tend to romanticize Steve Rogers, I happen to think that a good story is worth tearing up his reputation.)
Like Tony Stark in Civil War II, we need to seek to understand the context in order to gain proper control over our newfound powers. We cannot take it as-is, like Captain Marvel, and exploit the emotional response, the one that’s almost always proved to be wrong in the long term, to support a shorter term agenda. (However noble that may be, or seem to be.)
And we cannot let a Hydra operative manipulate our perception of the world - which is our reality - to the point where it becomes a self-fulfilling prophecy.
The Cambridge Analytica scandal, and many more before it, showed us the inadequacy of our understanding of technology’s impact on culture and society.
But the answer cannot be to hold technological evolution back. The answer should be catching up to it.