The Big Ask: Zero Privacy Culture & Violence

The question I get asked most about Gamechanger, hands down, is always a variation on: “Do you really think this future human society you’ve created could really do away with sexual assault?”

The short answer is yes. The medium answer is yes, if we really want it to. The long answer is … well, it’s this super tl;dr essay.

Until recently, I hedged my bets when answering this. Fiction is fiction for a reason. Even if you’re trying to believably extrapolate current trends, everything’s easier in story. So I’d talk about the imperfections of human-built systems, but also mention my heartfelt belief that men can choose not to be violent.

Getting deeper into my novel and its solution, I’d go into two ideas that are key to my optimistic future society. One of those ideas goes by the phrase “total accountability culture.” The other is “mutually assured disclosure.” Because it’s changes in how humans see things—even more so than the technology involved—that brings societal change.

Lately, though, I’ve been wondering if I needed to hedge. If my answer couldn’t just be yeah, we could do this.

This summer I listened to the latest season of Malcolm Gladwell’s podcast Revisionist History. Among other things, he had a three-week conversation with Mo Katibeh from AT&T, about the coming 5G revolution. This was ad content, sprinkled into other stories, very upbeat… it wasn’t hard-hitting journalism, if you get me. Even so, it was fascinating.

Katibeh explained that latency in tomorrow’s digital networks, when we all move to 5G, is about to achieve a speed that’s just barely slower than the speed of human thought. Examples of what this will mean for us, he went on, included surgeons in one part of the world operating—safely and routinely—on people in another. The reaction time on remote surgical devices will be that fast.

Katibeh painted a world where everyone’s digital helpers—the apps—in the gadgets from their watches and bikes and cars to the traffic cams around them—would coordinated to intervene in crashes before they happened. In Gamechanger, the exact same interventions seek to prevent human collisions—events like rape.

The technology is already all but there. All we need is someone to write the code, someone else to train the AIs to recognize everything from grooming and boundary-testing and negging to straight-up attacks… and crucially, for everyone else to agree that it’s time to consider going there.

(Oh, and there’s a small matter of considering whether privacy is less of a right and more of a privilege… and a problematic, #metoo enabling privilege at that.)

Every day, more of what we say and do and think and post and buy goes into corporate and public archives, possibly forever. It happens, more or less, with our permission. We sell our information for convenience. And so iTunes knows how many times I’ve rewatched Parade’s End. Google Maps can give you an alibi for my every waking moment since at least 2013.

What might we sell everything for? And what does a culture look like where you have no secrets at all? Does it have to be a horror show?

In Gamechanger, I go for a mix of utopian and creepy. (Aren’t all utopias a little creepy?)

Here’s how it works: By 2101, when the book’s happening, all our communications tech is implanted. As a kid you have wearables; then you have an operation when you’re fourteen and you get a microphone set near your trachea, speakers in your ears, and cameras in your eyes. Your uplink lets you switch from an unadjusted view of the world around you to an augmented one, full of info and tags. Or you can go fully virtual, bang back some buy-in drugs, and experience other digital realities. Full-immersion VR, in other words.

In this world, unless you are silently screwing in a darkened room, you are on the record in realtime. In public, views of you are multiplied by as many people and street cameras and mobile lenses as can see you and cross-reference your locator chip. Leaving aside futuristic surgical implants, the above is really just an amplification of something that is already happening.

The second amplification, in the book, is how social media interacts with everyone’s data. If I see you chucking a disposable coffee cup into the ravine, I can send a big thumbs down to the network—a strike. So can everyone else who’s witnessed your anti-social behavior.

(Kidding! Of course there are no disposable coffee cups!)

Once I complain, the system figures out who you are and sends those strikes, if they’re valid, to the social capital arbiter, Cloudsight. If enough other witnesses agree with me, it might impact your economic well-being. Prosocial people—the good and the much-Liked, basically—get discounts on goods and services. Generally speaking, a person who wants a good quality of life and the fewest possible hassles is going to be picking up the coffee cups and boosting their Cloudsight score, rather than littering.

Some of you may know that this, too, is something that’s already being prototyped in the real world. Here’s the article, and this one will get you right in the Orwells.

Get it straight… I am not saying the idea shouldn’t raise your hair. But consider the implications if you’re in a fight. Start with the slightly comical side of things–your entire extended family tuning in from around the world to offer opinions when you’re fighting with your kid about the length of their hair or skirts.

Family scraps aside, imagine getting belligerent in a bar. All the people who want to drink in peace are giving you strikes. As things escalate, you get an alert saying there’s a drone-mounted taser, piloted by a remote conflict-analysis specialist, on its way. A couple live law enforcement giggers are also waiting to see if they’re gonna get greenlighted to haul you off to the drunk tank.

Meanwhile, the censure of the folks in the bar continues to raise the price of your next beer to a point where you’re going to be too broke to drink unless you make nice with the Internet for six months.

Okay, though. Bar brawls. Do we care? Maybe not. So, let’s level the thought experiment one more time, and fold in smart tech and sexual violence.

Picture being a young adult with freshly minted VR implants.

One of the things you just do, as part of future high school here in 2101, is load up and learn to use a consent assessment app. Greenlight or Enthusiastic is running in the background of your heads-up display, all but forgotten, with your virus protection and sports scores.

But one day you meet someone, or several someones, and as things start getting romantic, the app goes from passive into active mode, patiently monitoring your realtime feed for words like Yes and No! and Ow! and More! It’s listening for your safe word, your secret “Call 911” code, sifting the transcript for gaslighting and negging, rating the physiological clues that indicate crying—because of course your mic monitors your pulse and respiration.

We’re on the verge of being able to nip bike accidents in the bud, remember? Right now, here in consensus reality, we’re using machine learning to teach AI to recognize faces and predict disaster scenarios. These consent apps could be trained using real trial transcripts and attack footage, as well as forced kisses in movies and everything else rape culture has thoughtfully provided over the years. They’ll be capable of parsing a lot of nuance. They’ll have settings to accommodate our preferences and kinks.

In this imaginary romantic encounter, imagine the mood shifting—something starts to go wrong. The app prepares to pop a text into your augmented display, reminding you it’s okay to slow things down or even leave. It’s ready to summon your loved ones, turn on all the lights, and if necessary call for more robust support.

And it’s just as ready to send you a brisk, impartial warning if you’re the one pushing the boundaries of your partners’ comfort zones.

This probably sounds weird and intrusive. The idea of having a future version of Alexa beep during a romantic moment to say “You didn’t check whether they wanted that, and facial analysis indicates they’re having feels…” Sure, that might be a real blast of cold water. I mean, that’s the beef people raise with asking for consent, right? We’re taking the fun out of it? Leaching out the sexy?

Pshaw. What if we took a minute to adjust to the idea that consent was nonnegotiable—and why the fuck have we gone on so long thinking otherwise?—and that tech could intervene on certain branches of the foreplay decision tree? What if–whether there was intent to commit harm or just a failure to read the room–an app-mediated safety alert wasn’t seen as party pooping or cock blocking? What if it was about ensuring sex without fear, for all parties? What would it be like to grow up in that world?

So… that’s the imaginary How in this particular Gamechanger worldbuilding element: no privacy, logistical challenges to being violent, a world where everybody has support from software while they’re learning to do sex with new partners, and most of all widespread societal agreement on consent always, always, goddamn always.

Amazing, right? Except. There is the creepy part. What about the part where people shouldn’t necessarily have every moment of their lives and especially every mistake or moral slip up on video forever, just waiting for someone clever enough to access and exploit it?

Friends, remember that we’re already more than halfway there. Your secrets are already endangered. Unless you take the trouble to police the mics and cameras in all your phones and other gadgets, unless you make your friends lock up their phones in a quiet box when you get together, unless you never selfie, don’t use your GPS, don’t play Pokemon, and you pay cash for everything, I am left to conclude that like me, you’re not terrified of losing your privacy. At most, you’re resigned. Mildly disquieted.

The problem doesn’t have to be the data. The problem is that you and I, the mildly disquieted, are end users. We’re Winston Smith. Consumers with virtually no financial or political power on any kind of grand scale. Caught in capitalism, and unable to participate fully unless we ante up our info.

But what if the answer isn’t locking the barn door? What if it’s cracking the vaults on the people we don’t know anything about—the Big Brothers, the people who do still have the privilege of privacy?

In Gamechanger, the other social adjustment is this: when those future humans talk about total accountability culture, they aren’t just talking about you and me and random average Joes. CEOs and politicians have to come clean too. Backroom deals and private boardroom meetings cease to exist. Everything’s on the public record. Everyone can log on to the room where it happens.

And, indeed, in a world where everyone is wearing surgical implants and they are on all the time, how do you have a secret meeting anyway? (I actually do answer this in the book.)

In Gamechanger our feeds all go into the Haystack, and they mostly sit there in the archive until someone ends up in a “Who said what?” type argument with a loved one and the sifting for the moral high ground begins. Though anyone can find out another person’s entire history, they aren’t necessarily going to want to. Under the mutually assured disclosure provisions, if someone is following your feeds that closely, you get a notification… and a copy of their cradle-to-current transcript.

You wanna know every single thing about that human you’re about to go on a date with? No problem: here are the files. By the way, they’ve received a date-stamped copy of your request and have an option to review your personal history too. They’re starting with whether you audited your last three romantic connections.

The result? At its best, we’d build a technologically mediated large-scale village effect, the same reality experienced by people living in itty bitty towns where everyone knows each other’s business.

Is this still something that makes you uneasy? It totally should. Getting the world to go for it would be really really hard. And is it better to imagine everyone having to drop their drawers to that extent? Sure, maybe.

But is that a harder sell, given what we’ve handed over already, than blithely imagining that the current world’s governments are somehow going to place significant limitations on the ways Facebook and Apple and Amazon and all the other players out there mine and sell the data we’ve already offered up or had stolen? How much interpersonal violence would have to disappear from the earth to make it worth each and every one of us becoming the star of our very own low-rated Truman Show?

It would take a huge change in our thinking. There would have to be no exceptions. You tell me what it would take for it to be worth it.