BY Andrew Hultkrans in Features | 12 MAR 14
Featured in
Issue 161

All Watched Over

Did Philip K. Dick predict the future of surveillance?

A
BY Andrew Hultkrans in Features | 12 MAR 14

‘A cage went in search of a bird.’
Franz Kafka, ‘Reflections on Sin, Pain, Hope, and the True Way’ (1917–19)1

In Philip K. Dick’s 1974 novel Flow My Tears, the Policeman Said, Jason Taverner, a pop singer and variety show host with an audience of 30 million, wakes up one day to the ultimate nightmare for celebrities like himself and social media users under the age of 40: according to the world’s databanks, he does not exist. Given that he lives in a near-future usa recast as a high-tech police state, Taverner must not only contend with bruised vanity and existential dread in his newfound status as an ‘unperson’, he needs id and a credible web of data trails that, in his networked world (and ours), constitute evidence of an authentic human life. ‘I can’t live two hours without my id,’ Taverner reflects upon discovering his predicament. ‘I’ll spend the rest of my life as a slave doing heavy manual labor [...] One random check by a mobile vehicle and a crew of three. With their damn radio gear connecting them to pol-nat data central in Kansas City. Where they keep the dossiers.’2

Data central, where they keep the dossiers. I’ve a feeling it’s not in Kansas any more (though it may be in Utah). In the 21st century, where George Orwell’s centralized, all-seeing Big Brother has turned out to be a vast, interconnected network of Little Brothers, each a ‘data central’, your digital dossier is available for inspection, analysis and redistribution by a broad array of corporations and government agencies, sometimes working in tandem. Information-based surveillance of this sort had long suffered under the inelegant rubric ‘dataveillance’ until the emergence, in recent years, of a buzzword as seemingly cuddly as it is ominous: Big Data.

Since 9/11, law enforcement and intelligence agencies have made use of a particularly tantalizing artefact of Big Data previously associated with the private sector: predictive analytics. Businesses have traditionally employed predictive analytics to determine how likely you are to get seriously ill or die, pay your creditors, respond to an ad or purchase a given product, among many other metrics. But as former Assistant Attorney General and USA PATRIOT Act co-author Viet Dinh said soon after 9/11, ‘If Johnson & Johnson can use these technologies to sell soap, we should be able to use them to fight terrorism.’3

Instead of divining potential customers and forecasting their choices, these repurposed data-mining systems deploy algorithms to sift through massive data sets obtained from a wide variety of public and private sources to develop profiles of potential malefactors, sorting people into risk categories in the way insurance firms and credit bureaus do, and making inferences about future behaviour. An early system of this sort, created by the Florida company Seisint, Inc., used a ‘terrorism quotient’ to ferret out individuals who might have a ‘High Terrorist Factor’ score.4 In these schemes, standards of ‘normalcy’ are codified and automated, the flexible instinct and judgment of security professionals replaced by the digital rigidity of actuarial tables. As the website of a much-criticized, us government-proposed precursor of these systems put it: ‘Total Information Awareness of transnational threats requires keeping track of individuals and understanding how they fit into models.’5

By opening case files, so to speak, on people before they break the law, dataveillance recentralizes state power, transforming the internal self-policing of traditional panoptic surveillance into a method of policing selves before the fact, promising to realize the dream of centuries of mystical endeavours, the stuff of a thousand science-fiction novels (including some by Dick): the ability to predict and control the future. In a word: precrime.

— 

‘“In our society we have no major crimes,”
Anderton went on, “but we do have a detention
camp full of would-be criminals.”’
Philip K. Dick, ‘The Minority Report’ (1956)6

Aside from Blade Runner (1982) and A Scanner Darkly (2006), the most compelling film adaptation of a Dick narrative is Steven Spielberg’s Minority Report (2002), based on a short story positing a near-future America where mutant precognitives ‘capable of previewing future events and transferring orally that data to analytical machinery’ enable the police to prevent nearly all crimes before they occur.7 The police department, where the ‘precogs’ are housed, has been renamed the Precrime Agency. In both story and film, the precogs are connected to computers that record and make sense of their jumbled, fragmentary visions. Reverse-extrapolating Dick’s story to our present, the precogs serve as the conduits – the undersea fibre-optic cables – of Big Data, an unstructured torrent of information from countless sources (including, for them, information about the future). Inside the computers wired to the precogs, algorithms detect patterns within the data flow that wouldn’t be apparent to human beings and generate predictions that, in the parlance of spooks, result in ‘actionable intelligence’.

The story explores the theme of fate versus free will and the paradoxes of alternate futures, both of which Dick loved to ponder, but the setting his characters inhabit is typically spare. Being closer to the mid-21st-century Dick was imagining in the 1950s, Spielberg and his team had the advantage of being able to extrapolate nascent real-world technologies to nightmarish extremes, resulting in the most plausible onscreen urban dystopia since Ridley Scott’s Blade Runner (based on Dick’s 1966 novel Do Androids Dream of Electric Sheep?). Cars can be shut down remotely, locked from the outside and redirected to Precrime headquarters. Small ambulatory drones called ‘spiders’ prowl through buildings at the behest of Precrime officers, performing retinal scans on all humans inside and instantly reporting back to the cops’ handheld devices. Beyond the spiders, retinal scanners are everywhere, for both security and marketing purposes, with ‘live’ billboards identifying passersby and calling out tailored messages to them based on their purchasing histories, demographic profiles and present moods. Minority Report envisions the total fusion of Customer Relationship Management and paramilitary, zero-tolerance policing – a world not unlike our own – where corporate marketing and state authority are equally omnipresent and intrusive, and where the only future available to individuals is to be compliant consumers.

Dick’s politics, such as they were, consisted of anarchistic, left-libertarian attitudes, all somewhat inchoate and arrived at, one feels, through emotional and intuitive channels rather than political study or engagement. If one were to distill Dick’s political inclinations into a phrase, it would be anti-totalitarian. He found Western corporate conglomerates – and the pressed-slacks conformity of those who staffed them – terrifying. But in his 1967 story ‘Faith of Our Fathers’, and other works, he also exhibited a deep aversion to the ideologically monolithic, individuality-effacing Maoist strains of communism. Above all, and this was typical in American science fiction of the 1950s – an era that not only generated Invasion of the Body Snatchers (1956) but two polarized, equally valid interpretations of it – Dick was worried about mass acquiescence to sameness, even if it promised eternal peace in what had been a fractious, violent world.

His work is shot through with this theme, which underscores many of his post-apocalyptic societies and off-world colonies but is most neatly encapsulated in the 1955 story ‘Service Call’, in which a repairman from the future unexpectedly arrives at a man’s home to service his ‘swibble’, a kind of telepathic bio-mechanical creature that ‘differentiated’ and swiftly liquidated anyone who harboured the ‘wrong’ ideology (the last of these being those ‘contrapersons’ who opposed the swibbles themselves). Once universal harmony had been achieved by means of orderly, biotech-assisted mass murder, swibbles were installed in every home as psychic monitors, reading minds and making ‘adjustments’ to keep people docile in endless consensus. As the repairman explains:

‘It’s absurd to wait until an individual has accepted a contrary ideology, and then hope he’ll shift away from it [...] There won’t be any more conflicts, because we don’t have any contrary ideologies [...] [It] doesn’t really matter what ideology we have; it isn’t important whether it’s Communism or Free Enterprise or Socialism or Fascism or Slavery. What’s important is that every one of us agrees completely; that we’re all absolutely loyal.’8

Dick distrusted state and corporate authority by default, and hated the brutally applied force (arrests, raids, wars) that so often emanated from them, but he feared their powers of persuasion even more. Propaganda and mind control – particularly when disseminated by ‘leaders’ or ‘businessmen’ who are themselves fake or non-existent – were perennial themes. In his story ‘The Mold of Yancy’, also from 1955, an off-world colony is lulled into a state of perfect homogeneity by the seemingly inane but subtly suggestive broadcasts of one John Edward Yancy, a conflation of Ronald Reagan and the Church of the SubGenius’s J.R. ‘Bob’ Dobbs, who like Dobbs (and possibly Reagan) is not a real person, but a virtual personality maintained and controlled by a secret office of ‘yance-men’. As an investigative team from Earth learns, there isn’t even a sinister tyrant or oligarchic council behind the Yancy simulacrum, but merely ‘the trading syndicates that own this moon: lock, stock, and barrel’.9 Thus, an ideal form of fascism – the merging of corporate and state power – had been established without violence or intimidation:

‘Torture chambers and extermination camps were needed only when persuasion failed. And persuasion was working perfectly. A police state, rule by terror, came about when the totalitarian apparatus began to break down. The earlier totalitarian societies had been incomplete; the authorities hadn’t really gotten into every sphere of life. But techniques of communication had improved.’10

Dick was, by his own admission, paranoid (there were times when he had good reason to be) and mortally afraid of J. Edgar Hoover’s FBI and the Nixon administration, so his responses to encroaching surveillance technologies were unsurprisingly trepidatious. ‘The ultimate in paranoia is not when everyone is against you but when everything is against you,’ he annotated his 1953 story ‘Colony’. ‘Instead of “My boss is plotting against me,” it would be “My boss’s phone is plotting against me.”’11 That Jason Taverner of Flow My Tears ... isn’t recognized by any of his friends, lovers, colleagues or 30 million fans is of secondary importance; what really matters is that there is no trace of him in data central. His ego depends on human recognition; his survival and relative freedom depend on machine recognition, without which he will be captured with mechanical efficiency by paramilitary police and disappeared to a labour camp. The police aren’t plotting against Taverner, data central’s server farms are.

One of the more laughable rationalizations floated by supporters of the NSA and GCHQ in the wake of the Edward Snowden revelations boils down to: ‘They can’t be listening to everyone’s phone calls and reading everyone’s email; that would require 10 million or more staffers on duty round the clock!’ Such people have failed to notice that techniques of communication have improved. If, as is statistically likely, many of these apologists have Gmail accounts, they might ask themselves how Google is able to ‘read’ all of their (and millions of other people’s) emails in real time and deliver ‘relevant’ advertising based on the language therein. Indeed, Google’s evergreen defence against charges of creepiness in the Gmail business model is precisely that no one – which is to say no human – is reading your email. Everyone isn’t against you, everything is. As Bob Arctor, the undercover narcotics agent charged with surveilling himself in Dick’s 1977 masterpiece of paranoia and identity slippage, A Scanner Darkly, wonders about his observers: ‘Assuming there’s a “they” at all. Which may just be my imagination, the “they” watching me. Paranoia. Or rather the “it.” The depersonalized it. Whatever it is that’s watching, it is not a human [...] As silly as it is, he thought, it’s frightening.’12

The primary questions Dick wrestled with in his work were: ‘What is reality?’ and ‘What constitutes an authentic human being?’ The latter question was often addressed through narratives involving androids that are essentially indistinguishable from humans – stories like ‘Impostor’ and ‘Second Variety’ (both 1953) and his novel-length meditation on the topic, Do Androids Dream of Electric Sheep? He was particularly interested in the point at which machine consciousness would become so advanced that an android wouldn’t know it wasn’t human (the protagonist of ‘Impostor’, for example, or Rachel in Do Androids Dream …). Dick concluded that empathy was the last trait an android would need to acquire to be a truly seamless copy of its maker, but he was equally concerned about the inverse process: humans becoming like androids, programmed automatons devoid of free will (or, for that matter, empathy). As he wrote in his 1972 essay ‘The Android and the Human’:

‘[A]s the external world becomes more animate, we may find that we – the so-called humans – are becoming, and may to a great extent always have been, inanimate in the sense that we are led, directed by built-in tropisms, rather than leading. So we and our elaborately evolving computers may meet each other halfway. Someday a human being, named perhaps Fred White, may shoot a robot named Pete Something-or-Other, which has come out of a General Electric factory, and to his surprise see it weep and bleed. And the dying robot may shoot back and, to its surprise, see a wisp of gray smoke arise from the electric pump that it supposed was Mr. White’s beating heart. It would be rather a great moment of truth for both of them.’13

When a statistician from the big-box chain Target can proclaim in a 2012 New York Times Magazine article, ‘We’ll be sending you coupons for things you want before you even know you want them,’ we are well on our way to that moment of truth.14 Today, ‘valid’ people identify themselves by shopping (President Bush’s prime directive, soon after 9/11: ‘I ask for your continued participation and confidence in the American economy’), often using loyalty cards; ‘invalid’ people (hackers/criminals/terrorists) do their shopping anonymously and elsewhere, on the Dark Web and black market.

For the ‘valids’, every online purchase, every mindless ‘I Agree’ to an internet company’s rapacious terms of service is equivalent to signing up for a loyalty card to Big Data. As with loyalty cards from major retailers, the ‘upside’ for consumers consists of instant recognition, transactional convenience and targeted recommendations; the benefit to the corporations is a constantly updated digital dossier of the consumer that will be used to predict and guide future purchases and sold to other corporations for additional profit. Thanks to Snowden, we now know – instead of merely assuming – that this dynamic benefits government security services as well. The merger of your corporate and state dossiers is complete.

‘Any society in which people meddle in other people’s business is not a good society, and a state in which the government “knows more about you than you know about yourself,” as it is expressed in Flow My Tears, is a state that must be overthrown. It may be a theocracy, a fascist corporate state, or reactionary monopolistic capitalism or centralistic socialism – that aspect does not matter.’
Philip K. Dick, ‘If You Find This World Bad, You Should See Some of the Others’ (1977)15

Predictive analytics produces diminishing returns and, if applied to culture or law enforcement, results in a monolithic society. Such systems can only truly ‘predict’ the future if the future remains like the past. The fatal flaw in the enterprise was laid bare by the 2008 financial crisis, where quantitative models used to price mortgage derivatives and other exotic financial instruments were revealed to have been based on the assumption that housing prices would always go up. Black-swan events were simply disregarded in the models. On a more basic level, the algorithms behind Amazon that recommend other books or films to you based on your past purchases keep you constrained to the narrowcast world of your existing tastes. Novelty and imagination are soon leached out of what increasingly becomes a closed loop. As Dick annotated his 1974 time-travel story ‘A Little Something for Us Tempunauts’: ‘It is as if the increase in information brought about by such a technological achievement – information as to exactly what is going to happen – decreases true understanding.’16 The statistical tail is wagging the dog. We are being led by our models.

‘Androidization requires obedience,’ Dick wrote in ‘The Android and the Human’. ‘And, most of all, predictability. It is precisely when a given person’s response to any given situation can be predicted with scientific accuracy that the gates are open for the wholesale production of the android life form.’17 In a twist the author would have regarded with his signature mix of humour and dread, a Philip K. Dick android has been operational since 2005, the incept date of Google Analytics.

1 Franz Kafka, The Great Wall of China, 1970, Shocken, New York, p. 165
2 Philip K. Dick, Flow My Tears, the Policeman Said, 1993, Vintage, New York, pp. 18–21
3 Jeffrey Rosen, The Naked Crowd, 2004, Random House, New York, p. 98
4 Jim DeFede, ‘Mining the Matrix’, Mother Jones, Sept/Oct 2004
5 Jeffrey Rosen, ‘The Year in Ideas: Total Information Awareness’, The New York Times, 15 December 2002
6 Philip K. Dick, Selected Stories, ed. Jonathan Lethem, 2002, Pantheon, New York, p. 229
7 Ibid. p. 243
8 Philip K. Dick, The Minority Report and Other Classic Stories, 1987, Citadel Press, New York, pp. 31–32
9 Ibid. p. 57
10 Ibid. pp. 55–56
11 Philip K. Dick, Paycheck and Other Classic Stories, 1987, Citadel Press, New York, p. 404
12 Philip K. Dick, A Scanner Darkly, 1991, Vintage, New York, p. 185
13 Philip K. Dick, The Shifting Realities of Philip K. Dick: Selected Literary and Philosophical Writings, ed. Lawrence Sutin, 1996, Vintage, p. 187
14 Charles Duhigg, ‘How Companies Learn Your Secrets’, The New York Times, 16 February 2012
15 Philip K. Dick, The Shifting Realities, op. cit., pp. 250–51
16 Philip K. Dick, Afterword to The Best of Philip K. Dick, 1977, Ballantine, New York, p. 443
17 Philip K. Dick, Shifting Realities, p. 191

Andrew Hultkrans is a writer based in New York, USA. He is the author of Forever Changes (Bloomsbury, 2003).

SHARE THIS