Monday, October 17, 2016

An excerpt from a 30-page article I am completing, "The Urgency of Orwell's Nineteen Eighty-Four: Surveillance and Ideas of Big Brother"

Conclusion:
The urgency of Orwell’s novel, which is translated into about 65 languages, rests on his depiction of extreme surveillance coupled to extreme government control.  While during the Cold War many readers felt the book served as a warning about the advancing Communist movement, it has been reframed over time to focus on government in the U.S.  During the “War of Terror,” and especially after the media leaks exposing worldwide surveillance by the NSA, Nineteen Eighty-Four and the Big Brother metaphor have become a chief reference to describe fears that mass spying puts the freedom and autonomy of individuals at risk.  While Orwell’s telescreen may not exist, other forms of monitoring seem equally pernicious as government tries to make all things visible to inspection.  The novel reminds us that the demands of official “security” can far surpass what seems legitimate and highly developed technological capabilities to watch people must be matched by meaningful regulation as well as by privacy protections.     
I propose that Big Brother now exists in parts of the U.S. government, notably, the intelligence agencies (such as the FBI and NSA). However, while surveillance and spying are a mass practice, repression and punishment remain selective.  So in this sense, it is possible to say that surveillance in America has become worse than in Nineteen Eighty-Four, but the security state’s overall conduct is not as severe as appears in Oceania.  The situation could deteriorate in the future since very little accountability exists for government spying and few efforts currently restrict the growing trend of enhanced surveillance.  As a leading example, when the U.S. Congress passed the USA Freedom Act in 2015 to reform the NSA, it largely ratified mass surveillance by keeping in tact most NSA systems of surveillance. As the idea of Big Brother takes shape on a global level, it is useful to look at the original work of fiction that inspired it to gain insight into the ways that surveillance can become a threat to civil society.  

Friday, September 9, 2016

The First Domestic Mass Data Mining Systems

Mass data mining first was developed by the Federal Bureau of Investigation (FBI) during the late 1990s and supervised by its Data Intercept Technology Unit. The first program, whose name remains a secret, debuted in 1996 and was replaced by the programs known as Omnivore in 1997 and Carnivore in 1999. (Carnivore later was renamed DCS-1000.) Carnivore was “packet sniffer” software: It captures the e-mail messages of a specific target from the e-mail traffic that travelled through the network of an Internet service provider (ISP). In 2000, when the program was publicly disclosed, the FBI said it “chews all the data on the network, but it only actually eats the information authorized by the court order.”[i] In another statement of this view, an assistant FBI Director told Congress in 2000: “[Carnivore] does NOT search through the contents of every message and collect those that contain certain key words like ‘bomb’ or ‘drugs.’ It selects messages based on criteria expressly set out in the court order, for example, messages transmitted to or from a particular account or to or from a particular us.”[ii] So before 9/11, the FBI said the program did not conduct mass data mining, although it had that capability.

Carnivore became a mass surveillance tool, bypassing the warrant requirement to search records, sometime after the attacks of September 11, 2001. I have tried to identify when exactly this changeover to mass surveillance for phone and internet records occurred, but it is difficult to pinpoint based on the current state of public knowledge. To be sure, FBI mass data-mining efforts are authorized under the USA Patriot Act (2001) and FBI Guidelines (2002) for investigation issued by the U.S. Justice Department. Carnivore (or DSC- 1000) was used until 2005 and replaced by a program known as NarusInsight. The technology writer Declan McCullagh writes that NarusInsight “can record all Internet traffic, including Web browsing--or, optionally, only certain subsets such as all e-mail messages flowing through the network. Interception typically takes place inside an Internet provider's network at the junction point of a router or network switch.” According to Kevin Bankston, a staff attorney at the Electronic Frontier Foundation, "What they're doing is intercepting everyone and then choosing their targets." This approach is called “full-pipe” surveillance and critics have questioned its legality. [iii]


[i] E. Judson Jennings, “Carnivore: US Government Surveillance of Internet Transmissions,” Virginia Journal of Law and Technology, 6: 10 (Summer 2001). See also Talitha Nabbali and Mark Perry, “Going for the Throat: Carnivore in an Echelon Word- Part I,” Computer Law and Security Report Vol. 19 no. 6 (2003): 460.

[ii] Oversight Hearings on “Fourth Amendment Issues Raised by the FBI’s ‘Carnivore’ Program,” Before the House Committee on the Judiciary, Subcommittee on the Constitution, 106th Cong. (2000) (statement of Donald M. Kerr, Assistant Director, Federal Bureau of Investigation), July 24, 2000 (quoted in Jennings, “Carnivore”).

[iii] Dean McCullagh, “FBI turns to Broad New Wiretap Method,” ZDNeT, January 30, 2007; Richard Koman, “FBI ‘Full-Pipe’ Surveillance May be Illegal,” ZDNet, January 31, 2007.

Wednesday, June 8, 2016

What Does Your "Metadata" Say?


In 2005, Washington Post journalist Robert O’Harrow published a popular book on mass data-mining entitled, No Place to Hide.  He identified new ways both industry and government collect vast amounts of personal information on Americans by working separately and also in collaboration.  The “War on Terror” had accentuated a data-driven surveillance society. The book received widespread notice. The conservative columnist William Safire wrote in the New York Times: “The computer's ability to collect an infinity of data about individuals -- tracking every movement and purchase, assembling facts and traits in a personal dossier, forgetting nothing -- was in place before 9/11. But among the unremarked casualties of that day was a value that Americans once treasured: personal privacy.” The idea that individuals could retain a sphere that is ''nobody's business'' rapidly had disintegrated.  A new “big business of everybody's business” had become the order of the day.
            Meanwhile, liberal law professor Geoffrey R. Stone, after reading O’Harrow, raised an existential question. “Once we understand that our every move is being tracked, monitored, recorded and collated, will we retain our essential sense of individual autonomy and personal dignity?” Where do people retreat if there is no place to hide?  There also are serious risks inherent in the construction of new data-based dossiers: data error; stolen data; and unintended public data disclosure.  Finally, as Stone notes, government may use its data collection capability “to suppress dissent and impose conformity.”  Despite official claims that data-mining promotes security, “history teaches that once government has such information, it will inevitably use it to harass and silence those who question its policies.”
            To be sure, O’Harrow was not the first writer to tackle this important subject matter.  Almost a decade earlier, academic specialists David Lyon and Elia Zureik edited an important book, Computers, Surveillance, and Privacy (1996), in which they had identified the new issue of “dataveillance.” For example, one of the authors in that volume, Colin J. Bennett, wrote:
 Mass dataveillance begins with no a priori knowledge of the individual(s) who may warrant attention. Its aim is to screen groups of people with a view to finding those worth subjecting to personal dataveillance. It is based on a general rather than specific suspicion, but also tries to deter or constrain behavior. All forms of computer matching are mass dataveillance techniques. They all involve the aggregate comparison of different data systems to identify those ‘hits’ that prima facie warrant further investigation.
Today, mass data-mining affects more Americans than ever before.  This is so because electronic records widely have displaced paper records and electronic communications now are prevalent in many spheres of both our personal and public lives.  More than 90 percent of Americans use cell phones.  The Internet has spread across the landscape transcending boundaries of race, gender, and class. In 2013, more than 85 percent of the nation’s population regularly went online.  More than half of the entire American adult population uses online social networking sites.  U.S. authorities also ask online service companies for account information on thousands of individuals.  To some extent, the idea that too much data now exists to make sense of it is relevant.  The common concern -- “drowning in data but starving for knowledge” —poses challenges for government data-mining, but the official development of more efficient systems for record matching and sorting promises to keep pace with the explosion of information.
            Of course, not all data-mining is nefarious. It can be an effective tool for scientists and other researchers, who refer to it as “knowledge extraction” and “information harvesting.”  It builds knowledge from large sets of data by identifying patterns; it makes generalizations about future behavior based on past behavior.  Data-mining can be used for “pattern detection” to identify small departures from the norm, or unusual patterns.  As information analyst Joyce Jackson notes, “Data mining allows the automated discovery of implicit patterns and interesting knowledge that’s hiding in large amounts of data.”
            But while data-mining proves useful in some fields, its application to the “War on Terror” at best is dubious. There is no way that patterns discerned from data analysis can predict political violence.  What may appear to be “suspicious” behaviors or patterns likely are anomalies – an oddity or peculiarity with little discernible meaning.  Using anomalies to create a suspect list is deeply flawed.  As Jim Harper of the Cato Institute concludes:
First, terrorist acts and their precursors are too rare in our society for there to be patterns to find. There simply is no nugget of information to mine.
Second, the lack of suitable patterns means that any algorithm used to turn up supposedly suspicious behavior or suspicious people will yield so many false positives as to make it useless. A list of potential terror suspects generated from pattern analysis would not be sufficiently targeted to justify investigating people on the list.
A major study conducted by the National Research Council confirms this analysis.  The report, ironically funded by the U.S. Department of Homeland Security, offers a blistering attack on the effectiveness of data-mining for terrorism discovery. “Automated identification of terrorists through data mining (or any other known methodology) is neither feasible as an objective nor desirable as a goal of technology development efforts," the report found. "Even in well-managed programs, such tools are likely to return significant rates of false positives, especially if the tools are highly automated." A false positive -- that is, erroneously identifying someone as a terrorist suspect -- can have disastrous consequences for individuals.  It can lead to major privacy intrusions, as well as targeted surveillance and harassment in everyday life if security agencies decide to “neutralize” subjects. False positives can lead to individuals “being in trouble with the government” for no legitimate reason.      
            So the ability of government to sort through mass data to discover preparation and planning for terrorism is a waste of resources. By contrast, data-mining is very effective to identify people and groups involved in dissident politics.  Both the FBI and NSA can sort through billions of records to find patterns of expression critical of government. Once the FBI locates subjects to neutralize, they can use data-mining directed at specific individuals to maximize their intelligence operations.  The National Research Council reports: 

"Once an individual is under strong suspicion of participating in some kind of terrorist activity, it is standard practice to examine that individual’s financial dealings, social networks, and comings and goings to identify coconspirators, for direct surveillance, etc. Data mining can expedite much of this by providing such information as (1) the names of individuals who have been in e‑mail and telephone contact with the person of interest in some recent time period, (2) alternate residences, (3) an individual’s financial withdrawals and deposits, (4) people that have had financial dealings with that individual, and (5) recent places of travel."

Wednesday, April 13, 2016

Some Current Writing Projects

I am working on several writing projects:
  • A graphic novel on the history of surveillance to be published in 2017.
  • Forms of monitoring in George Orwell's Nineteen Eighty-Four.
  • Political policing in an Oligarchy.
  • Some thoughts on "authorization."
  • Government surveillance of ethnic groups in the U.S.