Monday, October 17, 2016

An excerpt from a 30-page article I am completing, "The Urgency of Orwell's Nineteen Eighty-Four: Surveillance and Ideas of Big Brother"

Conclusion:
The urgency of Orwell’s novel, which is translated into about 65 languages, rests on his depiction of extreme surveillance coupled to extreme government control.  While during the Cold War many readers felt the book served as a warning about the advancing Communist movement, it has been reframed over time to focus on government in the U.S.  During the “War of Terror,” and especially after the media leaks exposing worldwide surveillance by the NSA, Nineteen Eighty-Four and the Big Brother metaphor have become a chief reference to describe fears that mass spying puts the freedom and autonomy of individuals at risk.  While Orwell’s telescreen may not exist, other forms of monitoring seem equally pernicious as government tries to make all things visible to inspection.  The novel reminds us that the demands of official “security” can far surpass what seems legitimate and highly developed technological capabilities to watch people must be matched by meaningful regulation as well as by privacy protections.     
I propose that Big Brother now exists in parts of the U.S. government, notably, the intelligence agencies (such as the FBI and NSA). However, while surveillance and spying are a mass practice, repression and punishment remain selective.  So in this sense, it is possible to say that surveillance in America has become worse than in Nineteen Eighty-Four, but the security state’s overall conduct is not as severe as appears in Oceania.  The situation could deteriorate in the future since very little accountability exists for government spying and few efforts currently restrict the growing trend of enhanced surveillance.  As a leading example, when the U.S. Congress passed the USA Freedom Act in 2015 to reform the NSA, it largely ratified mass surveillance by keeping in tact most NSA systems of surveillance. As the idea of Big Brother takes shape on a global level, it is useful to look at the original work of fiction that inspired it to gain insight into the ways that surveillance can become a threat to civil society.  

Friday, September 9, 2016

The First Domestic Mass Data Mining Systems

Mass data mining first was developed by the Federal Bureau of Investigation (FBI) during the late 1990s and supervised by its Data Intercept Technology Unit. The first program, whose name remains a secret, debuted in 1996 and was replaced by the programs known as Omnivore in 1997 and Carnivore in 1999. (Carnivore later was renamed DCS-1000.) Carnivore was “packet sniffer” software: It captures the e-mail messages of a specific target from the e-mail traffic that travelled through the network of an Internet service provider (ISP). In 2000, when the program was publicly disclosed, the FBI said it “chews all the data on the network, but it only actually eats the information authorized by the court order.”[i] In another statement of this view, an assistant FBI Director told Congress in 2000: “[Carnivore] does NOT search through the contents of every message and collect those that contain certain key words like ‘bomb’ or ‘drugs.’ It selects messages based on criteria expressly set out in the court order, for example, messages transmitted to or from a particular account or to or from a particular us.”[ii] So before 9/11, the FBI said the program did not conduct mass data mining, although it had that capability.

Carnivore became a mass surveillance tool, bypassing the warrant requirement to search records, sometime after the attacks of September 11, 2001. I have tried to identify when exactly this changeover to mass surveillance for phone and internet records occurred, but it is difficult to pinpoint based on the current state of public knowledge. To be sure, FBI mass data-mining efforts are authorized under the USA Patriot Act (2001) and FBI Guidelines (2002) for investigation issued by the U.S. Justice Department. Carnivore (or DSC- 1000) was used until 2005 and replaced by a program known as NarusInsight. The technology writer Declan McCullagh writes that NarusInsight “can record all Internet traffic, including Web browsing--or, optionally, only certain subsets such as all e-mail messages flowing through the network. Interception typically takes place inside an Internet provider's network at the junction point of a router or network switch.” According to Kevin Bankston, a staff attorney at the Electronic Frontier Foundation, "What they're doing is intercepting everyone and then choosing their targets." This approach is called “full-pipe” surveillance and critics have questioned its legality. [iii]


[i] E. Judson Jennings, “Carnivore: US Government Surveillance of Internet Transmissions,” Virginia Journal of Law and Technology, 6: 10 (Summer 2001). See also Talitha Nabbali and Mark Perry, “Going for the Throat: Carnivore in an Echelon Word- Part I,” Computer Law and Security Report Vol. 19 no. 6 (2003): 460.

[ii] Oversight Hearings on “Fourth Amendment Issues Raised by the FBI’s ‘Carnivore’ Program,” Before the House Committee on the Judiciary, Subcommittee on the Constitution, 106th Cong. (2000) (statement of Donald M. Kerr, Assistant Director, Federal Bureau of Investigation), July 24, 2000 (quoted in Jennings, “Carnivore”).

[iii] Dean McCullagh, “FBI turns to Broad New Wiretap Method,” ZDNeT, January 30, 2007; Richard Koman, “FBI ‘Full-Pipe’ Surveillance May be Illegal,” ZDNet, January 31, 2007.

Wednesday, June 8, 2016

What Does Your "Metadata" Say?


In 2005, Washington Post journalist Robert O’Harrow published a popular book on mass data-mining entitled, No Place to Hide.  He identified new ways both industry and government collect vast amounts of personal information on Americans by working separately and also in collaboration.  The “War on Terror” had accentuated a data-driven surveillance society. The book received widespread notice. The conservative columnist William Safire wrote in the New York Times: “The computer's ability to collect an infinity of data about individuals -- tracking every movement and purchase, assembling facts and traits in a personal dossier, forgetting nothing -- was in place before 9/11. But among the unremarked casualties of that day was a value that Americans once treasured: personal privacy.” The idea that individuals could retain a sphere that is ''nobody's business'' rapidly had disintegrated.  A new “big business of everybody's business” had become the order of the day.
            Meanwhile, liberal law professor Geoffrey R. Stone, after reading O’Harrow, raised an existential question. “Once we understand that our every move is being tracked, monitored, recorded and collated, will we retain our essential sense of individual autonomy and personal dignity?” Where do people retreat if there is no place to hide?  There also are serious risks inherent in the construction of new data-based dossiers: data error; stolen data; and unintended public data disclosure.  Finally, as Stone notes, government may use its data collection capability “to suppress dissent and impose conformity.”  Despite official claims that data-mining promotes security, “history teaches that once government has such information, it will inevitably use it to harass and silence those who question its policies.”
            To be sure, O’Harrow was not the first writer to tackle this important subject matter.  Almost a decade earlier, academic specialists David Lyon and Elia Zureik edited an important book, Computers, Surveillance, and Privacy (1996), in which they had identified the new issue of “dataveillance.” For example, one of the authors in that volume, Colin J. Bennett, wrote:
 Mass dataveillance begins with no a priori knowledge of the individual(s) who may warrant attention. Its aim is to screen groups of people with a view to finding those worth subjecting to personal dataveillance. It is based on a general rather than specific suspicion, but also tries to deter or constrain behavior. All forms of computer matching are mass dataveillance techniques. They all involve the aggregate comparison of different data systems to identify those ‘hits’ that prima facie warrant further investigation.
Today, mass data-mining affects more Americans than ever before.  This is so because electronic records widely have displaced paper records and electronic communications now are prevalent in many spheres of both our personal and public lives.  More than 90 percent of Americans use cell phones.  The Internet has spread across the landscape transcending boundaries of race, gender, and class. In 2013, more than 85 percent of the nation’s population regularly went online.  More than half of the entire American adult population uses online social networking sites.  U.S. authorities also ask online service companies for account information on thousands of individuals.  To some extent, the idea that too much data now exists to make sense of it is relevant.  The common concern -- “drowning in data but starving for knowledge” —poses challenges for government data-mining, but the official development of more efficient systems for record matching and sorting promises to keep pace with the explosion of information.
            Of course, not all data-mining is nefarious. It can be an effective tool for scientists and other researchers, who refer to it as “knowledge extraction” and “information harvesting.”  It builds knowledge from large sets of data by identifying patterns; it makes generalizations about future behavior based on past behavior.  Data-mining can be used for “pattern detection” to identify small departures from the norm, or unusual patterns.  As information analyst Joyce Jackson notes, “Data mining allows the automated discovery of implicit patterns and interesting knowledge that’s hiding in large amounts of data.”
            But while data-mining proves useful in some fields, its application to the “War on Terror” at best is dubious. There is no way that patterns discerned from data analysis can predict political violence.  What may appear to be “suspicious” behaviors or patterns likely are anomalies – an oddity or peculiarity with little discernible meaning.  Using anomalies to create a suspect list is deeply flawed.  As Jim Harper of the Cato Institute concludes:
First, terrorist acts and their precursors are too rare in our society for there to be patterns to find. There simply is no nugget of information to mine.
Second, the lack of suitable patterns means that any algorithm used to turn up supposedly suspicious behavior or suspicious people will yield so many false positives as to make it useless. A list of potential terror suspects generated from pattern analysis would not be sufficiently targeted to justify investigating people on the list.
A major study conducted by the National Research Council confirms this analysis.  The report, ironically funded by the U.S. Department of Homeland Security, offers a blistering attack on the effectiveness of data-mining for terrorism discovery. “Automated identification of terrorists through data mining (or any other known methodology) is neither feasible as an objective nor desirable as a goal of technology development efforts," the report found. "Even in well-managed programs, such tools are likely to return significant rates of false positives, especially if the tools are highly automated." A false positive -- that is, erroneously identifying someone as a terrorist suspect -- can have disastrous consequences for individuals.  It can lead to major privacy intrusions, as well as targeted surveillance and harassment in everyday life if security agencies decide to “neutralize” subjects. False positives can lead to individuals “being in trouble with the government” for no legitimate reason.      
            So the ability of government to sort through mass data to discover preparation and planning for terrorism is a waste of resources. By contrast, data-mining is very effective to identify people and groups involved in dissident politics.  Both the FBI and NSA can sort through billions of records to find patterns of expression critical of government. Once the FBI locates subjects to neutralize, they can use data-mining directed at specific individuals to maximize their intelligence operations.  The National Research Council reports: 

"Once an individual is under strong suspicion of participating in some kind of terrorist activity, it is standard practice to examine that individual’s financial dealings, social networks, and comings and goings to identify coconspirators, for direct surveillance, etc. Data mining can expedite much of this by providing such information as (1) the names of individuals who have been in e‑mail and telephone contact with the person of interest in some recent time period, (2) alternate residences, (3) an individual’s financial withdrawals and deposits, (4) people that have had financial dealings with that individual, and (5) recent places of travel."

Wednesday, April 13, 2016

Some Current Writing Projects

I am working on several writing projects:
  • A graphic novel on the history of surveillance to be published in 2017.
  • Forms of monitoring in George Orwell's Nineteen Eighty-Four.
  • Political policing in an Oligarchy.
  • Some thoughts on "authorization."
  • Government surveillance of ethnic groups in the U.S.

Friday, November 27, 2015

A new book, Facing the Future: Paintings, 2007-2010

I wear several different hats.  Besides being an engaged writer on surveillance, social movements and civil liberties, I am a visual artist (painter).  A new book featuring my art has been published by Cyberwit.  It features work completed from 2007 through 2010.

Thursday, November 12, 2015

Two Book Chapters are Forthcoming

"A White Man in the Colored Bronx" in Teresa A. Booker, ed., Public Space, Public Policy and Public Understanding of Race and Ethnicity in America: An Interdisciplinary Approach, (University of Akron Press, 2016).

"Is Mass Surveillance an Effective Tool in Fighting Terrorism?" in Richard Jackson and Daniela Pisoiu, eds., Contemporary Debates on Terrorism Vol. 2, (Routledge, 2016). 

Sunday, May 31, 2015

I wrote a chapter ("The State Response to Occupy: Surveillance and Suppression") in this new book, What Comes After Occupy? edited by Todd A. Comer

http://www.cambridgescholars.com/what-comes-after-occupy

From the Introduction to the book:

The last two essays described above mined the Internet in order to get a better sense of Occupy’s demographic and organizing shortcomings with the ultimate goal of improving future social movements. Ivan Greenberg, in a complementary fashion, in our final essay on “The State Response to Occupy: Surveillance and Suppression” shows in detail how the federal government and its lackeys also mined social networking in order to obstruct radical politics in multiple U.S. cities. He then argues that “[e]stablishing a thick historical record is vital in order to analyze the movement's strengths and limitations . . . Police and intelligence records can add specificity and historical consciousness about what the movement represented to official power and the threat it posed to remake society.” Greenberg ends by arguing that protestors will need, increasingly, to “occupy surveillance” to ensure an effective politics in the future. Surveillance systems are not coherent, seamless tools of power; we need to locate weaknesses, and find ways to undermine surveillance in tactical ways: “Overall,” he writes, “the new reality of the surveillance society is sobering, and the tenacity of the American state to protect its practices should not be underestimated.” Still, he writes, with the advent of Edward Snowden, a critical movement is beginning to take shape to combat excessive government surveillance of local cities.

Friday, March 20, 2015

New article just published, "From Surveillance to Torture: The Evolution of US Interrogation Practices during the War on Terror"

article

From surveillance to torture: The evolution of US interrogation practices during the War on Terror

Ivan Greenberg
DOI: 10.1057/sj.2015.7
The war on terrorism weakened the distinction between observing suspicious bodies and tortu…
Print Publication Date: 16 Mar 2015

Online Date: 16 Mar 2015

Sunday, December 28, 2014

A new year brings new published writings. Forthcoming publications in 2015.


“From Surveillance to Torture: The Evolution of U.S. Interrogation Practices during the War on Terror,” Security Journal (forthcoming Spring 2015).

 “Proletarian Literature from the Bottom Up: Workers and Poetry during the Rise of the CIO,” American Quarterly (forthcoming June 2015).

“Postage and Power: U.S. Nationalism and the 1970s ‘Bicentennial’ and ‘Americana’ Stamp Series,” Journal of Social History (forthcoming Fall 2015)

“The State Response to Occupy: Surveillance and Suppression,” in Todd A. Comer, ed., What Comes After Occupy? The Regional Politics of Resistance (forthcoming 2015).

“A White Man in the Colored Bronx,” in Teresa A. Booker, ed., Race and Urban Communities: An Interdisciplinary Approach (University of Akron Press, forthcoming 2015).

Saturday, July 5, 2014

Regarding FBI Official W. Mark Felt's (aka Deep Throat) Two Unpublished Manuscripts


In my recent book, Surveillance in America: Critical Analysis of the FBI, 1920 to the Present (2012), I devote a whole chapter to top FBI official W. Mark Felt, who we now know acted as Deep Throat during the Watergate crisis of 1972-1974. I detail how Felt led a secret faction at the FBI consisting of several other officials to undermine President Richard Nixon after Watergate and eventually to remove him from power.  The Felt faction worked with Washington Post reporters Bob Woodward and Carl Bernstein to break the Watergate story. However, if Felt had gone to Congress, instead of the press, to expose the Watergate break-in and associated crimes, the exposure would have occurred faster and probably would have led to Nixon losing the 1972 election.  In sum, Democratic candidate George McGovern would have been elected President.  By leaking slowly to the press, Felt helped Nixon win reelection.  
      Beyond that dramatic story, in my book I refer to two unpublished manuscripts written by Felt which have not seen the light of day.  In 1986, Felt submitted to the FBI for prepublication review and approval a book titled, An Unexpected Turn of Events.  In 1990, the FBI reviewed a second Felt manuscript entitled, Thirteen True Stories About the FBI.  The FBI reviewed both books and concluded that "no sensitive information" was revealed.  

      I have heard through the academic grapevine that scholars have made efforts to locate these unpublished manuscripts and have not been able to find them. These scholars have checked my sources in my Surveillance in America book to see how I know about them. 
      Information referring to these manuscripts are contained in the declassified FBI file on Felt.  I was the first person to request and obtain this FBI file under the Freedom of Information Act (FOIA) after Felt's death.  Soon after I received the Felt file, the FBI decided to put part of it on its own website (www.FBI.gov.)  Let me repeat, they put only part of it online. 

      I emphasis this fact because the FBI memos referring to Felt's two unpublished manuscripts are not contained in the Felt file that is posted on the FBI website.

      The FBI sent me a much larger file on Felt than they posted online -- about twice as large  I already have referred to this fact in a public lecture I gave in the summer of  2012 at the HOPE 9 conference in New York City.  My hour-long talk was video-taped and already has been posted on my blog.  Here is a link to the lecture --- http://www.youtube.com/watch?v=sQuuaDHiJWs.
       The mystery of where Felt's two unpublished manuscripts are located never may be resolved.  But there should be no doubt that he wrote them and they went through the prepublication review process at the FBI.
--------------
Update -- A fellow researcher in FBI files tells me the FBI may retain copies in their files of both of Felt's unpublished manuscripts since they underwent prepublication review.  One strategy is to file FOIA record requests with the FBI on both manuscripts to see if they can be obtained through the declassification process.