Michel Foucault is identified as one of the most influential thinkers and is seen as a key figure in modern philosophy of the late 20th century, before his unfortunate death in 1984. He is widely recognized for his contributions to the two waves of 20th-century philosophy: structuralism and post-structuralism. His academic career was crowned in 1970 with his appointment as a professor of the history of systems of thought at the prestigious university in France, the College de France. The title that was given to him was unique due to the distinguishing nature of his work, which included disciplines such as philosophy, history, and politics.
Foucault’s primary concern was the intersection of power and social change. He contextualized and studied how the following played out in France and shifted from a monarchy to a democracy via the French Revolution. According to him, we largely tend to oversimplify the transitions by viewing them as an ongoing, natural, and inevitable attainment of freedom and reason. This resulted in a misunderstanding of how power operates in modern societies.
A key concept that is widely discussed in the work of Foucault is how he describes the panopticon as a way to illustrate the proclivity of disciplinary societies to subjugate their citizens. He attempts to show that the prisoner of a panopticon is at the receiving end of unconventional surveillance, where he is seen but does not see; he is an object of information, never a subject in communication.
Role of the panopticon in developing prisons
Foucault employs Bentham’s Panopticon, which he uses to demonstrate discipline and systems of power that are enforced to create a constant feeling of surveillance. The history of how the panopticon came to be is quite interesting.
Towards the end of the seventeenth century, Europe was hit by a deadly plague. Any town that suspected the disease had arrived was put on lockdown. As for safety, maintaining discipline was the need of the hour. They had developed a mechanism where every family was ordered to stay home. A “syndic” is given authority and is assigned to “surveil” each street, locking every door as he goes. It was this syndic’s duty to visit his street each day and shout to the inhabitants of each house. The citizens had to appear in the window; if they didn’t, he knew something was wrong; they were either weakened by the plague or dead. The syndic would now deliver his report to the town officials known as intendants, who would relay the information to the magistrates.
This process of establishing a chain of command is significant because it resulted in a refined surveillance system. The mockery here is that, in the midst of the chaos of the disease, they had unexpectedly developed a model of control. However, it was this idea or paradigm that went on to inspire some of these towns’ occupants, who themselves later became politicians. They now imagine creating a perfectly governed and disciplined society.
This is a common feature that is present in disciplined societies of this type, which gives a feeling of being permanently “surveilled.”
We can see how this idea developed into an architectural form in the work of the English philosopher Jeremy Bentham, which was finalized in 1787. He imagined a panopticon as a donut-shaped building. At its center will be a tower. Its windows would face outward, towards the donut’s inner surface. The structure will be divided into cells, each of which will extend from the inner to the outer ring. The cells have two windows, one on each end. One faces the tower; the other points outward. The culminating effect of such an arrangement would be that the inmates, one to a cell, would each have a sense of being continuously watched. Due to this feeling, they behave in a certain manner, irrespective of whether or not they are actually being watched.
The Panopticon—the architectural layout of a prison in which guards reside within a central tower and maintain surveillance over all inmates—was used by Foucault to explain disciplinary power. He felt one of the major effects of the Panopticon was to induce in the inmates “a state of conscious and permanent visibility that guarantees the programmed functioning of power.” This permanent visibility was not only constant but also unverifiable, in the sense that one would never know whether, in fact, anyone was watching. He saw Bentham’s Panopticon symbolically as a technological “ideal form” of power that already existed elsewhere, although not in such a concentrated and articulated blueprint.
Among the many critiques of the panopticon model, Foucault claimed that prisons were designed to deprive individuals of liberty and prepare them for discipline in the industrial age. To begin with, the prison was now seen as a means by which to strip a criminal of their liberty.
As a consequence, the deprivation of liberty was considered an egalitarian punishment, and prisons were the means by which it could be enacted. However, there is more to prison than the stripping of liberty alone. It also offers space for “correction” and the moral improvement of the imprisoned criminals. Hence, isolation and solitude became the methods for chastisement and reformation.
In addition to solitude, convicts were assigned prison work. Remembering the concept, it was allegedly claimed that this would reform their personalities. However, Foucault questions the real purpose of forced labor. Nothing of economic benefit was being produced, nor did convicts acquire actual working skills. What they absorbed was how to be immersed in the regimen of production and eventually become part of the apparatus of an industrialized society. In this sense, the convict was no different from any other disciplined agent in the industrial age. Prisoner or worker, what were they but mere cogs in the belching machines of production?
Panopticon and the modern state
However, it needs to be noted that the Panopticon as a disciplinary mechanism, unlike an absolutist, monarchical, or sovereign form of power, where power is wielded usually by an individual, automatizes and dis-individualizes power so that no individual wields or commands it. Foucault elaborates that in order for the disciplinary mechanism to be exercised, it needed an instrument of permanent, exhaustive, omnipresent surveillance, which would be capable of making all visible as long as it remained invincible.
This form of power is neither seen as a public display nor as an interrogation by one who holds it. Rather, the subject knows that s/he is under a constant but unverifiable gaze, similar to that of a prisoner in the Panopticon, just as a prisoner knows they could be watched and engraves the gaze within themselves, on their souls, to become useful, productive, and effective, making them self-disciplined.
Though Foucault begins his argument with the panopticon and how it articulates disciplinary power over convicts, he further adds that the disciplinary power is useful in many areas of life, such as in hospitals to treat patients, in schools to instruct the children, to confine the insane in a mental asylum, to supervise workers in a factory, and to put beggars and idlers to work.
Panopticon and technology
However, the rise of the modern security state allows governments and powerful corporations to observe behaviors and trends in citizens and consumers to more easily control them and enforce checks on transgressive behavior. It could be said that there are very few public places that do not have some form of security camera or CCTV, and any time one connects to the internet, their traffic is monitored by various markers for reasons of commerce and security. It is clear that the practice of panopticism has now spread beyond institutions, be they the traditional Benthamite prisons and workhouses or, as Foucault suggested, any institution that exercises disciplinary power and conditioning, and outward to first the authorities that control the aforementioned institutions and then still further to the states that control even those authorities.
Under the modern form of panoptic power, people are told not to fear surveillance because it is all being done in the interest of the greater public good. Though security cameras have become ubiquitous in urban areas and everyone is at least vaguely aware that our online presence can be monitored at any time, people are encouraged to pay little attention because if they do nothing wrong, there is no need to fear retaliation. Whereas under classical panopticism subjects are encouraged to take stock of themselves and attempt to portray their best possible selves to the inspector, under the modern panoptic model subjects are induced to act as naturally as possible and not modify their behavior for the inspector’s sake. That self-awareness of the watching eye is taken away and obscured so that those subjected to the panoptic eye are less likely to respond to it.
The panopticon and the pandemic made tracking an essential
When the pandemic was sweeping all over the world, it brought concerns related to location-tracking technologies into focus. The situation demanded that the highly contagious nature of the disease necessitated that the people who had caught the disease, or even those who are currently asymptomatic but may become ill later, be in quarantine.
One way of keeping track of the relevant people was to track their location by monitoring the movement trails of their mobile phones. A number of nations, cities, state administrations, and central agencies had released apps that did this tracking with user awareness, i.e., the user had to install an app on her smartphone and give permissions. For instance, the Aarogya Setu app, sponsored by the Indian government, focused on acquiring proximity data, i.e., who were the people a person was in proximity to across days? This is to aid “contact tracing” to establish a path of transmission of the novel coronavirus. The app had also been criticized for extracting location data and for having ambiguous clauses in its privacy policy. There is also the fact that the ever-present danger of anonymized data can be easily removed by linking it to databases that hold the mobile number.
Similarly, an MIT-based project was documenting COVID tracking apps all over the world. Sure enough, a variety of problems plague many of these well-intentioned apps: the extraction of irrelevant personal details, as seen by the plethora of permissions they demand to run; a lack of clarity about the use and retention of the data; and privacy rules that decide what can be made public and what cannot be. Open-ended privacy policy clauses such as “will be used for appropriate purposes” or “will be shared by relevant agencies” keep the door open for the data to be shared with law enforcement or tax authorities, and thereby be used for coercive purposes it was never collected for.
These apps consequently create substantial vulnerabilities in terms of loss of privacy for those being tracked. For instance, an app used by the Karnataka government released names and addresses of patients along with location trials, resulting in harassment by neighbors. There have been many similar instances.
In an ethos where even the suspicion of being infected immediately invites social stigma and discrimination—rather than empathy—and sometimes physical violence at the hands of self-proclaimed vigilantes, the release of personal details into the public sphere endangers lives.
There is also a growing awareness of the danger that anonymized data can be easily deanonymized by linking it to databases that hold the mobile number, e.g., Aadhaar or Vahan in India, or even as described above in the NYT investigation. Also, in some cases, the apps are hackable or amenable to being copied by fraudsters. Data collected through COVID apps, like any other data collected, has the potential to reach data markets.
Many companies that are known for their intrusive technologies have jumped on the COVID-tracking bandwagon to provide surveillance tools. For instance, the Israeli company NSO-Pegasus Software has offered its technology, named Fleming, for testing in many countries. In addition, the Rome-based company Cy4gate provided free access to its Human Interaction Tracking System (HITS) to Italian authorities.
The dangers of facial recognition
With the rise of technological advancement, facial recognition is a real danger. We find that there are no laws in either India or most other parts of the world that tend to prohibit or even regulate the use of facial recognition. Millions of images are seized every day by publicly installed CCTV cameras.
For instance, in the national capital, the Delhi police have been using a software called Advanced Facial Recognition Software (AFRS) to surveil public gatherings to identify “habitual” protestors. There are reports that the police in Vadodara are keen on using facial recognition technology offered by none other than Clearview AI. This provides a more detailed report describing the Indian scenario, which indicates the spread of facial recognition technology in policing to more areas of the country like Chennai, Hyderabad, and Punjab. All of this is happening when the accuracy of the technology is only about 1–2%.
The home ministry had claimed that Delhi Police identified at least 1,900 alleged rioters through facial scans obtained from CCTV footage and recognized them using the driving license and voter-id databases (incidentally, Delhi has one of the largest CCTV camera networks amongst cities globally). Apparently, the Aadhar database was not used in this instance, though there have been demands to make this database accessible to the police.
Many government agencies are excited about using facial recognition technology for various purposes, ranging from voter identification to airport passage, in railway stations, and now even in schools. With all forms of tracking enabled and working in tandem, the objective is to track every move a citizen makes.
Unfortunately, the Chinese government currently has one of the largest databases of all types for its citizens. In a recent, massive push, it has attempted to acquire DNA data across the country. Even the world’s most-watched cities are in China. Such data acquisitions are happening through CCTVs, drones, biometric readers, and mobile apps. All people applying for mobile phone connections have to undergo facial recognition verification. Payments that are made through QR codes are likely to be replaced by facial scans. Chinese banks have started to use “micro-expression” analysis to check for signs of fraud by analyzing facial movements.
What needs to be done?
A guiding slogan that we need to have is “minimize our data trails.” As part of our demands, the citizens should include and insist on data laws that allow only specific data extraction strictly for providing a given service and should not carry any vague promises of “improving” services or end-user agreements that are short, crisp, and free of jargon terminologies. The understandable clauses asking for consent to be prominently displayed mark the rules that do not allow data retention after the event, for example, an Uber ride should not have mandatory controls for tuning privacy, they should have to show all the data collected, and it should be easy to delete it without much effort.
Further, we should demand that the collection of public data be made illegal. No one, for example, should have the authority to collect all public images, impose a ban on profiling, sell collected data to third parties, or link databases, particularly public and private databases.
We do find many of these demands incorporated into the General Data Protection Regulation (GDPR, May 2018) of the European Union, but that is just a start. There is still plenty of deep research and understanding needed in terms of the implications of these laws.
In India, the Personal Data Protection Bill, 2019, is pending in Parliament, with concerns that it gives unrestricted freedom to state agencies to access anything. It had the ability to turn India into an “Orwellian state.” The government can at any time access private data on grounds of sovereignty or public order, which will have dangerous implications.
It is a frightening situation when the state turns upon its own citizens and “wants to know everything” about them now that digital technologies have made this feasible. Nationalism and “national security” are being reinvented rapidly as the rationale for the deployment of omnipresent state surveillance, driven by the unchecked acquisition of personal data.
There are potentially at least three consequences that arise:
Primarily, state agencies are now equipped with efficient tools and sufficient data to find subversive individuals and deal with them appropriately. If such individuals can be located, identified, and tracked at all times, there is very little that can save them from the depredations of a rogue state hell-bent on preventing them from doing anything that is a threat to its rule.
The second consequence is that large databases containing diverse kinds of personal information and biometrics can be used to train algorithms and automate the demonization of entire communities simply on the basis of how they look or even what they wear.
And, lastly, this creates a character of fear that conditions the masses of the inhabitants and makes them submissive to authoritarian regimes and leaders.
To conclude, it seems appropriate to end with a quote from one of the most needed books written about capitalism in the digital era. Shoshana Zuboff argues in her book “The Age of Surveillance Capitalism” that behavior prediction and modification is the new form of capitalism:
“Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data. Although some of these data are applied to service improvement, the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as “machine intelligence,” and fabricated into prediction products that anticipate what you will do now, soon, and later.”