“Over the past decade, the Internet has become an important and ubiquitous feature of daily life in the world. As is often the case, the technology is somewhat of a double-edged sword. Although it may enhance our lives in many ways, as our world becomes an “information society” it also raises new concerns. For much of that information relates to not just things but to people. Information about us is accessed, stored, manipulated, data mined, shared, bought and sold, analyzed, and potentially lost, stolen or misused by countless government, corporate, public and private agencies, often without our knowledge or consent. When we communicate, interact, or even just go shopping—both online and offline—we leave data trails and digital footprints behind us, generating information about our lives and activities as we go” Buchanan (2007).
While it is a generally accepted notion that privacy forms one of the basic principles of human rights of any liberal democracy, recent advances in technology have now begun threaten this right. The question therefore becomes, do we choose between privacy and advancements in technology or do we try to find a balance between these two competing interest. I would argue that we adopt the concept of a positive sum methodology. That is say while we cannot stall or resist the advance in technology and its benefits to society at large; we can and should adopt measures that would be able to taper these advancements. By adopting this methodology one would be able to find a harmonious relation between the two.
Before one can delve into the impact of technology on privacy however, it important to have a clear understanding of what exactly we mean by the term privacy and it relation to the democratic ideals of a society. As a concept the notion of privacy is ambiguous at best. The view of what one considers private varies from ones thought, to control over one’s body, solitude, control over personal information, freedom from surveillance, protection of one’s reputation etc. Noted legal scholar Arthur Miller has declared that privacy is “difficult to define because it is exasperatingly vague and evanescent”.
One of the earliest definitions of privacy can be attributed to Samuel Warren and Louis Brandeis in their seminal article, “the Right to Privacy” in 1890. They advanced that at its core privacy centers around ones “right to be left alone”. Miller declares that “the basic attribute of an effective right of privacy is the individual’s ability to control the circulation of information relating to him”. While according to Charles Fried, “Privacy is not simply an absence of information about us in the minds of others; rather it is the control we have over information about ourselves”. For the purposes of this paper, we would limit our understanding on privacy to the definition as put forward by Prof. Alan Westin “Privacy is the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others”.
The right to privacy has been described as essential to democratic government. U.S. Supreme Court Justice Louis Brandeis pronounced it “the most comprehensive of rights and the right most valued by civilized men”. Furthermore, privacy is recognized as a fundamental human right. The United Nations Universal Declaration of Human Rights of 1948 states that, “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honor and reputation”. The European Convention of Human Rights of 1950 also provides that “everyone has the right to respect for his private and family life, his home and his correspondence”. Thus there exist an international accord around the importance of privacy and the need for its protection within society.
As we go about our daily lives in today’s environment, we are inundated with a plethora of technology. From smartphones to cars with built in GPS technology and computer chips, CCTV cameras on the various highways and byways and at various places we may visit, computers, ATM machines etc. While in some instances the use of personal information by organizations may be direct, technology has become so advanced that the ability to collect private information on anyone may be an unintentional outcome. For example a credit card can highlight ones whereabouts and spending habits and the telephone bills provide information a person’s conversations; proximity cards used in some university dorms allow the ability to track the movements of students.
Nissenbaum (2009) summarises these various “technology-based systems and practices” into three groupings organized around key functional characteristics or capacities as follows:
· monitoring and tracking – to watch over people, to capture information about them, and to follow them through time and space;
· aggregation and analysis – covers the general capacity to store and analyze information; and
· dissemination and publication – remarkably effective capacities to distribute information in endlessly varied configurations.
She further notes that advancements in technology pose a foremost danger to privacy because “it enables pervasive surveillance, massive databases, and lightning-speed distribution of information across the globe”.
Apart from the private sphere, governments across the world have also now begun to utilise technology in the form of Identity Management (IDM’s) to assist in the delivery of service to the citizenry. While on the one hand these IDM systems can assist in making e-services emore seamless, it can also intensify privacy risks. Some argue out that the use these new IDM systems can lead to the ‘social sorting’ of citizens: to the processing of captured personal and demographic data by government agencies in order to classify people, and determine who should be targeted for special treatment, scrutiny, eligibility, or exclusion.
It can be seen therefore that the increasing use of digital technology allows organizations (both by the private and public) the ability to create and maintain digital records that spans much of one’s life. In fact according to Prof. Daniel Solove in his book “The Digital Person”
We are currently confronting the rise of what I refer to as “digital dossiers”…. As businesses and the government increasingly share personal information, digital dossiers about nearly every individual are being assembled. This raises serious concerns. The information gathered about us has become quite extensive, and it is being used in ways that profoundly affect our lives. Yet, we know little about how our personal information is being used, and we lack the power to do much about it.”
I would now explore two avenues namely legislative and the use of the concept of privacy by design to that could be use to mitigate the problem posed by technology privacy to ones privacy. Swire and Bermann (2010) highlight four major legislative world models of privacy/data protection adopted around the world.
1. Comprehensive laws (European Union) – countries which have adopted this approach usually appoint an official or agency responsible for overseeing enforcement. This official ensures compliance with the law and investigates alleged breaches of the law’s provisions.
2. Sectoral laws (United States) – this model protects personal information by the enactment of laws that specifically address a particular industry sector for example, consumer financial transactions, credit records, medical records, etc.
3. The Co-Regulatory model (Canada, Australia) – under the co-regulatory approach, industry develops enforceable standards for privacy agency such as an information or privacy commission.
4. The Self-regulatory model (United States, Japan, and Singapore) – this requires companies to abide by codes of practice as set by a company or group of companies as well as industry bodies, and/or independent bodies as a means to protect data.
One of the cornerstones of most of these legislation models rest on the fact that individuals would now have the right to determine who they give their personal information to, how it is collected, used, shared and discarded. Additionally many of these legislations also align themselves with several multinational privacy guidelines, directives, and frameworks. For example the Organization for Economic Cooperation and Development (OECD) Privacy Guidelines (1985), the European Union issued a Directive on Data Protection (1995) and the Asia-Pacific Economic Cooperation (APEC) Privacy Framework (2004).
Apart from legislative measures, another approach to protecting privacy is through the adoption of proactive rather than reactive measures. We are normally told by individuals who work in the field of IT or security that we can either have privacy or security, we can’t have both. “In other words enhanced surveillance and security would necessarily come at the expense of privacy or on the other hand, adding privacy controls would detract from system performance”. According to Dr. Cavoukain (Information Commissioner for Ontario), rather than adopting a combative paradigm to privacy protection, a more proactive model could be utilized, whereby adding privacy measures to technology need not weaken its functionally but rather enhance the overall design.
This in her words would create a win-win situation. However in order to achieve this sought of approach privacy must be proactively built into the system from the outset. She refers to this as concept as “Privacy by Design”.
In brief this concept “seeks to build in privacy – up front, right into the design specifications; into the architecture”; or as she refers “bake it in”. This can be achieved by adopting, what she refers to as “fair information practices” (FIPs) into the design, operation and management of information management systems”. For Dr. Cavoukain, Privacy by Design encompasses many elements in practice:
· Recognition that privacy interests and concerns must be addressed;
· Application of basic principles expressing universal spheres of privacy protection;
· Early mitigation of privacy concerns when developing information technologies and systems, across the entire information life cycle;
· Need for qualified privacy leadership and/or professional input; and
· Adoption and integration of privacy-enhancing technologies (PETs).
The concept of Privacy by Design would therefore encourage organizations (both public and private) to ensure that as new systems (both electronic and hard copy) designed to collect and process information of a private nature are developed, privacy concerns are identified and addressed from the onset, rather than ignoring it or adding it on (usually as an additional cost) as an afterthought.
This paper has sought to highlight the impact advances in modern technology can have on an individual’s right to privacy. While we can all agree that these advances have greatly improved our way of life we can also acknowledge that it does pose a serious threat to the way ones personal information is utilized. So as it was in the time of Warren and Brandeis, so it is now and so it may continue to be in time to come. However through the enactment of specific legislation and the proactive approach of building in proper privacy safeguards into ones information infrastructure proper privacy safeguards can exist to ensure that one’s “right to privacy is maintained.
While some commentators may argue that that “privacy is dead” and the original definition of privacy as the “right to be left alone” may be utopian in nature, I am of the opinion that it is an ideal which we must continue to reach for. For to not may force us into a reality of George Orwell’s 1984, a reality I believe no one would want to imagine.