Blog Image

International Network Observatory

Global Strategy and Big Data

Big Data Ethics

Ethics Posted on 17 Dec, 2014 14:01

by Andrej Zwitter, University of
Groningen – December 17, 2014

Big Data and associated phenomena, such as
social media, have surpassed the capacity of the average consumer to judge the effects
of his or her actions and their knock-on effects, as Facebook parties and the
importance of social media for the Arab Spring vividly demonstrated. We are
moving towards changes in how ethics has to be perceived: away from individual
decisions with specific and knowable outcomes, towards actions by many, often
unaware that they may have taken actions with unintended consequences for
anyone. Responses will require a rethinking of ethical choices, the lack
thereof and how this will guide scientists, governments, and corporate agencies
in handling Big Data.

Data versus Traditional Ethics

Since the onset of modern ethics in the
late 18th century, we took premises such as individual moral responsibility for
granted. Today, however, it seems Big Data requires ethics to do some
rethinking of its assumptions, particularly about individual moral agency. The
novelty of Big Data poses some known ethical difficulties (such as for
privacy), which are not per se new. In addition to its novelty, the very nature
of Big Data has an underestimated impact on the individual’s ability to
understand its potential, thus make informed decisions. Examples include among
others, the “likes” on Facebook sold to marketing companies in order to more
specifically target certain micro-markets; information generated out of Twitter
feed based sentiment analyses for political manipulation of groups, etc.

In a hyper-connected era the concept of
power, which is so crucial for ethics and moral responsibility, is changing
into a more networked fashion. To retain the individual’s agency, i.e.
knowledge and ability to act is one of the main challenges for the governance
socio-technical epistemic systems. Big Data induced hyper-networked ethics
exacerbate the effect of network knock-on effects. In other words, the nature
of hyper-networked societies increases and randomizes the collateral damage
caused by actions within this network and thereby the unintended consequences
of people’s action.


As Global Warming is an effect of emissions
of many individuals and companies, Big Data is the effect of individual
actions, sensory data, and other real world measurements creating a digital
image of our reality, i.e. “datafication”. Already, simply the absence of
knowledge about which data is in fact collected or what it can be used for puts
the “data generator” (e.g. online consumers, cellphone owning people, etc.) at
an ethical disadvantage qua knowledge and free will. The “internet of things”
and ambient intelligence online further contribute to the distance between one
actor’s knowledge and will and the other actor’s source of information and
power, as well as it strengthens the dependency on the delivery of services
dependent on Big Data. Furthermore, the ownership over Big Data leads to a
power imbalance between different stakeholders benefitting mostly corporate
agencies and governments with the necessary knowhow and equipment to generate
intelligence and knowledge from data.

In the sphere of education, children,
adolescents, and grown ups still need to be educated about the unintended
consequences of their digital footprints (beyond digital literacy). Social
science research might have to consider this educational gap and draw its
conclusions about the ethical implications of using anonymous, social Big Data,
which nonetheless reveals much about groups. In the area of law and politics,
political campaign observers, law enforcement, social services and lawyers will
increasingly become data forensic investigators to utilize Big Data themselves
and to recognize the illegal exploitation of the possibilities of Big Data.

A full open
access version of the paper
has been published in Big Data & Society.

Intelligence and Ethics

Ethics Posted on 31 Jul, 2014 14:13

Despite negative headlines there is genuine narrative about implementing ethical concerns within the Intelligence Community.

A keynote lecture by Professor John Grieve, Ethics and Intelligence, was an encouraging example of the presence of genuine ethical concerns inside the UK Intelligence Community for a number of reasons.

Firstly Professor Grieve, as a leading figure in the sector, is an advocate of lifetime command accountability i.e. he is prepared to answer questions on command decisions he made at any point in his career – providing nobody has shredded the paper work! This is a principle to admire and should be followed by those in leadership positions not just within the police but the military and other sectors where decisions are being made that have collateral effects.

Secondly, Prof. Grieve promotes the duty to learn. Alluding to literary great John Steinbeck’s Sea of Cortez and a host of philosophers he shows how an open mind and consilient approach to policing will produce the most ethical and best practice for intelligence.

Third, the consilient approach to Intelligence Led Policing is what innovative networks of academics, industry leads and professionals are practising. Ethical concerns are at the core of these partnerships – with representatives from unfamiliar backgrounds collaborating, more bases of ethical concern are covered.

Finally, Prof. Grieve brought attention to ethics in the work place itself. Fairness at work, or FAW, is a major concern of his. Treating employees i.e intelligence analysts and contractors etc. with the respect they deserve and noticing problems that may lead to dissenting behaviour will ultimately result in fewer whistle blowers. Although whistle bowing is an activity which forces transparency in organisations, there are plenty of avoidable cases which add to negative press and a feeling of public distrust for intelligence practices. It is public consent building which will allow greater freedom and most effective practice for the intelligence communities – in the interest of reducing threat, risk and harm for society.

This column was written by Mr. Robert Barrows (Project Administrator at CASI, Liverpool Hope University. Follow his blog at

‘Big data requires a code of conduct for academics, government and industry’

Ethics Posted on 28 Apr, 2014 12:10

Telephone traffic, GPS data, photos, patient information in the healthcare service: these days, we leave a digital footprint wherever we go. Together, these footprints form what is known as ‘big data’. This huge volume of data is prompting important ethical questions. Questions to which we do not yet have satisfactory answers, says Professor of Ethics and International Politics, Dr Andrej Zwitter. He thinks it is now time for ethicists to start formulating these answers. We have to think about what academics, governments and industry should be allowed to do with this collected data, but also how we can teach our children to live in a world surrounded by data. Zwitter has set up an international think-tank to discuss this matter: the International Network Observatory.

‘We store enormous volumes of data’, says Zwitter. ‘Two researchers, Smolan and Erwitt, worked out that we currently store more than five billion gigabytes of data every ten minutes. This is the equivalent of all the data stored from the start of the computer era until 2003. What’s more, this trend is set to continue: it is estimated that by 2015, we will be storing the same volume of data every ten seconds.’

Big data

‘Big data is not simply more of something we’ve been doing all along. Big data is fundamentally different. We used to collect traditional statistical data, small data, for a specific reason, so it was accurate and clean. But collecting big data is a whole new ball game. Companies and analysts try to collect as much data as possible on a certain subject, accumulating mountains of data on anything that is remotely related to their chosen field, including data from social networking sites such as Twitter and Facebook. The data is jumbled, polluted and represents reality: they have created a digital, reflected reality.’

Always correlations

‘Big data obviously generates a lot of useful functionalities. But the digital reality thus created can also cause countless problems. For example, you will always find correlations and links in large databases. Men who buy nappies buy more beer than average. There is no direct link between these facts; the only common denominator is a baby. This is a fairly innocent example, but large-scale data sets increase your risk of being randomly associated with someone who has committed an atrocious crime, for example, without having the slightest moral responsibility. This can have a very real impact.’


‘Predictions based on collected data can also have a disastrous effect. Data analysts use information about groups to predict consumer shopping habits so that shops can organize their purchasing and design their shops accordingly. But this group strategy can have implications for individuals. Imagine living in an area where as an unemployed person with a certain make of car, you are more than ninety percent more likely to steal from a shop? Should we lock you up as a precaution or send round a social worker? Considered guilty on the basis of a prediction, you may very well find yourself stigmatized.’

Free will

‘All in all, big data is causing a fundamental shift in ethics. It no longer involves an individual action or decision with predictable results or implications, but a situation in which you make subconscious decisions or do things automatically, which then have unexpected or unintended consequences.’


‘So it’s time to start thinking seriously about the ethical implications of the way we are datafying our real lives into big data, and how we want to deal with this. Universities must take the lead in this discussion. The University of Groningen has set up an international think-tank (the International Network Observatory) with Liverpool Hope University, the Austrian Institute for International Politics (OIIP) and the European Centre for Applied Research (ECFAR) to consider these very issues.’

‘I am in favour of a code of conduct for people who work with large volumes of personal and anonymous social data, rather like the oath that doctors have to take. Teaching is another essential aspect: people must be made to understand what big data is, what is stored and how it can be used (and misused). We should start in primary schools. Children have to earn a road safety diploma, so why not a digital living diploma?’

Andrej J. Zwitter (Klagenfurt, 1982) studied law and philosophy at the Karl Franzens University in Graz (Austria). He carried out PhD research into terrorism, international law and the philosophy of law at the Ruhr-Universität Bochum, Germany. Zwitter is Professor of International Relations at the University of Groningen and director of the European Centre for Applied Research.