The researchers point out that children do not automatically develop critical understanding of the digital technologies they use every day. Education is required. They have talked to eight graders about their insights into how their data is being collected and processed. (Photo: Shutterstock)

Does your ‘private’ data picture meet the beauty standards of society?

COMMENT: Private companies and public sectors collect our data every day and minute. In a democratic society, we need to teach children to become critically aware and understand how data processing and digital technologies really work.

Would it be ok for you to pay 4 bucks for a gallon of milk, when the customer before you paid only 3?

How would you react if letting go of your childhood friend got you high-speed internet – and if you kept her, you got slow-speed?

Or, if you were denied crossing a state as punishment for opinions against the government's energy policy?

We do not exactly think that you would be happy. Because you live in a democracy. You have political rights, freedom rights and social rights.
Or do you?

Private companies and public sectors both look over your shoulder in the digital reality. (Photo: Shutterstock)

Most of us spend a lot of our time in front of a digital screen, and without registering it, a large portion of applications collect a large portion of data on us. On every click we make, what we are talking about and how we move around.

Actually, we do not really know what happens to the data we feed the applications. But contrary to the physical world, we do not really seem to care.

We need to change that. And critical awareness and understanding of digital technologies need to begin in our public school system.

“Somehow, it is a little scary”

As part of our ongoing research, we have talked to eighth graders to examine their insights into data collection and processing.

“Could anything stop you from downloading an app – for instance if it wanted access to something?” we asked Malte and Alfred from grade 8. Z in an informal conversation.

None of them can come to think of anything: “What's the worst thing that can happen? A targeted ad? Well ... I’ve never experienced any bad consequences”, Malte laughs.

Isabella and Victor from grade 8. A reflect in another conversation: “There was something with Facebook, and that they save like everything about you. Like birthdays and stuff like that, it's right in a database.”

We ask whether they know what they use the data for. “Do they use it?” Isabella says, surprised. Victor thinks he remembers a lawsuit not so long ago: “So ... if you look like someone who plays a lot of soccer, they can show you many ads on football,” he says.

We ask if they think targeted ads are smart or cause irritation. “I guess it's smart,” Isabella says. “Then they have found one's interests,” she elaborates when we ask about the smart part.

Maja from grade 8. Z is a little skeptical about the smartness: “Somehow, it is a little scary to … just to know that everything is being used and like … almost listened to. And observed. Off course it is a little scary.”

We discuss her smartphone habits and her understanding of the data her favorite apps collect about her. And her skepticism is healthy – because her data is excessively collected and transformed by algorithms. Not only 'little' and 'almost'.

We add ingredients to the cake, but do not know the recipe

Roughly, we can compare an algorithm to a developer's recipe. It consists of a variety of ingredients, a mixing ratio and a method.

The result is a cake, cut exactly in the style of the developer's wishes, which is then served to us and continuously adapted with new ingredients.

We contribute with our data ingredients, but what the cake on top of that consists of or how it was made, we do not know. The program does not tell us and we do not learn it in school.

Nevertheless, we happily eat, while others grow money on our growing sugar addiction and stomachs. But do we really have other choices?

Your insurance company is watching you when you eat burgers or drive your car

Predictive algorithms are, as the name suggests, designed to predict actions.

Targeted content that through the use of data predicts potential customers and attempts to attract our attention at the exact right time and place with the highest odds that we behave as intended, have become a well-known example – but it's just one of countless others.

Do you know, for example, why Tom who lives in an upper class area pays 200 dollars less per year for insurance than you do in your traditional middle class area?

Insurance companies collect information on us from a broad range of data registers – but also on people who look like us – so they can target their prices. They also look at possibilities to collect data on how quickly we accelerate and how hard we brake when we drive in our cars.

And they even take advantage of smartphone data on how we move around during the day. So hurry down to the fitness center and stay away from the fast food chains. 'Your insurance company is watching you'.

Who does the private sector keep an eye on?

And insurance companies are not the only ones watching. The public sector has access to large data sets, which allow for improved public service, crime prevention and control.

But what prejudices are the algorithms designed to maintain? Who are the suspects in criminal cases if predictions are based on predictive algorithms – not only based on your data but also based on data on others that seem to look like you?

For example, in Pittsburgh lives a caring couple with children, who voluntarily helps other children in their communities. Despite this, the couple has been flagged in digital social service databases for childcare neglect.

Their 'crime' is that they are poor.

It is even worse if your data looks like a terrorist's. In that case, you better cross your fingers that the killer drone designed for targeted killing is programmed by humans in the decision loop before it attacks you.

The digital discriminatory dictatorship of China

The most frightful example of all is China's experiments with a social credit system.

Through the use of citizens' data, China is testing the generation of 'citizen scores', and you better not express yourself critically and independently or spend (too) many hours a day playing computer games, if you want a high score that gives you access to, for example, visas to other countries or the right to travel by air within China's borders.

It might also be a good idea to reconsider your friendships and cut some of your family ties. Your circles' behavior can affect you too.

Buying diapers, on the other hand, is a good thing to do if you want to increase your score – then you are probably a responsible parent. Or you could of course 'game' the system and try to look like the prototype good citizen.

But someone might be watching you if you try out the latter. Recognition technologies save face pictures and voices in databases linked to personal ID numbers. Ultimate control and no escape.

So what do we do – if we do not want to sacrifice our democracy for a digital discriminatory dictatorship?

In a democracy, everyone must learn to understand digital technologies

The importance of understanding data and data processing in a democratic society was formulated 50 years ago by computer science professor Peter Naur – a Danish pioneer in the field – when he in 1968 argued that “understanding the programming of computers should be brought into general education” – in order for all to learn to understand.

Otherwise, he warned that expert programmers would take up a power that would threaten our democracy.

50 years later, his warning is more relevant than ever – and it is time we take it seriously.

When automated systems designed by black box algorithms increasingly make decisions without human interference, it becomes increasingly impossible for us to understand why, for example, we get rejected when applying for a loan, get fired from our dream job or suddenly have higher insurance costs than others.

'Mouseprint' is not enough

Two American professors point out that we must demand transparency from technology companies for us as citizens to gain insights into predictive algorithms that influence our lives. Not only transparency, but meaningful transparency that supports understanding.

The two professors identify eight important categories, including knowing an algorithm’s goal and the problem it addresses, what data sets it trains – and what data the system excludes – and an easy-to-understand explanation of how it works.

Another American professor also argue that everyone be given the opportunity to understand algorithms: “As AI (artificial intelligence) is applied more broadly, it will be critical to understand how it reaches its conclusions” – and technical explanations and ‘mouseprint’ is not enough: “It must be explainable to people – including people who are not expert in AI.”

In other words, we should not only have the right to know the data we put into the machines – we also need to know what is happening to them inside the machine and what is coming out.

Datalogy on the school timetable!

But political regulation and explanations are not enough. In a society that is so permeated with data, we need datalogy – the teaching of data, their nature and their use – on the school timetable.

Access to books is not sufficient to learn to read and analyze texts. Likewise, it is not enough to know the recipes that programs are designed by.
Understanding data and digital technologies requires education.

A recent study by Epinion for the Danish Ministry of Education suggests that children and young people experience a high degree of control of the data they share online. But this experience is based on being in control, when you have control over its input, that is, the data you upload and share.

It does not take into account the fact that you rarely know what happens to your data inside the machine and what the output is.

In addition, the study indicates that we think more about digital literacy as related to individual issues. The socio-critical perspective is missing.

We should use technology in human ways

It is our responsibility to teach children to analyze and interpret the algorithms that affect our lives so that we can act and make choices based on awareness and insights.

To not end up with individual prices in the supermarket. To not end up paying with friends for proper internet speed. To not end up sacrificing our freedom of speech for a train ride.

Instead, we can choose to use the technologies in positive ways for a bunch of amazing things they can actually help us with – such as identifying earlier stages of diseases, assisting with solutions to climate change or free us from routine tasks, which gives us more time to concentrate on things that requires human consciousness.

For instance, caring about each other.

The arguments of this article come from an academic article by the authors, that is expected to be published in Danish by Aarhus University Press in 2019 entitled ‘The (un)intelligence of machines and (a)morality of algorithms – a democratic perspective on computational thinking and technological understanding as a school subject’.

Scientific links

External links

Related content
Powered by Labrador CMS