Your Digital Shadow
As our society and lives become digital, everything is represented in data. This gives us two versions of the world: the real, physical one and a digital reflection, like a skewed mirror image.
More and more, it’s not the real you but the data version of you that others interact with—whether it’s a bank checking your credit, an algorithm showing you certain search results, or a potential employer “stalking” you online. Our interactions with the world are increasingly shifting from the physical realm to the digital one, where we have less control over what happens or how we’re perceived.
This brings up several ethical and philosophical concerns. What happens when we treat people as data objects? What happens when our digital profiles determine whether we’re accepted or rejected in society? This can fundamentally change how we see people.
The term ’individual’ implies a single, undivided unit. But in the digital world, we become split into two parts: the physical self grounded in reality, and a digital counterpart, represented by all the data associated with us. This data can influence whether we’re hired, granted access to a service, and so on. In effect, we move from being individuals to ’dividuals’, existing in multiple versions.
A shadow that follows you
The idea of a digital shadow originally comes from privacy advocates. A shadow has negative connotations: It’s something that follows us, that hides things, and that isn’t necessarily a good, three-dimensional representation of who we really are.
This shadow is a virtual representation of you, which is constantly updated through your actions and interactions, as a result of both active and passive data collection. Both what we consciously share and that which we can’t control ourselves.
This digital representation isn’t final and fixed, rather it evolves, changes, and follows you continuously. The idea is a variation of what’s known as a digital twin.
Fact
Digital twins
Digital twins are a concept primarily used in industry. They’re essentially digital replicas of real-world systems or processes, with the real and digital versions often linked and sharing data. These digital twins can then be utilised to simulate different situations, monitor performance and conditions, or experiment with new ideas.
Digital twins can be created for anything, whether material or immaterial, whether it’s an object like a crane or a subject such as a bank customer, or even something as vast as the ecosystem of the Amazon rainforest. You’ll explore more about digital twins in the next chapter.
Should we be afraid of shadows?
When decisions are made that directly or indirectly affect us, they are often based on our digital representations, whether it’s a person or an algorithm making those decisions.
This isn’t entirely a bad thing. Digitalisation has, for example, made tasks like applying for loans or filing taxes much easier, as it automatically pulls in necessary information from various sources, so we don’t need to frantically search through old documents, receipts, and accounts.
These are important advancements, and we can’t simply go back to the old ways. However, we must be aware of and work to minimise the potential downsides.
Challenges include the possibility of data being incomplete or of poor quality. Also, data is self-referential, meaning it can only point to other data; it is disconnected from the real world. The data is stored in various places and to a certain extent has “a life of its own”, beyond your control. If decisions are made automatically and it’s not transparent on what basis the decision was made, it becomes difficult to verify their accuracy.
Moreover, the treatment will never be completely neutral, it will be shaped, consciously or unconsciously, by the values, prejudices, and agendas of those who have developed the specific service and those who are running it.
If it involves clear-cut information, like income and debt on a tax return, it’s generally straightforward, transparent, and correctable. Not so in cases that involve more assessments and interpretations, which may be run by private companies, and where you have less insight or control over what’s happening.
If these errors and shortcomings lead to negative consequences such as denial of financial aid or exclusion from an educational program, it impacts your real-world situation. We do have some protection against this, thanks to European laws like the GDPR, which states that everyone has the right not to be subject to a decision based solely on automated processing. However, there are many grey areas in this.
Insight
What do the platforms know about you?
In China, they use a data-driven social credit system that puts a score on citizens’ conformity to the rules in the country. It’s not a big exaggeration to say that we essentially have the same thing in the West, too—but here it’s a commercially motivated scoring: what is rated, is our value as a customer or service provider (think about the rating system on Etsy or Airbnb). We aren’t judged as citizens according to our national constitution, but on the basis of our commercial value.
The large technology platforms, especially, want to have as much detailed and fine-grained data about us as possible, so they can profile us for data-driven marketing. It’s said that it doesn’t take many clicks in your social media feed before the platform knows more about you than your own friends do (and even things you don’t know about yourself, know for that matter). There’s a kernel of truth in this, as our actions, what we look at, click on, linger over, like, avoid and so on, reflect who we are. And it all gets recorded.
This is used to display relevant content and advertisements. The problem is that the same profiles can be used to predict political affiliation, sexual identity, and potential diseases.
The data-driven society challenges human autonomy and freedom. What can you do if you find decisions being made about you based on information that’s hard to understand or verify? How can you correct any inaccuracies? Who is held accountable when things go wrong?
This isn’t just a digital transformation, but perhaps something more radical: a metamorphosis.