A world tailored on your tastes, your body, budget and needs.
This is the promise of data-based personalisation. Looking for a restaurant? Log in to Google, go to Maps, and it will show you not only the nearest pizzerias, but also those most likely to suit you based on your previous search history, emails and other tracking information collected all over the web. Need an e-book to read during the summer? Amazon will provide a list of “recommended for you” readings, cherry-picking authors similar to those you’ve already showed your appreciation to before.
Sounds idyllic, on the surface. Sensors, loyalty cards, camera feeds, bank transactions, online browsing history, social media activity, smartphone apps and others sources generate huge amounts of data which are used to provide tailored services. Few choices for us to make; better results. Or, at least, that's how the story is usually told.
But there's a darker side to personalisation: as companies and States become increasingly able to mould the environment in which we move, we, in turn, as consumers and citizens, become more transparent and predictable. In other words, easier to manipulate.
Sometimes this attempt at manipulation is rather obvious, like when a movie title you were looking for on IMDB now seems to chase you around the internet, magically appearing on your Facebook feed. In other cases, however, things are less straightforward.
Think of how, a couple of years ago, online travel agency Orbitz Worldwide experimented with showing Mac users pricier accommodation options than those offered to PC visitors. The practice was revealed by an article in the Wall Street Journal, and was ultimately abandoned after consumers complained about this form of price discrimination.
Or think of how, according to a New York Times investigation, American retailer Target is able to predict with good accuracy, customers’ pregnancy and delivery date based on their shopping history: the retailer is then able to send the customer tailored discount coupons for each stage of pregnancy, at a moment when they are at their most vulnerable and purchase-oriented.
Not convinced yet? Just recall the last time you tried to book a flight, and how the price suddenly skyrocketed as the airline was able to recognise, by tracking your online activity, that you had finally made up your mind about buying the ticket.
Disturbing as they might be, these marketing oriented techniques are relatively harmless compared to the risks of introducing data-based personalisation in other fields.
Personalised medicine, for instance, holds great promises. By sequencing and analysing DNA, doctors could identify the ideal treatment for a specific patient, a much more effective approach than simply relying on generic guidelines. But this technique carries great dangers as well.
Once your DNA has been singled out and used to cure you, how do you make sure that it is not stolen by, shared with, or sold to, third parties? Data stored in hospitals are a boon for hackers, as they contain highly sensitive information that can be sold on the black market or used to blackmail patients (think for instance of those in cure for drug addiction, or suffering from sexually transmitted diseases).
Genetic data are even more sensitive than usual, as they represent a “map” of sorts of an individual and of his or her strengths and vulnerabilities.
Theoretically, this data should be anonymised, but many experts are sceptical about the possibility of actually stripping away every last piece of information that could lead to personal identification.
In the future, as genome sequencing becomes cheaper and more widespread, they could be used by insurers to apply higher coverage costs to those more likely to develop chronic conditions or by governments to identify subjects at risk of anti-social behaviour (some studies suggest that genetic factors might play a significant role in the propensity to commit sexual crimes).
Not to mention the fact that, by manipulating the genome, the “personalisation” of the body could stretch to include giving parents the option to “order” the perfect child, with all the attributes – eyes, skin, hair, etc – of choice.
The extent of manipulation allowed not only by genome editing, but by precision medicine in general, calls for a widespread ethical debate and for societal consensus around this most sensitive frontier. But much could be said about the personalisation of the physical space as well.
Call it the Internet of Things, or the Internet of Everything, as you wish, but the results is the same: the objects around us are increasingly becoming smarter and always connected to the web and with each other. What this basically means is that you'll no longer be able to be just an anonymous face in the crowd.
It's not just the Minority Report style ads, calling you by name as you pass by – sure we'll also have them, they are already here in some respects. It's every single objects potentially monitoring and reacting to your moves.
In the UK in 2013 for instance, a start-up called Renew London placed in the city 12 Wi-Fi enabled recycle bins across the city that sensed the location data of passers-by, thereby tracking their movements. That function was not openly communicated at first, and the campaign had to stop following public outrage.
Price gouging and news filtering
Enabling a two-way flow of information between you and the outside world could provide benefits: targeted discounts when you are in your favourite store and logged in to the retailer's mobile app.
Similarly, when coming back from work, your smart home could welcome you with your favourite song, while automatically adjusting the lights to better suit your mood, estimated through facial recognition or data coming from a wearable gadget, or a combination of the two.
But the Internet of Things also has its drawbacks: it could be used, for instance, for replicating offline the same kind of price discrimination that we've already seen happening online with flights and Mac users.
Imagine: you're driving around in your connected car and, by the signals you're sending out, the nearest petrol station is able to estimate that you're about to run out of fuel. What could stop it from automatically rising the price of gasoline? Uber is already doing something similar with its surge pricing feature, in case of rain or extraordinary demand.
Another danger is that of surveillance: for all their cleverness, the emergence of smart cities, where even bins and street lamps are constantly monitoring their surroundings, opens the doors to every sort of abuse, especially in non-democratic states. Smart televisions could be used to listen to private conversations; smart electricity or gas meters could provide sensitive information about personal habits; RFID tags could be used to track private cars, as is already happening in some Chinese cities.
" The emergence of smart cities… opens the doors to every sort of abuse, especially in non-democratic states”
Finally, much has been said lately about the dangers of personalised information. According to the “filter bubble” theory, Facebook and Google users are almost exclusively exposed to news and articles that confirm their pre-existing opinions, resulting in an increasingly polarised public sphere and less tolerance to alternative points of view. This, in turn, favours the spreading of fake news, since even clearly fabricated claims are not likely to be challenged in one's own feed.
However, a recent report by the Reuters Institute for the Study of Journalism seems to suggest that the use of social media actually might have the opposite effect. According to the report's authors, “Contrary to conventional wisdom (...) social media use is clearly associated with incidental exposure to additional sources of news that people otherwise wouldn’t use — and with more politically diverse news diets.”.
Still, they argue, there's no guarantee that the effect will not change with the latest tweak to the algorithm. For what concerns news consumption, as for many other fields in which personalisation is playing an increasingly significant role, the jury is still out on whether this will ultimately benefit users or damage them.