More and more companies are designing their products in a way that they can create predictions from user data. It's nice when the products learn to fulfil our wishes even better - but it won't always be that nice, warns Shoshana Zuboff in her latest book.
A rose is a rose is a rose, wrote Gertrude Stein in 1913. Things are what they are. And that’s how it was indeed at the beginning of the 20th century. Now, one century later, the situation has changed fundamentally. Because more and more products are being designed in a way that they are more than just what they seem to be. A rose can also be a declaration of love, a data medium and a spying tool. When we receive a rose, we can read a message out of it. And this is also true for the company that receives the data transmitted by the rose. It can learn, along with many other data from many other products, why we bought the rose, when we are likely to buy the next one - and how the rose industry might persuade us to buy more roses more frequently in the future.
US economist Shoshana Zuboff calls this new, rapidly growing segment "Prediction Products". She defines it as "nearly every product or service that begins with the word “smart” or “personalised”, every internet-enabled device, every digital assistant”. These products are able to channel information from our lives to the respective companies; in real time. The first company to start making predictions this way about the future in general and our behavior in particular was Google two decades ago. Today, Zuboff says, every major company is eager to turn its products into prediction products. She quotes Ford's CEO, who no longer wants to see his cars primarily as a means of transportation, but as 100 million data boxes that are constantly receiving information all over the world about who is using them, when and where, and how.
This creates new layers of data between people, things and companies that can benefit everyone. When smart seeds report their state of maturity and these data are linked to hyperlocal weather forecasts, the use of water and fertilizers can be reduced. If the shoe itself notices that its heel is about to break, it can warn us before it happens and provide a suitable alternative - in the right place and the right size. If companies know our behavior, they can adapt their product range to our needs. This increases convenience and customer satisfaction because the product quality feels like being substantially improved by this additional data layer.
But there’s a downside, too. Zuboff calls it like her latest book: "Surveillance Capitalism". After all, companies would not primarily use the findings from our data, from our behavior, to make us happy, but to increase their own profits. So for them, the temptation is great not to change their products according to our wishes, but to change our wishes in such a way that they match their offerings. This can have a positive or negative effect on us: it can increase or decrease our sugar consumption or our risk of heart attacks; and often we don't even notice that we suddenly consume or think differently.
For Zuboff, this is an expropriation. With prediction products, personality, behavior and privacy become tradable commodities. Companies collect our data not only for their own use, but sell it on in an almost unregulated market; in the end no one knows where and how this data will ultimately be used. As an example, she cites a German doll that recorded the children's conversations with her - and then resold them to the US secret service CIA.
Zuboff compares the behaviour of today’s companies with that of the Spanish conquistadors after the discovery of America: "Back then Columbus simply declared the islands as the territory of the Spanish monarchy and the pope. The first surveillance capitalists also conquered by declaration: Google began by unilaterally declaring that the world wide web was its to take for its search engine. And we were caught off guard. Like the Caribbean people, we faced something truly unprecedented."
Will the digital natives have to suffer the same fate as the American natives at Columbus’ time? It doesn’t have to happen, says Zuboff: " Our societies have tamed the dangerous excesses of raw capitalism before, and we must do it again.” If, for example, private data remains private property, each user can decide for himself which institutions he entrusts it to for which purposes. Then prediction products would no longer be a threat, but a promise.