It is a pioneer of e-government in the Middle East.
But as humans we should beware the temptation to allow too much technology into our daily lives.
IBM has issued a patent for a system that would deliver a person coffee, by drone, based on their cognitive state. Data from cameras and sensors on customers’ clothes would pick up details about their sleep cycle, when they went to bed last night, and their blood pressure. Meanwhile, smart software would pick up on “contextual” information, rummaging through their electronic diaries to see whether they have an important meeting coming up in a few minutes, who it is with and how important it is. An algorithm would analyse the data, work out whether the person needed a caffeine hit and, if they did, would dispatch a drone.
Technology that susses out humans’ desires is now common. Anyone who has used the online retailer Amazon will have been able to browse selections of consumables inspired by their previous choices. Anyone who has used Netflix will have been pelted with suggestions for films closely linked in cast and genre to those they watched the previous night. Spotify serves up playlists of songs that it thinks its users will like. The same thing is going on behind the scenes of a dating app or a Twitter or Facebook news feed, as the hosts work out how to keep users engaged and advertisers paying for space.
The machines already know what you want.
Yet in drinks, film, music or love, the best choices can be the accidental ones. Many parents of teenage children have surprised themselves by developing an addiction to trashy TV after wandering into the living room at the wrong time. Those teenagers can as easily find themselves hooked by films from the 70s and 80s in their parents’ VHS collection. The best cup of coffee is sometimes not the one whose caffeination has been most carefully judged, but the one most thoughtfully proffered by a colleague. If technology takes over choices like when to have a drink, much of the joy to be had from being alive will be lost.
There will also be more potential for manipulation. As most people do not understand algorithms, they will not be able to tell whether companies are delivering products with their best interests at heart, or because of more obscure, ingeniously coded commercial interests. The tech takeover may come not when AI subjugates humans against their will, but when it comes to know their will too well.