In August this year, the Royal Academy of Engineering (RAE) published a thought-provoking report on a roundtable meeting at the Academy about a rarely discussed aspect of autonomous systems – their social, legal and ethical ramifications.
The roundtable focussed on two rapidly developing technologies, autonomous road vehicles and smart homes, because these will be most visible to the general public and therefore attract considerable attention and comment.
While it’s not hard to imagine the motoring public’s initial reaction to autonomous road vehicles (“What, turn the driving over to some robot!”) or the associated insurance issues (“Wasn’t my fault, the robot did it!”), the discussion of smart homes was both fascinating and somewhat alarming.
The early days of smart homes are, in fact, already here. Lights can be set to go on and off automatically, fridges can monitor the comings and goings of food items and ovens can be set to turn on, cook the roast, and turn off. But just around the corner are smarter homes where the fridge not only monitors the consumption of food but orders replacements as needed and smart sensors not only detect movement but send out alerts if the occupant doesn’t get out of bed or falls and doesn’t get up.
While this functionality can obviously benefit the old and the frail, do people really want Big Brother watching over them 24/7? What if Big Brother gets it wrong?
This RAE report is a brief but fascinating look at the autonomy that will envelop us all in the next few decades and what seem now to be the social, legal and ethical contradictions that will accompany it.
Copies of the full RAE report, “Autonomous Systems: Social, Legal and Ethical Issues” are available for download online. Look under ‘Recent Events’ at: www.raeng.org.uk/autonomoussystems