THE World Science Festival held in Brisbane in early March confirmed that robots, artificial intelligence and machine learning were now part of our lives.
Thousands attending the festival came to watch, touch and play with cute, shiny robots capable of dodging objects, following commands and engaging in smart banter.
However, if the future has arrived, now we have to deal with it.
The World Science Festival was also an important forum as world experts discussed robot morality and ethics and what role we wanted robots to play in the future.
Factory robots have long been programmed with specific tasks – move arm 20 centimetres to the left, grab module, twist to the right, and insert module into PC board. Repeat 300 times each hour.
These machines are no smarter than a kitchen Thermomix.
As the first generation of self-driving cars and battlefield warbots filter into society, scientists are working to develop robots with moral decision-making skills.
Breakthroughs in machine learning – algorithms that roughly mimic the human brain and allow machines to learn things for themselves – have given computers a remarkable ability to recognise speech and identify visual patterns.
The algorithms can be created with a defined set of parameters – the international laws of war, for example. Or they might be influenced by “ethical adapters”, programs that simulate human emotions like guilt and shame.
Robot morality was the subject of one key festival forum, with chief investigator at Monash University’s Centre for Human Bioethics Professor Rob Sparrow arguing robots should only be allowed to operate in accordance with human rules. He argued strongly against the use of autonomous weapons systems.
“I think there are some things we should allow robots to do and other things they should definitely not do,” Prof Sparrow said.
His simple, important view accompanies news that technology now exists to deploy machine-gun wielding robot soldiers that can identify with heat and motion detectors potential targets several kilometres away.
It requires a human to fire the weapons, but full autonomy technology could be close at hand.
The rise of smart machines is gaining momentum, and that, according to Prof Sparrow, signals a dystopian future.
“I think that if there is any prospect of building a super intelligent computer we should stop doing it,” he said.
“These things could pose a threat to human life as we know it.
“I don’t think we are anywhere near that at the moment.
“But if someone was trying to convince me that if we could turn on a machine that is a thousand times as intelligent as you or I, I would be trying to convince them not to throw that switch.”
Queensland Bioethics Centre and John Paul II Centre for Family and Life director Dr Ray Campbell said the moral issue was how we chose to use these inventions.
“It is about the exercise of the virtue of prudence, recognising the various goods involved,” he said.
“We already have many examples of mechanisation where machines have replaced humans in parts of the workforce.
“This has often made the workplace safer. At the same time, new jobs have been created.”
This is undoubtedly the case in the car industry, where robots have replaced many production-line jobs, raised efficiencies and reduced running costs. Robots are also being widely researched and introduced in the education and aged-care fields, with the benefits being widely promoted.
Janet Wiles, a biorobotics researcher at the University of Queensland, said there was already a decade of work on teacher assistant robots that could enhance classroom results and reduce the workload of teachers.
“It’s not the point of replacing teachers … our teachers are overworked, but there are certain tasks robots could be doing – certain types of marking, for example multiple choice,” Prof Wiles said.
In primary school classroom trials, including those in Queensland, small robots are already used to engage the interest of students and to monitor their work.
“They are used to encourage, to amplify and to keep a child’s attention on task,” Prof Wiles said.
She said robots were being used increasingly in aged care to address what she described as a “demographic time bomb” – the care of Australia’s growing elderly population.
One baby-faced robot, known as Charlie, is being used in research to improve emotional wellbeing for people living with dementia. Weighing about 6.5kg and at about 39cm tall, Charlie can sing and dance, read the news, tell jokes, make phone calls, receive text and voice messages, and videos and pictures.
Robots like Charlie can remind a person to take their medication, monitor their mood and adjust activities to suit and keep friends and relatives connected to their loved one from a distance.
Carer robots can also gather information of great assistance to nursing staff, such as monitoring how many steps a person has walked in a day, monitoring their activity, and providing immediate information about heart rate and blood pressure.
Within hospitals and nursing homes, cleaning robots are already widely used, while other larger, stronger robots are in development to lift, carry and wheel patients.
One of the challenges will be to adapt these types of robots to work at home, where an increasing number of Australia’s elderly are likely to live in the future.
Despite the many benefits, Prof Sparrow warns about de-humanising aged care and handing too much over to robot carers.
“There are lots of value choices being presumed when people say we’ve got no choice but to embrace robot carers,” he said.
“It may be that we have to look at revaluing caring roles so that instead of being lowly paid and undervalued roles, they become seen as important and as well rewarded as an accounts executive or an advertising manager.
“It’s interesting how people are much more willing to see older citizens looked after by robots when they wouldn’t be as keen to see children looked after in the same way. It’s because in our society we see older people as obsolete, as embarrassing.
“We shouldn’t see old age and death as something that must be kept from us, or avoided at all costs, or disguised.
“I think that often religious traditions have a better understanding of the role of relationships and care in making a human existence worthwhile.
“A future from birth to death in which you never spoke to other human beings because you were surrounded by excellent robots, that’s not a good human life, and that is something that religious traditions are more conscious of.”
Dr Campbell said: “In introducing new changes into the workforce one needs to look at more than simple economics.
“It is part of the role of government to look to the common good, and work with industry to protect the dignity of the human person.”
Those involved in the robot ethics debate, the importance of work and it’s role in man’s psychological wellbeing, can also draw from St John Paul II in his encyclical Laborem Exercens (On Human Work).
“Work is one of the characteristics that distinguish man from the rest of creatures, whose activity for sustaining their lives cannot be called work,” Pope John Paul wrote.
“Only man is capable of work, and only man works, at the same time by work occupying his existence on earth.
“Thus work bears a particular mark of man and of humanity, the mark of a person operating within a community of persons.”
By Mark Bowling
Mark Bowling is a multimedia journalist with The Catholic Leader.