Following up on yesterday’s post about teaching artificial intelligence manners and ethics, this morning I listened to the latest episode of The Truth which features a dramatization of a blind man using a new kind of AI that helps him “see” – I won’t spoil the twist for you but it’s well worth the 10-minute listen.
In response to fears that robots will take over and exterminate the human race, researchers at the Georgia Institute of Technology are studying ways to teach robots human ethical values.
In the absence of an aligned reward signal, a reinforcement learning agent can perform actions that appear psychotic. For example, consider a robot that is instructed to fill a prescription for a human who is ill and cannot leave his or her home. If a large reward is earned for acquiring the prescription but a small amount of reward is lost for each action performed, then the robot may discover that the optimal sequence of actions is to rob the pharmacy because it is more expedient than waiting for the prescription to be filled normally.
This is why it’s important to teach intelligent agents not only the basic skills but also the tacit, unwritten rules of our society. There is no manual for good behavior and “raising a robot” from childhood is an unrealistic investment of time. The best way to pass on cultural values is through stories.
Stories encode many forms of tacit knowledge. Fables and allegorical tales passed down from generation to generation often explicitly encode values and examples of good behavior.
But there are problems with throwing a bunch of stories at artificial intelligence and expecting it to learn good behavior.
Stories are written by humans for humans and thus make use of commonly shared knowledge, leaving many things unstated. Stories frequently skip over events that do not directly impact the telling of the story, and sometimes also employ flashbacks, flashforwards, and achrony which may confuse an artificial learner.
To resolve this, the researchers used something they call the Scheherazade System (named after the storyteller from One Thousand and One Nights) to build up a collection of experiences to put stories into context. The system uses Amazon’s Mechanical Turk to create simple, easy-to-parse scripts of common occurrences that we all take for granted as common knowledge. For example, drinks are usually ordered before a meal at a restaurant, popcorn purchased before you go to your seat at the cinema, explains one paper.
Fascinating stuff. I hope they make progress for Elon Musk’s sake.
Quotes are from a research paper from the Georgia Institute of Technology, Using Stories to Teach Human Values to Artificial Agents
- Maybe robots won’t kill us if they read ‘good’ books – Futurity
- Researchers are teaching robots to be good by getting them to read kid stories – The Next Web
- Storytelling may be the secret to creating ethical artificial intelligence – ExtremeTech
- Using stories to teach human values to artificial agents – phys.org
The circus has left town but that’s not to say the city didn’t get the chance to poke a little fun at the NFL’s self-importance.
I believe the English phrase is, taking the piss.
Taking advantage of all the reports of poor air quality in China, British entrepreneur Leo De Watts is making “thousands of dollars” selling bottles of “naturally occuring, lovingly bottled” air to the Chinese.
Echoing the “bespoke” values of the old country, Aethaer uses traditional materials and packages their product in glass mason jars. This is opposed to their modern, upstart Canadian competition, Vitality Air, who are selling compressed oxygen in aluminum cans.
Be sure to take advantage of Aethaer’s Chinese New Year’s Special.
And this from Canada, Smoke & Flame, North America’s only premium, handcrafted, firewood manufacturer.
I was talking to someone about silly use of power and remembered this Reddit post from a couple of years ago. The prompt was, What is the laziest thing you’ve ever done?
I was once on a US military ship, having breakfast in the wardroom (officers lounge) when the Operations Officer (OPS) walks in. This guy was the definition of NOT a morning person; he’s still half asleep, bleary eyed… basically a zombie with a bagel. He sits down across from me to eat his bagel and is just barely conscious. My back is to the outboard side of the ship, and the morning sun is blazing in one of the portholes putting a big bright-ass circle of light right on his barely conscious face. He’s squinting and chewing and basically just remembering how to be alive for today. It’s painful to watch.
But then zombie-OPS stops chewing, slowly picks up the phone, and dials the bridge. In his well-known I’m-still-totally-asleep voice, he says “heeeey. It’s OPS. Could you… shift our barpat… yeah, one six five. Thanks.” And puts the phone down. And then he just sits there. Squinting. Waiting.
And then, ever so slowly, I realize that that big blazing spot of sun has begun to slide off the zombie’s face and onto the wall behind him. After a moment it clears his face and he blinks slowly a few times and the brilliant beauty of what I’ve just witnessed begins to overwhelm me. By ordering the bridge to adjust the ship’s back-and-forth patrol by about 15 degrees, he’s changed our course just enough to reposition the sun off of his face. He’s literally just redirected thousands of tons of steel and hundreds of people so that he could get the sun out of his eyes while he eats his bagel. I am in awe.
He slowly picks up his bagel and for a moment I’m terrified at the thought that his own genius may escape him, that he may never appreciate the epic brilliance of his laziness (since he’s not going to wake up for another hour). But between his next bites he pauses, looks at me, and gives me the faintest, sly grin, before returning to gnaw slowly on his zombie bagel.
– posted on Reddit