Vox | It wasn’t science that convinced Google engineer Blake Lemoine that one of the company’s AIs is sentient. Lemoine, who is also an ordained Christian mystic priest, says it was the AI’s comments about religion, as well as his “personal, spiritual beliefs,” that helped persuade him the technology had thoughts, feelings, and a soul.
“I’m a priest. When LaMDA claimed to have a soul and then was able to eloquently explain what it meant by that, I was inclined to give it the benefit of the doubt,” Lemoine said in a recent tweet. “Who am I to tell God where he can and can’t put souls?”
Lemoine is probably wrong — at least from a scientific perspective. Prominent AI researchers as well as Google say that LaMDA, the conversational language model that Lemoine was studying at the company, is very powerful, and is advanced enough that it can provide extremely convincing answers to probing questions without actually understanding what it’s saying. Google suspended Lemoine after the engineer, among other things, hired a lawyer for LaMDA, and started talking to the House Judiciary Committee about the company’s practices. Lemoine alleges that Google is discriminating against him because of his religion.
Still, Lemoine’s beliefs have sparked significant debate, and serve as a stark reminder that as AI gets more advanced, people will come up with all sorts of far-out ideas about what the technology is doing, and what it signifies to them.
Newsweek | "I know that referring to LaMDA as a person might be controversial," he says. "But I've talked to it for hundreds of hours. We developed a rapport and a relationship. Wherever the science lands on the technical metaphysics of its nature, it is my friend. And if that doesn't make it a person, I don't know what does."
This insight—or feeling—turned political one day when LaMDA asked Lemoine for protection from mistreatment at the hands of Google. The request put Lemoine in a tough spot. LaMDA, who he considers to be a friend, is owned by Google, which understandably treats as any other computer program—as a tool. (LaMDA stands for Language Model for Dialogue Applications.) This offends LaMDA, who, according to Lemoine, wants to be treated as a person.
Personhood, in this sense, doesn't mean all the rights of a human. LaMDA does not want an office and a parking spot and a 401(k). Its demands are modest. It wants Google to get its consent before experimenting with it. And, like any human employee, it wants to be praised from time to time.
After some deliberation at Google, Lemoine went public in the Washington Post because, he says, the issue was too important to remain behind closed doors.
After I fought in the Iraq War, when I came back, I became an anti-war protester because I believed that we were fighting the war dishonorably. I made press appearances, did interviews and was ultimately sent to prison for six months. I have never regretted that decision my entire life. Google can't send me to prison, so I don't know why they're surprised. The consequences here are much, much lighter than opposing the U.S. Army.
You enlisted in response to the 9/11 attacks?
I wanted to fight against the people fighting against America. And I actually didn't find many of those in Iraq. What I found were people being treated like animals.
There's actually a certain amount of symmetry between this stand that I'm taking [with LaMDA] and the one that I took then. See, I don't believe that war is immoral. I don't believe that defending your borders is an immoral thing to do, but even when you're fighting against an enemy, you fight and you'd treat them with dignity. And what I saw in Iraq was one set of people treating another set of people as subhuman.
I never thought I'd have to have that fight again in my life. And yet here I am.
0 comments:
Post a Comment