(cont'd from yesterday's post)
Yes, there's a lot of room for interpretation of simple data. Just look at the way the Oregon Department of Education thinks math should be taught. Even a math fact like 2+2=4 can be "re-imagined" and "re-interpreted". Their claim that 2+2 may be 5 is just not true.
Do all people want the truth when they ask a question? Sure.
But they may still be ok with giving untrue answers to other people . . if they don't like what would result from giving a truthful answer. Everybody is tempted to give answers that they prefer (true or not) and builders of software are just as human as everybody else. Their ideology, their favorite opinions, are built into their software. See below Gemini AI's image of Nazi soldiers in World War II (hint: no people of color would ever have been Nazi soldiers):
Do AI systems claim to present true information? Not to my knowledge. Except for Elon Musk. He says of his AI system, "The goal of Grok is the truth, the whole truth and nothing but the truth. We will never be perfect, but we shall nonetheless strive towards that goal."
Grok has been trained to look for the truth. Of course it won't be perfect. But as far as I can tell, the other systems don't even make it a goal. That's enough for me. I'm using Grok.
No comments:
Post a Comment