(cont'd from yesterday's post)
Yes, there's a lot of room for interpretation of simple data collected by AI. Just look at the way the Oregon Department of Education thinks math should be taught. Even a simple math fact like 2+2=4 can be re-imagined and re-interpreted.
Do all people really want the truth when they ask a question? Sure.
But many don't think they need to give truthful answers to others . . if they don't like what would result from giving a truthful answer. Everybody is tempted to answer questions for their own benefit or preference. Builders of software are, in that respect, just as human as everybody else. Their ideology, their view of the world, is built into their software.
You'll notice that makers of AI systems don't make any claims about truth. Except for Elon Musk. He says of his AI system, "The goal of Grok is the truth, the whole truth and nothing but the truth. We will never be perfect, but we shall nonetheless strive towards that goal."
Grok has been trained to look for the truth. Of course it won't be perfect. But as far as I can tell, the other systems don't even care to make it a goal. That's enough for me. I'm using Grok.
No comments:
Post a Comment