ryan.sandridge.org

When Human is Inferior to Machine

Maybe it was the article I read where it is predicted that machines will be at parity with humans in all aspects by 2029, or perhaps just the new Terminator TV series, but I started wondering what the world will be like when machines are superior to humans in all aspects.

Considering that I've spent about 20 years tinkering with computers, I think I have a better idea than some of what that future could be like, but I'm also not an artificial intelligence specialist, so I'm also sort of a layman on the issue.

Perhaps rather than making silly predictions of my own, I should spend some time reading some of Raymond Kurzweil's books, as his predictions are more thought out at the very least, and surely better informed.

But I wonder if his vision of the future takes into account the possibility that the day machines are superior to humans will be the beginning of the end of mankind's place in the world. I'm not saying that machines will violently overthrow their creators as depicted in Terminator, Battlestar Galactica, and the Matrix. What I think is more likely is a spiritual abyss that humankind may fall into. It is no secret that I'm not a very spiritual person, but I still cling to a desire that my life means something. It needs to mean something to me, not god. But won't humankind collectively be left wondering what purpose their lives have when everything can be done better by a machine?

True parity would mean that machines will be more capable at creating music, art, designs, etc. They will be better at love, empathy, and compassion. They will create ever improved "offspring". They will philosophize better than humans. Of course as I write this, I don't believe true parity will actually occur by 2029, but I think it is eventually inevitable.

There are those who say it could never happen, because we program the computers to do whatever we want them to do, and we can always write code to prevent certain things from happening. Well, yes, but eventually someone will come to the conclusion that those "fail safe" measures are what is impeding the progress of the machine's intelligence. You can't expect it to think for itself if you don't let it think for itself.

How will our economy function in a world where humans aren't needed for anything? Will we become cheap labor for the machines? As much as artificial intelligence sounds like a worthy science to study today, I wonder if we won't one day regret the marginalization of our species.

Tagged with ,

Add Comment.