I don't think it's a question of creating a life form/technology that will render us obsolete so much as it's a practical issue. Assume AI exists. We have then created a life form that is competing with us for the same resources and has no human empathy or emotions whatsoever. The practical, LOGICAL measure is to eliminate the competition for resources, particularly when said competition has made no indication that it can police its own usage of said resources and tends to deplete them in their entirety. At that point, what reason does the AI have to exist with humanity in harmony, particularly if it can eliminate us without harm to itself?
no subject
Date: 2006-09-27 10:07 pm (UTC)