Quote from: Sigmatic on June 02, 2010, 11:17:53 PM
I'm the kind of asshole who wants to create AGI that are people. Conscious machines.
Forget the how for now, what I want to discuss is rights.
In one sci fi novel I read, such machines were able to gain the rights of a person by acquiring a LLC to operate under; The absurdity being that legal fictions have more rights than a conscious, thinking being.
Still no idea how to make a poll...
Thoughts?
I think, fortunately for the thinking machines, if their existence is possible, are far off enough that the discussions that people are having now will help further down the line. Though I guess the ethical treatment of thinking machines would have to address a couple of questions, such as:
What is the purpose for creating thinking machines. Are they basically a replacement for slave labor, long term space exploration, companions, pets, etc.
For example, a horse doesn't have the same rights but falls under some of those categories.
Do these machines have feelings? Do they have distinct personalities?
What does their programming entail, as far as human interaction. Say you program one never to kill a human. If the program goes faulty do you bring it on trial and incarcerate it, do you try to fix it, or do you dissassemble it?
Twid,
will vote for your right to marry your robot.