I'll try to cut this down where I can for clarity, if I skip something, let me know:
I think one of the problems is that you're looking at this issue as something that's threatening to God
I don't think there is a god, so I don't look at it as a threat to god. I do look at it as a challenge to the Christian beliefs, but I also understand they'll simply go to god of the gaps.
So how about you? What is your definition of sentient?
I'm not sure, this discussion has made me question it! I think self-awareness is key, some sort of self preservation instinct, reproductivity, the capacity to act out of one's own interests, emotional response, etc...it's a pretty big question that seems to come down to I know it when I see it, but my point is what happens if you can't tell it's being faked.
Do you think robots could go renegade? Like in movies and sci-fi tv shows where a robot(s) takes on their own personality, acts independently from human control, etc.?
I suppose, sure. I saw that episode where they went to Itchy and Scratchy Land. It was...formative.
Do you consider robots sentient?
Not the kind that build your car or move inventory around a warehouse, but I'm not ruling out that a robot could one day achieve sentience.
If there's a creator, there's no way to dictate how he can or cannot create a universe. God is sometimes described as light.
Just wanted to point out that sentence one and sentence two are not necessarily related. It's an old topic, but I don't want to take us off track.
Are you saying, that anything other than not existing after death would mean "magical afterlife"?
I'm saying any afterlife seems to be squarely in the realm of the magical: fantasy. It's never been even remotely close to demonstrated. Like Mordor.
Do you think it would be wise to manufacture AI to have independent free will?
I'm not sure it's possible to NOT end up with free-ish thinking AI if you program it to be as close to human decisions as possible.
The rest of it I don't want to assume have nothing to do with the story, but feel free to expound by all means. I don't remember any pondering on Doc Frank's part as to whether or not he should give the creature full knowledge, free will (which I think was already assumed as I don't think Doc Frank wanted a slave). Or giving it the ability to love (I think just getting it'sheart to beat was enough)
I'm talking about the book, not the movie with Boris Karloff or Abbot and Costello. VERY broad strokes, as it's been a while, tHe only thing Frankenstein did was prove he could re-animate a dead body. The monster just wanted to be accepted, like a human, and because it was so ugly, the family of the blind guy rejected him. He decided he wanted a wife, and he threatened Frankenstein to give him one with arguments like "only a monster would re-animate me into a life of isolation, what's wrong with you?" then threatened and eventually started killing the people important to Frankenstein. It wanted a wife, to love and to be with. This leads us to the subsequent questions, which absolutely can be asked not only of Frankenstein himself, but of any life creating force. If you were to create a sentient being, what does that creation owe you? Does it owe you allegiance forever, blind obedience, propitiation, tribute? Why? What do you owe it? Do you owe anything beyond life? Would you forever have the right to beat it severely whenever you felt like it, whenever it displeased you? Should you undertake creating sentience because you want to torture it? These are the questions of 'playing god,' not 'should we do it.'
That's a good question. What do you think?
No, it is not immoral to have children.
Jesus figured out what we were looking for, then decided to make foxes look like dogs, meaning that our efforts to do all that genetic engineering weren't ACTUALLY working, they just looked like they were working, but were actually responding to divine intervention. Feasible?
Not particularly. No.
Why not?