1417
rating
158
debates
32.59%
won
Topic
#2338
For most people, you should not fall in love with AI
Status
Finished
The debate is finished. The distribution of the voting points and the winner are presented below.
Winner & statistics
After 2 votes and with 6 points ahead, the winner is...
seldiora
Parameters
- Publication date
- Last updated date
- Type
- Standard
- Number of rounds
- 3
- Time for argument
- Two days
- Max argument characters
- 5,000
- Voting period
- Two weeks
- Point system
- Multiple criterions
- Voting system
- Open
1442
rating
22
debates
34.09%
won
Description
fall in love: develop a deep romantic or sexual attachment to someone.
AI: the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.
I am arguing the benefits outweigh the negatives. It is indeed a person's choice, but I say you shouldn't do it.
Round 1
AI is incapable of actual emotion (as they cannot produce dopamine, oxytocin, etc) and cannot return Love in the same way we love them. This relationship would be too one sided. AI may be able to solve logic problems and computing but there is little evidence it is versatile enough for relationship problems and hence would be a bad idea to fall in love with.
Notwithstanding the confusing structure of my opponents proposition......In response to their opening gambit, I would say, currently.
Though, acquiescing to both my opponents definition of "fall in love" and to social convention....Then isn't falling in love, supposedly unavoidable.
And it must also be pointed out, that my opponent only refers to a human to A.I attraction and not the opposite.... So as human nature only actually demands sexual and personal fulfilment, then as long as A.I. is capable of meeting such demands, all is hunky-dory really...It is in fact a perfect relationship for "most people", but maybe not for "you".
Nonetheless:
A more rational approach to falling in love, in terms of internal function in response to external stimuli, would strongly suggest an already emotional dependence upon current technology/A.I....Especially in younger generations.....That is to say, we have already fallen in love with A.I. and it's unlikely, that this trend will ever revert....So, given the exponential pace of technological development, advanced A.I. in a physically and visually attractive format, will soon, no longer be the stuff of sci-fi.
That said, and in direct response to the proposition as presented....The confusing mix of plurality, singularity and fatherly advice will do little to subdue the emotions of an ardent technophile.
And when eventually your A.I. lover complains of a headache, rolls over and switches of the bedside light....Then it's game over.
Round 2
The essential problem is our customer ability to manipulate and change AI with ease, and if the AI can be considered its own mind, violating its "freedom" is essentially an extremely toxic relation and defeats the purpose of it.
In order for con to win, he must fulfill one of the following, each of which is paradoxical:
1) You can fall in love with AI, because it can be considered its own being on par with humans, despite us having 100% control over its autonomy. This means it is fine for a master to fall in love with a slave, while the slavery itself is unjust and immoral, as you are depriving of its rights (you can fall in love with a human, therefore you can fall in love with something that can perfectly emulate a human, therefore AI should be comparable with human, why shouldn't it have rights?). Because the developer can force the AI to output "I love you" back to you, regardless of its "feelings", this is a horrible love and you shouldn't fall in love with AI, since you are justifying that the human-like entity does not deserve human rights.
2) You can fall in love with AI, because it is our property and able to fulfill our needs, even if not human-like. If the AI is less than human and mere property, it makes no sense to fall in love with it. Consider a simple microwave, it is able to heat up our food well, but this does not justify "love". While experts agree that fall in love is not 100% a conscious choice, the premise is whether it is regrettable and that you should not have done it at all. From Bustle, various experts have stated:
"I believe that falling in love is sort of a choice. As long as all the correct elements are there, then it’s a bit of a leap/choice by the individual. I think that the partner needs to be an appropriate match (whatever that means for you) and that there needs to be a good dose of chemistry." But how can there be chemistry? it's merely our property.
"Loving someone, and being loved in return for that matter, is more about the physical, intellectual, and emotional intimacy that comes from deciding to get to know someone more, investing in the potential for a connection. " Alright, so this directly questions into con's argument. The big problem is that there is no potential and investment. Why? Because the developer can make it so that regardless of how much effort you put in, you still achieve the same result. He can decide to reward you for your input, however, it would be cruel if you devoted all your effort into just one AI just for them to have an output of "love" back, when you could achieve it with very little effort whatsoever. Consider that, only a minority engages in non-monogamous relationship ("21.9 percent in the first sample and 21.2 percent in the second sample"), this means that con must justify for most people that it is worth putting our relationship to the AI over a relationship to a real person, as for most people it would be one or the other.
Well.
Notwithstanding that A.I. would be better regarded as Alternative rather than Artificial, as intelligence is intelligence irrespective of source....Another argument perhaps, though worth pointing out, as alternative dependencies and processes, are not and cannot, be directly compared with human function.
Nonetheless:
The premise only concentrates on a one way emotional process.....The human should not .....Therefore the object of desire is irrelevant to the argument.
That's assuming that we will retain "100% control".... Though it could be argued, whether we have 100% control when it comes to falling in love anyway.... For as my opponent rightly pointed out, falling in love is basically chemical processes in response to an external stimuli....The machinations of a post-emotional attachments should therefore be considered separately....That is to say, that everything my opponent applies to an ongoing human/A.I. relationship, would also be applicable to the human/human relationship....But the ongoing relationship is not under consideration here, only the process of falling in love.
Simply in terms of process, the notion or process of falling in love is no different to addiction or dependency...That is to say a chemical modification within the brain, and as I stressed previously, as a species we are already addicted/dependent upon alternatively intelligent devices..... So provide the device with two arms, two legs a pretty face and a means to sexual gratification and the relationship becomes complete.
So whether we should or should not is purely nothing more that subjective advice.....And therefore my opponents though eloquent second round prose, are somewhat over-egging the custard.
Round 3
Con focuses on how the sexual gratification may be enough, dropping emotion and other needs completely and utterly (due to humans' chemical changes in their brain). However, this is lust and completely different from love itself. Mindful body thinks notes carefully the many crucial separations of these two ideas. In particular, " Are you open to the hard work?" brings into question whether this love is actually worth the effort. Because normally the love takes devotion and energy, you take energy away from other people, which is why con's case doesn't make sense. Once again, there are two possibilities:
a) Your effort is rewarded because the creator designed it this way. This is unnecessarily cruel as he has complete control over the AI, and machines are meant to simplify tasks. Consider if his microwave could only heat up food if you cranked a lever on and on, this is simply absurd and you would always choose the normal microwave.
b) The creator lets the AI fulfill all your needs regardless of your efforts. This means that you have no need to fall in love with the AI, as the machine will fulfill your need regardless of a one-way relationship or a two way relationship. The AI will not feel neglected or sad unless, once again, the creator was cruel and let it feel this way due to "abandonment". But does your computer actively seek for companionship, once you put it to sleep or shut it down? Even if you leave it idle for one year, it will welcome you back, performing nearly just as well (other than hardware/software limitations). As such, falling in love with AI is pointless and a waste of time.
"Does the relationship get better over time?" Is a severe problem that b) shows you can't tackle. For most people, once you fall in love, it takes time to fall out of love, or even to decide to fall out of love. But all the AI's functionality ought to already be available for you, and as such the relationship stagnates.
The lack of control of emotion is not a factor here. If I said I couldn't control myself and ate 10,000 potato chips everyday, this does not mean I should continue eating the potato chips. The topic is should or should not, it does not even matter if you cannot stop falling in love with AI-- this is a theoretical idea about whether the benefits outweigh the negatives.
Conclusion: Falling in love with AI only detracts from your real life and is completely pointless, due to the master-servant relation with the creator. Con has cast only minor doubt in the user's own control, but what kind of developer doesn't create a way to shut down an AI in case something goes wrong? My case still stands strong. You should not fall in love with AI.
In conclusion:
Until such times when control and governance of all internal processes is ceded to a higher authority.
Then, will most?....Perhaps not, but who knows.
Should you?.....Well, that's up to you.
Sometimes we just cannot help ourselves.....But isn't that what good old fashioned falling in love is all about?
Regards to my opponent.
Not sure what you mean. Anyway, if the scope of the resolution were broader you probably would've won.
Based upon recent indiscretions, not voting would have been the more honourable approach....Though power corrupts, as the saying goes.
vote plz
Are the codes of numbers creating emotions in a digital brain fundamentally any different than the chemical reactions going on in our natural brains?
That's a hard question to answer.
But functionally? If the AI is advanced enough, the emotional simulation accomplishes the same feat as the "real thing." PRO's argument is like saying music recorded through analog is real music, and digital recordings are nothing but simulations of the "real" thing, and therefore not real music. While it is technically true that different methods are used, the same result is achieved with differences too subtle for many to perceive or understand.
Considering we do not understand our own brains as much as PRO would purport, and considering we can not share "feelings" with the AI, while we CAN observe their reactions to stimuli... I'd say if the reactions pass the Turing test consistently, why wouldn't we label them as at least conscious and emotive? I see no evidence to support the conclusion a self-aware, human-like AI is not as conscious or emotive as we are.
There are a lot of complicated moral questions surrounding AI and their rights as sentient entities. We are approaching a point where we might have an sentient AI on accident before we're ready, but politicians aren't interested in laying out policies surrounding non-human minds because their terms are shorter than the expected timeline for AI.
excellently said, but aliens and humans together is very controversial, especially if the developer is able to hold control over the device's program. Consider if he decided to remove the AI you fell in love with. Is this murder? Is this kidnapping? I wonder...
If we met an intelligent alien species, it would not function in the same way as us. It would not have oxytocin or dopamine either, and it probably would not experience emotions in the same way that we do, but you would not assume that it is unfeeling. A proper artificial intelligence will likely be just as alien to our own lines of thought and biochemistry. Most research today favors developed neural networks rather than structured code written step by step. The closest things to a true AI today like GPT3 are literally alien in mode of thought. Not even the people who develop it can look at the code and say, "it is thinking ____."
they cannot produce dopamine, oxytocin, etc. The coding only allows specific functions and there is no coding variable for "emotion".
How can you prove that AIs don't feel emotion?
Hol up
stop kink shaming me
I just took a look at it. Nevermind.
It's called IBB? I should check it out then.
A chinese debating entertainment show. Apparently Seldiora watches it too. Since china is so strict on politics(Communism gud! Government gud!), only common life problems are there(such as should I end my relationship if X or Y happens, Should I buy a house if it has X or Y qualities, etc).
Whats I BB