free will

Author: keithprosser

Posts

Total: 712
TwoMan
TwoMan's avatar
Debates: 0
Posts: 379
1
2
3
TwoMan's avatar
TwoMan
1
2
3
-->
@3RU7AL
Emotion is just one component that distinguishes a human from a computer. Humans also have varying degrees of intelligence and capacity for rational thought giving some a greater ability to make a rational choice than others meaning freewill is a sliding scale.

A newborn infant probably has probably little or no freewill.

An adult with a normal functioning brain has much more.

An adult with a minor cognitive disability has less but not none.

Etc.

3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@TwoMan
Emotion is just one component that distinguishes a human from a computer. Humans also have varying degrees of intelligence and capacity for rational thought giving some a greater ability to make a rational choice than others meaning freewill is a sliding scale.
A newborn infant probably has probably little or no freewill.
An adult with a normal functioning brain has much more.
An adult with a minor cognitive disability has less but not none.
Etc.
(IFF) free-will is proportional to intelligence (animals have less, humans have more)

(AND) free-will is proportional to moral culpability (without free-will there is no moral culpability)

(THEN) intelligence is proportional to moral culpability.

Please feel free to modify any of the above statements to better fit your "moral intuition".

Computers will soon be able to pass a complex Turning test - [LINK]
TwoMan
TwoMan's avatar
Debates: 0
Posts: 379
1
2
3
TwoMan's avatar
TwoMan
1
2
3
-->
@3RU7AL
(THEN) intelligence is proportional to moral culpability.
Intelligence is only one aspect of moral culpability. You might add emotional stability, knowledge and ignorance, intent and other attributes that I can't think of off hand.

The ability to make a choice exists apart from moral culpability but moral culpability requires choice.

3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@TwoMan
(THEN) intelligence is proportional to moral culpability.
Intelligence is only one aspect of moral culpability. You might add emotional stability, knowledge and ignorance, intent and other attributes that I can't think of off hand.

The ability to make a choice exists apart from moral culpability but moral culpability requires choice.
The ONLY question is, "HOW DO YOU QUANTIFY FREEWILL?"

If your only answer is "intuition" then you might as well be measuring love.
3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@TwoMan
Intelligence is only one aspect of moral culpability. You might add emotional stability, knowledge and ignorance, intent and other attributes that I can't think of off hand.
By this measure, a psychopath has less moral culpability and should therefore be given a lighter sentence than an emotionally stable, knowledgeable, well intentioned criminal.
Mopac
Mopac's avatar
Debates: 4
Posts: 8,050
3
4
7
Mopac's avatar
Mopac
3
4
7
-->
@3RU7AL
(IFF) freewill = choice (THEN) robotic sorting systems have freewill.
What is an example of a robotic sorting system making a choice?


Because I find it hard to accept that the giant machine at the package handling facility is making a choice when it makes sure all the packages find their way to the right truck. It is simply following directions.


TwoMan
TwoMan's avatar
Debates: 0
Posts: 379
1
2
3
TwoMan's avatar
TwoMan
1
2
3
-->
@3RU7AL
The ONLY question is, "HOW DO YOU QUANTIFY FREEWILL?"
You can't. I also can't quantify emotional intensity but it exists nonetheless.

I'm not particularly interested in discussing moral culpability as it relates to the ability to make a choice, only the fact that phenomenon itself exists.

RationalMadman
RationalMadman's avatar
Debates: 574
Posts: 19,931
10
11
11
RationalMadman's avatar
RationalMadman
10
11
11
Nothing is free in life, even the capacity to make decisions. Free will is not something to EVEN WANT because it would mean you'd need to lack a body and mind that you're enslaved to to have a self.
3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@TwoMan
The ONLY question is, "HOW DO YOU QUANTIFY FREEWILL?"
You can't. I also can't quantify emotional intensity but it exists nonetheless.

I'm not particularly interested in discussing moral culpability as it relates to the ability to make a choice, only the fact that phenomenon itself exists.
(IFF) you can't quantify freewill (AND) you don't care about moral culpability (THEN) why would you bother distinguishing "freewill" from simple "choice"?

Do you consider yourself a "freewill agnostic" or a "freewill apatheist"?
3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@Mopac
(IFF) freewill = choice (THEN) robotic sorting systems have freewill.
What is an example of a robotic sorting system making a choice?
Imagine a human working in a mail room.

They look at a package and then decide where it should be moved to.

Do you believe freewill is required for a human to perform this choice?
keithprosser
keithprosser's avatar
Debates: 0
Posts: 3,052
3
3
3
keithprosser's avatar
keithprosser
3
3
3
A human can choose to quit being a postman and join a circus.

3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@keithprosser
A human can choose to quit being a postman and join a circus.
A sophisticated robot could make a similar choice. [LINK]
Mopac
Mopac's avatar
Debates: 4
Posts: 8,050
3
4
7
Mopac's avatar
Mopac
3
4
7
-->
@3RU7AL
A human can certainly choose to place the package in the incorrect place. A robotic sorting mechanism does not have this choice.

keithprosser
keithprosser's avatar
Debates: 0
Posts: 3,052
3
3
3
keithprosser's avatar
keithprosser
3
3
3
-->
@Mopac
@3RU7AL
Perhaps we could ask how it is that a person might well want to join a circus but no sorting machine has ever wanted to.   

3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@Mopac
A human can certainly choose to place the package in the incorrect place. A robotic sorting mechanism does not have this choice.
A human who intentionally places a package in the incorrect place is a malfunctioning employee and should be fired.

Are you suggesting that you believe freewill is only manifested when a human intentionally violates their code of conduct?

If you have a human who never intentionally places a package in the incorrect place, DO THEY REQUIRE FREEWILL TO DO THEIR JOB?

In other words, in your opinion, do all human choices require freewill or do only some human choices require freewill?
3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@keithprosser
Perhaps we could ask how it is that a person might well want to join a circus but no sorting machine has ever wanted to.
It's simply a matter of sophistication and independence. [LINK]
Mopac
Mopac's avatar
Debates: 4
Posts: 8,050
3
4
7
Mopac's avatar
Mopac
3
4
7
-->
@3RU7AL
I think the better question is why doo you have such a strong desire to be a robot? Why are you so preoccupied with dehumanization? I see this as possibly a symptom of some issue you are having, this subject coming from that root.

keithprosser
keithprosser's avatar
Debates: 0
Posts: 3,052
3
3
3
keithprosser's avatar
keithprosser
3
3
3
Is it just a matter of sophistication?  The problem I have with that is that I am a very experienced computer programmer but I have no idea how to program a sorting machine with the desire to join a circus!   I know how to fake it and give an impression of having a desire, but not the real thing.
3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@Mopac
I think the better question is why doo you have such a strong desire to be a robot? Why are you so preoccupied with dehumanization? I see this as possibly a symptom of some issue you are having, this subject coming from that root.
I appreciate your dime-store psychoanalysis, but let's try and stick to the logical distinctions between a sophisticated robot's (or a spider's) "choice" and a human's "choice" and avoid ad hominem red-herrings.
3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@keithprosser
Is it just a matter of sophistication?  The problem I have with that is that I am a very experienced computer programmer but I have no idea how to program a sorting machine with the desire to join a circus!   I know how to fake it and give an impression of having a desire, but not the real thing.
The epistemological question is, can anyone tell the difference between a "fake" desire and a "real" desire if they effect the same observable behavior?

Let me know what you think of this - [LINK]
Mopac
Mopac's avatar
Debates: 4
Posts: 8,050
3
4
7
Mopac's avatar
Mopac
3
4
7
-->
@3RU7AL
It should be obvious that sorting machines do not have choice. 

3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@Mopac
It should be obvious that sorting machines do not have choice. 
A robot can analyze inbound packages for specific details and then based on those details (logic) they move the package to the corresponding location.

How is this different than a "choice"?

A spider can build a web virtually anywhere, and yet they prefer certain locations over others and only build a finite number of webs.  How is a spider's decision making hierarchy distinct from that of a human's?

Can a spider make a "choice"?
keithprosser
keithprosser's avatar
Debates: 0
Posts: 3,052
3
3
3
keithprosser's avatar
keithprosser
3
3
3
-->
@3RU7AL
I have never wanted to join the circus, but I have wanted other things.   I know what it is like to have a desire.  As an engineer I know how machines work and - unless real magic is involved - machines don't have the interacting parts needed to produce a genuine desire.
If a machine was to have a dream of circus stardom it would have to have a than two cogs and a lever.
If I use more cogs I could make a more life-like fake of 'wanting', but I am not sure I can make a machine feel a desire as I feel desires.
Mopac
Mopac's avatar
Debates: 4
Posts: 8,050
3
4
7
Mopac's avatar
Mopac
3
4
7
-->
@3RU7AL
A sorting machine does not choose anymore than a funnel does.
Mopac
Mopac's avatar
Debates: 4
Posts: 8,050
3
4
7
Mopac's avatar
Mopac
3
4
7
-->
@3RU7AL
I don't know if spiders make choices, maybe they do. Sorting machines do not make choices. In fact, a sorting machine's behavior is programmed so precisely that you can always predict what it will do.

When I put this quarter in, it will always land where the quarters are stored. If I put this nickel in, it will always land where the nickels go.


A sorting machine is nothing more than a complicated funnel.

Are you are making this comparison because you see people as being nothing more than conplicated funnels?


TwoMan
TwoMan's avatar
Debates: 0
Posts: 379
1
2
3
TwoMan's avatar
TwoMan
1
2
3
-->
@3RU7AL
(IFF) you can't quantify freewill (AND) you don't care about moral culpability (THEN) why would you bother distinguishing "freewill" from simple "choice"?
I don't. That is the definition of freewill - the ability to make a choice unimpeded.

It seems you likely have a different definition and Mopac may be correct that you define it in such a way as to make it impossible to exist.

3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@TwoMan
(IFF) you can't quantify freewill (AND) you don't care about moral culpability (THEN) why would you bother distinguishing "freewill" from simple "choice"?
I don't. That is the definition of freewill - the ability to make a choice unimpeded.

It seems you likely have a different definition and Mopac may be correct that you define it in such a way as to make it impossible to exist.
So would agree, according to your definitions, that sophisticated computerized neural networks with the ability to learn and optimize tasks, also have proper freewill (the ability to choose)?
keithprosser
keithprosser's avatar
Debates: 0
Posts: 3,052
3
3
3
keithprosser's avatar
keithprosser
3
3
3
-->
@3RU7AL
What does the word 'choice' mean?  If a computer hits a branch instruction in its program it will go one way or the other but the computer does not have any choice.  Spiders - I expect - are hard wired with rules that specify their behaviour in any given situation,  so no real choice there either.  
Why not say a choice(in this context) is a selection made in response to a desire?
TwoMan
TwoMan's avatar
Debates: 0
Posts: 379
1
2
3
TwoMan's avatar
TwoMan
1
2
3
-->
@3RU7AL
So would agree, according to your definitions, that sophisticated computerized neural networks with the ability to learn and optimize tasks, also have proper freewill (the ability to choose)?
That is not my definition, it is from Wikipedia. I would say that those computers have what is similar to the ability to choose (freewill). To the casual observer it may be indistinguishable. However, computers do not utilize the same variables that humans do when making choices so it is not identical.

3RU7AL
3RU7AL's avatar
Debates: 3
Posts: 14,582
3
4
9
3RU7AL's avatar
3RU7AL
3
4
9
-->
@keithprosser
What does the word 'choice' mean?  If a computer hits a branch instruction in its program it will go one way or the other but the computer does not have any choice.  Spiders - I expect - are hard wired with rules that specify their behaviour in any given situation,  so no real choice there either.  
Why not say a choice(in this context) is a selection made in response to a desire?
How would you distinguish between a desire and a program?

Aren't desires the consequence of our human instincts (biologically pre-programmed survival instinct for example) and learned behavior based on (programmed by) our primary experiences (limited by our biological senses and genetically determined brain capacity)?