a = 1, b = 1
(a-b)(a+b) = 0
(a-b)(a+b)/(a-b) = 0/(a-b)
1(a+b) = 0
(a+b) = 0
1 + 1 = 0 (Hence, 1+1 not equal to 2). Thoughts?
You are the final word, I wont debate you on it. I've already debated it and been proven wrong using mathematical semantics. 1.99999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999 is not 2Just one question is 1.899999999999999999999999999999999999999999999999999999999999999999999999999999999999999999 the same as 1.9 since 1.9999999999999999999999999999999999999999999999999999999999999999999999999999999999999 is the same as 2?
“Generally accepted”, as opposed to “universal or absolute” acceptance, which is to say the definition is not accepted with100% certainty.Definitions are arbitrary anyway, so I'm not sure why this is a point of contention. The point is that 1 + 1 = 2 is known with 100% certainty for the objects that most mathematicians use the symbols 1 and 2 to describe.
Yep, “Certainty in a formal system isa matter of definition”. BTW, I’m 100% certain Sherlock Holmessmoked a pipe.I'm not sure what your point is.
It took Russell and Whitehead 360pages to define and give meaning to the terms “1”, “+”, “=”, “2” and to lay thelogical foundation from which they could consider 1+1=2 to be proven. They couldn’t have been more tedious, andwent off on a lot of tangents, apparently they themselves didn’t believe theywere there until page 362, I think most mathematicians think they hadn’t adequately defined "addition" yet, many believe it as actually took them 379 pages.Exactly. The proof that 1 + 1 = 2 was rather short, it's just that prior to that things like "1" and "2" weren't even defined yet.Then ZFC and Godel came along andsquashed Logicism like a bug.Did they? I'm not sure why there are so many misconceptions about Gödel's incompleteness theorem. All it says is that in any formal system F which contains basic arithmetic, there exists a statement A such that neither A nor its negation is a theorem in F. That's it. It doesn't mean that math is broken or anything like that. Math studies theorems and their proofs, most commonly within ZFC, but also within PA and other formal systems, and is in turn merely an extension of basic logic.
So...in your proof you considered the proof to already be inplace?No. What?
Let's denote x = 1.999... and multiply both sides of the equation by 10: (you can actually pick any number)10x = 19.999...Now, subtract x from both sides:10x - x = 19.999... - 1.999...9x = 18Divide both sides by 9:x = 2So, mathematically, we can conclude that 1.999... is equal to 2.
Yeah, there are multiple ways to show this.Your way is less formal and much easier to understand.
I have a question for you:What is an objectively true statement but a statement which is true by definition?
Also, you continue to insist that because the definitions are not fully and universally agreed upon, that makes these results not "100% certain." When I claim that 1 + 1 = 2, I am making a claim about my definitions of 1 and 2, no more, no less. If someone else disagrees on what those definitions should be, then they don't actually disagree with me on the statement that I am asserting, but rather, they disagree with me on whether what I call 1 and 2 should be called 1 and 2. Those are two different kinds of disagreement.
However, if you try to prove something, you should at least make sure its not circular reasoning lol.Sadly, all math is circular reasoning. Thats why I dont believe in it.
Basic forms of math such as understanding the difference between 1 thing and 2 things have an evolutionary basis that give animals an advantage in how to navigate the world.