0 Members and 2 Guests are viewing this topic.
Anyway I personally have no clue why 0/0 doesn't work, though, but what I think is that 0 by itself is nothing, so basically you divide nothing by nothing. However I think it's also because you multiply nothing by the infinity, but I'm not sure if the opposite of "infinity" is "nothing".
Yeah, we're learning Calculus (mostly differential atm) in Year 12, and we had a long discussion about limits. The limit technically does not exist at the given value e.g.: lim h->0 f(x)=3x^2+7x-5. At h=0 the equation is undefined as we're dividing by 0 (The answer could be infinity, for all we know, or it could be 9000.00000032), but the point of Calculus (well at least this part of calculus) is that we are to find the theoretical derivative of the equation when h=0, even though it technically does not exist.Hope that made some sense, and I may have made a mistake.
So that implies 0/0=1 which is mathemagically correct. (x/x=1)hmmm...
Guys, please no "divide by zero" funny pictures on this topic.Somebody gave me a good reason that 0/0 isn't valid. But I can't remember what it was.
Quote from: Hot_Dog on February 23, 2012, 12:37:49 pmGuys, please no "divide by zero" funny pictures on this topic.Somebody gave me a good reason that 0/0 isn't valid. But I can't remember what it was.Let's represent 0/0 as something else, namely 0*(1/0). This makes it obvious what the main problem is. Division by 0, of *any* number, whether a real or a rational number, simply isn't defined in the standard system of reals. It's not immediately obvious why this is so. 0 doesn't appear to be a special number (besides being the null quantity) until we look at the definition of a field, which the set of real numbers forms when combined with the operations of addition, negation, multiplication, and multiplicative inversion (Finding A=B-1). Basically, 0 is what is called the additive identity, which means that any number from the field can be added to it and you will get the same number back out. A+additive identity=A in any field F. The important part of being defined as the additive identity is that by definition, division by the additive identity is undefined in any non-trivial field. It's just a matter of definition. Things go funky when you allow it and inconsistencies creep in, such as being able to "prove" that 1=2 in the real numbers. TL;DR 0/0 is undefined because convention says it's undefined to make things work out properly. You can say 0/0 is whatever you want, but as soon as you do so, everything else falls apart.Also, now that I've written this, I see that adriweb beat me to it with a well timed QuoteExcellent explanation. Thank you.
Excellent explanation. Thank you.