Associated Topics || Dr. Math Home || Search Dr. Math

### Why Can't 0 Divided By 0 Be 0?

```Date: 11/25/2003 at 17:27:49
From: Steven
Subject: Dividing 0 by 0.

Why can't you divide 0 by 0?  I've thought about it and it seems that
dividing 0 objects, into 0 groups will result in 0 groups which makes
sense.  I've read over your "Why dividing by 0 doesn't work" posts,
but I don't see any real reference to this.

Here's my take on the classic 'proof' that 2 = 1:

a = b
a^2 = ab
a^2 - b^2 = ab - b^2
(a-b)(a+b) = b(a-b)
a+b = b
b+b = b
2b = b
2 = 1

In this situation, if we assume a = b = 0, wouldn't it work out?

```

```
Date: 11/26/2003 at 23:39:03
From: Doctor Ian
Subject: Re: Dividing 0 by 0.

Hi Steven,

>Why can't you divide 0 by 0?

Because division of anything by zero is undefined.  It's undefined,
because any possible number that might be assigned to it would cause
most of arithmetic to fall apart, by allowing false statements (like
the one you 'prove' below) to be proved.

>I've thought about it and it seems that dividing 0 objects, into 0
>groups will result in 0 groups which makes sense.  I've read over
>your "Why dividing by 0 doesn't work" posts, but I don't see any real
>reference to this.

Because it's irrelevant.  Numbers are used to model various objects
and activities in the world, but you can't draw conclusions about
numbers based on those objects or activities, because numbers aren't
constrained at all by what goes on in the world.

For example, if you take an object in the real world and start halving
it, you eventually get to the point where you've got something
indivisible (e.g., a quark).  Should we then conclude that if we keep
halving a number, eventually we'll get to a number that can't be
halved again?  No, we shouldn't.  Why not?  Because numbers aren't
objects.  They're just concepts that we sometimes use to represent
objects, the way we might use sugar cubes to represent battleships
when planning a naval battle.

>In the case of (2=1):
>
>a = b
>a^2 = ab
>a^2 - b^2 = ab - b^2
>(a-b)(a+b) = b(a-b)

If a and b are equal, dividing by (a-b) is dividing by zero, and
that's undefined.  So you can't draw any valid conclusions from it.

>a+b = b
>b+b = b
>2b = b
>2 = 1
>
>In this situation, if we assume a = b = 0, wouldn't it work out?

No, because you're still dividing by zero.

Let's put it this way:  You _could_ define 0/0 to be zero, but only at
the expense of being able to prove things like 2 = 1.  But once you
can prove that 2 = 1, you can prove _anything_, and math becomes
useless.

So what you'd be doing would be sort of like saying, "I'm going to
change the rules of chess, so that a queen can move directly to any
square, even if there are other pieces in the way, and even if that
square isn't in the same rank, file, or diagonal."

Now, you might find that game entertaining, but most people would find
it completely uninteresting, because both kings begin the game in
checkmate.

Similarly, a mathematical system in which it's possible to prove that
2 = 1 might be interesting to you, but it would hold no interest for
mathematicians.

Does this make sense?

- Doctor Ian, The Math Forum
http://mathforum.org/dr.math/
```

```
Date: 11/27/2003 at 17:04:55
From: Steven
Subject: Thank you (Dividing 0 by 0.)

Ok, I understand perfectly now.  Your explanation was much better than
the one my teacher gave me which was simply, "You just can't."  Thanks
```
Associated Topics:
High School Number Theory

Search the Dr. Math Library:

 Find items containing (put spaces between keywords):   Click only once for faster results: [ Choose "whole words" when searching for a word like age.] all keywords, in any order at least one, that exact phrase parts of words whole words

Submit your own question to Dr. Math
Math Forum Home || Math Library || Quick Reference || Math Forum Search