> > Judging by the lengthy debates in various online forums, I would say > > > this is a divisive issue. (For example, the ongoing "Ask A > > > Mathematician" thread on this topic starting in December 2010 now has > > > 982 postings!) > > > > Ha ha. Are you claiming that an issue (whatever that means, is it > > something like a problem?) that is divisive online must be so in > > mathematics as well? Why? >
Now, you are being silly.
> > > > >> > > >> > > >> In any given context we use the definition that we > > >> > > >> want to use in that context. No problem. > > >> > > > > > > What "context" is a computer programmer to use when writing software > > > for, say, medical equipment? Should he/she just assume 0^0=1? > > > Shockingly, most programming languages seem to automatically make > > > this assumption! > > > > I've told you: the programmer should do what the specification says. >
Then what "context" is the writer of the specification supposed to use?
> Nor is doing mathematics the same as programming computers. >
Yeah, computers don't recognize any kind of hand-waving.
> > > > There is a good case to be made that 0^0 is ambiguous even in the > > > natural numbers. > > > > Make it then. >
> > > > Therefore, it seems to me that the safest, most > > > conservative assumption when programming is that 0^0 should be > > > flagged as an error condition. This should be a global standard built > > > into every general purpose programming language. > > > > Yet again confusing mathematics with programming. >