The Math Forum

Ask Dr. Math - Questions and Answers from our Archives
Associated Topics || Dr. Math Home || Search Dr. Math

Inverses within Semigroups

Date: 05/06/2002 at 14:49:38
From: Clayton
Subject: Inverse operations: addition/subtraction, 
         multiplication/division, exponentiation/logarithm

I have been studying set theory and have come across the definition 
of inverses within semigroups. The book I have been reading is "Set 
Theory and Logic," by Robert R. Stoll.

In his discussion of semigroups he defines inverses, "An element a of 
a semigroup X, <X,*,e>, is inertible iff there exists an element a' 
of X such that a * a' = a' * a = e."

He goes on to discuss the semigroups for addition in which e = 0 and 
multiplication for which e = 1. He does not discuss a semigroup for 
exponentiation. I would like to know the value of e for such a 
semigroup if it exists. In other words what CONSTANT value e satisfies 
the equation, a^b = e?

I have been exploring complex exponentiation in the hope that e for 
exponentiation may have a complex solution. I have tried solving the 
equation a^b = e, directly by:

         log  a^b = log  e
            a          a

         b = log  e

If a is in R, however, and is allowed to vary from -oo to +oo, there 
is no sensible solution for b that I can find.


Date: 05/06/2002 at 17:04:22
From: Doctor Peterson
Subject: Re: Inverse operations: addition/subtraction, 
             multiplication/division, exponentiation/logarithm

Hi, Clayton.

You've jumped over the preliminaries, which must never be ignored: 
before you can talk about inverses, you have to look at the 
definitions of a semigroup and of an identity (your "e"). You are 
taking the definition of an inverse as if it were the definition of 
the identity, skipping an important step.

But first, we have to see whether the set of real or complex numbers 
forms a semigroup under exponentiation. I'll take the definition 
given in Eric Weisstein's MathWorld: 


    A mathematical object defined for a set and a binary
    operator in which the multiplication operation is associative.
    No other restrictions are placed on a semigroup; thus a
    semigroup need not have an identity element and its elements
    need not have inverses within the semigroup.

(The fact that you have identified a semigroup as <X,*,e> suggests 
that the existence of an identity is being assumed, making this 
discussion refer to a monoid rather than a mere semigroup; but that 
doesn't affect my discussion.)
So to define a semigroup, we first have to choose the set; since 
you've mentioned complex numbers, I'll assume you intend that to be 
the set we're using. You've specified the operation, exponentiation. 
Now we have to show that it is associative:

    a^(b^c) = (a^b)^c  for all a, b, c

But that is NOT true for exponentiation! Here is a simple 

    2^(3^4) = 2^81

    (2^3)^4 = 2^(3*4) = 2^12

Clearly these are not equal, so we don't have a semigroup. (That's 
probably why your book didn't mention it!)

Now let's pretend this doesn't matter, and take the next step (so you 
can see what to do in other problems). What would the identity be?

The identity would be an element e for which

    a^e = e^a = a  for all a

Let's see what this would mean. First, we want

    a^e = a  for all a

Clearly e=1 works here.

We also want

    e^a = a  for all a

Play with that a while, and you will see that there is no such e. So 
even if we had a semigroup, it would not have an identity, so the 
concept of an inverse would be meaningless.

As you see, when we choose an arbitrary set and operation, there is 
no guarantee that it will satisfy the definition of even as broad a 
concept as a semigroup. You can never assume anything here.

Incidentally, your subject line suggests that you are confusing the 
concept of inverse in a semigroup with that of an inverse operation, 
such as the logarithm. You are evidently looking for an explanation of 
inverse functions (and the logarithm in particular) in terms of 
inverse elements of a semigroup. The concepts are different, and 
exponentiation does not even give a semigroup at all, as I said, but 
there are some interesting relationships.

Let's first look at the case of addition.

We have seen that there is an (additive) identity, called 0, defined 
by the fact that

    a + 0 = 0 + a = a  for all a

We have seen that the (additive) inverse of an element a, which we 
can write as "-a", is defined by:

    a + -a = -a + a = 0

Now we can define the inverse operation to addition, which we will 
call subtraction, by

    a - b = a + -b

It is no coincidence that we use the same symbol, "-", for both 
inverses; but it's important to distinguish them and realize that we 
could have used different symbols, and that the properties of 
subtraction have to be proved from the combined properties of addition 
and the additive inverse. We are simply defining a new operation here.

What properties does an inverse operation have? Let's combine + and - 
in various ways:

    (a + b) - b = a
    (a - b) + b = a
    a + (b - a) = b
    a - (a + b) = -b

These facts all depend on associativity for their proofs; try proving 
them using the definition above. (You'll find that you also have to 
prove that -(a+b) = -a + -b, and that --a = a, both of which are 
easy, though the former depends on commutativity.)

Now you can do the same with multiplication; the only difference, 
besides the symbols, is that the number 0 has no inverse, so that 
division is not defined when the divisor is zero:

Multiplicative identity, 1:

    a * 1 = 1 * a = a  for all a

Multiplicative inverse, a' (I'm avoiding exponential notation to keep 
things clear):

    a * a' = a' * a = 1

Inverse operation, division ("/"):

    a / b = a * b'

Properties of the inverse operation:

    (a * b) / b = a
    (a / b) * b = a
    a * (b / a) = b
    a / (a * b) = b'

Now we want to find an inverse operation to exponentiation. The 
trouble, we saw, is that exponentiation has none of the nice 
properties of addition and multiplication that define semigroups (and 
better things). It is not commutative, and not even associative; 
there isn't an identity, much less an inverse. But we can still do 
some things, though they have to be one-sided. For example, I showed 
you that 1 can be considered the "right-identity":

    a ^ 1 = a  for all a

though 1^a = 1, so it is not also a left-identity, and therefore not 
a true identity. But there is no inverse, so we can't just blindly 
define the inverse operation as exponentiation with the inverse, as 
we defined subtraction and division. A "left-inverse" would require 

    a' ^ a = 1

and a "right-inverse" would require

    a ^ a' = 1

neither of which exists. (Actually, the first a' would have to be 1 
for all a, and the second a' would have to be 0, neither of which is 
very useful as an inverse.)

Now look at the properties we showed for the inverse operation, and 
decide what the inverse operation has to be in order to behave the 
same way. As with the identity, we should not be surprised if we need 
different operations in each case. I'll call the inverse 
operation "v" since that looks like an inverted "^":

    (a ^ b) v b = a
    (a v b) ^ b = a
    a ^ (b v a) = b
    a v (a ^ b) = vb? (undefined - there's no inverse element)

I'll take those one at a time:

    (a ^ b) v b = a

What can we do to a^b to get a back? Well, we can raise it to the 1/b 

    (a^b)^(1/b) = a^(b * 1/b) = a^1 = a

Therefore, our "left-associative inverse operation" is

    c v b = c^(1/b)

Hmmm ... this is in fact exponentiating with the (multiplicative!) 
inverse, just what we would have liked to do if we had had an inverse 
of our own.

Next case:

    (a v b) ^ b = a

What number raised to the b power gives a?

    (a^(1/b))^b = a^(1/b * b) = a^1 = a

so again,

    a v b = a^(1/b)

That's nice: we've defined one inverse that works for both these 
cases with "left-associativity." That inverse is in fact the bth root 
of a, and my notation almost works, except that the b goes before the 
"radical sign" rather than after (pure coincidence).

Will this hold up?

    a ^ (b v a) = b

What exponent can we raise a to to get b? Here we need something 
different, and we finally come to what you were looking for:

    a ^ log_a(b) = b

so we define

    b v a = log_a(b)

How about the last case?

    a v (a ^ b) = "vb"

We don't have an inverse, but let's see what happens if we use the 
log again:

    a v (a ^ b) = log_(a^b) (a) = log(a) / log(a^b) 
       = log(a) / (b log(a)) = 1/b

Wow! Again we find that if we use the multiplicative inverse, our 
expectations are met. Even though 1/b is in no sense the inverse 
element under exponentiation, it happens to fit in here. So we have a 
nice "right-associative inverse" to exponentiation:

    a v b = log_b (a)

So you see that exponentiation actually has two different inverse 
operations, depending on which side we look at it from: the root, or 
the logarithm. That shouldn't surprise you.

By the way, I've never studied "non-associative algebras," so the 
ideas here are mostly my own, once I got past the standard concepts 
of "left inverse" and so on. I've never heard of a "left-associative 
inverse," but it seems to make some sense here.

- Doctor Peterson, The Math Forum 

Date: 05/07/2002 at 17:45:28
From: Clayton
Subject: Inverse operations: addition/subtraction, 
         multiplication/division, exponentiation/logarithm

Dr. Peterson,

Thank you very much for your prompt reply. You addressed my issue 
right on the head of the nail. It's not so much that I didn't 
understand the properties of logs etc. but rather that I was curious 
as to "why" the inverse operations of exponent and logarithm didn't 
behave in the same way as addition and multiplication.

In other words, set theory, in many cases, adopts a more general or 
abstract definition of something (say a semigroup) to attempt to 
define many specific concepts from algebra or other "higher" order 
maths. I was curious why exponentiation and logarithm didn't seem to 
fit the general definition of inverses for semigroups, which you 
cleared up for me.

I didn't know about some of the interplay between multiplicative 
inverses in the exponents and the operations of logarithm and 
exponentiation that you pointed out. However, I have long wondered 
whether a notation such as "a v b" might be less unwieldy than the 
klunky "log_b (a)" and hence lead to more intuitive understanding of 
the logarithm and it's properties. Most of the time, mathematical 
notation lends itself to the operation at hand, but in the case of 
logarithm I think the notation falls short.

A related question I've had is this, (I don't want to belabor you, 
but you seem interested in my questions): Given the natural numbers 
and the operation of addition, we can construct the negative numbers 
using the concept of an identity element, 0. We can then construct 
the operation of multiplication in terms of iterated addition. We can 
then construct fractions using another identity element, 1. We then 
proceed to construct exponentiation as iterated multiplication. We 
construct logarithm as the inverse of exponentiation as you did in 
your letter to me. 

What then? Can we construct another operation using iterated 
exponentiation? Although this doesn't mean it doesn't exist, I have 
never seen such an operation defined. Maybe because it wouldn't have 
any use, but then again, the fellow who discovered matrices thought 
they would have no application at all!

I don't want to get too deep, or ask too many questions, but is there 
a relation, that you are aware of, between this hierarchy of iterated 
operations (if it exists) and fractals? Fractals are, in general, 
self-similar in one way or another and are constructed using 
iterations. Likewise, a heirarchy of iterated operations would be 
self-similar in a sense as well. Also, there seems to be a close 
relation between exponentials and fractals. For example, the length of 
a Koch snowflake after n iterations is,

L = 3(4^n)(3^-n)a

where a is the length of one side of the triangle upon which the Koch 
snowflake is constructed. This is an exponential function. Maybe 
iterated exponentiation would have an application here. :)

Again, thank you for taking the time to answer my questions.


Date: 05/07/2002 at 22:14:43
From: Doctor Peterson
Subject: Re: Inverse operations: addition/subtraction, 
             multiplication/division, exponentiation/logarithm

Hi, Clayton.

Your mention of the awkwardness of logarithm notation brings up a 
point I'd thought of mentioning. Both exponents and logs have an odd, 
asymmetrical notation with one number "off the line," and that 
actually matches up with their asymmetrical behavior. In working with 
my temporary "a v b" notation, I found it actually awkward to use, 
because it's hard to remember which number is the base. Of course, I 
have gotten used to using a^b for exponents (and being very careful 
about parentheses as a result), so I would probably get used to a new 
logarithm notation. My first thought, partly based on the observation 
that my "a v b" has the base come second, would be to write the base 
as a subscript, showing its inverse relationship to exponents:

    a  = log_b(a)  inverse of  b  = exp_b(a)

Of course that would be far too easily confused with other uses of 
subscripts. It's fun to think about how we could do this, though.

As for iterated exponentiation, that kind of operation has indeed 
been studied. I haven't found it very interesting, but it has some 
use. Here is a page about it:

   big numbers - Susan Stepney 

- Doctor Peterson, The Math Forum 
Associated Topics:
College Exponents
College Logic

Search the Dr. Math Library:

Find items containing (put spaces between keywords):
Click only once for faster results:

[ Choose "whole words" when searching for a word like age.]

all keywords, in any order at least one, that exact phrase
parts of words whole words

Submit your own question to Dr. Math

[Privacy Policy] [Terms of Use]

Math Forum Home || Math Library || Quick Reference || Math Forum Search

Ask Dr. MathTM
© 1994- The Math Forum at NCTM. All rights reserved.