Drexel dragonThe Math ForumDonate to the Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by Drexel University or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.math.independent

Topic: Proof that mixed partials commute.
Replies: 20   Last Post: Nov 22, 2013 11:57 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
ross.finlayson@gmail.com

Posts: 1,188
Registered: 2/15/09
Re: Proof that mixed partials commute.
Posted: Nov 22, 2013 11:57 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On Tuesday, November 19, 2013 3:59:38 PM UTC-8, Hetware wrote:
> On 11/19/2013 11:17 AM, dullrich@sprynet.com wrote:
>

> > On Mon, 18 Nov 2013 18:45:53 -0500, Hetware <hattons@speakyeasy.net>
>
> > wrote:
>
> >
>
> >> In _Calculus and Analytic Geometry_ 2nd ed.(1953), Thomas provides:
>
> >>
>
> >> Theorem. If the function w=f(x,y) together with the partial derivatives
>
> >> f_x, f_y, f_xy and f_yx are continuous, then f_xy = f_yx.
>
> >>
>
> >> Both Thomas and Anton (1980) provide rather long-winded proofs of this
>
> >> theorem. These proofs involved geometric arguments, auxiliary
>
> >> functions, the mean-value theorem, epsilon error variables, a
>
> >> proliferation of symbols, and a generous helping of obscurity.
>
> >>
>
> >> Starting from the definition of partial differentiation, and using the
>
> >> rules of limits, along with a modest amount of basic algebra, I came up
>
> >> with this:
>
> >>
>
> >> f_x(x,y) = Limit[[f(x+Dx,y)-f(x,y)]/Dx, Dx->0]
>
> >>
>
> >> f_yx(x,y) = Limit[[f_x(x,y+Dy)-f_x(x,y)]/Dy, Dy->0]
>
> >> = Limit[
>
> >> [[f(x+Dx,y+Dy)-f(x,y+Dy)]-[f(x+Dx,y)-f(x,y)]]/DyDx
>
> >> , {Dy->0, Dx->0}]
>
> >>
>
> >> f_xy(x,y) = Limit[
>
> >> [[f(x+Dx,y+Dy)-f(x+Dx,y)]-[f(x,y+Dy)-f(x,y)]]/DxDy
>
> >> , {Dx->0, Dy->0}]
>
> >
>
> > That's very bad notation. It's not one limit, it's the limit of
>
> > a limit. Should be
>
> >
>
> > Limt(Limt(...)[x->c][y->c].
>
> >
>
> > And now the big question is why
>
> >
>
> > Limt(Limt(...)[x->c][y->c] = Limt(Limt(...)[y->c][x->c]
>
>
>
> I guess I should have included the intermediate steps. I had intended
>
> that the order of taking limits should be ambiguous.
>
>
>

> >>
>
> >> The only caveat is that the rules for limits, such as /the product of
>
> >> limits is equal to the limit of the products/, are stated in terms of a
>
> >> single variable. For example:
>
> >>
>
> >> Limit[F(t) G(t), t->c] = Limit[F(t), t->c] Limit[G(t), t->c]
>
> >>
>
> >> Whereas I am assuming
>
> >>
>
> >> Limit[F(x) G(y), {x->c, y->c}] = Limit[F(x), x->c] Limit[F(y), y->c].
>
> >>
>
> >> I argue as follows. The statement that x->c as y->c can be formalized by
>
> >> treating x and y as functions of t such that
>
> >>
>
> >> Limit[x(t), t->c] = Limit[y(t), t->c] = c
>
> >>
>
> >> |F(x)-F(c)| < epsilon_F ==> |x-c| < delta_F exists
>
> >>
>
> >> |G(y)-G(c)| < epsilon_G ==> |y-c| < delta_G exists
>
> >>
>
> >> |x(t)-c| < epsilon_x ==> |t-c| < delta_x exists
>
> >>
>
> >> |y(t)-c| < epsilon_y ==> |t-c| < delta_y exists
>
> >>
>
> >> Now epsilon_F ==> delta_F can be used as delta_F = epsilon_x which
>
> >> implies delta_x > |t-c| exists. So
>
> >>
>
> >> Limit[F(x(t)), t->c] = Limit[F(x), x->c], etc.
>
> >>
>
> >> It follows that
>
> >>
>
> >> Limit[F(x(t)) G(y(t)), t->c]
>
> >> = Limit[F(x(t)), t->c] Limit[G(y(t)), t->c]
>
> >> = Limit[F(x), x->c] Limit[G(y), y->c]
>
> >>
>
> >> Am I making sense here?
>
> >
>
> > Not as far as I can see.
>
>
>
> Suppose that x(t)=at+c and y(t)=bt+c where a and b are arbitrarily
>
> chosen real number constants where at least one is not zero. Is it true
>
> that
>
>
>
> Limit[F(x(t)) G(y(t)), t->c]
>
> = Limit[F(x(t)), t->c] Limit[G(y(t)), t->c]
>
> = Limit[F(x), x->c] Limit[G(y), y->c]
>
>
>
> for all possible a and b? Assuming F and G are continuous in the
>
> neighborhood of {x(c),y(c)}. Is there any case of Limit[F(x), x->c]
>
> Limit[G(y), y->c] not covered by the set of all pairs {{a,b}}?







Where are you bringing out the squares? All I see is to
establish the relative values of variables, in limits, is
a process, that maintains that relative scale, in limits,
or bounds.

(And bounds don't.)

I am expecting that when you mutually analyze the
co-differential for x and y back in the fundamental
theorem, it is to squares here that later it is as to
approximations. Then, when you go to exchange limits of
them are they share, sure that's a conserved notion that
the order is immaterial except here for example where its
ranged over the bounds, and simple enough order of
integration is in effect. (Here of perfect distance.)

Here a line is a circle to some infinity distant point,
where: to be infinitely distant, that point is to the
line, infinitely distant from each of its points. Then,
the other point might be closest to another point on the
line, as the next definition of geometry, for a ray from
that point to infinity (eg as through its vector, in real
numbers). That's simply not an unusual expectation where
then that the order of integration as effective, is on
that line, for the perspective and projection of some
actual _ordering_ of the integration of components, that
are mutual, an actual ordering itself of the integration
of components, here establishes a square. The integration
of components could be of any other, and, for a particular
action of the components, is direct to that, with usual
notions of conservation and symmetry.

So, when you want the variables to share the limit, or the
variable in the one sense to be limited by another, here
in the standard framework that y and x are interchangeable
in ordering, as long as their own identity is maintained,
they're not just interchangeable in the ordering, it's
that due the natural symmetries of Lebesgue and Riemann
integrals, in the Cartesian, the components of integration
are as to squares. To actually estimate that effect,
instead of their own vector model, of actions, the
effective integration of the order of the components of
integration, would be effective terms in the relative
estimation effect, of even maintaining x and y as vector
bases: of integrable components.

Here, the declaration and notation, while direct, in
relative x and y in their limits as a natural monotonic
process, working up how the integration is an integration
of the integration components, that entire families of
usually regular analytical frameworks of those components
are as to that they would be more concisely expressed in
different analytical components and in their relation
concise, those would add up. So, what a system has
already been constructed to be integrating it, then
integrating over that (Which is refinement) would
naturally fall to lines. It's like: Achilles and the
tortoise are racing from A to B. As usual, they race
directly from A to B. It is not the concern that Achilles
could wait the entire race and win at the last moment.
(This could be however far it is.) This is usually framed
as in a line or a racetrack which is a quadrilateral with
semi-circle track segments on the sides. Starting behind,
Achilles could be always back in time, but starting the
same, he always beats the turtle at every race, it is
established here from their rates of motion and the simple
enough rules of motion as their numeric equations,
classically (phys.), that he could start far enough
behind, at any race, and win. That there is a race the
turtle could win, is defined here from its constant rate
progress of Achilles, A. Simply enough as A(t) and T(t)
are functions of their distance, _only between and through
A and B_, then the track is an effective definition of a
circuit.

Then, the time the Turtle could win, compared to what
Achilles could save waiting to start the race, are
different in whatever is A(t), that A_max(t) = T(t). This
is that whatever the race, Achilles could wait the entire
beginning or the race T(t) - A(t), or (T-A)(t). (Rather,
B-A , that the minimum time Achilles can win the race, is
the maximum time the Turtle or Tortoise has to win the
race. ) If the turtle gets close enough to the finish
line then it could win, then if it can see Achilles start,
it can wait as long as it takes to get going again, that
is the time the turtle can win.


Regards, Ross Finlayson



Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© Drexel University 1994-2014. All Rights Reserved.
The Math Forum is a research and educational enterprise of the Drexel University School of Education.