The Math Forum



Search All of the Math Forum:

Views expressed in these public forums are not endorsed by NCTM or The Math Forum.


Math Forum » Discussions » sci.math.* » sci.math

Topic: Failing Linear Algebra:
Replies: 54   Last Post: Jan 10, 2007 12:47 PM

Advanced Search

Back to Topic List Back to Topic List Jump to Tree View Jump to Tree View   Messages: [ Previous | Next ]
Russell Blackadar

Posts: 586
Registered: 12/12/04
Re: Failing Linear Algebra:
Posted: Apr 29, 2004 10:40 PM
  Click to see the message monospaced in plain text Plain Text   Click to reply to this topic Reply

On 29 Apr 2004 20:35:59 GMT, Anonymous wrote:

>Russell:
>

>>Also the salient point of these variables being "free" is that they
>>can be set to *anything* (not just zero) and the system holds.


Another thing I didn't mention is that it's arbitrary which variables
you choose to be free, and which "basic" as you call them. That is,
you could rewrite the same equations with (say) the terms for x6 in
the first column and those for x1 in the 6th column. Then x6 would be
basic and x1 would be free; your results will look different
numerically but they will turn out to work equally well in the
equations. Probably you shouldn't worry about this complication just
yet, just don't be surprised if you run across it someday.

>
>Right. But, once you get a system into echelon form, one free variable is set
>to one and the others to 0, then the system of basic variables are solved in
>terms of one free being 1 and the rest being 0. Then, the same thing is
>repeated so that each free variable equals 1 once, while holding the others at
>zero. This gives the basis of the system, right?


Basis of the *system*? That is very confused terminology. Drum into
your head the idea that a basis is something that applies to a *vector
space*. And nothing else. (in linear algebra, that is.)

What vector space are you talking about here? Before you can say
something is a basis, you need to say what vector space it is supposed
to be the basis of.

Btw I should really have been more severe with you earlier in my
previous post; we had been talking in the context of the matrix of a
linear transformation, and then suddenly you started throwing in
concepts like "system of equations" and "free variables" etc.; that is
an *application* of the theory of linear transformations but it is not
the only one possible, nor is it even the first one you should think
of, IMHO. Get used to thinking about matrices etc as abstract objects
in their own right, not just some weird technique for solving systems
of equations.

That said, we can go back to looking at this particular application.
The reason why linear transformations and the kernels thereof are
important here that if some vector solves the matrix equation, then
you can add to it any other vector in the kernel of the transformation
(in other words any other vector that satisfies the homogeneous matrix
equation) and that will also be a solution. Therefore, the subset of
vectors in R^n that are solutions to the equation will in some sense
have the same dimension as the kernel (which *is* a subspace, i.e. is
a vector space in its own right). But the set of solutions is not
itself a subspace, so you will not be able to find a basis for it.
Hint: in your example below, which very important vector is not in the
set of solutions?

>
>For example:
>
>x1 + 6x5 +x7 = 5
>
>x2 + 4x6 = 8
>
>x3 + x5 + 8x6 = 1
>
>x4 + 3x5 + x7 = 10


You should be working on the homogeneous equation here, I think. Set
the rhs to zero for now. AIUI the goal of this is to get a basis for
the kernel. (OTOH I may be doing your grandmother's linear algebra,
so anybody who wants to correct me feel free to pitch in.)

>
>In this system, 1-4 are basic. 5-7 are free. So, first, the system is solved
>by setting x5 = 1, x6 = 0, and x7 = 0.
>
>span = {(-1, 8, 0, 7 1,0,0)}


I don't know what you mean by this "span =" notation. There are
various ways to denote the idea of span but yours doesn't do the job.
Copy the one that your prof and/or textbook use, and be precise about
it. Anyhow, your vector doesn't work in a span; try multiplying it by
2 and see if it satisfies your system. You need to subtract the
constants out (make it homogeneous) so I think you should have
(-6, 0, 1, -3, 1, 0, 0).

>
>Then, x5 and x7 are set at 0, while x6 = 1:
>
>span = {(5, 4, -7, 10, 0,1,0)}


For the correct homogeneous equation, it is
(0, -4, -8, 0, 0, 1, 0)

>
>Lastly, x7 is set to 1 and x5 & x6 = 0:
>
>span={(4, 8, 1, 9, 0,0,1)}


Similarly,
(-1, 0, 0, -1, 0, 0, 1)

>
>Right? So those three become a basis?
>
>basis = {(-1,8,0,7,1,0,0) , (5,4,-7,10,0,1,0) , (4,8,1,9,0,0,1)}
>
>Right? So, there's the basis?


My vectors are a basis -- of the kernel, not of the solution set
(which has no basis). Yours are not a basis of any space relevant to
the problem, AFAICS.

The solution set is the set of all vectors of form (5,8,1,10,0,0,0)+V
where V is a member of the span of my three vectors.

dim = 3, which means the dim (kernel) is three.
> But the original system is in R^7, so the dim of the image must be 4. Am I
>getting this? PLEASE correct any errors. Thanks.


That last bit is right, but note, the image of the linear
transformation is not very important here. It happens to be the set
of all column vectors that you can write on the right hand side and
get a solvable equation with your matrix; but here we already know
what matrix equation we want to solve and aren't interested in any
other ones.

>
>Any more insight as to why exactly the dimension of the kernel would be 3 would
>be helpful. Is it because any of three of the variables (the free ones) can be
>the zero vector and the system still holds?


Look at my statement at the top of this post.

With three unconstrained variables that you can set to anything, the
point is that you have a whole 3-dimensional space of vectors that
will "hold" in your homogeneous equation. That's a lot more than what
you say, with the free variables all set to zero (but yes that does
happen to be a solution too). The technique of setting one of them to
1 and the rest to 0, and solving, etc., is a way of getting a *basis*
for the kernel, but that's by no means the only basis that exists for
the kernel.

>
>>That
>>means there are many, many column vectors satisfying such a
>>homogeneous matrix equation, a whole r-dimensional subspace of them,
>>where r is the number of free variables.

>
>Right. I've got that. The simplest way to look at it is in the following type
>of independence:
>
>x+ y = 10
>
>Y is the only free variable. But, x & y can both be 5. x can be 3 and y can
>be 7, or vice-versa. x can be -100 and y can be 110. x can be 9.999 and y can
>be 0.001, etc, etc, etc. So, is the above system independent?


You tell me. How many equations do you have in how many unknowns?
How many rows will there be in your matrix if you write this as a
matrix equation, and how many columns? If you wrote the homogeneous
version of this matrix equation, you should find that you have a
kernel of dimension 1, so there should be one vector in the basis.

>
>Also, would the basis trick work above? If y is set to 1, then x has to be 9.


No, it's -1 because we should be using the homogeneous equation.

>But if y is set to 0, then x has to be 10. So, is the basis: (9,1) or (10,0)?

You see the trouble your method got you into. Forget trying to get a
basis for your solution set -- it isn't a vector space!

>My guess is that it would be (10,0), which can be reduced to (1,0), right?

Your goal by two weeks from now should be not to have to guess. With
the basis {(-1,1)} for your kernel, you are able to express the
general solution of your equation as (10,0) + c*(-1, 1). And if for
example c=110, you get one of the examples you listed above.

>
>
>

>>
>>Its dimension, I think you mean. I'm not familiar with the term
>>"basic variables" but yes, the dimension of the image is n-r using the
>>terminology I used above.

>
>Yes. We call all of the variables that appear first in any of the equations,
>when the system is in echelon form, "basic variables". The other ones are
>called "free variables". So, in any system in echelon form, the number of
>basic variables equals the number of equations. Is that what you mean by
>"n-r", where r is the number of free variables (dimension of the kernel)? Or,
>is your "r" actually the basic variables and "n-r" the frees?
>

>>Your single most damning confusion, as far as I can tell, is that you
>>seem not able yet to distinguish between a list (or set or space) of
>>vectors, and the n-tuple that describes (or is) one vector.

>
>I don't think I can even figure out what you're saying, so it must be confusing
>me. Could you provide an example please?
>
>Are you trying to say that: {(0,3,4,5,1),(4,3,0,0,1), (1,1,4,9,1)} is a set of
>three vectors, and each one of the 3 vectors is a 5-tuple? I think I've got
>that down.


That's right. Of course, you might find the same set written as
{a,b,c} with a being defined as (0,3,4,5,1), etc. If you weren't
looking carefully, you might think that {a,b,c} is a 3-vector, instead
of what it really is, a set of three vectors.

It troubled me a lot when you talked about (x_1, x_2, ..., x_n) as if
you thought the x's were vectors and the thing you wrote some sort of
vector space. That's why I singled this out as an important confusion
of yours. You may be mostly over it by now, but I think you have to
be a bit vigilant still.

>
>>Take special care to notice when a list is surrounded by {} -- that is
>>a set, usually in this context a set of vectors -- and when it is
>>surrounded by () -- that is (usually, in this context) an n-tuple, and
>>it (usually) refers to a *single* vector.

>
>OK, yup. I *think* I'm getting that.
>

>>E.g. {(1,0,0), (0,1,0), (0,0,1)} is a set of three vectors; it's one
>>of the possible basis sets for the vector space R^3. 1, 0, 0 are the
>>three components of the vector (1,0,0) in R^3.

>
>Right. Got that, thanks. I'm good with bases when they're in that form.
>Otherwise, I'm having trouble, which is why the change of basis formula gets
>me.


Not sure what to say here, are you bothered by abstract notation like
{e_1, e_2, e_3} being a basis for some 3-dimensional vector space? Or
is the problem that you can't seem to grasp bases in R^n other than
the standard one?


Date Subject Author
4/22/04
Read Failing Linear Algebra:
Guest
4/22/04
Read Re: Failing Linear Algebra:
Michael N. Christoff
1/10/07
Read Re: Failing Linear Algebra:
Gerry Myerson
1/10/07
Read Re: Failing Linear Algebra:
Jonathan Miller
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich
1/10/07
Read Re: Failing Linear Algebra:
Acid Pooh
1/10/07
Read Re: Failing Linear Algebra:
Guest
4/23/04
Read Re: Failing Linear Algebra:
Russell Blackadar
4/23/04
Read Re: Failing Linear Algebra:
Brian Borchers
4/27/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
maky m.
4/26/04
Read Re: Failing Linear Algebra:
David Ames
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Michael Stemper
1/10/07
Read Re: Failing Linear Algebra:
maky m.
4/23/04
Read Re: Failing Linear Algebra:
Porker899
4/27/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Abraham Buckingham
1/10/07
Read Re: Failing Linear Algebra:
Mitch Harris
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Grey Knight
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Toni Lassila
1/10/07
Read Re: Failing Linear Algebra:
Thomas Nordhaus
1/10/07
Read Re: Failing Linear Algebra:
George Cox
4/28/04
Read Re: Failing Linear Algebra:
Dave Rusin
4/28/04
Read Re: Failing Linear Algebra:
George Cox
4/28/04
Read Re: Failing Linear Algebra:
George Cox
4/29/04
Read Re: Failing Linear Algebra:
Marc Olschok
4/29/04
Read Re: Failing Linear Algebra:
Mitch Harris
4/29/04
Read Re: Failing Linear Algebra:
Robert Israel
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/28/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/29/04
Read Re: Failing Linear Algebra:
Russell Blackadar
4/29/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
5/1/04
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
1/10/07
Read Re: Failing Linear Algebra:
Dave Rusin
1/10/07
Read Re: Failing Linear Algebra:
Russell Blackadar
4/30/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich
4/27/04
Read Re: Failing Linear Algebra:
Guest
4/27/04
Read Re: Failing Linear Algebra:
Guest
4/28/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
Law Hiu Chung
4/30/04
Read Re: Failing Linear Algebra:
Guest
1/10/07
Read Re: Failing Linear Algebra:
David C. Ullrich

Point your RSS reader here for a feed of the latest messages in this topic.

[Privacy Policy] [Terms of Use]

© The Math Forum at NCTM 1994-2017. All Rights Reserved.