Quote:
Originally Posted by daveT
This Linear Algebra stuff is pretty cool, except for the amount of paper I need to work through any problem. There hasn't been anything earth-shattering to the level I've seen in Calculus, but I like the new way of thinking this course introduces.
Still early in the class and still learning about matrix inversions, triangle matrices, and transposing. If I'm understanding this correctly, I want to solve for A=LU (or PA=LU) to prove that one single point of the intersecting planes (or vectors) is a solution to the matrix. My question is, what is the practical consideration of solving for an invertible matrix? Why does it matter if a neat solution exists or not?
For some reason, I'm reminded of lowest common denominators from counting theory, though I don't quite have a complete connection to this yet.
Kind of going slow at first since I find matrix multiplication a bit difficult. I just break the matrices apart into vectors and do vector multiplications for right now, in the theory that the matrix multiplication will become more intuitive. Is this an okay way to get acclimated to the idea?
Yeah think Strang's course tends to start with a lot of matrix factorizations and manipulations, and I find all that kind of tedious.
Matrix multiplication is just a whole bunch of vector multiplications (dot products), so you're doing fine. And yes, these early factorizations, etc. are slow and take a lot of work by hand (until you learn to make a computer do them, but for now you should understand the mechanics of doing it "the long way").
And for the most part, whenever you are asking for the solution to a problem, it's nice to know whether the solution is unique or if there are many. For example, if there are many, you may want to further select a solution that is optimal in some way (e.g., costs the least).