Talk:Linear algebra
From Wikipedia, the free encyclopedia
[edit] Misleading statement
The statement from the article:
"Since vectors, as n-tuples, are ordered lists of n components, it is possible to summarize and manipulate data efficiently in this framework. For example, in economics, one can create and use, say, 8-dimensional vectors or 8-tuples to represent the Gross National Product of 8 countries. One can decide to display the GNP of 8 countries for a particular year, where the countries' order is specified, for example, (United States, United Kingdom, France, Germany, Spain, India, Japan, Australia), by using a vector (v1, v2, v3, v4, v5, v6, v7, v8) where each country's GNP is in its respective position."
is misleading and incorrect. There is a big difference between a tuple and a vector. The tuple of GNP values of 8 countries does not behave like a vector. For example, how would it behave under a linear transformation? What are its basis vectors?
It would improve this article if this statement were removed.
206.169.236.122 20:02, 6 February 2007 (UTC)
[edit] What else?
So what other stuff has the structure of a linear space but has elements that are not real or complex numbers?
You can have a space comprised of, say, all continous functions or polynomials. In the polynomial case, however, Pn is isomorphic to Rn+1 (Pn being the space of all polynomials of degree n).
Veddan (talk) 10:25, 24 March 2008 (UTC)
[edit] Wrong statement
"However, it has few, if any, applications in the natural sciences and the social sciences, and is rarely used except in esoteric mathematical disciplines."
This is just plain wrong. Linear algebra is used in both the natural and social sciences. Physics and Chemistry are obvious. Biology uses matrices and all that malarkey when looking at coupled ODEs. Social sciences use them in some stats work and in ODEs/PDEs. Anywho, the above statement is misleading and should be removed.--137.205.132.41 10:20, 16 January 2007 (UTC)
[edit] Finite fields
In computational number theory you sometimes get people doing linear algebra on matrices made out of integers modulo a prime. Often the prime is 2, but larger ones are also used.
My guess is the elements have to be from a ring or maybe a field. Anyway something with a group operation on the whole set, another group operation on the set except for identity of the first group, distributive law between the two group operations.
- Anything to do with finite fields? --Damian Yerrick
[edit] Fields and rings
You can do linear algebra over any field. If you're working with rings, they're called modules. Modules share many of the properties of vector spaces, but certain important basic facts are no longer true (the term dimension doesn't make much sense anymore, as bases may not have the same cardinality.) --Seb
[edit] Striking (wrong) example
Quoted from the main page:
- A vector space, as a purely abstract concept about which we prove theorems, is part of abstract algebra, and well integrated into this field. Some striking examples of this are the group of invertible linear maps or matrices,
This is truly a striking example :-)
Toby Bartels and I are going to correct this and I think we're also going to write about linear algebra over a rig (algebra) (this is not a typo!). -- Miguel
[edit] Linear algebraists, please help
The derivation of the maximum-likelihood estimator of the covariance matrix of a multivariate normal distribution is perhaps surprisingly subtle and elegant, involving the spectral theorem of linear algebra and the fact that it is sometimes better to view a scalar as the trace of a 1×1 matrix than as a mere scalar. See estimation of covariance matrices. Please help contribute a "linear algebraists' POV" to that article. Michael Hardy 20:20, 10 Sep 2004 (UTC)
- A similar request (this time from a non-mathematician). I plan to introduce a proof from linear algebra into the arbitrage pricing theory article. Firstly, I would like to tighten up the wording such that it is acceptable; secondly, I would like to link the argument to the appropriate linear algebra theorem. Hope that's do-able - and thanks if it is. Basically, this is how the derivation there usually goes (where the generic-vectors below have a financial meaning): "If the fact that (1) a vector is orthogonal to n-1 vectors, implies that (2) it is also orthogonal to an nth vector, then (3) this nth vector can be formed as a linear combination of the other n-1 vectors." Fintor 13:38, 23 October 2006 (UTC)
[edit] How did Hamilton name vectors?
Quote from the article: "In 1843, William Rowan Hamilton (from whom the term vector stems) discovered the quaternions."
Huh? I didn't find the answer on a quick perusal of the William Rowan Hamilton article either. I didn't see it in quaternions either. It sounds like an interesting story, but what (or where) is the story? Spalding 18:25, Oct 4, 2004 (UTC)
[edit] Useful Theorems of linear algebra
The statement about definite and semi-definite matrices is not correct as stated. Matrices should be assumed to be symmetric. Moreover, this is slightly off-topic: it is rather part of bilinear algebra rather than linear algebra.
The statement ``A non-zero matrix A with n rows and n columns is non-singular if there exists a matrix B that satisfies AB = BA = I where I is the identity matrix is much more a definition than a theorem
In my opinion, the main non-trivial result of linear algebra says that the Dimension of a vector space is well defined: Theorem: If a vector space has two bases, then they have the same cardinality.
[edit] Equivalent statements for square matrices
This is not an elegant section: it feels slightly like a dumping ground for a bunch of facts. Did anyone else have the same feeling? (Yiliu60)
- Yes. Jitse Niesen (talk) 6 July 2005 09:21 (UTC)
[edit] Vote for new external link
Here is my site with linear algebra example problems. Someone please put this link in the external links section if you think it's helpful and relevant. Tbsmith
http://www.exampleproblems.com/wiki/index.php/Linear_Algebra
[edit] GA Promotion
Hi everyone,
I promoted this article but I do feel that this is a borderline good article as it is an extremely brief article for such a large branch of mathmatics.
Cedars 07:50, 23 April 2006 (UTC)
- I've delisted as a good article, because
- It is broad in its coverage - for such an important branch of mathematics it is too brief
- No examples
- --Salix alba (talk) 10:57, 16 June 2006 (UTC)
[edit] Finite dimensions
I assume the part about "systems of linear equations in finite dimensions" was intended to distinguish the subject of linear algebra from, say functional analysis. However, the distinction lies not in the number of dimensions, but in whether the linear structure is studied as a thing in itself (as opposed to being studied in the context of a topology). In other words, pure vector spaces are the province of linear algebra, while topological vector spaces are the province of functional analysis. Thus, even infinite-dimensional linear phenomena, if studied from a purely algebraic standpoint, are technically part of linear algebra.--Komponisto 21:36, 26 July 2006 (UTC)
[edit] Clarify 'over a field'
I think it would be helpful if someone clarified the meaning of "over a field" from the first sentence of the fourth paragraph in the 'Elementary Introduction' section. The sentence reads as follows: 'A vector space is defined over a field, such as the field of real numbers or the field of complex numbers.' -- —The preceding unsigned comment was added by DrEricH (talk • contribs) .
- That's a fair point. I reformulated it so that it does not use the phrase "over a field" anymore. -- Jitse Niesen (talk) 02:12, 16 August 2006 (UTC)
[edit] Plagiarism?
The passage:
For small systems, ad hoc methods are sufficient. Larger systems require one to have more systematic methods. The modern day approach can be seen 2,000 years ago in a Chinese text, the Nine Chapters on the Mathematical Art (traditional Chinese: 九章算術; simplified Chinese: 九章算术; pinyin: Jiǔzhāng Suànshù).
Is very similar to:
For small systems, ad hoc methods certainly suffice. Larger systems, however, require more systematic methods. The approach generally used today was beautifully explained 2,000 years ago in a Chinese text, the Nine Chapters on the Mathematical Art (Jiuzhang Suanshu, 九章算術).
Which is taken from Linear Algebra with Applications Third Edition by Otto Bretscher.
To me, it sounded a bit too similar to the original text.
Just something I noticed.
- Thanks for your message. I agree that the similarity is too much to be coincidence. The first two paragraphs in the "History" section were added in a single edit, so they are both suspect. I thus deleted them. -- Jitse Niesen (talk) 12:05, 23 January 2007 (UTC)
[edit] Choice about choice
A number of editors of linear algebra/vector space articles are uncomfortable about making statements which rely on the axiom of choice without mentioning it. To some extent, I share their unease (or to put it more light-heartedly, "You say every vector space has a basis? Great! Now give me a well-ordering of the real numbers - it might come in handy..."). On the other hand, these are articles about linear algebra, so it is a pity to constantly distract the reader with digressions into logic and set theory. So we have a choice (fortunately a finite choice!): do we mention choice or not? I know this has been discussed on a few talk pages before, but my recent edit of this article suggests a compromise: footnoting references to choice. I imagine the use of footnotes may polarize opinion, but it might be a sensible way forward in this case, so let me know what you think. And I'll make a few other similar edits to stimulate the discussion :) Geometry guy 21:42, 13 February 2007 (UTC)
(Hmmm... I'm quite proud of that split infinitive.) I've footnoted the choices in Dual space. One obvious question that arises is whether it would be better to have just one choice-related note with all the relevant caveats, or several. I'd be inclined to put them together to avoid repetition. Geometry guy 22:18, 13 February 2007 (UTC)
- seems silly to avoid all together making statements that require the axiom of choice in linear algebra articles. however, so long as the topic belongs in linear algebra proper, probably good to discuss the finite dimensional case first if possible. it should be explicitly stated when the axiom of choice is needed. i agree that doing so via footnotes, as you've done in the dual space article, is a good idea. (even better if they are expanded a bit). Mct mht 15:17, 14 February 2007 (UTC)
-
- I actually agree entirely, but wanted to invite opinion. (Also I have a slight preference for avoiding choice when it is not needed. For example, a lot of the claims about dual spaces do hold without choice for duals of vector spaces with bases. For another example, I prefer the statement "does not have a countable basis" to the statement "has an uncountable basis".) Anyway, I'm glad you like the footnotes idea, and would be happy for them to be expanded. Ultimately there might be a place for an article on "choice in linear algebra". Geometry guy 23:22, 14 February 2007 (UTC)
-
-
- a seperate article that collects relevant linear algebraic results and delineate when the axiom of choice is needed and not needed looks like the best solution. let's hope someone will take that up. :-) Mct mht 02:52, 15 February 2007 (UTC)
-
[edit] Some unfavorable impressions
I have to (regretfully) state that for a "top importance" article, this one is remarkably incoherent. Problems are manifold, but just for starters, there is the issue of consistency within the article itself and in wikipedia in general.
- If linear algebra studies systems of linear equations, as the introduction states, then how come Gauss is not even mentioned in the history section?
- Moreover, since according to the article on abstract algebra, linear algebra is its proper part, it would seem circular to state that linear algebra is widely used in both abstract algebra and ... On the other hand, applications of linear algebra to differential equations are not even mentioned (and no, it's not covered by a reference to functional analysis).
- Why is matrix theory not referenced at all? One would hope it's not because we cannot explain the difference between it and linear algebra!
- The History section seems particularly weak. As the article on matrices discusses, they were introduced in ancient times and used throughout the Middle Ages. Of course, Gauss's work in the beginning of 19th century is very relevant for development of linear algebra, but so is, for example, Laplace's work before, and Cauchy's after, neither of which is mentioned. Arthur Cayley only introduced notation for determinants and abstract matrices, it's hardly proper to credit him with invention of linear algebra! In fact, there are [1], [2] two articles on history of linear algebra in MacTutor History of Mathematics Archive, which, while not complete, nonetheless make me think that it's better to scrap the current history section altogether as simplistic and factually wrong, and rewrite it anew.
- The section Elementary introduction is a weird mixture, an ad hoc explanation of vector spaces (and as someone has already commented above, a tuple is not at all a representative object for linear algebra as a discipline), with matrices, determinants, and the general idea of linearity interspersed.
- All but the very first Useful theorems deal with matrices, would it not be more natural to put them into the article on matrix theory?
And the list goes on, and on, and on. Arcfrk 15:01, 19 March 2007 (UTC)
- Although the commentary here is a bit harsh (and some of these issues are easily fixed, for example by referring to applications of linear algebra in other areas of abstract algebra, for instance using representation theory), I do agree with the substance of the criticisms. This really is one of the most fundamental articles in pure mathematics, and I think we have a real opportunity here to expand and enhance it. Geometry guy 22:24, 19 March 2007 (UTC)
[edit] Rewrite?
How about rewriting it? I mean, really. Readable articles for basic mathematics topics shouldn't be _too_ hard for us, should they? I tried turning the intro into grammatically correct english; as for content, however, I came here to learn and my linear algebra, history thereof, etc. is still weak. Please help! User:x14n 10th-ish Oct. 2007
I'm hopefully gonna spend some time in the next few days reworking some pieces of this article. As is, it really is a complete mess. A couple of thoughts that spring to mend:
-There should be a definition of vector space, some substantial mention of module theory and a couple of comments about why vector spaces and modules are different and why they are the same.
-The example given about the GNP of 8 countries is misleading in its triviality. Vectors are much more than just lists of numbers. Towards the end of explaining what linear algebra is and what a vector space really is, this article should have some well developed heuristic explanations of the concept of "linearity".
-I think the section that just lists important theorems should be trashed. It is completely unenlightening to just list off a bunch of results that all involve technical concepts, none of which have been defined.
Its late for me and these comments might be a little bit vague but please respond. Ill try to realize some of this stuff when ive had some sleep.Jrdodge 08:31, 11 November 2007 (UTC)
[edit] A matrix is invertible if and only if its determinant is nonzero
This is inaccurate, for instance a matrix over the integers modulo 4 with a determinant of 2 would be uninvertible.
Was I being too pedantic for a Wikipedia article?
Maybe it should read "A matrix is invertible if and only if its determinant is nonzero (but see Invertible)" ?
I was taught this exact statement in school, and it cost me time and effort when I started trying to work with matrices over rings other than the integers or the reals. I'd rather not see anyone else misled by this implicit assumption. 91.84.221.238 (talk) 02:24, 15 January 2008 (UTC)
- From the structure of the article, I interpreted the assumption of the previous section (that scalars come from a field) as carrying over to the subsequent section ("Some useful theorems"). The next section ("Generalisations and related topics") discusses matrices over other algebraic objects. Myasuda (talk) 02:52, 15 January 2008 (UTC)
[edit] Chinese Linear Algebra side note
I was reading my linear Algebra book for class (Otto Bretsher's Linear Algebra with Applications. 3 edition. Upper Saddle River, New Jersey: Pearson Education, 2005) when I came across something interesting on page 8: "When mathematicans in ancient China had to solve a system of simultaneous linear equations such as, they took all the numbers involved in this system and arranged them in a rectangular pattern (Fang Cheng in Chinese as follows:
All the information about this system is conveniently stored in this array of numbers. The entries were respresented by counting rods; [...] the equations were then solved in a hands-on fashion, by manipulating the rods" I did some googling and found out that how Fang Chang is Chapter 8 in a book called Nine Chapters on the Mathematical Art which shows how almost 2000 years ago Chinese had a method similar to Guassian Elimination for solving linear equations even though they didn't call it Linear Alegbra. I thought it would be an interesting side note to add to the history section. the link about the book is here Nine Chapters on wiki and here Nine Chapters on google books —Preceding unsigned comment added by 128.61.43.160 (talk) 18:05, 12 June 2008 (UTC)