Where did matrices and determinants come from?
By Murray Bourne, 08 Apr 2008
A reader of the Matrices and Determinants chapter in Interactive Mathematics recently wrote and asked where matrices and determinants come from and why do they work?
Matrices are essential for solving large sets of simultaneous equations using a computer. We certainly don't want to use a different letter for each variable in our problem (or lots of subscripts, like a34) because it would slow down the solution process and would be horrible to code. With matrices, we don't have to include any variables - just the numbers in front of those variables.
So for example, suppose we are trying to solve this 4x4 system of equations:
3x + 4y + 2z −6w = 5
x − 5y + 7z + 10w = −8
8x + 5y − z + 7w = 8
6x − 4y + 12z + 15w = 4
We only need to give the computer the coefficients, like this:
3 4 2 −6 | 5 1 −5 7 10 | −8 8 5 −1 7 | 8 6 −4 12 15 | 4
The computer just works on the numbers − it doesn't need the letters.
The Han Chinese and Simultaneous Equations
Here's a problem from a Chinese mathematics book written in 200BC. (Source)
There are three types of corn, of which three bundles of the first, two of the second, and one of the third make 39 measures. Two of the first, three of the second and one of the third make 34 measures. And one of the first, two of the second and three of the third make 26 measures. How many measures of corn are contained of one bundle of each type?
It looks a lot like the kind of problems in textbooks today, doesn't it?
The remarkable thing about this problem is the way that the Chinese writer solved it. First, they set up the numbers involved as follows:
1 2 3 2 3 2 3 1 1 26 34 39
(They are using rows where we would use columns. It doesn't matter.)
The instruction is to...
...multiply the middle column by 3 and subtract the right column as many times as possible, the same is then done subtracting the right column as many times as possible from 3 times the first column. This gives
0 0 3 4 5 2 8 1 1 39 24 39
A similar process occurs to eliminate the 4 in the second row.
0 0 3 0 5 2 36 1 1 99 24 39
From this, we can read off the answer for the amount of the 3rd type (99/36 = 11/4) and then substitute to find the second type (17/4) and first type of corn (37/4).
We now call this process Gaussian Elimination after the German mathematician Gauss (1777-1855).
Maybe it should be called Han Elimination.
You can read more interesting history about matrices and determinants from The MacTutor History of Mathematics archive.
See the 10 Comments below.
3 Jun 2008 at 8:24 pm [Comment permalink]
You have actually given some of us a good background in mathematics. With one's good knowledge of the origin of matrices and determinants, one would not feel the pain of solving problems in certain aspects of maths anymore.
Thanks.
3 Jun 2008 at 9:24 pm [Comment permalink]
Thanks for the comment, Peter.
Many people feel great pain solving matrices problems - they are tedious and prone to many errors.
Let's use computers for this hack work and spend the time understanding what it means!
26 Oct 2011 at 11:29 pm [Comment permalink]
Thanks for this post. This gave a bit of motivasion on studying about Linear Algebra 🙂
31 Aug 2012 at 5:19 am [Comment permalink]
It is many years since I tried advanced Maths and I never got the hang of why use this strange, unintuitive way of representing algebraic expressions. I don't pretend to be talented in maths although I think I was very badly taught in this area - I suspect the teacher hadn't got much of a clue.
However, the penny may be starting to drop with your explanation.
This is how I am now seeing it:
1- you can solve a group of simultaneous equations using the coefficients in a specific way.
2 - with a large number of 'dimensions' and/or a large number of equations, the 'workings out' (while still appearing complex and confusing) are vastly reduced in complexity and confusion compared to sticking with basic algebraic 'workings out'.
Thanks for the information. I will keep an eye on this site.
31 Aug 2012 at 8:21 am [Comment permalink]
@Brian. I'm glad you found the article useful. Your 2-point summary is a good one!
18 Jun 2013 at 8:57 am [Comment permalink]
Hello,
I had had this kind of explanation before, but I would like, if possible, to understand what does it mean (a matrix) in normal three dimensional space. I mean maybe I could see that 3x+2=y and 2x+1=y are two lines in a plane and that solving them means finding where those lines meet, but is there any kind of trick to be able to picture the solutions of a bigger and more complex group of equations??. Thank you very much in advance! 🙂
18 Jun 2013 at 3:23 pm [Comment permalink]
@Diego: For a 3x3 system, the solutions are represented by intersecting planes. If there is one unique solution, then the 3 planes represented by the 3 equations in 3 unknowns meet at a point.
For larger systems, it's a bit trickier to visualize.
These pages may help:
MathWarehouse
Wolfram Demonstration (requires the CDF plugin, but well worth it)
CoolMath
3 Oct 2013 at 7:50 pm [Comment permalink]
Well, apparently, this question took birth in my mind too. I'm simply fascinated by matrices. I heard of a branch of quantum mechanics called "matrix mechanics". Can you please be kind enough to tell me something about this study?
4 Oct 2013 at 8:32 am [Comment permalink]
@Gaurev: This topic is beyond the scope of IntMath, but this search brings up plenty of results.
14 Dec 2013 at 7:29 am [Comment permalink]
You can see the rest of the story on matrix determinants here:http://www.amarketplaceofideas.com/math-derivation-of-matrix-determinant.htm