2011-12-15

Mnemonic for π


All the following poems describe a number.

  • 産医師異国に向こう.産後薬なく産に産婆四郎二郎死産.産婆さんに泣く.ご礼には早よ行くな.
  • Yes, I have a number.
  • How I want a drink, alcoholic of course, after the heavy lectures involving quantum mechanics.
  • Sir, I send a rhyme excelling.In sacred truth and rigid spelling. Numerical spirits elucidate, For me, the lesson's dull weight. If Nature gain, Not you complain, Tho' Dr. Johnson fulminate.
  • Que j'aime à faire apprendre un nombre utile aux sages! Immortel Archimède, artiste ingénieur, Qui de ton jugement peut priser la valeur? Pour moi, ton problème eut de pareils avantages.
  • Wie, O dies π. Macht ernstlich so vielen viele Müh!  Lernt immerhin, Jünglinge, leichte Verselein.  Wie so zum Beispiel dies dürfte zu merken sein!

The German version mentioned about the number --- π. These are all Mnemonic for π. Japanese uses sounds of numbers, but, other languages uses the number of words. From
Yes(3), I(1) have(4) a(1) number(6),
you can find 3.1416 (rounded).

I found these poems in

  • Akihiro Nozaki, A story of π, Iwanamishoten (1974) pp.77-78.

But this book also refers the followings:

  • Shin Hitotumatu, Essay for numbers, Chuuoukouronsha, (1972) p.109
  • Shigeo Nakano, the road to the modern mathematics, Shinyousya, (1973) p.19
  • Shuuichirou Yoshioka, The thousand and one nights of mathematics, Seinenshobou (1941) p.147, Gakuseisha(1959) p.107


Geometric Multiplicity: eignvectors (2)

If eigenvectors of a matrix A are independent, it is a happy property. Because the matrix A can be diagonalized with a matrix S that column vectors are eigenvectors of A. For example,



Why this is a happy property of A? Because I can find A's power easily.



A^{10} is not a big deal. Because Λ is a diagonal matrix and power of a diagonal matrix is quite simple.
A^{10} = SΛ^{10} S^{-1}
Then, why if I want to compute power of A? That is the same reason to find eigenvectors. Eigenvectors are a basis of a matrix. A matrix can be represented by a single scalar. I repeat this again. This is the happy point, a matrix becomes a scalar. What can be simpler than a scalar value.

But, this is only possible when the matrix S's columns are independent. Because S^{-1} must be exist.

Now I come back to my first question. Is the λ's multiplicity related with the number of eigenvectors? This time I found this has the name.

  • Geometric multiplicity (GM): the number of independent eigenvectors
  • Algebratic multiplicity (AM): the number of multiplicity of eigenvalues

There is no rigid relationship between them. There is only an inequality relationship GM <= AM.

For example, a 4x4 matrix's AM = 3 (The number of different λs  is 2.), GM is not necessary to be 2.


By the way, this S is a special matrix and called Hadamard matrix. I wrote a blog entry how to compute this matrix.  This matrix is so special, it is symmetric, orthogonal, and only contains 1 and -1.

The identity matrix is also an example of such matrix. The eigenvalues of 4x4 identity matrix is λ = 1,1,1,1 and eigenvectors are
.



I took a day to realize this. But Marc immediately pointed this out.

Though, I still think one λ value corresponds to one eigenvector in general. The number of independent eigenvector is the dimension of null space of A - λ I. The eigenvalue multiplicity is based on this as the form of characteristic function. But, I feel I need to study more to find the deep understanding of this relationship.


Anyway, an interesting thing to me is one eigenvalue can have multiple corresponding eigenvectors.


References:
Gilbert Strang, Introduction to Linear Algebra, 4th Ed.



Geometric Multiplicity: eignvectors (1)


I had a question regarding the relationship between multiplicity of eigenvalue and eigenvectors.

I am more interested in eigenvalue's multiplicity than the value itself. Because if eigenvalue has multiplicity, the number of independent eigenvectors ``could'' decrease. My favorite property of eigen-analysis is that is a transformation to simpler basis. Here, simpler means a matrix became a scalar. I even have a problem to understand a 2x2 matrix, but a scalar has no problem, or there is no simpler thing than a scalar. Ax = λ x means the matrix A equals λ, what a great simplification!

My question is
 If λ has multiplicity, are there still independent eigenvectors for the eigenvalue?
My intuition said no. I can compute an eigenvector to a corresponding eigenvalue. But, I think I cannot compute the independent eigenvectors for one eigenvalue.

For instance, assume 2x2 matrix that has λ = 1,1, how many eigenvectors? one?

Recently I found this is related with diagonalization using eigenvector. My intuition was wrong.

For one eigenvalue, that has multiplicity, there can be multiple eigenvectors.

I will show the example of this one eigenvalue and multiple eigenventors in next article.