I had the weirdest dream the other day. Simply put, I was laughing in my dream when I found out that some people are luckier than others, because one of their eigenvectors happens to be salted pork. And I haven't even been reading any Boris Vian lately.
Eigenvectors are interesting animals. They occur when you multiply a vector with a matrix; sometimes, you find that the result of the multiplication is a vector that points in the same direction as the original. I.e., only the vector's magnitude changes, not its direction. In other words, multiplying this particular vector with the matrix was the same as multiplying the vector with an ordinary number.
Take, for instance, this:
\[\begin{pmatrix}0&1\\1&0\end{pmatrix}\begin{pmatrix}x\\y\end{pmatrix}=\begin{pmatrix}y\\x\end{pmatrix}.\]
The matrix in this equation has two eigenvectors: (1, 1) and (1, –1). For instance
\[\begin{pmatrix}0&1\\1&0\end{pmatrix}\begin{pmatrix}1\\-1\end{pmatrix}=\begin{pmatrix}-1\\1\end{pmatrix}.\]
I.e., the eigenvalue corresponding to the eigenvector $(1, -1)$ is –1; in other words, multiplying $(1, -1)$ with the matrix is the same as multiplying $(1, -1)$ with –1, so we get $(-1, 1)$.
Function operators also have "eigenvectors": these are called eigenfunctions. To see this, first consider that many functions can be expressed as a (possibly infinite) power series: for instance, $\sin x=x-x^3/3!+x^5/5!-x^7/7!+...$. In other words, we can view these functions as "vectors" in some "infinite-dimensional" vector space whose "dimensions" correspond to $x^0=1$, $x^1=x$, $x^2$, $x^3$, $x^4$, etc. The "vector" corresponding to $\sin x$ would be $(0, 1, 0, -1/3!, 0, 1/5!, 0, -1/7!, ...)$.
Now here's a particularly interesting infinite-dimensional matrix:
\[\begin{pmatrix}0&1&0&0&0&.\\0&0&2&0&0&.\\0&0&0&3&0&.\\0&0&0&0&4&.\\0&0&0&0&0&.\\.&.&.&.&.&\cdots\end{pmatrix}\begin{pmatrix}0\\0\\0\\1\\0\\\cdots\end{pmatrix}=\begin{pmatrix}0\\0\\3\\0\\0\\\cdots\end{pmatrix}.\]
Can you see what this matrix does? In this example, the vector $(0, 0, 0, 1, ...)$ represents the function $x^3$. The vector on the right, $(0, 0, 3, 0, ...)$, happens to represent the function $3x^2$. Multiplying the first by the matrix on the left gave us the second. Yes, you guessed it right: This particular matrix is none other than the derivative operator.
So what are its eigenvectors? Here is one particular eigenvector of great importance:
\[\begin{pmatrix}0&1&0&0&0&.\\0&0&2&0&0&.\\0&0&0&3&0&.\\0&0&0&0&4&.\\0&0&0&0&0&.\\.&.&.&.&.&\cdots\end{pmatrix}\begin{pmatrix}1\\1\\1/2\\1/6\\1/24\\\cdots\end{pmatrix}=\begin{pmatrix}1\\1\\1/2\\1/6\\1/24\\\cdots\end{pmatrix}.\]
The vector $(1, 1, 1/2, 1/6, 1/24, ...)$, better recognizable as $(1/0!, 1/1!, 1/2!, 1/3!, 1/4!, ...)$, happens to represent the function $e^x=x^0/0!+x^1/1!+x^2/2!+x^3/3!+...$. That it is an eigenvector of our matrix, i.e., that $e^x$ is an eigenfunction of the derivative operator, is just another way of stating something I learned in high school, namely that $de^x/dx=e^x$.
Eigenfunctions pop up all over the place in quantum mechanics. That is because one way of stating the classical problem in quantum mechanics is that it's an eigenvalue problem: given a function operator that represents a physical quantity (an observable) we need to find the corresponding wave functions that happen to be eigenfunctions of this operator. These will represent the "pure states" of a quantum system (i.e., states that the system may be in right after a measurement) while the general state will be a "mixed" state, a linear combination of the "pure" eigenstates.
When you deal with infinite things, you have to be careful: many results generalize from the finite to the infinite, but some do not. Infinite-dimensional vector spaces are often called Hilbert-spaces (though strictly speaking, a Hilbert-space doesn't necessarily have to be infinite-dimensional, it just can be), and their theory also plays an important role in quantum mechanics.
One more thing about functions as infinite-dimensional vectors. When you present a function as a power series (e.g., $\sin x=x-x^3/3!+x^5/5!-x^7/7!+...$), the basis vectors form a countable set. There are other ways to "decompose" a function; one of the best known is the Fourier-transform, where you basically represent a function as a combination of sine waves of various frequencies. Since there's a distinct sine wave for every real number, in this case the number of basis vectors will be the same as the cardinality of the set of reals, i.e., uncountably infinite. Which strongly suggests that the number of functions that can be expressed using Fourier-transforms is infinitely larger than the number of functions that can be expressed as a power series. Not altogether surprising; after all, power series can be used to represent analytic functions (at least locally), whereas Fourier-transforms can be used to represent functions that are decidedly not analytic, such as a square wave, for instance.
What all this has to do with salted pork, I have no idea. But I do know that I woke up with the words "sózott disznó" (Hungarian for salted pork) on my tongue. Wonder what Dr. Freud would have made of this dream.