# Math

You are currently browsing the archive for the Math category.

## Deriving the Gaussian Distribution from the Sterling Approximation and the Central Limit Theorem

You can find anything on the web.  Earlier today, I noticed that the Sterling approximation and the Gaussian distribution both had a $\sqrt{2 \pi}$ in them, so I started thinking that maybe you could apply Sterling’s approximation to the binomial distribution with $p=1/2$ and large $n$ and the central limit theorem to derive the Gaussian distribution.  After making a few calculations it seemed to be working and I thought, “Hey, maybe someone else has done this.”  Sure enough, Jake Hoffman did it and posted it here.  The internet is pretty cool.

## Hausdorff dimension

I was reading MarkCC’s gripe about the misuse of the word dimension and it reminded me about the Hausdorff dimension.  Generally speaking, the Hausdorff dimension matches our normal intuitive definition of dimension, i.e the H-dimension of a smooth curve or line is 1, the H-dimension of a smooth surface is 2, the H-dimension of the interior of a cube is 3. But for fractals, the H-dimension can be a non-integer. For the Cantor set, the H-dimension is 0.63. For more information, check out the Wikipedia article

http://en.wikipedia.org/wiki/Hausdorff_dimension

## “Matrix identities as derivatives of determinant identities”

Check out Terence Tao‘s wonderful post “Matrix identities as derivatives of determinant identities“.

## “Proofs without words”

Thank you to Freakonometrics for pointing me toward the book “Proofs without words” by Rodger Nelson.  Might be a nice Christmas present

## Matroids

Matroids are an abstraction of Vector Spaces.  Vectors spaces have addition and scalar multiplication and the axioms of vectors spaces lead to the concepts of subspaces, rank, and independent sets.  All Vector Spaces are Matroids, but Matroids only retain the properties of independence, rank, and subspaces. The idea of the Matroid closure of a set is the same as the span of the set in the Vector Space.

## 100 Most useful Theorems and Ideas in Mathematics

So I have been thinking about which ideas in mathematics I use most often and I have listed them below.  Please feel free to comment because I would like to use this list as a starting point for a future list of “The most useful ideas in Mathematics for Scientists and Engineers”.
$\$
I showed this list to Carl and he said his list would be completely different.  (He choked a bit when he saw primes at #71.)  Hopefully we can post his list as well as others.
$\$
1. counting
2. zero
3. integer decimal positional notation 100, 1000, …
4. the four arithmetic operations + – * /
5. fractions
6. decimal notation  0.1, 0.01, …
7. basic propositional logic (Modus ponens, contrapositive, If-then, and, or, nand, …)
8. negative numbers
9. equivalence classes
10. equality & substitution
11. basic algebra – idea of variables, equations, …
12. the idea of probability
13. commutative and associative properties
14. distributive property
15. powers (squared, cubed,…),  – compound interest (miracle of)
16. scientific notation 1.3e6 = 1,300,000
17. polynomials
18. first order predicate logic
19. infinity
20. irrational numbers
21. Demorgan’s laws
22. statistical independence
23. the notion of a function
24. square root  (cube root, …)
25. inequalities (list of inequalities)
26. power laws (i.e. $a^b a^c = a^{b+c}$ )
27. Cartesian coordinate plane
28. basic set theory
29. random variable
30. probability distribution
31. histogram
32. the meanexpected value & strong law of large numbers
33. the graph of a function
34. standard deviation
35. Pythagorean theorem
36. vectors and vector spaces
37. limits
38. real numbers as limits of fractions, the least upper bound
39. continuity
40. $R^n$, Euclidean Space,  and Hilbert spaces (inner or dot product)
41. derivative
42. correlation
43. central limit theorem, Gaussian Distribution, Properties of Guassains.
44. integrals
45. chain rule
46. modular arithmetic
47. sine cosine tangent
48. $\pi$circumference, area, and volume formulas for circles, rectangles, parallelograms, triangles, spheres, cones,…
49. linear regression
50. Taylor’s theorem
51. the number e and the exponential function
52. Rolle’s theoremKarush–Kuhn–Tucker conditions, derivative is zero at the maximum
53. the notion of linearity
54. Big O notation
55. injective (one-to-one) / surjective (onto) functions
56. imaginary numbers
57. symmetry
58. Euler’s Formula $e^{i \pi} + 1 = 0$
59. Fourier transform, convolution in time domain is the product in the frequency domain (& vice versa), the FFT
60. fundamental theorem of calculus
61. logarithms
62. matrices
63. conic sections
64. Boolean algebra
65. Cauchy–Schwarz inequality
66. binomial theorem – Pascal’s triangle
67. the determinant
68. ordinary differential equation (ODE)
69. mode (maximum likelihood estimator)
70. cosine law
71. prime numbers
72. linear independence
73. Jacobian
74. fundamental theorem of arithmetic
75. duality – (polyhedron faces & pointsgeometry lines and pointsDual Linear Programdual space, …)
76. intermediate value theorem
77. eigenvalues
78. median
79. entropy
80. KL distance
81. binomial distribution
82. Bayes’ theorem
83. $2^{10} \approx 1000$
84. compactnessHeine – Borel theorem
85. metric space, Triangle Inequality
86. ProjectionsBest Approximation
87. $1/(1-X) = 1 + X + X^2 + \ldots$
88. partial differential equations
90. Reisz representation theorem
91. Fubini’s theorem
92. the ideas of groups, semigroups, monoids, rings, …
93. Singular Value Decomposition
94. numeric integration – trapezoidal rule, Simpson’s rule, …
95. mutual information
96. Plancherel’s theorem
97. matrix condition number
98. integration by parts
99. Euler’s method for numerical integration of ODEs (and improved EulerRunge–Kutta)
100. pigeon hole principle

## Gauss’s continued fraction

Gauss did a lot of math, but sometimes I am surprised when I find something new to me done by Gauss

$\tan(x) = { 1 \over{1/x \ – {1\over{3/x \ – {1\over{5/x\ – \ldots}}}}}}$

## Topological Sorting

Given a directed a-cyclic graph (DAG) find an ordering of the nodes which is consistent with the directed edges.  This is called Topological Sorting and several algorithms exist.

## Why are Gaussian Distributions Great?

Gaussian distributions are the most “natural” distributions. They show up everywhere. Here is a list of the properties that make me think that Gaussians are the most natural distributions:

• The sum of several random variables (like dice) tends to be Gaussian. (Central Limit Theorem).
• There are two natural ideas that appear in Statistics, the standard deviation and the maximum entropy principle. If you ask the question, “Among all distributions with standard deviation 1 and mean 0, what is the distribution with maximum entropy?” The answer is the Gaussian.
• Randomly select a point inside a high dimensional hypersphere. The distribution of any particular coordinate is approximately Gaussian. The same is true for a random point on the surface of the hypersphere.
• Take several samples from a Gaussian Distribution. Compute the Discrete Fourier Transform of the samples. The results have a Gaussian Distribution. I am pretty sure that the Gaussian is the only distribution with this property.
• The eigenfunctions of the Fourier Transforms are products of polynomials and Gaussians.
• The solution to the differential equation y’ = -x y is a Gaussian. This fact makes computations with Gaussians easier. (Higher derivatives involve Hermite polynomials.)
• I think Gaussians are the only distributions closed under multiplication, convolution, and linear transformations.
• Maximum likelihood estimators to problems involving Gaussians tend to also be the least squares solutions.
• I think all solutions to stochastic differential equations involve Gaussians. (This is mainly a consequence of the Central Limit Theorem.
• “The normal distribution is the only absolutely continuous distribution all of whose cumulants beyond the first two (i.e. other than the mean and variance) are zero.” – Wikipedia.
• For even n, the nth moment of the Gaussian is simply an integer multiplied by the standard deviation to the nth power.
• Many of the other standard distributions are strongly related to the Gaussian (i.e. binomial, Poisson, chi-squared, Student t, Rayleigh, Logistic, Log-Normal, Hypergeometric …)
• “If X1 and X2 are independent and their sum X1 + X2 is distributed normally, then both X1 and X2 must also be normal.” — From the Wikipedia.
• “The conjugate prior of the mean of a normal distribution is another normal distribution.” — From the Wikipedia.
• When using Gaussians, the math is easier.
• The Erdős–Kac theorem implies that the distribution of the prime factors of a “random” integer is Gaussian.
• The velocities of random molecules in a gas are distributed as a Gaussian. (With standard deviation = $z*\sqrt{ k\, T / m}$ where $z$ is a “nice” constant, $m$ is the mass of the particle, and $k$ is Boltzmann’s constant.)
• “A Gaussian function is the wave function of the ground state of the quantum harmonic oscillator.” — From Wikipedia
• Kalman Filters.
• The Gauss–Markov theorem.

## “Old and New Matrix Algebra Useful for Statistics”

Sometimes I can’t remember the formulas for matrix calculus.  Fortunately there are some good references on the web.  One of my favorites is “Old and New Matrix Algebra Useful for Statistics” (Minka 2000 1, 2).