Zipf’s Law, ArtEnt Blog Hits

As I look at the hit statistics for the last quarter, I cannot help but wonder how well they fit Zipf’s law (a.k.a. Power Laws, Zipf–Mandelbrot law, discrete Pareto distribution).  Zipf’s law states that the distribution of many ranked things like city populations, country populations, blog hits, word frequency distribution, probability distribution of questions for Alicebot, Wikipedia Hits, terrorist attacksthe response time of famous scientists, … look like a line when plotted on a log-log diagram.  So here are the numbers for my blog hits and, below that, a plot of log(blog hits) vs log(rank) :

 

“Deep Support Vector Machines for Regression Problems” 400
Simpson’s paradox and Judea Pearl’s Causal Calculus 223
Standard Deviation of Sample Median 220
100 Most useful Theorems and Ideas in Mathematics 204
Computer Evaluation of the best Historical Chess Players 181
Notes on “A Few Useful Things to Know about Machine Learning” 178
Comet ISON, Perihelion, Mars, and the rule of 13.3 167
Dropout – What happens when you randomly drop half the features? 139
The Exact Standard Deviation of the Sample Median 101
Bengio LeCun Deep Learning Video 99
Category Theory ? 92
“Machine Learning Techniques for Stock Prediction” 89
Approximation of KL distance between mixtures of Gaussians 75
“A Neuro-evolution Approach to General Atari Game Playing” 74
The 20 most striking papers, workshops, and presentations from NIPS 2012 65
Matlab code and a Tutorial on DIRECT Optimization 61
About 51

 

 

bloghits4

 

Not too linear.  Hmmm.

(Though Zipf’s “law” has been known for a long time, this post is at least partly inspired by Tarence Tao’s wonderful post “Benford’s law, Zipf’s law, and the Pareto distribution“.)