Kernel density estimation
From Wikipedia, the free encyclopedia
In statistics, kernel density estimation (or Parzen window method, named after Emanuel Parzen) is a way of estimating the probability density function of a random variable. As an illustration, given some data about a sample of a population, kernel density estimation makes it possible to extrapolate the data to the entire population.
A histogram can be thought of as a collection of point samples from a kernel density estimate for which the kernel is a uniform box the width of the histogram bin.
Contents |
[edit] Definition
If x1, x2, ..., xN ~ ƒ is an IID sample of a random variable, then the kernel density approximation of its probability density function is
where K is some kernel and h is the bandwidth (smoothing parameter). Quite often K is taken to be a standard Gaussian function with mean zero and variance 1:
[edit] Intuition
Although less smooth density estimators such as the histogram density estimator can be made to be asymptotically consistent, others are often either discontinuous or converge at slower rates than the kernel density estimator. Rather than grouping observations together in bins, the kernel density estimator can be thought to place small "bumps" at each observation, determined by the kernel function. The estimator consists of a "sum of bumps" and is clearly smoother as a result (see below image).
[edit] Properties
Let be the L2 risk function for ƒ. Under weak assumptions on ƒ and K,
By minimizing the theoretical risk function, it can be shown that the optimal bandwidth is
where
When the optimal choice of bandwidth is chosen, the risk function is for some constant c4 > 0. It can be shown that, under weak assumptions, there cannot exist a non-parametric estimator that converges at a faster rate than the kernel estimator. Note that the n−4/5 rate is slower than the typical n−1 convergence rate of parametric methods.
[edit] Statistical implementation
- In Matlab, kernel density estimation is implemented through the
ksdensity
function. - In Stata, it is implemented through
kdensity
; for examplehistogram x, kdensity
. - In R, it is implemented through the
density
function. - In SAS,
proc kde
can be used to estimate univariate and bivariate kernel densities.
[edit] See also
- Kernel (statistics)
- Kernel (mathematics)
- Density estimation
- Kernel smoothing
- Mean-shift algorithm
[edit] References
- Parzen E. (1962). On estimation of a probability density function and mode, Ann. Math. Stat. 33, pp. 1065-1076.
- Duda, R. and Hart, P. (1973). Pattern Classification and Scene Analysis. John Wiley & Sons. ISBN 0-471-22361-1.
- Wasserman, L. (2005). All of Statistics: A Concise Course in Statistical Inference, Springer Texts in Statistics.
[edit] External links
- Introduction to kernel density estimation
- Free Matlab m-file for one and two dimensional kernel density estimation
- Free Online Software (Calculator) computes the Kernel Density Estimation for any data series according to the following Kernels: Gaussian, Epanechnikov, Rectangular, Triangular, Biweight, Cosine, and Optcosine.
- FIGTree is a fast library that can be used to compute Kernel Density Estimates using a Gaussian Kernel. MATLAB and C/C++ interfaces available.