ebooksgratis.com

See also ebooksgratis.com: no banners, no cookies, totally FREE.

CLASSICISTRANIERI HOME PAGE - YOUTUBE CHANNEL
Privacy Policy Cookie Policy Terms and Conditions
Dirichlet process - Wikipedia, the free encyclopedia

Dirichlet process

From Wikipedia, the free encyclopedia

Let S be a set equipped with a suitable σ-algebra. A Dirichlet process over S is a stochastic process whose sample path is a probability distribution on S. The finite dimensional distributions are from the Dirichlet distribution: If M is a finite measure on S, and X is a random distribution drawn from a Dirichlet process, written as

X \sim \mathrm{DP}\left(M\right)

then for any partition of S, say \left\{B_i\right\}_{i=1}^{n}, we have that

\left(X\left(B_1\right),...,X\left(B_n\right)\right) \sim \mathrm{Dirichlet}\left(M\left(B_1\right),...,M\left(B_n\right)\right)

The Dirichlet process was formally introduced by Thomas Ferguson[1] in 1973.

Contents

[edit] The Chinese Restaurant Process

The above formal definition of the Dirichlet process does not lend itself to an understanding of the distribution's properties. A more intuitive approach is to consider a sampling scheme known as the Chinese Restaurant Process. This scheme results in a set of samples from S with an equivalent distribution to that which would be obtained by first sampling a distribution on S from a Dirichlet process, and then drawing i.i.d. samples from that distribution.

Suppose that J samples, \left\{\theta_j\right\}_{j=1}^{J} have already been obtained. According to the Chinese Restaurant Process, the \left(J+1\right)^{\mathrm{th}} sample should be drawn from

\theta_{J+1} \sim M + \sum_{j=1}^{J}\delta_{\theta_j}

where δθ is an atomic distribution centred on θ. Interpreting this, two properties are clear:

  1. Even if S is an uncountable set, there is a finite probability that two samples will have exactly the same value. Samples from a Dirichlet process are therefore discrete.
  2. The Dirichlet process exhibits a self-reinforcing property; the more often a given value has been sampled in the past, the more likely it is to be sampled again.

The name 'Chinese Restaurant Process' is derived from the following analogy: imagine an infinitely large restaurant containing an infinite number of tables, and able to serve an infinite number of dishes. The restaurant in question operates a somewhat unusual seating policy whereby new diners are seated either at a currently occupied table with probability proportional to the number of guests already seated there, or at an empty table with probability proportional to a constant. Guests who sit at an occupied table must order the same dish as those currently seated, whereas guests allocated a new table are served a dish at random according to the chef's taste. The distribution of dishes after J guests are served is a sample drawn as described above. The Chinese Restaurant Process is related to the Polya Urn sampling scheme for finite Dirichlet distributions.

[edit] Stick-breaking Construction

A third approach to the Dirichlet process is provided by the so-called stick-breaking construction. Let \left\{\beta'_k\right\}_{k=1}^{\infty} be a set of random variables so that

\beta'_k \sim \mathrm{Beta}\left(1,\alpha\right)

where α is the normalisation constant for the measure M, so that M = αMnorm. Define \left\{\beta_k\right\}_{k=1}^{\infty} according to

\beta_k = \prod_{i=1}^{k-1}\left(1-\beta'_i\right)\cdot\beta'_k

and let \left\{\theta_k\right\}_{k=1}^{\infty} be a set of samples from Mnorm. The distribution given by \sum_{k=1}^{\infty}\beta_k\cdot\delta_{\theta_k} is then a sample from the corresponding Dirichlet process. This method provides an explicit construction of the non-parametric sample, and makes clear the fact that the samples are discrete.

The name 'stick-breaking' comes from the interpretation of the β variables as the lengths of the pieces of a unit-length stick, the remainder of which is repeatedly broken according to samples from a beta distribution.

[edit] Applications of the Dirichlet Process

As draws from a Dirichlet process are discrete, an important use is as a prior probability in infinite mixture models. In this case, S is the parametric set of component distributions. The generative process is therefore that a sample is drawn from a Dirichlet process, and in turn for each data point a value is drawn from this sample distribution and used as the component distribution for that data point. The fact that there is no limit to the number of distinct components which may be generated makes this kind of model ideal for the case when the number of mixture components is not well-defined in advance. For example, the infinite mixture of Gaussians model [2].

The infinite nature of these models also lends them to Natural Language Processing applications, where it is often desirable to treat the vocabulary as an infinite, discrete set.

[edit] Related Distributions

  • The Pitman-Yor distribution (also known as the 'two-parameter Poisson-Dirichlet process') is a generalisation of the Dirichlet process.
  • The hierarchical Dirichlet process extends the ordinary Dirichlet process for modelling grouped data.

[edit] External links


[edit] References

  1. ^ Ferguson, Thomas (1973). "Bayesian analysis of some nonparametric problems". Annals of Statistics 1: 209--230. doi:10.1214/aos/1176342360. 
  2. ^ Rasmussen, Carl (2000). "The Infinite Gaussian Mixture Model". Advances in Neural Information Processing Systems (NIPS) 12. 


aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -