Feynmann's technique for Gamma-like integrals
In mathematical statistics and probability theory, integration of the form that resembles gamma function arises time to time if not frequently:
» continue reading
Negative Version of Hypergeometric Distribution
It is a well known and popular fact that the number $X$ of i.i.d. Bernoulli trials, with success probability $p,$ until the fixed $r$-th success follows the distribution with probability mass function given as
» continue reading
Understanding Restricted Boltzmann Machine
A Boltzmann machine is an unsuperviced generative model that learns the probability distribution of a random variable using the Boltzmann distribution. Although it has been proposed in 1985, practical utilization was nearly impossible for samples of nontrivial sizes. Only later in 2002 when the restricted version of it overcame the implausibility did it became a widely used algorithm.
» continue reading
Mutual information and Channel Capacity
From the definition of entropy, Shannon further defined information that the message $X$ and the recovered message $Y$ have in common. Using the the founding definitions from this and the previous article, we can develop a little more sophisticated theory.
» continue reading
The Shannon Entropy
Information theory developed by Shannon is being used not only in mathematical modeling of information transmission, but also being actively applied to ecology, machine learning and its related fields. Here, I would like to briefly review the founding concept of information theory: the Shannon entropy.
» continue reading