Entropy 日付: 3月 13, 2019 リンクを取得 Facebook × Pinterest メール 他のアプリ ### Defined on the **event** of **one** random variable * Information - The "suprise" when an event is observed I(x)=−logp(x)I(x) = -\log p(x)I(x)=−logp(x) ### Defined on **one** random variable * Entropy - The uncertainty of a random variable (or the mean information of the random variable) H(X)=−∑xp(x)logp(x)=Ex[I(x)]H(X)=-\sum_x p(x)\log p(x) = E_x[I(x)]H(X)=−x∑p(x)logp(x)=Ex[I(x)] ### Defined on **two** distributions of **one** random variable * Cross Entropy H(p,q)=−∑xp(x)logq(x)H(p,q)=-\sum_x p(x)\log q(x)H(p,q)=−x∑p(x)logq(x) * Relative Entropy (KL Divergence) - KL(p∣∣q)=∑xp(x)logp(x)q(x)=H(p,q)−H(p)KL(p||q) = \sum_x p(x) \log \frac{p(x)}{q(x)}=H(p,q)-H(p)KL(p∣∣q)=x∑p(x)logq(x)p(x)=H(p,q)−H(p) * Jensen-Shannon Divergence JS(p,q)=12[KL(p∣∣m)+KL(q∣∣m)]JS(p,q)=\frac{1}{2}[KL(p||m)+KL(q||m)]JS(p,q)=21[KL(p∣∣m)+KL(q∣∣m)] ### Defined on the **event** of **two** random variable * PMI (Pointwise Mutual Information) PMI(x;y)=logp(x,y)p(x)p(y)=I(x)+I(y)−I(x,y)PMI(x;y)=\log \frac{p(x,y)}{p(x)p(y)}=I(x)+I(y)-I(x,y)PMI(x;y)=logp(x)p(y)p(x,y)=I(x)+I(y)−I(x,y) ### Defined on **two** random variable * Mutual Information I(X;Y)=H(X)−H(X∣Y) I(X;Y) = H(X) - H(X|Y)I(X;Y)=H(X)−H(X∣Y) * Conditional Entropy H(Y∣X)=Ex[H(Y∣X=x)]H(Y|X)=E_x[H(Y|X=x)]H(Y∣X)=Ex[H(Y∣X=x)] コメント
コメント
コメントを投稿