Entropy 日付: 3月 13, 2019 リンクを取得 Facebook × Pinterest メール 他のアプリ ### Defined on the **event** of **one** random variable * Information - The "suprise" when an event is observed $$I(x) = -\log p(x)$$ ### Defined on **one** random variable * Entropy - The uncertainty of a random variable (or the mean information of the random variable) $$H(X)=-\sum_x p(x)\log p(x) = E_x[I(x)]$$ ### Defined on **two** distributions of **one** random variable * Cross Entropy $$H(p,q)=-\sum_x p(x)\log q(x)$$ * Relative Entropy (KL Divergence) - $$KL(p||q) = \sum_x p(x) \log \frac{p(x)}{q(x)}=H(p,q)-H(p)$$ * Jensen-Shannon Divergence $$JS(p,q)=\frac{1}{2}[KL(p||m)+KL(q||m)]$$ ### Defined on the **event** of **two** random variable * PMI (Pointwise Mutual Information) $$PMI(x;y)=\log \frac{p(x,y)}{p(x)p(y)}=I(x)+I(y)-I(x,y)$$ ### Defined on **two** random variable * Mutual Information $$ I(X;Y) = H(X) - H(X|Y)$$ * Conditional Entropy $$H(Y|X)=E_x[H(Y|X=x)]$$ コメント
コメント
コメントを投稿