变分推断EM算法

目标L:计算后验分布p(z|x)

方法:构建q(z)去近似p(z|x)

KL散度是用来衡量两个分布之间的距离,当距离=0时,表示两个分布完全一致。

KL(q(z)||p(z|x))\geq 0

KL(q(z)||p(z|x))=\int q(z) \log\frac{q(z)}{p(z|x)} dz\\=\int q(z) \log q(z)dz-\int q(z)\log p(z|x) dz\\=\int q(z) \log q(z)dz-\int q(z)\log \frac{p(z,x)}{p(x)}  dz\\=\int q(z) \log q(z)dz-\int q(z)\log p(z,x) dz + \int q(z)\log p(x)dz\\=\int q(z) \log q(z)dz-\int q(z)\log p(z,x) dz + \log p(x)

\int q(z) \log q(z)dz-\int q(z)\log p(z,x) dz + \log p(x)\geq 0

\log p(x)\geq \int q(z)\log p(z,x) dz - \int q(z) \log q(z)dz = ELBO

ELBO: Evidence Lower BOund

目标极大化ELBO

假设模型有隐含变量θ,β,z,模型的参数是α,η。为了求出模型参数和对应的隐藏变量分布,EM算法需要在E步先求出隐藏变量θ,β,z的基于条件概率分布的期望,接着在M步极大化这个期望,得到更新的后验模型参数α,η。

问题是在EM算法的E步,由于θ,β,z的耦合,我们难以求出隐藏变量θ,β,z的条件概率分布,也难以求出对应的期望,需要“变分推断“来帮忙,这里所谓的变分推断,也就是在隐藏变量存在耦合的情况下,我们通过变分假设,即假设所有的隐藏变量都是通过各自的独立分布形成的,这样就去掉了隐藏变量之间的耦合关系。我们用各个独立分布形成的变分分布来模拟近似隐藏变量的条件分布,这样就可以顺利的使用EM算法了。

q(\theta,z,\phi)=q(\theta,z,\phi|\gamma ,\mu ,\lambda )

ELBO=E_{q(z)}[p(z,x)]-E_{q(z)}[q(z)]

ELBO=E_{q(\theta,z,\phi)}[p(\theta,z,\phi,w)]-E_{q(\theta,z,\phi)}[q(\theta,z,\phi)]

\begin{align*} & KL(q(\theta,z,\phi)||p(\theta,z,\phi|w,\alpha,\beta))\\&=\int q(\theta,z,\phi) \log \frac {q(\theta,z,\phi)}{p(\theta,z,\phi|w)} d\theta dz d\phi\\&=\int q(\theta,z,\phi) \log q(\theta,z,\phi)d\theta dz d\phi-\int q(\theta,z,\phi) \log p(\theta,z,\phi|w) d\theta dz d\phi\\&=\int q(\theta,z,\phi) \log q(\theta,z,\phi)d\theta dz d\phi-\int q(\theta,z,\phi) \log \frac{p(\theta,z,\phi,w)}{p(w)}d\theta dz d\phi\\&=\int q(\theta,z,\phi) \log q(\theta,z,\phi)d\theta dz d\phi-\int q(\theta,z,\phi) \log p(\theta,z,\phi,w)d\theta dz d\phi+\int q(\theta,z,\phi) \log p(w)d\theta dz d\phi\\&=-(E_{q(\theta,z,\phi)}[\log p(\theta,z,\phi,w)]-E_{q(\theta,z,\phi)} [\log q(\theta,z,\phi)]) + \log p(w)\\&=-ELBO + \log p(w)\end{align*}

故ELBO最大化,才能使得KL最小。

\begin{align*} & ELBO\\&=E_{q(\theta,z,\phi)}[\log p(\theta|\alpha)]+E_{q(\theta,z,\phi)}[\log p(z|\theta)]+E_{q(\theta,z,\phi)}[\log p(\phi|\beta)]+E_{q(\theta,z,\phi)}[\log p(w|z,\phi)]\\&-E_{q(\theta,z,\phi)}[\log q(\theta|\gamma)]-E_{q(\theta,z,\phi)}[\log q(z|\mu)]-E_{q(\theta,z,\phi)}[\log q(\phi|\lambda )]\end{align*}

\begin{align*}&E_{q(\theta,z,\phi)}[\log p(\theta|\alpha)]\\&=\int q(\theta,z,\phi) \log p(\theta|\alpha) d\theta dz d\phi\\&=\int q(\theta,z,\phi) \log \frac {\Gamma(\sum_{i=1}^K \alpha_i)}{\prod_{i=1}^K \Gamma(\alpha_i)} \prod_{i=1}^K\theta_i^{\alpha_i-1}d\theta dz d\phi\\&=\int q(\theta,z,\phi) \log \Gamma(\sum_{i=1}^K \alpha_i) dz d\phi\\&-\int q(\theta,z,\phi) \log \prod_{i=1}^K \Gamma(\alpha_i) d\theta dz d\phi\\&+\int q(\theta,z,\phi) \log  \prod_{i=1}^K\theta_i^{\alpha_i-1}d\theta dz d\phi\\&= \log \Gamma(\sum_{i=1}^K \alpha_i) -\log \prod_{i=1}^K \Gamma(\alpha_i)+\int q(\theta) \log  \prod_{i=1}^K\theta_i^{\alpha_i-1}d\theta\\&=\log \Gamma(\sum_{i=1}^K \alpha_i) -\sum_{i=1}^K \log \Gamma(\alpha_i) + E_{q(\theta)}[\sum_{i=1}^K \log \theta_i^{\alpha_i-1}]\\&=\log \Gamma(\sum_{i=1}^K \alpha_i) -\sum_{i=1}^K \log \Gamma(\alpha_i) + \sum_{i=1}^K (\alpha_i-1)E_{q(\theta|\alpha)}[\log \theta_i]\end{align*}

指数分布族:p(x|\theta)=h(x)exp(\eta (\theta) \cdot T(x)-A(\theta))

A(x)为归一化因子,主要是保证概率分布累积求和后为1

0=\log 1=\log \int p(x|\theta) dx=\log \int h(x)exp(\eta (\theta) \cdot T(x)-A(\theta)) dx=\log \int h(x)exp(\eta (\theta) \cdot T(x)) dx - A(\theta)

A(\theta)=\log \int h(x)exp(\eta (\theta) \cdot T(x)) dx

\begin{align*}&\frac{d}{d \eta(\theta)} A(\theta)\\&=\frac{\int h(x) exp(\eta(\theta) \cdot T(x)) T(x) dx}{\int h(x)exp(\eta (\theta) \cdot T(x)) dx } \\&=\frac{\int h(x) exp(\eta(\theta) \cdot T(x)) T(x) dx}{exp(A(\theta))} \\&=\int h(x) exp(\eta(\theta) \cdot T(x)-A(\theta)) T(x) dx\\&=\int p(x|\theta) T(x) dx\\ &= E_{p(x|\theta)} [T(x)] \end{align*}

\begin{align*}&\frac{d}{d \eta(\theta_i)} A(\theta)\\&=\frac{\int h(x) exp(\eta(\theta) \cdot T(x)) T(x_i) dx}{\int h(x)exp(\eta (\theta) \cdot T(x)) dx } \\&=\frac{\int h(x) exp(\eta(\theta) \cdot T(x)) T(x_i) dx}{exp(A(\theta))} \\&=\int h(x) exp(\eta(\theta) \cdot T(x)-A(\theta)) T(x_i) dx\\&=\int p(x|\theta) T(x_i) dx\\ &= E_{p(x|\theta)} [T(x_i)] \end{align*}

Dirichlet分布也是指数分布族

证明:

\begin{align*}&p(\theta|\alpha)=Dir(\theta|\alpha)
\\&=\frac{\Gamma(\sum_{i=1}^K\alpha_i)}{\prod_{i=1}^K \Gamma(\alpha_i)} \prod_{i=1}^K \theta_i^{\alpha_i-1}
\\&=\exp(\log(\frac{\Gamma(\sum_{i=1}^K\alpha_i)}{\prod_{i=1}^K \Gamma(\alpha_i)} \prod_{i=1}^K \theta_i^{\alpha_i-1}))\\&=\exp(\log(\Gamma(\sum_{i=1}^K\alpha_i))-\log(\prod_{i=1}^K \Gamma(\alpha_i))+\log(\prod_{i=1}^K \theta_i^{\alpha_i-1}))\\&=exp(\sum_{i=1}^K (\alpha_i-1) \log \theta_i + (\log(\Gamma(\sum_{i=1}^K\alpha_i))-\log(\prod_{i=1}^K \Gamma(\alpha_i))) )\\&=exp(\sum_{i=1}^K (\alpha_i-1) \log \theta_i - (\sum_{i=1}^K \log \Gamma(\alpha_i)-\log(\Gamma(\sum_{i=1}^K\alpha_i))) )\end{align*}

其中,

\begin{align*} &h(\theta)=1\\&\eta(\alpha_i)=\alpha_i-1\\&T(\theta_i)=\log \theta_i\\ &\eta(\alpha) \cdot T(x)=\sum_{i=1}^K \eta(\alpha_i)T(\theta_i)\\ &A(\alpha)=\sum_{i=1}^K \log \Gamma(\alpha_i)-\log(\Gamma(\sum_{i=1}^K\alpha_i))\end{align*}

得证。

Dirichlet分布

\begin{align*}&E_{p(\theta|\alpha)}[\log\theta_i]=E_{p(\theta|\alpha)} [T(\theta_i)] =\frac{d}{d\eta(\alpha_i)}A(\alpha)=\frac{d}{d\alpha_i}A(\alpha)\\&=\frac{d}{d\alpha_i} (\sum_{i=1}^K \log \Gamma(\alpha_i)-\log(\Gamma(\sum_{i=1}^K\alpha_i)))\\&=\frac{d}{d\alpha_i} \log \Gamma(\alpha_i) -\frac{d}{d\alpha_i} \log(\Gamma(\sum_{i=1}^K\alpha_i))\\&=\Psi (\alpha_i) - \Psi(\sum_{i=1}^K\alpha_i)\end{align*}

Ψ是Digamma函数,即log Γ 函数的导数。

\begin{align*}&E_{q(\theta,z,\phi)}[\log p(\theta|\alpha)]
\\&=\log \Gamma(\sum_{i=1}^K \alpha_i) -\sum_{i=1}^K \log \Gamma(\alpha_i) + \sum_{i=1}^K (\alpha_i-1)E_{q(\theta,z,\phi)}[\log \theta_i]
\\&=\log \Gamma(\sum_{i=1}^K \alpha_i) -\sum_{i=1}^K \log \Gamma(\alpha_i) + \sum_{i=1}^K (\alpha_i-1)E_{q(\theta)}[\log \theta_i]
\\&=\log \Gamma(\sum_{i=1}^K \alpha_i) -\sum_{i=1}^K \log \Gamma(\alpha_i) + \sum_{i=1}^K (\alpha_i-1)E_{q(\theta|\gamma)}[\log \theta_i]
\\&=\log \Gamma(\sum_{i=1}^K \alpha_i) -\sum_{i=1}^K \log \Gamma(\alpha_i) + \sum_{i=1}^K (\alpha_i-1) (\Psi (\gamma_i) - \Psi(\sum_{j=1}^K\gamma_j))
\end{align*}

\begin{align*}&E_{q(\theta,z,\phi)}[\log p(z|\theta)]
\\&=E_{q(\theta,z,\phi)}[\log\prod_{i=1}^N \prod_{j=1}^K (\theta_j)^{z_i^j}]
\\&=\sum_{i=1}^N \sum_{j=1}^KE_{q(\theta,z,\phi)}[\log (\theta_j)^{z_i^j}]
\\&=\sum_{i=1}^N \sum_{j=1}^KE_{q(z|\mu)}[z_i^j]E_{q(\theta|\gamma)}[\log (\theta_j)]
\\&=\sum_{i=1}^N \sum_{j=1}^K\mu_{ij}(\Psi (\gamma_j)-\Psi (\sum_{k=1}^K\gamma_k))\end{align*}

\begin{align*}&E_{q(\theta,z,\phi)}[\log p(\phi|\beta)]
\\&=\int q(\theta,z,\phi) \log \prod_{k=1}^K (\frac{\Gamma(\sum_{i=1}^V \beta_i)}{\prod_{i=1}^V\Gamma(\beta_i)}\prod_{i=1}^V\phi_{ki}^{\beta_i-1})d\theta dz d\phi
\\&=K\log\Gamma(\sum_{i=1}^V \beta_i)-K\sum_{i=1}^V\log\Gamma(\beta_i) + \sum_{k=1}^KE_{q(\theta,z,\phi)}[\sum_{i=1}^V(\beta_i-1)\log\phi_{ki}]
\\&=K\log\Gamma(\sum_{i=1}^V \beta_i)-K\sum_{i=1}^V\log\Gamma(\beta_i) + \sum_{k=1}^K\sum_{i=1}^V(\beta_i-1)E_{q(\theta,z,\phi)}[\log\phi_{ki}]
\\&=K\log\Gamma(\sum_{i=1}^V \beta_i)-K\sum_{i=1}^V\log\Gamma(\beta_i) + \sum_{k=1}^K\sum_{i=1}^V(\beta_i-1)E_{q(\phi|\lambda)}[\log\phi_{ki}]
\\&=K\log\Gamma(\sum_{i=1}^V \beta_i)-K\sum_{i=1}^V\log\Gamma(\beta_i) + \sum_{k=1}^K\sum_{i=1}^V(\beta_i-1)(\Psi (\lambda_{ki}) - \Psi(\sum_{j=1}^V\lambda_{kj}))
\end{align*}

\begin{align*}&E_{q(\theta,z,\phi)}[\log p(w|z,\phi)]
\\&=\sum_{i=1}^N E_{q(\theta,z,\phi)}[\log p(w_i|z_i,\phi)]
\\&=\sum_{i=1}^N\sum_{k=1}^K\sum_{j=1}^VE_{q(\theta,z,\phi)} [\log\phi_{kj}^{z_n^k w_n^j}]
\\&=\sum_{i=1}^N\sum_{k=1}^K\sum_{j=1}^V \mu_{nk} w_n^jE_{q(\theta,z,\phi)} [\log\phi_{kj}]
\\&=\sum_{i=1}^N\sum_{k=1}^K\sum_{j=1}^V \mu_{nk} w_n^j(\Psi(\phi_{kj})-\Psi(\sum_{v=1}^V\phi_{kv}))
\end{align*}

E_{q(\theta,z,\phi)}[\log q(\theta|\gamma)]=\log \Gamma(\sum_{i=1}^K \gamma_i) -\sum_{i=1}^K \log \Gamma(\gamma_i) + \sum_{i=1}^K (\gamma_i-1) (\Psi (\gamma_i) - \Psi(\sum_{j=1}^K\gamma_j))

E_{q(\theta,z,\phi)}[\log q(z|\mu)]=\sum_{i=1}^N \sum_{j=1}^K E_{q(\theta,z,\phi)} \log (\mu_{ij})^{z_i^j}=\sum_{i=1}^N \sum_{j=1}^K E_{q(\theta,z,\phi)}[z_i^j\log (\mu_{ij})]=\sum_{i=1}^N \sum_{j=1}^K \mu_{ij}\log (\mu_{ij})

\begin{align*}&E_{q(\theta,z,\phi)}[\log q(\phi|\lambda )]
\\&=E_{q(\theta,z,\phi)}[\log \prod_{k=1}^K  q(\phi_k|\lambda_k)]
\\&=\sum_{k=1}^KE_{q(\theta,z,\phi)} [\log (\frac{\Gamma(\sum_{i=1}^V \lambda_{ki})}{\prod_{i=1}^V\Gamma(\lambda_{ki})}\prod_{i=1}^V \phi_{ki}^{\lambda_{ki}-1})]
\\&=\sum_{k=1}^K \log\Gamma(\sum_{i=1}^V\lambda_{ki})-\sum_{k=1}^K \sum_{i=1}^V \log \Gamma(\lambda_{ki})+\sum_{k=1}^K \sum_{i=1}^V (\lambda_{ki}-1)E_{q(\phi|\lambda)}[\log \phi_{ki}]
\\&=\sum_{k=1}^K \log\Gamma(\sum_{i=1}^V\lambda_{ki})-\sum_{k=1}^K \sum_{i=1}^V \log \Gamma(\lambda_{ki})+\sum_{k=1}^K \sum_{i=1}^V (\lambda_{ki}-1)(\Psi(\lambda_{ki})-\Psi(\sum_{v=1}^V \lambda_{kv}))
\end{align*}

未完!

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。

推荐阅读更多精彩内容