This is personal study note
Copyright and original reference:
https://www.youtube.com/watch?v=gchgZ2zp8uo&list=PLbhbGI_ppZISMV4tAWHlytBqNq1-lb8bz&index=61
================================================================================
3 research questions in HMM
- Evaluation question
- Decoding question
- Almost supervised learning
- Forward probability
- Backward probability
- Viterbi decoding algorithm
- Learning question
- Unsupervised learning for HMM: Baum-Welch algorithm (= EM algorithm)
- Only observation sequence X is give
- Parameters $$$\pi,a,b$$$ are not give
================================================================================
text = crawl_twitter_text(twiter)
text: X
================================================================================
$$$\pi,a,b$$$
- Parameters in HMM
Forward algorithm for evaluation(params=[pi,a,b])
Viterbi algorithm for decoding(params=[pi,a,b])
================================================================================
pi,a,b=know parameters(info1=Latent factor information Z, info2=Observation sequence X)
================================================================================
get(Latent factor information Z) is hard.
================================================================================
Real world case
you have $$$X$$$
Inference $$$Z, \pi, a, b$$$
================================================================================
hat: estimated value
$$$\pi$$$,a,b=estimate parameters(infomation=X)
optimaly estimated Z=find most probable Z.by Viterbi decoding(information=[X,$$$\pi$$$,a,b])
================================================================================
Step1:
fix(Z)
optimized_params=optimize_estimation(\bar{\pi},\bar{a},\bar{b})
Step2:
fix(optimized_params)
optimized_Z=optimize_estimation(Z)
while True:
Step1()
Step2()
================================================================================
================================================================================
randomly_initialize(parameters=[$$$\pi$$$,a,b])
Expectation_step:
$$$\pi$$$,a,b=estimate_parameters(X)
prob=get_probability_value(P(Z|X,$$$\theta=\pi,a,b$$$))
Z=prob
Maximization step:
optimized_params=optimize($$$\pi$$$,a,b)
while True:
Expectation_step()
Maximization_step()
================================================================================
EM algorithm for HMM
================================================================================