views:

156

answers:

1

What is the difference between Foward-backward algorithm on n-gram model and viterbi algorithm on HMM model?

When I review the implementation of these two algorithms, only thing I found is that the transaction probability is coming from different probabilistic models.

Is there a difference between these 2 algorithms?

+5  A: 

The Forward-Backward algorithm combines the forward step and the backward step to get the probability of being at each state at a specific time. Doing this for all time steps can therefore give us a sequence of individually most likely states at each time (although not guaranteed to be valid sequence, since it considers individual state at each step, and it can happen that the probability p(q_i -> q_j)=0 in the transition model), in other words:

Equation (1) where Equation (2)

On the other hand, Viterbi algorithm finds the most likely state sequence given an observation sequence, by maximizing a different optimality criterion:

Equation (3)

I suggest you refer to this well-known paper for a detailed explanation (see Problem #2):

LAWRENCE R. RABINER, A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition


Equations:

<code language=latex>
    $q_t = argmax_{i} (\gamma_t(i))$
    $\gamma_t(i) = \dfrac{\alpha_t(i)\beta_t(i)}{\sum_i{\alpha_t(i)\beta_t(i)}}$
    $Q^* = argmax_Q P(Q \mid O,\lambda)$
</code>
Amro
None of those images are showing up for me..
ealdent
for some reason, images wont work... posted them on *mathbin.net* instead
Amro