您好,欢迎来到叨叨游戏网。
搜索
您的当前位置:首页N-grams

N-grams

来源:叨叨游戏网

Introduce probabilistic language modeling

let's thinking about the probability of sentence.

P(w1, w2, w3, w4)

= P(w1, w2, w3) * P(w4| w1, w2, w3)

= P(w1, w2) * P(w3| w1, w2) * P(w4| w1, w2, w3)

= P(w1) * P(w2| w1) * P(w3| w1, w2) * P(w4| w1, w2, w3)

 

Markov Assumption

 P(w1, w2, w3, w4...wn) = P(wn) * P (wn| wk, wk+1...wn)

for example

P(w1, w2, w3, w4)

= P(w1, w2, w3) * P(w4|w3)

= P(w1, w2) * P(w3|w2) * P(w4| w3)

= P(w1) * P(w2| w1) * P(w3|w2) * P(w4|w3)

Then it is easier.

And there are N-grams, 3-grams, 4-grams.

转载于:https://www.cnblogs.com/chuanlong/archive/2013/04/18/3029331.html

因篇幅问题不能全部显示,请点此查看更多更全内容

Copyright © 2019- gamedaodao.net 版权所有 湘ICP备2024080961号-6

违法及侵权请联系:TEL:199 18 7713 E-MAIL:2724546146@qq.com

本站由北京市万商天勤律师事务所王兴未律师提供法律服务