Monte Carlo MCMC: Efficient Inference by Approximate Sampling.
EMNLP-CoNLL '12: Proceedings of the 2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning(2012)
摘要
Conditional random fields and other graphical models have achieved state of the art results in a variety of tasks such as coreference, relation extraction, data integration, and parsing. Increasingly, practitioners are using models with more complex structure---higher tree-width, larger fan-out, more features, and more data---rendering even approximate inference methods such as MCMC inefficient. In this paper we propose an alternative MCMC sampling scheme in which transition probabilities are approximated by sampling from the set of relevant factors. We demonstrate that our method converges more quickly than a traditional MCMC sampler for both marginal and MAP inference. In an author coreference task with over 5 million mentions, we achieve a 13 times speedup over regular MCMC inference.
更多查看译文
关键词
alternative MCMC sampling scheme,regular MCMC inference,traditional MCMC sampler,MAP inference,approximate inference method,author coreference task,data integration,art result,complex structure,conditional random field,Monte Carlo MCMC,approximate sampling,efficient inference
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络