网站首页
词典首页
请输入您要查询的短语:
短语
Markov chains
释义
Markov chain
['mɑ:kɔ:f]
【统计学】 马尔可夫链(一个随机事件的序列,其中每一个事件均由紧靠它的前一个事件决定) [亦作 Markoff chain]
随便看
overdying
overeager
overeagerly
overeagerness
overearnest
overearnestly
overearnestness
over easy
overeat
overeate
overeaten
overeater
overeaters
overeating
overeating disease
overeatings
overeats
overeducate
overeducated
overeducates
overeducating
overegg
overelaborate
overelaborated
overelaborately
英语短语词组词典共收录525608条英汉双解短语词组词条,基本涵盖了全部常用短语词组的翻译及用法,是英语学习的有利工具。
Copyright © 2004-2022 Fjtmw.com All Rights Reserved
京ICP备2021023879号
更新时间:2026/1/30 20:08:41