网站首页
词典首页
请输入您要查询的短语:
短语
Markov chains
释义
Markov chain
['mɑ:kɔ:f]
【统计学】 马尔可夫链(一个随机事件的序列,其中每一个事件均由紧靠它的前一个事件决定) [亦作 Markoff chain]
随便看
dressmake
dressmaker
dressmaker form
dressmakers
dressmakes
dressmaking
dressmakings
dress-off
dress oneself
dress out
dress parade
dress-preserver
dress rehearsal
dress rehearsals
dress shield
dress shields
dress ship
dress shirt
dress shirts
dress suit
dress suits
dress uniform
dress uniforms
dress ... up
dress up
英语短语词组词典共收录525608条英汉双解短语词组词条,基本涵盖了全部常用短语词组的翻译及用法,是英语学习的有利工具。
Copyright © 2004-2022 Fjtmw.com All Rights Reserved
京ICP备2021023879号
更新时间:2024/12/22 13:30:31