Talk:Multiple sequence alignment: Difference between revisions

Content deleted Content added
GA Template check: checking oldid 75224080 (00:03, 12 September 2006) , 5 intermediate revisions to current (144672904). setting topic to Natsci, per WP:UCGA
Gribskov (talk | contribs)
Line 25:
Great work, thanks for the reactions. Congrat! :) It's now a [[WP:GA|good article]]. [[User:NCurse|NCurse]] <sub> [[User talk:NCurse|work]]</sub> 14:47, 12 September 2006 (UTC)
: Thanks for the review! [[User:Opabinia regalis|Opabinia regalis]] 01:03, 13 September 2006 (UTC)
==some clarifications==
 
the statement
 
<div style="border:1px solid black;padding:5px;">
Because HMMs are probabilistic, they do not produce the same solution every time they are run on the same dataset; thus they cannot be guaranteed to converge to an optimal alignment. HMMs can produce both global and local alignments. Although HMM-based methods have been developed relatively recently, they offer significant improvements in computational speed, especially for sequences that contain overlapping regions.
</div>
 
is incorrect. HMMs are probablistic in the sense that they are a statistical model, however, they are completely deterministic and will produce the same result every time on a given dataset. HMM alignments use the same algorithms as local sequence alignments and therefore have no computational speed advantage.
 
<div style="border:1px solid black;padding:5px;">One of the most common motif-finding tools, known as MEME, uses expectation maximization and hidden Markov methods to generate motifs that are then used as search tools by its companion MAST in the combined suite MEME/MAST.[19][20]
</div>
 
MEME uses a PSSM (position specific scoring matrix), but does not contain insertion or deletion probabilities or other characteristics of a typical sequence HMM.
 
[[User:Gribskov|Gribskov]] 03:55, 20 September 2007 (UTC)