Morphological Parsing i.e., extracting the “properties” of a word (e.g., computers -> computer + + ).For example, pluralizing words (cat -> cats, dog -> dogs, goose -> geese, etc.). There is a defined “initial state” (indicated here in green) and 1 or more “final states” (indicated here in gray)įSTs are used for a variety of different applications:.Traversing through to the end of an FST implies the generation (or indication of) a set of string relations.Sometimes, labels are “empty” (indicated here as lowercase epsilon ε).Edges (“Transitions”) have input/output labels.I like to think of Finite State Transducers (FSTs) as directed graphs, with a few special properties: Here’s an example of a finite state transducer: Let’s start with a visual, because I like images. If you were just as confused as I was, then awesome! Read on.
#FINITE STATE AUTOMATA NATURAL LANGAUGE FREE#
Wikipedia describes Finite State Transducers (FSTs) as “a finite state machine with two tapes: an input tape and an output tape.” If that makes sense to you, great! Feel free to stop reading. However, I should note that this blog series is not sponsored by, and represent my opinions only, and not those of the company.)ĭo you like graphs? So do I! High five! Let’s ease into NLP Fundamentals by talking about Finite State Transducers (FSTs). As a consequence, there may be hints of Amazon sprinkled throughout this blog series. ( Disclaimer: I am a developer at for their Alexa technologies, on devices such as Amazon Echo. This blog post is part of a series, titled Natural Language Processing (NLP) Fundamentals.