Natural Language
PSU CS441/541
Lecture 9
November 20, 2000
- Plan For Today
- Natural Language
- Scheduling (deferred)
- Project discussion
- HW3 discussion (eliminated)
- Natural Language
- Why AI?
- obvious: language implies intelligence
- Turing test
- Washoe
- SHRDLU
- ELIZA
- language recognition as puzzle domain requiring reasoning
- Two facets: recognition and synthesis
- state of art in synthesis
- acceptable phoneme-based word pronunciation
- usable
- spoken NL text using tricks
- text generation via standard CS methods
- sonics generally ahead of NL generation
- NL generation generally ahead of general AI
- state of art in recognition
- much harder than synthesis
- the crummy->good vs. good->crummy problem
- requires much more integration with semantics
- many axes on which to compare
- for a good time, try 1-800-555-TELL
- DragonDictate
- big sponsor: the phone companies
- levels of recognition
- speech sounds
- digitize
- FFT (voiceprint = time/freq plot)
- phonemes, phoneme groups
- fit to vocal tract model
- deal with transitions
- words ("segments")
- most words are not pronounced distinctly
("continuous speech")
- force speaker to segment speech (e.g. menu systems)
- try to find breaks bottom up (hard/impossible)
- work back to breaks from syntax (HMMs, works better)
- syntax ("grammar")
- book is very explicit here
- obtain word meanings from dictionary
- walk nondeterministic ATN grammar to generate parse
- grammar constrained by special rules
- where do dictionaries and magic grammars come from?
- semantics ("meaning")
- goal: run special rules over parse to generate FOL
- make sure to match conjunction of all triggered rules
- pragmatics ("higher meaning")
- narrow choices as far as possible using background knowledge
- e.g.: pronoun referent problem (anaphora resolution)
- discourse analysis (e.g. scene formation)
- plan recognition
- identify agent goals
- find agent plan which would fit agent actions
- give broader semantic analysis (e.g. need money,
get gun)
- also useful elsewhere
- Project + HW 3 discussion
- review of negamax
- from board value to move value
- general tricks
- transposition tables
- hash to position
- open hash table
- random replacement
- with iterative deepening
- graph-history interaction
- alpha-beta with t-tables
- incremental do-undo!
- side-separated heuristic, differencing
- the rules of LOA
- LOA-specific problems
- connectedness
- move generation
- heuristic evaluation
- material (different for high/low)
- connectedness (number/size of groups)
- mobility
- coordination
- piece position