Cover image for An introduction to language processing with Perl and Prolog : an outline of theories, implementation, and application with special consideration of English, French, and German
Title:
An introduction to language processing with Perl and Prolog : an outline of theories, implementation, and application with special consideration of English, French, and German
Personal Author:
Publication Information:
Berlin : Springer, 2006
ISBN:
9783540250319

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010113506 QA76.9.N38 N83 2006 Open Access Book Book
Searching...

On Order

Summary

Summary

The areas of natural language processing and computational linguistics have continued to grow in recent years, driven by the demand to automatically process text and spoken data. With the processing power and techniques now available, research is scaling up from lab prototypes to real-world, proven applications.

This book teaches the principles of natural language processing, first covering linguistics issues such as encoding, entropy, and annotation schemes; defining words, tokens and parts of speech; and morphology. It then details the language-processing functions involved, including part-of-speech tagging using rules and stochastic techniques; using Prolog to write phase-structure grammars; parsing techniques and syntactic formalisms; semantics, predicate logic and lexical semantics; and analysis of discourse, and applications in dialog systems. The key feature of the book is the author's hands-on approach throughout, with extensive exercises, sample code in Prolog and Perl, and a detailed introduction to Prolog. The reader is supported with a companion website that contains teaching slides, programs, and additional material.

The book is suitable for researchers and students of natural language processing and computational linguistics.


Author Notes

Pierre Nugues' research is focused on natural language processing for advanced user interfaces and spoken dialogue. This includes the design and the implementation of conversational agents within a multimodal framework and text visualization. He led the team that designed a navigation agent - Ulysse - that enables a user to navigate in a virtual reality environment using language, and the team thatdesignedthe CarSim system that generates animated 3D scenes from written texts.

Pierre Nugues has taught natural-language processing and computational linguistics at the following institutions:ISMRA, Caen, France;University of Nottigham, UK;Staffordshire University, UK; FH Konstanz, Germany;Lund University, Sweden;and Ghent University, Belgium.


Table of Contents

1 An Overview of Language Processingp. 1
1.1 Linguistics and Language Processingp. 1
1.2 Applications of Language Processingp. 2
1.3 The Different Domains of Language Processingp. 3
1.4 Phoneticsp. 4
1.5 Lexicon and Morphologyp. 6
1.6 Syntaxp. 8
1.6.1 Syntax as Defined by Noam Chomskyp. 8
1.6.2 Syntax as Relations and Dependenciesp. 10
1.7 Semanticsp. 11
1.8 Discourse and Dialoguep. 14
1.9 Why Speech and Language Processing Are Difficultp. 14
1.9.1 Ambiguityp. 15
1.9.2 Models and Their Implementationp. 16
1.10 An Example of Language Technology in Action: the Persona Projectp. 17
1.10.1 Overview of Personap. 17
1.10.2 The Persona's Modulesp. 18
1.11 Further Readingp. 19
2 Corpus Processing Toolsp. 23
2.1 Corporap. 23
2.1.1 Types of Corporap. 23
2.1.2 Corpora and Lexicon Buildingp. 24
2.1.3 Corpora as Knowledge Sources for the Linguistp. 26
2.2 Finite-State Automatap. 27
2.2.1 A Descriptionp. 27
2.2.2 Mathematical Definition of Finite-State Automatap. 28
2.2.3 Finite-State Automata in Prologp. 29
2.2.4 Deterministic and Nondeterministic Automatap. 30
2.2.5 Building a Deterministic Automata from a Nondeterministic Onep. 31
2.2.6 Searching a String with a Finite-State Automatonp. 31
2.2.7 Operations on Finite-State Automatap. 33
2.3 Regular Expressionsp. 35
2.3.1 Repetition Metacharactersp. 36
2.3.2 The Longest Matchp. 37
2.3.3 Character Classesp. 38
2.3.4 Nonprintable Symbols or Positionsp. 39
2.3.5 Union and Boolean Operatorsp. 41
2.3.6 Operator Combination and Precedencep. 41
2.4 Programming with Regular Expressionsp. 42
2.4.1 Perlp. 42
2.4.2 Matchingp. 42
2.4.3 Substitutionsp. 43
2.4.4 Translating Charactersp. 44
2.4.5 String Operatorsp. 44
2.4.6 Back Referencesp. 45
2.5 Finding Concordancesp. 46
2.5.1 Concordances in Prologp. 46
2.5.2 Concordances in Perlp. 48
2.6 Approximate String Matchingp. 50
2.6.1 Edit Operationsp. 50
2.6.2 Minimum Edit Distancep. 51
2.6.3 Searching Edits in Prologp. 54
2.7 Further Readingp. 55
3 Encoding, Entropy, and Annotation Schemesp. 59
3.1 Encoding Textsp. 59
3.2 Character Setsp. 60
3.2.1 Representing Charactersp. 60
3.2.2 Unicodep. 61
3.2.3 The Unicode Encoding Schemesp. 63
3.3 Locales and Word Orderp. 66
3.3.1 Presenting Time, Numerical Information, and Ordered Wordsp. 66
3.3.2 The Unicode Collation Algorithmp. 67
3.4 Markup Languagesp. 69
3.4.1 A Brief Backgroundp. 69
3.4.2 An Outline of XMLp. 69
3.4.3 Writing a DTDp. 71
3.4.4 Writing an XML Documentp. 74
3.4.5 Namespacesp. 75
3.5 Codes and Information Theoryp. 76
3.5.1 Entropyp. 76
3.5.2 Huffman Encodingp. 77
3.5.3 Cross Entropyp. 80
3.5.4 Perplexity and Cross Perplexityp. 81
3.6 Entropy and Decision Treesp. 82
3.6.1 Decision Treesp. 82
3.6.2 Inducing Decision Trees Automaticallyp. 82
3.7 Further Readingp. 84
4 Counting Wordsp. 87
4.1 Counting Words and Word Sequencesp. 87
4.2 Words and Tokensp. 87
4.2.1 What Is a Word?p. 87
4.2.2 Breaking a Text into Words: Tokenizationp. 88
4.3 Tokenizing Textsp. 89
4.3.1 Tokenizing Texts in Prologp. 89
4.3.2 Tokenizing Texts in Perlp. 91
4.4 N-gramsp. 92
4.4.1 Some Definitionsp. 92
4.4.2 Counting Unigrams in Prologp. 93
4.4.3 Counting Unigrams with Perlp. 93
4.4.4 Counting Bigrams with Perlp. 95
4.5 Probabilistic Models of a Word Sequencep. 95
4.5.1 The Maximum Likelihood Estimationp. 95
4.5.2 Using ML Estimates with Nineteen Eighty-Fourp. 97
4.6 Smoothing N-gram Probabilitiesp. 99
4.6.1 Sparse Datap. 99
4.6.2 Laplace's Rulep. 100
4.6.3 Good-Turing Estimationp. 101
4.7 Using N-grams of Variable Lengthp. 102
4.7.1 Linear Interpolationp. 103
4.7.2 Back-offp. 104
4.8 Quality of a Language Modelp. 104
4.8.1 Intuitive Presentationp. 104
4.8.2 Entropy Ratep. 105
4.8.3 Cross Entropyp. 105
4.8.4 Perplexityp. 106
4.9 Collocationsp. 106
4.9.1 Word Preference Measurementsp. 107
4.9.2 Extracting Collocations with Perlp. 108
4.10 Application: Retrieval and Ranking of Documents on the Webp. 109
4.11 Further Readingp. 111
5 Words, Parts of Speech, and Morphologyp. 113
5.1 Wordsp. 113
5.1.1 Parts of Speechp. 113
5.1.2 Featuresp. 114
5.1.3 Two Significant Parts of Speech: The Noun and the Verbp. 115
5.2 Lexiconsp. 117
5.2.1 Encoding a Dictionaryp. 119
5.2.2 Building a Trie in Prologp. 121
5.2.3 Finding a Word in a Triep. 123
5.3 Morphologyp. 123
5.3.1 Morphemesp. 123
5.3.2 Morphsp. 124
5.3.3 Inflection and Derivationp. 125
5.3.4 Language Differencesp. 129
5.4 Morphological Parsingp. 130
5.4.1 Two-Level Model of Morphologyp. 130
5.4.2 Interpreting the Morphsp. 131
5.4.3 Finite-State Transducersp. 131
5.4.4 Conjugating a French Verbp. 133
5.4.5 Prolog Implementationp. 134
5.4.6 Ambiguityp. 136
5.4.7 Operations on Finite-State Transducersp. 137
5.5 Morphological Rulesp. 138
5.5.1 Two-Level Rulesp. 138
5.5.2 Rules and Finite-State Transducersp. 139
5.5.3 Rule Composition: An Examplewith French Irregular Verbsp. 141
5.6 Application Examplesp. 142
5.7 Further Readingp. 142
6 Part-of-Speech Tagging Using Rulesp. 147
6.1 Resolving Part-of-Speech Ambiguityp. 147
6.1.1 A Manual Methodp. 147
6.1.2 Which Method to Use to Automatically Assign Parts of Speechp. 147
6.2 Tagging with Rulesp. 149
6.2.1 Brill's Taggerp. 149
6.2.2 Implementation in Prologp. 151
6.2.3 Deriving Rules Automaticallyp. 153
6.2.4 Confusion Matricesp. 154
6.3 Unknown Wordsp. 154
6.4 Standardized Part-of-Speech Tagsetsp. 156
6.4.1 Multilingual Part-of-Speech Tagsp. 156
6.4.2 Parts of Speechfor Englishp. 158
6.4.3 An Annotation Schemefor Swedishp. 160
6.5 Further Readingp. 162
7 Part-of-Speech Tagging Using Stochastic Techniquesp. 163
7.1 The Noisy Channel Modelp. 163
7.1.1 Presentationp. 163
7.1.2 The N-gram Approximationp. 164
7.1.3 Tagging a Sentencep. 165
7.1.4 The Viterbi Algorithm: An Intuitive Presentationp. 166
7.2 Markov Modelsp. 167
7.2.1 Markov Chainsp. 167
7.2.2 Hidden Markov Modelsp. 169
7.2.3 Three Fundamental Algorithms to Solve Problems with HMMsp. 170
7.2.4 The Forward Procedurep. 171
7.2.5 Viterbi Algorithmp. 173
7.2.6 The Backward Procedurep. 174
7.2.7 The Forward-Backward Algorithmp. 175
7.3 Tagging with Decision Treesp. 177
7.4 Unknown Wordsp. 179
7.5 An Application of the Noisy Channel Model: Spell Checkingp. 179
7.6 A Second Application: Language Models for Machine Translationp. 180
7.6.1 Parallel Corporap. 180
7.6.2 Alignmentp. 181
7.6.3 Translationp. 183
7.7 Further Readingp. 184
8 Phrase-Structure Grammars in Prologp. 185
8.1 Using Prolog to Write Phrase-Structure Grammarsp. 185
8.2 Representing Chomsky's Syntactic Formalism in Prologp. 185
8.2.1 Constituentsp. 185
8.2.2 Tree Structuresp. 186
8.2.3 Phrase-Structure Rulesp. 187
8.2.4 The Definite Clause Grammar (DCG) Notationp. 188
8.3 Parsing with DCGsp. 190
8.3.1 Translating DCGs into Prolog Clausesp. 190
8.3.2 Parsing and Generationp. 192
8.3.3 Left-Recursive Rulesp. 193
8.4 Parsing Ambiguityp. 194
8.5 Using Variablesp. 196
8.5.1 Gender and Number Agreementp. 196
8.5.2 Obtaining the Syntactic Structurep. 198
8.6 Application: Tokenizing Texts Using DCG Rulesp. 200
8.6.1 Word Breakingp. 200
8.6.2 Recognition of Sentence Boundariesp. 201
8.7 Semantic Representationp. 202
8.7.1 A-Calculusp. 202
8.7.2 Embedding A-Expressions into DCG Rulesp. 203
8.7.3 Semantic Composition of Verbsp. 205
8.8 An Application of Phrase-Structure Grammars and a Worked Examplep. 206
8.9 Further Readingp. 210
9 Partial Parsingp. 213
9.1 Is Syntax Necessary?p. 213
9.2 Word Spotting and Template Matchingp. 213
9.2.1 ELIZAp. 213
9.2.2 Word Spotting in Prologp. 214
9.3 Multiword Detectionp. 217
9.3.1 Multiwordsp. 217
9.3.2 AStandard Multiword Annotationp. 217
9.3.3 Detecting Multiwords with Rulesp. 219
9.3.4 The Longest Matchp. 219
9.3.5 Running the Programp. 220
9.4 Noun Groups and Verb Groupsp. 222
9.4.1 Groups Versus Recursive Phrasesp. 223
9.4.2 DCG Rules to Detect Noun Groupsp. 223
9.4.3 DCG Rules to Detect Verb Groupsp. 225
9.4.4 Running the Rulesp. 226
9.5 Group Detection as a Tagging Problemp. 227
9.5.1 Tagging Gapsp. 227
9.5.2 Tagging Wordsp. 228
9.5.3 Using Symbolic Rulesp. 229
9.5.4 Using Statistical Taggingp. 229
9.6 Cascading Partial Parsersp. 230
9.7 Elementary Analysis of Grammatical Functionsp. 231
9.7.1 Main Functionsp. 231
9.7.2 Extracting Other Groupsp. 232
9.8 An Annotation Scheme for Groups in Frenchp. 235
9.9 Application: The FASTUS Systemp. 237
9.9.1 The Message Understanding Conferencesp. 237
9.9.2 The Syntactic Layers of the FASTUS Systemp. 238
9.9.3 Evaluationof Information Extraction Systemsp. 239
9.10 Further Readingp. 240
10 Syntactic Formalismsp. 243
10.1 Introductionp. 243
10.2 Chomsky's Grammar in Syntactic Structuresp. 244
10.2.1 Constituency: A Formal Definitionp. 244
10.2.2 Transformationsp. 246
10.2.3 Transformations and Movementsp. 248
10.2.4 Gap Threadingp. 248
10.2.5 Gap Threading to Parse Relative Clausesp. 250
10.3 Standardized Phrase Categories for Englishp. 252
10.4 Unification-Based Grammarsp. 254
10.4.1 Featuresp. 254
10.4.2 Representing Features in Prologp. 255
10.4.3 A Formalism for Features and Rulesp. 257
10.4.4 Features Organizationp. 258
10.4.5 Features and Unificationp. 260
10.4.6 A Unification Algorithm for Feature Structuresp. 261
10.5 Dependency Grammarsp. 263
10.5.1 Presentationp. 263
10.5.2 Properties of a Dependency Graphp. 266
10.5.3 Valencep. 268
10.5.4 Dependencies and Functionsp. 270
10.6 Further Readingp. 273
11 Parsing Techniquesp. 277
11.1 Introductionp. 277
11.2 Bottom-up Parsingp. 278
11.2.1 The Shift-Reduce Algorithmp. 278
11.2.2 Implementing Shift-Reduce Parsing in Prologp. 279
11.2.3 Differences Between Bottom-up and Top-down Parsingp. 281
11.3 Chart Parsingp. 282
11.3.1 Backtracking and Efficiencyp. 282
11.3.2 Structure of a Chartp. 282
11.3.3 The Active Chartp. 283
11.3.4 Modules of an Earley Parserp. 285
11.3.5 The Earley Algorithm in Prologp. 288
11.3.6 The Earley Parser to Handle Left-Recursive Rules and Empty Symbolsp. 293
11.4 Probabilistic Parsing of Context-Free Grammarsp. 294
11.5 A Description of PCFGsp. 294
11.5.1 The Bottom-up Chartp. 297
11.5.2 The Cocke-Younger-Kasami Algorithm in Prologp. 298
11.5.3 Adding Probabilities to the CYK Parserp. 300
11.6 Parser Evaluationp. 301
11.6.1 Constituency-Based Evaluationp. 301
11.6.2 Dependency-Based Evaluationp. 302
11.6.3 PerformanceofPCFG Parsingp. 302
11.7 Parsing Dependenciesp. 303
11.7.1 Dependency Rulesp. 304
11.7.2 Extending the Shift-Reduce Algorithm to Parse Dependenciesp. 305
11.7.3 Nivre's Parser in Prologp. 306
11.7.4 Finding Dependencies Using Constraintsp. 309
11.7.5 Parsing Dependencies Using Statistical Techniquesp. 310
11.8 Further Readingp. 313
12 Semantics and Predicate Logicp. 317
12.1 Introductionp. 317
12.2 Language Meaning and Logic: An Illustrative Examplep. 317
12.3 Formal Semanticsp. 319
12.4 First-Order Predicate Calculus to Represent the State of Affairsp. 319
12.4.1 Variables and Constantsp. 320
12.4.2 Predicatesp. 320
12.5 Querying the Universe of Discoursep. 322
12.6 Mapping Phrases onto Logical Formulasp. 322
12.6.1 Representing Nouns and Adjectivesp. 323
12.6.2 Representing Noun Groupsp. 324
12.6.3 Representing Verbs and Prepositionsp. 324
12.7 The Case of Determinersp. 325
12.7.1 Determiners and Logic Quantifiersp. 325
12.7.2 Translating Sentences Using Quantifiersp. 326
12.7.3 A General Representation of Sentencesp. 327
12.8 Compositionality to Translate Phrases to Logical Formsp. 329
12.8.1 Translating the Noun Phrasep. 329
12.8.2 Translating the Verb Phrasep. 330
12.9 Augmenting the Database and Answering Questionsp. 331
12.9.1 Declarationsp. 332
12.9.2 Questions with Existential and Universal Quantifiersp. 332
12.9.3 Prolog and Unknown Predicatesp. 334
12.9.4 Other Determiners and Questionsp. 335
12.10 Application: The Spoken Language Translatorp. 335
12.10.1 Translating Spoken Sentencesp. 335
12.10.2 Compositional Semanticsp. 336
12.10.3 Semantic Representation Transferp. 338
12.11 Further Readingp. 340
13 Lexical Semanticsp. 343
13.1 Beyond Formal Semanticsp. 343
13.1.1 La langue etlaparolep. 343
13.1.2 Language and the Structure of the Worldp. 343
13.2 Lexical Structuresp. 344
13.2.1 Some Basic Terms and Conceptsp. 344
13.2.2 Ontological Organizationp. 344
13.2.3 Lexical Classes and Relationsp. 345
13.2.4 Semantic Networksp. 347
13.3 Building a Lexiconp. 347
13.3.1 The Lexicon and Word Sensesp. 349
13.3.2 Verb Modelsp. 350
13.3.3 Definitionsp. 351
13.4 An Example of Exhaustive Lexical Organization: Word Netp. 352
13.4.1 Nounsp. 353
13.4.2 Adjectivesp. 354
13.4.3 Verbsp. 355
13.5 Automatic Word Sense Disambiguationp. 356
13.5.1 Senses as Tagsp. 356
13.5.2 Associating a Word with a Contextp. 357
13.5.3 Guessing the Topicp. 357
13.5.4 Naive Bayesp. 358
13.5.5 Using Constraints on Verbsp. 359
13.5.6 Using Dictionary Definitionsp. 359
13.5.7 An Unsupervised Algorithm to Tag Sensesp. 360
13.5.8 Senses and Languagesp. 362
13.6 Case Grammarsp. 363
13.6.1 Cases in Latinp. 363
13.6.2 Cases and Thematic Rolesp. 364
13.6.3 Parsing with Casesp. 365
13.6.4 Semantic Grammarsp. 366
13.7 Extending Case Grammarsp. 367
13.7.1 Frame Netp. 367
13.7.2 A Statistical Method to Identify Semantic Rolesp. 368
13.8 An Example of Case Grammar Application: EVARp. 371
13.8.1 EVAR's Ontology and Syntactic Classesp. 371
13.8.2 Cases in EVARp. 373
13.9 Further Readingp. 373
14 Discoursep. 377
14.1 Introductionp. 377
14.2 Discourse: A Minimalist Definitionp. 378
14.2.1 A Description of Discoursep. 378
14.2.2 Discourse Entitiesp. 378
14.3 References: An Application-Oriented Viewp. 379
14.3.1 References and Noun Phrasesp. 379
14.3.2 Finding Names - Proper Nounsp. 380
14.4 Coreferencep. 381
14.4.1 Anaphorap. 381
14.4.2 Solving Coreferences in an Examplep. 382
14.4.3 A Standard Coreference Annotationp. 383
14.5 References: A More Formal Viewp. 384
14.5.1 Generating Discourse Entities: The Existential Quantifierp. 384
14.5.2 Retrieving Discourse Entities: Definite Descriptionsp. 385
14.5.3 Generating Discourse Entities: The Universal Quantifierp. 386
14.6 Centering: A Theory on Discourse Structurep. 387
14.7 Solving Coreferencesp. 388
14.7.1 A Simplistic Method: Using Syntactic and Semantic Compatibilityp. 389
14.7.2 Solving Coreferences with Shallow Grammatical Informationp. 390
14.7.3 Salience in a Multimodal Contextp. 391
14.7.4 Using a Machine-Learning Technique to Resolve Coreferencesp. 391
14.7.5 More Complex Phenomena: Ellipsesp. 396
14.8 Discourse and Rhetoricp. 396
14.8.1 Ancient Rhetoric: An Outlinep. 397
14.8.2 Rhetorical Structure Theoryp. 397
14.8.3 Types of Relationsp. 399
14.8.4 Implementing Rhetorical Structure Theoryp. 400
14.9 Events and Timep. 401
14.9.1 Eventsp. 403
14.9.2 Event Typesp. 404
14.9.3 Temporal Representation of Eventsp. 404
14.9.4 Events and Tensesp. 406
14.10 Time ML, an Annotation Scheme for Time and Eventsp. 407
14.11 Further Readingp. 409
15 Dialoguep. 411
15.1 Introductionp. 411
15.2 Why a Dialogue?p. 411
15.3 Simple Dialogue Systemsp. 412
15.3.1 Dialogue Systems Based on Automatap. 412
15.3.2 Dialogue Modelingp. 413
15.4 Speech Acts: A Theory of Language Interactionp. 414
15.5 Speech Acts and Human-Machine Dialoguep. 417
15.5.1 Speech Acts as a Tagging Modelp. 417
15.5.2 Speech Acts Tags Used in the SUNDIAL Projectp. 418
15.5.3 Dialogue Parsingp. 419
15.5.4 Interpreting Speech Actsp. 421
15.5.5 EVAR: A Dialogue Application Using Speech Actsp. 422
15.6 Taking Beliefs and Intentions into Accountp. 423
15.6.1 Representing Mental Statesp. 425
15.6.2 The STRIPS Planning Algorithmp. 427
15.6.3 Causalityp. 429
15.7 Further Readingp. 430
A An Introduction to Prologp. 433
A.1 A Short Backgroundp. 433
A.2 Basic Features of Prologp. 434
A.2.1 Factsp. 434
A.2.2 Termsp. 435
A.2.3 Queriesp. 437
A.2.4 Logical Variablesp. 437
A.2.5 Shared Variablesp. 438
A.2.6 Data Types in Prologp. 439
A.2.7 Rulesp. 440
A.3 Running a Programp. 442
A.4 Unificationp. 443
A.4.1 Substitution and Instancesp. 443
A.4.2 Terms and Unificationp. 444
A.4.3 The Herbrand Unification Algorithmp. 445
A.4.4 Examplep. 445
A.4.5 The Occurs-Checkp. 446
A.5 Resolutionp. 447
A.5.1 Modus Ponensp. 447
A.5.2 A Resolution Algorithmp. 447
A.5.3 Derivation Trees and Backtrackingp. 448
A.6 Tracing and Debuggingp. 450
A.7 Cuts, Negation, and Related Predicatesp. 452
A.7.1 Cutsp. 452
A.7.2 Negationp. 453
A.7.3 The once/1 Predicatep. 454
A.8 Listsp. 455
A.9 Some List-Handling Predicatesp. 456
A.9.1 The member/2 Predicatep. 456
A.9.2 The append/3 Predicatep. 457
A.9.3 The delete/3 Predicatep. 458
A.9.4 The intersection/3 Predicatep. 458
A.9.5 The reverse/2 Predicatep. 459
A.9.6 The Mode of an Argumentp. 459
A.10 Operators and Arithmeticp. 460
A.10.1 Operatorsp. 460
A.10.2 Arithmetic Operationsp. 460
A.10.3 Comparison Operatorsp. 462
A.10.4 Lists and Arithmetic: The length/2 Predicatep. 463
A.10.5 Lists and Comparison: The quicksort/2 Predicatep. 463
A.11 Some Other Built-in Predicatesp. 464
A.11.1 Type Predicatesp. 464
A.11.2 Term Manipulation Predicatesp. 465
A.12 Handling Run-Time Errors and Exceptionsp. 466
A.13 Dynamically Accessing and Updatingthe Databasep. 467
A.13.1 Accessing a Clause: The clause/2 Predicatep. 467
A.13.2 Dynamic and Static Predicatesp. 468
A.13.3 Adding a Clause: The asserta/1 and 1 assertz/Predicatesp. 468
A.13.4 Removing Clauses: The retract/1 and abolish/2 Predicatesp. 469
A.13.5 Handling Unknown Predicatesp. 470
A.14 All-Solutions Predicatesp. 470
A.15 Fundamental Search Algorithmsp. 471
A.15.1 Representing the Graphp. 472
A.15.2 Depth-First Searchp. 473
A.15.3 Breadth-First Searchp. 474
A.15.4 A* Searchp. 475
A.16 Input/Outputp. 476
A.16.1 Reading and Writing Characters with Edinburgh Prologp. 476
A.16.2 Reading and Writing Terms with Edinburgh Prologp. 476
A.16.3 Opening and Closing Files with Edinburgh Prologp. 477
A.16.4 Reading and Writing Characters with Standard Prologp. 478
A.16.5 Reading and Writing Terms with Standard Prologp. 479
A.16.6 Opening and Closing Files with Standard Prologp. 479
A.16.7 Writing Loopsp. 480
A.17 Developing Prolog Programsp. 481
A.17.1 Presentation Stylep. 481
A.17.2 Improving Programsp. 482
Indexp. 487
Referencesp. 497