Computational Semantics

September/October 2000

Course description | Format | Course readings | Courseware | Sessions | Background materials | Participants

Course description

The subject of this course is the assembly of meaning in natural language. In a previous course, we have seen that the computational system of language is resource-sensitive: in building up grammatical forms and meanings, each lexical assumption has to be used exactly once, and the way the assumptions are syntactically structured affects wellformedness. Resource-sensitivity has a computationally attractive side: one can develop lean parsing regimes that directly exploit the built-in economy principles of the grammar logic (so called proof nets, a subject dealt with in a companion course). But at the same time, it limits the semantic expressivity of the computational system: sometimes we would like to associate a sentence with an interpretation which assembles the meaning parts in a way that is incompatible with syntax. Examples: derivational meanings cannot express multiple binding; binding (=hypothetical reasoning) is only possible with respect to structurally accessible positions.

In the course, we investigate two strategies which natural languages use to overcome the limitations of derivational semantics.

We will investigate the interplay of derivational semantics, lexical meaning and structural reasoning by studying some key phenomena in NL semantics: binding and scope, reflexives, discourse anaphora.

Back to the top


The course has three components:

Your grade for the course is based on your results for the exercises and on the grade for the final paper.

Back to the top

Course readings

For a start, we look at the following materials:

Back to the top


To use the courseware, you need an account on the unix machine There is an on-line form to apply for an account. Mention the name of the course and the type of student you are (CKI doctorate or otherwise) when you send in your application. The command grail at the unix prompt starts up the grammar development environment. For documentation, see:

You can also take a look at Bob Carpenter's on-line type-logical parser. This produces derivations in the format used in the TLG book (derivations with implicit structural rules).

Back to the top


Week One has two introductory sessions. The course proper starts September 29, and runs on an intensive schedule till the end of the block (October 27). By September 29, you are supposed to have made a first exploration of the Carpenter chapter Quantifiers and scope. There is an assignment (deadline: October 6) to test your understanding.

Sept 29, Oct 3. We investigate the 'lexical strategy' to obtain multiple scope readings: GQs are assigned different lexical types (plus appropriate meaning programs) to produce different scopings. There is a set of lab exercises with this part of the course: you find them in Section 2 of the course notes. Save your fragment as you have it after finishing the exercises for Section 2.

Oct 6, Oct 10. To obtain the lexical meaning programs for scope-specific type assignments under the lexical strategy, we have allowed ourselves full freedom to reorder and restructure the semantic buildings blocks (structural rules of Commutativity and Associativity). Is there a more restricted form of structural reasoning that doesn't damage natural language syntax and that would make it possible to obtain different scope readings from one single type assignment to GQs?

Oct 13. We discuss the definition of the Scoping Constructor in terms of the basic logical constants (product and the two implications). Lab exercises: Section 3.1 of the course notes. Save your solutions under a new fragment name.

Oct 17. With the analysis of the Scoping Constructor as a defined/compiled operation, we return to Carpenter Chapter 7, and discuss the analysis of quantification and negation (7.7), embedded quantifiers (7.5) and quantifiers and coordination (7.6).

Oct 20. The generalized binding operation we have been studying is not construction-specific: it has applications to a wide range of phenomena, not just quantifier scope. In this session we illustrate this with a discussion of Extraction and Pied-Piping. Lab exercises are in Section 3.2 of the course notes. Save your solutions for this final set of exercises again under a new fragment name, so that your /fragments directory now contains three fragments.

Follow-up course. OTS offers a nice follow-up course (which is open to students who took the Computational Semantics course): Type Theory and the Composition of Meaning. The course looks at meaning assembly in natural language from the perspective of the functional programming language Haskell. The course is presented by Jan van Eijck.

Back to the top

Background reading

This course presupposes knowledge of the material covered in the CKI first year course 'Parsing as deduction'. You can visit the course page for links to electronically available literature and teaching materials. An alternative starting point for non-Dutch participants is the Moortgat-Oehrle course Grammatical resources: logic, structure and control at the ESSLLI99 summer school.

Back to the top


Back to the top