Simulating human language understanding on the computer is a great challenge. A way to approach it is to represent natural language meanings in logic, and to use logical provers to determine what... Show moreSimulating human language understanding on the computer is a great challenge. A way to approach it is to represent natural language meanings in logic, and to use logical provers to determine what does and does not follow from a text. What logic is best to use and how natural language meanings are best represented in it are far from trivial questions. This thesis focuses on semantic representation in deep parsing. It describes the Delilah parser and generator for Dutch, which computes semantic representations for sentences, discussing several issues and proposing some further improvements to the system. A style of logical form is developed that is optimized for inference in mainly two ways. One is the implementation of event semantics for verbs and nominalizations and with underlying states for intersective adjectives and their corresponding abstract nouns. This makes many entailments follow straightforwardly. The second is the introduction of Flat Logical Form, as an alternative to first-order logic representations. In Flat Logical Form, crucial information on quantification, monotonicity, and embedding is annotated locally on the variables of the formula, where it does not complicate the formula's structure. Both moves make the representations rich in information and at the same time easy to process for purposes of automated reasoning. Such automated reasoning with access to detailed semantic information is expected to contribute to the retrieval of free narrative text. Show less