Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456 Syntax and Semantics
The goal of this course is to introduce the main computational models
and techniques that underlie the syntax and semantics of programming
languages. As you will discover during this semester and in the
remainder of your studies, the theory that we shall cover in this
course has important applications in, e.g., compiler design, various
design methodologies and notations (for instance, the UML notation you
have met in passing in your Object-Oriented
Analysis and Design course), speech recognition (which is partly
addressed in the course Verbal interaktion i multimodal
kontekst on INF6), text processing, and many facets of program
analysis and implementation. At the end of the course, the students
will be familiar with the basic computational models of Finite State
and Pushdown Automata, with the classes of grammars that generate the
languages recognized by these abstract computational devices, and with
basic operational and axiomatic semantics of programming languages.
We shall try to emphasize the role played by semantics in compiler and
language design, program analysis, and good programming practice. (The
notions of pre- and post-conditions, loop invariants and the like have
their root in a formalism for reasoning about programs called Hoare
logic, which is the foundation of axiomatic semantics.)
Syntax
Programming languages, like all other natural or formal languages,
have a syntax and a semantics. Their syntax
describes the collection of legal programs by means of a set of rules
that allow one to construct new syntactically correct programs from
smaller ones. This set of program formation rules gives a
formal definition of the syntax of a programming
language. Such descriptions are now quite common, and are often given
in BNF
notation.
The development of high level programming languages --- like FORTRAN,
COBOL and LISP --- was one of the major advances in Computer Science
in the 1950s. These languages allowed programmers to specify commands
in mnemonic code and with high level constructs such as loops, arrays
and procedures. With the development of these languages, and of the
more sophisticated ones that followed, it became important to
understand their expressiveness (that is, what programs can be written
using them) as well as the characteristics of the simplest computing
machines that can translate them into machine language. This brought
about the study of formal languages and the automata that recognize
them. The goal of the first part of the course is to introduce some of
these classes of formal languages, their grammars, and the simplest
machines that recognize them.
Semantics
To quote from the introduction of Matthew
Hennessy's book The Semantics of Programming Languages: an
Elementary Introduction using Structural Operational Semantics
(John Wiley and Sons, New York, N.Y., 1990):
It is not possible to have a true understanding of a programming
language without a mental model of its semantics, i.e., "how the
language works".
Programmers usually develop this mental picture by day-to-day use of
the language via a compiler and an interpreter. This empirically
obtained knowledge of "what programs do" is imprecise, haphazard and
hard, if not impossible, to work with.
In the second part of this course, we shall see that the mental
picture of "how programs should work" can be developed and understood
in a machine independent way by using the tools of the formal
semantics of programming languages. More specifically, we shall
focus on the techniques of Structural Operational Semantics (SOS), and
will have a brief look at Axiomatic Semantics.
Textbooks
The main textbook for the syntax part of the course is Introduction to
the Theory of Computation by Michael Sipser. The author
maintains a list of errata
for the current edition of this book. This book is quite expensive,
but you will also use it as your main textbook in the Computability
part of the Computability & Network Technology course.
For the semantics part of the course, I plan to use a textbook that
is available on line, namely:
Whenever necessary I shall supply notes and chapters from other
references to supplement the material in these books.
Prerequisites
The course assumes knowledge of basic discrete mathematics, such as
the one that you have been using in the course Matematik 2B
Forår 2001 and in Algorithms and Data
Structures. Those of you who would like to refresh their memory,
or who are not confident about their understanding of the basic
mathematics used in this course, are strongly encouraged to read,
e.g., Chapter 0 of Sipser's book and/or Chapter 1 of the book
Elements of the Theory of Computation (2nd edition) by Lewis
and Papadimitriou. As
practice makes perfect, I also recommend that you work out some of the
exercises to be found in those references. Try, for example, exercises
0.2-0.5, 0.7, 0.10-0.11 in Sipser's book and/or exercises 1.1.1-1.1.4,
1.5.1-1.5.2 in the book by Lewis and Papadimitriou.
Note: The above exercises are not really part of this
course. How many you attempt to solve depends only upon your level of
confidence with the mathematics that will be used as the course
progresses.
Course Material and Lecture Plan
The course will consist of a series of 15 lectures, which will take
place at 10:15am on Mondays and Thursdays during term time. The
location of the lectures will be room A4-106. Look at the semester calendar
for updated information on the schedule for this (and all other)
courses.
Our working language for the course
will be English.
Below, I am attempting to give you an a priori description of parts of
the plan of the course. You are invited to take it with a pinch of
salt. The topic of each actual lecture may vary, depending on how fast
the course progresses, and on how receptive you are to the topics of
the course.
Note! There will be no lectures in the period March 6-12!
Syntax
Lecture 1 (Thursday, 6 February 2003):
Introduction to the course; Mathematical Preliminaries:
Strings and languages; A taste of finite automata.
Lecture 2 (Monday, 10 February
2003): Nondeterministic automata and their relationships to
deterministic ones.
Lecture 3 (Thursday, 13 February
2003): Regular expressions and finite automata.
Lecture 4 (Monday, 17 February 2003):
Languages that are not regular.
Lecture 5 (Thursday, 20 February 2003):
Context-free grammars.
Lecture 6 (Monday, 24 February
2003): An introduction to pushdown automata, and their
relationships to context-free grammars.
Lecture 7 (Thursday, 27 February
2003): Pushdown automata, and their relationships to
context-free grammars (continued).
Lecture 8 (Monday, 3 March
2003): Languages that are not context-free.
Semantics
Lecture 9 (Thursday, 13 March
2003): Introduction to the formal semantics of programming
languages.
Lecture 10 (Monday, 17 March
2003): Natural and structural operational semantics for the
language While.
Lecture 11 (Thursday, 20 March
2003): Structural operational semantics for the
language While.
Lecture 12 (Monday, 24 March
2003): Equivalence between programs. Connections between
natural and structural operational semantics for the language
While.
Lecture 13 (Thursday, 27 March
2003): Operational semantics for extensions of the language
While.
Lecture 14 (Monday, 31 March
2003): A taste of program verification - partial and total
correctness of programs.
Lecture 15 (Thursday, 3 April
2003): A taste of a formal system for partial correctness
assertions. Discussion of exam questions.
Exercise Classes and Advice on Modus Operandi
As usual, each lecture will have an associated exercise class. The
exercises mentioned on the web page for a lecture should be solved
during the exercise class that precedes the following lecture. For
example, the exercises listed on the web page for lecture 1 should be
solved before lecture 2. There will be no exercise class preceding the
first lecture.
In general, I shall give you more exercises than you might conceivably
be able to solve during one exercise class. The exercises marked with
a star are those that I consider more important for your
understanding. All the exercises will be "doable", and working them
out will greatly increase your understanding of the topics covered in
the course. The best advice I can give you is to work them all out by
yourselves, and to make sure you understand the solutions if the other
members of your group (or me) gave you the solutions on a golden
plate. Above all, don't give up if you cannot find the key to the
solutions right away. Problem solving is often a matter of mental
stamina as much as creativity.
For further advice on how to learn this material (and, in fact, the
material in any course) I strongly recommend that you look at the
slides for the talk Psychologists'
tips on how and how not to learn by Wilfrid Hodges. In
particular, try to reflect upon the hints he gives, and ask yourselves
how much you practice what he preaches.
Exam
The exam will be individual and oral. Each student will be examined
for at most 20 minutes on one of the topics listed in the exam
prospectus that will be distributed during the last lecture. The topic
will be selected randomly by each student. The student will be
informed of his/her mark (on the standard 13-scale) after the oral
exam. The student can choose to hold the exam in either Danish or
English.
Preliminary information on the details of the course exam is now available.
The Java Computability Toolkit.
The Java Computability Toolkit (JCT) contains a pair of graphical
environments for creating and simulating NFAs, DFAs, and high level
Turing Machines.
For an in-house simulator for Finite Automata, look at the DAT2 project developed in 1999
(and upgraded last year) by some of your colleagues!
For an in-house simulator for Pushdown Automata, look at the DAT2 project developed last year by some of your colleagues!
LETOS -- A Lightweight Execution Tool for Operational Semantics.
This is a lightweight tool proposed by Pieter Hartel to aid in the
development of operational semantics. To use letos, an operational
semantics must be expressed in its meta-language, which itself is a
superset of Miranda. The letos compiler is smaller than comparable
tools, yet letos is powerful enough to support publication quality
rendering using LaTeX, fast enough to provide competitive execution
and tracing using Miranda or Haskell, and versatile enough to support
derivation tree browsing using Netscape. Letos has been applied to a
Java Secure Processor, which is a version of the Java Virtual Machine
intended to run on smart cards. Letos has also been used with subsets
of various programming languages.
The latest version of letos is described in detail in a paper, which is included with the program and some sample inputs in a
gzipped tar file.
This page will be actively modified over the spring term
2003, and is currently undergoing heavy restructuring. You are invited
to check it regularly during the spring term. The page is dormant in
the autumn term.