Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456
Abstracts of talks at Festival Workshop in Foundations and Computations
Abstracts of talks at Festival Workshop in Foundations and Computations
Heriot-Watt University, Edinburgh
Sunday 16 July-Tuesday 18 July 2000
 |
http://www.cedar-forest.org/forest/events/festival/workshop1/abstracts.html
http://www.cedar-forest.org/forest/events/festival/workshop1/
 |
ULTRA group Useful Logics, Types, Rewriting,
and Applications |
 |
UKII
UK Institute of Informatics,
UK |
The list of abstracts is as follows:
Laura Crosilla
(Leeds, UK)
A Weak Constructive Set Theory witsh Inaccessible sets
See this extended abstract
Gilles Dowek (INRIA-Rocquencourt, FR):
About Folding-Unfolding Cuts
In some theories (e.g. set theory, the Stratified Foundations, ...)
the usual notion of proof is extended with folding and unfolding
steps, a sequence of such steps is called a folding-unfolding cut and
a proof containing such a cut can be reduced. We show that this
notion of folding-unfolding cut is an instance of the more general
notion of "cut modulo" and that cut elimination modulo for these
theories implies cut elimination with folding-unfolding cuts.
Cut elimination for a theory modulo can be proved by constructing a so
called pre-model. We show that the notion of normalization model,
introduced by Crabbe' to prove cut elimination for the Stratified
Foundations, is more or less an instance of the notion of
pre-model. The techniques developed by Crabbe' to construct a
normalization model for the Stratified Foundations can be adapted to
construct a pre-model of this theory. This permits to give a simpler
and more modular cut elimination proof for the Stratified Foundation.
We discuss at last the generality of this pre-model construction method.
Jan van Eijck(University of Amsterdam, NL):
Dynamic First Order Logic
Dynamic First Order Logic results from interpreting quantification
over a variable v as change of valuation over the v position,
conjunction as sequential composition, disjunction as
nondeterministic choice, and negation as (negated) test for
continuation. We present a tableau style calculus for DFOL with
explicit (simultaneous) substitution, prove its soundness and
completeness, and point out its relevance for programming with
dynamic first order logic, by interpreting global variable declarations
as dynamic existential quantifiers and local variable declarations
as dynamic existential quantifiers in the immediate scope of
a double negation to block off the dynamic effects.
Massimo Felici (University of Edinburgh, UK):
A Tableaux system for a fragment of the hyperset theory
Non well founded phenomena arises in a variety
of ways in informatics. This talk develops a tableaus system
for the hyperset theory which is based on non well founded set theory.
See this extended abstract.
Jacques Fleuriot (University of Edinburgh, UK):
Automating Newton's calculus in Isabelle
In this talk, I shall give an overview of the construction and use of
the hyperreals in the theorem prover Isabelle within the framework of
higher order logic (HOL):
The theory, which includes infinitesimals and infinite numbers, is
based on the hyperreal number system developed by Abraham Robinson in
his Nonstandard Analysis (NSA). I shall first outline the ultrapower :
construction of the hyperreals in Isabelle. This was carried out
strictly through the use of definitions to ensure that the foundations
of NSA in the prover are sound. The mechanization has required
constructing the various number systems leading to the reals, and then
going one step further to define the hyperreals using a mechanized
theory of filters and ultrafilters.
I shall next describe the use of the new types of numbers and new
relations on them to formalize familiar concepts from analysis such
as sequences, limits, continuous functions, derivatives, power
series, and transcendental functions. I will point out the merits
of the nonstandard approach with respect to the practice of
analysis and mechanical theorem proving. In particular, I will
argue that the overall result is an intuitive, yet rigorous,
development of real analysis, with a relatively high degree of
proof automation in many cases.
Therese Hardin (Paris 6 and INRIA-Rocquencourt, FR):
The calculus of contexts of lambda sigma
The notion of context, i.e. a lambda-term with a hole, is rather
informally defined in teaching books on lambda-calculi, where it
allows a kind of operational view of the congruence properties of the
reduction. A more formal treatment of contexts was implicit in the
pionner works on higher-order unification, which is unification of
contexts. Recently, the notion of context has been used to represent
modules, environments, etc. This talk recalls first the coding of
contexts in lambda-sigma as first-order terms with first-order term
variables representing the holes. Then, some comparisons with other
approachs are made.
Roger Hindley (University of Swansea, UK):
The birth of lambda-calculus and combinatory logic
This talk will be about the early days of the concepts of
function-abstraction and application, leading to lambda-calculus and
combinatory logic. Both these systems achieved very satisfying successes in
their early years before 1940, which were not surpassed until their
application to computer science in the 1970's. The talk will be based on
an article being written with Felice Cardone for a forthcoming history of
modern logic to be edited by D. van Dalen and J. Dawson. It will include a
test which will distinguish between computer scientists who are genuine and
those who are just logicians looking for money.
Yoko Motohama (Torino University, Italy):
Compositional Characterizations of lambda terms using intersection types
See this extended abstract
Cesar Munoz (ICASE, NASA, USA) :
Aircraft Trajectory Modeling and Analysis:
A challenge to Formal Methods
The Airborne Information for Lateral Spacing (AILS) program at NASA aims
at giving pilots the information necessary to make independent
approaches to parallel runways with spacing down to 2500 feet in
Instrument Meteorological Conditions. The AILS concept consists of
accurate traffic information visible on the navigation display and an
alerting algorithm which warns
the crew when one of the aircraft involved in a parallel landing is
diverting from its intended flight path. In this talk we present a
formal model of aircraft approaches to parallel runways. Based on this
model, we analyze the alerting algorithm with the objective of
verifying its correctness. The formalization is conducted in the general
verification system PVS.
Alan Mycroft
(Cambridge University and AT&T Labs, UK):
Type Based
Decompilation
We describe a system which decompiles (reverse
engineers) C programs f rom target machine code by type-inference
techniques. This extends recent trends in the converse process of
compiling high-level languages whereby type information is preserved
during compilation. The algorith ms remain independent of the
particular architecture by virtue of treating target instructions as
register-tr ansfer specifications. Target code expressed in such RTL
form is then transformed into SSA form (undoing register c olouring
etc.); this then generates a set of type constraints. Iteration and
recursion over data-structure s causes synthesis of appropriate
recursive C structs; this is triggered by and resolves occurs-check
constraint violation. Other constraint violations are resolved by C's
casts and unions. In the limit we use heuristics to select between
equally suitable C code---a good GUI would clearly facilitate its
professional u se.
Gopalan Nadathur (University of Minnesota, USA):
Correspondences between Classical, Intuitionistic and Uniform
Provability and their Impact on Proof Search
Based on an analysis of the inference rules used, it is
possible to characterize the situations in which classical and
intuitionistic provability coincide. This observation is useful, for
instance, in sanctioning the use of classical principles in
determining intuitionistic derivability for formulas satisfying
certain syntactic restrictions. It is similarly possible to relate
classical and intuitionistic derivability to uniform provability, a
constrained form of intuitionistic provability that embodies a special
notion of goal-directedness. Uniform provability has been employed in
describing abstract logic programming languages and this analysis
permits the richest such languages within the relevant derivation
systems to be identified. Even when the other provability notions
deviate from uniform derivability, it is often possible to reduce them
to uniform provability through the addition of the negation of the
formula to be proved to the assumption set. Further, this enhancement
of the assumptions can be factored into certain derived rules in a
sequent calculus setting. An interesting aspect of these observations
is that they lead directly to a proof procedure for classical logic
that incorporates into a single framework previously proposed
mechanisms for dealing with disjunctive information in assumptions and
for handling hypotheticals.
In this talk we will explore the correspondences alluded to
above between the different derivability notions and we will also
sketch their relevance to proof search.
Catherine Piliere(INRIA, FR)
Guarded Exception Handling: Some results
See this extended abstract
Sarah Rees (Newcastle, UK):
Formal languages applied to group theory
For a group defined abstractly as the set of all products (strings) of
a finite set of symbols, subject to a finite set of equations
holding between products, some questions about the group may be
theoretical undecidable. When they are decidable, an analysis
of their complexity may be of great interest. Examples of questions of this
type
include the word problem (`Is a given product of symbols equal to
the identity element?'), the conjugacy problem (`Are two given
products `u,v' related by an equation `ux=xv' for some x?) and the
isomorphism problem (`Are two given groups G and H isomorphic, i.e.
abstractly the same, but differently described?').
The formal language hierarchy provides one way of describing the
complexity of a decision problem. For the word problem, the regular
languages define the complexity for a very natural family of groups,
those with finitely many elements, and the context-free languages a
similarly easily defined family, the virtually free groups. These
facts are well known, but much less is known about groups whose
word problem is defined by language families further up the hierarchy.
Regular languages are also naturally associated with another
problem for groups, that of finding a good normal form, i.e. a set of
strings which represents the group's elements of the groups without
too much repetition and displays well the algebra relating the elements.
Groups within the family of automatic groups can be `combed' by regular
languages. A defining geometric property of a combing can also be
expressed in terms of regular languages. Various non-automatic groups
can be combed by languages further up the language hierarchy, such
as indexed families, real-time families, but less is known about these
more general combings. It is known that groups with regular synchronous
combings have deterministic context-sensitive word problem, while any group
with a combing has non deterministic context-sensitive word problem.
This talk will survey what is known about the language complexity of the
word problem and of combings, and about connections between these two
problems.
Jonathan Seldin
(Lethbridge, Canada)
Extensional Set Equality in the Calculus of Constructions
I developed a significant part of set
theory in the calculus of constructions. This followed a
suggestion of Huet.
Berardi (personal communication) has informed me that set theory has not been
implemented this way in Coq because contradictions follow from the axiom of
extensionality. In Coq set theory has been implemented with a new type for
sets. None of the consistency results of
\cite{seldin:PTCC} apply to this implementation.
It is hardly surprising that contradictions follow from the axiom of
extensionality, for the equality which is
Leibniz' equality, is intensional rather than extensional. The
language of
the calculus of constructions is rich enough to provide predicates that can
distinguish different equivalent propositions.
In ordinary axiomatic set theory, set equality behaves like Leibniz equality in
that equals may always be replaced by equals. This is clearly untrue in the
calculus of constructions with $\mathsf{SetQ}$. However, there is a class of
properties with respect to which equals under $\mathsf{SetQ}$ can be
replaced by
equals. This is the class of \emph{extensional properties}.
The class of extensional properties can be shown to parallel the definition of
well formed formulas in first-order axiomatic set theory.
As pointed out in \cite[Remark 17]{seldin:PTCC}, while we can prove most of the
axioms of constructive set theory, we cannot prove the axioms of power set and
$\in$-induction. Furthermore, we cannot expect to get all of set theory
this way,
because the consistency of this implementation follows from the strong
normalization theorem for the calculus of constructions. If a complete set
theory
satisfying these axioms is desired, then the implementation given here will not
do. But the need for a different (and, presumably, stronger implementation
does
not follow from any problem with the axiom of extensionality.
See this extended abstract for further details.
Eleni Spiliopoulou (Bristol, UK):
The Brisk machine: the next step in the execution
of functional languages
Here is the abstract.
Iain Stewart (University of Leicester, UK):
A programming approach to descriptive complexity theory
Finite model theory is the study of what can and cannot be said
about classes of finite structures using different logics. A substantial
sub-area of finite model theory is descriptive complexity theory;
essentially, the study of the expressibility of logics on classes of finite
structures in relation to computational complexity. Perhaps the seminal
result in descriptive complexity theory is Fagin's Theorem which states that
a problem is in the complexity class NP if, and only if, it can be defined
in existential second-order logic. Most logics studied within descriptive
complexity theory are traditional in the sense that they are extensions of
first-order logic by, for example, inductive operators, second-order
constructs or (sequences of) Lindstroem quantifiers. However, one can take a
much more computational view and consider program schemes and the (classes
of) finite structures they accept. Program schemes are much closer to the
notion of a high-level program than are logical formulae but are still
amenable to logical analysis. Program schemes were extensively studied in
the seventies and eighties but have since declined somewhat. In this talk, I
shall introduce the rudiments of descriptive complexity theory and explain
how the study of program schemes can yield new finite-model theoretic
results and techniques.
Maintained by
Fairouz Kamareddine
(
)