Deprecated: The each() function is deprecated. This message will be suppressed on further calls in /home/zhenxiangba/zhenxiangba.com/public_html/phproxy-improved-master/index.php on line 456 Position Statement
This workshop will take place June 14-15 1996, at MIT. Participants
are asked to provide a position statement and to make it available
on internet. Below my current position can be found regarding the
field of concurrency.
Concurrency theory will set standards for description and analysis
of software
Concurrency theory and practice has all properties of a slowly
but steadily progressing field. What was felt as a problem
yesterday, is currently being addressed and will be solved tomorrow.
The field of concurrency has the capability of setting the
standards for description and analysis for all distributed systems
in the future, be it a new leader election algorithm, or an airport
control system. The future relationship between specification/analysis
versus realisation will be the same as the current relationship
between higher level programming and assembly programming. Accordingly,
omitting specification and analysis before realisation will generally
be seen as a clear act of non-professionalism.
How can we sooner achieve this situation? It seems that there is only
one major answer: Apply to improve applicability. The only way to do
this is to become alternatedly interested in an application domain and
interested in improving concurrency theory.
Important applications are:
Specify and verify (new) distributed algorithms.
Distributed algorithms are hard to get correct. It has been
stated that from all new algorithms being presented at least
50% is erroneous. By using methods from the field of concurrency
we should get this ratio down.
Specify and analyse (new) (industrial) products and systems.
From my own experience I must conclude that generally no
newly developed industrial product involving some form of
more advanced communication is correct. The situation is
even worse. As an appropriate and sufficiently precise design
is often missing, most technicians happen to have a different
conception of the product. These differences are somewhat straightened
out during the debugging fase.
But it is often hard to get truely
involved in industrial development. Generally industrial
engineers consider themselves as very capable. They do not need
intruders with new hocus pocus causing delays, and telling
all competitors what they are up to. I think it is totally
superfluous to say that many products never reach the market
as these are `too complex', `the ROM is too small to contain
the necessary software', `suppliers did deliver components
with inadequate interfaces', `remaining bugs could not
be located and repaired in time', etc.
Getting involved in standardization committees.
Is there any (international) standard for communication that
does unambiguously and understandably
describe the sequence in which messages must be
exchanged? Is there any standard that did not need essential
repairs? If so, there are for every correct standard many that
are incorrect or not understandable. By getting involved in
standardization, we may reduce the number of mistakes, increase
the precision of the description and most likely introduce
essential simplifications reducing code length and increasing
performance. It is very hard work though, taking very many
years.
Note that getting involved in such applications is hard, and benefits
for individuals relatively small.
On the one hand concurrency theory is not
sufficiently strong to take over the first fiddle in most fields of
applications. So, results are always considered as auxiliary. On
the other hand there is
a clear feeling that proper scientific contributions are theoretical
extensions. Application of the theory is often seen as a second class
activity, and referees judge accordingly.
What are the problems when faced with concrete applications?
Problem of scale.
Applications can always be made larger. And often (but certainly
not always) these are too large to be appropriately analysed.
The solution for this problem will come from two sides. On the
one hand techniques from the area of symbolic methods (of which
BDD technology forms only a tiny part) will greatly extend the
technology barriers. On the other hand, the use of specification
techniques during design will reduce the complexity of many systems
providing an improved functionality. Such systems will turn out
much easier to analyze.
Problem of choice of formalism.
There are many specification and analysis formalisms around. All
have their advantages and disadvantages, but quite often these
hardly matter. There is no consensus
on the appropriateness of a particular formalism for a particular
task. This especially holds for important features such as time,
probability and data. Contrary to the previous point, this will
not be a matter of steady progress, but more a matter of
formalisms dying out. The winner will not necessarily be the
best.
Problem of tools.
Tools are often buggy, hard to use and not available for the
appropriate computer platform. The semantics of their
functionality is sometimes not obvious. Often they cannot
cope with the scale of the application under consideration;
even if they can handle the problem, they become extremely slow.
Steady progress will improve the situation on all points,
but I found problems with tools among the most annoying that
I know of. On the other hand appropriate tools (even a syntax or
static semantic checker) can be extremely useful.
Problem of education.
There is no use in providing formal specification as we can not
read them is an often used argument against making things
precise. This argument has two origins. Precise descriptions
are always hard to read; it requires substantial effort per
line. Moreover, changing to new languages has always caused
reluctance. But it is in no way impossible. Many formalisms do
lack appropriate reading material, and this should be provided.