Monday, October 12, 2009

IFIP 1.8 workshop on FORMAL METHODS FOR EMBEDDED SYSTEMS

Here is the call for participation for an event organized the WG. Do take part if you happen to be in Eindhoven for FM week!


========================================
                       CALL FOR PARTICIPATION
========================================
      IFIP 1.8 workshop on FORMAL METHODS FOR EMBEDDED SYSTEMS

               http://www.cse.unsw.edu.au/~rvg/FMES/

           Eindhoven, The Netherlands, November 5, 2009
======================================================================

This IFIP 1.8 workshop is organised as part of the Formal Methods Week,
which takes place in Eindhoven from November 2 until November 6, 2009.
The goal of the workshop is to summarise research from different areas
of formal methods targeted to embedded systems, and to promote the use
of formal methods in different applications and in the engineering
discipline for embedded systems.

PROGRAMME:
----------
 8:30  Registration and coffee
 8:50  Opening

 9:00  Bert van Beek -
         The Compositional Interchange Format: concepts, formal basis,
         and applications
 9:45  Holger Hermanns -
         Synchronous vs. Asynchronous Performance Models of Industrial
         Networks on Chip Designs

 10:30 Coffee break

 11:00 Catuscia Palamidessi -
         Synchronization in the pi-calculus
 11:45 Joost-Pieter Katoen -
         Analysis and Semantics of Extended AADL Models

 12:30 Lunch
 14:00 The End

For abstracts of the talks and further details about the workshop we
refer to http://www.cse.unsw.edu.au/~rvg/FMES/.

REGISTRATION
------------
The registration fee for the workshop is 45 euros and covers coffee/tea
and lunch. You also need to register for FMweek, which costs an
additional 35 euros (administration costs). Please register via the
FMweek website: http://www.win.tue.nl/fmweek

WORKSHOP ORGANISERS:
---------------------
Rob van Glabbeek (National ICT Australia)
Ursula Goltz (Technical University Braunschweig, Germany)
Bas Luttik (Technische Universiteit Eindhoven, The Netherlands)
Uwe Nestmann (Technical University Berlin, Germany)

Thursday, September 25, 2008

Vardi Receives The Blaise Pascal Medal in Computer Science For 2008

I just learned that WG1.8 member Moshe Vardi is the recipient of the Blaise Pascal Medal in Computer Science for 2008 of the European Academy of Sciences. This is outstanding news for TCS as a whole and for our WG in particular.

The motivation for the prize reads:

In recognition of his outstanding contributions in several areas of computer science connected by their use of logic as an underlying methodology. His work has had fundamental and lasting impact on automatic verification, logic of knowledge, database theory, and finite-model theory.
Congrats to Moshe!

Tuesday, September 09, 2008

Changes at the helm of WG1.8 and plans for the near future

I apologize for not posting much on this blog of late. Life has been very hectic and the organization of ICALP 2008 took its toll on me, but that's no excuse for the silence.

Chairpersons of IFIP WGs are in service for a three-year period and may apply for reappointment for another three-year stint. My period of service is running out as we speak and, in the light of several other family- and work-related commitments, I think that it is best for the WG if I step down as chair now. The WG needs a more active and visible chairperson. Thanks a lot for your cooperation over the last three years. I enjoyed it a lot.

Wan Fokkink and Anna Ingolfsdottir will also step down from their positions as vice-chair and secretary of the WG. Thanks a lot to them for their sterling service.

The IFIP secratariat will organize an election that will allow us to decide on our new chairperson democratically. We are lucky to have a strong candidate for the role, namely Jos Baeten. Jos is the chairman of the Steering Committee for CONCUR and the Dutch national representative within TC1. If he is elected as chairperson of the WG, he will therefore ensure a seamless integration of our activities with those within the CONCUR conferences and within TC1.

If you would like to run for election, please drop me a line.

We are also planning several new strategic workshops under the auspices of the WG. We will organize a workshop affiliated with FM 2009, which will be held in Eindhoven. The workshop will be organized by Wan Fokkink and Uwe Nestmann, and its preliminary theme is related to the connections between the research carried out within the CONCUR community and the PODC community, with possible emphasis on the development of verification techniques and tools.

We are also thinking about organizing a workshop at CONCUR 2009 in Bologna. We already have some ideas for possible topics for the workshop, but we welcome your input for this and future workshops.

I also encourage you to use the blog for airing your opinions, questions and desires related to the activities of the WG.

I'll sign off now, since Tim Roughgarden is starting his invited talk at TCS 2008.

Monday, July 14, 2008

ICALP 2008

ICALP 2008 is now over. We had nearly 500 people attending the main conference and its affiliated events. It was a lot of work to organize such a large scientific meeting, but Anna, Magnús and I were very happy to see that people enjoyed their visit to Iceland and their stay in Reykjavík.

Here I will limit myself to giving a few remarks on the last three days of the ICALP conference.

On Wednesday, Bruno Courcelle (Labri, Universitè Bordeaux, France) delivered a plenary invited talk entitled Graph Structure and Monadic Second-order Logic: Language Theoretical Aspects. Courcelle is one of the 15 most cited French scientists and the most cited French computer scientist. (See http://www.labri.fr/perso/courcell/PrixCitations2004.doc.) In addition, he has become a member of the very prestigious Institut Universitaire de France (see http://iuf.amue.fr/author/bcourcelle/). Courcelle is one of the grand "old" men in theoretical computer science. In a career spanning about 35 years he has given fundamental contributions to the study of the logical, language-theoretic and algorithmic aspects of the theory of graphs. He is also one of the founders of, and main contributors to, algebraic semantics. In his talk, Bruno described the role that hierarchical decompositions and Monadic Second-order Logic play in the extension of methods and results from classic Formal Language Theory to the description of sets of finite graphs. Bruno is putting the final touches to a monograph on this topic that will be published by Cambridge University Press. (Present state of the work Part 1 : Chapters 1-6. Part 2 : Chapters 7-11.)

The scientific programme on Thursday was kicked off by an invited plenary talk by Peter Winkler (Dartmouth, USA). The talk, entitled Optimality and Greed in Dynamic Allocation, described a method for proving optimality in dynamic allocation problems that relies on the assumption that "it's right to be greedy." The method was developed while Peter was working on two problems which arose at Lucent Technologies.

The scientific programme for Thursday reached its climax with the award ceremony, during which the EATCS award and the Gödel prize where handed out. After receiving the EATCS award, Leslie G. Valiant (Harvard, USA) delivered an excellent 30-minute presentation focusing on three topics that interest him right now. The topics are:
  1. Evolution
  2. Intelligence
  3. Human Brain.
In his opinion, all these topics are computer science and are screaming for an understanding based on algorithmic approaches. Leslie described some of his work in this direction.

The Gödel prize talk, delivered by Daniel A. Spielman (Yale, USA), provided an outstanding finale for the day. In his talk, Dan explained the idea of smooth anallysis, how Teng and he thought of it and what he hopes the future will bring. Shang-Hua Teng (Boston University, USA) gave a personal and heartfelt introduction to the work they did to merit such a prestigious award.

The main event during the final day at ICALP was an invited plenary talk delivered by Javier Esparza (Technische Universität München, Germany). Javier's presentation was entitled Newtonian Program Analysis and presented recent work by his students and him that aims at using a computational approach to solving equations developed by Isaac Newton about 300 years ago in the analysis of computer programs. The talk presented exciting work and was beautifully and enthusiastically delivered. After the talk, Peter Winkler commented to us: "You saved the best for the last!" Javier closed the talk by saying that "Newton did it all 300 years ago, but he never saw Iceland!"

ICALP 2009 will be held in Rhodos, Greece. Good luck to the organizers of ICALP 2009!

Saturday, October 06, 2007

Frits Vaandrager's Answers

I have received a contribution from Frits Vaandrager to the discussion arising from the CONCUR workshop. I am very pleased to post them here.
Enjoy!


# What is the role/importance of real-time in modelling? Does industry want dense-time or discrete-time models?

When modelling embedded systems and/or protocols, real-time aspects have to be dealt with very frequently. If one prefers not to use a real-time model checker, one can often encode
real-time constraints in finite state model checkers. For instance, in an STTT paper from 2002, Brinksma, Mader and Fehnker analyzed a PLC control schedule using SPIN, and obtained results that were (at that time) competitive with Uppaal. Their trick was to advance the discrete clock variable not one by one, but directly to the point of the next event.

In his PhD thesis, Rajeev Alur has made a good case for dense time models. In the case of open systems with components that have different clocks, dense time is conceptually the right
thing to do. But in practice one often gets away with discrete-time models.

Industry has no preference for dense-time or discrete-time: any approach that solves their problems is fine.


# How does one involve small- and medium-size companies in collaborations with concurrency theoreticians/practitioners? Does "company size" matter?

The maturity of model checkers has increased enormously over the last years and as a result it becomes much easier to apply them, also for non-experts. I give two examples:

- Allen B. Downey has written "The little book of semaphores", a nice book in which he presents solutions to dozens of concurrency problems, ranging from classical ones (like the barbershop)
to tricky and obscure problems (like the Sushi bar). This year I asked a class of first-year computer science students with just a few hours of model checking experience to pick a problem from Downey's book (a different problem for each pair of students) and to model and analyze Downey's solution for it using Uppaal. As a result, my students spotted several mistakes in Downey's book, mistakes which have been confirmed by the author.

- Matthijs Mekking, a student interested in implementing protocols and with no particular expertise in formal methods just completed an MSc thesis project at NLnet Labs on the SHIM6
protocol, a proposed IETF standard for multi-homing. At some point he started to model the protocol using Uppaal and became enthusiastic. He managed to find several nontrivial mistakes
in the standard and greatly impressed the protocol designers. His results directly influenced the new version of the standard. (See http://www.ita.cs.ru.nl/publications/papers/fvaan/SHIM6/)

SMEs usually don't have verification specialists in house, so they need some advice on modelling and analysis. But with todays technology it is possible to get results rather quickly, and they can do part of the work themselves. The situation will further improve when MSc and PhD graduates with a background in model checking get jobs at these companies. My group is collaborating with several SMEs and the verification problems they provide us with are often
interesting from an academic perspective.

Unlike SMEs large companies are able to invest in long-term research.

# Is there any need for stochastic and probabilistic modelling in applications? More pointedly, have you met an example that you could not model because your tool does not support stochastic or probabilistic phenomena?

Yes!! There is a great need for stochastic and probabilistic modelling and analysis techniques, and I would for instance welcome a tool that combines the functionality of Uppaal and PRISM.

The Zeroconf protocol, for instance, is full of probabilistic features that we could not model using Uppaal. If we really want to make impact in the area of wireless sensor networks, mobile ad-hoc networks, and P2P networks we need to extend model checkers with probabilities since the design of these networks can only be properly understood if we take probabilities into account.

# How can we, as a community, foster the building of industrial-strength tools based on sound theories?

Theory for the sake of theory is good and important but IMHO the concurrency community has too much of it. In order to really push model checking technology into industry the performance and functionality of these tools must be further improved by several orders of magnitude. This can only be achieved by a combined and focused effort of a large team of researchers and is also a major academic challenge.

In order to test new ideas one needs prototype tools to play with. However, I believe it is not healthy that almost every research group on model checking has its own tool. Only a few groups manage to keep their model checking tool in the air for more than a decade. Developing an
industrial-strength model checking tool requires a huge effort. I think academic groups have to join forces if they want to build (and maintain!) industrial-strength tools. Uppaal has been highly succesful in part due to the continued efforts from teams in Aalborg, Uppsala, Nijmegen, Twente and elsewhere. So why don't the CADP and muCRL teams join forces? Why isn't it possible to establish stronger ties between Uppaal and IF? Why are there so many probabilistic model
checkers?

By combining efforts we can be much more effective.

# What has concurrency theory offered industry so far? What are the next steps that the concurrency community should take in order to increase the impact of its research in an industrial setting? And what are future promising application areas for concurrency research?

Contributions are very powerful modeling languages and concepts, and of course model checking.

A major challenge ahead of us is to combine features. In model checking people proposed symmetry reduction, partial order reduction, CEGAR, etc. What about the combination of all these features? How can we prove it is sound. Can we combine probabilistic choice, hybrid aspects, real-time, and hierarchy as in state-charts? I believe it will be possible to define such combinations but I expect we will need theorem provers and proof assistents to help us to manage the resulting complexity.

I am aware of applications in other areas, but I believe computer system engineering will remain the most important application area for concurrency theory the coming decade.

Friday, October 05, 2007

Joost-Pieter Katoen On The Need for Probabilistic and Stochastic Modelling

I am posting this message on behalf of Joost-Pieter Katoen, who sent me his reaction to one of the questions posed to the panel members during our workshop at CONCUR 2007. Enjoy!

I'd like to answer to the question on the need for stochastic and probabilistic modeling (and analysis). Some concrete examples of case studies provided by industry for which probabilistic aspects are very important are listed below. The importance of explicitly modeling random effects explicitly stands or falls with the kind of property to be established, of course, so I am definitely not claiming that these examples cannot (and should not) be modeled by techniques that do not support random phenomena.

1. Leader election in IEEE 1394: in case of a conflict (two nodes
pretend to be a leader), the contending nodes send a message (be
my parent) and randomly wait either short or long. What is the
optimal policy to resolve the contention the fastest? (This turns
out to be a slightly unbiased coin).

2. In the Ametist EU-project, the German industrial partner Axxom
generated schedules for a lacquer production plant. While doing
so, they abstracted from many details that the lacquer producer
supplies such as: the average fraction of time a resource is not
operational, the fraction of (operational) time the resource can
be used because necessary human support is present, and so
forth. In their abstraction they scheduled tasks conservatively
and they were interested in whether they could improve their
schedules while reducing the probability to miss the deadline.
Clearly, a stochastic modeling is needed, and indeed has been
carried out using a stochastic process algebra.

3. Hubert and Holger should be able to say much more about
a recent project they are pursuing with a French company on
the validation of multiprocessor multi-threaded architectures.
I do not know exactly what they are investigating, but they use
stochastic process algebras to model!

Finally, let me say that (as Moshe is also indicating) that the
interest in probabilistic modeling is growing steadily. To give
an example, the European Space Agency (ESA) is currently
considering to use probabilistic modeling and analysis in the
context of AADL, an architecture specification language where
an important ingredient ais the failure rates of components.

All in all, it is fair to say that there is a quest for probabilistic
techniques!

Joost-Pieter Katoen

Monday, September 24, 2007

Luca de Alfaro in the News

At CONCUR 2007, I learned that our colleague and member of WG1.8 Luca de Alfaro has recently hit the news for his work on an evaluation of trust in contributions to Wikipedia. (See Demo: Coloring the text of the Wikipedia according to its trust.) Luca's work is based on the idea of colouring the text of Wikipedia articles according to a computed value of trust. The trust value of each word of a Wikipedia article is computed as a function of the reputation of the original author, as well as the reputation of all authors who subsequently revised the text.

This work of Luca's was mentioned in the Washington Post (I cannot locate the link anymore), the Jerusalem Post, ACM TechNews and SFGate.com amongst others.

Luca will also be the PC chair for FOSSACS 2009 in York, but this will give him less media exposure :-)