Tracks (ISoLA)
Automated Verification of Hybrid (Discrete and Continuous) Models for Cyber-Physical System Design (AVHM)
- J Paul Gibson (Telecom Sud Paris, FR)
- Dominique Mery (LORIA, FR)
Cyber-Physical Systems connect the real world to software systems through a network of sensors and actuators in which physical and logical components interact in complex ways. There is a diverse range of application domains, including health, energy, transport, autonomous vehicles and robotics; and many of these include safety critical requirements. Such systems are, by definition, characterised by both discrete and continuous components. The development and verification processes must, therefore, incorporate and integrate discrete and continuous models.
The development of techniques and tools to handle the correct design of cyber-physical systems has drawn the attention of many researchers.
Continuous modelling approaches are usually based on a formal mathematical expression of the problem using dense reals and differential equations to model the behaviour of the studied hybrid system. Then, models are simulated in order to check required properties. Discrete modelling approaches rely on formal methods, based on abstraction, model-checking and theorem proving.
There is much ongoing research concerned with how best to combine these approaches in a more coherent and pragmatic fashion, in order to support more rigorous and automated hybrid-design verification.
This track is concerned with improving integration of the discrete and continuous models and verification approaches. The track will focus on workshop type presentations, with emphasis on industrial and educational case studies, and tool demonstrations.
Submission:
https://equinocs.springernature.com/service/avhm
Automating Software Re-Engineering (ASRE)
Organizers:
- Serge Demeyer (University of Antwerp, BE),
- Reiner Hähnle (Technical University of Darmstadt, DE),
- Heiko Mantel (Technical University of Darmstadt, DE)
Formal approaches to software analysis and development tend to focus on greenfield scenarios or to look at some piece of given software as a static object. Arguably, dynamic evolution of software is a much more common and relevant issue, and its importance keeps growing.
Key drivers are:
- The advent of innovative execution platforms, including massively parallel and re-configurable hardware
- Emerging norms and regulations demanding software being re-engineered to comply with stricter legislation, new quality requirements, or ethical standards
- The role of software in science (“in silico” now surpasses the more traditional “in vitro” / “in vivo” research methods) and the challenge of maintaining research software
- Novel application scenarios for existing software (blockchain, micro-services, IoT, …), fueled by the digitalization of everything
- The growing importance of simulations as a tool to model, understand, and predict complex, dynamic behavior (climate, infectious diseases, …), specifically the generalization with a feedback loop to obtain a digital twin
Software refactoring, parallelization, and adaptation have become central activities in the value chain: automating them can realize huge gains. Formal approaches to software modeling and analysis are poised to make a substantial contribution, because they are fundamentally concerned with automation and correctness. In this track we invite researchers with an active interest in the automation of software re-engineering: People working on formal foundations, on tools, as well as practitioners of re-engineering. We particularly welcome submissions looking at re-engineering of simulation software, digital twins, and of applications in IoT scenarios
Bridging gaps between program specification paradigms (Specify This)
Organizers:
- Gidon Ernst (Ludwig-Maximilians-Universität München, DE)
- Paula Herber (University of Münster, DE)
- Marieke Huisman (University of Twente, NL)
- Mattias Ulbrich (Karlsruhe Institute of Technology, DE)
The field of program verification has seen considerable successes in recent years. At the same time, both the variety of properties that can be specified and the collection of approaches that solve such program verification challenges have specialised and diversified a lot. Examples include contract-based deductive specification and verification of functional properties, the specification and verification of temporal properties using model checking or static analyses for secure information flow properties. While this diversification enables the formal analyses of properties that were not accessible a few years ago, it may leave the impression of isolated solutions that solve different isolated problems.
Since Specify This at ISoLa 2022, several efforts have been made towards bridging the gaps between specification and modeling paradigms. This includes coordinated efforts like aDagstuhl seminar as well as community-wide case-studies in the VerifyThis long-term challenge series. The efforts have been very influential in inspiring active research programs, with new theoretical approaches being developed, verification tools gradually embracing more diverse feature sets as part of their specification languages, which in turn unlocks new application areas. In order to maintain momentum and continuity, an upcoming Lorentz seminar is dedicated to discuss further directions on how to increase the expressiveness, abstraction, interoperability, and applications.
The proposed Specify This track at the upcoming ISoLa 2024 will be a perfect and well-timed opportunity by which such fresh ideas on further directions on the integration of specification paradigms can be made concrete:
- ISoLa 2024 is intended as the venue in which contributions to the ongoing collaborative VerifyThis long-term challenge are documented and it will offer a similar opportunity for polished solutions to the VerifyThis on-site competition (associated with ETAPS 2024).
- Most importantly, we will be able to present and discuss the progress that has been achieved with respect to the perspectives opened up during the Lorentz seminar. There will be new results on approaches that bridge across different specification and verification techniques, including dynamic and static analysis, model checking, deductive verification, security analyses, testing.
Submission:
https://equinocs.springernature.com/service/specifythis
Digital Twin Engineering (DTE)
Organizers:
- John Fitzgerald (Newcastle University, UK)
- Claudio Gomes (Aarhus University, DK)
- Einar Broch Johnsen (University of Oslo, NO)
- Eduard Kamburjan (University of Oslo, NO)
- Martin Leucker (Universität zu Lübeck, DE)
- Jim Woodcock (University of York, UK)
A Digital Twin is a virtual replica of a real-world system (usually called the physical twin) that can inform decision-making in operation and design. Digital twins combine multi-models developed during design processes, with models derived by Machine Learning (ML) from data acquired from the physical twin and models expressed using Knowledge Representation (KR) techniques for data and process integration. The potential benefits include the ability to perform what-if analysis, reconfigure in response to internal faults or a changing environment, perform preventive maintenance and facilitate visualisation.
Every Digital Twin has its lifecycle, where it is designed, developed, used, maintained and operated alongside the physical twin, yet tightly interacts with all life phases of the physical twin and plays a crucial role in ensuring the dependable operation of overall cyber-physical systems by addressing the (potential) consequences of evolving system components, and the ability to explore and identify changes that do not unduly compromise overall dependability. This combination of prediction and response alongside support for informed decision-making and redesign by humans requires both data derived from operational settings and models developed in design stages.
While Digital Twins have started to see adaptation in, e.g., manufacturing, transport, energy or agriculture, ensuring that Digital Twins themselves are dependable still presents challenges and questions for formal methods, model-based systems engineering and software engineering. These include:
- When is it worthwhile to construct a digital twin, and what value can be gained?
- What are the appropriate architectures to facilitate the sound implementation of digital twins?
- What are the common abstractions for models stemming from ML, KR and formal design, and how can they be used to analyze highly heterogeneous digital twins?
This track will discuss how one can enable the well-founded engineering of dependable digital twins for dependable CPSs. In addition, the plan is to co-locate the track with the general assembly of the INTO-CPS Association and so attract additional participants.
Submission:
https://equinocs.springernature.com/service/dte
Formal methods for DIStributed COmputing in future RAILway systems (DisCoRail)
Organizers:
- Alessandro Fantechi (Università di Firenze, IT)
- Stefania Gnesi (Institute of Information Science and Technologies, IT)
- Anne Haxthausen (Technical University of Denmark, DK)
Innovative train control systems, such as ERTMS-ETCS and its envisioned successors aimed at increasing the autonomy degree of train driving, relying on algorithms heavily based on communication between geographically distributed computing elements. The high safety and availability standards of railway transportation impose high standards as well on safety and availability of such systems. The adoption of formal methods for the specification and verification of distributed control algorithms is recognised as crucial in guaranteeing compliance to such high standards. The previous editions of DisCoRail (2019, 2021, 2022) have discussed such intertwining of formal methods and distributed computing in the design and development of innovative train control systems, with a specific focus on techniques that allow to assess safety or availability, such as model checking or quantitative evaluation. This year edition aims at repeating the success of the previous DisCoRail editions, stimulating the discussion on these topics between researchers and experts from industry and academia that have addressed these aspects in research and development projects. The last years have seen the emergence of a potentially disruptive element in this discussion, namely the projected increasing adoption on Artificial Intelligence techniques, especially in those subsystems aimed to increase the degree of autonomy of train driving. How formal methods can guarantee safety, availability, interoperability, cybersecurity is actually a challenge that needs to be attacked. This is another theme that naturally falls in the scope of DisCoRail 2024.
Submission:
https://equinocs.springernature.com/service/discorail2024
Rigorous Engineering of Collective Adaptive Systems (ReoCAS)
Organizers of the Track:
- Martin Wirsing (Ludwig-Maximilians-Universität München, DE)
- Rocco De Nicola (IMT Lucca, IT)
- Stefan Jähnichen (Technische Universität Berlin, DE)
- Mirco Tribastone (Scuola IMT Alti Studi Lucca, IT)
Organizers of the Colloquium:
- Mirco Tribastone (Scuola IMT Alti Studi Lucca, IT)
- Stefan Jähnichen (Technische Universität Berlin, DE)
- Martin Wirsing (Ludwig-Maximilians-Universität München, DE)
Today´s software systems are becoming increasingly distributed and decentralized and have to adapt autonomously to dynamically changing, open-ended environments. Often the nodes have their own individual properties and objectives; interactions with other nodes or with humans may lead to the emergence of unexpected phenomena. We call such systems collective adaptive systems. Examples for collective adaptive systems are robot swarms, smart cities, autonomous cars, voluntary peer-to-peer clouds as well as socio-technical systems and the internet of things.
It is crucial that unexpected and possibly dangerous situations be avoided within such systems. Hence, there is a strong need of methods, tools and techniques to guarantee that systems behave as expected and guarantee at once safety, security, integrity, availability, correctness, reliability, resilience and that such issue are not addressed in isolation but as a whole at system level.
This track is a follow-up of the successful tracks on “rigorous engineering” at ISOLA 2014, ISOLA 2016, ISOLA 2018, ISOLA 2020/2021, and ISOLA 2022 – the 2014 one tackling “autonomic ensembles” and the four latter ones “collective adaptive systems”. It offers a venue for presenting research work aiming at rigorous defining methods to engineer collective adaptive systems in order to face the above outlined challenges. Topics of interest include (but are not limited to):
- Methods, theories, and machine learning for designing, specifying, and analysing collective adaptive systems
- Techniques for programming and operating collective adaptive systems
- Methods and mechanisms for adaptation and dynamic self-expression
- Trustworthiness, validation and verification of collective adaptive systems
- Security and performance of collective adaptive systems
- Techniques to identify collective adaptive systems and their components
The track will be held jointly with a REoCAS Colloquium in honour of the 70th birthday of Rocco De Nicola, who has significantly contributed to the field of collective adaptive systems through novel approaches for their formal specification, analysis, and verification. The Colloquium will host invited speakers who will reflect upon these developments within the context of Rocco’s much broader legacy in concurrency theory, distributed systems, and formal methods, exploring his more recent contributions to cybersecurity.
Submission ReoCAS:
https://equinocs.springernature.com/service/reocas
Submission ReoCAS Colloquium:
https://equinocs.springernature.com/service/reocas-colloquium
Research at ISE (RAISE)
Organizers:
- Tiziana Margaria (University of Limerick, IE)
Submission:
https://equinocs.springernature.com/service/raise2
Scalable Verification and Validation of Concurrent and Distributed Systems (ScaVeri)
Organizers:
- Cristina Seceleanu (Mälardalen University, SE)
- Marieke Huismann (University of Twente, NL)
- Stephan Merz (INRIA Nancy & LORIA, FR)
To improve performance, many computerized systems involved in critical infrastructures (smart grids, supply chains, traffic controllers, smart factory automation etc.) or cyber-physical environments feature both concurrency and distribution. However, this also adds a lot of extra complexity to systems, and allows many different new kinds of errors to occur. Because of the complexity of concurrent and distributed systems and the large state space of failure conditions, verifying these systems is critically important, yet it is equally challenging. In the ISoLA track on Verification and Validation of Concurrent and Distributed Heterogeneous Systems at ISoLA 2022, we saw that over the last years substantial progress was made on this topic, and impressive results on the verification and validation of concurrent and distributed systems were reported. However, the scalability issues encountered when verifying large and complex concurrent and distributed systems are still hindering applying formal analysis to such systems. In the current track, we wish to explore this area further, and to investigate what is needed to make such verification and validation techniques scale to industrial-sized systems.
This track focuses on providing solutions to the scalability issues mentioned above, by inviting submissions that propose models, techniques, and tools for the rigorous and scalable analysis and verification of concurrent and distributed systems, be it via modular verification, abstraction, combinations of formal verification techniques with AI methods, bounded verification, statistical methods, quantum computation, etc., to name a few of the possible solutions.
Submission:
https://equinocs.springernature.com/service/scaveri
Type- and Logic-Based Methods for System Specification, Synthesis and Analysis (TLBM3SA)
Organizers:
- Boris Düdder (University of Copenhagen, DK)
- Fritz Henglein (University of Copenhagen, DK)
- Falk Howar (TU Dortmund, DE)
The workshop is centered around the scientific work of Jakob Rehof in formal models of computation. The workshop’s intention is to bring researchers from computer science from theoretical and applied research together with practitioners, developers, and business stakeholders to discuss R&D topics in this space that can help with the adoption and maturation of formal methods in computer science as well as with realizing its full potential in future practice.
We are interested in cross-fertilization of ideas in all areas of formal methods, including:
- Type systems for program property and specification
- Type-based program analysis
- Type-based program synthesis
- Modal logic, model checking for program verification
- Computational complexity of type checking, inference, inhabitation
- Program analysis & verification
- Programming models, languages, and semantics
- Formals Models in Software Engineering
- Complex system design and modeling
- Enterprise Systems
- Mathematical logic
- Formal Model Applications and Case Studies
Invited Speaker:
- Jakob Rehof, TU Dortmund and Fraunhofer ISST
Submission:
https://equinocs.springernature.com/service/tlbm3sa
X-by-Construction Meets AI
Organizers:
- Maurice H. ter Beek (Institute of Information Science and Technologies, IT)
- Loek Cleophas (Eindhoven University of Technology, NL & Stellenbosch University, ZA)
- Clemens Dubslaff (Eindhoven University of Technology, NL & Centre for Tactile Internet with Human-in-the-loop, DE)
- Ina Schaefer (Karlsruhe Institute of Technology, DE)
Correctness-by-Construction (CbC) sees the development of software (systems) as a step-wise refinement process from specification to code, ideally by CbC design tools that automatically generate error-free software (system) implementations from rigorous and unambiguous requirement specifications. Afterwards, testing only serves to validate the CbC process rather than to find bugs.
A lot of progress has been made on CbC, and after a successful track on the combination of CbC with post-hoc verification at ISoLA 2016, at ISoLA 2018 it was time to look further than correctness by investigating a move from CbC to X-by-Construction (XbC), i.e., by also considering non-functional properties. XbC is thus concerned with a step-wise refinement process from specification to code that automatically generates software (system) implementations that by construction satisfy specific non-functional properties (i.e., concerning security, dependability, reliability, resource or energy consumption, and the like). In line with the growing attention to fault tolerance and the increasing use of machine-learning techniques in modern software systems, which make it hard to establish guaranteed properties – as witnessed in other tracks at ISoLA 2022 and at AISoLA 2023 – a third track in this series, at ISoLA 2020/2021, focused on XbC in the setting of probabilistic systems and properties. Finally, a fourth track in this series, at ISoLA 2022, focused on the synergies between XbC and Runtime Verification (RV). The latter is concerned with monitoring and analysing actual software (and hardware) system behaviour, since achieving correctness starting at design time is difficult—if not impossible—with the current proliferation of systems with data-driven AI components, while a system failure detected by RV may possibly be repaired, but ensurances are needed that the corrected system is indeed better than the previous one.
Submission:
https://equinocs.springernature.com/service/xbc-ai
Doctoral Symposium
Organizers:
- Sven Jörges (University of Applied Sciences and Arts, DE)
- Steven Smyth (TU Dortmund University, DE)
- Salim Saay (University of Limerick, IE)
Information:
Please find additional information on the detail page.
Submission:
https://equinocs.springernature.com/service/docsym