Visual counterexample explanation for model checking with Oeritte
Abstract: Despite being one of the most reliable approaches for ensuring system correctness, model checking requires auxiliary tools to fully avail. In this work, we tackle the issue of its results being hard to interpret and present Oeritte, a tool for automatic visual counterexample explanation for function block diagrams. To learn what went wrong, the user can inspect a parse tree of the violated LTL formula and a table view of a counterexample, where important variables are highlighted. Then, on the function block diagram of the system under verification, they can receive a visualization of causality relationships between the calculated values of interest and intermediate results or inputs of the function block diagram. Thus, Oeritte serves to decrease formal model and specification debugging efforts along with making model checking more utilizable for complex industrial systems.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.