Re: Integrating the Formal Methods part of Qualification

2022-09-19 Thread andrew.butterfi...@scss.tcd.ie
Dear Andrew,

> It's great to see a move toward formal verification for RTEMS.

Great to hear about other work in this space as well !

> From our side (TU Dortmund in Germany and the University of Twente in the 
> Netherlands), we recently adopted Frama-C to verify ICPP and MrsP. A 
> potential deadlock is successfully identified, and a remedy is provided. The 
> result will be presented in EMSOFT this year 
> (https://ieeexplore.ieee.org/document/9852753/), and the verification 
> framework is publicly released here: 
> https://github.com/tu-dortmund-ls12-rt/Resource-Synchronization-Protocols-Verification-RTEMS

Thanks for posting this - I have already  looked at and your paper pre-print - 
I like it a lot.

It complements our work very well - for example, one of the current pieces of 
our on-going work is looking at thread queues. You assume they
are implemented correctly, while we are exploring the use of model-based 
techniques to corroborate this.


> Due to the double-blind review process, I cannot chime in earlier in this 
> thread. Today earlier I have seen your patches regarding the chapter you 
> proposed. I wonder if you could take our contribution into account when you 
> organize the chapter? 
A preprint can be found 
https://www.researchgate.net/publication/362599165_Formal_Verification_of_Resource_Synchronization_Protocol_Implementations_A_Case_Study_in_RTEMS.
 (The corresponding ticket will be prepared, and the patch together with a test 
case will be submitted)

I modelled the draft formal verification chapter on the immediately preceding 
one on the Software Testing Framework.
Like it, it currently lacks much of a background/literature survey - I am 
awaiting feedback on it.
In any case, your work will be added in to anything I write about this area.

Best Regards,
  Andrew


Best,
Kuan-Hsun

On Fri, Jul 22, 2022 at 12:37 PM 
mailto:andrew.butterfi...@scss.tcd.ie 
wrote:
Dear RTEMS developers,
 
thanks for the feedback below - I want to wrap this up and move to the next 
step.
 
I propose to produce a complete draft of a formal methods section for the 
Software Engineering manual in rtems-docs
This will add a "Formal Verification" section just after the existing "Test 
Framework" section.
 
This will allow developers to get a much better idea of what is proposed, and 
for me to get feedback.
 
For now, the section will state clearly at the start that this is a proposal 
and that the tooling and resources it describes
won't yet be available in RTEMS proper. 
 
Files likely to be modified in rtems-docs:
eng/index.rst
common/refs.bib
 
I'll add in  eng/formal-verification.rst
 
I'll then submit patches for review in the usual way.
 
Regards,
  Andrew
 
 

Andrew Butterfield     Tel: +353-1-896-2517     Fax: +353-1-677-2204
Lero@TCD, Head of Software Foundations & Verification Research Group
School of Computer Science and Statistics,
Room G.39, O'Reilly Institute, Trinity College, University of Dublin
                         http://www.scss.tcd.ie/Andrew.Butterfield/

 
 
On 11/07/2022, 12:43, "devel on behalf of 
mailto:andrew.butterfi...@scss.tcd.ie;  wrote:
 
On 6 Jul 2022, at 20:07, Gedare Bloom  wrote:
 
On Sun, Jul 3, 2022 at 7:49 PM Chris Johns  wrote:

On 2/7/2022 12:59 am, Andrew Butterfield wrote:
On 1 Jul 2022, at 00:59, Chris Johns > wrote:

On 28/6/2022 11:09 pm, mailto:andrew.butterfi...@scss.tcd.ie
 wrote:
Dear RTEMS Developers,

While the validation tests from the RTEMS pre-qualification activity are
now merged into the RTEMS master, the work done in investigating and
deploying formal methods techniques is not yet merged.

The activity had two main phases: a planning phase (Nov 2018-Oct 2019)
that explored various formal techniques, followed by an execution phase
(Oct 2019-Nov 2021) that then applied selected techniques to selected
parts of RTEMS. Some discussions occurred with the RTEMS community
via the mailings lists over this time. A short third phase (Nov 2021 - Dec 2021)
then reported on the outcomes. Each phase resulted in a technical report.

The key decision made was to use Promela/SPIN for test generation, and
to apply it to the Chains API, the Event Manager, and the SMP scheduler
thread queues. This involved developing models in the Promela language
and using the SPIN model-checker tool to both check their correctness
and to generate execution scenarios that could be used to generate tests.
Tools were developed using Python 3.8 to support this. Initial documentation
about tools and how to use them was put into the 2nd phase report.

Congratulations for the 

Re: Integrating the Formal Methods part of Qualification

2022-09-13 Thread Kuan-Hsun Chen
Dear Andrew,

It's great to see a move toward formal verification for RTEMS.

>From our side (TU Dortmund in Germany and the University of Twente in the
Netherlands), we recently adopted Frama-C to verify ICPP and MrsP. A
potential deadlock is successfully identified, and a remedy is provided.
The result will be presented in EMSOFT this year (
https://ieeexplore.ieee.org/document/9852753/), and the verification
framework is publicly released here:
https://github.com/tu-dortmund-ls12-rt/Resource-Synchronization-Protocols-Verification-RTEMS

Due to the double-blind review process, I cannot chime in earlier in this
thread. Today earlier I have seen your patches regarding the chapter you
proposed. I wonder if you could take our contribution into account when you
organize the chapter?
A preprint can be found here
.
(The corresponding ticket will be prepared, and the patch together with a
test case will be submitted)

Best,
Kuan-Hsun

On Fri, Jul 22, 2022 at 12:37 PM andrew.butterfi...@scss.tcd.ie <
andrew.butterfi...@scss.tcd.ie> wrote:

> Dear RTEMS developers,
>
>
>
> thanks for the feedback below - I want to wrap this up and move to the
> next step.
>
>
>
> I propose to produce a complete draft of a formal methods section for the
> Software Engineering manual in rtems-docs
>
> This will add a "Formal Verification" section just after the existing
> "Test Framework" section.
>
>
>
> This will allow developers to get a much better idea of what is proposed,
> and for me to get feedback.
>
>
>
> For now, the section will state clearly at the start that this is a
> proposal and that the tooling and resources it describes
>
> won't yet be available in RTEMS proper.
>
>
>
> Files likely to be modified in rtems-docs:
>
> eng/index.rst
>
> common/refs.bib
>
>
>
> I'll add in  eng/formal-verification.rst
>
>
>
> I'll then submit patches for review in the usual way.
>
>
>
> Regards,
>
>   Andrew
>
>
>
>
>
> 
> Andrew Butterfield Tel: +353-1-896-2517 Fax: +353-1-677-2204
> Lero@TCD, Head of Software Foundations & Verification Research Group
> School of Computer Science and Statistics,
> Room G.39, O'Reilly Institute, Trinity College, University of Dublin
>  http://www.scss.tcd.ie/Andrew.Butterfield/
> 
>
>
>
>
>
> On 11/07/2022, 12:43, "devel on behalf of andrew.butterfi...@scss.tcd.ie"
> 
> wrote:
>
>
>
> On 6 Jul 2022, at 20:07, Gedare Bloom  wrote:
>
>
>
> On Sun, Jul 3, 2022 at 7:49 PM Chris Johns  wrote:
>
>
> On 2/7/2022 12:59 am, Andrew Butterfield wrote:
>
> On 1 Jul 2022, at 00:59, Chris Johns  > wrote:
>
>
> On 28/6/2022 11:09 pm, andrew.butterfi...@scss.tcd.ie
>  wrote:
>
> Dear RTEMS Developers,
>
> While the validation tests from the RTEMS pre-qualification activity are
> now merged into the RTEMS master, the work done in investigating and
> deploying formal methods techniques is not yet merged.
>
> The activity had two main phases: a planning phase (Nov 2018-Oct 2019)
> that explored various formal techniques, followed by an execution phase
> (Oct 2019-Nov 2021) that then applied selected techniques to selected
> parts of RTEMS. Some discussions occurred with the RTEMS community
> via the mailings lists over this time. A short third phase (Nov 2021 - Dec
> 2021)
> then reported on the outcomes. Each phase resulted in a technical report.
>
> The key decision made was to use Promela/SPIN for test generation, and
> to apply it to the Chains API, the Event Manager, and the SMP scheduler
> thread queues. This involved developing models in the Promela language
> and using the SPIN model-checker tool to both check their correctness
> and to generate execution scenarios that could be used to generate tests.
> Tools were developed using Python 3.8 to support this. Initial
> documentation
> about tools and how to use them was put into the 2nd phase report.
>
>
> Congratulations for the work and results you and others have managed to
> achieve.
> It is exciting to see this happening with RTEMS and being made public.
>
> We now come to the part where we explore the best way to integrate this
> into RTEMS. I am proposing to do this under the guidance of Sebastian
> Huber.
>
> The first suggested step is adding in new documentation to the RTEMS
> Software Engineering manual, as a new Formal Methods chapter. This would
> provide some motivation, as well as a "howto".
>
> I assume that I would initially describe the proposed changes using the
> patch
> review process described in the section on Preparing and Sending Patches in
> the User Manual.
>
> How do you feel I should best proceed?
>
>
> It is hard for me to answer until I understand what is 

Re: Integrating the Formal Methods part of Qualification

2022-07-22 Thread andrew.butterfi...@scss.tcd.ie
Dear RTEMS developers,

thanks for the feedback below - I want to wrap this up and move to the next 
step.

I propose to produce a complete draft of a formal methods section for the 
Software Engineering manual in rtems-docs
This will add a "Formal Verification" section just after the existing "Test 
Framework" section.

This will allow developers to get a much better idea of what is proposed, and 
for me to get feedback.

For now, the section will state clearly at the start that this is a proposal 
and that the tooling and resources it describes
won't yet be available in RTEMS proper.

Files likely to be modified in rtems-docs:
eng/index.rst
common/refs.bib

I'll add in  eng/formal-verification.rst

I'll then submit patches for review in the usual way.

Regards,
  Andrew



Andrew Butterfield Tel: +353-1-896-2517 Fax: +353-1-677-2204
Lero@TCD, Head of Software Foundations & Verification Research Group
School of Computer Science and Statistics,
Room G.39, O'Reilly Institute, Trinity College, University of Dublin
 http://www.scss.tcd.ie/Andrew.Butterfield/



On 11/07/2022, 12:43, "devel on behalf of 
andrew.butterfi...@scss.tcd.ie" 
mailto:devel-boun...@rtems.org> on behalf of 
andrew.butterfi...@scss.tcd.ie> wrote:

On 6 Jul 2022, at 20:07, Gedare Bloom  wrote:

On Sun, Jul 3, 2022 at 7:49 PM Chris Johns  wrote:

On 2/7/2022 12:59 am, Andrew Butterfield wrote:
On 1 Jul 2022, at 00:59, Chris Johns mailto:chr...@rtems.org>> wrote:

On 28/6/2022 11:09 pm, andrew.butterfi...@scss.tcd.ie
 wrote:
Dear RTEMS Developers,

While the validation tests from the RTEMS pre-qualification activity are
now merged into the RTEMS master, the work done in investigating and
deploying formal methods techniques is not yet merged.

The activity had two main phases: a planning phase (Nov 2018-Oct 2019)
that explored various formal techniques, followed by an execution phase
(Oct 2019-Nov 2021) that then applied selected techniques to selected
parts of RTEMS. Some discussions occurred with the RTEMS community
via the mailings lists over this time. A short third phase (Nov 2021 - Dec 2021)
then reported on the outcomes. Each phase resulted in a technical report.

The key decision made was to use Promela/SPIN for test generation, and
to apply it to the Chains API, the Event Manager, and the SMP scheduler
thread queues. This involved developing models in the Promela language
and using the SPIN model-checker tool to both check their correctness
and to generate execution scenarios that could be used to generate tests.
Tools were developed using Python 3.8 to support this. Initial documentation
about tools and how to use them was put into the 2nd phase report.

Congratulations for the work and results you and others have managed to achieve.
It is exciting to see this happening with RTEMS and being made public.

We now come to the part where we explore the best way to integrate this
into RTEMS. I am proposing to do this under the guidance of Sebastian Huber.

The first suggested step is adding in new documentation to the RTEMS
Software Engineering manual, as a new Formal Methods chapter. This would
provide some motivation, as well as a "howto".

I assume that I would initially describe the proposed changes using the patch
review process described in the section on Preparing and Sending Patches in
the User Manual.

How do you feel I should best proceed?

It is hard for me to answer until I understand what is being submitted and who
maintains it? I am sure you understand this due to the specialised nature of the
work.

Indeed, I quite agree.  I have some short answers below, with suggestions.

Thanks.
+1



What will be submitted, ie SPIN files, scripts, etc?

Promela/SPIN model files (ASCII text, C-like syntax)
C template files (.h,.c) to assist test code generation
YAML files to provide a mapping from model concepts to RTEMS C test code
python scripts to automate the test generation
Documentation - largely RTEMS compliant sphinx sources (.rst)


Where are you looking to add these pieces?

Everything except the documentation could live in a top-level folder ('formal')
in rtems-central.
In fact there is no particular constraint from my perspective for where they 
can go.

Using rtems-central is fine.
Do they require anything currently located in rtems-central? Are the
models or YAML files related to the current specification files? I
know I'm guilty of not spending the time yet to deeply learn
rtems-central, but I would like to know that these files will fit
within that repo as it currently is intended to operate.

At the moment there is nothing in rtems-central directly related to this. All 
the python
scripts there support the tools/specs that Sebastian and 

Re: Integrating the Formal Methods part of Qualification

2022-07-11 Thread andrew.butterfi...@scss.tcd.ie
On 6 Jul 2022, at 20:07, Gedare Bloom  wrote:

On Sun, Jul 3, 2022 at 7:49 PM Chris Johns  wrote:


On 2/7/2022 12:59 am, Andrew Butterfield wrote:

On 1 Jul 2022, at 00:59, Chris Johns mailto:chr...@rtems.org>> wrote:


On 28/6/2022 11:09 pm, andrew.butterfi...@scss.tcd.ie
 wrote:

Dear RTEMS Developers,

While the validation tests from the RTEMS pre-qualification activity are
now merged into the RTEMS master, the work done in investigating and
deploying formal methods techniques is not yet merged.

The activity had two main phases: a planning phase (Nov 2018-Oct 2019)
that explored various formal techniques, followed by an execution phase
(Oct 2019-Nov 2021) that then applied selected techniques to selected
parts of RTEMS. Some discussions occurred with the RTEMS community
via the mailings lists over this time. A short third phase (Nov 2021 - Dec 2021)
then reported on the outcomes. Each phase resulted in a technical report.

The key decision made was to use Promela/SPIN for test generation, and
to apply it to the Chains API, the Event Manager, and the SMP scheduler
thread queues. This involved developing models in the Promela language
and using the SPIN model-checker tool to both check their correctness
and to generate execution scenarios that could be used to generate tests.
Tools were developed using Python 3.8 to support this. Initial documentation
about tools and how to use them was put into the 2nd phase report.

Congratulations for the work and results you and others have managed to achieve.
It is exciting to see this happening with RTEMS and being made public.


We now come to the part where we explore the best way to integrate this
into RTEMS. I am proposing to do this under the guidance of Sebastian Huber.

The first suggested step is adding in new documentation to the RTEMS
Software Engineering manual, as a new Formal Methods chapter. This would
provide some motivation, as well as a "howto".

I assume that I would initially describe the proposed changes using the patch
review process described in the section on Preparing and Sending Patches in
the User Manual.

How do you feel I should best proceed?

It is hard for me to answer until I understand what is being submitted and who
maintains it? I am sure you understand this due to the specialised nature of the
work.

Indeed, I quite agree.  I have some short answers below, with suggestions.

Thanks.
+1





What will be submitted, ie SPIN files, scripts, etc?

Promela/SPIN model files (ASCII text, C-like syntax)
C template files (.h,.c) to assist test code generation
YAML files to provide a mapping from model concepts to RTEMS C test code
python scripts to automate the test generation
Documentation - largely RTEMS compliant sphinx sources (.rst)



Where are you looking to add these pieces?

Everything except the documentation could live in a top-level folder ('formal')
in rtems-central.
In fact there is no particular constraint from my perspective for where they 
can go.

Using rtems-central is fine.
Do they require anything currently located in rtems-central? Are the
models or YAML files related to the current specification files? I
know I'm guilty of not spending the time yet to deeply learn
rtems-central, but I would like to know that these files will fit
within that repo as it currently is intended to operate.

At the moment there is nothing in rtems-central directly related to this. All 
the python
scripts there support the tools/specs that Sebastian and colleagues developed.
I can see a new top-level script being added here to support the formal stuff.

It is possible that we might introduce a new specification item that integrates
in with the spec system that is used to trigger the Spin test generations. I'll 
discuss the
best wat to do this with Sebastian.




RSB should be taught how to build the necessary host tools to run Promela/SPIN.

Spin is open source under a BSD 3-clause license, available from spinroot.com,
and also via https://github.com/nimble-code/Spin. It is written in C and has 
very few
dependencies. I'm not sure what is involved in adding it to the RSB, but I guess
Sebastian will know.


The plan would be to add the pertinent parts of our project documentation into
new chapters
in the RTEMS software engineering manual. So that would be eng/ in the
rtems-docs repo.

Great.
+1

A new section in the eng should be suitable. I would actually
recommend using
https://docs.rtems.org/branches/master/eng/test-framework.html# as a
good example.

Thanks - that very helpful. I think our section should immediately follow that 
one.
All we need now is a suitable title.  "Formal Test Framework"?




How is the verification run against the code? Do we manage regression testing
and is that even possible?

The python scripts basically run SPIN in such a way as to generate scenarios
that model
correct behaviour which then get turned into standard RTEMS test programs. 
These all
get added into a new 

Re: Integrating the Formal Methods part of Qualification

2022-07-06 Thread Gedare Bloom
On Sun, Jul 3, 2022 at 7:49 PM Chris Johns  wrote:
>
> On 2/7/2022 12:59 am, Andrew Butterfield wrote:
> > On 1 Jul 2022, at 00:59, Chris Johns  > > wrote:
> >>
> >> On 28/6/2022 11:09 pm, andrew.butterfi...@scss.tcd.ie
> >>  wrote:
> >>> Dear RTEMS Developers,
> >>>
> >>> While the validation tests from the RTEMS pre-qualification activity are
> >>> now merged into the RTEMS master, the work done in investigating and
> >>> deploying formal methods techniques is not yet merged.
> >>>
> >>> The activity had two main phases: a planning phase (Nov 2018-Oct 2019)
> >>> that explored various formal techniques, followed by an execution phase
> >>> (Oct 2019-Nov 2021) that then applied selected techniques to selected
> >>> parts of RTEMS. Some discussions occurred with the RTEMS community
> >>> via the mailings lists over this time. A short third phase (Nov 2021 - 
> >>> Dec 2021)
> >>> then reported on the outcomes. Each phase resulted in a technical report.
> >>>
> >>> The key decision made was to use Promela/SPIN for test generation, and
> >>> to apply it to the Chains API, the Event Manager, and the SMP scheduler
> >>> thread queues. This involved developing models in the Promela language
> >>> and using the SPIN model-checker tool to both check their correctness
> >>> and to generate execution scenarios that could be used to generate tests.
> >>> Tools were developed using Python 3.8 to support this. Initial 
> >>> documentation
> >>> about tools and how to use them was put into the 2nd phase report.
> >>
> >> Congratulations for the work and results you and others have managed to 
> >> achieve.
> >> It is exciting to see this happening with RTEMS and being made public.
> >>
> >>> We now come to the part where we explore the best way to integrate this
> >>> into RTEMS. I am proposing to do this under the guidance of Sebastian 
> >>> Huber.
> >>>
> >>> The first suggested step is adding in new documentation to the RTEMS
> >>> Software Engineering manual, as a new Formal Methods chapter. This would
> >>> provide some motivation, as well as a "howto".
> >>>
> >>> I assume that I would initially describe the proposed changes using the 
> >>> patch
> >>> review process described in the section on Preparing and Sending Patches 
> >>> in
> >>> the User Manual.
> >>>
> >>> How do you feel I should best proceed?
> >>
> >> It is hard for me to answer until I understand what is being submitted and 
> >> who
> >> maintains it? I am sure you understand this due to the specialised nature 
> >> of the
> >> work.
> >
> > Indeed, I quite agree.  I have some short answers below, with suggestions.
>
> Thanks.
>
+1

> >
> >>
> >> What will be submitted, ie SPIN files, scripts, etc?
> >
> > Promela/SPIN model files (ASCII text, C-like syntax)
> > C template files (.h,.c) to assist test code generation
> > YAML files to provide a mapping from model concepts to RTEMS C test code
> > python scripts to automate the test generation
> > Documentation - largely RTEMS compliant sphinx sources (.rst)
> >
> >>
> >> Where are you looking to add these pieces?
> >
> > Everything except the documentation could live in a top-level folder 
> > ('formal')
> > in rtems-central.
> > In fact there is no particular constraint from my perspective for where 
> > they can go.
>
> Using rtems-central is fine.
>
Do they require anything currently located in rtems-central? Are the
models or YAML files related to the current specification files? I
know I'm guilty of not spending the time yet to deeply learn
rtems-central, but I would like to know that these files will fit
within that repo as it currently is intended to operate.

RSB should be taught how to build the necessary host tools to run Promela/SPIN.

> > The plan would be to add the pertinent parts of our project documentation 
> > into
> > new chapters
> > in the RTEMS software engineering manual. So that would be eng/ in the
> > rtems-docs repo.
>
> Great.
>
+1

A new section in the eng should be suitable. I would actually
recommend using
https://docs.rtems.org/branches/master/eng/test-framework.html# as a
good example.

> >> How is the verification run against the code? Do we manage regression 
> >> testing
> >> and is that even possible?
> >
> > The python scripts basically run SPIN in such a way as to generate scenarios
> > that model
> > correct behaviour which then get turned into standard RTEMS test programs. 
> > These all
> > get added into a new testsuite in the rtems repo (testsuites/models, say).
> > They are properly integrated into the RTEMS test system, so get built and 
> > run by
> > waf.
> > This is similar to how the tests generated by Sebastian in 
> > testsuites/validation
> > are handled.
> >
> > From the perspective of a user that works out of git.rtems.org/rtems
> > ,
> > there will be no obvious impact - just some extra tests in among the ones 
> > that
> > already exist.
>
> 

Re: Integrating the Formal Methods part of Qualification

2022-07-03 Thread Chris Johns
On 2/7/2022 12:59 am, Andrew Butterfield wrote:
> On 1 Jul 2022, at 00:59, Chris Johns  > wrote:
>>
>> On 28/6/2022 11:09 pm, andrew.butterfi...@scss.tcd.ie
>>  wrote:
>>> Dear RTEMS Developers,
>>>
>>> While the validation tests from the RTEMS pre-qualification activity are
>>> now merged into the RTEMS master, the work done in investigating and
>>> deploying formal methods techniques is not yet merged.
>>>
>>> The activity had two main phases: a planning phase (Nov 2018-Oct 2019)
>>> that explored various formal techniques, followed by an execution phase
>>> (Oct 2019-Nov 2021) that then applied selected techniques to selected
>>> parts of RTEMS. Some discussions occurred with the RTEMS community
>>> via the mailings lists over this time. A short third phase (Nov 2021 - Dec 
>>> 2021)
>>> then reported on the outcomes. Each phase resulted in a technical report.
>>>
>>> The key decision made was to use Promela/SPIN for test generation, and
>>> to apply it to the Chains API, the Event Manager, and the SMP scheduler
>>> thread queues. This involved developing models in the Promela language
>>> and using the SPIN model-checker tool to both check their correctness
>>> and to generate execution scenarios that could be used to generate tests.
>>> Tools were developed using Python 3.8 to support this. Initial documentation
>>> about tools and how to use them was put into the 2nd phase report.
>>
>> Congratulations for the work and results you and others have managed to 
>> achieve.
>> It is exciting to see this happening with RTEMS and being made public.
>>
>>> We now come to the part where we explore the best way to integrate this
>>> into RTEMS. I am proposing to do this under the guidance of Sebastian Huber.
>>>
>>> The first suggested step is adding in new documentation to the RTEMS
>>> Software Engineering manual, as a new Formal Methods chapter. This would
>>> provide some motivation, as well as a "howto".
>>>
>>> I assume that I would initially describe the proposed changes using the 
>>> patch
>>> review process described in the section on Preparing and Sending Patches in
>>> the User Manual.
>>>
>>> How do you feel I should best proceed?
>>
>> It is hard for me to answer until I understand what is being submitted and 
>> who
>> maintains it? I am sure you understand this due to the specialised nature of 
>> the
>> work.
> 
> Indeed, I quite agree.  I have some short answers below, with suggestions.

Thanks.

> 
>>
>> What will be submitted, ie SPIN files, scripts, etc?
> 
> Promela/SPIN model files (ASCII text, C-like syntax)
> C template files (.h,.c) to assist test code generation
> YAML files to provide a mapping from model concepts to RTEMS C test code 
> python scripts to automate the test generation
> Documentation - largely RTEMS compliant sphinx sources (.rst)
> 
>>
>> Where are you looking to add these pieces?
> 
> Everything except the documentation could live in a top-level folder 
> ('formal')
> in rtems-central.
> In fact there is no particular constraint from my perspective for where they 
> can go.

Using rtems-central is fine.

> The plan would be to add the pertinent parts of our project documentation into
> new chapters
> in the RTEMS software engineering manual. So that would be eng/ in the
> rtems-docs repo.

Great.

>> How is the verification run against the code? Do we manage regression testing
>> and is that even possible?
> 
> The python scripts basically run SPIN in such a way as to generate scenarios
> that model
> correct behaviour which then get turned into standard RTEMS test programs. 
> These all
> get added into a new testsuite in the rtems repo (testsuites/models, say).
> They are properly integrated into the RTEMS test system, so get built and run 
> by
> waf.
> This is similar to how the tests generated by Sebastian in 
> testsuites/validation
> are handled.
> 
> From the perspective of a user that works out of git.rtems.org/rtems
> , 
> there will be no obvious impact - just some extra tests in among the ones that
> already exist.

Thanks and this make sense.

> 
>>
>> I hope my simple questions highlight a lack of understand on how this works 
>> and
>> how we maintain it and use it once integrated.
> 
> I intend to continue to work and maintain this for the foreseeable future. I
> would hope as this beds in that other Promela/SPIN users out there will also 
> get
> more involved over time.

Thank, it is appreciated.

> It is expected that Promela models will be as static as the corresponding 
> APIs.
> They capture the specified behaviour of API calls, not their detailed
> implementation.
> 
> The python scripts should also be fairly stable, although I can see some
> tweaking for a while to improve workflow issues that might arise.
> 
> There are some extra python libraries that are required over and above what is
> currently specified in rtems-central/requirements.txt

This 

Re: Integrating the Formal Methods part of Qualification

2022-07-01 Thread andrew.butterfi...@scss.tcd.ie
-Original Message-
From: Chris Johns 


On 28/6/2022 11:09 pm, andrew.butterfi...@scss.tcd.ie wrote:
> Dear RTEMS Developers,
> 
> While the validation tests from the RTEMS pre-qualification activity are
> now merged into the RTEMS master, the work done in investigating and 
> deploying formal methods techniques is not yet merged.
> 
> The activity had two main phases: a planning phase (Nov 2018-Oct 2019) 
> that explored various formal techniques, followed by an execution phase 
> (Oct 2019-Nov 2021) that then applied selected techniques to selected 
> parts of RTEMS. Some discussions occurred with the RTEMS community 
> via the mailings lists over this time. A short third phase (Nov 2021 - 
Dec 2021) 
> then reported on the outcomes. Each phase resulted in a technical report.
> 
> The key decision made was to use Promela/SPIN for test generation, and 
> to apply it to the Chains API, the Event Manager, and the SMP scheduler
> thread queues. This involved developing models in the Promela language 
> and using the SPIN model-checker tool to both check their correctness
> and to generate execution scenarios that could be used to generate tests.
> Tools were developed using Python 3.8 to support this. Initial 
documentation 
> about tools and how to use them was put into the 2nd phase report.

Congratulations for the work and results you and others have managed to 
achieve.
It is exciting to see this happening with RTEMS and being made public.

> We now come to the part where we explore the best way to integrate this
> into RTEMS. I am proposing to do this under the guidance of Sebastian 
Huber.
> 
> The first suggested step is adding in new documentation to the RTEMS 
> Software Engineering manual, as a new Formal Methods chapter. This would 
> provide some motivation, as well as a "howto".
> 
> I assume that I would initially describe the proposed changes using the 
patch 
> review process described in the section on Preparing and Sending Patches 
in 
> the User Manual.
> 
> How do you feel I should best proceed?

It is hard for me to answer until I understand what is being submitted and 
who
maintains it? I am sure you understand this due to the specialised nature 
of the
work.

Indeed, I quite agree.  I have some short answers below, with suggestions.

What will be submitted, ie SPIN files, scripts, etc?

Promela/SPIN model files (ASCII text, C-like syntax)
C template files (.h,.c) to assist test code generation
YAML files to provide a mapping from model concepts to RTEMS C test code 
python scripts to automate the test generation
Documentation - largely RTEMS compliant sphinx sources (.rst)

Where are you looking to add these pieces?

Everything except the documentation could live in a top-level folder ('formal') 
in rtems-central.
In fact there is no particular constraint from my perspective for where they 
can go.
 
The plan would be to add the pertinent parts of our project documentation into 
new chapters
in the RTEMS software engineering manual. So that would be eng/ in the 
rtems-docs repo.

How is the verification run against the code? Do we manage regression 
testing
and is that even possible?

The python scripts basically run SPIN in such a way as to generate scenarios 
that model
correct behaviour which then get turned into standard RTEMS test programs. 
These all
get added into a new testsuite in the rtems repo (testsuites/models, say).
They are properly integrated into the RTEMS test system, so get built and run 
by waf.
This is similar to how the tests generated by Sebastian in 
testsuites/validation are handled.
 
>From the perspective of a user that works out of git.rtems.org/rtems, 
there will be no obvious impact - just some extra tests in among the ones that 
already exist.

I hope my simple questions highlight a lack of understand on how this works 
and
how we maintain it and use it once integrated.

I intend to continue to work and maintain this for the foreseeable future. I 
would hope as this beds in that other Promela/SPIN users out there will also 
get more involved over time.
 
It is expected that Promela models will be as static as the corresponding APIs. 
They capture the specified behaviour of API calls, not their detailed 
implementation.
 
The python scripts should also be fairly stable, although I can see some 
tweaking for a while to improve workflow issues that might arise.
 
There are some extra python libraries that are required over and above what is 
currently specified in rtems-central/requirements.txt

Thanks
Chris

Thanks,
  Andrew

___
devel mailing list
devel@rtems.org
http://lists.rtems.org/mailman/listinfo/devel


Re: Integrating the Formal Methods part of Qualification

2022-07-01 Thread andrew.butterfi...@scss.tcd.ie

On 1 Jul 2022, at 00:59, Chris Johns 
mailto:chr...@rtems.org>> wrote:

On 28/6/2022 11:09 pm, 
andrew.butterfi...@scss.tcd.ie wrote:

Dear RTEMS Developers,

While the validation tests from the RTEMS pre-qualification activity are
now merged into the RTEMS master, the work done in investigating and
deploying formal methods techniques is not yet merged.

The activity had two main phases: a planning phase (Nov 2018-Oct 2019)
that explored various formal techniques, followed by an execution phase
(Oct 2019-Nov 2021) that then applied selected techniques to selected
parts of RTEMS. Some discussions occurred with the RTEMS community
via the mailings lists over this time. A short third phase (Nov 2021 - Dec 2021)
then reported on the outcomes. Each phase resulted in a technical report.

The key decision made was to use Promela/SPIN for test generation, and
to apply it to the Chains API, the Event Manager, and the SMP scheduler
thread queues. This involved developing models in the Promela language
and using the SPIN model-checker tool to both check their correctness
and to generate execution scenarios that could be used to generate tests.
Tools were developed using Python 3.8 to support this. Initial documentation
about tools and how to use them was put into the 2nd phase report.

Congratulations for the work and results you and others have managed to achieve.
It is exciting to see this happening with RTEMS and being made public.


We now come to the part where we explore the best way to integrate this
into RTEMS. I am proposing to do this under the guidance of Sebastian Huber.

The first suggested step is adding in new documentation to the RTEMS
Software Engineering manual, as a new Formal Methods chapter. This would
provide some motivation, as well as a "howto".

I assume that I would initially describe the proposed changes using the patch
review process described in the section on Preparing and Sending Patches in
the User Manual.

How do you feel I should best proceed?

It is hard for me to answer until I understand what is being submitted and who
maintains it? I am sure you understand this due to the specialised nature of the
work.

Indeed, I quite agree.  I have some short answers below, with suggestions.



What will be submitted, ie SPIN files, scripts, etc?

Promela/SPIN model files (ASCII text, C-like syntax)
C template files (.h,.c) to assist test code generation
YAML files to provide a mapping from model concepts to RTEMS C test code
python scripts to automate the test generation
Documentation - largely RTEMS compliant sphinx sources (.rst)


Where are you looking to add these pieces?

Everything except the documentation could live in a top-level folder ('formal') 
in rtems-central.
In fact there is no particular constraint from my perspective for where they 
can go.

The plan would be to add the pertinent parts of our project documentation into 
new chapters
in the RTEMS software engineering manual. So that would be eng/ in the 
rtems-docs repo.



How is the verification run against the code? Do we manage regression testing
and is that even possible?

The python scripts basically run SPIN in such a way as to generate scenarios 
that model
correct behaviour which then get turned into standard RTEMS test programs. 
These all
get added into a new testsuite in the rtems repo (testsuites/models, say).
They are properly integrated into the RTEMS test system, so get built and run 
by waf.
This is similar to how the tests generated by Sebastian in 
testsuites/validation are handled.

From the perspective of a user that works out of 
git.rtems.org/rtems,
there will be no obvious impact - just some extra tests in among the ones that 
already exist.


I hope my simple questions highlight a lack of understand on how this works and
how we maintain it and use it once integrated.

I intend to continue to work and maintain this for the foreseeable future. I 
would hope as this beds in that other Promela/SPIN users out there will also 
get more involved over time.

It is expected that Promela models will be as static as the corresponding APIs. 
They capture the specified behaviour of API calls, not their detailed 
implementation.

The python scripts should also be fairly stable, although I can see some 
tweaking for a while to improve workflow issues that might arise.

There are some extra python libraries that are required over and above what is 
currently specified in rtems-central/requirements.txt


Thanks
Chris

Thanks,
  Andrew



___
devel mailing list
devel@rtems.org
http://lists.rtems.org/mailman/listinfo/devel

Re: Integrating the Formal Methods part of Qualification

2022-06-30 Thread Chris Johns
On 28/6/2022 11:09 pm, andrew.butterfi...@scss.tcd.ie wrote:
> Dear RTEMS Developers,
> 
> While the validation tests from the RTEMS pre-qualification activity are
> now merged into the RTEMS master, the work done in investigating and 
> deploying formal methods techniques is not yet merged.
> 
> The activity had two main phases: a planning phase (Nov 2018-Oct 2019) 
> that explored various formal techniques, followed by an execution phase 
> (Oct 2019-Nov 2021) that then applied selected techniques to selected 
> parts of RTEMS. Some discussions occurred with the RTEMS community 
> via the mailings lists over this time. A short third phase (Nov 2021 - Dec 
> 2021) 
> then reported on the outcomes. Each phase resulted in a technical report.
> 
> The key decision made was to use Promela/SPIN for test generation, and 
> to apply it to the Chains API, the Event Manager, and the SMP scheduler
> thread queues. This involved developing models in the Promela language 
> and using the SPIN model-checker tool to both check their correctness
> and to generate execution scenarios that could be used to generate tests.
> Tools were developed using Python 3.8 to support this. Initial documentation 
> about tools and how to use them was put into the 2nd phase report.

Congratulations for the work and results you and others have managed to achieve.
It is exciting to see this happening with RTEMS and being made public.

> We now come to the part where we explore the best way to integrate this
> into RTEMS. I am proposing to do this under the guidance of Sebastian Huber.
> 
> The first suggested step is adding in new documentation to the RTEMS 
> Software Engineering manual, as a new Formal Methods chapter. This would 
> provide some motivation, as well as a "howto".
> 
> I assume that I would initially describe the proposed changes using the patch 
> review process described in the section on Preparing and Sending Patches in 
> the User Manual.
> 
> How do you feel I should best proceed?

It is hard for me to answer until I understand what is being submitted and who
maintains it? I am sure you understand this due to the specialised nature of the
work.

What will be submitted, ie SPIN files, scripts, etc?

Where are you looking to add these pieces?

How is the verification run against the code? Do we manage regression testing
and is that even possible?

I hope my simple questions highlight a lack of understand on how this works and
how we maintain it and use it once integrated.

Thanks
Chris
___
devel mailing list
devel@rtems.org
http://lists.rtems.org/mailman/listinfo/devel


Integrating the Formal Methods part of Qualification

2022-06-28 Thread andrew.butterfi...@scss.tcd.ie
Dear RTEMS Developers,

While the validation tests from the RTEMS pre-qualification activity are
now merged into the RTEMS master, the work done in investigating and 
deploying formal methods techniques is not yet merged.

The activity had two main phases: a planning phase (Nov 2018-Oct 2019) 
that explored various formal techniques, followed by an execution phase 
(Oct 2019-Nov 2021) that then applied selected techniques to selected 
parts of RTEMS. Some discussions occurred with the RTEMS community 
via the mailings lists over this time. A short third phase (Nov 2021 - Dec 
2021) 
then reported on the outcomes. Each phase resulted in a technical report.

The key decision made was to use Promela/SPIN for test generation, and 
to apply it to the Chains API, the Event Manager, and the SMP scheduler
thread queues. This involved developing models in the Promela language 
and using the SPIN model-checker tool to both check their correctness
and to generate execution scenarios that could be used to generate tests.
Tools were developed using Python 3.8 to support this. Initial documentation 
about tools and how to use them was put into the 2nd phase report.

We now come to the part where we explore the best way to integrate this
into RTEMS. I am proposing to do this under the guidance of Sebastian Huber.

The first suggested step is adding in new documentation to the RTEMS 
Software Engineering manual, as a new Formal Methods chapter. This would 
provide some motivation, as well as a "howto".

I assume that I would initially describe the proposed changes using the patch 
review process described in the section on Preparing and Sending Patches in 
the User Manual.

How do you feel I should best proceed?

Best regards,
  Andrew Butterfield,


Andrew Butterfield Tel: +353-1-896-2517 Fax: +353-1-677-2204
Lero@TCD, Head of Software Foundations & Verification Research Group
School of Computer Science and Statistics,
Room G.39, O'Reilly Institute, Trinity College, University of Dublin
 http://www.scss.tcd.ie/Andrew.Butterfield/

 

___
devel mailing list
devel@rtems.org
http://lists.rtems.org/mailman/listinfo/devel