[ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list ]
I had previously only addressed the first part of this only to Philip Wadler, so I reproduce it here: Compcert uses evaluation contexts: https://urldefense.com/v3/__https://compcert.org/doc/html/compcert.cfrontend.Csem.html*context__;Iw!!IBzWLUs!HGX2hGRQRuVyTKBYG3pfweQirZSHsZHubbOfy84VJXuPAA-Sn0j82bCDEgj0LczocJqUb-S-1mA$ Iris does too: https://urldefense.com/v3/__https://gitlab.mpi-sws.org/iris/iris/-/blob/master/iris_heap_lang/lang.v*L412__;Iw!!IBzWLUs!HGX2hGRQRuVyTKBYG3pfweQirZSHsZHubbOfy84VJXuPAA-Sn0j82bCDEgj0LczocJqULhe3zyM$ This is slightly different than what you propose, because here the nesting is handled in the definition of contexts rather than in the step relation. Your approach works too: https://urldefense.com/v3/__https://pastebin.com/raw/TQU9UrnS__;!!IBzWLUs!HGX2hGRQRuVyTKBYG3pfweQirZSHsZHubbOfy84VJXuPAA-Sn0j82bCDEgj0LczocJqUT7t_EKs$ I have here represented the frame f directly as its f[_] function, but I think that shouldn't make a difference. About decomposition/composition: I have found it useful to define context typing as (K : A => B) := ∀ e, e : A => K[e] : B This means you don't need to give the inductive rules, and you get composition for free. In fact, I usually don't even give context typing a name in Coq, and instead prove this lemma by induction on typing: K[e] : B <=> ∃ A, e : A ∧ ∀ e, e : A => K[e] : B. Jules On Sat, Oct 9, 2021 at 11:14 PM Derek Dreyer <[email protected]> wrote: > [ The Types Forum, http://lists.seas.upenn.edu/mailman/listinfo/types-list > ] > > Hi, Phil. > > Yes, there is a way to make the single congruence rule work. See for > example the way I set things up in my Semantics course notes (see > Section 1.2): > https://urldefense.com/v3/__https://courses.ps.uni-saarland.de/sem_ws1920/dl/21/lecture_notes_2nd_half.pdf__;!!IBzWLUs!G2Y7fSLG9GkuFfn_pRKp5yvrMo1tv78em8j3kc4xuW-yBS3NZuhJ49Y6sdougc4yziUkPf0h6W8$ > > (Note: the general approach described below is not original, but my > version may be easier to mechanize than others. Earlier work, such as > Wright-Felleisen 94 and Harper-Stone 97, presents variations on this > -- they use the term Replacement Lemma -- that I think are a bit > clunkier and/or more annoying to mechanize. Wright-Felleisen cites > Hindley-Seldin for this Replacement Lemma. In my version, the > Replacement Lemma is broken into two lemmas -- Decomposition and > Composition -- by defining a typing judgment for evaluation contexts.) > > Under this approach, you: > > 1. Divide the definition of reduction into two relations: let's call > them "base reduction" and "full reduction". The base one has all the > interesting basic reduction rules that actually do something (e.g. > beta). The full one has just one rule, which handles all the "search" > cases via eval ctxts: it says that K[e] reduces to K[e'] iff e > base-reduces to e'. I believe it isn't strictly necessary to separate > into two relations, but I've tried it without separating, and it makes > the proof significantly cleaner to separate. > > 2. Define a notion of evaluation context typing K : A => B (signifying > that K takes a hole of type A and returns a term of type B). This is > the key part that many other accounts skip, but it makes things > cleaner. > > With eval ctxt typing in hand, we can now prove the following two very > easy lemmas (each requires like only 1 or 2 lines of Coq): > > 3. Decomposition Lemma: If K[e] : B, then there exists A such that K : > A => B and e : A. > > 4. Composition Lemma, If K : A => B and e : A, then K[e] : B. > > (Without eval ctxt typing, you have to state and prove these lemmas as > one joint Replacement lemma.) > > Then, to prove preservation, you first prove preservation for base > reduction in the usual way. Then, the proof of preservation for full > reduction follows immediately by wrapping the base-reduction > preservation lemma with calls to Decomposition and Composition (again, > just a few lines of Coq). > > My Semantics course notes just show this on pen and paper, but my > students have also mechanized it in Coq, and we will be using that in > the newest version of my course this fall. It is quite > straightforward. The Coq source for the course is still in > development at the moment, but I can share it with you if you're > interested. I would be interested to know if for some reason this > proof structure is harder to mechanize in Agda. > > Best wishes, > Derek > > > On Sat, Oct 9, 2021 at 8:55 PM Philip Wadler <[email protected]> wrote: > > > > [ The Types Forum, > http://lists.seas.upenn.edu/mailman/listinfo/types-list ] > > > > Most mechanised formulations of reduction systems, such as those found in > > Software Foundations or in Programming Language Foundations in Agda, use > > one congruence rule for each evaluation context: > > > > ξ-·₁ : ∀ {L L′ M} > > → L —→ L′ > > ----------------- > > → L · M —→ L′ · M > > > > ξ-·₂ : ∀ {V M M′} > > → Value V > > → M —→ M′ > > ----------------- > > → V · M —→ V · M′ > > > > One might instead define frames that specify evaluation contexts and > have a > > single congruence rule. > > > > data Frame : Set where > > □· : Term → Frame > > ·□ : (V : Term) → Value V → Frame > > > > _[_] : Frame → Term → Term > > (□· M) [ L ] = L · M > > (·□ V _) [ M ] = V · M > > > > ξ : ∀ F {M M′} > > → M —→ M′ > > ------------------- > > → F [ M ] —→ F [ M′ ] > > > > However, one rapidly gets into problems. For instance, consider the proof > > that types are preserved by reduction. > > > > preserve : ∀ {M N A} > > → ∅ ⊢ M ⦂ A > > → M —→ N > > ---------- > > → ∅ ⊢ N ⦂ A > > ... > > preserve (⊢L · ⊢M) (ξ (□· _) L—→L′) = (preserve ⊢L L—→L′) · ⊢M > > preserve (⊢L · ⊢M) (ξ (·□ _ _) M—→M′) = ⊢L · (preserve ⊢M M—→M′) > > ... > > > > The first of these two lines gives an error message: > > > > I'm not sure if there should be a case for the constructor ξ, > > because I get stuck when trying to solve the following unification > > problems (inferred index ≟ expected index): > > F [ M ] ≟ L · M₁ > > F [ M′ ] ≟ N > > when checking that the pattern ξ (□· _) L—→L′ has type L · M —→ N > > > > And the second provokes a similar error. > > > > This explains why so many formulations use one congruence rule for each > > evaluation context. But is there a way to make the approach with a single > > congruence rule work? Any citations to such approaches in the literature? > > > > Thank you for your help. Go well, -- P > > > > > > > > . \ Philip Wadler, Professor of Theoretical Computer Science, > > . /\ School of Informatics, University of Edinburgh > > . / \ and Senior Research Fellow, IOHK > > . > https://urldefense.com/v3/__http://homepages.inf.ed.ac.uk/wadler/__;!!IBzWLUs!EIYnAk7pSQVJFwJaONabTO_JqymiXUpQnVqKBbbpFSiJ_flduU6cOIjOgNtqMC_UDbn50dUukp4$ > > The University of Edinburgh is a charitable body, registered in > > Scotland, with registration number SC005336. >
