Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • iris/iris
  • jeehoon.kang/iris-coq
  • amintimany/iris-coq
  • dfrumin/iris-coq
  • Villetaneuse/iris
  • gares/iris
  • shiatsumat/iris
  • Blaisorblade/iris
  • jihgfee/iris-coq
  • mrhaandi/iris
  • tlsomers/iris
  • Quarkbeast/iris-coq
  • janno/iris
  • amaurremi/iris-coq
  • proux/iris
  • tchajed/iris
  • herbelin/iris-coq
  • msammler/iris-coq
  • maximedenes/iris-coq
  • bpeters/iris
  • haidang/iris
  • lepigre/iris
  • lczch/iris
  • simonspies/iris
  • gpirlea/iris
  • dkhalanskiyjb/iris
  • gmalecha/iris
  • germanD/iris
  • aa755/iris
  • jules/iris
  • abeln/iris
  • simonfv/iris
  • atrieu/iris
  • arthuraa/iris
  • simonh/iris
  • jung/iris
  • mattam82/iris
  • Armael/iris
  • adamAndMath/iris
  • gmevel/iris
  • snyke7/iris
  • johannes/iris
  • NiklasM/iris
  • simonspies/iris-parametric-index
  • svancollem/iris
  • proux1/iris
  • wmansky/iris
  • LukeXuan/iris
  • ivanbakel/iris
  • SkySkimmer/iris
  • tjhance/iris
  • yiyunliu/iris
  • Lee-Janggun/iris
  • thomas-lamiaux/iris
  • dongjae/iris
  • dnezam/iris
  • Tragicus/iris
  • clef-men/iris
  • ffengyu/iris
59 results
Show changes
Showing with 1855 additions and 302 deletions
# Equalities in Iris
Using Iris involves dealing with a few subtly different equivalence and equality
Using std++ and Iris involves dealing with a few subtly different equivalence and equality
relations, especially among propositions.
This document summarizes these relations and the subtle distinctions among them.
This is not a general introduction to Iris: instead, we discuss the different
......@@ -8,7 +8,7 @@ Iris equalities and the interface to their Coq implementation. In particular, we
discuss:
- Equality ("=") in the *on-paper* Iris metatheory
- Coq's Leibniz equality (`=`) and std++'s setoid equivalence (`≡`);
- N-equivalence on OFEs (`≡{n}≡`);
- Iris `n`-equivalence on OFEs (`≡{n}≡`);
- Iris internal equality (`≡` in `bi_scope`);
- Iris entailment and bi-entailment (`⊢`, `⊣⊢`).
......@@ -45,6 +45,52 @@ Here, stdpp adds the following facilities:
goal `f a ≡ f b` into `a ≡ b` given an appropriate `Proper` instance (here,
`Proper ((≡) ==> (≡)) f`).
## Defining Proper instances
- For each function `f` that could be used in generalized rewriting (e.g., has
setoid, ofe, ordered arguments), there should be a `Params f n` instance. This
instance forces Coq's setoid rewriting mechanism not to rewrite in the first
`n` arguments of the function `f`. This significantly reduces backtracking
during `Proper` search and thus improves performance/avoids failing instance
searches that diverge. These first arguments typically include type variables
(`A : Type` or `B : A → Type`), type class parameters (`C A`), and Leibniz
arguments (`i : nat` or `i : Z`), so they cannot be rewritten or do not need
setoid rewriting.
Examples:
+ For `cons : ∀ A, A → list A → list A` we have `Params (@cons) 1`,
indicating that the type argument named `A` is not up to rewriting.
+ For `replicate : ∀ A, nat → A → list A` we have `Params (@replicate) 2`
indicating that the type argument `A` is not up to rewriting and that the
`nat`-typed argument also does not show up as rewriteable in the `Proper`
instance (because rewriting with `=` doesn't need such an instance).
+ For `lookup : ∀ {Lookup K A M}, K → M → option A` we have
`Params (@lookup) 5`: there are 3 Type parameters, 1 type class, and a key
(which is Leibniz for all instances).
- Consequenently, `Proper .. f` instances are always written in such a way
that `f` is partially applied with the first `n` arguments from `Params f n`.
Note that implicit arguments count here.
Further note that `Proper` instances never start with `(=) ==>`.
Examples:
+ `Proper ((≡@{A}) ==> (≡@{list A}) ==> (≡@{list A})) cons`,
where `cons` is `@cons A`, matching the 1 in `Params`.
+ `Proper ((≡@{A}) ==> (≡@{list A})) (replicate n)`,
where `replicate n` is `@replicate A n`.
+ `Proper ((≡@{M}) ==> (≡@{option A})) (lookup k)`,
where `lookup k` is `@lookup K A M _ k`, so 5 parameters are fixed, matching
the `Params` instance.
- Lemmas about higher-order functions often need `Params` premises.
These are also written using the convention above. Example:
```coq
Lemma set_fold_ind `{FinSet A C} {B} (P : B C Prop) (f : A B B) (b : B) :
( x, Proper (() ==> impl) (P x)) ...
```
- For premises involving predicates (such as `P` in `set_fold_ind` above), we
always write the weakest `Proper`: that is, use `impl` instead of `iff` (and
in Iris, write `(⊢)` instead of `(⊣⊢)`). For "simple" `P`s, there should be
instances to solve both `impl` and `iff` using `solve_proper`, and for more
complicated cases where `solve_proper` fails, an `impl` is much easier to
prove by hand than an `iff`.
## Equivalences on OFEs
On paper, OFEs involve two relations, equality "=" and distance "=_n". In Coq,
......
......@@ -33,7 +33,7 @@ constructor).
## Notation
Notation for writing HeapLang terms is defined in
[`notation.v`](../theories/heap_lang/notation.v). There are two scopes, `%E` for
[`notation.v`](../iris_heap_lang/notation.v). There are two scopes, `%E` for
expressions and `%V` for values. For example, `(a, b)%E` is an expression pair
and `(a, b)%V` a value pair. The `e` of a `WP e {{ Q }}` is implicitly in `%E`
scope.
......@@ -55,9 +55,12 @@ that the current goal is of the shape `WP e @ E {{ Q }}`.
Tactics to take one or more pure program steps:
- `wp_pure`: Perform one pure reduction step. Pure steps are defined by the
`PureExec` typeclass and include beta reduction, projections, constructors, as
well as unary and binary arithmetic operators.
- `wp_pure pat credit:"H"`: Perform one pure reduction step. `pat` optionally
defines the pattern that the redex has to match; it defaults to `_` (any
redex). The `credit:` argument is optional, too; when present, a later credit
will be generated in a fresh hypothesis named `"H"`.
Pure steps are defined by the `PureExec` typeclass and include beta reduction,
projections, constructors, as well as unary and binary arithmetic operators.
- `wp_pures`: Perform as many pure reduction steps as possible. This
tactic will **not** reduce lambdas/recs that are hidden behind a definition.
If the computation reaches a value, the `WP` will be entirely removed and the
......@@ -101,9 +104,16 @@ Further tactics:
particular when accessing invariants, which is only possible when the `WP` in
the goal is for a single, atomic operation -- `wp_bind` can be used to bring
the goal into the right shape.
- `wp_apply pm_trm`: Apply a lemma whose conclusion is a `WP`, automatically
applying `wp_bind` as needed. See the [ProofMode docs](./proof_mode.md) for an
explanation of `pm_trm`.
- `wp_apply pm_trm as (x1 ... xn) "ipat1 ... ipatn"`:
Apply a lemma whose conclusion is a `WP`, automatically applying `wp_bind` as
needed. The `as` clause is optional and can be used to introduce the
postcondition; this works particularly well for Texan triples. See the
[ProofMode docs](./proof_mode.md) for an explanation of `pm_trm` and `ipat`.
- `wp_smart_apply pm_trm as (x1 ... xn) "ipat1 ... ipatn"`:
like `wp_apply`, but also performs pure reduction steps to reveal a redex that
matches `pm_trm`. To be precise, if applying the lemma fails, `wp_smart_apply`
will perform a step of pure reduction (via `wp_pure`), and repeat. (This means
that `wp_smart_apply` is not the same as `wp_pures; wp_apply`.)
There is no tactic for `Fork`, just do `wp_apply wp_fork`.
......@@ -118,7 +128,7 @@ all of the redexes reduced.)
Sometimes, it is useful to define a derived notion in HeapLang that involves
thunks. For example, the parallel composition `e1 ||| e2` is definable in
HeapLang, but that requires thunking `e1` and `e2` before passing them to
`par`. (This is defined in [`par.v`](theories/heap_lang/lib/par.v).) However,
`par`. (This is defined in [`par.v`](../iris_heap_lang/lib/par.v).) However,
this is somewhat subtle because of the distinction between expression lambdas
and value lambdas.
......
;; Usage: copy to ~/.m17n.d/coq.mim
(input-method t coq)
(description "Input method for Coq")
(title "Coq")
(map (trans
;; Standard LaTeX math notations
("\\forall" "∀")
("\\exists" "∃")
("\\lam" "λ")
("\\not" "¬")
("\\/" "∨")
("/\\" "∧")
("->" "→")
("<->" "↔")
("\\<-" "←") ;; we add a backslash because the plain <- is used for the rewrite tactic
("\\==" "≡")
("\\/==" "≢")
("/=" "≠")
("<=" "≤")
("\\in" "∈")
("\\notin" "∉")
("\\cup" "∪")
("\\cap" "∩")
("\\setminus" "∖")
("\\subset" "⊂")
("\\subseteq" "⊆")
("\\sqsubseteq" "⊑")
("\\sqsubseteq" "⊑")
("\\notsubseteq" "⊈")
("\\meet" "⊓")
("\\join" "⊔")
("\\top" "⊤")
("\\bottom" "⊥")
("\\vdash" "⊢")
("\\dashv" "⊣")
("\\Vdash" "⊨")
("\\infty" "∞")
("\\comp" "∘")
("\\prf" "↾")
("\\bind" "≫=")
("\\mapsto" "↦")
("\\hookrightarrow" "↪")
("\\uparrow" "↑")
;; Iris specific
("\\fun" "λ")
("\\mult" "⋅")
("\\ent" "⊢")
("\\valid" "✓")
("\\diamond" "◇")
("\\box" "□")
("\\bbox" "■")
("\\later" "▷")
("\\pred" "φ")
("\\and" "∧")
("\\or" "∨")
("\\comp" "∘")
("\\ccomp" "◎")
("\\all" "∀")
("\\ex" "∃")
("\\to" "→")
("\\sep" "∗")
("\\lc" "⌜")
("\\rc" "⌝")
("\\Lc" "⎡")
("\\Rc" "⎤")
("\\empty" "∅")
("\\Lam" "Λ")
("\\Sig" "Σ")
("\\-" "∖")
("\\aa" "●")
("\\af" "◯")
("\\auth" "●")
("\\frag" "◯")
("\\iff" "↔")
("\\gname" "γ")
("\\incl" "≼")
("\\latert" "▶")
("\\update" "⇝")
("\\bind" "≫=")
;; accents (for iLöb)
("\\\"o" "ö")
;; subscripts and superscripts
("^^+" "⁺") ("__+" "₊") ("^^-" "⁻")
("__0" "₀") ("__1" "₁") ("__2" "₂") ("__3" "₃") ("__4" "₄")
("__5" "₅") ("__6" "₆") ("__7" "₇") ("__8" "₈") ("__9" "₉")
("__a" "ₐ") ("__e" "ₑ") ("__h" "ₕ") ("__i" "ᵢ") ("__k" "ₖ")
("__l" "ₗ") ("__m" "ₘ") ("__n" "ₙ") ("__o" "ₒ") ("__p" "ₚ")
("__r" "ᵣ") ("__s" "ₛ") ("__t" "ₜ") ("__u" "ᵤ") ("__v" "ᵥ") ("__x" "ₓ")
;; Greek alphabet
("\\Alpha" "Α") ("\\alpha" "α")
("\\Beta" "Β") ("\\beta" "β")
("\\Gamma" "Γ") ("\\gamma" "γ")
("\\Delta" "Δ") ("\\delta" "δ")
("\\Epsilon" "Ε") ("\\epsilon" "ε")
("\\Zeta" "Ζ") ("\\zeta" "ζ")
("\\Eta" "Η") ("\\eta" "η")
("\\Theta" "Θ") ("\\theta" "θ")
("\\Iota" "Ι") ("\\iota" "ι")
("\\Kappa" "Κ") ("\\kappa" "κ")
("\\Lamda" "Λ") ("\\lamda" "λ")
("\\Lambda" "Λ") ("\\lambda" "λ")
("\\Mu" "Μ") ("\\mu" "μ")
("\\Nu" "Ν") ("\\nu" "ν")
("\\Xi" "Ξ") ("\\xi" "ξ")
("\\Omicron" "Ο") ("\\omicron" "ο")
("\\Pi" "Π") ("\\pi" "π")
("\\Rho" "Ρ") ("\\rho" "ρ")
("\\Sigma" "Σ") ("\\sigma" "σ")
("\\Tau" "Τ") ("\\tau" "τ")
("\\Upsilon" "Υ") ("\\upsilon" "υ")
("\\Phi" "Φ") ("\\phi" "φ")
("\\Chi" "Χ") ("\\chi" "χ")
("\\Psi" "Ψ") ("\\psi" "ψ")
("\\Omega" "Ω") ("\\omega" "ω")
))
(state (init (trans)))
......@@ -81,7 +81,7 @@ avoided.
## Type class resolution control
When you are writing a module that exports some Iris term for others to use
(e.g., `join_handle` in the [spawn module](../theories/heap_lang/lib/spawn.v)), be
(e.g., `join_handle` in the [spawn module](../iris_heap_lang/lib/spawn.v)), be
sure to mark these terms as opaque for type class search at the *end* of your
module (and outside any section):
```coq
......@@ -99,7 +99,7 @@ for further details on `libG` classes). For example, the STS library is
parameterized by an STS and assumes that the STS state space is inhabited:
```coq
Class stsG Σ (sts : stsT) := {
sts_inG :> inG Σ (stsR sts);
#[local] sts_inG :: inG Σ (stsR sts);
sts_inhabited :> Inhabited (sts.state sts);
}.
```
......
......@@ -21,7 +21,7 @@ pervasively. These are defined in dedicated sections in this manual.
Many of the tactics below apply to more goals than described in this document
since the behavior of these tactics can be tuned via instances of the type
classes in the file [proofmode/classes](iris/proofmode/classes.v). Most notably, many
classes in the file [proofmode/classes](../iris/proofmode/classes.v). Most notably, many
of the tactics can be applied when the connective to be introduced or to be eliminated
appears under a later, an update modality, or in the conclusion of a
weakest precondition.
......@@ -162,11 +162,11 @@ Elimination of logical connectives
spatial context matching pattern `pat`.
In case all branches of `ipat` start with a `#` (which causes the hypothesis
to be moved to the intuitionistic context) or with an `%` (which causes the
hypothesis to be moved to the pure Coq context), then one can use all
hypotheses for proving the premises of `pm_trm`, as well as for proving the
resulting goal. Note that in this case the hypotheses still need to be
subdivided among the spatial premises.
to be moved to the intuitionistic context), with an `%` (which causes the
hypothesis to be moved to the pure Coq context), or with an `->`/`<-` (which
performs a rewrite), then one can use all hypotheses for proving the premises
of `pm_trm`, as well as for proving the resulting goal. Note that in this case
the hypotheses still need to be subdivided among the spatial premises.
Separation logic-specific tactics
---------------------------------
......@@ -184,10 +184,17 @@ Separation logic-specific tactics
- `iFrame select (pat)%I` : cancel the last hypothesis of the intuitionistic
of spatial context that matches pattern `pat`.
- `iCombine "H1 H2" as "ipat"` : combine `H1 : P1` and `H2 : P2` into `H: P1 ∗
P2` or something simplified but equivalent, then destruct the combined
P2` or something simplified but equivalent, then `destruct` the combined
hypothesis using `ipat`. Some examples of simplifications `iCombine` knows
about are to combine `own γ x` and `own γ y` into `own γ (x ⋅ y)`, and to
combine `l ↦{1/2} v` and `l ↦{1/2} v` into `l ↦ v`.
- `iCombine "H1 H2" gives "ipat"` : from `H1 : P1` and `H2 : P2`, find
persistent consequences of `P1 ∗ P2`, then `destruct` this consequence with
`ipat`. Some examples of persistent consequences `iCombine` knows about are
that `own γ x` and `own γ y` gives `✓ (x ⋅ y)`, and that
`l ↦{#q1} v1` and `l ↦{#q2} v2` gives `⌜(q1 + q2 ≤ 1) ∧ v1 = v2⌝`.
- `iCombine "H1 H2" as "ipat1" gives "ipat2"` combines the two functionalities
of `iCombine` described above.
- `iAccu` : solve a goal that is an evar by instantiating it with all
hypotheses from the spatial context joined together with a separating
conjunction (or `emp` in case the spatial context is empty). Not commonly
......@@ -233,18 +240,26 @@ Induction
+ `iLöb as "IH" forall (x1 ... xn) "selpat"` : perform Löb induction,
generalizing over the Coq level variables `x1 ... xn`, the hypotheses given by
the selection pattern `selpat`, and the spatial context as usual.
- `iInduction x as cpat "IH" "selpat"` : perform induction on
the Coq term `x`. The Coq introduction pattern `cpat` is used to name the introduced
variables. The induction hypotheses are inserted into the intuitionistic
context and given fresh names prefixed `IH`.
+ `iInduction x as cpat "IH" forall (x1 ... xn) "selpat"` : perform induction,
- `iInduction x as cpat` : perform induction on the Coq term `x`. The
Coq introduction pattern `cpat` is used to name the introduced variables,
including the induction hypotheses which are introduced into the proof mode
context. Note that induction hypotheses should not be put into string
quotation marks `".."`, e.g., use `iInduction n as [|n IH]` to perform
induction on a natural number `n`. An implementation caveat is that the names
of the induction hypotheses should be fresh in both the Coq context and the
proof mode context.
+ `iInduction x as cpat forall (x1 ... xn) "selpat"` : perform induction,
generalizing over the Coq level variables `x1 ... xn`, the hypotheses given by
the selection pattern `selpat`, and the spatial context.
+ `iInduction x as cpat "IH" using t` : perform induction using the induction
+ `iInduction x as cpat using t` : perform induction using the induction
scheme `t`. Common examples of induction schemes are `map_ind`, `set_ind_L`,
and `gmultiset_ind` for `gmap`, `gset`, and `gmultiset`.
+ `iInduction x as cpat "IH" using t forall (x1 ... xn) "selpat"` : the above
+ `iInduction x as cpat using t forall (x1 ... xn) "selpat"` : the above
variants combined.
+ `iInduction x as cpat "IH" "selpat"` : ignore the names of the induction
hypotheses from `cpat` and call them `IH`, `IH1`, `IH2`, etc. Here "IH" is
a string (in string quotation marks). This is *legacy* syntax that might be
deprecated in the future. Similar legacy syntax exists for all variants above.
Rewriting / simplification
--------------------------
......@@ -266,6 +281,10 @@ Rewriting / simplification
- `iSimpl` / `iSimpl in "selpat"` : perform `simpl` on the proof mode
goal / hypotheses given by the selection pattern `selpat`. This is a
shorthand for `iEval (simpl)`.
- `iUnfold xs` / `iUnfold xs in "selpat"` : perform `unfold xs` on the proof
mode goal / hypotheses given by the selection pattern `selpat`. This is a
shorthand for `iEval (unfold xs)`. Similar to Coq's `unfold`, the argument
`xs` should be a comma-separated non-empty list.
Iris
----
......
......@@ -14,12 +14,20 @@ Libraries typically bundle the `inG` they need in a `libG` typeclass, so they do
not have to expose to clients what exactly their resource algebras are. For
example, in the [one-shot example](../tests/one_shot.v), we have:
```coq
Class one_shotG Σ := { one_shot_inG :> inG Σ one_shotR }.
Class one_shotG Σ := { one_shot_inG : inG Σ one_shotR }.
Local Existing Instances one_shot_inG.
```
The `:>` means that the projection `one_shot_inG` is automatically registered as
an instance for type-class resolution. If you need several resource algebras,
just add more `inG` fields. If you are using another module as part of yours,
add a field like `one_shot_other :> otherG Σ`.
Here, the projection `one_shot_inG` is registered as an instance for type-class
resolution. If you need several resource algebras, just add more `inG` fields.
If you are using another module as part of yours, add a field like
`one_shot_other : otherG Σ`. All of these fields should be added to the
`Local Existing Instances` command.
The code above enables these typeclass instances only in the surrounding file
where they are used to define the new abstractions by the library. The
interface of these abstractions will only depend on the `one_shotG` class.
Since `one_shot_inG` will be hidden from clients, they will not accidentally
rely on it in their code.
Having defined the type class, we need to provide a way to instantiate it. This
is an important step, as not every resource algebra can actually be used: if
......@@ -51,12 +59,12 @@ Now you can add `one_shotG` as an assumption to all your module definitions and
proofs. We typically use a section for this:
```coq
Section proof.
Context `{!heapG Σ, !one_shotG Σ}.
Context `{!heapGS Σ, !one_shotG Σ}.
```
Notice that besides our own assumptions `one_shotG`, we also assume `heapG`,
Notice that besides our own assumptions `one_shotG`, we also assume `heapGS`,
which are assumptions that every HeapLang proof makes (they are related to
defining the `↦` connective as well as the basic Iris infrastructure for
invariants and WP). For this purpose, `heapG` contains not only assumptions
invariants and WP). For this purpose, `heapGS` contains not only assumptions
about `Σ`, it also contains some ghost names to refer to particular ghost state
(see "global ghost state instances" below).
......@@ -91,10 +99,10 @@ To obtain a closed Iris proof, i.e., a proof that does not make assumptions like
`inG`, you have to assemble a list of functors of all the involved modules,
and if your proof relies on some singleton (most do, at least indirectly; also
see the next section), you have to call the respective initialization or
adequacy lemma. [For example](tests/one_shot.v):
adequacy lemma. [For example](../tests/one_shot.v):
```coq
Section client.
Context `{!heapG Σ, !one_shotG Σ, !spawnG Σ}.
Context `{!heapGS Σ, !one_shotG Σ, !spawnG Σ}.
Lemma client_safe : WP client {{ _, True }}%I.
(* ... *)
......@@ -118,10 +126,11 @@ class for the generalized heap module, bundles the usual `inG` assumptions
together with the `gname`:
```coq
Class gen_heapGpreS (L V : Type) (Σ : gFunctors) `{Countable L} := {
gen_heapGpreS_heap :> ghost_mapG Σ L V;
#[local] gen_heapGpreS_heap :: ghost_mapG Σ L V;
}.
Class gen_heapGS (L V : Type) (Σ : gFunctors) `{Countable L} := GenHeapGS {
gen_heap_inG :> gen_heapGpreS L V Σ;
#[local] gen_heap_inG :: gen_heapGpreS L V Σ;
gen_heap_name : gname;
}.
```
......@@ -132,9 +141,9 @@ would be incompatible with each other.
The `gen_heapGpreS` typeclass (without the singleton data) is relevant for
initialization, i.e., to create an instance of `gen_heapGS`. This is happening as
part of [`heap_adequacy`](iris_heap_lang/adequacy.v) using the
part of [`heap_adequacy`](../iris_heap_lang/adequacy.v) using the
initialization lemma for `gen_heapGS` from
[`gen_heap_init`](iris/base_logic/lib/gen_heap.v):
[`gen_heap_init`](../iris/base_logic/lib/gen_heap.v):
```coq
Lemma gen_heap_init `{gen_heapGpreS L V Σ} σ :
(|==> _ : gen_heapGS L V Σ, gen_heap_ctx σ)%I.
......@@ -222,9 +231,9 @@ F (X,X⁻) := gmap K (agree (nat * ▶ X))
To make it convenient to construct such functors and prove their contractivity,
we provide a number of abstractions:
- [`oFunctor`](iris/algebra/ofe.v): functors from COFEs to OFEs.
- [`rFunctor`](iris/algebra/cmra.v): functors from COFEs to cameras.
- [`urFunctor`](iris/algebra/cmra.v): functors from COFEs to unital
- [`oFunctor`](../iris/algebra/ofe.v): functors from COFEs to OFEs.
- [`rFunctor`](../iris/algebra/cmra.v): functors from COFEs to cameras.
- [`urFunctor`](../iris/algebra/cmra.v): functors from COFEs to unital
cameras.
Besides, there are the classes `oFunctorContractive`, `rFunctorContractive`, and
......@@ -249,7 +258,7 @@ Using these combinators, one can easily construct bigger functors in point-free
style and automatically infer their contractivity, e.g:
```coq
F := gmaURF K (agreeRF (prodOF (constOF natO) (laterOF idOF)))
F := gmapURF K (agreeRF (prodOF (constOF natO) (laterOF idOF)))
```
which effectively defines the desired example functor
......@@ -273,7 +282,9 @@ Putting it all together, the `libG` typeclass and `libΣ` list of functors for
your example would look as follows:
```coq
Class libG Σ := { lib_inG :> inG Σ (gmapR K (agreeR (prodO natO (laterO (iPropO Σ))))) }.
Class libG Σ :=
{ #[local] lib_inG :: inG Σ (gmapR K (agreeR (prodO natO (laterO (iPropO Σ))))) }.
Definition libΣ : gFunctors := #[GFunctor (gmapRF K (agreeRF (natO * )))].
Instance subG_libΣ {Σ} : subG libΣ Σ libG Σ.
Proof. solve_inG. Qed.
......
# Iris proof style
**Warning:** this document is still in development and rather incomplete.
If you run across a question of style (for example, something comes
up in an MR) and it's not on this list, please do reach out to us on Mattermost
so we can add it.
## Basic syntax
### Binders
This applies to Context, Implicit Types, and definitions.
- **Put a space on both sides of the colon.**
Good: `(a : B)` <br>
Bad: `(a:B)`, `(a: B)`
- **Prefer `(a : B)` to `a : B`.** (TODO)
### Patterns
#### Disjunctions & branches
- **Always mark the disjuncts when destructuring a disjunctive pattern, even if you don't bind anything, to indicate that the proof branches.**
Good:
```coq
Lemma foo : b : bool, b = true b = false.
Proof.
intros [|].
...
```
Bad:
```coq
Lemma foo : b : bool, b = true b = false.
Proof.
intros [].
...
```
#### Uncategorized
- **Use `"[H1 H2]"` when possible otherwise do `"(H1 & H2 & H3)"`.** (TODO)
### Unicode
- **Always use Unicode variants of forall, exists, ->, <=, >=.**
Good: `∀ ∃ → ≤ ≥`<br>
Bad: `forall exists -> <= >=`
### Equivalent vernacular commands
- **Use `Context`, never `Variable`.**
- **Use `Implicit Types`, never `Implicit Type`.** (TODO)
- **Use `Lemma`, not `Theorem` (or the other variants: `Fact`, `Corollary`, `Remark`)**.
- **Always add `Global` or `Local` to `Hint`, `Arguments`, and `Instance`.** (TODO)
### `Require`
- **Never put `Require` in the middle of a file.** All `Require` must be at the top.
If you only want to *import* a certain module in some specific place (for instance, in a `Section` or other `Module`), you can do something like:
```coq
From lib Require component.
(* ... *)
Import component.
```
### Ltac
- **Prefer `first [ t1 | fail 1 "..." ]` to `t1 || fail "..."`.** This is better because the latter will fail if `t1` doesn't make progress. See https://gitlab.mpi-sws.org/iris/iris/-/issues/216. Note that `first [ t1 | fail "..."]` is simply incorrect; the failure message will never show up and will be replaced with a generic failure.
### Coqdoc comments
- **Module-level comments (covering the entire file) go at the top of the file, before the `Require`.**
### Uncategorized
- **Indent the body of a match by one space.**
Good:
```coq
match foo with
| Some x =>
long line here using
```
*RJ*: This is odd, usually everything is in (multiples of) 2 spaces, I think.
*Tej*: https://gitlab.mpi-sws.org/iris/iris/-/blob/920bc3d97b8830139045e1780f2aff4d05b910cd/iris_heap_lang/proofmode.v#L194
- **Avoid using the extra square brackets around an Ltac match.**
Good:
```coq
match goal with
| |- ?g => idtac g
```
Bad:
```coq
match goal with
| [ |- ?g ] => idtac g
```
- **Use coqdoc syntax in comments for Coq identifiers and inline code, e.g. `[cmraT]`.**
- **Put proofs either all on one line (`Proof. reflexivity. Qed.`) or split up the usual way with indentation.**
Bad:
```coq
Lemma foo : 2 + 2 = 4 1 + 2 = 3.
Proof. split.
- reflexivity.
- done.
Qed.
```
- **Put the entire theorem statement on one line or one premise per line, indented by 2 spaces.**
Bad:
```coq
Lemma foo x y z :
x < y y < z
x < z.
```
Good:
```coq
Lemma foo x y z : x < y y < z x < z.
```
Good: (particularly if premises are longer)
```coq
Lemma foo x y z :
x < y
y < z
x < z.
```
- **If the parameters before the `:` become too long, indent later lines by 4 spaces and always have a newline after `:`.**
Bad:
```coq
Lemma foo (very_long_name : has_a_very_long_type)
(even_longer_name : has_an_even_longer_type) : x < y y < z
x < z.
```
Good:
```coq
Lemma foo (very_long_name : has_a_very_long_type)
(even_longer_name : has_an_even_longer_type) :
x < y y < z x < z.
```
- **For definitions that don't fit into one line, put `:=` before the linebreak, not after.**
Bad:
```coq
Definition foo (arg1 arg2 arg3 : name_of_the_type)
:= the body that is very long.
```
Good:
```coq
Definition foo (arg1 arg2 arg3 : name_of_the_type) :=
the body that is very long.
```
- **For tests with output put `Check "theorem name in a string"` before it so that the output from different tests is separated.**
- **For long `t1; t2; t3` and `t; [t1 | t2 | t3]` split them like this, and indent by 2 spaces.**
Good:
```coq
t;
[t1
|t2
|t3].
```
```coq
t;
t1;
t2.
```
## File organization
theories/algebra is for primitive ofe/RA/CMRA constructions
theories/algebra/lib is for derived constructions
theories/base_logic/lib is for constructions in the base logic (using own)
## Naming
* `*_ctx` for persistent facts (often an invariant) needed by everything in a library
* `*_interp` for a function from some representation to iProp
* If you have lemma `foo` which is an iff and you want single direction versions, name them `foo_1` (forward) and `foo_2` (backward)
* If you have a lemma `foo` parametrized by an equivalence relation, you might want a version specialized to Leibniz equality for better rewrite support; name that version `foo_L` and state it with plain equality (e.g., `dom_empty_L` in stdpp). You might take an assumption `LeibnizEquiv A` if the original version took an equivalence (say the OFE equivalence) to assume that the provided equivalence is plain equality.
* Lower-case theorem names, lower-case types, upper-case constructors
* **TODO:** how should `f (g x) = f' (g' x)` be named?
* `list_lookup_insert` is named by context (the type involved), then the two functions outside-in on the left-hand-side, so it has the type `lookup (insert …) = …` where the `insert` is on a list. Notations mean it doesn’t actually look like this and the insert is textually first.
* Operations that "extract" from the data structure (`lookup`, `elem_of`, ...) should come before operations that "alter" the data structure (`filter`, `insert`, `union`, `fmap`). For example, use `map_lookup_filter` instead of `map_filter_lookup`.
* Injectivity theorems are instances of `Inj` and then are used with `inj`
* Suffixes `_l` and `_r` when we have binary `op x y` and a theorem related to the left or right. For example, `sep_mono_l` says bi_sep is monotonic in its left argument (holding the right constant).
* Suffix `'` (prime) is used when `foo'` is a corollary of `foo`. Try to avoid
these since the name doesn't convey how `foo'` is related to `foo`.
* Given a polymorphic function/relation `f` (e.g., `eq`, `equiv`, `subseteq`), the instance of type `A` is called `A_f_instance`, and we add a lemma `A_f` that characterizes the instance. In case of many instances, this lemma is proved by unfolding the definition of the instance, e.g., `frac_op`, `frac_valid`. However, in some cases, e.g., `list_eq`, `map_eq`, `set_eq` this lemma might require non-trivial proof work.
* For lemmas involving modalities, we usually follow this naming scheme:
```
M1_into_M2: M1 P ⊢ M2 P
M1_M2_elim: M1 (M2 P) ⊣⊢ M1 P
M1_elim_M2: M1 (M2 P) ⊣⊢ M2 P
M1_M2: M1 (M2 P) ⊣⊢ M2 (M1 P)
```
* Monotonicity lemmas where the relation can be ambiguous are called `<f>_<relation>_mono`, e.g. `Some_included_mono`.
* For lemmas `f x = g ...` that give a definition of function `f` in terms of `g`, we use `f_as_g`. For example, `map_compose_as_omap : m ∘ₘ n = omap (m !!.) n`.
### Naming algebra libraries
**TODO:** describe any conclusions we came to with the `mono_nat` library
## Parameter order and implicitness for lemmas
* Parameter order is usually from more higher-order to less higher-order (types, functions, plain values), and otherwise follows the order in which variables appear in the lemma statement.
* In new lemmas, arguments should be marked as implicit when they can be inferred by unification in all intended usage scenarios. (If an argument can *sometimes* be inferred, we do not have a clear guideline yet -- consider on a case-by-case basis, for now.)
## Lemma statements
### Iris lemmas: `-∗` vs `⊢`
* For low-level lemmas, in particular if there is a high likelihood someone would want to rewrite with it / use them in non-proofmode goals (e.g. modality intro rules), use `⊢`
* `P ⊢ |==£> P`
* `(|==£> |==£> P) ⊢ |==£> P`
* `▷ own γ a ⊢ ◇ ∃ b, own γ b ∧ ▷ (a ≡ b)`
* `(P -∗ Q) i ⊢ (P i -∗ Q i)`
* If a lemma is a Coq implication of Iris entailments (where the entailments are visible, not hidden behind e.g. `Persistent`), then use `⊢`
* `(P1 ⊢ P2) → recv l P1 ⊢ recv l P2`
* Else use -∗
* `a1 ⋅ a2 ~~> a' → own γ a1 -∗ own γ a2 ==∗ own γ a'` (curried and hence not rewrite-friendly)
## Metavariables
**TODO:** move these to the right place
* `P` `Q` for bi:PROP (or specifically `iProp Σ`)
* `Φ` and `Ψ` for (?A -> iProp), like postconditions
* `φ` and `ψ` for `Prop`
* `A` `B` for types, ofeT, or cmraT
Suffixes like O, R, UR (already documented)
snippet \forall "" i
endsnippet
snippet \exists "" i
endsnippet
snippet \lam "" i
λ
endsnippet
snippet \not "" i
¬
endsnippet
snippet \lor "" i
endsnippet
snippet \land "" i
endsnippet
snippet \/ "" i
endsnippet
snippet /\ "" i
endsnippet
snippet \rightarrow "" i
endsnippet
snippet \implies "" i
endsnippet
snippet -> "" i
endsnippet
snippet \iff "" i
endsnippet
snippet <-> "" i
endsnippet
snippet \<- "" i
endsnippet
snippet \cong "" i
endsnippet
snippet \== "" i
endsnippet
snippet \/== "" i
endsnippet
snippet \neq "" i
endsnippet
snippet /= "" i
endsnippet
snippet \less "" i
endsnippet
snippet \le "" i
endsnippet
snippet <= "" i
endsnippet
snippet \in "" i
endsnippet
snippet \notin "" i
endsnippet
snippet \cup "" i
endsnippet
snippet \cap "" i
endsnippet
snippet \setminus "" i
endsnippet
snippet \subset "" i
endsnippet
snippet \subseteq "" i
endsnippet
snippet \sqsubseteq "" i
endsnippet
snippet \sqsubseteq "" i
endsnippet
snippet \notsubseteq "" i
endsnippet
snippet \meet "" i
endsnippet
snippet \join "" i
endsnippet
snippet \true "" i
endsnippet
snippet \top "" i
endsnippet
snippet \false "" i
endsnippet
snippet \bottom "" i
endsnippet
snippet \vdash "" i
endsnippet
snippet \dashv "" i
endsnippet
snippet \vDash "" i
endsnippet
snippet \Vdash
endsnippet
snippet \infty "" i
endsnippet
snippet \comp "" i
endsnippet
snippet \prf "" i
endsnippet
snippet \bind "" i
≫=
endsnippet
snippet \mapsto "" i
endsnippet
snippet \hookrightarrow "" i
endsnippet
snippet \up "" i
endsnippet
snippet \uparrow "" i
endsnippet
snippet \fun "" i
λ
endsnippet
snippet \mult "" i
endsnippet
snippet \ent "" i
endsnippet
snippet \valid "" i
endsnippet
snippet \diamond "" i
endsnippet
snippet \box "" i
endsnippet
snippet \bbox "" i
endsnippet
snippet \eval "" i
endsnippet
snippet \rhd "" i
endsnippet
snippet \later "" i
endsnippet
snippet \pred "" i
φ
endsnippet
snippet \and "" i
endsnippet
snippet \or "" i
endsnippet
snippet \circ "" i
endsnippet
snippet \comp "" i
endsnippet
snippet \ccomp "" i
endsnippet
snippet \pound "" i
£
endsnippet
snippet \all "" i
endsnippet
snippet \ex "" i
endsnippet
snippet \to "" i
endsnippet
snippet \ast "" i
endsnippet
snippet \sep "" i
endsnippet
snippet \ulc "" i
endsnippet
snippet \urc "" i
endsnippet
snippet \lc "" i
endsnippet
snippet \rc "" i
endsnippet
snippet \Lc "" i
endsnippet
snippet \Rc "" i
endsnippet
snippet \varnothing "" i
endsnippet
snippet \empty "" i
endsnippet
snippet \Lam "" i
Λ
endsnippet
snippet \Sig "" i
Σ
endsnippet
snippet \- "" i
endsnippet
snippet \aa "" i
endsnippet
snippet \af "" i
endsnippet
snippet \auth "" i
endsnippet
snippet \frag "" i
endsnippet
snippet \iff "" i
endsnippet
snippet \gname "" i
γ
endsnippet
snippet \incl "" i
endsnippet
snippet \latert "" i
endsnippet
snippet \update "" i
endsnippet
snippet \bind "" i
≫=
endsnippet
snippet ^^+ "" i
endsnippet
snippet __+ "" i
endsnippet
snippet ^^- "" i
endsnippet
snippet __0 "" i
endsnippet
snippet __1 "" i
endsnippet
snippet __2 "" i
endsnippet
snippet __3 "" i
endsnippet
snippet __4 "" i
endsnippet
snippet __5 "" i
endsnippet
snippet __6 "" i
endsnippet
snippet __7 "" i
endsnippet
snippet __8 "" i
endsnippet
snippet __9 "" i
endsnippet
snippet __a "" i
endsnippet
snippet __e "" i
endsnippet
snippet __h "" i
endsnippet
snippet __i "" i
endsnippet
snippet __k "" i
endsnippet
snippet __l "" i
endsnippet
snippet __m "" i
endsnippet
snippet __n "" i
endsnippet
snippet __o "" i
endsnippet
snippet __p "" i
endsnippet
snippet __r "" i
endsnippet
snippet __s "" i
endsnippet
snippet __t "" i
endsnippet
snippet __u "" i
endsnippet
snippet __v "" i
endsnippet
snippet __x "" i
endsnippet
snippet \Alpha "" i
Α
endsnippet
snippet \alpha "" i
α
endsnippet
snippet \Beta "" i
Β
endsnippet
snippet \beta "" i
β
endsnippet
snippet \Gamma "" i
Γ
endsnippet
snippet \gamma "" i
γ
endsnippet
snippet \Delta "" i
Δ
endsnippet
snippet \delta "" i
δ
endsnippet
snippet \Epsilon "" i
Ε
endsnippet
snippet \epsilon "" i
ε
endsnippet
snippet \Zeta "" i
Ζ
endsnippet
snippet \zeta "" i
ζ
endsnippet
snippet \Eta "" i
Η
endsnippet
snippet \eta "" i
η
endsnippet
snippet \Theta "" i
Θ
endsnippet
snippet \theta "" i
θ
endsnippet
snippet \Iota "" i
Ι
endsnippet
snippet \iota "" i
ι
endsnippet
snippet \Kappa "" i
Κ
endsnippet
snippet \kappa "" i
κ
endsnippet
snippet \Lamda "" i
Λ
endsnippet
snippet \lamda "" i
λ
endsnippet
snippet \Lambda "" i
Λ
endsnippet
snippet \lambda "" i
λ
endsnippet
snippet \Mu "" i
Μ
endsnippet
snippet \mu "" i
μ
endsnippet
snippet \Nu "" i
Ν
endsnippet
snippet \nu "" i
ν
endsnippet
snippet \Xi "" i
Ξ
endsnippet
snippet \xi "" i
ξ
endsnippet
snippet \Omicron "" i
Ο
endsnippet
snippet \omicron "" i
ο
endsnippet
snippet \Pi "" i
Π
endsnippet
snippet \pi "" i
π
endsnippet
snippet \Rho "" i
Ρ
endsnippet
snippet \rho "" i
ρ
endsnippet
snippet \Sigma "" i
Σ
endsnippet
snippet \sigma "" i
σ
endsnippet
snippet \Tau "" i
Τ
endsnippet
snippet \tau "" i
τ
endsnippet
snippet \Upsilon "" i
Υ
endsnippet
snippet \upsilon "" i
υ
endsnippet
snippet \Phi "" i
Φ
endsnippet
snippet \phi "" i
ϕ
endsnippet
snippet \varphi "" i
φ
endsnippet
snippet \Chi "" i
Χ
endsnippet
snippet \chi "" i
χ
endsnippet
snippet \Psi "" i
Ψ
endsnippet
snippet \psi "" i
ψ
endsnippet
snippet \Omega "" i
Ω
endsnippet
snippet \omega "" i
ω
endsnippet
"generic-input-methods.input-methods": [
{
"name": "Iris Math",
"commandName": "text.math",
"languages": [
"coq"
],
"triggers": [
"\\"
],
"dictionary": [
// Standard LaTeX math notations
{ "label": "forall", "body": "∀", "description": "∀" },
{ "label": "exists", "body": "∃", "description": "∃" },
{ "label": "lam", "body": "λ", "description": "λ" },
{ "label": "not", "body": "¬", "description": "¬" },
{ "label": "->", "body": "→", "description": "→" },
{ "label": "<->", "body": "↔", "description": "↔" },
{ "label": "<-", "body": "←", "description": "←" },
{ "label": "==", "body": "≡", "description": "≡" },
{ "label": "/==", "body": "≢", "description": "≢" },
{ "label": "/=", "body": "≠", "description": "≠" },
{ "label": "neq", "body": "≠", "description": "≠" },
{ "label": "nequiv", "body": "≢", "description": "≢" },
{ "label": "<=", "body": "≤", "description": "≤" },
{ "label": "leq", "body": "≤", "description": "≤" },
{ "label": "in", "body": "∈", "description": "∈" },
{ "label": "notin", "body": "∉", "description": "∉" },
{ "label": "cup", "body": "∪", "description": "∪" },
{ "label": "cap", "body": "∩", "description": "∩" },
{ "label": "setminus", "body": "∖", "description": "∖" },
{ "label": "subset", "body": "⊂", "description": "⊂" },
{ "label": "subseteq", "body": "⊆", "description": "⊆" },
{ "label": "sqsubseteq", "body": "⊑", "description": "⊑" },
{ "label": "sqsubseteq", "body": "⊑", "description": "⊑" },
{ "label": "notsubseteq", "body": "⊈", "description": "⊈" },
{ "label": "meet", "body": "⊓", "description": "⊓" },
{ "label": "join", "body": "⊔", "description": "⊔" },
{ "label": "top", "body": "⊤", "description": "⊤" },
{ "label": "bottom", "body": "⊥", "description": "⊥" },
{ "label": "vdash", "body": "⊢", "description": "⊢" },
{ "label": "|-", "body": "⊢", "description": "⊢" },
{ "label": "dashv", "body": "⊣", "description": "⊣" },
{ "label": "Vdash", "body": "⊨", "description": "⊨" },
{ "label": "infty", "body": "∞", "description": "∞" },
{ "label": "comp", "body": "∘", "description": "∘" },
{ "label": "prf", "body": "↾", "description": "↾" },
{ "label": "bind", "body": "≫=", "description": "≫=" },
{ "label": "mapsto", "body": "↦", "description": "↦" },
{ "label": "hookrightarrow", "body": "↪", "description": "↪" },
{ "label": "uparrow", "body": "↑", "description": "↑" },
// Iris specific
{ "label": "fun", "body": "λ", "description": "λ" },
{ "label": "mult", "body": "⋅", "description": "⋅" },
{ "label": "ent", "body": "⊢", "description": "⊢" },
{ "label": "valid", "body": "✓", "description": "✓" },
{ "label": "diamond", "body": "◇", "description": "◇" },
{ "label": "box", "body": "□", "description": "□" },
{ "label": "bbox", "body": "■", "description": "■" },
{ "label": "later", "body": "▷", "description": "▷" },
{ "label": "pred", "body": "φ", "description": "φ" },
{ "label": "and", "body": "∧", "description": "∧" },
{ "label": "or", "body": "∨", "description": "∨" },
{ "label": "comp", "body": "∘", "description": "∘" },
{ "label": "ccomp", "body": "◎", "description": "◎" },
{ "label": "all", "body": "∀", "description": "∀" },
{ "label": "ex", "body": "∃", "description": "∃" },
{ "label": "to", "body": "→", "description": "→" },
{ "label": "sep", "body": "∗", "description": "∗" },
{ "label": "star", "body": "∗", "description": "∗" },
{ "label": "lc", "body": "⌜", "description": "⌜" },
{ "label": "rc", "body": "⌝", "description": "⌝" },
{ "label": "Lc", "body": "⎡", "description": "⎡" },
{ "label": "Rc", "body": "⎤", "description": "⎤" },
{ "label": "empty", "body": "∅", "description": "∅" },
{ "label": "Lam", "body": "Λ", "description": "Λ" },
{ "label": "Sig", "body": "Σ", "description": "Σ" },
{ "label": "-", "body": "∖", "description": "∖" },
{ "label": "aa", "body": "●", "description": "●" },
{ "label": "af", "body": "◯", "description": "◯" },
{ "label": "auth", "body": "●", "description": "●" },
{ "label": "frag", "body": "◯", "description": "◯" },
{ "label": "iff", "body": "↔", "description": "↔" },
{ "label": "gname", "body": "γ", "description": "γ" },
{ "label": "incl", "body": "≼", "description": "≼" },
{ "label": "latert", "body": "▶", "description": "▶" },
{ "label": "update", "body": "⇝", "description": "⇝" },
{ "label": "bind", "body": "≫=", "description": "≫=" },
// accents (for iLöb)
{ "label": "\"o", "body": "ö", "description": "ö" },
// subscripts and superscripts
{ "label": "^^+", "body": "⁺", "description": "⁺" },
{ "label": "__+", "body": "₊", "description": "₊" },
{ "label": "^^-", "body": "⁻", "description": "⁻" },
{ "label": "__0", "body": "₀", "description": "₀" },
{ "label": "__1", "body": "₁", "description": "₁" },
{ "label": "__2", "body": "₂", "description": "₂" },
{ "label": "__3", "body": "₃", "description": "₃" },
{ "label": "__4", "body": "₄", "description": "₄" },
{ "label": "__5", "body": "₅", "description": "₅" },
{ "label": "__6", "body": "₆", "description": "₆" },
{ "label": "__7", "body": "₇", "description": "₇" },
{ "label": "__8", "body": "₈", "description": "₈" },
{ "label": "__9", "body": "₉", "description": "₉" },
{ "label": "__a", "body": "ₐ", "description": "ₐ" },
{ "label": "__e", "body": "ₑ", "description": "ₑ" },
{ "label": "__h", "body": "ₕ", "description": "ₕ" },
{ "label": "__i", "body": "ᵢ", "description": "ᵢ" },
{ "label": "__k", "body": "ₖ", "description": "ₖ" },
{ "label": "__l", "body": "ₗ", "description": "ₗ" },
{ "label": "__m", "body": "ₘ", "description": "ₘ" },
{ "label": "__n", "body": "ₙ", "description": "ₙ" },
{ "label": "__o", "body": "ₒ", "description": "ₒ" },
{ "label": "__p", "body": "ₚ", "description": "ₚ" },
{ "label": "__r", "body": "ᵣ", "description": "ᵣ" },
{ "label": "__s", "body": "ₛ", "description": "ₛ" },
{ "label": "__t", "body": "ₜ", "description": "ₜ" },
{ "label": "__u", "body": "ᵤ", "description": "ᵤ" },
{ "label": "__v", "body": "ᵥ", "description": "ᵥ" },
{ "label": "__x", "body": "ₓ", "description": "ₓ" },
// Greek alphabet
{ "label": "Alpha", "body": "Α", "description": "Α" },
{ "label": "alpha", "body": "α", "description": "α" },
{ "label": "Beta", "body": "Β", "description": "Β" },
{ "label": "beta", "body": "β", "description": "β" },
{ "label": "Gamma", "body": "Γ", "description": "Γ" },
{ "label": "gamma", "body": "γ", "description": "γ" },
{ "label": "Delta", "body": "Δ", "description": "Δ" },
{ "label": "delta", "body": "δ", "description": "δ" },
{ "label": "Epsilon", "body": "Ε", "description": "Ε" },
{ "label": "epsilon", "body": "ε", "description": "ε" },
{ "label": "Zeta", "body": "Ζ", "description": "Ζ" },
{ "label": "zeta", "body": "ζ", "description": "ζ" },
{ "label": "Eta", "body": "Η", "description": "Η" },
{ "label": "eta", "body": "η", "description": "η" },
{ "label": "Theta", "body": "Θ", "description": "Θ" },
{ "label": "theta", "body": "θ", "description": "θ" },
{ "label": "Iota", "body": "Ι", "description": "Ι" },
{ "label": "iota", "body": "ι", "description": "ι" },
{ "label": "Kappa", "body": "Κ", "description": "Κ" },
{ "label": "kappa", "body": "κ", "description": "κ" },
{ "label": "Lamda", "body": "Λ", "description": "Λ" },
{ "label": "lamda", "body": "λ", "description": "λ" },
{ "label": "Lambda", "body": "Λ", "description": "Λ" },
{ "label": "lambda", "body": "λ", "description": "λ" },
{ "label": "Mu", "body": "Μ", "description": "Μ" },
{ "label": "mu", "body": "μ", "description": "μ" },
{ "label": "Nu", "body": "Ν", "description": "Ν" },
{ "label": "nu", "body": "ν", "description": "ν" },
{ "label": "Xi", "body": "Ξ", "description": "Ξ" },
{ "label": "xi", "body": "ξ", "description": "ξ" },
{ "label": "Omicron", "body": "Ο", "description": "Ο" },
{ "label": "omicron", "body": "ο", "description": "ο" },
{ "label": "Pi", "body": "Π", "description": "Π" },
{ "label": "pi", "body": "π", "description": "π" },
{ "label": "Rho", "body": "Ρ", "description": "Ρ" },
{ "label": "rho", "body": "ρ", "description": "ρ" },
{ "label": "Sigma", "body": "Σ", "description": "Σ" },
{ "label": "sigma", "body": "σ", "description": "σ" },
{ "label": "Tau", "body": "Τ", "description": "Τ" },
{ "label": "tau", "body": "τ", "description": "τ" },
{ "label": "Upsilon", "body": "Υ", "description": "Υ" },
{ "label": "upsilon", "body": "υ", "description": "υ" },
{ "label": "Phi", "body": "Φ", "description": "Φ" },
{ "label": "phi", "body": "φ", "description": "φ" },
{ "label": "Chi", "body": "Χ", "description": "Χ" },
{ "label": "chi", "body": "χ", "description": "χ" },
{ "label": "Psi", "body": "Ψ", "description": "Ψ" },
{ "label": "psi", "body": "ψ", "description": "ψ" },
{ "label": "Omega", "body": "Ω", "description": "Ω" },
{ "label": "omega", "body": "ω", "description": "ω" }
]
}
]
(env
(_ ; Applies to all profiles (dev, release, ...).
(coq
(flags ; Configure the coqc flags.
(:standard ; Keeping the default flags.
; Add custom flags (to be kept in sync with _CoqProject).
-w -notation-overridden
-w -redundant-canonical-projection
-w -unknown-warning
-w -argument-scope-delimiter
-w -notation-incompatible-prefix
)))))
(lang dune 3.8)
(using coq 0.8)
......@@ -2,11 +2,16 @@
import sys, os, subprocess
import requests, argparse
from datetime import datetime, timezone
from collections import namedtuple
################################################################################
# This script lets you autoamtically trigger some operations on the Iris CI to
# do further test/analysis on a branch (usually an MR).
# Set the GITLAB_TOKEN environment variable to a GitLab access token.
# You can generate such a token at
# <https://gitlab.mpi-sws.org/-/user_settings/personal_access_tokens>.
# Select only the "api" scope.
#
# Set at least one of IRIS_REV or STDPP_REV to control which branches of these
# projects to build against (defaults to default git branch). IRIS_REPO and
# STDPP_REPO can be used to take branches from forks. Setting IRIS to
......@@ -26,11 +31,11 @@ PROJECTS = {
'lambda-rust': { 'name': 'lambda-rust', 'branch': 'master', 'timing': True },
'lambda-rust-weak': { 'name': 'lambda-rust', 'branch': 'masters/weak_mem' }, # covers GPFSL and ORC11
'examples': { 'name': 'examples', 'branch': 'master', 'timing': True },
'gpfsl': { 'name': 'gpfsl', 'branch': 'master', 'timing': True }, # need separate entry for timing
'iron': { 'name': 'iron', 'branch': 'master', 'timing': True },
'reloc': { 'name': 'reloc', 'branch': 'master' },
'spygame': { 'name': 'spygame', 'branch': 'master' },
'time-credits': { 'name': 'time-credits', 'branch': 'master' },
'actris': { 'name': 'actris', 'branch': 'master' },
'simuliris': { 'name': 'simuliris', 'branch': 'master' },
'tutorial-popl20': { 'name': 'tutorial-popl20', 'branch': 'master' },
'tutorial-popl21': { 'name': 'tutorial-popl21', 'branch': 'master' },
}
......@@ -83,6 +88,8 @@ def build(args):
var = project.upper()
vars[var+"_REPO"] = repo
vars[var+"_REV"] = rev
if args.coq:
vars["NIGHTLY_COQ"] = args.coq
# Loop over all projects, and trigger build.
for (name, project) in PROJECTS.items():
if args.filter in name:
......@@ -90,51 +97,77 @@ def build(args):
pipeline_url = trigger_build(project['name'], project['branch'], vars)['web_url']
print(" Pipeline running at {}".format(pipeline_url))
def time(args):
# Make sure only 'iris' variables are set.
for project in BUILD_BRANCHES.keys():
if project != 'iris':
print("'time' command only supports Iris branches")
sys.exit(1)
(iris_repo, iris_rev) = BUILD_BRANCHES['iris']
# Get project to test and ensure it supports timing
project_name = args.project
if project_name not in PROJECTS:
print("ERROR: no such project: {}".format(project_name))
sys.exit(1)
project = PROJECTS[project_name]
if not project.get('timing'):
print("ERROR: {} does not support timing".format(project_name))
sys.exit(1)
TimeJob = namedtuple("TimeJob", "id base_commit base_pipeline test_commit test_pipeline compare")
def time_project(project, iris_repo, iris_rev, test_rev):
# Obtain a unique ID for this experiment
id = datetime.now(timezone.utc).strftime("%Y%m%d-%H%M%S")
# Determine the branch commit to build
subprocess.run(["git", "fetch", "-q", "https://gitlab.mpi-sws.org/{}".format(iris_repo), iris_rev], check=True)
test_commit = subprocess.run(["git", "rev-parse", "FETCH_HEAD"], check=True, stdout=subprocess.PIPE).stdout.decode().strip()
# Determine the base commit in master
subprocess.run(["git", "fetch", "-q", "origin", "master"], check=True)
base_commit = subprocess.run(["git", "merge-base", test_commit, "origin/master"], check=True, stdout=subprocess.PIPE).stdout.decode().strip()
subprocess.run(["git", "fetch", "-q", "https://gitlab.mpi-sws.org/iris/iris.git", "master"], check=True)
base_commit = subprocess.run(["git", "merge-base", test_commit, "FETCH_HEAD"], check=True, stdout=subprocess.PIPE).stdout.decode().strip()
# Trigger the builds
print("Triggering timing builds for {} with Iris base commit {} and test commit {} using ad-hoc ID {}...".format(project_name, base_commit[:8], test_commit[:8], id))
vars = {
'IRIS_REPO': iris_repo,
'IRIS_REV': base_commit,
'TIMING_AD_HOC_ID': id+"-base",
}
base_pipeline = trigger_build(project['name'], project['branch'], vars)
print(" Base pipeline running at {}".format(base_pipeline['web_url']))
vars = {
'IRIS_REPO': iris_repo,
'IRIS_REV': test_commit,
'TIMING_AD_HOC_ID': id+"-test",
}
if args.test_rev is None:
if test_rev is None:
# We hope that this repository did not change since the job we created just above...
test_pipeline = trigger_build(project['name'], project['branch'], vars)
print(" Test pipeline running at {}".format(test_pipeline['web_url']))
else:
test_pipeline = trigger_build(project['name'], args.test_rev, vars)
print(" Test pipeline (on non-standard branch {}) running at {}".format(args.test_rev, test_pipeline['web_url']))
print(" Once done, timing comparison will be available at https://coq-speed.mpi-sws.org/d/1QE_dqjiz/coq-compare?orgId=1&var-project={}&var-branch1=@hoc&var-commit1={}&var-config1={}&var-branch2=@hoc&var-commit2={}&var-config2={}".format(project['name'], base_pipeline['sha'], id+"-base", test_pipeline['sha'], id+"-test"))
compare = "https://coq-speed.mpi-sws.org/d/1QE_dqjiz/coq-compare?orgId=1&var-project={}&var-branch1=@hoc&var-commit1={}&var-config1={}&var-branch2=@hoc&var-commit2={}&var-config2={}".format(project['name'], base_pipeline['sha'], id+"-base", test_pipeline['sha'], id+"-test")
return TimeJob(id, base_commit, base_pipeline['web_url'], test_commit, test_pipeline['web_url'], compare)
def time(args):
# Make sure only 'iris' variables are set.
# One could imagine generalizing to "either Iris or std++", but then if the
# ad-hoc timing jobs honor STDPP_REV, how do we make it so that particular
# deterministic std++ versions are used for Iris timing? This does not
# currently seem worth the effort / hacks.
for project in BUILD_BRANCHES.keys():
if project != 'iris':
print("'time' command only supports Iris branches")
sys.exit(1)
(iris_repo, iris_rev) = BUILD_BRANCHES['iris']
# Special mode: time everything
if args.project == 'all':
if args.test_rev is not None:
print("'time all' does not support '--test-rev'")
sys.exit(1)
for (name, project) in PROJECTS.items():
if not project.get('timing'):
continue
job = time_project(project, iris_repo, iris_rev, None)
print("- [{}]({})".format(name, job.compare))
return
# Get project to test and ensure it supports timing
project_name = args.project
if project_name not in PROJECTS:
print("ERROR: no such project: {}".format(project_name))
sys.exit(1)
project = PROJECTS[project_name]
if not project.get('timing'):
print("ERROR: {} does not support timing".format(project_name))
sys.exit(1)
# Run it!
job = time_project(project, iris_repo, iris_rev, args.test_rev)
print("Triggering timing builds for {} with Iris base commit {} and test commit {} using ad-hoc ID {}...".format(project_name, job.base_commit[:8], job.test_commit[:8], job.id))
print(" Base pipeline running at {}".format(job.base_pipeline))
if args.test_rev is None:
print(" Test pipeline running at {}".format(job.test_pipeline))
else:
print(" Test pipeline (on non-standard branch {}) running at {}".format(args.test_rev, job.test_pipeline))
print(" Once done, timing comparison will be available at {}".format(job.compare))
# Dispatch
if __name__ == "__main__":
......@@ -143,10 +176,11 @@ if __name__ == "__main__":
parser_build = subparsers.add_parser('build', help='Build many reverse dependencies against an Iris branch')
parser_build.set_defaults(func=build)
parser_build.add_argument('--coq', help='the (opam) Coq version to use for these tests')
parser_build.add_argument('filter', nargs='?', default='', help='(optional) restrict build to projects matching the filter')
parser_time = subparsers.add_parser('time', help='Time one reverse dependency against an Iris branch')
parser_time.add_argument("project", help="the project to measure the time of")
parser_time.add_argument("project", help="the project to measure the time of, or 'all' to measure all of them")
parser_time.add_argument("--test-rev", help="use different revision on project for the test build (in case the project requires changes to still build)")
parser_time.set_defaults(func=time)
......
......@@ -72,7 +72,7 @@ Proof.
destruct (H2 b) as (c&?&?); eauto. by exists c; split; last etrans.
* intros a ?. destruct (H2' a) as (b&?&?); auto.
destruct (H1' b) as (c&?&?); eauto. by exists c; split; last etrans.
- intros n x y [??]; split; naive_solver eauto using dist_S.
- intros n m x y [??] ?; split; naive_solver eauto using dist_le with si_solver.
Qed.
Canonical Structure agreeO := Ofe (agree A) agree_ofe_mixin.
......@@ -128,6 +128,11 @@ Proof.
split; [|by intros ?; exists y].
by intros [z Hz]; rewrite Hz assoc agree_idemp.
Qed.
Lemma agree_includedN n x y : x {n} y y {n} x y.
Proof.
split; [|by intros ?; exists y].
by intros [z Hz]; rewrite Hz assoc agree_idemp.
Qed.
Lemma agree_op_invN n x1 x2 : {n} (x1 x2) x1 {n} x2.
Proof.
......@@ -139,7 +144,7 @@ Qed.
Definition agree_cmra_mixin : CmraMixin (agree A).
Proof.
apply cmra_total_mixin; try apply _ || by eauto.
- intros n x; rewrite !agree_validN_def; eauto using dist_S.
- intros n x; rewrite !agree_validN_def; eauto using dist_le.
- intros x. apply agree_idemp.
- intros n x y; rewrite !agree_validN_def /=.
setoid_rewrite elem_of_app; naive_solver.
......@@ -178,6 +183,11 @@ Proof.
exists a. by rewrite elem_of_list_singleton -discrete_iff_0.
Qed.
Lemma agree_op_inv x y : (x y) x y.
Proof.
intros ?. apply equiv_dist=> n. by apply agree_op_invN, cmra_valid_validN.
Qed.
Global Instance to_agree_injN n : Inj (dist n) (dist n) (to_agree).
Proof.
move=> a b [_] /=. setoid_rewrite elem_of_list_singleton. naive_solver.
......@@ -191,7 +201,6 @@ Proof.
destruct (elem_of_agree x) as [a ?].
exists a; split=> b /=; setoid_rewrite elem_of_list_singleton; naive_solver.
Qed.
Lemma to_agree_uninj x : x a, to_agree a x.
Proof.
rewrite /valid /agree_valid_instance; setoid_rewrite agree_validN_def.
......@@ -204,19 +213,24 @@ Proof.
move=> Hval [z Hy]; move: Hval; rewrite Hy.
by move=> /agree_op_invN->; rewrite agree_idemp.
Qed.
Lemma agree_valid_included x y : y x y x y.
Proof.
move=> Hval [z Hy]; move: Hval; rewrite Hy.
by move=> /agree_op_inv->; rewrite agree_idemp.
Qed.
Lemma to_agree_includedN n a b : to_agree a {n} to_agree b a {n} b.
Proof.
split; last by intros ->. intros [x [_ Hincl]].
by destruct (Hincl a) as (? & ->%elem_of_list_singleton & ?); first set_solver+.
split; last by intros ->.
intros. by apply (inj to_agree), agree_valid_includedN.
Qed.
Lemma to_agree_included a b : to_agree a to_agree b a b.
Proof.
split; last by intros ->.
intros (x & Heq). apply equiv_dist=>n. destruct (Heq n) as [_ Hincl].
by destruct (Hincl a) as (? & ->%elem_of_list_singleton & ?); first set_solver+.
intros. by apply (inj to_agree), agree_valid_included.
Qed.
Lemma to_agree_included_L `{!LeibnizEquiv A} a b : to_agree a to_agree b a = b.
Proof. unfold_leibniz. apply to_agree_included. Qed.
Global Instance agree_cancelable x : Cancelable x.
Proof.
......@@ -232,10 +246,6 @@ Proof.
by rewrite -EQy -EQz -Hx'y' -Hx'z'.
Qed.
Lemma agree_op_inv x y : (x y) x y.
Proof.
intros ?. apply equiv_dist=>n. by apply agree_op_invN, cmra_valid_validN.
Qed.
Lemma to_agree_op_invN a b n : {n} (to_agree a to_agree b) a {n} b.
Proof. by intros ?%agree_op_invN%(inj to_agree). Qed.
Lemma to_agree_op_inv a b : (to_agree a to_agree b) a b.
......
......@@ -5,7 +5,7 @@ From iris.prelude Require Import options.
(** The authoritative camera with fractional authoritative elements *)
(** The authoritative camera has 2 types of elements: the authoritative element
[●{dq} a] and the fragment [◯ b] (of which there can be several). To enable
sharing of the authoritative element [●{dq} a], it is equiped with a
sharing of the authoritative element [●{dq} a], it is equipped with a
discardable fraction [dq]. Updates are only possible with the full
authoritative element [● a] (syntax for [●{#1} a]]), while fractional
authoritative elements have agreement, i.e., [✓ (●{dq1} a1 ⋅ ●{dq2} a2) → a1 ≡
......@@ -65,17 +65,13 @@ Definition authUR (A : ucmra) : ucmra := viewUR (A:=A) (B:=A) auth_view_rel.
Definition auth_auth {A: ucmra} : dfrac A auth A := view_auth.
Definition auth_frag {A: ucmra} : A auth A := view_frag.
Typeclasses Opaque auth_auth auth_frag.
Global Typeclasses Opaque auth_auth auth_frag.
Global Instance: Params (@auth_auth) 2 := {}.
Global Instance: Params (@auth_frag) 1 := {}.
(** FIXME: Refactor these notations using custom entries once Coq bug #13654
has been fixed. *)
Notation "●{ dq } a" := (auth_auth dq a) (at level 20, format "●{ dq } a").
Notation "●{# q } a" := (auth_auth (DfracOwn q) a) (at level 20, format "●{# q } a").
Notation "●□ a" := (auth_auth (DfracDiscarded) a) (at level 20).
Notation "● a" := (auth_auth (DfracOwn 1) a) (at level 20).
Notation "● dq a" := (auth_auth dq a)
(at level 20, dq custom dfrac at level 1, format "● dq a").
Notation "◯ a" := (auth_frag a) (at level 20).
(** * Laws of the authoritative camera *)
......@@ -120,8 +116,6 @@ Section auth.
(** Operation *)
Lemma auth_auth_dfrac_op dq1 dq2 a : {dq1 dq2} a {dq1} a {dq2} a.
Proof. apply view_auth_dfrac_op. Qed.
Lemma auth_auth_frac_op q1 q2 a : {#(q1 + q2)} a {#q1} a {#q2} a.
Proof. apply view_auth_frac_op. Qed.
Global Instance auth_auth_dfrac_is_op dq dq1 dq2 a :
IsOp dq dq1 dq2 IsOp' ({dq} a) ({dq1} a) ({dq2} a).
Proof. rewrite /auth_auth. apply _. Qed.
......@@ -132,10 +126,19 @@ Section auth.
Proof. apply view_frag_mono. Qed.
Lemma auth_frag_core a : core ( a) = (core a).
Proof. apply view_frag_core. Qed.
Lemma auth_both_core_discarded a b :
core (●□ a b) ●□ a (core b).
Proof. apply view_both_core_discarded. Qed.
Lemma auth_both_core_frac q a b :
core ({#q} a b) (core b).
Proof. apply view_both_core_frac. Qed.
Global Instance auth_auth_core_id a : CoreId (●□ a).
Proof. rewrite /auth_auth. apply _. Qed.
Global Instance auth_frag_core_id a : CoreId a CoreId ( a).
Proof. rewrite /auth_frag. apply _. Qed.
Global Instance auth_both_core_id a1 a2 : CoreId a2 CoreId (●□ a1 a2).
Proof. rewrite /auth_auth /auth_frag. apply _. Qed.
Global Instance auth_frag_is_op a b1 b2 :
IsOp a b1 b2 IsOp' ( a) ( b1) ( b2).
Proof. rewrite /auth_frag. apply _. Qed.
......@@ -329,6 +332,10 @@ Section auth.
Qed.
Lemma auth_update_auth_persist dq a : {dq} a ~~> ●□ a.
Proof. apply view_update_auth_persist. Qed.
Lemma auth_updateP_auth_unpersist a : ●□ a ~~>: λ k, q, k = {#q} a.
Proof. apply view_updateP_auth_unpersist. Qed.
Lemma auth_updateP_both_unpersist a b : ●□ a b ~~>: λ k, q, k = {#q} a b.
Proof. apply view_updateP_both_unpersist. Qed.
Lemma auth_update_dfrac_alloc dq a b `{!CoreId b} :
b a {dq} a ~~> {dq} a b.
......
......@@ -20,14 +20,14 @@ Since these big operators are like quantifiers, they have the same precedence as
[∀] and [∃]. *)
(** * Big ops over lists *)
Fixpoint big_opL `{Monoid M o} {A} (f : nat A M) (xs : list A) : M :=
Fixpoint big_opL {M : ofe} {o : M M M} `{!Monoid o} {A} (f : nat A M) (xs : list A) : M :=
match xs with
| [] => monoid_unit
| x :: xs => o (f 0 x) (big_opL (λ n, f (S n)) xs)
end.
Global Instance: Params (@big_opL) 4 := {}.
Global Arguments big_opL {M} o {_ A} _ !_ /.
Typeclasses Opaque big_opL.
Global Typeclasses Opaque big_opL.
Notation "'[^' o 'list]' k ↦ x ∈ l , P" := (big_opL o (λ k x, P) l)
(at level 200, o at level 1, l at level 10, k, x at level 1, right associativity,
format "[^ o list] k ↦ x ∈ l , P") : stdpp_scope.
......@@ -35,12 +35,13 @@ Notation "'[^' o 'list]' x ∈ l , P" := (big_opL o (λ _ x, P) l)
(at level 200, o at level 1, l at level 10, x at level 1, right associativity,
format "[^ o list] x ∈ l , P") : stdpp_scope.
Definition big_opM_def `{Monoid M o} `{Countable K} {A} (f : K A M)
Local Definition big_opM_def {M : ofe} {o : M M M} `{!Monoid o} `{Countable K} {A} (f : K A M)
(m : gmap K A) : M := big_opL o (λ _, uncurry f) (map_to_list m).
Definition big_opM_aux : seal (@big_opM_def). Proof. by eexists. Qed.
Local Definition big_opM_aux : seal (@big_opM_def). Proof. by eexists. Qed.
Definition big_opM := big_opM_aux.(unseal).
Global Arguments big_opM {M} o {_ K _ _ A} _ _.
Definition big_opM_eq : @big_opM = @big_opM_def := big_opM_aux.(seal_eq).
Local Definition big_opM_unseal :
@big_opM = @big_opM_def := big_opM_aux.(seal_eq).
Global Instance: Params (@big_opM) 7 := {}.
Notation "'[^' o 'map]' k ↦ x ∈ m , P" := (big_opM o (λ k x, P) m)
(at level 200, o at level 1, m at level 10, k, x at level 1, right associativity,
......@@ -49,23 +50,25 @@ Notation "'[^' o 'map]' x ∈ m , P" := (big_opM o (λ _ x, P) m)
(at level 200, o at level 1, m at level 10, x at level 1, right associativity,
format "[^ o map] x ∈ m , P") : stdpp_scope.
Definition big_opS_def `{Monoid M o} `{Countable A} (f : A M)
Local Definition big_opS_def {M : ofe} {o : M M M} `{!Monoid o} `{Countable A} (f : A M)
(X : gset A) : M := big_opL o (λ _, f) (elements X).
Definition big_opS_aux : seal (@big_opS_def). Proof. by eexists. Qed.
Local Definition big_opS_aux : seal (@big_opS_def). Proof. by eexists. Qed.
Definition big_opS := big_opS_aux.(unseal).
Global Arguments big_opS {M} o {_ A _ _} _ _.
Definition big_opS_eq : @big_opS = @big_opS_def := big_opS_aux.(seal_eq).
Local Definition big_opS_unseal :
@big_opS = @big_opS_def := big_opS_aux.(seal_eq).
Global Instance: Params (@big_opS) 6 := {}.
Notation "'[^' o 'set]' x ∈ X , P" := (big_opS o (λ x, P) X)
(at level 200, o at level 1, X at level 10, x at level 1, right associativity,
format "[^ o set] x ∈ X , P") : stdpp_scope.
Definition big_opMS_def `{Monoid M o} `{Countable A} (f : A M)
Local Definition big_opMS_def {M : ofe} {o : M M M} `{!Monoid o} `{Countable A} (f : A M)
(X : gmultiset A) : M := big_opL o (λ _, f) (elements X).
Definition big_opMS_aux : seal (@big_opMS_def). Proof. by eexists. Qed.
Local Definition big_opMS_aux : seal (@big_opMS_def). Proof. by eexists. Qed.
Definition big_opMS := big_opMS_aux.(unseal).
Global Arguments big_opMS {M} o {_ A _ _} _ _.
Definition big_opMS_eq : @big_opMS = @big_opMS_def := big_opMS_aux.(seal_eq).
Local Definition big_opMS_unseal :
@big_opMS = @big_opMS_def := big_opMS_aux.(seal_eq).
Global Instance: Params (@big_opMS) 6 := {}.
Notation "'[^' o 'mset]' x ∈ X , P" := (big_opMS o (λ x, P) X)
(at level 200, o at level 1, X at level 10, x at level 1, right associativity,
......@@ -73,7 +76,7 @@ Notation "'[^' o 'mset]' x ∈ X , P" := (big_opMS o (λ x, P) X)
(** * Properties about big ops *)
Section big_op.
Context `{Monoid M o}.
Context {M : ofe} {o : M M M} `{!Monoid o}.
Implicit Types xs : list M.
Infix "`o`" := o (at level 50, left associativity).
......@@ -104,6 +107,16 @@ Section list.
Lemma big_opL_unit l : ([^o list] ky l, monoid_unit) (monoid_unit : M).
Proof. induction l; rewrite /= ?left_id //. Qed.
Lemma big_opL_take_drop Φ l n :
([^o list] k x l, Φ k x)
([^o list] k x take n l, Φ k x) `o` ([^o list] k x drop n l, Φ (n + k) x).
Proof.
rewrite -{1}(take_drop n l) big_opL_app length_take.
destruct (decide (length l n)).
- rewrite drop_ge //=.
- rewrite Nat.min_l //=; lia.
Qed.
Lemma big_opL_gen_proper_2 {B} (R : relation M) f (g : nat B M)
l1 (l2 : list B) :
R monoid_unit monoid_unit
......@@ -210,6 +223,18 @@ Section list.
revert f g; induction l as [|x l IH]=> f g /=; first by rewrite left_id.
by rewrite IH -!assoc (assoc _ (g _ _)) [(g _ _ `o` _)]comm -!assoc.
Qed.
(** Shows that some property [P] is closed under [big_opL]. Examples of [P]
are [Persistent], [Affine], [Timeless]. *)
Lemma big_opL_closed (P : M Prop) f l :
P monoid_unit
( x y, P x P y P (x `o` y))
( k x, l !! k = Some x P (f k x))
P ([^o list] kx l, f k x).
Proof.
intros Hunit Hop. revert f. induction l as [|x l IH]=> f Hf /=; [done|].
apply Hop; first by auto. apply IH=> k. apply (Hf (S k)).
Qed.
End list.
Lemma big_opL_bind {A B} (h : A list B) (f : B M) l :
......@@ -242,12 +267,12 @@ Proof. by apply big_opL_sep_zip_with. Qed.
Lemma big_opM_empty `{Countable K} {B} (f : K B M) :
([^o map] kx , f k x) = monoid_unit.
Proof. by rewrite big_opM_eq /big_opM_def map_to_list_empty. Qed.
Proof. by rewrite big_opM_unseal /big_opM_def map_to_list_empty. Qed.
Lemma big_opM_insert `{Countable K} {B} (f : K B M) (m : gmap K B) i x :
m !! i = None
([^o map] ky <[i:=x]> m, f k y) f i x `o` [^o map] ky m, f k y.
Proof. intros ?. by rewrite big_opM_eq /big_opM_def map_to_list_insert. Qed.
Proof. intros ?. by rewrite big_opM_unseal /big_opM_def map_to_list_insert. Qed.
Lemma big_opM_delete `{Countable K} {B} (f : K B M) (m : gmap K B) i x :
m !! i = Some x
......@@ -294,7 +319,7 @@ Section gmap.
( k x, m !! k = Some x R (f k x) (g k x))
R ([^o map] k x m, f k x) ([^o map] k x m, g k x).
Proof.
intros ?? Hf. rewrite big_opM_eq. apply (big_opL_gen_proper R); auto.
intros ?? Hf. rewrite big_opM_unseal. apply (big_opL_gen_proper R); auto.
intros k [i x] ?%elem_of_list_lookup_2. by apply Hf, elem_of_map_to_list.
Qed.
......@@ -343,7 +368,14 @@ Section gmap.
[setoid_rewrite] in the proof of [big_sepS_sepS]. See Coq issue #14349. *)
Lemma big_opM_map_to_list f m :
([^o map] kx m, f k x) [^o list] xk map_to_list m, f (xk.1) (xk.2).
Proof. rewrite big_opM_eq. apply big_opL_proper'; [|done]. by intros ? [??]. Qed.
Proof. rewrite big_opM_unseal. apply big_opL_proper'; [|done]. by intros ? [??]. Qed.
Lemma big_opM_list_to_map f l :
NoDup l.*1
([^o map] kx list_to_map l, f k x) [^o list] xk l, f (xk.1) (xk.2).
Proof.
intros. rewrite big_opM_map_to_list.
by apply big_opL_permutation, map_to_list_to_map.
Qed.
Lemma big_opM_singleton f i x : ([^o map] ky {[i:=x]}, f k y) f i x.
Proof.
......@@ -353,13 +385,13 @@ Section gmap.
Lemma big_opM_unit m : ([^o map] ky m, monoid_unit) (monoid_unit : M).
Proof.
by induction m using map_ind; rewrite /= ?big_opM_insert ?left_id // big_opM_eq.
by induction m using map_ind; rewrite /= ?big_opM_insert ?left_id // big_opM_unseal.
Qed.
Lemma big_opM_fmap {B} (h : A B) (f : K B M) m :
([^o map] ky h <$> m, f k y) ([^o map] ky m, f k (h y)).
Proof.
rewrite big_opM_eq /big_opM_def map_to_list_fmap big_opL_fmap.
rewrite big_opM_unseal /big_opM_def map_to_list_fmap big_opL_fmap.
by apply big_opL_proper=> ? [??].
Qed.
......@@ -411,8 +443,8 @@ Section gmap.
induction m as [|k v m ? IH] using map_ind.
{ by rewrite map_filter_empty !big_opM_empty. }
destruct (decide (φ (k, v))).
- rewrite map_filter_insert //.
assert (filter φ m !! k = None) by (apply map_filter_lookup_None; eauto).
- rewrite map_filter_insert_True //.
assert (filter φ m !! k = None) by (apply map_lookup_filter_None; eauto).
by rewrite !big_opM_insert // decide_True // IH.
- rewrite map_filter_insert_not' //; last by congruence.
rewrite !big_opM_insert // decide_False // IH. by rewrite left_id.
......@@ -435,7 +467,23 @@ Section gmap.
([^o map] kx m, f k x `o` g k x)
([^o map] kx m, f k x) `o` ([^o map] kx m, g k x).
Proof.
rewrite big_opM_eq /big_opM_def -big_opL_op. by apply big_opL_proper=> ? [??].
rewrite big_opM_unseal /big_opM_def -big_opL_op. by apply big_opL_proper=> ? [??].
Qed.
(** Shows that some property [P] is closed under [big_opM]. Examples of [P]
are [Persistent], [Affine], [Timeless]. *)
Lemma big_opM_closed (P : M Prop) f m :
Proper (() ==> iff) P
P monoid_unit
( x y, P x P y P (x `o` y))
( k x, m !! k = Some x P (f k x))
P ([^o map] kx m, f k x).
Proof.
intros ?? Hop Hf. induction m as [|k x ?? IH] using map_ind.
{ by rewrite big_opM_empty. }
rewrite big_opM_insert //. apply Hop.
{ apply Hf. by rewrite lookup_insert. }
apply IH=> k' x' ?. apply Hf. rewrite lookup_insert_ne; naive_solver.
Qed.
End gmap.
......@@ -473,7 +521,7 @@ Section gset.
( x, x X R (f x) (g x))
R ([^o set] x X, f x) ([^o set] x X, g x).
Proof.
rewrite big_opS_eq. intros ?? Hf. apply (big_opL_gen_proper R); auto.
rewrite big_opS_unseal. intros ?? Hf. apply (big_opL_gen_proper R); auto.
intros k x ?%elem_of_list_lookup_2. by apply Hf, elem_of_elements.
Qed.
......@@ -504,10 +552,10 @@ Section gset.
[setoid_rewrite] in the proof of [big_sepS_sepS]. See Coq issue #14349. *)
Lemma big_opS_elements f X :
([^o set] x X, f x) [^o list] x elements X, f x.
Proof. by rewrite big_opS_eq. Qed.
Proof. by rewrite big_opS_unseal. Qed.
Lemma big_opS_empty f : ([^o set] x , f x) = monoid_unit.
Proof. by rewrite big_opS_eq /big_opS_def elements_empty. Qed.
Proof. by rewrite big_opS_unseal /big_opS_def elements_empty. Qed.
Lemma big_opS_insert f X x :
x X ([^o set] y {[ x ]} X, f y) (f x `o` [^o set] y X, f y).
......@@ -547,7 +595,7 @@ Section gset.
Lemma big_opS_unit X : ([^o set] y X, monoid_unit) (monoid_unit : M).
Proof.
by induction X using set_ind_L; rewrite /= ?big_opS_insert ?left_id // big_opS_eq.
by induction X using set_ind_L; rewrite /= ?big_opS_insert ?left_id // big_opS_unseal.
Qed.
Lemma big_opS_filter' (φ : A Prop) `{ x, Decision (φ x)} f X :
......@@ -577,6 +625,22 @@ Section gset.
- rewrite /= big_opS_union; last set_solver.
by rewrite big_opS_singleton IHl.
Qed.
(** Shows that some property [P] is closed under [big_opS]. Examples of [P]
are [Persistent], [Affine], [Timeless]. *)
Lemma big_opS_closed (P : M Prop) f X :
Proper (() ==> iff) P
P monoid_unit
( x y, P x P y P (x `o` y))
( x, x X P (f x))
P ([^o set] x X, f x).
Proof.
intros ?? Hop Hf. induction X as [|x X ? IH] using set_ind_L.
{ by rewrite big_opS_empty. }
rewrite big_opS_insert //. apply Hop.
{ apply Hf. set_solver. }
apply IH=> x' ?. apply Hf. set_solver.
Qed.
End gset.
Lemma big_opS_set_map `{Countable A, Countable B} (h : A B) (X : gset A) (f : B M) :
......@@ -592,12 +656,18 @@ Proof.
Qed.
Lemma big_opM_dom `{Countable K} {A} (f : K M) (m : gmap K A) :
([^o map] k↦_ m, f k) ([^o set] k dom _ m, f k).
([^o map] k↦_ m, f k) ([^o set] k dom m, f k).
Proof.
induction m as [|i x ?? IH] using map_ind.
{ by rewrite big_opM_eq big_opS_eq dom_empty_L. }
{ by rewrite big_opM_unseal big_opS_unseal dom_empty_L. }
by rewrite dom_insert_L big_opM_insert // IH big_opS_insert ?not_elem_of_dom.
Qed.
Lemma big_opM_gset_to_gmap `{Countable K} {A} (f : K A M) (X : gset K) c :
([^o map] ka gset_to_gmap c X, f k a) ([^o set] k X, f k c).
Proof.
rewrite -{2}(dom_gset_to_gmap X c) -big_opM_dom.
apply big_opM_proper. by intros k ? [_ ->]%lookup_gset_to_gmap_Some.
Qed.
(** ** Big ops over finite msets *)
Section gmultiset.
......@@ -610,7 +680,7 @@ Section gmultiset.
( x, x X R (f x) (g x))
R ([^o mset] x X, f x) ([^o mset] x X, g x).
Proof.
rewrite big_opMS_eq. intros ?? Hf. apply (big_opL_gen_proper R); auto.
rewrite big_opMS_unseal. intros ?? Hf. apply (big_opL_gen_proper R); auto.
intros k x ?%elem_of_list_lookup_2. by apply Hf, gmultiset_elem_of_elements.
Qed.
......@@ -641,18 +711,18 @@ Section gmultiset.
[setoid_rewrite] in the proof of [big_sepS_sepS]. See Coq issue #14349. *)
Lemma big_opMS_elements f X :
([^o mset] x X, f x) [^o list] x elements X, f x.
Proof. by rewrite big_opMS_eq. Qed.
Proof. by rewrite big_opMS_unseal. Qed.
Lemma big_opMS_empty f : ([^o mset] x , f x) = monoid_unit.
Proof. by rewrite big_opMS_eq /big_opMS_def gmultiset_elements_empty. Qed.
Proof. by rewrite big_opMS_unseal /big_opMS_def gmultiset_elements_empty. Qed.
Lemma big_opMS_disj_union f X Y :
([^o mset] y X Y, f y) ([^o mset] y X, f y) `o` [^o mset] y Y, f y.
Proof. by rewrite big_opMS_eq /big_opMS_def gmultiset_elements_disj_union big_opL_app. Qed.
Proof. by rewrite big_opMS_unseal /big_opMS_def gmultiset_elements_disj_union big_opL_app. Qed.
Lemma big_opMS_singleton f x : ([^o mset] y {[+ x +]}, f y) f x.
Proof.
intros. by rewrite big_opMS_eq /big_opMS_def gmultiset_elements_singleton /= right_id.
intros. by rewrite big_opMS_unseal /big_opMS_def gmultiset_elements_singleton /= right_id.
Qed.
Lemma big_opMS_insert f X x :
......@@ -669,12 +739,28 @@ Section gmultiset.
Lemma big_opMS_unit X : ([^o mset] y X, monoid_unit) (monoid_unit : M).
Proof.
by induction X using gmultiset_ind;
rewrite /= ?big_opMS_disj_union ?big_opMS_singleton ?left_id // big_opMS_eq.
rewrite /= ?big_opMS_disj_union ?big_opMS_singleton ?left_id // big_opMS_unseal.
Qed.
Lemma big_opMS_op f g X :
([^o mset] y X, f y `o` g y) ([^o mset] y X, f y) `o` ([^o mset] y X, g y).
Proof. by rewrite big_opMS_eq /big_opMS_def -big_opL_op. Qed.
Proof. by rewrite big_opMS_unseal /big_opMS_def -big_opL_op. Qed.
(** Shows that some property [P] is closed under [big_opMS]. Examples of [P]
are [Persistent], [Affine], [Timeless]. *)
Lemma big_opMS_closed (P : M Prop) f X :
Proper (() ==> iff) P
P monoid_unit
( x y, P x P y P (x `o` y))
( x, x X P (f x))
P ([^o mset] x X, f x).
Proof.
intros ?? Hop Hf. induction X as [|x X IH] using gmultiset_ind.
{ by rewrite big_opMS_empty. }
rewrite big_opMS_insert //. apply Hop.
{ apply Hf. set_solver. }
apply IH=> x' ?. apply Hf. set_solver.
Qed.
End gmultiset.
(** Commuting lemmas *)
......@@ -773,7 +859,7 @@ Proof. repeat setoid_rewrite big_opMS_elements. by rewrite big_opL_opL. Qed.
End big_op.
Section homomorphisms.
Context `{Monoid M1 o1, Monoid M2 o2}.
Context {M1 M2 : ofe} {o1 : M1 M1 M1} {o2 : M2 M2 M2} `{!Monoid o1, !Monoid o2}.
Infix "`o1`" := o1 (at level 50, left associativity).
Infix "`o2`" := o2 (at level 50, left associativity).
(** The ssreflect rewrite tactic only works for relations that have a
......
This diff is collapsed.
......@@ -13,7 +13,8 @@ Qed.
Lemma big_opM_None {M : cmra} `{Countable K} {A} (f : K A option M) m :
([^op map] kx m, f k x) = None k x, m !! k = Some x f k x = None.
Proof.
induction m as [|i x m ? IH] using map_ind=> /=; first by rewrite big_opM_eq.
induction m as [|i x m ? IH] using map_ind=> /=.
{ by rewrite big_opM_empty. }
rewrite -None_equiv_eq big_opM_insert // None_equiv_eq op_None IH. split.
{ intros [??] k y. rewrite lookup_insert_Some; naive_solver. }
intros Hm; split.
......@@ -23,7 +24,8 @@ Qed.
Lemma big_opS_None {M : cmra} `{Countable A} (f : A option M) X :
([^op set] x X, f x) = None x, x X f x = None.
Proof.
induction X as [|x X ? IH] using set_ind_L; [by rewrite big_opS_eq |].
induction X as [|x X ? IH] using set_ind_L.
{ by rewrite big_opS_empty. }
rewrite -None_equiv_eq big_opS_insert // None_equiv_eq op_None IH. set_solver.
Qed.
Lemma big_opMS_None {M : cmra} `{Countable A} (f : A option M) X :
......
......@@ -60,22 +60,22 @@ Section coPset.
Qed.
End coPset.
(* The disjoiny union CMRA *)
(* The disjoint union CMRA *)
Inductive coPset_disj :=
| CoPset : coPset coPset_disj
| CoPsetBot : coPset_disj.
| CoPsetInvalid : coPset_disj.
Section coPset_disj.
Local Arguments op _ _ !_ !_ /.
Canonical Structure coPset_disjO := leibnizO coPset_disj.
Local Instance coPset_disj_valid_instance : Valid coPset_disj := λ X,
match X with CoPset _ => True | CoPsetBot => False end.
match X with CoPset _ => True | CoPsetInvalid => False end.
Local Instance coPset_disj_unit_instance : Unit coPset_disj := CoPset ∅.
Local Instance coPset_disj_op_instance : Op coPset_disj := λ X Y,
match X, Y with
| CoPset X, CoPset Y => if decide (X ## Y) then CoPset (X Y) else CoPsetBot
| _, _ => CoPsetBot
| CoPset X, CoPset Y => if decide (X ## Y) then CoPset (X Y) else CoPsetInvalid
| _, _ => CoPsetInvalid
end.
Local Instance coPset_disj_pcore_instance : PCore coPset_disj := λ _, Some ε.
......
From iris.algebra Require Export ofe.
From iris.prelude Require Import options.
(* Note that [Inhabited] is not derivable. Take [F X := ▶ X], then a possible
solution is [Empty_set]. *)
Record solution (F : oFunctor) := Solution {
solution_car :> ofe;
solution_inhabited : Inhabited solution_car;
solution_cofe : Cofe solution_car;
solution_iso :> ofe_iso (oFunctor_apply F solution_car) solution_car;
}.
Global Existing Instance solution_cofe.
Global Existing Instances solution_inhabited solution_cofe.
Module solver. Section solver.
Context (F : oFunctor) `{Fcontr : oFunctorContractive F}.
Context (F : oFunctor) `{Fcontr : !oFunctorContractive F}.
Context `{Fcofe : (T : ofe) `{!Cofe T}, Cofe (oFunctor_apply F T)}.
Context `{Finh : Inhabited (oFunctor_apply F unitO)}.
Notation map := (oFunctor_map F).
......@@ -61,7 +64,7 @@ Proof.
+ by intros X n.
+ by intros X Y ? n.
+ by intros X Y Z ?? n; trans (Y n).
- intros k X Y HXY n; apply dist_S.
- intros k j X Y HXY Hlt n. apply (dist_le k); [|lia].
by rewrite -(g_tower X) (HXY (S n)) g_tower.
Qed.
Definition T : ofe := Ofe tower tower_ofe_mixin.
......@@ -177,6 +180,8 @@ Proof.
- rewrite (ff_tower k (i - S k) X). by destruct (Nat.sub_add _ _ _).
Qed.
Global Instance tower_inhabited : Inhabited tower := populate (embed 0 ()).
Program Definition unfold_chain (X : T) : chain (oFunctor_apply F T) :=
{| chain_car n := map (project n,embed' n) (X (S n)) |}.
Next Obligation.
......@@ -206,7 +211,7 @@ Proof. by intros n X Y HXY k; rewrite /fold /= HXY. Qed.
Theorem result : solution F.
Proof using Type*.
refine (Solution F T _ (OfeIso (OfeMor fold) (OfeMor unfold) _ _)).
refine (Solution F T _ _ (OfeIso (OfeMor fold) (OfeMor unfold) _ _)).
- move=> X /=. rewrite equiv_dist=> n k; rewrite /unfold /fold /=.
rewrite -g_tower -(gg_tower _ n); apply (_ : Proper (_ ==> _) (g _)).
trans (map (ff n, gg n) (X (S (n + k)))).
......
......@@ -12,14 +12,14 @@ Local Arguments cmra_valid _ !_ /.
Inductive csum (A B : Type) :=
| Cinl : A csum A B
| Cinr : B csum A B
| CsumBot : csum A B.
| CsumInvalid : csum A B.
Global Arguments Cinl {_ _} _.
Global Arguments Cinr {_ _} _.
Global Arguments CsumBot {_ _}.
Global Arguments CsumInvalid {_ _}.
Global Instance: Params (@Cinl) 2 := {}.
Global Instance: Params (@Cinr) 2 := {}.
Global Instance: Params (@CsumBot) 2 := {}.
Global Instance: Params (@CsumInvalid) 2 := {}.
Global Instance maybe_Cinl {A B} : Maybe (@Cinl A B) := λ x,
match x with Cinl a => Some a | _ => None end.
......@@ -35,12 +35,12 @@ Implicit Types b : B.
Inductive csum_equiv : Equiv (csum A B) :=
| Cinl_equiv a a' : a a' Cinl a Cinl a'
| Cinr_equiv b b' : b b' Cinr b Cinr b'
| CsumBot_equiv : CsumBot CsumBot.
| CsumInvalid_equiv : CsumInvalid CsumInvalid.
Local Existing Instance csum_equiv.
Inductive csum_dist : Dist (csum A B) :=
| Cinl_dist n a a' : a {n} a' Cinl a {n} Cinl a'
| Cinr_dist n b b' : b {n} b' Cinr b {n} Cinr b'
| CsumBot_dist n : CsumBot {n} CsumBot.
| CsumInvalid_dist n : CsumInvalid {n} CsumInvalid.
Local Existing Instance csum_dist.
Global Instance Cinl_ne : NonExpansive (@Cinl A B).
......@@ -65,13 +65,13 @@ Proof.
split.
- intros mx my; split.
+ by destruct 1; constructor; try apply equiv_dist.
+ intros Hxy; feed inversion (Hxy 0); subst; constructor; try done;
apply equiv_dist=> n; by feed inversion (Hxy n).
+ intros Hxy; oinversion (Hxy 0); subst; constructor; try done;
apply equiv_dist=> n; by oinversion (Hxy n).
- intros n; split.
+ by intros [|a|]; constructor.
+ by destruct 1; constructor.
+ destruct 1; inversion_clear 1; constructor; etrans; eauto.
- by inversion_clear 1; constructor; apply dist_S.
- by inversion_clear 1; constructor; eauto using dist_le with si_solver.
Qed.
Canonical Structure csumO : ofe := Ofe (csum A B) csum_ofe_mixin.
......@@ -81,32 +81,32 @@ Next Obligation. intros c a n i ?; simpl. by destruct (chain_cauchy c n i). Qed.
Program Definition csum_chain_r (c : chain csumO) (b : B) : chain B :=
{| chain_car n := match c n return _ with Cinr b' => b' | _ => b end |}.
Next Obligation. intros c b n i ?; simpl. by destruct (chain_cauchy c n i). Qed.
Definition csum_compl `{Cofe A, Cofe B} : Compl csumO := λ c,
Definition csum_compl `{!Cofe A, !Cofe B} : Compl csumO := λ c,
match c 0 with
| Cinl a => Cinl (compl (csum_chain_l c a))
| Cinr b => Cinr (compl (csum_chain_r c b))
| CsumBot => CsumBot
| CsumInvalid => CsumInvalid
end.
Global Program Instance csum_cofe `{Cofe A, Cofe B} : Cofe csumO :=
Global Program Instance csum_cofe `{!Cofe A, !Cofe B} : Cofe csumO :=
{| compl := csum_compl |}.
Next Obligation.
intros ?? n c; rewrite /compl /csum_compl.
feed inversion (chain_cauchy c 0 n); first auto with lia; constructor.
oinversion (chain_cauchy c 0 n); first auto with lia; constructor.
+ rewrite (conv_compl n (csum_chain_l c _)) /=. destruct (c n); naive_solver.
+ rewrite (conv_compl n (csum_chain_r c _)) /=. destruct (c n); naive_solver.
Qed.
Global Instance csum_ofe_discrete :
OfeDiscrete A OfeDiscrete B OfeDiscrete csumO.
Proof. by inversion_clear 3; constructor; apply (discrete _). Qed.
Proof. by inversion_clear 3; constructor; apply (discrete_0 _). Qed.
Global Instance csum_leibniz :
LeibnizEquiv A LeibnizEquiv B LeibnizEquiv csumO.
Proof. by destruct 3; f_equal; apply leibniz_equiv. Qed.
Global Instance Cinl_discrete a : Discrete a Discrete (Cinl a).
Proof. by inversion_clear 2; constructor; apply (discrete _). Qed.
Proof. by inversion_clear 2; constructor; apply (discrete_0 _). Qed.
Global Instance Cinr_discrete b : Discrete b Discrete (Cinr b).
Proof. by inversion_clear 2; constructor; apply (discrete _). Qed.
Proof. by inversion_clear 2; constructor; apply (discrete_0 _). Qed.
End ofe.
......@@ -118,7 +118,7 @@ Definition csum_map {A A' B B'} (fA : A → A') (fB : B → B')
match x with
| Cinl a => Cinl (fA a)
| Cinr b => Cinr (fB b)
| CsumBot => CsumBot
| CsumInvalid => CsumInvalid
end.
Global Instance: Params (@csum_map) 4 := {}.
......@@ -152,25 +152,25 @@ Local Instance csum_valid_instance : Valid (csum A B) := λ x,
match x with
| Cinl a => a
| Cinr b => b
| CsumBot => False
| CsumInvalid => False
end.
Local Instance csum_validN_instance : ValidN (csum A B) := λ n x,
match x with
| Cinl a => {n} a
| Cinr b => {n} b
| CsumBot => False
| CsumInvalid => False
end.
Local Instance csum_pcore_instance : PCore (csum A B) := λ x,
match x with
| Cinl a => Cinl <$> pcore a
| Cinr b => Cinr <$> pcore b
| CsumBot => Some CsumBot
| CsumInvalid => Some CsumInvalid
end.
Local Instance csum_op_instance : Op (csum A B) := λ x y,
match x, y with
| Cinl a, Cinl a' => Cinl (a a')
| Cinr b, Cinr b' => Cinr (b b')
| _, _ => CsumBot
| _, _ => CsumInvalid
end.
Lemma Cinl_op a a' : Cinl (a a') = Cinl a Cinl a'.
......@@ -184,14 +184,14 @@ Lemma Cinr_valid b : ✓ (Cinr b) ↔ ✓ b.
Proof. done. Qed.
Lemma csum_included x y :
x y y = CsumBot ( a a', x = Cinl a y = Cinl a' a a')
x y y = CsumInvalid ( a a', x = Cinl a y = Cinl a' a a')
( b b', x = Cinr b y = Cinr b' b b').
Proof.
split.
- unfold included. intros [[a'|b'|] Hy]; destruct x as [a|b|];
inversion_clear Hy; eauto 10.
- intros [->|[(a&a'&->&->&c&?)|(b&b'&->&->&c&?)]].
+ destruct x; exists CsumBot; constructor.
+ destruct x; exists CsumInvalid; constructor.
+ exists (Cinl c); by constructor.
+ exists (Cinr c); by constructor.
Qed.
......@@ -199,16 +199,19 @@ Lemma Cinl_included a a' : Cinl a ≼ Cinl a' ↔ a ≼ a'.
Proof. rewrite csum_included. naive_solver. Qed.
Lemma Cinr_included b b' : Cinr b Cinr b' b b'.
Proof. rewrite csum_included. naive_solver. Qed.
Lemma CsumInvalid_included x : x CsumInvalid.
Proof. exists CsumInvalid. by destruct x. Qed.
(** We register a hint for [CsumInvalid_included] below. *)
Lemma csum_includedN n x y :
x {n} y y = CsumBot ( a a', x = Cinl a y = Cinl a' a {n} a')
x {n} y y = CsumInvalid ( a a', x = Cinl a y = Cinl a' a {n} a')
( b b', x = Cinr b y = Cinr b' b {n} b').
Proof.
split.
- unfold includedN. intros [[a'|b'|] Hy]; destruct x as [a|b|];
inversion_clear Hy; eauto 10.
- intros [->|[(a&a'&->&->&c&?)|(b&b'&->&->&c&?)]].
+ destruct x; exists CsumBot; constructor.
+ destruct x; exists CsumInvalid; constructor.
+ exists (Cinl c); by constructor.
+ exists (Cinr c); by constructor.
Qed.
......@@ -236,11 +239,11 @@ Proof.
constructor; eauto using cmra_pcore_l.
- intros [a|b|] ? [=]; subst; auto.
+ destruct (pcore a) as [ca|] eqn:?; simplify_option_eq.
feed inversion (cmra_pcore_idemp a ca); repeat constructor; auto.
oinversion (cmra_pcore_idemp a ca); repeat constructor; auto.
+ destruct (pcore b) as [cb|] eqn:?; simplify_option_eq.
feed inversion (cmra_pcore_idemp b cb); repeat constructor; auto.
oinversion (cmra_pcore_idemp b cb); repeat constructor; auto.
- intros x y ? [->|[(a&a'&->&->&?)|(b&b'&->&->&?)]]%csum_included [=].
+ exists CsumBot. rewrite csum_included; eauto.
+ exists CsumInvalid. rewrite csum_included; eauto.
+ destruct (pcore a) as [ca|] eqn:?; simplify_option_eq.
destruct (cmra_pcore_mono a a' ca) as (ca'&->&?); auto.
exists (Cinl ca'). rewrite csum_included; eauto 10.
......@@ -255,7 +258,7 @@ Proof.
+ destruct y1 as [a1|b1|], y2 as [a2|b2|]; try by exfalso; inversion Hx'.
destruct (cmra_extend n b b1 b2) as (z1&z2&?&?&?); [done|apply (inj Cinr), Hx'|].
exists (Cinr z1), (Cinr z2). by repeat constructor.
+ by exists CsumBot, CsumBot; destruct y1, y2; inversion_clear Hx'.
+ by exists CsumInvalid, CsumInvalid; destruct y1, y2; inversion_clear Hx'.
Qed.
Canonical Structure csumR := Cmra (csum A B) csum_cmra_mixin.
......@@ -295,7 +298,7 @@ Proof. intros ? [] ? EQ; inversion_clear EQ. by eapply id_free0_r. Qed.
(** Interaction with [option] *)
Lemma Some_csum_includedN x y n :
Some x {n} Some y
y = CsumBot
y = CsumInvalid
( a a', x = Cinl a y = Cinl a' Some a {n} Some a')
( b b', x = Cinr b y = Cinr b' Some b {n} Some b').
Proof.
......@@ -305,7 +308,7 @@ Proof.
Qed.
Lemma Some_csum_included x y :
Some x Some y
y = CsumBot
y = CsumInvalid
( a a', x = Cinl a y = Cinl a' Some a Some a')
( b b', x = Cinr b y = Cinr b' Some b Some b').
Proof.
......@@ -366,6 +369,10 @@ Proof.
Qed.
End cmra.
(* We use a [Hint Extern] with [apply:], instead of [Hint Immediate], to invoke
the new unification algorithm. The old unification algorithm sometimes gets
confused by going from [ucmra]'s to [cmra]'s and back. *)
Global Hint Extern 0 (_ CsumInvalid) => apply: CsumInvalid_included : core.
Global Arguments csumR : clear implicits.
(* Functor *)
......