deducteam / lambdapi Goto Github PK
View Code? Open in Web Editor NEWProof assistant based on the λΠ-calculus modulo rewriting
License: Other
Proof assistant based on the λΠ-calculus modulo rewriting
License: Other
We should consider variables that are unused in the RHS as being wildcards.
I don't really understand how the arity restriction of lambdapi works. For example in the following signature:
A : Type.
def h : (A -> A) -> (A -> A) -> A -> A.
The following rule is accepted:
[aa] h aa (x => aa x) --> aa.
But not this one:
[aa] h (x => aa x) aa --> aa.
with the error message: "Arity mismatch for metavariable [aa].".
A : Type.
a : A.
def f : (A -> A) -> A.
[y] f (x => y y) --> a.
I know that it is a syntactic matching and not a HO one, but still I think it should be rejected here.
We should ask the maintainer of the library to pack the source files differently. I propos a .tar
archive with .dk.gz
files inside. This only takes a small amount of extra space and allow us to work on file independently. The problem is, it is not possible to extract a single file (efficiently) from a .tar.gz
archive.
https://github.com/rlepigre/lambdapi/blob/2258c46dd85ea119b12b91083ffb718296c96fd3/src/files.ml#L21
The error messages should be different for each check.
This example comes from this issue.
A : Type.
a : A.
def T : A -> Type.
[] T a --> A -> A.
True : Type.
def eq : A -> A -> Type.
[x] eq x x --> True.
true : True.
def f : x : A -> y : A -> eq x y -> T x -> A.
(; OK ;)
[] f a _ true (x => x) --> a.
(; Not OK ;)
[] f _ a true (x => x) --> a.
There is a bit of critical state that is outside the Handle.Pure
interface; in particular Sign.loading
and Sign.loaded
.
I think these should be taken care of in the Pure
interface, maybe modifying initial_state
to be:
val initial_state : module_path -> state
so that we can create scoped states for each document that we are processing.
For some applications/libraries like zenon, iProverModulo, etc. it is not necessary to generate a dko file because they will never be used. We just need to know whether type checking succeeded or not. It will improve the time because dko files may be huged, and reduce disk space used too.
Do we want to be able to reduce ill-typed terms or not?
For now, we do not type-check before evaluating.
The current implementation performs very badly in terms of memory usage, and this has big consequences on speed. The problem (at least part of it) comes from the new type-checking / inference functions, which allocates a lot of memory.
One possibility for improvement would be to type-check using a stack (pushing the arguments until a head term is found) to avoid introducing too many unification variables. We should also avoid situations where terms are copied (this breaks physical equalities).
https://github.com/rlepigre/lambdapi/blob/821a972add8c4075ed7eced7d9951a84b129e2f4/src/parser.ml#L43
Why not let the parser do the splitting itself by adding a parser for module names?
And module names will be useful when we will add some Require command.
An assertion failure is raised on the example below:
A : Type.
def g : A -> A.
[] g --> x => x.
Because they are not taken into account. This should be changed in Dedukti too.
La dernière règle ne type check pas avec lambdapi mais typecheck avec dedukti (il me semble que Dedukti a raison) :
#NAME cic.
Nat : Type.
z : Nat.
s : Nat -> Nat.
def m : Nat -> Nat -> Nat.
[i] m i z --> i.
[j] m z j --> j.
[i, j] m (s i) (s j) --> s (m i j).
(; Sorts ;)
Sort : Type.
prop : Sort.
type : Nat -> Sort.
(; Universe successors ;)
def succ : Sort -> Sort.
[] succ prop --> type z.
[i] succ (type i) --> type (s i).
(; Universe cumulativity ;)
def next : Sort -> Sort.
[] next prop --> type z.
[i] next (type i) --> type (s i).
(; Universe product rules ;)
def rule : Sort -> Sort -> Sort.
[s1] rule s1 prop --> prop.
[s2] rule prop s2 --> s2.
[i, j] rule (type i) (type j) --> type (m i j).
def max : Sort -> Sort -> Sort.
[s1] max s1 prop --> s1.
[s2] max prop s2 --> s2.
[i, j] max (type i) (type j) --> type (m i j).
(; Types and terms ;)
Univ : s : Sort -> Type.
def Term : s : Sort -> a : Univ s -> Type.
univ : s : Sort -> Univ (succ s).
def lift : s1 : Sort -> s2 : Sort -> a : Univ s1 -> Univ (max s1 s2).
def prod : s1 : Sort -> s2 : Sort -> a : Univ s1 -> b : (Term s1 a -> Univ s2) -> Univ (rule s1 s2).
def join : s1 : Sort -> s2 : Sort -> Univ s1 -> Univ s2 -> Univ (max s1 s2).
[s1,s2] succ (max s1 s2) --> max (succ s1) (succ s2).
[s1,s2] join _ _ (univ s1) (univ s2) --> univ (max s1 s2).
[s] Term _ (univ s) --> Univ s.
[s1, s2, a] Term _ (lift s1 s2 a) --> Term s1 a.
[s1, s2, a, b]
Term _ (prod s1 s2 a b) --> x : Term s1 a -> Term s2 (b x).
(; Canonicity rules ;)
[s] max s s --> s.
[s1, s2, s3] max (max s1 s2) s3 --> max s1 (max s2 s3).
[s1, s2, s3] rule (max s1 s3) s2 --> max (rule s1 s2) (rule s3 s2).
[s1, s2, s3] rule s1 (max s2 s3) --> max (rule s1 s2) (rule s1 s3).
[s1,s2,s3,a,b,c]
join _ _ (prod s1 s2 a b) (prod _ s3 a c)
--> prod s1 (max s2 s3) a (x : Term s1 a => join s2 s3 (b x) (c x)).
On this example
#NAME bug.
type : Type.
def term : type -> Type.
pi : A : type -> (term A -> type) -> type.
[A,B] term (pi A B) --> a : term A -> term (B a).
def i (A : type) (B : term A -> type) : term (pi A B) := z : term A => z.
The error given is
Cannot type the definition of i test.dk, 9:4-5
Why?
Object files do not have to be produced. This may require going from Makefile
to bash scripts.
To discuss point 2 in #32 .
https://github.com/rlepigre/lambdapi/blob/bd4589f4c1fb94dae951a4b8d8703a27c501730a/src/extra.ml#L53
Is it part of OCaml specification?
Rodolphe wrote:
As I already told you by email, I'd like to introduce the same changes that I did in https://github.com/rlepigre/lambdapi/tree/metavar . Part of the changes that were introduced was the complete removal of contexts in typing rules. I proposed the following syntax:
add Zero &m --> &m
add (Succ &n) &m --> Succ (add &n &m)
weird (x => y => z => &f[x,z]) --> &f[Zero, Zero]
Frédéric wrote:
Ah, I was writing a mail about this to you when I saw yours... ;-) I was also thinking that we could perhaps get rid of rule environment completely. On the other hand, I would propose to you to use the same syntax as metavariables, that is, to write:
add Zero ?m --> ?m.
instead of:
[m] add Zero m --> m.
But there are two things here we need to separate:
Forbid types in rule environment. It should not be a big deal to do this both in Dedukti and in Lambdapi. There is very few things to change in the code. On the other hand, we perhaps need to change some Dedukti generators. Do you know any Dedukti generators outputing types in rule environment? This was the first goal of this issue.
Remove rule environment completely. It seems to be possible to do this by replacing free variables by metavariables in rules. This should be discussed in another issue I think.
To stay compatible with Dedukti, we could have the two possible syntaxes at the same time. Then, once the Dedukti generators will be updated, we can drop the old syntax.
We could develop scoping for the new syntax only, and translate the old syntax into the new one internally.
Rodolphe wrote:
OK, it should be possible to keep the two syntax, by translating the old one into the new one, as you said. However, I disagree that we should the same syntax as for meta-variables. As I explained in my email (bellow, in French), pattern variables are not meta-variables.
More precisely, they can be seen as:
free higher-order variables in the LHS (pattern),
bound higher-order variables in the RHS (action).
The representation is definitely not the same from an operational point of view, and I don't think that we should trick the user into thinking that these are meta-variables.
Salut à nouveau,
Je me pose à nouveau la question des patterns à la miller et du typage des règles de réécriture. A priori, ce que j'avais fait dans la branche « metavar » était très satisfaisant, et je me demande si je ne devrait pas refaire ça dans la branche « pa ».
En gros, ça implique d'ajouter deux nouveaux constructeurs de type :
« Patt of int option * string * term array » pour une variable de pattern avec son environnement. L'entier optionnel est à « None » si c'est un attrape tout, sinon il pointe vers l'indice du tableau dans lequel il faut écrire dans le matching.
« TEnv of term_env * term array » qui peut être vu comme un contexte qui porte son environnement, mais dont le « trou » est une variable de lieur. En gros, le membre droit est un lieur multiple qui lie les « trous » de l'environnement dans un terme. En gros, quand on match, on construit des « tmbinder » qu'on peut ensuite substituer dans le membre droit pour obtenir un terme. En d'autres mots, le RHS est de type « (term_env, term) Bindlib.mbinder ».
Est-ce que tu veux que j'essaye de réintégrer tout ça bientôt ?
A+
Rodolphe
Once we agree, I'll start coding that as soon as possible.
It should be nice that the following program could be parsed:
A : Type.
foo : (x : A) -> A.
https://github.com/rlepigre/lambdapi/blob/7a256a6916501a9e9b9d0a9eb93194c258c31e62/terms.ml#L30
Why not factorizing more by using something like
and symbol =
{ sym_name : string (** Name of the symbol. )
; mutable sym_type : term (* Type of the symbol. )
; sym_path : module_path (* Module in which it is defined. *)
; sym_kind : sym_kind }
and sym_kind = Undef | Def of (rule list) ref
?
Aren't the following exception handlings useless?
https://github.com/rlepigre/lambdapi/blob/4cef6e2202816ca0c5a289fae511bc771a1a0f4c/src/handle.ml#L202
https://github.com/rlepigre/lambdapi/blob/4cef6e2202816ca0c5a289fae511bc771a1a0f4c/src/handle.ml#L232
If so, we can remove the function handle_cmds completely.
I tested: make tests work well if we remove this code.
In many cases, we need to reason about the application head of a term t, that is, the term h such that t = h t1 .. tn and h is not an application. Currently, we need to use the function get_args. We could get it for free by replacing:
| Appl of term * term
by:
| Appl of term * term list
where the first component is never an Appl node, and the second component is never empty. This can be enforced by declaring term as a private data type and providing construction functions.
I think that this would clarify some parts of the code and increase the efficiency.
This is done like this in Dedukti.
Trying to test lambdapi on example #59 , I found that documentation was missing for HO rules.
I have the impression that the correct syntax (right now) is:
A : Type.
a : A.
def f : (A -> A) -> A.
f (x => ?y[x]) --> a.
However, the following syntax is rejected not at parsing level (btw the error message could be improved):
(; KO 9 ;)
A : Type.
a : A.
def f : (A -> A) -> A.
[y] f (x => ?y[x]) --> a.
While the example below is rejected at parsing level:
A : Type.
a : A.
def f : (A -> A) -> A.
[y] f (x => y[x]) --> a.
At least we should have a note that explain the differences between the syntax of Dedukti and the one of lambdapi.
The example of Deducteam/Dedukti#66 does not work in Lambdapi either.
type : Type.
def term : type -> Type.
eq : A : type -> term A -> term A -> type.
def Eq (A : type) (a : term A) (b : term A) : Type := term (eq A a b).
refl : A : type -> a : term A -> Eq A a a.
sym : A : type -> a : term A -> b : term A -> Eq A a b -> Eq A b a.
def trans : A : type -> a : term A -> b : term A -> c : term A ->
Eq A a b -> Eq A b c -> Eq A a c.
[A,a,b,h] trans A a b a h (sym A a b h) --> refl A a
[A,a,b,h] trans A b a b (sym A a b h) h --> refl A b
[A,a,b,h] trans A a b b h (refl A b) --> h
[A,a,b,h] trans A a a b (refl A a) h --> h
[A,w,x,y,z,h,h',h''] trans A w x z h (trans A x y z h' h'') -->
trans A w y z (trans A w x y h h') h''.
def bpi : X : type -> x : term X -> y : term X -> h : Eq X x y ->
P : (y : term X -> Eq X x y -> type) ->
term (P x (refl X x)) -> term (P y h).
(; The left-hand side ;)
def bpi_trans_1 (X : type) (x : term X) (y : term X) (z : term X)
(P : z : term X -> Eq X y z -> type)
(h : Eq X x y) (h' : Eq X y z)
(p : term (P x (sym X x y h))) :
term (P z h') :=
bpi X y z h' (z : term X => h' : Eq X y z => P z h')
(bpi X x y h
(z : term X => h'' : Eq X x z => P z (trans X y x z (sym X x y h) h''))
p).
(; The right-hand side ;)
def bpi_trans_2 (X : type) (x : term X) (y : term X) (z : term X)
(P : z : term X -> Eq X y z -> type)
(h : Eq X x y) (h' : Eq X y z)
(p : term (P x (sym X x y h))) : term (P z h') :=
bpi X x z (trans X x y z h h')
(z : term X => h'' : Eq X x z => P z (trans X y x z (sym X x y h) h'')) p.
(; The following rule is rejected ;)
[X,x,y,z,P,h,h',p]
bpi X y z h' P
(bpi X x y h
(z => h'' => P z (trans X y x z (sym X x y h) h''))
p)
-->
bpi X x z (trans X x y z h h')
(z : term X => h'' : Eq X x z => P z (trans X y x z (sym X x y h) h'')) p.
(; [dedukti66.dk, 46:0-53:77] Error on command.
Rule [bpi ?X ?y ?z ?h' ?P (bpi ?X ?x ?y ?h (z:?#0 => h'':?#1[z] => ?P z (trans ?X ?y ?x z (sym ?X ?x ?y ?h) h'')) ?p)
→ bpi <?X> <?x> <?z> (trans <?X> <?x> <?y> <?z> <?h> <?h'>) (z:term <?X> => h'':Eq <?X> <?x> z => <?P> z (trans <?X> <?y> <?x> z (sym <?X> <?x> <?y> <?h>) h'')) <?p>] does not preserve typing. ;)
12:28 ~/lambdapi-git make dklib
## Compiling the dklib library ##
Checked [cc.dk]
Checked [dk_bool.dk]
Fatal error: exception Console.Fatal("[dklib.dk, 1:0-23] error while handling a command.\n[dk_machine_int.dk, 3:0-15] error while handling a command.\n[dk_nat.dk, 3:0-16] error while handling a command.\n[dk_list.dk, 3:0-17] error while handling a command.\n[dk_logic.dk, 3:0-16] error while handling a command.\nParse error at [dk_type.dk, 53:15].\n\n\n\n\n\n")
Raised at file "format.ml" (inlined), line 242, characters 35-52
Called from file "format.ml", line 469, characters 8-33
Called from file "format.ml", line 484, characters 6-24
Command exited with non-zero status 2
Finished in 0:00.11 at 98% with 8620Kb of RAM
GNUmakefile:98 : la recette pour la cible « dklib » a échouée
make: *** [dklib] Erreur 2
line in dk_type.dk:
left (A : cc.uT) (B : cc.uT) : cc.eT A -> Sum A B.
Can arguments be given before type declaration?
Quoting @fblanqui :
I don't think that eq is a good name. You should rather call it unify and rename unify into safe_set_unif and set_unif into unsafe_set_unif.
12:17 ~/src/deducteam/lambdapi-lsp-git opam uninstall lambdapi
The following actions will be performed:
⊘ remove lambdapi dev
=-=- Processing actions -=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
[WARNING] Directory /local/blanqui/.opam/4.07.0/lib/lambdapi is not empty, not
removing
⊘ removed lambdapi.dev
Done.
Does make uninstall remove the library files too?
https://github.com/rlepigre/lambdapi/blob/7a256a6916501a9e9b9d0a9eb93194c258c31e62/scope.ml#L145
Move type cmd in a separate file, say cmd.ml, and rename cmds.ml into, say handle.ml?
Is this feature essential? What does it imply for the theory, besides the new rule for product types?
Example: (; dhdhdh (; dhdhd ;) ddd ;)
https://github.com/rlepigre/lambdapi/blob/6c3a72613cc9a91ec1cf02f7fb8fc1c5da407e87/src/cmd.ml#L27
What is the boolean argument for?
https://github.com/rlepigre/lambdapi/blob/ebfea320770e59506617753633f93bee989079db/src/scope.ml#L103
I guess that yo use # because no identifier is allowed to start with #.
But why do you need to give a name to wildcard variables at all?
Some comment is needed here.
README.md should include some explanations about these directories, what are their purposes, how they are organized. And what is examples/CHANGES?
Wouldn't be better, for the moment, to separate the code for type-checking from the code for subject reduction, even though subject reduction shares many things with type-checking? I think that it would be cleaner and would allow one to modify the code for type-checking independently of subject reduction, and vice versa. Type checking should fail when a unification variable is encountered. Subject reduction should be developed in a separate file say sr.ml.
The following file generates a FIXME error
def A : Type.
def eA : A -> Type.
a : A.
[] A --> eA a.
B : Type.
def eB : B -> Type.
[] eB a --> eA a.
On the following example, lambdapi is happy (I tried to minimize the example):
Sort : Type.
succ : Sort -> Sort.
def max : Sort -> Sort -> Sort.
[s1,s2,s3] max (max s1 s2) s3 --> max s1 (max s2 s3).
Univ : Sort -> Type.
def Term : s : Sort -> a : Univ s -> Type.
univ : s : Sort -> Univ (succ s).
def join : s1 : Sort -> s2 : Sort -> Univ s1 -> Univ s2 -> Univ (max s1 s2).
def cast : s1 : Sort -> s2 : Sort -> a : Univ s1 -> b : Univ s2 -> Term s1 a -> Term (max s1 s2) (join s1 s2 a b).
(; [s] Term _ (univ s) --> Univ s. ;)
[s1,s2,s3,t1,t2,t3]
join _ s3 (join s1 s2 t1 t2) t3 -->
join s1 (max s2 s3) t1 (join s2 s3 t2 t3).
[s1,s2,s3,t1,t2,t3,a]
cast _ s3 _ t3 (cast s1 s2 t1 t2 a)
--> cast s1 (max s2 s3) t1 (join s2 s3 t2 t3) a.
However, if you uncomment the rewrite rule on Term
, lambdapi does not typecheck anymore (because of the last rule). But if put this rule at the end of the file, it typechecks again.
I suspect that even if the symbol Term
is declared as definable, if there is no rewrite rule on this symbol, he is considered as injective. Using the injectivity of Term
allows the type checker to infer that for the last rule, the second wildcard must be a join
applied to some arguments, hence, the first wildcard has to be a max
.
I don't know if this behavior is wanted or not. By the way, Dedukti does not type check the last rule in any way (except if you put explicitly a join
instead of the wildcard, but since join
is definable, I don't want to write it explicitly because I want the rule to match a term that can be convertible to a join
for this wildcard).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.