Date: Dec 4, 2012 6:16 PM Author: Graham Cooper Subject: Re: LOGICAL NEGATION working in PROLOG! <<Using Peano Arithmetic>> On Dec 5, 8:05 am, Jan Burse <janbu...@fastmail.fm> wrote:

> Graham Cooper schrieb:

>

> > By putting the if( _ , _ ) first

>

> > thm(R) :- if(L,R), thm(L)

>

> I guess you mean:

>

> thm(B) :- thm(if(A,B)), thm(A)

I can add

thm( if(X,Y) ) :- if(X,Y)

to bring if statements into the theory, but you can also have

inference rules at a different level to theorems (a popular point from

GG)

>

> Only them it is modus ponens. Namely then you

> have the Hilbert style inference rule:

>

> |- A->B |- A

> ------------------

> |- B

>

> But if you have:

>

> thm(B) :- if(A,B), thm(A).

>

> Then this is something else. I am not sure what it

> should be. You have to explain. It resembles

> the Prolog forward chaining vanilla interpreter

> which has (simplified version without oneormore

> and already):

>

> solved(B) :- clause(B,A), solved(A).

>

> Note clause/2 is the mirror of if/2. That is

> why the arguments are reversed.

>

> But the inference rule, which the forward chaining

> vanilla interpreter is based on, is not modus ponens.

> It is the (->L) rule of the cut-eliminated sequent

> calculus. Namely this rule reads, without Dragalins

> optimization, i.e. the original Kleene G3:

>

> G, A -> B |- A G, A -> B, B |- C

> ------------------------------------ (->L)

> G, A -> B |- C

>

> Forward chaining is now a special case of the above rule,

> when you have in G already the solved unit clause A. We

> then have:

>

> G', A, A -> B |- A G', A, A -> B, B |- C

> -------------------------------------------- (->L)

> G', A, A -> B |- C

>

> We can close the left branch via the (id) rule, since A

> is already part of G.

>

> ------------------ (id)

> G', A, A -> B |- A G', A, A -> B, B |- C

> --------------------------------------------- (->L)

> G', A, A -> B |- C

>

> So basically we go from:

>

> G', A, A -> B |- C

>

> To the following via backward chaining:

>

> G', A, A -> B, B |- C

>

> So we started with A, and we added B, because there was

> a rule A -> B. This is the backward chaining explanation

> of forward chaining. Of course A has been a theorem already,

> and B is a theorem now, by means of the (id) rule again:

>

> ------------------ (id)

> G', A, A -> B |- A

>

> --------------------- (id)

> G', A, A -> B, B |- B

>

> But there is one drawback of this approach, during back-

> tracking you loose B again. So that the method is a little

> forgetful, and recomputes a lot of things again an

> again. But if you add:

>

> thm(C) :- if((A,B),C), thm(A), thm(B).

This is my whole proof ideology!

To construct a Directed Acyclic Graph from the Theorem back to axioms,

I call it binary modus ponens.

(It ends up one of A or B is usually an inference rule and it just

applies standard MP - going round is circles!)

I should be able to construct sets without contradictions this way.

LOGIC

E(Y) Y={x|P(x)} <-> DERIVE( E(Y) Y={x|P(x)} )

DERIVE(T) <-> DERIVE(a) ^ DERIVE(b) ^ (a^b)->T

MATHEMATICS

E(Y) Y={x|P(x)} <-> PRVBLE( E(Y) Y={x|P(x)} )

NOT(PRVBLE(T)) <-> DERIVE(NOT(T))

since DERIVE(NOT( EXIST(russell-set) ))

Going back to my previous MODUS PONENS system.

----------------------------------

if( and(X , Y ), or(X,Y) ).

if( and(not(X) , not(Y) ), not(or(X,Y)) ).

if( and(not(X) , Y ), or(X,Y) ).

if( and(X , not(Y) ), or(X,Y) ).

if( and(not(X) , not(Y) ), not(and(X,Y)) ).

if( and(not(X) , Y ), not(and(X,Y)) ).

if( and(X , not(Y) ), not(and(X,Y)) ).

if( and(X , Y ), if(X,Y) ).

if( and(not(X) , not(Y) ), if(X,Y) ).

if( and(not(X) , Y ), if(X,Y) ).

if( and(X , not(Y) ), not(if(X,Y)) ).

if( and(X , Y ), iff(X,Y) ).

if( and(not(X) , not(Y) ), iff(X,Y) ).

if( and(not(X) , Y ), not(iff(X,Y)) ).

if( and(X , not(Y) ), not(iff(X,Y)) ).

if( and(if(A,B) , if(B,C)), if(A,C) ).

if( not(not(X)) , X ).

if( not(and(X,Y)) , or(not(X),not(Y)) ).

if( not(or(X,Y)) , and(not(X),not(Y)) ).

and(X,Y) :- t(X), t(Y).

and(X,not(Y)) :- t(X), not(Y).

and(not(X),not(Y)) :- not(X), not(Y).

and(not(X),Y) :- not(X), t(Y).

t(and(X,Y)) :- and(X,Y).

t(R) :- t(L), if(L,R).

t(less(X,Y)) :- less(X,Y).

less(0,X).

if( and(less(X,Y) , less(Y,Z)) , less(X,Z) ).

Predicates are themselves constructed using

if( and( _,_ ) , _ )

and so are the arithmetic inferences etc..

and(X,Y) :- thm(X) , thm(Y)

does the cartesian calculation depth first in Unify.

>

> Then you should be able to prove:

>

> p, p -> q, p -> r, q&r -> s |- s

>

> But for ternary conjunction you would need to add:

>

> thm(D) :- if((A,B,C),D), thm(A), thm(B), thm(C).

>

> And for quaternary conjunction you would need to add:

>

> thm(E) :- if((A,B,C,D),E), thm(A), thm(B), thm(C), thm(D).

>

> And so on. One solution to make it not forgetful, is the

> forward chaining rule I have given:

>

> :- forward solved/1.

>

> solved(H) :- solved(A), forward(H,B),

> oneormany(A,B,C), already(C).

>

> Important thing is here the forward declaration, so

> that solved facts do not become forgetful. If it is not

> forgetful, we can get away with checking only one

> newly arrived solved/1 fact and check the rest via the

> old solved/1 facts.

>

> Lets introduce:

>

> thm_arrived: new arrived solved/1 fact.

> thm_new: delta of solved/1 facts.

> thm_memory: old solved/1 facts.

>

> The above solved/1 forward chaining rule then does nothing

> else than:

>

> thm_new(B) :- if(A,B), thm_arrived(A).

> thm_new(C) :- if((A,B),C), thm_arrived(A), thm_memory(B).

> thm_new(C) :- if((A,B),C), thm_memory(A), thm_arrived(B).

> thm_new(C) :- if((A,B),C), thm_arrived(A), thm_arrived(B).

> Etc..

>

> And the interpreter does nothing else than:

>

> 1) Pick the newest thm_new/1, retract it and assert

> it as thm_arrived/1.

>

> 2) Invoke all forward rules, and compute the new thm_new/1

> and assert them.

>

> 3) Retract the thm_arrived/1 and assert it as thm_memory/1.

>

> 4) Continue with 1).

>

> The Craft of Prolog has a section about delta fixpoint calculation.

> But I did not yet verify that it really matches the above. There

> are also a lot of other papers around about delta calculations.

>

> Using memory and delta computation is not mandatory. You can

> also do it as you do, via the backward chaining interpretation

> of forward chaining. But a non forgetful implementation is

> especially useful, if the derived thm/1 is used in multiple

> places, for example a cell in a chart parser.

>

>

Right, but with a billion records in MySQL and logarithmic slowing you

could just turn off garbage collection after running queries, a crude

learning of facts the more queries that you ask it.

Herc