- Research
- Teaching
- Organisation
This is an old revision of the document!
It only needs to ask an askable atom when there is a clause where the head has not already been proved, all of the non-askable atoms in the body have been proved, and the askable atoms in the body that have been asked have been answered by “yes”.
Here is a trace of AILog
ailog: load ’elect_bug2.ail’.
AILog theory elect_bug2.ail loaded.
ailog: ask lit_l1.
Answer: lit_l1.
Runtime since last report: 0.0 secs.
[ok,more,how,help]: how.
lit_l1 <-
1: light_l1
2: live_l1
3: ok_l1
How? [Number,up,retry,ok,prompt,help]: how 2.
live_l1 <-
1: live_w0
How? [Number,up,retry,ok,prompt,help]: how 1.
live_w0 <-
1: live_w1
2: up_s2
How? [Number,up,retry,ok,prompt,help]: how 1.
live_w1 <-
1: live_w3
2: up_s1
How? [Number,up,retry,ok,prompt,help]: how 2.
up_s1 is a fact
The buggy clause is
up_s1.
(a) Give a model of the model of the knowledge base: The minimal model has { a, c, e, f, g } true and { b, d, h } false. Another model is where all atoms are true.
(b) Give an interpretation that is not a model of the knowledge base: The interpretation with a false, and all other atoms true is not a model as the first clause is false in this interpretation. The interpretation with all atoms false is not a model as the 7th clause is false in this interpretation.
(c) Give two atoms that are logical consequences of the KB: Any two of { a, c, e, f, g } .
(d) Give two atoms that are not logical consequences of the KB: Any two of { b, d, h } .