next up previous contents
Next: Settings for Refutational Theorem Up: Heuristics for the Definition Previous: Heuristics for the Definition   Contents

General

There are two essential ways in which a user can influence the saturation strategy: orderings and selection. The orderings that are available in Saturate have been described above. For the precedence-based orderings, it is difficult to provide generally useful heuristics for setting up the precedence. In any case the ordering should be such that smaller terms and atoms are closer to a canonical form that we want to approximate. That implies that derived symbols usually should get a higher precedence than the symbols they are derived from. Constants are usually minimal. Often predicates should be maximal and top predicates, in particular when their definitions are to be expanded out such as in certain formulations of set theory. Sometimes it is helpful when Skolem functions have a high precedence. For Skolem functions introduced by the system, they are given a precedence higher than any user function. Generated Skolem functions are ordered according to generation order: later symbols (corresponding to inner quantifiers) are smaller than symbols generated earlier. That is, by choosing the ordering of the variables in a quantifier prefix, a user can take influence on the ordering of the corresponding Skolem functions. An example where this is crucial is a certain lemma for the verification of quick sort. Equations in input formulas should be written in the way that one wants them to be ordered. These heuristics are applied when Saturate computes or completes precedences automatically.

While the ordering essentially guides rewriting and search, the selection strategies specify how negative literals should be treated. The available strategies are explained with the sass command. Again it is hard to say which strategy to apply in general. With the ordering-guided strategies sass([]) and sass([1]) simplification by recursive (contextual) rewriting seems to have a better effect. Actually, only clauses in which no negative literal is selected may become a potential simplifier at present. For clauses with a selected negative literal we only check as to whether they subsume (or subsume-reduce) some other clause.

Apart from these two pricipal ways of taking influence on search and simplification, one may choose the way how ordering constraints are handled (saci command). One needs to find the right balance between precision and the fraction of runtime allocated for constraint solving. Constraint solving for the lpo is NP-complete, and the present implementation clearly shows that. For the future we would like to improve our implementation by providing better heuristics for constraint simplification. The best choice usually is saci([1]). Some examples run much better with constraint inheritance saci([2]), e.g., this formulation of the theorem of the sum of two continuous functions being continuous. Constraint inheritance might be crucial for the finite saturation of a consistent theory.

Finally, we have a number of flags and options that one may set in addition. These, among others, affect the selection of inferences according to certain complexity measures on terms and clauses. Setting them one way or another is hacking, and nothing general can be said about it.


next up previous contents
Next: Settings for Refutational Theorem Up: Heuristics for the Definition Previous: Heuristics for the Definition   Contents
Harald Ganzinger 2002-12-04

Imprint | Data Protection