How to bias Z3's (Python) SAT solving towards a criteria, such as 'preferring' to have more negated literals

There are two main ways to handle this sort of problems in z3. Either using the optimizer, or manually computing via multiple-calls to the solver.

Using the optimizer

As Axel noted, one way to handle this problem is to use the optimizing solver. Your example would be coded as follows:

from z3 import *

A, B = Bools('A B')

o = Optimize()
o.add(Or(A, B))
o.add_soft(Not(A))
o.add_soft(Not(B))

print(o.check())
print(o.model())

This prints:

sat
[A = False, B = True]

You can add weights to soft-constraints, which gives a way to associate a penalty if the constraint is violated. For instance, if you wanted to make A true if at all possible, but don't care much for B, then you'd associate a bigger penalty with Not(B):

from z3 import *

A, B = Bools('A B')

o = Optimize()
o.add(Or(A, B))
o.add_soft(Not(A))
o.add_soft(Not(B), weight = 10)

print(o.check())
print(o.model())

This prints:

sat
[A = True, B = False]

The way to think about this is as follows: You're asking z3 to:

  • Satisfy all regular constraints. (i.e., those you put in with add)
  • Satisfy as many of the soft constraints as possible (i.e., those you put in with add_soft.) If a solution isn't possible that satisfies them all, then the solver is allowed to "violate" them, trying to minimize the total cost of all violated constraints, computed by summing the weights up.
  • When no weights are given, you can assume it is 1. You can also group these constraints, though I doubt you need that generality.

So, in the second example, z3 violated Not(A), because doing so has a cost of 1, while violating Not(B) would've incurred a cost of 10.

Note that when you use the optimizer, z3 uses a different engine than the one it uses for regular SMT solving: In particular, this engine is not incremental. That is, if you call check twice on it (after introducing some new constraints), it'll solve the whole problem from scratch, instead of learning from the results of the first. Also, the optimizing solver is not as optimized as the regular solver (pun intended!), so it usually performs worse on straight satisfiability as well. See https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/nbjorner-nuz.pdf for details.

Manual approach

If you don't want to use the optimizer, you can also do this "manually" using the idea of tracking variables. The idea is to identify soft-constraints (i.e., those that can be violated at some cost), and associate them with tracker variables.

Here's the basic algorithm:

  • Make a list of your soft constraints. Int the above example, they'll be Not(A) and Not(B). (That is, you'd like these to be satisfied giving you negative literals, but obviously you want these to be satisfied only if possible.) Call these S_i. Let's say you have N of them.

  • For each such constraint, create a new tracker variable, which will be a boolean. Call these t_i.

  • Assert N regular constraints, each of the form Implies(t_i, S_i), for each soft-constraint.

  • Use a pseudo-boolean constraint, of the form AtMostK, to force that at most K of these tracker variables t_i are false. Then use a binary-search like schema to find the optimal value of K. Note that since you're using the regular solver, you can use it in the incremental mode, i.e., with calls to push and pop.

For a detailed description of this technique, see https://stackoverflow.com/a/15075840/936310.

Summary

Which of the above two methods will work is problem dependent. Using the optimizer is easiest from an end-user point of view, but you're pulling in heavy machinery that you may not need and you can thus suffer from a performance penalty. The second method might be faster, at the risk of more complicated (and thus error prone!) programming. As usual, do some benchmarking to see which works the best for your particular problem.


Z3py features an optimizing solver Optimize. This has a method add_soft with the following description:

Add soft constraint with optional weight and optional identifier. If no weight is supplied, then the penalty for violating the soft constraint is 1. Soft constraints are grouped by identifiers. Soft constraints that are added without identifiers are grouped by default.

A small example can be found here:

The Optimize context provides three main extensions to satisfiability checking:

  o = Optimize()

  x, y = Ints('x y')
  o.maximize(x + 2*y)           # maximizes LIA objective

  u, v = BitVecs('u v', 32)
  o.minimize(u + v)             # minimizes BV objective

  o.add_soft(x > 4, 4)          # soft constraint with 
                                # optional weight