# Boolean Logic¶

Tensor networks make for a great representation of Boolean expressions. Let’s have a look at how to build tensor Boolean formulas and what we can do with them.

[1]:

import tntorch as tn
import torch

p, q, r = tn.symbols(3)


These symbols are plain tntorch tensors (in this case, 3D ones) and have rank 1 each:

[2]:

p

[2]:

3D TT tensor:

2   2   2
|   |   |
(0) (1) (2)
/ \ / \ / \
1   1   1   1


The fun starts with overloaded Boolean operators:

[3]:

tn.is_tautology(p | ~p)  # Formula is always true

[3]:

True

[4]:

tn.is_contradiction(p & ~p)  # Formula is always false

[4]:

True

[5]:

tn.is_satisfiable(p ^ q | r)  # Formula is true for some values

[5]:

True

[6]:

tn.implies(p & q | q & r, q)  # First formula implies the second

[6]:

True

[7]:

tn.equiv(p & q, ~(~p | ~q))  # A double implication (De Morgan's law)

[7]:

True


Check out the quantifiers too:

[8]:

tn.equiv(tn.all(3), p & q & r)  # "for all"

[8]:

True

[9]:

tn.equiv(tn.any(3), p | q | r)  # "exists"

[9]:

True

[10]:

tn.equiv(tn.none(3), ~tn.any(3))  # "does not exist"

[10]:

True

[11]:

tn.equiv(tn.one(3), p&~q&~r | ~p&q&~r | ~p&~q&r)  # "exists and is unique"

[11]:

True


To find out the relevant symbols of a formula (those whose values can affect its result), do the following:

[12]:

tn.relevant_symbols(p & q | ~p & q)

[12]:

[1]


The ranks grow exponentially after each binary operation. It is recommended to run round() to try to reduce those ranks, often saving up memory and computational costs:

[13]:

formula = ~p | (p & q) | (q & r)
print(formula)
formula.round()
print(formula)

3D TT tensor:

2   2   2
|   |   |
(0) (1) (2)
/ \ / \ / \
1   11  11  1

3D TT tensor:

2   2   2
|   |   |
(0) (1) (2)
/ \ / \ / \
1   2   2   1



How many ways are there to satisfy a given Boolean formula? This is known as the #SAT problem. For us, it’s just the sum of elements of the tensor:

[14]:

tn.sum(formula)

[14]:

tensor(6.0000)


We can look at all the inputs that satisfy this formula with the function accepted_inputs():

[15]:

tn.accepted_inputs(formula)  # One row per accepted input

[15]:

tensor([[0, 0, 0],
[0, 0, 1],
[0, 1, 0],
[0, 1, 1],
[1, 1, 0],
[1, 1, 1]])


The function only() is useful to enforce that only certain variables play a role:

[16]:

tn.accepted_inputs(tn.only(p) | tn.only(r))

[16]:

tensor([[0, 0, 1],
[1, 0, 0]])


## Applications¶

Boolean tensors can be applied in ANOVA and variance-based sensitivity analysis as well as for compact derivative computation.