Automata¶
Tensor trains can represent compactly deterministic finite automata and weighted finite automata that read a fixed number of symbols.
[1]:
import torch
import tntorch as tn
For instance, weight_mask
produces an automaton that accepts a string iff it has a certain amount of 1’s:
[2]:
m = tn.weight_mask(N=4, weight=2)
m
[2]:
4D TT tensor:
2 2 2 2
| | | |
(0) (1) (2) (3)
/ \ / \ / \ / \
1 2 3 2 1
All accepted input strings can be retrieved alphabetically via accepted_inputs()
:
[3]:
tn.accepted_inputs(m)
[3]:
tensor([[0, 0, 1, 1],
[0, 1, 0, 1],
[0, 1, 1, 0],
[1, 0, 0, 1],
[1, 0, 1, 0],
[1, 1, 0, 0]])
On the other hand, weight()
produces an automaton that is a little different. Instead of accepting or rejecting strings, it just counts how many 1’s the string has:
[4]:
m = tn.weight(N=4)
print(m[0, 0, 0, 0])
print(m[0, 1, 0, 0])
print(m[1, 0, 1, 1])
tensor(0.)
tensor(1.)
tensor(3.)
Applications¶
TT automata come in handy to group and sum tensor entries, which is important to obtain advanced metrics for sensitivity analysis. See also the tutorial on Boolean logic with *tntorch*.