# Arithmetics¶

## Basic Arithmetics¶

The most basic tensor operations (addition +, subtraction -, and product * with either a scalar or with another tensor) can be accomplished via direct manipulation of tensor cores (see e.g. the original tensor train paper).

[1]:

import tntorch as tn
import torch
import numpy as np

t1 = tn.ones([32]*4)
t2 = tn.ones([32]*4)

t = tn.round((t1+t2)*(t2-2))
print(t)

4D TT tensor:

32  32  32  32
|   |   |   |
(0) (1) (2) (3)
/ \ / \ / \ / \
1   1   1   1   1



You can also assign values to parts of a tensor:

[2]:

t = tn.ones(5, 5)
t[:3, :] = 2
t[:, :2] *= 3
print(t.torch())

tensor([[6., 6., 2., 2., 2.],
[6., 6., 2., 2., 2.],
[6., 6., 2., 2., 2.],
[3., 3., 1., 1., 1.],
[3., 3., 1., 1., 1.]])


Thanks to cross-approximation, tntorch supports many other more advanced operations on tensors, including element-wise division /, exp(), log(), sin(), etc.

[3]:

domain = [torch.linspace(0, np.pi, 32)]*4
x, y, z, w = tn.meshgrid(domain)

t = tn.round(1 / (1+x+y+z+w))
print(t)

4D TT-Tucker tensor:

32  32  32  32
|   |   |   |
7  13  13   7
(0) (1) (2) (3)
/ \ / \ / \ / \
1   7   7   7   1



We will now try the trigonometric identity $$\sin^2(x) + \cos^2(x) = 1$$:

[4]:

t = tn.round(tn.sin(t)**2 + tn.cos(t)**2)
print(t)

4D TT tensor:

32  32  32  32
|   |   |   |
(0) (1) (2) (3)
/ \ / \ / \ / \
1   13  17  13  1



The tensor t should be $$1$$ everywhere. Indeed:

[5]:

print(tn.mean(t))
print(tn.var(t))

tensor(1.0000)
tensor(1.8159e-15)