# Main Tensor Formats¶

Three of the most popular tensor decompositions are supported in *tntorch*:

Those formats are all represented using \(N\) *tensor cores* (one per tensor dimension, used for CP/TT) and, optionally, up to \(N\) *factor matrices* (needed for Tucker).

In an \(N\)-D tensor of shape \(I_1 \times \dots \times I_N\), each \(n\)-th core can come in four flavors:

- \(R^{\mathrm{TT}}_n \times I_n \times R^{\mathrm{TT}}_{n+1}\): a TT core.
- \(R^{\mathrm{TT}}_n \times S_n^{\mathrm{Tucker}} \times R^{\mathrm{TT}}_{n+1}\): a TT-Tucker core, accompanied by an \(I_n \times S_n^{\mathrm{Tucker}}\) factor matrix.
- \(I_n \times R^{\mathrm{CP}}_n\): a CP core. Conceptually, it works as if it were a 3D TT core of shape \(R^{\mathrm{CP}}_n \times I_n \times R^{\mathrm{CP}}_n\) whose slices along the 2nd mode are all diagonal matrices.
- \(S_n^{\mathrm{Tucker}} \times R^{\mathrm{CP}}_n\): a CP-Tucker core, accompanied by an \(I_n \times S_n^{\mathrm{Tucker}}\) factor matrix. Conceptually, it works as a 3D TT-Tucker core.

One tensor network can combine cores of different kinds. So all in all one may have TT, TT-Tucker, Tucker, CP, TT-CP, CP-Tucker, and TT-CP-Tucker tensors. We will show examples of all.

*(see `this notebook <decompositions.ipynb>`__ to decompose full tensors into those main formats)*

*(see `this notebook <other_formats.ipynb>`__ for other structured and custom decompositions)*

## TT¶

Tensor train cores are represented in parentheses `( )`

:

```
[1]:
```

```
import tntorch as tn
tn.rand([32]*5, ranks_tt=5)
```

```
[1]:
```

```
5D TT tensor:
32 32 32 32 32
| | | | |
(0) (1) (2) (3) (4)
/ \ / \ / \ / \ / \
1 5 5 5 5 1
```

## TT-Tucker¶

In this format, TT cores are compressed along their spatial dimension (2nd mode) using an accompanying Tucker factor. This was considered e.g. in the original TT paper.

```
[2]:
```

```
tn.rand([32]*5, ranks_tt=5, ranks_tucker=6)
```

```
[2]:
```

```
5D TT-Tucker tensor:
32 32 32 32 32
| | | | |
6 6 6 6 6
(0) (1) (2) (3) (4)
/ \ / \ / \ / \ / \
1 5 5 5 5 1
```

Here is an example where only *some* cores have Tucker factors:

```
[3]:
```

```
tn.rand([32]*5, ranks_tt=5, ranks_tucker=[None, 6, None, None, 7])
```

```
[3]:
```

```
5D TT-Tucker tensor:
32 32
32 | 32 32 |
| 6 | | 7
(0) (1) (2) (3) (4)
/ \ / \ / \ / \ / \
1 5 5 5 5 1
```

Note: if you want to leave some factors fixed during gradient descent, simply set them to some PyTorch tensor that has requires_grad=False.

## Tucker¶

“Pure” Tucker is technically speaking not supported, but is equivalent to a TT-Tucker tensor with full TT-ranks. The minimal necessary ranks are automatically computed and set up for you:

```
[4]:
```

```
tn.rand([32]*5, ranks_tucker=3) # Actually a TT-Tucker network, but just as expressive as a pure Tucker decomposition
```

```
[4]:
```

```
5D TT-Tucker tensor:
32 32 32 32 32
| | | | |
3 3 3 3 3
(0) (1) (2) (3) (4)
/ \ / \ / \ / \ / \
1 3 9 9 3 1
```

In other words, all \(32^5\) tensors of Tucker rank \(3\) can be represented by a tensor that has the shape shown above, and vice versa.

## CP¶

CP factors are shown as cores in brackets `< >`

:

```
[5]:
```

```
tn.rand([32]*5, ranks_cp=4)
```

```
[5]:
```

```
5D CP tensor:
32 32 32 32 32
| | | | |
<0> <1> <2> <3> <4>
/ \ / \ / \ / \ / \
4 4 4 4 4 4
```

Even though those factors work conceptually as 3D cores in a tensor train (every CP tensor is a particular case of the TT format), they are stored in 2D as in a standard CP decomposition. In this case all cores have shape \(32 \times 4\).

## TT-CP¶

TT and CP cores can be combined by specifying lists of ranks for each format. You should provide \(N-1\) TT ranks and \(N\) CP ranks and use `None`

so that they do not collide anywhere. Also note that consecutive CP ranks must coincide. Here is a tensor with 3 TT cores and 2 CP cores:

```
[6]:
```

```
tn.rand([32]*5, ranks_tt=[2, 3, None, None], ranks_cp=[None, None, None, 4, 4])
```

```
[6]:
```

```
5D TT-CP tensor:
32 32 32 32 32
| | | | |
(0) (1) (2) <3> <4>
/ \ / \ / \ / \ / \
1 2 3 4 4 4
```

Here is another example with 2 TT cores and 3 CP cores:

```
[7]:
```

```
tn.rand([32]*5, ranks_tt=[None, 2, None, None], ranks_cp=[4, None, None, 5, 5])
```

```
[7]:
```

```
5D TT-CP tensor:
32 32 32 32 32
| | | | |
<0> (1) (2) <3> <4>
/ \ / \ / \ / \ / \
4 4 2 5 5 5
```

## CP-Tucker¶

Similarly to TT-Tucker, this model restricts the columns along CP factors to live in a low-dimensional subspace. It is also known as a canonical decomposition with linear constraints (*CANDELINC*). Compressing a Tucker core via CP leads to an equivalent format.

```
[8]:
```

```
tn.rand([32]*5, ranks_cp=2, ranks_tucker=4)
```

```
[8]:
```

```
5D CP-Tucker tensor:
32 32 32 32 32
| | | | |
4 4 4 4 4
<0> <1> <2> <3> <4>
/ \ / \ / \ / \ / \
2 2 2 2 2 2
```

## TT-CP-Tucker¶

Finally, we can combine all sorts of cores to get a hybrid of all 3 models:

```
[9]:
```

```
tn.rand([32]*5, ranks_tt=[2, 3, None, None], ranks_cp=[None, None, None, 10, 10], ranks_tucker=[5, None, 5, 5, None])
```

```
[9]:
```

```
5D TT-CP-Tucker tensor:
32 32 32
| 32 | | 32
5 | 5 5 |
(0) (1) (2) <3> <4>
/ \ / \ / \ / \ / \
1 2 3 10 10 10
```