Skip to content

WTF is a Tensor?

Trust me, we've all been there. If you do Math or Computer Science, chances are you've heard the word "tensor" thrown around a lot. But WTF is that even?

T(e1)T(e2)T(e3)σ11σ21σ31σ12σ22σ32σ13σ23σ33
1
2
3
4
5
6
7
8
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…

Image Source: https://en.wikipedia.org/wiki/Tensor

Well, WTF is a Tensor? ​

A tensor is an object that transforms and behaves like a tensor.

Great. That was helpful. Let's try a more formal definition:

A tensor is an algebraic object that describes a multilinear relationship between sets of algebraic objects associated with a vector space.

Cool. That was even worse. Let's try something else.

What a Tensor is NOT ​

There are some common misconceptions about tensors that we should clear up first.

  • "A tensor is just a matrix"
    • Nope. But a Matrix is a type of tensor! (Just one of many types)
  • "A tensor is just a higher-dimensional array"
    • Nope. But a higher-dimensional array can represent a tensor!
  • "A tensor is just a data structure"
    • Nope. But a data structure can be used to store a tensor!

Cool. So what is it then? ​

Well, a Tensor is something you probably use every day, even if you don't know it. If you go shopping and you see price tags, those are tensors. If you're a programmer and ever used number arrays or lists, you're using tensors. If you do any kind of math, computer science, or physics, you're probably using tensors.

NOW WHAT IS IT?!?!?!?!11

A Tensor is something that basically holds a number or a set of numbers and tells you how to interpret them.
In other words: It's a container for numbers + rules on how those numbers transform.

Types of Tensors / Dimensions ​

Tensors can have different Dimensions, which are often referred to as "orders" or "ranks". You will definitely know some of these:

Note

Order and Rank are sometimes used interchangeably, but they can have different meanings in specific contexts.

  • Order: The number of indices required to uniquely identify each element of the tensor.
  • Rank: The minimum number of simple tensors that generate the tensor as their sum.

But here I'll use both to mean the same thing: the number of dimensions/indices.


0th Order Tensor ​

TypeScalar
Dimension0D - just a point: â‹…
DescriptionA single number. No direction, no magnitude. Just a value.
Examples0, 1, -5, 3.14
Used ForWell, you name it. It's just a number. So: counting, measuring, labeling, etc.

1st Order Tensor ​

TypeVector
Dimension1D - a line of numbers:        
DescriptionA one dimensional array/list of numbers. Has both magnitude and direction.
Examples[3, 4] (2D Vector)

[1, 0, 0] (3D Vector)

[-1, 2, 3, 4] (4D Vector)

[123] (Column Vector)
Used ForRepresenting quantities that have both magnitude and direction, like velocity, force, or position in space.

2nd Order Tensor ​

TypeMatrix
Dimension2D - a grid of numbers:     y0x
DescriptionA two-dimensional array of numbers.
Can represent linear transformations, rotations, and more.
Examples[[1, 2], [3, 4]] (2×2 array)

In programming: An Array of Arrays:

[
    [1, 2],
    [3, 4]
]


[1234] (2×2 Matrix)

[100010001] (3×3 Identity Matrix)
Used ForLinear transformations, rotations, scaling, and more in physics, graphics, and ML.

3rd Order Tensor ​

Type3D Array
Dimension3D - a "cube" of numbers:     zyx
DescriptionA three-dimensional array of numbers. Can represent more complex relationships and transformations.
ExamplesA Matrix in a Matrix:

[[1234],[5678]]

In Programming: An Array of Arrays of Arrays

[
    [
        [ 1, 1 ],
        [ 2, 3 ],
        [ 4, 5 ]
    ], [
        [ 6, 7 ],
        [ 8, 9 ],
        [ 10, 11 ]
    ]
]


A cube of numbers, like a stack of matrices:
1
2
3
4
5
6
7
8
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
â‹…
Used ForRepresenting more complex relationships in fields like physics (stress tensors), computer vision (color images), and machine learning (convolutional layers). E.g.: a color image can be represented as a 3D tensor: height × width × color channels (256×256×3 for RGB)

Higher Order Tensors (4th Order and above) ​

Typen-Dimensional Array
DimensionAny
DescriptionTensors of order 4 and above. These can represent even more complex relationships and transformations.
ExamplesA 4D tensor could be represented as a stack of 3D tensors (like a video, which is a stack of images).
Used ForAdvanced applications in physics, machine learning (like in deep learning models), and data analysis.

Visually ​

0D-Tensor
1D-Tensor
2D-Tensor
3D-Tensor
4D-Tensor
5D-Tensor
6D-Tensor

Properties & Operations on Tensors ​

Tensors aren't just static blobs of numbers. You can do stuff with them. Here are some of the most common operations:

Addition & Subtraction ​

Tensors of the same shape can be added or subtracted elementwise.

[1234]+[5678]=[681012]

Note

Works the same for vectors and higher-order tensors too, as long as the shapes match.


Scalar Multiplication ​

Multiply every element of the tensor by a number (a scalar):

2×[1234]=[2468]

Note

This scales the tensor uniformly.


Dot Product / Inner Product ​

For vectors, the dot product is:

a⋅b=∑iaibi

For tensors, "dot product" generalizes to tensor contraction, i.e.: summing over matching indices. Example: matrix-vector multiplication is just a contraction:

[1234][xy]=[1x+2y3x+4y]

Note

This reduces the order of the tensor by 2 (one for each contracted index).


Tensor Product (Outer Product) ​

Given two tensors, you can produce a higher-order tensor by multiplying without summing. For example, vector outer product:

[12]⊗[34]=[1⋅31⋅42⋅32⋅4]

Note

This is how you go from lower to higher dimensions.


Transpose & Reshape ​

You can rearrange tensor dimensions without changing the data.

  • Transpose: swap axes (e.g. rows ↔ columns for a matrix).
  • Reshape: reinterpret the same data with different dimensions (e.g. flatten an image tensor into a vector).
[1234]T=[1324]

Note

These operations don't change the underlying data, just how you view it.


Norms & Magnitudes ​

The norm of a tensor measures its "size" (generalizing vector length). For a vector:

|v|=∑ivi2

For matrices and higher tensors, various norms exist (Frobenius, spectral, etc.).
See: https://en.wikipedia.org/wiki/Matrix_norm

Note

Norms are useful for regularization and measuring distances.

Things that aren't quite tensors ​

  • Areas and Volumes:
    • There is no "Area-Tensor" in a locally Euclidean plane. Areas and Volumes are scalar quantities derived from tensors (like the determinant of a matrix), but they are not tensors themselves.
  • Quarternions:
    • These are a number system that extends complex numbers, often used to represent rotations in 3D space. While they can be related to tensors, they are not tensors themselves.
    • They actually are spinors, which are a different mathematical object.
  • Lie Groups and Lie Algebras:
    • These are algebraic structures that describe continuous symmetry. They can be represented using tensors, but they are not tensors themselves.
  • Tensor-Densities:
    • They do not scale like regular tensors under coordinate transformations (i.e.: they don't scale by a factor of kn when the coordinates are scaled by k).
  • Pseudotensors:
    • They behave like tensors under proper rotations but gain an additional sign flip under improper transformations (like reflections). An example is the cross product in 3D space.

Tensors and Transformations (the advanced bit) ​

Remember that one-line "formal definition" from earlier?

"A tensor is an object that transforms and behaves like a tensor."

This actually means something very specific. Let's consider a vector vi and a coordinate transformation given by a matrix Aij.

Under a change of basis:

v′i=Aijvj

That's how vectors transform. Now for a 2nd-order tensor (e.g. a matrix):

T′ij=AipAjqTpq

Notice how each index transforms with its own copy of A. That's the essence of being a tensor:

Each index transforms independently and linearly under coordinate changes.

For lower indices (covariant), you use the inverse transpose (often written (A−1)jq). For mixed tensors (one index up, one down), you mix both rules.

Example: a rank-2 mixed tensor Tij:

T′ij=Aip(A−1)qjTpq

Important

This is a key property that makes a tensor a tensor, and that distinguishes tensors from arbitrary multi-dimensional arrays.

Conclusion ​

Tensor here. Tensor there. Tensors everywhere.

I am so sorry but I am too lazy to write a better conclusion.

But here you have a post-editorial note:
I should not have made that god damn 6D tensor in CSS...

Last updated:

© NullDev 2025