A tensor can be viewed as a multi-dimensional array. Similar to how an n-dimensional vector is shown as a one-dimensional array with n elements relative to a specific basis, any tensor can be expressed as a multi-dimensional array when referenced to a basis. The individual values within this multi-dimensional structure are referred to as the tensor’s components.
PyTorch is an open-source machine learning library developed by Facebook’s AI Research lab. It’s known for its flexibility, intuitive design, and dynamic computational graph which makes debugging easier. This library offers multi-dimensional tensor data structures and implements various mathematical functions to manipulate these tensors. It also includes numerous tools for effective tensor serialisation, handling arbitrary data types, and provides several other practical utilities.
PyTorch shares significant similarities with NumPy, though it uses the term “tensor” instead of “N-dimensional array”. For example,
Now let us create tensors and do some operations in PyTorch. If you’re familiar with vector and matrix operations and linear algebra then you should see some familiar things below, but if you’re not, that’s ok! If you’re wanting an intro or a refresher on linear algebra, we highly recommend the YouTuber 3Blue1Brown and his series “Essence of Linear Algebra”.
# create specific tensorszeros = torch.zeros(3, 4) # 3x4 tensor of zerosones = torch.ones(2, 3) # 2x3 tensor of onesrand = torch.rand(2, 2) # 2x2 tensor of random numbers (0-1)print(zeros, "\n", ones, "\n", rand)
# Create two 1D tensors (vectors)vector_a = torch.tensor([1, 2, 3])vector_b = torch.tensor([4, 5, 6])# Computing the dot product between two vectors using torch.dotdot_product_result = torch.dot(vector_a, vector_b)print(dot_product_result)
tensor(32)
In this section, we have introduced some basic tensor operations. To learn more about tensor operations, refer to the official documentation: Tensor Operations