Understanding Tensors

00:00 Welcome to this lesson where you’ll learn about tensors, which are the backbone of TensorFlow and PyTorch.

00:08 Think of tensors as containers. Imagine you have a container, and this container can hold a bunch of numbers like 3, 5, 8, and seven, and that’s a tensor.

00:20 It’s like a container that can hold numbers,

00:23 but here’s where it gets interesting. Tensors can hold multidimensional arrays. Let’s look at this picture. Tensors can be like rows of boxes or containers where each of them represents a piece of data.

00:36 Machine learning models use these rows of boxes to process information.

00:41 For example, 3D tensors can represent more complex data like images. How you might ask, let’s create a three by three tensor for a simple pixelated grayscale image.

00:54 Each cell in the tensor corresponds to a pixel with white and black colors, representing the maximum and minimum values respectively. Okay, here’s the tensor that you’re creating with an np.array.

01:07 In the first row, you have 255, 0, 255, 0 is a pixel that corresponds to black, and 255 corresponds to white, and in the second row you have 0, 255, 0 or black, white, black.

01:22 And on the third row you have 255, 0, 255 again. Now, what would this correspond to as a whole? Let’s see.

01:31 Here you go exactly how you described it. In the tensor, you have a grayscale image with each row corresponding to the black and white pixels.

01:40 In a nutshell, tensors are just containers for numbers, and they come in different shapes and sizes depending on the data you’re working with.

Become a Member to join the conversation.