Creating a torch tensor from a generator

I don't see why you want to use a generator. The list doesn't really make a difference here.

The question is: Do you want to create your data in Python first and move it then to PyTorch (slower in most cases) or do you want to create it directly in PyTorch.
(A generator would always create the data in Python first)

So if you want to load data the story is different, but if you want to generate data I see no reason why you shouldn't do so in PyTorch directly.


If you want to directly create your list in PyTorch for your example you can do so using arange and pow:

torch.arange(10).pow(2)

Output:

tensor([ 0,  1,  4,  9, 16, 25, 36, 49, 64, 81])

torch.arange(10) works the same way like range in python, so it's exactly as versatile range. Then pow(2) just takes your tensor to the 2nd power.

But you can also do all other sorts of computation instead of pow once you created your tensor using arange.


As @blue-phoenox already points out, it is preferred to use the built-in PyTorch functions to create the tensor directly. But if you have to deal with generator, it can be advisable to use numpy as a intermediate stage. Since PyTorch avoid to copy the numpy array, it should be quite performat (compared to the simple list comprehension)

>>> import torch
>>> import numpy as np
>>> torch.from_numpy(np.fromiter((i**2 for i in range(10)), int))
tensor([ 0,  1,  4,  9, 16, 25, 36, 49, 64, 81])

Tags:

Pytorch