27

How do I convert a torch tensor to numpy?

8 Answers 8

44

copied from pytorch doc:

a = torch.ones(5)
print(a)

tensor([1., 1., 1., 1., 1.])

b = a.numpy()
print(b)

[1. 1. 1. 1. 1.]


Following from the below discussion with @John:

In case the tensor is (or can be) on GPU, or in case it (or it can) require grad, one can use

t.detach().cpu().numpy()

I recommend to uglify your code only as much as required.

Sign up to request clarification or add additional context in comments.

9 Comments

In my copy of torch better make that a.detach().cpu().numpy()
@LarsEricson why?
what would the complexity be to convert tensor to NumPy like this?
This is true, although I believe both are noops if unnecessary so the overkill is only in the typing and there's some value if writing a function that accepts a Tensor of unknown provenance. I apologize for misunderstanding your original question to Lars. To summarize, detach and cpu are not necessary in every case, but are necessary in perhaps the most common case (so there's value in mentioning them). numpy is necessary in every case but is often insufficient on its own. Any future persons should reference the question linked above or the pytorch documentation for more information.
|
11

You can try following ways

1. torch.Tensor().numpy()
2. torch.Tensor().cpu().data.numpy()
3. torch.Tensor().cpu().detach().numpy()

1 Comment

"try" is a very bad phrasing for this. The behavior is very deterministic for each of those calls
4

This is a function from fastai core:

def to_np(x):
    "Convert a tensor to a numpy array."
    return apply(lambda o: o.data.cpu().numpy(), x)

Possible using a function from prospective PyTorch library is a nice choice.

If you look inside PyTorch Transformers you will find this code:

preds = logits.detach().cpu().numpy()

So you may ask why the detach() method is needed? It is needed when we would like to detach the tensor from AD computational graph.

Still note that the CPU tensor and numpy array are connected. They share the same storage:

import torch
tensor = torch.zeros(2)
numpy_array = tensor.numpy()
print('Before edit:')
print(tensor)
print(numpy_array)

tensor[0] = 10

print()
print('After edit:')
print('Tensor:', tensor)
print('Numpy array:', numpy_array)

Output:

Before edit:
tensor([0., 0.])
[0. 0.]

After edit:
Tensor: tensor([10.,  0.])
Numpy array: [10.  0.]

The value of the first element is shared by the tensor and the numpy array. Changing it to 10 in the tensor changed it in the numpy array as well.

This is why we need to be careful, since altering the numpy array my alter the CPU tensor as well.

Comments

4

Another useful way :

a = torch(0.1, device='cuda')

a.cpu().data.numpy()

Answer

array(0.1, dtype=float32)

2 Comments

What is the benefit of including .data. ?
It's not a "useful way". .cpu().data is strictly useless in the context of the question.
1

You may find the following two functions useful.

  1. torch.Tensor.numpy()
  2. torch.from_numpy()

Comments

1

Sometimes if there's "applied" gradient, you'll first have to put .detach() function before the .numpy() function.

loss = loss_fn(preds, labels)
print(loss.detach().numpy())

Comments

1

You can use the force=True parameter from torch.Tensor.numpy:

import torch

t = torch.rand(3, 2, device='cuda:0')
print(t.numpy(force=True))

t.numpy(force=True) is a shorthand to:

t.detach().cpu().resolve_conj().resolve_neg().numpy()

The force parameter was introduced in PyTorch 1.13.

2 Comments

that's cool. nice idea by torch. what version introduced this?
Hi @Gulzar, it was introduced in PyTorch 1.13
-1
x = torch.tensor([0.1,0.32], device='cuda:0')

x.detach().cpu().data.numpy()

2 Comments

Code only answer are not great. Please explain why you consider this code to answer the question.
the detach().cpu().data part is not required for this answer

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.