1

I'm trying to combine a set of images into one np.array and convert it to a tf.data.Dataset object like so:

test = np.array([np.array(PIL.Image.open(image), dtype=np.float32) for image in image_list],
                dtype=object)

test_set = tf.data.Dataset.from_tensor_slices(test)

But doing so raises the following error:

ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type numpy.ndarray).

Any idea what I could be doing wrong? Cheers

1
  • That's how a numpy array format is written. If you do print(test), it will be printed in the format you want. There is nothing wrong with the output. Commented May 5, 2022 at 7:41

1 Answer 1

3

Figured out the issue was caused by creating one np.array too many. Doing the following fixed it for me:

test = [np.array(PIL.Image.open(image), dtype=np.float32) for image in image_list]
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.