1

I was following the TFJS WebML YT Course from Jason Mayes and following along to the TFJS Converter video, here's my notebook: Google Colab Notebook

Basically it is an exact replica from the video and with a few extra warnings here and there, the model files were generated: MobileNetV2.zip

On my site, I used TFJS with tf.loadLayersModel('URL OF MODEL.JSON') but an error was given saying an InputLayer should have been passed either a batchInputShape or an inputShape. There is a batch_shape in the model.json though. TFJS is up to date, no clue whats going on.

I am trying to convert a MobileNetV3-Large actually but the same thing happened too and I was left with this (using same notebook as earlier but replacing tf.keras.applications.MobileNetV2 with tf.keras.applications.MobileNetV3Large, but got same results as earlier): MobileNetV3-Large.zip

Any help appreciated! This is probably just me stupid as I'm learning TF/TFJS and ML in general, so sorry in advance!

1 Answer 1

0

The error indicates that the TensorFlow.js converter cannot determine the expected input shape for your model. This arises because the model.save() saves the model architecture and weights and does not explicitly save the InputLayer.

You can convert a MobileNetV3-Large using TensorFlow.js converter like:

!tensorflowjs_converter --input_format=keras sample_data/tf_model_MobileNetV3Large/model_tf.h5 sample_data/tfjs_model_keras_MobileNetV3Large
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.