2

I am trying to load a Tensorflowjs model using the model.json which is an in memory browser side object.

https://js.tensorflow.org/api/latest/#loadLayersModel

One approach may be to return the json from a dummy fetch method.

fetchFunc (Function) A function used to override the window.fetch function.

Alternatively, it is possible to create a custom IOHandler, but there is very little documentation on this.

An tf.io.IOHandler object that loads model artifacts with its load method.

Does anyone know how to achieve this using the tensorflow load methods?

var modelJson = "{...ModelAndWeightsConfig}";

//Do something here to load it.

var newModel =  tf.loadLayersModel("/model_0/model.json", {
                    onProgress: onProgressCallback}).then(model =>{});

Regards,

2 Answers 2

5

Yes, you can write your own IOHandler to load the model. Check out the definition of an IOHandler here. You have to implement the load function that returns a Promise<ModelArtifacts>.

That means, to load a model saved by the file IOHandler, you can check out the source code and reimplement the load function yourself.

Code Sample

Here is example to get you started. The load() part is mostly copied from the loadJSONModel function from the file IOHandler. Basically, the JSON string is passed as an argument and then used when the load function is called by Tensorflow.js.

export class JSONHandler implements tfc.io.IOHandler {
  constructor(jsonString) {
     this.jsonString = jsonString;
  }
  async load() {
    const modelJSON = JSON.parse(jsonString);
    const modelArtifacts: tfc.io.ModelArtifacts = {
      modelTopology: modelJSON.modelTopology,
      format: modelJSON.format,
      generatedBy: modelJSON.generatedBy,
      convertedBy: modelJSON.convertedBy
    };
    if (modelJSON.weightsManifest != null) {
      // load weights (if they exist)
    }
    if (modelJSON.trainingConfig != null) {
      modelArtifacts.trainingConfig = modelJSON.trainingConfig;
    }
    if (modelJSON.userDefinedMetadata != null) {
      modelArtifacts.userDefinedMetadata = modelJSON.userDefinedMetadata;
    }
    return modelArtifacts;
  }
}

To use the model, you can then create an instance of it and pass it to the load function:

const modelJson = '{ ... }';
const handler = new JSONHandler(modelJson);
const model = await tf.loadLayersModel(handler);
Sign up to request clarification or add additional context in comments.

1 Comment

hi, i seem to get the following error, Uncaught (in promise) TypeError: specs is not iterable at Module.decodeWeights (io_utils.js:105), is there a way to resolve this
0

Fetch the models

    var fetchPromise = function(url,p1,p2,) {
        return new Promise(function(resolve, reject) {
            fetch(url)
                    .then(response => {
                        resolve(response);
                    }).catch(err =>{
                        reject();
                    });
        });
    };

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.