1

I have uploaded the model.json file of my tensorflow graph to a private repository on an AWS S3 bucket, and am now trying to load the graph with the loadGraphModel (alongside with the binary files of the weight manifest values, group1-shard1of1). Here's my code, which I run with node (I've kept the bucket path and signature keys private)

TFJSConverter = require('@tensorflow/tfjs-converter')
const MODEL_URL = "https://[BucketName].s3.amazonaws.com/[PathToModel]/model.json?[credentials]&[securitykey]";
global.fetch = require('node-fetch')
TFJSConverter.loadGraphModel(MODEL_URL)

However the loadGraphModel function looks for a model url ending with '.json'. If not, it looks for the full model url and checks for a weight manifest file called weights_manifest.json, with no signature. An error request then follows:

UnhandledPromiseRejectionWarning: Error: Request to https://[BucketName].s3.amazonaws.com/[PathToModel]/model.json?[credentials]&[securitykey],https://[BucketName].s3.amazonaws.com/[PathToModel]/weights_manifest.json failed with status code 403. Please verify this URL points to the model JSON of the model to load.

I've checked that the signed url actually works, is there a solution for signed urls?

Installed versions: @tensorflow/[email protected] node v10.15.3

Many thanks!

1
  • Maybe download it to a temp directory and load it from there? Commented May 9, 2019 at 5:45

2 Answers 2

2

The correct library to use to load the model is tfjs and not tfjs-converter

let tf = require("@tensorflow/tfjs");
tf.loadGraphModel(MODEL_URL)

403 error is an authorization error response. Try to set the credentials in the request using requestInit of the object passed as parameter of loadGraphModel

Sign up to request clarification or add additional context in comments.

3 Comments

I've added header with the signature: const mycred = '[mySignature]'; var myHeader = {'Authorization':mycred}; var myInit = {headers: myHeader}; const model = tf.loadGraphModel(MODEL_URL,{requestInit:myInit}); and I now get this error: Error: Request to s3.amazonaws.com/[PathToModel]/model.json failed with status code 400. Please verify this URL points to the model JSON of the model to load.
Can you check the network tab in the developer tool of chrome, to see if the headers are set for the ajax requests to retrieve the model files?
Did you manage to get this to work @tlc91? From my understanding, presigned S3 urls are limited to a single key (e.g. the model.json file), and tf.loadGraphModel would attempt to download the rest of the bin files in the defined prefix.
0

This worked for me:

const fetch = require('node-fetch')
global.fetch = fetch

but you can also try:

const fetch = require('node-fetch')
tf.loadGraphModel(MODEL_URL, { fetchFunc: fetch } )

as described in the documentation: https://js.tensorflow.org/api/latest/#loadGraphModel

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.