Load Tensorflow js model from local file system in javascript

I have converted a keras model to tensorflow json format and saved it locally in my computer. I am trying to load that json model in a javascript code using the below command

model = await tf.loadModel('web_model')

But the model is not getting loaded. Is there a way to load tensorflow json model from local file system?


Solution 1:

I know you're trying to load your model in a browser but if anybody lands here that's trying to do it in Node, here's how:

const tf = require("@tensorflow/tfjs");
const tfn = require("@tensorflow/tfjs-node");
const handler = tfn.io.fileSystem("./path/to/your/model.json");
const model = await tf.loadLayersModel(handler);

Solution 2:

LoadModel uses fetch under the hood. And fetch cannot access the local files directly. It is meant to be used to get files served by a server. More on this here. To load a local file with the browser, there is two approaches, asking the user to upload the file with

<input type="file"/>

Or serving the file by a server.

In these two scenarios, tf.js provides way to load the model.

  1. Load the model by asking the user to upload the file

html

<input type="file" id="upload-json"/>
<input type="file" id="upload-weights"/>

js

const uploadJSONInput = document.getElementById('upload-json');
const uploadWeightsInput = document.getElementById('upload-weights');
const model = await tfl.loadModel(tf.io.browserFiles(
 [uploadJSONInput.files[0], uploadWeightsInput.files[0]]));
  1. Serving the local files using a server

To do so, one can use the following npm module http-server to serve the directory containing both the weight and the model. It can be installed with the following command:

 npm install http-server -g

Inside the directory, one can run the following command to launch the server:

http-server -c1 --cors .

Now the model can be loaded:

 // load model in js script
 (async () => {
   ...
   const model = await tf.loadFrozenModel('http://localhost:8080/model.pb', 'http://localhost:8080/weights.json')
 })()

Solution 3:

const tf = require('@tensorflow/tfjs');
const tfnode = require('@tensorflow/tfjs-node');

async function loadModel(){
    const handler = tfnode.io.fileSystem('tfjs_model/model.json');
    const model = await tf.loadLayersModel(handler);
    console.log("Model loaded")
}


loadModel();

This worked for me in node. Thanks to jafaircl.

Solution 4:

If you're using React with create-react-app, you can keep your saved model files in your public folder.

For example, say you want to use the blazeface model. You would

  1. Download the .tar.gz model from that web page.

  2. Unpack the model into your app's public directory. So now you have the files from the .tar.gz file in a public subdir:

    %YOUR_APP%/public/blazeface_1_default_1/model.json
    %YOUR_APP%/public/blazeface_1_default_1/group1-shard1of1.bin
    
  3. Load the model in your React app using

    tf.loadGraphModel(process.env.PUBLIC_URL + 'blazeface_1_default_1/model.json'