Serving multiple tensorflow models using docker

There is no docker environment variable named “MODEL_CONFIG_FILE” (that’s a tensorflow/serving variable, see docker image link), so the docker image will only use the default docker environment variables ("MODEL_NAME=model" and "MODEL_BASE_PATH=/models"), and run the model “/models/model” at startup of the docker image. "config.conf" should be used as input at "tensorflow/serving" startup. Try to run something like this instead:

docker run -p 8500:8500 8501:8501 \
  --mount type=bind,source=/path/to/models/first/,target=/models/first \
  --mount type=bind,source=/path/to/models/second/,target=/models/second \
  --mount type=bind,source=/path/to/config/config.conf,target=/config/config.conf\
  -t tensorflow/serving --model_config_file=/config/config.conf

I ran into this double slash issue for git bash on windows.

As such I am passing the argument, mentioned by @KrisR89, in via command in the docker-compose.

The new docker-compose looks like this and works with the supplied dockerfile:

version: '3'

services:
  serving:
    build: .
    image: testing-models
    container_name: tf
    command: --model_config_file=/config/config.conf

The error is cause serving couldn't find your model.

E tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:369] FileSystemStoragePathSource encountered a file-system access error: Could not find base path /models/model for servable model

Your docker compose file didn't mount your model files in the container. So the Serving couldn't find your models. I suggest to set three configure files.

1 docker-compose.yml

2 .env

3 models.config

docker-compose.yml:

Mount your model files from host to the container. I think you could do this :

 version: "3"
  services:
        sv:
                image: tensorflow/serving:latest
                restart: unless-stopped
                ports:
                        - 8500:8500
                        - 8501:8501
                volumes:
                        - ${MODEL1_PATH}:/models/${MODEL1_NAME}
                        - ${MODEL2_PATH}:/models/${MODEL2_NAME}
                        - /home/deploy/dcp-file/tf_serving/models.config:/models/models.config
                command: --model_config_file=/models/models.config

.env: docker-compose.yml loads info from this file.

MODEL1_PATH=/home/notebooks/water_model
MODEL1_NAME=water_model
MODEL2_PATH=/home/notebooks/ice_model
MODEL2_NAME=ice_model

models.config:

model_config_list: {
  config {
    name:  "water_model",
    base_path:  "/models/water_model",
    model_platform: "tensorflow",
    model_version_policy: {
        versions: 1588723537
        versions: 1588734567
    }
  },
  config {
    name:  "ice_model",
    base_path:  "/models/ice_model",
    model_platform: "tensorflow",
    model_version_policy: {
        versions: 1588799999
        versions: 1588788888
    }
  }
}

And you can see this serving official document