Build Status

Image Classification Guide

A model example for image classification with a Keras wrapper within Acumos.

Sample images and example image classification scores


This model analyzes static images and produces a probability for a number of objects, scene, and activity tags. It is a Keras-based wrapper around a visual model trained for Inception Net and this source code creates and pushes a model into Acumos. This model utilizes the pre-trained network from keras inception v4 and utilizes the pretrained keras model At time of writing, this sample does not support retraining.


Input to the model is an array of one or more tuples of image binary data and a binary mime type. The position of the image within the array is utilized in the output signature as a zero-based index. For example if three images were sent, the output probabilities would have 0, 1, and 2 as index values. The probabilities are normalized to sum to 1.0 over all values so that they can be utilized as relative confidence scores.

A web demo is included with the source code, available via the Acumos Gerrit repository or the mirrored Acumos Github repository. It utilizes a protobuf javascript library and inputs captured frames from a few video samples to classify and display the top N detected classification scores, as illustrated in the model image.

Once deployed, you can quickly jump to the default webhost page and point to your model for a demo; see Demonstrations: Tutorial for Image Classification Models in the Tutorials.


Formal performance is not provided here because this is a wrapped, pre-generated model, but the original authors point to these sources for information.

Error rates are actually slightly lower than the listed error rates on non-blacklisted subset of ILSVRC2012 Validation Dataset (Single Crop):

  • Top@1 Error: 20.0%
  • Top@5 Error: 5.0%

More Information

Enhancements to this model may include additional training capabilities or adaptation to new model weights (and classes) when available.

Source Installation

This section is useful for source-based installations and is not generally intended for catalog documentation.

Package dependencies

Package dependencies for the core code and testing have been flattened into a single file for convenience. Instead of installing this package into your your local environment, execute the command below.

Note: If you are using an anaconda-based environment, you may want to try installing with conda first and then pip.

conda install --yes --file requirements.txt  # suggested first step if you're using conda

Installation of the package requirements for a new environment.

pip install -r requirements.txt


This package contains runable scripts for command-line evaluation, packaging of a model (both dump and posting), and simple web-test uses. All functionality is encapsulsted in the script and has the following arguments.

usage: [-h] [-m MODEL_PATH] [-i IMAGE] [-I IMAGE_LIST]
                     [-p PREDICT_PATH] [-f {keras,tensorflow}]
                     [-C CUDA_ENV] [-l LABEL_PATH]
                     [-n NUM_TOP_PREDICTIONS] [-a PUSH_ADDRESS]
                     [-A AUTH_ADDRESS] [-d DUMP_MODEL]

optional arguments:
  -h, --help            show this help message and exit

main execution and evaluation functionality:
  -m MODEL_PATH, --model_path MODEL_PATH
                        Path to read and store image model. (created if not
  -i IMAGE, --image IMAGE
                        Absolute path to image file. (for now must be a jpeg)
  -I IMAGE_LIST, --image_list IMAGE_LIST
                        To batch process multiple images in one load
  -p PREDICT_PATH, --predict_path PREDICT_PATH
                        Optional place to save intermediate predictions from
  -l LABEL_PATH, --label_path LABEL_PATH
                        Path to class label file for output columns, unnamed
                        if empty (i.e. data/keras_class_names.txt).

model creation and configuration options:
  -f {keras,tensorflow}, --framework {keras,tensorflow}
                        Underlying framework to utilize
  -C CUDA_ENV, --cuda_env CUDA_ENV
                        Anything special to inject into CUDA_VISIBLE_DEVICES
                        environment string
                        Display this many predictions. (0=disable)
  -a PUSH_ADDRESS, --push_address PUSH_ADDRESS
                        server address to push the model (e.g.
  -A AUTH_ADDRESS, --auth_address AUTH_ADDRESS
                        server address for login and push of the model (e.g.
  -d DUMP_MODEL, --dump_model DUMP_MODEL
                        dump model to a directory for local running

Example Usages

Please consult the Tutorials directory for usage examples.

Release Notes

The Image Classification Release Notes catalog additions and modifications over various version changes.