Skip to content

Depth Multiplier becomes 0, on conversion to TFLITE #5767

Open
@SanthoshRajendiran

Description

@SanthoshRajendiran

MODEL: DeeplabV3+ MobilenetV2 Pretrained with Pascal VOC 2012
Tensorflow Version: 1.12.0
TFLITE Version: 1.12.0
Bazel Version: 0.17.2
OS version: Ubuntu 16.04
GPU Version: GTX 1080 Ti
Command to reproduce:

bazel run //tensorflow/lite/toco:toco --
--input_file=frozen_inference_graph.pb
--output_file=converted.tflite
--input_format=TENSORFLOW_GRAPHDEF
--output_format=TFLITE
--input_arrays=ImageTensor
--output_arrays=SemanticPredictions
--input_shapes=1,513,513,3
--allow_custom_ops
--inference_type=FLOAT
--inference_input_type=QUANTIZED_UINT8

(or)

CUDA_VISIBLE_DEVICES="0"
tflite_convert
--output_file=converted.tflite
--graph_def_file=frozen_inference_graph.pb
--inference_type=FLOAT
--inference_input_type=QUANTIZED_UINT8
--input_arrays=ImageTensor
--output_arrays=SemanticPredictions
--input_shapes=1,513,513,3
--mean_values=513
--std_dev_values=513

Benchmarking Command:
bazel run -c opt tensorflow/lite/tools/benchmark:benchmark_model -- --graph='realpath converted.tflite'

Conversion of pre-trained models provided on the model zoo to TFLITE, on benchmarking and running on Android provided issues on depth multiplier value getting converted to 0.

Stack Overflow Issue: https://stackoverflow.com/questions/53228969/unable-to-test-and-deploy-a-deeplabv3-mobilenetv2-tensorflow-lite-segmentation-m

Metadata

Metadata

Assignees

Labels

comp:litetf-lite issuesmodels:researchmodels that come under research directory

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions