Skip to content

quantize_model() rejects a valid standalone Keras 3 Sequential model #1270

@griffinstalha

Description

@griffinstalha

Describe the bug
tfmot.quantization.keras.quantize_model() rejects a valid standalone Keras 3 keras.Sequential model with:
ValueError: 'to_quantize' can only either be a keras Sequential or Functional model.

This appears to be a compatibility/type-check issue between standalone Keras 3 and TensorFlow Model Optimization Toolkit (TFMOT). The model is a real keras.Sequential instance, but TFMOT does not recognize it as a compatible Sequential model.

System information

TensorFlow version (installed from source or binary):
2.17.0 (installed from binary / pip wheel)

TensorFlow Model Optimization version (installed from source or binary):
0.8.0 (installed from binary / pip wheel)

Python version:
3.11.15

Describe the expected behavior
A valid standalone Keras 3 keras.Sequential model should be accepted by tfmot.quantization.keras.quantize_model() or the API/docs should clearly state that standalone Keras 3 models are unsupported.

Describe the current behavior
A standalone Keras 3 keras.Sequential model is created successfully, but calling:
tfmot.quantization.keras.quantize_model(model)
which primarily raises:
ValueError: 'to_quantize' can only either be a keras Sequential or Functional model.
In the same run:

  • isinstance(model, keras.Sequential) is True
  • the model type is keras.src.models.sequential.Sequential
  • TFMOT’s compat-Keras type check does not recognize it as compatible

Code to reproduce the issue

import tensorflow as tf
import keras
from keras import layers
import tensorflow_model_optimization as tfmot

model = keras.Sequential(
    [
        keras.Input(shape=(100,)),
        layers.Dense(10, activation="relu"),
        layers.Dense(2, activation="sigmoid"),
    ]
)

print("TF:", tf.__version__)
print("Keras:", keras.__version__)
print("TFMOT:", tfmot.__version__)
print("Model type:", type(model))
print("isinstance(model, keras.Sequential):", isinstance(model, keras.Sequential))

quantized_model = tfmot.quantization.keras.quantize_model(model)
print(quantized_model)

Observed Output:

TF: 2.17.0
Keras: 3.13.2
TFMOT: 0.8.0
Model type: <class 'keras.src.models.sequential.Sequential'>
isinstance(model, keras.Sequential): True
ValueError: `to_quantize` can only either be a keras Sequential or Functional model.

Screenshots
N/A

Additional context
This does not appear to be GPU-specific. The same failure occurs before any actual GPU-dependent computation.

My environment during reproduction:

- keras==3.13.2
- tf_keras==2.17.0
- tensorflow==2.17.0
- tensorflow-model-optimization==0.8.0
- python==3.11.15

I also observed that the model is recognized as a standalone Keras Sequential model, but not as a TFMOT compat-Keras Sequential model. This suggests a type-identity mismatch between standalone Keras 3 and the Keras compatibility layer used inside TFMOT.

Point to be noted:
This looks like a bug in TFMOT type checking rather than a model construction bug in Keras.

More details about my environment:

OS: Linux
Install method: pip wheels
Reproducible: yes
GPU required: no

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions