wafa_ath
wafa_ath
DIIDevHeads IoT Integration Server
Created by wafa_ath on 9/9/2024 in #middleware-and-os
Fixing INT8 Quantization Error for Depthwise Conv2D Layers
Hey everyone, Thanks for the previous suggestions on tackling the inference timeout issue in my vibration anomaly detection project. I implemented quantization to optimize the model, but now I'm encountering a new error: Error Message:
Quantization Error: Unsupported Layer Type in INT8 Conversion - Layer 5 (Depthwise Conv2D)
Quantization Error: Unsupported Layer Type in INT8 Conversion - Layer 5 (Depthwise Conv2D)
It seems like the quantization process is failing specifically at Layer 5, which uses a Depthwise Conv2D operation. What’s the best approach to handle layers that aren’t compatible with INT8 quantization? Should I consider retraining with a different architecture, or is there a workaround to manually adjust these layers? Thanks in advance for your help!
3 replies