Comments (6)
Hi @MuhammadBilal848 ,
Since TF2.16 uses Keras3 by default.In Keras3 saving to the TF SavedModel format via model.save() is no longer supported in Keras 3. Please refer to migration guide for some more details.
from tensorflow.
Hi @MuhammadBilal848 ,
Thanks for confirmation and happy that it worked. Could you please mark this issue as closed. Thanks!
from tensorflow.
My thinking process onto this Error is probably due to multiple issues. I can lend a few ideas to the situation and anyone is free to correct me, if I'm wrong about it.
First, I would check to see if the TenserFlow compiler is updated to the versions you're using. Maybe it could cause these issues. As it did suggest rebuild TenserFlow with appropriate compiler.
Second, I would double check to see if the versions are even compatible. That could be causing issues.
from tensorflow.
So I trained another just to check and used model.export("FOLDER_NAME")
instead model.save("model.h5")
.
Got this as output,
and the folder is saved with assets , variables , pb file and fingerprint:
I load the model using tf.keras.layers.TFSMLayer("FOLDER_NAME", call_endpoint="serving_default")
It worked. @SuryanarayanaY Thank you 🖤
from tensorflow.
Hi @MuhammadBilal848 ,
Since TF2.16 uses Keras3 by default.In Keras3 saving to the TF SavedModel format via model.save() is no longer supported in Keras 3. Please refer to migration guide for some more details.
Also could you tell me how can I convert the pd model to h5? I want to convert the model to tflite.
Got:
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
WARNING: All log messages before absl::InitializeLog() is called are written to STDERR
W0000 00:00:1714042377.283703 4504 tf_tfl_flatbuffer_helpers.cc:390] Ignored output_format.
W0000 00:00:1714042377.284215 4504 tf_tfl_flatbuffer_helpers.cc:393] Ignored drop_control_dependency.
2024-04-25 15:52:57.285827: I tensorflow/cc/saved_model/reader.cc:83] Reading SavedModel from: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x
2024-04-25 15:52:57.287069: I tensorflow/cc/saved_model/reader.cc:51] Reading meta graph with tags { serve }
2024-04-25 15:52:57.287242: I tensorflow/cc/saved_model/reader.cc:146] Reading SavedModel debug info (if present) from: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x
2024-04-25 15:52:57.300472: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled
2024-04-25 15:52:57.306249: I tensorflow/cc/saved_model/loader.cc:234] Restoring SavedModel bundle.
2024-04-25 15:52:57.378809: I tensorflow/cc/saved_model/loader.cc:218] Running initialization op on SavedModel bundle at path: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x
2024-04-25 15:52:57.391327: I tensorflow/cc/saved_model/loader.cc:317] SavedModel load for tags { serve }; Status: success: OK. Took 105497 microseconds.
2024-04-25 15:52:57.412351: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:268] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable.
Traceback (most recent call last):
File "F:\Projects\Trigger Word Detection\converter.py", line 10, in <module>
tflite_model = converter.convert()
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1175, in wrapper
return self._convert_and_export_metrics(convert_func, *args, **kwargs)
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1129, in _convert_and_export_metrics
result = convert_func(self, *args, **kwargs)
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1636, in convert
saved_model_convert_result = self._convert_as_saved_model()
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1617, in _convert_as_saved_model
return super(TFLiteKerasModelConverterV2, self).convert(
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1407, in convert
result = _convert_graphdef(
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert_phase.py", line 212, in wrapper
raise converter_error from None # Re-throws the exception.
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert_phase.py", line 205, in wrapper
return func(*args, **kwargs)
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert.py", line 995, in convert_graphdef
data = convert(
File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert.py", line 367, in convert
raise converter_error
tensorflow.lite.python.convert_phase.ConverterError: Could not translate MLIR to FlatBuffer.
from tensorflow.
Hi @MuhammadBilal848 ,
Since TF2.16 uses Keras3 by default.In Keras3 saving to the TF SavedModel format via model.save() is no longer supported in Keras 3. Please refer to migration guide for some more details.Also could you tell me how can I convert the pd model to h5? I want to convert the model to tflite.
Got:
To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags. WARNING: All log messages before absl::InitializeLog() is called are written to STDERR W0000 00:00:1714042377.283703 4504 tf_tfl_flatbuffer_helpers.cc:390] Ignored output_format. W0000 00:00:1714042377.284215 4504 tf_tfl_flatbuffer_helpers.cc:393] Ignored drop_control_dependency. 2024-04-25 15:52:57.285827: I tensorflow/cc/saved_model/reader.cc:83] Reading SavedModel from: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x 2024-04-25 15:52:57.287069: I tensorflow/cc/saved_model/reader.cc:51] Reading meta graph with tags { serve } 2024-04-25 15:52:57.287242: I tensorflow/cc/saved_model/reader.cc:146] Reading SavedModel debug info (if present) from: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x 2024-04-25 15:52:57.300472: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:388] MLIR V1 optimization pass is not enabled 2024-04-25 15:52:57.306249: I tensorflow/cc/saved_model/loader.cc:234] Restoring SavedModel bundle. 2024-04-25 15:52:57.378809: I tensorflow/cc/saved_model/loader.cc:218] Running initialization op on SavedModel bundle at path: C:\Users\Bilal\AppData\Local\Temp\tmphhpbtz_x 2024-04-25 15:52:57.391327: I tensorflow/cc/saved_model/loader.cc:317] SavedModel load for tags { serve }; Status: success: OK. Took 105497 microseconds. 2024-04-25 15:52:57.412351: I tensorflow/compiler/mlir/tensorflow/utils/dump_mlir_util.cc:268] disabling MLIR crash reproducer, set env var `MLIR_CRASH_REPRODUCER_DIRECTORY` to enable. Traceback (most recent call last): File "F:\Projects\Trigger Word Detection\converter.py", line 10, in <module> tflite_model = converter.convert() File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1175, in wrapper return self._convert_and_export_metrics(convert_func, *args, **kwargs) File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1129, in _convert_and_export_metrics result = convert_func(self, *args, **kwargs) File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1636, in convert saved_model_convert_result = self._convert_as_saved_model() File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1617, in _convert_as_saved_model return super(TFLiteKerasModelConverterV2, self).convert( File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\lite.py", line 1407, in convert result = _convert_graphdef( File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert_phase.py", line 212, in wrapper raise converter_error from None # Re-throws the exception. File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert_phase.py", line 205, in wrapper return func(*args, **kwargs) File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert.py", line 995, in convert_graphdef data = convert( File "F:\Projects\Trigger Word Detection\twd\lib\site-packages\tensorflow\lite\python\convert.py", line 367, in convert raise converter_error tensorflow.lite.python.convert_phase.ConverterError: Could not translate MLIR to FlatBuffer.
I tried this method and it worked:
from tensorflow.
Related Issues (20)
- Aborted (core dumped) with `tf.raw_ops.LSTMBlockCell`
- Aborted (core dumped) with `tf.raw_ops.LSTMBlockCellGrad` HOT 1
- Check fail in `tf.raw_ops.MaxPoolGradWithArgmax` HOT 1
- Aborted (core dumped) in `tf.raw_ops.NearestNeighbors` HOT 1
- Aborted (core dumped) in `tf.raw_ops.SparseBincount`
- Aborted (core dumped) in `TensorScatterOp` HOT 1
- Buffer size mismatch in tensorflow/lite/kernels/stablehlo_pad.cc
- libtensorflow-cpu-windows-x86_64-2.15.0 HOT 2
- Dataset sharding warning
- libhexagon_interface.so for non Android - eLinux platform HOT 4
- Unable to install old version of tensorflow HOT 3
- module 'keras.src.backend' has no attribute 'convert_to_numpy' HOT 2
- How to create XLA compiled model from a TF Saved Model HOT 1
- Tensorflow lite build with CMake HOT 3
- Android Tensorflow lite cannot use models from path, gives "TFLite failed to load model with error" error. HOT 1
- Cannot save Keras model when v2 behavior is disabled HOT 1
- RQA features
- "Cannot dlopen some GPU libraries." does not List what Libraries Failed to Load HOT 2
- [tensorflow/tflite branch r2.10] crash at absl::OkStatus(); HOT 2
- Multiple CNN models concurrently on the GPU
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tensorflow.