Comments (15)
Could you please provide the error message?
from onnx-simplifier.
Simplifying...
2019-09-23 23:08:05.474371187 [E:onnxruntime:, sequential_executor.cc:127 Execute] Non-zero status code returned while running Node: 2 Status Message: Invalid input shape: {0,0}
Traceback (most recent call last):
File "/usr/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/usr/local/lib/python3.6/dist-packages/onnxsim/__main__.py", line 24, in <module>
main()
File "/usr/local/lib/python3.6/dist-packages/onnxsim/__main__.py", line 17, in main
model_opt = onnxsim.simplify(args.input_model, check_n=args.check_n, perform_optimization=not args.skip_optimization)
File "/usr/local/lib/python3.6/dist-packages/onnxsim/onnx_simplifier.py", line 200, in simplify
res = forward_all(model_opt)
File "/usr/local/lib/python3.6/dist-packages/onnxsim/onnx_simplifier.py", line 107, in forward_all
res = forward(model)
File "/usr/local/lib/python3.6/dist-packages/onnxsim/onnx_simplifier.py", line 99, in forward
res = OrderedDict(zip(outputs, sess.run(outputs, inputs)))
File "/usr/local/lib/python3.6/dist-packages/onnxruntime/capi/session.py", line 72, in run
return self._sess.run(output_names, input_feed, run_options)
RuntimeError: Method run failed due to: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Non-zero status code returned while running Node: 2 Status Message: Invalid input shape: {0,0}
from onnx-simplifier.
Seems dynamic input not supported by onnxruntime.
from onnx-simplifier.
But actually, onnxruntime can inference model with dynamicly correct
from onnx-simplifier.
@jinfagang i think your source code is not correct,please post it.
from onnx-simplifier.
It's not about source code. I simply convert maskrcnn.onnx from onnx model zoo. You can try simplifying it. But challenging.
Currently, this model generated with too much complicated node structure, the reason for simpliying is that I want convert it to TensorRT, it's can't be done if not do some model surgeon on it.
from onnx-simplifier.
I am able to make it forward by sending numpy.array with certain dimension converted from my input image. For onnx-sim it doesn't handle dynamic input which always get (3,0,0) input size.
I caught some weired Constant op:
input: "605"
output: "913"
name: "911"
op_type: "Constant"
attribute {
name: "axis"
i: 0
type: INT
}
Which seems can not be simplified either can not leave it alone. I can using onnx-sim forward now but can't remove all Constant layers.
When eliminating these nodes will caught error:
elem_type = get_elem_type(model, node.output[0])
if elem_type != None:
# print(node)
# default is TENSOR, INT will not trace here
shape = res[node.output[0]].shape
new_attr = onnx.helper.make_attribute(
'value',
onnx.helper.make_tensor(
name=node.output[0],
data_type=elem_type,
dims=shape,
vals=np.array(res[node.output[0]]).flatten().astype(get_np_type_from_elem_type(elem_type))
))
del node.input[:]
del node.attribute[:]
node.attribute.extend(
[new_attr])
They have no elem_type from above structure.
from onnx-simplifier.
@jinfagang You are right, onnxsim cannot handle dynamic input shape currently, I'll try to support it when I have time
from onnx-simplifier.
@daquexian May I ask that from your codes I only saw constant simplifying (clean Const and Eliminate Const) which basically is convert constants with Tensors raw data into a combined Const node. How do u able to simplifying these (shape->gather->unsqueeze) into a single reshape node? Correct me if am wrong.
from onnx-simplifier.
@jinfagang great question. As long as the input shape is determined, the output of shape, gather and unsqueeze op will be all marked as const in onnxsim and be replaced by constant ops.
from onnx-simplifier.
I see.. so that dynamic inputs graph such as maskrcnn will impossible to simplify in this way... because every middle output shape will be various on different images or same image but different input size.
Do u think there still necessary simplify it in this situation?
The essential point of onnxsim is that I can convert a simed onnx model to tensorrt but before sim it just can not convert. What do u think is the behind root reason for this?
from onnx-simplifier.
Do u think there still necessary simplify it in this situation?
Sorry for the late reply. It depends. The ops performed on const data (e.g., weights and bias) will be also eliminated via simplifying, no matter whether the input shape is static.
What do u think is the behind root reason for this?
Could you provide the error log for the failure case? It is helpful to find the reason.
from onnx-simplifier.
Some large model can not be or hard to simplify such as maskrcnn.onnx
from onnx-simplifier.
Some large model can not be or hard to simplify such as maskrcnn.onnx
Could you please open a separate issue for it? Thanks! This issue is closing since onnxsim now supports setting input shape manually in v0.1.9
from onnx-simplifier.
https://github.com/onnx/models/tree/master/vision/object_detection_segmentation/faster-rcnn
您用的 --input-shape 是什么?谢谢 我想简化后 转为caffe,您有什么好的方案么
from onnx-simplifier.
Related Issues (20)
- [BUG] Message onnx.ModelProto exceeds maximum protobuf size of 2GB error from the latest version HOT 1
- [BUG] After simplification using onnxsim, the model size increased ten times HOT 2
- How to prohibit constant reuse HOT 1
- resnet18 and inception cannot use simplify method HOT 1
- How to prohibit constant reuse HOT 3
- 是否存在选项让simplify只做部分操作,比如只推理shape不做op优化
- Cannot remove shape of ch_PP-OCRv4_rec
- Consecutive squeeze unsqueeze layers could be simplified
- [BUG] unable to install onnxsim 0.4.34/0.4.35 from pypi HOT 5
- [Q&A] 请问可以支持 ConvTranspose + BatchNormalization 的融合吗? HOT 1
- 3 concatenations at the same time? [Request]
- [Request] "ConstantOfShape + Mul(B=0)" is not simplified HOT 1
- [BUG] Simplify removes local functions from the ONNX model HOT 1
- [BUG] onnx.onnx_cpp2py_export.checker.ValidationError: The model does not have an ir_version set properly.
- [BUG] onnx-simplifier==0.4.25 cannot do shape inference in some onnx HOT 3
- [BUG] Shape not supported yet! Tile not supported yet! HOT 1
- ERROR: Could not build wheels for onnxsim, which is required to install pyproject.toml-based projects[BUG]
- [Disscussion] Would mechanism like onnx "register_schema" be helpful in optimizing models with user-defined operators?
- [Request] Preserve input value_info for custom ops
- eraseOutput: Assertion `outputs_[i]->uses().empty()` failed. HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from onnx-simplifier.