Comments (7)
I also noticed the inference increasing significantly making live detection with more than super primitive model very ugly. Do you remember which version started the inference to increase @yingeo ?
from flutter-tflite.
I looked into it and it seems that issues in Flutter. Forked one older repo and tested it:
Environment
This was tested on the following environment:
- Built on Windows 10 with Java 11.0.12
- Run on Pixel 5 (Android 14)
Just detected same blank(ish) camera image. Not too scientific (e.g. phone might heat up and give lower results), so few ms difference should not be taken too seriously, but the 7x or 16x difference is significant.
Results
Average time until settles (times in ms).
Test | Inference | Pre-processing | Total predict | Total elapsed |
---|---|---|---|---|
tflite_flutter: 0.9.5 Flutter: 3.13.9 |
22 | 17 | 39 | 51 |
tflite_flutter: 0.9.5 Flutter: 3.16.9 |
359 | 18 | 378 | 387 |
tflite_flutter: 0.10.4 Flutter: 3.13.9 |
22 | 15 | 38 | 50 |
tflite_flutter: 0.10.4 Flutter: 3.16.9 |
351 | 16 | 368 | 376 |
Conclusion
- No difference between tflite_flutter 0.9.5 and 0.10.4
- Big difference between Flutter 3.13.9 and Flutter 3.16.9 - 16x slower inference time and 7.5x slower total time.
Flutter 3.19.3 seemed to be as slow as 3.16.9. So it seems that the problem is not yet fixed.
My repo: https://github.com/timukasr/object_detection_flutter
Something similar results can be reproduced with tflite_flutter sample app live_object_detection_ssd_mobilenet. This app seems a bit broken - does not display results on screen. But it reports inference time. With Flutter 3.13.9, it was 170-200ms, with Flutter 3.19.3, it was 500-550ms. For some reason it is slow even with older Flutter, but even slower with newer flutter.
from flutter-tflite.
@timukasr Thanks a lot for you insights! I didnt even try out different flutter versions. I noticed a inference increase by just changing the tflite_flutter version when I upgraded to 0.10.4 (from 0.10.0) I think. I will continue to investigate this when I find time.
from flutter-tflite.
I take it all back. I was not able to reproduce significant inference differences in 0.10.x versions. The massive spikes due to distinct flutter versions should be addressed at some point.
from flutter-tflite.
It has nothing to do with the version of tflite_flutter, but to the version of Flutter. Currently, I use tflite_flutter: version 0.10.4, and Flutter version is 3.7.12. The inference time is consistent with the native speed of Android. If you upgrade flutter to the latest version, the inference speed will be slow. terrible.
from flutter-tflite.
@yingeo Did you compare 3.7.12 with the other version(such as 3.13.9). Having a hard time after downgrade to 3.7.12 (several config should be change lol)
from flutter-tflite.
tried tflite_flutter 0.10.4 + flutter 3.13.9 + video classification(MoviNet) + isolateInterpreter.runForMultipleInputs.
can't figure out any difference between the 3.13.9 and the latest version of flutter.
My conclusion is that tflite_flutter is currently not optimized for some models/operators... one step of inference (on a single frame image) takes more than 3000ms on Samsung galaxy s8 / more than 2000ms on Samsung galaxy z flip 3.
from flutter-tflite.
Related Issues (20)
- Not able to run the model directly on image
- Support for Hexagon Delegate?
- Cannot get yamnet features in flutter
- E/flutter (30995): [ERROR:flutter/runtime/dart_vm_initializer.cc(41)] Unhandled Exception: Invalid argument(s): Output object shape mismatch, interpreter returned output of shape: [1, 84, 8400] while shape of output provided as argument in run is: [8400, 4] HOT 2
- null check operator used on a null value HOT 1
- IOS release version does't work HOT 2
- Using GPU for tflite
- Model far less accurate when testing with example code
- Why flutter-tflite doesn't support flutter web HOT 3
- Undefined name 'TfLiteGpuInferenceUsage' and 'TfLiteGpuInferencePrioFrity'. HOT 1
- support for the CUDA platform of NVIDIA graphics cards HOT 2
- Is it possible to run more then one TFLite models at the same time on a Flutter App HOT 1
- NNAPI delegate HOT 1
- Model missing in live object detection
- How can I load a tflite model using Isolate? HOT 1
- Super slow inference speed for a certain model
- Unhandled exception: type 'List<double>' is not a subtype of type 'List<int>' of 'value' HOT 4
- Build issue with old version
- Need help for super-resolution for onnx like tflite implemention (Not issue)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from flutter-tflite.