Comments (25)
Hey @soudalan. That's right, currently we only have pre-built packages published for Python on Linux, macOS and Windows. We're working to bring the rest of the Continuous Integration system up to speed. Windows support has just been merged today, in fact. I'll close this bug once the Java builds are running and the package is available.
from stt.
Thanks for clarifying, looking forward to it! I assume in the meantime, using the old DeepSpeechDemo is my only alternative for Android then?
from stt.
That, or building from source.
from stt.
How would I build from the source? Not sure if there was a wiki or page anywhere explaining the process.
from stt.
Following the building docs until the configure step, then switch to the Android build steps which show how to build libstt.so and then libstt.aar. But as I'm taking a look now, I'm realizing the state of building from source is not ideal. You'll need to manually copy the built libstt.so into the correct libstt/libs
folder, and then also edit native_client/java/libstt/gradle.properties to keep only the architecture you're interested in using, before you run the ./gradlew libstt:build
step.
Hope this helps. If you try it and run into any problems please let me know so I can improve the docs. I'll also file an issue for improving the docs and the build system here to make it easier to build from source.
from stt.
I forgot to ask this earlier, but it seems I will need a GPU to build from the source? And assuming I don't have a GPU available, are there any other alternatives?
from stt.
No, you don't need a GPU to build from source. What made you think that?
from stt.
Got it. The setup for Tensorflow asked for a path to my CUDA setup, so wasn't sure if I would also need a GPU prepared as well.
from stt.
Ah, yes. You need CUDA installed to make GPU builds, but no actual GPUs are required. And you can also just not enable CUDA in the configure step to do CPU only builds.
from stt.
Under the "Building libstt.so for Android" section, when I run the command
bazel build --workspace_status_command="bash native_client/bazel_workspace_status_cmd.sh" --config=monolithic --config=android --config=android_arm --define=runtime=tflite --action_env ANDROID_NDK_API_LEVEL=21 --cxxopt=-std=c++14 --copt=-D_GLIBCXX_USE_C99 //native_client:libstt.so
, I get the error:
ERROR: Skipping '//native_client:libstt.so': no such package 'native_client': BUILD file not found in any of the following directories. Add a BUILD file to a directory to mark it as a package.
This is after the symlink command ln -s ../native_client
.
What BUILD file do I need to add exactly?
from stt.
If you created the symlink inside the tensorflow
folder and if you're running bazel build
from the tensorflow
folder, that's all that should be needed.
from stt.
Hmm, that's what I just did but it's throwing the error still.
from stt.
Are you on Windows by any chance?
from stt.
No, I'm on Mac OS (Big Sur)
from stt.
ERROR: Skipping '//native_client:libstt.so': no such package 'native_client': BUILD file not found in any of the following directories. Add a BUILD file to a directory to mark it as a package.
- /Users/jlam/Documents/coqui/tensorflow/native_client
WARNING: Target pattern parsing failed.
ERROR: no such package 'native_client': BUILD file not found in any of the following directories. Add a BUILD file to a directory to mark it as a package. - /Users/jlam/Documents/coqui/tensorflow/native_client
INFO: Elapsed time: 0.156s
INFO: 0 processes.
FAILED: Build did NOT complete successfully (0 packages loaded)
jlam-macbookpro% ls
ACKNOWLEDGMENTS LICENSE configure.py
AUTHORS README.md models.BUILD
BUILD RELEASE.md native_client
CODEOWNERS SECURITY.md tensorflow
CODE_OF_CONDUCT.md WORKSPACE third_party
CONTRIBUTING.md arm_compiler.BUILD tools
ISSUES.md configure
ISSUE_TEMPLATE.md configure.cmd
jlam-macbookpro% pwd
/Users/jlam/Documents/coqui/tensorflow
jlam-macbookpro%
from stt.
Can you run, from the tensorflow
folder, the following command and post their outputs?
ls -lha
ls -lha native_client
from stt.
jlam-macbookpro% ls -lha
total 1112
drwxr-xr-x 31 jlam primarygroup 992B Apr 20 11:37 .
drwxr-xr-x 4 jlam primarygroup 128B Apr 19 15:23 ..
-rw-r--r-- 1 jlam primarygroup 29K Apr 19 15:26 .bazelrc
-rw-r--r-- 1 jlam primarygroup 6B Apr 19 15:26 .bazelversion
drwxr-xr-x 13 jlam primarygroup 416B Apr 19 15:27 .git
drwxr-xr-x 6 jlam primarygroup 192B Apr 19 15:26 .github
-rw-r--r-- 1 jlam primarygroup 896B Apr 19 15:26 .gitignore
lrwxr-xr-x 1 jlam primarygroup 34B Apr 19 15:26 .pylintrc -> tensorflow/tools/ci_build/pylintrc
-rw-r--r-- 1 jlam primarygroup 688B Apr 20 08:55 .tf_configure.bazelrc
-rw-r--r-- 1 jlam primarygroup 2.2K Apr 19 15:26 ACKNOWLEDGMENTS
-rw-r--r-- 1 jlam primarygroup 349B Apr 19 15:26 AUTHORS
-rw-r--r-- 1 jlam primarygroup 95B Apr 19 15:26 BUILD
-rw-r--r-- 1 jlam primarygroup 656B Apr 19 15:26 CODEOWNERS
-rw-r--r-- 1 jlam primarygroup 5.2K Apr 19 15:26 CODE_OF_CONDUCT.md
-rw-r--r-- 1 jlam primarygroup 9.7K Apr 19 15:26 CONTRIBUTING.md
-rw-r--r-- 1 jlam primarygroup 606B Apr 19 15:26 ISSUES.md
-rw-r--r-- 1 jlam primarygroup 2.2K Apr 19 15:26 ISSUE_TEMPLATE.md
-rw-r--r-- 1 jlam primarygroup 13K Apr 19 15:26 LICENSE
-rw-r--r-- 1 jlam primarygroup 21K Apr 19 15:26 README.md
-rw-r--r-- 1 jlam primarygroup 333K Apr 19 15:26 RELEASE.md
-rw-r--r-- 1 jlam primarygroup 13K Apr 19 15:26 SECURITY.md
-rw-r--r-- 1 jlam primarygroup 665B Apr 19 15:26 WORKSPACE
-rw-r--r-- 1 jlam primarygroup 1.1K Apr 19 15:26 arm_compiler.BUILD
-rwxr-xr-x 1 jlam primarygroup 285B Apr 19 15:26 configure
-rw-r--r-- 1 jlam primarygroup 782B Apr 19 15:26 configure.cmd
-rw-r--r-- 1 jlam primarygroup 52K Apr 19 15:26 configure.py
-rw-r--r-- 1 jlam primarygroup 328B Apr 19 15:26 models.BUILD
lrwxr-xr-x 1 jlam primarygroup 16B Apr 20 11:37 native_client -> ../native_client
drwxr-xr-x 36 jlam primarygroup 1.1K Apr 20 11:35 tensorflow
drwxr-xr-x 97 jlam primarygroup 3.0K Apr 19 15:27 third_party
drwxr-xr-x 4 jlam primarygroup 128B Apr 19 15:29 tools
jlam-macbookpro% ls -lha native_client
lrwxr-xr-x 1 jlam primarygroup 16B Apr 20 11:37 native_client -> ../native_client
from stt.
Yeah it looks exactly like it should. And can you also do cat native_client/BUILD
from the tensorflow
folder? To be clear, that symlink should be pointing to the native_client
folder inside the STT repo.
from stt.
It says
cat: native_client/BUILD: No such file or directory
from stt.
Yeah, so either you're not in the right tensorflow
folder (the submodule) or something in your system is broken in a way I've never seen. Make sure you have the following structure:
STT:
- native_client/
- BUILD
- ...
- tensorflow/
- native_client -> ../native_client
- ...
- ...
from stt.
Good catch! Turns out I had two separate Tensorflow subdirecties, and I was running these commands inside the one that wasn't part of the STT directory. Thanks for pointing that out!
from stt.
I ran into this issue when building specifically for Android:
ERROR: /Users/jlam/Documents/STT/tensorflow/tensorflow/lite/kernels/internal/BUILD:687:1: C++ compilation of rule '//tensorflow/lite/kernels/internal:kernel_utils' failed (Exit 1)
src/main/tools/process-wrapper-legacy.cc:58: "execvp(/bin/false, ...)": No such file or directory
Target //native_client:libstt.so failed to build
This is after running
bazel build --workspace_status_command="bash native_client/bazel_workspace_status_cmd.sh" --config=monolithic --config=android --config=android_arm --define=runtime=tflite --action_env ANDROID_NDK_API_LEVEL=21 --cxxopt=-std=c++14 --copt=-D_GLIBCXX_USE_C99 //native_client:libstt.so
I found this: tensorflow/tensorflow#2640
it looks like /bin/false is failing with no such file directory on execvp. That could be because Mac OS has /bin/false in /usr/bin/false instead. Could you try ln -s /usr/bin/false to /bin/false and see if it gets any further?
But that gave me an error:
ln -s /usr/bin/false /bin/false
ln: /bin/false: Operation not permitted
from stt.
Did you try the second solution posted in that issue? tensorflow/tensorflow#2640 (comment)
from stt.
As this was posted 5 years ago, both versions are rather outdated now, so I wasn't sure if it made sense to build with such old versions. It could possibly still be an issue with version mis-matches, so I will try changing the version.
from stt.
Please i need a firm answer , do i need to use Gpu (Cuda paths) to build libstt? If no i did get through all the steps but ended up with this error , any help would be very appreciated 🙏
function 'virtual Metadata* ModelState::decode_metadata(const DecoderState&, size_t)': native_client/modelstate.cc:73:3: error: too many initializers for 'Metadata' }; ^
``
from stt.
Related Issues (20)
- Bug: lm_optimize fails due to check failing. HOT 2
- Bug: Segmentation Fault HOT 1
- Bug: Scorer.fill_dictiomary() Python function throws SWIG exception
- Feature request: Multiple Parallel/Concatenatable Models
- Bug: Android couldn`t find libstt-jni.so
- Feature request: Cancel previous workflow actions
- Feature request: Tensorflow 2.0 compatibility HOT 11
- Feature request: add Typescript @types for the WASM bindings
- Bug: --alphabet required with --force_bytes_output_mode off but not accepted as a CLI option HOT 4
- Update `genrate_scorer_package` error message when not given any `checkpoint` HOT 1
- Bug: Update `Python` inside `Dockerfile.build` HOT 1
- Bug: Illegal Hard Instruction on generate_scorer_package
- Improvment: `NotFoundError`: Unsuccessful TensorSliceReader constructor: Failed to find any matching files for `best_dev_checkpoint` HOT 2
- Bug: stt complains libbz2.so.1.0 not found HOT 6
- Bug: "import stt" works in notebook but not in bash command HOT 1
- Feature request: Replace Scorer.KenLM with Scorer.Transform HOT 18
- Bug: Importer `import_librivox.py` can't render absolute path of WAV files in CSV HOT 2
- Bug: Update `set-output` calls for ci pipeline v3 HOT 1
- Upload missing aarch64 and arm32 wheels to PyPi
- Bug: Model zoo seems to be gone HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from stt.