leon-ai / leon Goto Github PK
View Code? Open in Web Editor NEWπ§ Leon is your open-source personal assistant.
Home Page: https://getleon.ai
License: MIT License
π§ Leon is your open-source personal assistant.
Home Page: https://getleon.ai
License: MIT License
One module for each platform.
Leon, what's trending on Product Hunt?
Leon, what's trending on GitHub?
Might be related to a another package.
I feel labeling this project as 1.0.0 to be deceptive, especially for a project this large in scope.
1.0.0 to most developers signals a 'ready for production use' mark for a projects lifecycle.
I would suggest following semantic versioning and start with 0.x.x.
If i use docker to deploy Leon it try to connect to localhost:1337.
Is it possible to give leon the correct URL and port?
node --version -> v11.1.0
β°ββ€ npm run check
> [email protected] check /mnt/storage0/overminddl1/leon/leon
> babel-node scripts/run-check.js
β
---
.: CHECKING :.
β‘ /bin/sh -c node --version
β v11.1.0
β‘ /bin/sh -c npm --version
β 6.4.1
β‘ /bin/sh -c pipenv --version
β pipenv, version 2018.11.26
β Error: Command failed: /bin/sh -c pipenv run python --version
Warning: Python 3.6 was not found on your systemβ¦
You can specify specific versions of Python with:
$ pipenv --python path/to/python
And yet running that command manually:
β°ββ€ /bin/sh -c 'pipenv run python --version'
Loading .env environment variablesβ¦
Python 3.7.2
Python 3.7 should work. Downgrading the system version is... not safe.
npm run check
is failing the python check.
Use the updated version of python.
Ask Leon the reputation of a particular website. For example : « Leon, what is the reputation of github.com ? », « Is the aliexpress.com website safe ? »
I think this can be easily done, using free APIs like « AlienVault Open Threat Exchange (OTX) », « Google Safe Browsing », « Web Of Trust (WOT) ». Using VirusTotal we can even lunch URL analysis.
I think that this feature can be add to the package « checker ».
npm run check
.: CHECKING :.
β Error: Command failed: /bin/sh -c pipenv --version
/bin/sh: warning: setlocale: LC_ALL: cannot change locale (C.UTF-8)
/bin/sh: warning: setlocale: LC_ALL: cannot change locale (C.UTF-8)
Traceback (most recent call last):
File "/usr/bin/pipenv", line 11, in
load_entry_point('pipenv==2018.11.15.dev0', 'console_scripts', 'pipenv')()
File "/usr/lib64/python3.6/site-packages/pipenv/vendor/click/core.py", line 764, in call
return self.main(*args, **kwargs)
File "/usr/lib64/python3.6/site-packages/pipenv/vendor/click/core.py", line 696, in main
_verify_python3_env()
File "/usr/lib64/python3.6/site-packages/pipenv/vendor/click/_unicodefun.py", line 124, in _verify_python3_env
' mitigation steps.' + extra
RuntimeError: Click will abort further execution because Python 3 was configured to use ASCII as encoding for the environment. Consult https://click.palletsprojects.com/en/7.x/python3/ for mitigation steps.
This system lists a couple of UTF-8 supporting locales that
you can pick from. The following suitable locales were
discovered: aa_DJ.utf8, aa_ER.utf8, aa_ET.utf8, af_ZA.utf8, agr_PE.utf8, ak_GH.utf8, am_ET.utf8, an_ES.utf8, anp_IN.utf8, ar_AE.utf8, ar_BH.utf8, ar_DZ.utf8, ar_EG.utf8, ar_IN.utf8, ar_IQ.utf8, ar_JO.utf8, ar_KW.utf8, ar_LB.utf8, ar_LY.utf8, ar_MA.utf8, ar_OM.utf8, ar_QA.utf8, ar_SA.utf8, ar_SD.utf8, ar_SS.utf8, ar_SY.utf8, ar_TN.utf8, ar_YE.utf8, as_IN.utf8, ast_ES.utf8, ayc_PE.utf8, az_AZ.utf8, az_IR.utf8, be_BY.utf8, bem_ZM.utf8, ber_DZ.utf8, ber_MA.utf8, bg_BG.utf8, bhb_IN.utf8, bho_IN.utf8, bho_NP.utf8, bi_VU.utf8, bn_BD.utf8, bn_IN.utf8, bo_CN.utf8, bo_IN.utf8, br_FR.utf8, brx_IN.utf8, bs_BA.utf8, byn_ER.utf8, ca_AD.utf8, ca_ES.utf8, ca_FR.utf8, ca_IT.utf8, ce_RU.utf8, chr_US.utf8, cmn_TW.utf8, crh_UA.utf8, cs_CZ.utf8, csb_PL.utf8, cv_RU.utf8, cy_GB.utf8, da_DK.utf8, de_AT.utf8, de_BE.utf8, de_CH.utf8, de_DE.utf8, de_IT.utf8, de_LI.utf8, de_LU.utf8, doi_IN.utf8, dsb_DE.utf8, dv_MV.utf8, dz_BT.utf8, el_CY.utf8, el_GR.utf8, en_AG.utf8, en_AU.utf8, en_BW.utf8, en_CA.utf8, en_DK.utf8, en_GB.utf8, en_HK.utf8, en_IE.utf8, en_IL.utf8, en_IN.utf8, en_NG.utf8, en_NZ.utf8, en_PH.utf8, en_SC.utf8, en_SG.utf8, en_US.utf8, en_ZA.utf8, en_ZM.utf8, en_ZW.utf8, eo.utf8, es_AR.utf8, es_BO.utf8, es_CL.utf8, es_CO.utf8, es_CR.utf8, es_CU.utf8, es_DO.utf8, es_EC.utf8, es_ES.utf8, es_GT.utf8, es_HN.utf8, es_MX.utf8, es_NI.utf8, es_PA.utf8, es_PE.utf8, es_PR.utf8, es_PY.utf8, es_SV.utf8, es_US.utf8, es_UY.utf8, es_VE.utf8, et_EE.utf8, eu_ES.utf8, fa_IR.utf8, ff_SN.utf8, fi_FI.utf8, fil_PH.utf8, fo_FO.utf8, fr_BE.utf8, fr_CA.utf8, fr_CH.utf8, fr_FR.utf8, fr_LU.utf8, fur_IT.utf8, fy_DE.utf8, fy_NL.utf8, ga_IE.utf8, gd_GB.utf8, gez_ER.utf8, gez_ET.utf8, gl_ES.utf8, gu_IN.utf8, gv_GB.utf8, ha_NG.utf8, hak_TW.utf8, he_IL.utf8, hi_IN.utf8, hif_FJ.utf8, hne_IN.utf8, hr_HR.utf8, hsb_DE.utf8, ht_HT.utf8, hu_HU.utf8, hy_AM.utf8, ia_FR.utf8, id_ID.utf8, ig_NG.utf8, ik_CA.utf8, is_IS.utf8, it_CH.utf8, it_IT.utf8, iu_CA.utf8, ja_JP.utf8, ka_GE.utf8, kab_DZ.utf8, kk_KZ.utf8, kl_GL.utf8, km_KH.utf8, kn_IN.utf8, ko_KR.utf8, kok_IN.utf8, ks_IN.utf8, ku_TR.utf8, kw_GB.utf8, ky_KG.utf8, lb_LU.utf8, lg_UG.utf8, li_BE.utf8, li_NL.utf8, lij_IT.utf8, ln_CD.utf8, lo_LA.utf8, lt_LT.utf8, lv_LV.utf8, lzh_TW.utf8, mag_IN.utf8, mai_IN.utf8, mai_NP.utf8, mfe_MU.utf8, mg_MG.utf8, mhr_RU.utf8, mi_NZ.utf8, miq_NI.utf8, mjw_IN.utf8, mk_MK.utf8, ml_IN.utf8, mn_MN.utf8, mni_IN.utf8, mr_IN.utf8, ms_MY.utf8, mt_MT.utf8, my_MM.utf8, nan_TW.utf8, nb_NO.utf8, nds_DE.utf8, nds_NL.utf8, ne_NP.utf8, nhn_MX.utf8, niu_NU.utf8, niu_NZ.utf8, nl_AW.utf8, nl_BE.utf8, nl_NL.utf8, nn_NO.utf8, nr_ZA.utf8, nso_ZA.utf8, oc_FR.utf8, om_ET.utf8, om_KE.utf8, or_IN.utf8, os_RU.utf8, pa_IN.utf8, pa_PK.utf8, pap_AW.utf8, pap_CW.utf8, pl_PL.utf8, ps_AF.utf8, pt_BR.utf8, pt_PT.utf8, quz_PE.utf8, raj_IN.utf8, ro_RO.utf8, ru_RU.utf8, ru_UA.utf8, rw_RW.utf8, sa_IN.utf8, sah_RU.utf8, sat_IN.utf8, sc_IT.utf8, sd_IN.utf8, se_NO.utf8, sgs_LT.utf8, shn_MM.utf8, shs_CA.utf8, si_LK.utf8, sid_ET.utf8, sk_SK.utf8, sl_SI.utf8, sm_WS.utf8, so_DJ.utf8, so_ET.utf8, so_KE.utf8, so_SO.utf8, sq_AL.utf8, sq_MK.utf8, sr_ME.utf8, sr_RS.utf8, ss_ZA.utf8, st_ZA.utf8, sv_FI.utf8, sv_SE.utf8, sw_KE.utf8, sw_TZ.utf8, szl_PL.utf8, ta_IN.utf8, ta_LK.utf8, tcy_IN.utf8, te_IN.utf8, tg_TJ.utf8, th_TH.utf8, the_NP.utf8, ti_ER.utf8, ti_ET.utf8, tig_ER.utf8, tk_TM.utf8, tl_PH.utf8, tn_ZA.utf8, to_TO.utf8, tpi_PG.utf8, tr_CY.utf8, tr_TR.utf8, ts_ZA.utf8, tt_RU.utf8, ug_CN.utf8, uk_UA.utf8, unm_US.utf8, ur_IN.utf8, ur_PK.utf8, uz_UZ.utf8, ve_ZA.utf8, vi_VN.utf8, wa_BE.utf8, wae_CH.utf8, wal_ET.utf8, wo_SN.utf8, xh_ZA.utf8, yi_US.utf8, yo_NG.utf8, yue_HK.utf8, yuw_PG.utf8, zh_CN.utf8, zh_HK.utf8, zh_SG.utf8, zh_TW.utf8, zu_ZA.utf8
Click discovered that you exported a UTF-8 locale
but the locale system could not pick up from it because
it does not exist. The exported locale is "C.UTF-8" but it
is not supported
It seems that Leon currently only speaks English (?)
It would be great if the software could support other languages. And do it in a way that is more extensible/linguistically informed than Mycroft.
Chromecast package allowing to interact with a connected Chromecast device. 2 modules for a start:
Ask Leon to find a playlist on youtube and play it. This performs a search and starts playing the first playlist from the results.
Control any media currently playing on the Chromecast. Allows for the following commands:
I have experimented with my Chromecast and Leon and it works quite well with the current NLU, with potential for more accuracy with the coming updates. For the experiment, I used https://github.com/balloob/pychromecast to interact with the Chromecast, and the Youtube Rest API (very similar query to the video download module). I'm eager to work on it and make it a production-ready package if the time is right!
hi!
Thanks for your interest in Leon! β€οΈ
Leon can manage or task with a todo list.
It is can contain 3 list: ToDo/Doing/Done
And we tell him for example I should send email to my mother and tell him I sent email too my mom.
We can set lable for any task for importancy or set a deadline for them
And all task can depend on time.
Hey Leon I should read that article tomorrow
and additinal feature.
it's be great if anyone can implement
I start it in my fork project and merge it
tnx for this project:)
Something that I have really dream about in the night between the 22nd to the 23rd of February.
A good example could be the game of "guess the number".
User: Leon, let's play the guess the number game
Leon: Alright, I've chosen a number, you guess.
User: 42
Leon: More
User: 45
Leon: Less
User: 43
Leon: You got me with 3 tries, congrats!
Be able to have a dedicated "flow" for each module. As long as the user does not say "stop" or the module does not exit via a trigger, then the flow continue for this specific module.
Dedicated "flow" for each module. Module with an option flow
true or false. When true, do not exit the current flow until the user say "stop" or the module exit via a trigger.
> [email protected] check /home/emperoreby/sideprojects/leon
> babel-node scripts/run-check.js
sh: 1: babel-node: not found
npm ERR! file sh
npm ERR! code ELIFECYCLE
npm ERR! errno ENOENT
npm ERR! syscall spawn
npm ERR! [email protected] check: `babel-node scripts/run-check.js`
npm ERR! spawn ENOENT
npm ERR!
npm ERR! Failed at the [email protected] check script.
npm ERR! This is probably not a problem with npm. There is likely additional logging output above.
npm WARN Local package.json exists, but node_modules missing, did you mean to install?
npm ERR! A complete log of this run can be found in:
npm ERR! /home/emperoreby/.npm/_logs/2019-02-24T12_39_09_469Z-debug.log
Leon should give a proper reply when we ask questions.
It is throwing an error like "Sorry, I have an issue with the Greeting module of my Leon package!"
I have updated the .env.sample file as per this.
I got this below error when docker 7th step.
Step 7/10 : RUN npm install
---> Running in 145bed53e9ab
npm WARN lifecycle [email protected]~preinstall: cannot run in wd [email protected] node scripts/setup/preinstall.js (wd=/app)
But installed successfully with some warnings.
Please let me know if you need any extra info.
I have attached the Gif regarding the issue. Feel free to take a look.
Hello,
I create this issue but the problem can be on my side ;),
I install Leon on my ArchLinux with docker, The build and installation looks good, leon start on Leet's port ;) so when I want to speak with leon It looks like no modules work as we see on the screen capture :)
I think the problem can be on my side, I have archLinux with deepin and docker don't have the reputaton to work good on it :P, I create this issue because if an other person have the same problem as me He can add it to this issue ;)
Don't see it as a big priority, I have a lot of time to see what you do ;)
My english is not perfect as you can see, I speak better french than English but I work on ;)
Have a Nice day and be happy !!!!
Leon uses NLU entities to find domain names.
Leon uses a customized regex to find domain names.
Use the NLU extraction entities to find the domain names and give it to the core.
Hello,
I have seen many of these kinds of AI/Assistants.
What I am always looking for though, is a way to get them to perform specific actions on specific commands. I never see documentation on them. This is the kind of thing I mean:
"Leon, turn the lights on in the kitchen." == Leon will execute a pre-defined script I have to do this task.
"Leon, wake up MOHAWK." == Leon can execute a pre-defined script which sends a magic packet to a remote PC named MOHAWK.
"Leon, open my development tools." == Executes a simple batch file which launches App1, App2, App3.
Be able to subtract a type of entity from a query. Such as date, number, location, etc.
If it uses a tiers but Node.js, it should operate from the core and send the result object to modules via the child process parameters.
Otherwise, it should fully be implemented in the core itself.
I noted some following leads in this Trello card.
When we need modules returns HTML answers, such as the Trend package with the GitHub module, we need to display the repositories links.
Simply allow the client to understand HTML when it displays the answers. We also need to stripe HTML tags for the TTS.
Leon, what time is it?
- It's quarter past ten.
Implement a new module / package.
Leon is buildable and runs.
Leon is unbuildable and unable to run.
Use steps on ReadME with fresh packages from original Mint Install.
Using Bash-It for my linux terminal.
https://docs.getleon.ai/offline.html#hotword
typo in: npm install setup:offline-hotword
must be: npm run setup:offline-hotword
Also, how to manually install deepspeech. I cannot let npm download 1.8GB file and just fail because of slow internet connection, I need to download the 1.8GB file using aria2c/wget -c and unpack it to where?
My bandwidth is expensive.
Another question, I have installed python3.7.2 from source (cannot remove it using make uninstall) but leon requires only 3.6.X, so I installed 3.6.8 from source again but pipenv is using 3.7.
I am using Debian, and this broke apt-get, is there any way to make pipenv select 3.6 not 3.7?
# Example
--> Leon, play The Fat Rat
<-- Ok, playing the calling by TheFatRat
Implement a feature to use the Spotify web API to find the music that the use is looking for and play with a device that is available through the API.
I am trying to add some modules in the Existing package. I have added the quotes module in the leon package.
I added this in leon/data/expressions/en.json
"quote": [
"Tell me a Quote",
"Give me a Quote",
]
Then I have updated the below in ` leon/data/expressions/en.json
"quote":{
"quotes":
[
"Design is the silent ambassador of your brand. ",
"If there is one word Iβd like to remove from any conversation about design, itβs βpretty.",
]
}
Then I create a new python file named quote.py
. Please see the below code.
import utils
def quote(string):
"""Leon says some quote"""
return utils.output('end', 'quotes', utils.translate('quotes'))
I also created a new test python file in the test folder.
But it seems it breaks leon. Wheneve I ask question it gives reply as "Sorry, I don't work correctly!"
How can I test the modules? Do I need to add anything else?
Please switch to DeepSpeech v0.4.1 models, the released v0.4.0 was a bogus one: https://github.com/mozilla/DeepSpeech/releases/tag/v0.4.1
having issue npm install
on macOS
β Failed to install the Python packages: Error: Command failed: /bin/sh -c pipenv install
Warning: Python 3.6 was not found on your system...
You can specify specific versions of Python with:
$ pipenv --python path/to/python
Would you guys build a Docker Image to simplify local dev & production running process?
just write a Dockerfile
would be enough(I think). or both Dockerfile
& docker-compose.yml
it's not difficult at all.
then it can be easily run without all these install process, that would be really helpful
Any thought? Thanks
After running npm install
, I got the following output:
added 1554 packages from 828 contributors and audited 31148 packages in 141.769s
found 66 vulnerabilities (65 low, 1 high)
If you'd like, I can paste the output of the command npm audit
. Do you have any plans to update the dependencies using something like greenkeeper?
There should be a "marketplace" to see all packages and modules available for leon, I don't know if it's a problem to paste a link from the competitor, but I think it's self-explanatory how a marketplace should be.
# Example
--> Leon, what do I have planned for tomorrow?
<-- At 13:00 you have a nap
Implement a Calendar package with sub-modules for each provider. For example
User: Leon, add potatoes to my shopping list
Leon: I added potatoes to your shopping list
User: Leon, add go to the gym to my personal list
Leon: I added go to the gym to your personal list
Use named entity recognition to be able to build such modules as the Todo List module.
Give informations about weather
# Example
--> Leon, how's the weather ?
<-- I'm searching weather for Rouen
Today, clear sky, the temperature at Rouen 18Β°C
Weather will remain stable
Implement a weather package which give information about the weather using external API like openweathermap. We can also use external IP geolocalisation, like that we can avoid asking the localisation.
Awesome project!
Are there any plans to add german language support?
An interesting app, it would be nice to support desktop version via electron.
β‘ C:\WINDOWS\system32\cmd.exe /q /s /c "npm --version"
β 6.4.1
β‘ C:\WINDOWS\system32\cmd.exe /q /s /c "pipenv --version"
β pipenv, version 2018.11.26
β Error: Command failed: C:\WINDOWS\system32\cmd.exe /q /s /c "pipenv run python --version"
Warning: Python 3.6 was not found on your systemοΏ½
You can specify specific versions of Python with:
$ pipenv --python path\to\python
I expected that pipenv found the python. but seems pipenv cannot find it. I also add python to PATH variable and when I run where python.exe
and the result is:
C:\Users\Home\AppData\Local\Programs\Python\Python37-32\python.exe
β Error: Command failed: C:\WINDOWS\system32\cmd.exe /q /s /c "pipenv run python --version"
Warning: Python 3.6 was not found on your systemοΏ½
You can specify specific versions of Python with:
$ pipenv --python path\to\python
nothing
# in Leon Interface
> Hello
< Hi! Please, you should sleep to be in shape for your day.
# in Leon Interface
> Hello
< Sorry, it seems I have a problem with the Greeting module of my Leon package!
< Sorry, it seems I have a problem with the Greeting module of my Leon package!
< Hi! Please, you should sleep to be in shape for your day.
Error message from the process:
β /bin/sh: warning: setlocale: LC_ALL: cannot change locale (C.UTF-8) in greeting module
LC_ALL=c
(c is a valid locale, see https://linux.die.net/man/3/setlocale)
$ LC_ALL=en_US.UTF8 npm start
works
Right now it seems unclear to me what amount of hardware will be required to run an instance of leon without actually running it.
The documentation should provide some suggestions of scaling for different use cases.
.: CHECKING :.
β Error: Command failed: /bin/sh -c pipenv --version
/bin/sh: pipenv: command not found
Get Leon Installed
The Installation fails.
Maybe specific to my machine and how Python is setup
pipenv
command not found"
But :
pip3 install pipenv says "Requirement already satisfied:"
i created an alias alias pipenv="python3 -m pipenv"
to get pipenv working but the install looks for pipenv in /bin/sh
Any idea ? :p
Specs
Expected Behaviour
Leon Should build successfully
Actual Behaviour
Throwed an error While installing the python package
Error Message
β Failed to install the Python packages: Error: Command failed: /bin/sh -c pipenv install
Any help would be appreciated.
npm-debug.log and terminal std err output why cant build this and how can Δ± fix ?
Hi,
I get an error when using STT. I use the offline version of STT and TTS.
`
β leon git:(develop) β docker exec -it 97c3b4910a12 sh
/app # npm run check
[email protected] check /app
babel-node scripts/run-check.js
.: CHECKING :.
β‘ /bin/sh -c node --version
β v10.15.2
β‘ /bin/sh -c npm --version
β 6.4.1
β‘ /bin/sh -c pipenv --version
β pipenv, version 2018.11.26
β‘ /bin/sh -c pipenv --where
β /app/bridges/python
β‘ /bin/sh -c pipenv run python --version
β Python 3.6.6
β‘ /bin/sh -c pipenv run python bridges/python/main.py en leon randomnumber "Give me a random number"
β {"package": "leon", "module": "randomnumber", "lang": "en", "input": "Give me a random number", "output": {"type": "end", "code": "success", "speech": 57, "options": {}}}
β‘ Classifier state
β Found and valid
β‘ Amazon Polly TTS
β Amazon Polly TTS is not yet configured
β‘ Google Cloud TTS/STT
β Google Cloud TTS/STT is not yet configured
β‘ Watson TTS
β Watson TTS is not yet configured
β‘ Offline TTS
β Found Flite at bin/flite/flite
β‘ Watson STT
β Watson STT is not yet configured
β‘ Offline STT
β Found DeepSpeech language model at bin/deepspeech/lm.binary
.: REPORT :.
β‘ Here is the diagnosis about your current setup
β Run
β Run modules
β Reply you by texting
β Amazon Polly text-to-speech
β Google Cloud text-to-speech
β Watson text-to-speech
β Offline text-to-speech
β Google Cloud speech-to-text
β Watson speech-to-text
β Offline speech-to-text
β Hooray! Leon can run correctly
β‘ If you have some yellow warnings, it is all good. It means some entities are not yet configured
`
No error :-)
I get the following error in the docker log:
β New instance β‘ Initializing STT... β‘ Loading model from file bin/deepspeech/output_graph.pb... (node:23) UnhandledPromiseRejectionWarning: TypeError: DeepSpeech.Model is not a constructor at Object.parser.init.args [as init] (/app/server/dist/stt/deepspeech/parser.js:107:13) at Stt.init (/app/server/dist/stt/stt.js:55:27) at Socket.socket.on (/app/server/dist/core/server.js:193:17) at Socket.emit (events.js:189:13) at /app/node_modules/socket.io/lib/socket.js:528:12 at process._tickCallback (internal/process/next_tick.js:61:11) (node:23) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 3)
Cheers
BjΓΆrn
In the readme we can read the below:
2. With this generic structure, everyone can create his own modules and share them with others. Therefore there is only one core (to govern all of them).
The lord of the cores would require this to be written as proposed:
2. With this generic structure, everyone can create his own modules and share them with others. Therefore there is only one core (to rule them all).
node --version -> v11.1.0
β°ββ€ npm run check
> [email protected] check /mnt/storage0/overminddl1/leon/leon
> babel-node scripts/run-check.js
β
---
.: CHECKING :.
β‘ /bin/sh -c node --version
β v11.1.0
β‘ /bin/sh -c npm --version
β 6.4.1
β‘ /bin/sh -c pipenv --version
β pipenv, version 2018.11.26
β‘ /bin/sh -c pipenv --where
β /mnt/storage0/overminddl1/leon/leon/bridges/python
β‘ /bin/sh -c pipenv run python --version
β Python 3.6.8
β‘ /bin/sh -c pipenv run python bridges/python/main.py en leon randomnumber "Give me a random number"
β Error: Command failed: /bin/sh -c pipenv run python bridges/python/main.py en leon randomnumber "Give me a random number"
Traceback (most recent call last):
File "bridges/python/main.py", line 4, in <module>
import utils
File "/mnt/storage0/overminddl1/leon/leon/bridges/python/utils.py", line 11, in <module>
from tinydb import TinyDB, Query, operations
ModuleNotFoundError: No module named 'tinydb'
Should not crash.
See above in the check. Relevant section:
β‘ /bin/sh -c pipenv run python bridges/python/main.py en leon randomnumber "Give me a random number"
β Error: Command failed: /bin/sh -c pipenv run python bridges/python/main.py en leon randomnumber "Give me a random number"
Traceback (most recent call last):
File "bridges/python/main.py", line 4, in <module>
import utils
File "/mnt/storage0/overminddl1/leon/leon/bridges/python/utils.py", line 11, in <module>
from tinydb import TinyDB, Query, operations
ModuleNotFoundError: No module named 'tinydb'
Running the listed command manually has the same result
All I did was spool up a default python 3.6 + npm 11 docker image with a data directory of the git clone, ran pip install pipenv
, ran npm install
, ran npm run check
and the above appeared.
Such an amazing name, what's the story behind the choice?
Just a quick question how did you imagine Leon to decide which Bridge to call?
I have been tinkering on a go bridge, saddly that one only will work on mac OS and Llnux until golang fixes their plugin package to work with windows.
: CHECKING :.
β‘ /bin/sh -c node --version
β v11.6.0
β‘ /bin/sh -c npm --version
β 6.5.0
β‘ /bin/sh -c pipenv --version
β pipenv, version 2018.11.26
β‘ /bin/sh -c pipenv --where
β /Users/maverick/git-local/Personal/leon/bridges/python
β‘ /bin/sh -c pipenv run python --version
β
The Python version must be 3.6.x. Please install it: https://www.python.org/downloads
β‘ /bin/sh -c pipenv run python bridges/python/main.py en leon randomnumber "Give me a random number"
β {"package": "leon", "module": "randomnumber", "lang": "en", "input": "Give me a random number", "output": {"type": "end", "code": "success", "speech": 74, "options": {}}}
β‘ Classifier state
β Found and valid
β‘ Amazon Polly TTS
β Amazon Polly TTS is not yet configured
β‘ Google Cloud TTS/STT
β Google Cloud TTS/STT is not yet configured
β‘ Watson TTS
β Watson TTS is not yet configured
β‘ Offline TTS
β Cannot find bin/flite/flite. You can setup the offline TTS by running: "npm run setup:offline-tts"
β‘ Watson STT
β Watson STT is not yet configured
β‘ Offline STT
β Cannot find bin/deepspeech/lm.binary. You can setup the offline STT by running: "npm run setup:offline-stt"
.: REPORT :.
β‘ Here is the diagnosis about your current setup
β Run
β Run modules
β Reply you by texting
β Amazon Polly text-to-speech
β Google Cloud text-to-speech
β Watson text-to-speech
β Offline text-to-speech
β Google Cloud speech-to-text
β Watson speech-to-text
β Offline speech-to-text
β Please fix the errors above
Everything works mostly correct, I get proper output using the Checker, and I expect when giving the command
Is GitHub.com up?
to get back
Github is running correctly.```
### Actual Behavior
Is GitHub.com up?
I'm trying to reach Ub.
Ub is up.
### How Do We Reproduce?
Type the commands as above
### Extra (like a sample repo to reproduce the issue, etc.)
Is there plans to do Docker builds of Leon? As in provide an official image for it.
I would like to set some env variables where is this file typically stored? after build is successful?
Be able to query the status of some website without the fully qualified domain name.
Currently a query like this fails
is gitub up ?
I should ask is github.com up ?
May be list some of the most popular websites and treat them with special query as I made just before
I'm trying to create a new module/feature for leon but I'm having a hard time with 3rd party modules....
whenever I try to import a module I get the error:
β Traceback (most recent call last):
File "bridges/python/main.py", line 23, in <module>
main()
File "bridges/python/main.py", line 18, in main
m = import_module('packages.' + package + '.' + module)
File "/home/gmaciel/docker/pessoal/leon/bridges/python/.venv/lib/python3.6/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "./packages/dateandtime/saytime.py", line 45, in <module>
import pyPEG
ModuleNotFoundError: No module named 'pyPEG'
Is there any steps I have to follow to get this done?
I already have pyPEG installed with "pip install pypeg2" and "pip3 install pypeg2" commands
I guess the ideal setup would be to install it on a raspberry pi or similar board, make it listen in the current room, and plug it to your favorite speakers.
It might be nice to add some tutorial about small boards install.
> [email protected] check /home/metal-mighty/github-repos/leon
> babel-node scripts/run-check.js
β
---
.: CHECKING :.
β‘ /bin/sh -c node --version
β v10.15.1
β‘ /bin/sh -c npm --version
β 6.4.1
β‘ /bin/sh -c pipenv --version
β pipenv, version 2018.11.26
β‘ /bin/sh -c pipenv --where
β /home/metal-mighty/github-repos/leon/bridges/python
β‘ /bin/sh -c pipenv run python --version
β Python 3.6.8
β‘ /bin/sh -c pipenv run python bridges/python/main.py en leon randomnumber "Give me a random number"
β {"package": "leon", "module": "randomnumber", "lang": "en", "input": "Give me a random number", "output": {"type": "end", "code": "success", "speech": 90, "options": {}}}
β‘ Classifier state
β Found and valid
β‘ Amazon Polly TTS
β Amazon Polly TTS is not yet configured
β‘ Google Cloud TTS/STT
β Google Cloud TTS/STT is not yet configured
β‘ Watson TTS
β Watson TTS is not yet configured
β‘ Offline TTS
β Cannot find bin/flite/flite. You can setup the offline TTS by running: "npm run setup:offline-tts"
β‘ Watson STT
β Watson STT is not yet configured
β‘ Offline STT
β Cannot find bin/deepspeech/lm.binary. You can setup the offline STT by running: "npm run setup:offline-stt"
---
.: REPORT :.
β‘ Here is the diagnosis about your current setup
β Run
β Run modules
β Reply you by texting
β Amazon Polly text-to-speech
β Google Cloud text-to-speech
β Watson text-to-speech
β Offline text-to-speech
β Google Cloud speech-to-text
β Watson speech-to-text
β Offline speech-to-text
β Hooray! Leon can run correctly
Display Leon's web interaction interface as shown on the demo
Stuck on the interface's loading screen (gray and loading animation). No error in the logs
Server is hosted at home behind a router/ISP box. Ports 1337 and 4242 (great ports btw ;)) have been opened in the server's firewall and added to the NAT rules in the router in case some communication with the outside world was necessary.
Console log when accessing Leon's page:
.: REQUESTING :.
β‘ GET /
---
.: REQUESTING :.
β‘ GET /css/style.css
---
.: REQUESTING :.
β‘ GET /vendor/socket.io/2.0.3/socket.io.js
---
.: REQUESTING :.
β‘ GET /js/main.js
---
.: REQUESTING :.
β‘ GET /img/mic.svg
---
.: REQUESTING :.
β‘ GET /img/logo.svg
---
.: REQUESTING :.
β‘ GET /img/favicon.png
Hi ! I have an unexpected behavior, in fact when I type hello or any request that the assistant should understand it returns this message
Sorry, it seems I have a problem with the [module (ex: Whoami or Greeting)] of my Leon package!
I don't know if it's due to a bad installation of the packages.
Thanks.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
π Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. πππ
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google β€οΈ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.