Coder Social home page Coder Social logo

enclosure-mark1's Introduction

Mycroft Enclosure

mycroft-front

This repository holds the code run on the Arduino within a Mycroft unit. It manages the eyes, the mouth, and the button. The code is written entirely in C++ with Arduino's standard library.

Getting Started in Linux

Compiling

sudo ./install-arduino.sh
./compile.sh

Explanation: To compile the application you need to ensure Arduino IDE is installed to /opt/arduino. To do this automatically you can run the included script with sudo ./install-arduino.sh. If you have Arduino IDE installed in a different location you can set that in compile.sh. Finally, to compile run ./compile.sh.

Uploading to board

First download this repository to the unit in the folder /opt/enclosure provided it doesn't already exists. Next copy the generated file build/enclosure.ino.hex to the Mycroft unit in the folder /opt/enclosure/. This can be done quickly if the unit is connected to your network with the following script: ./deploy.sh [email protected] with the correct IP instead of 192.168.X.X.

Finally, on the Mycroft Unit run the following:

sudo ./install-avrdude.sh
./upload.sh

This will write the generated hex file to the Arduino. After running through the setup steps all that is necessary to compile and upload the hex to the arduino is the following:

# From host machine:
./deploy.sh [email protected]

# From Mycroft unit:
./upload.sh

Version number location

enclosure/lib/MycroftArduino/MycroftArduino.h

Serial Port Protocols

See the file protocols.txt for a description of commands that can be sent to the faceplate.

Commands can be sent from the command line on a Raspberry Pi, such as this:

$ echo "eyes.blink" > /dev/ttyAMA0
$ echo "eyes.color=16711680" > /dev/ttyAMA0
$ echo "mouth.text=I am angry!" > /dev/ttyAMA0

Which will blink the eyes, turn them red, then display the phrase on the faceplate.

Mycroft Skill API Interface

Skills are written in Python. Access to the Mark 1 features can be had using the mycroft.client.enclosure.api, which is configured and available inside a skill using self.enclosure. For example:

self.enclosure.mouth_text("I am angry!")

Graphics

The mouth.icon= command takes a custom format for it's black and white images. You can use the HTML editor to create the image strings that can be sent. Within Mycroft you can send PNG files using self.enclosure.display_png() and they will be automatically converted to the correct format for you.

IO pins

When looking at the device from the back, pins are laid out as:

             -----------------------------------------------------------------------------------------------
RCA Port    | 2 | 4 | 6 | 8 | 10 | 12 | 14 | 16 | 18 | 20 | 22 | 24 | 26 | 28 | 30 | 32 | 34 | 36 | 38 | 40 |
            | 1 | 3 | 5 | 7 |  9 | 11 | 13 | 15 | 17 | 19 | 21 | 23 | 25 | 27 | 29 | 31 | 33 | 35 | 37 | 39 |
             -----------------------------------------------------------------------------------------------
             
 HDMI                         Ethernet                        USB     USB

Where:

Pin Description
1 GND
2 +12V
3 GND
4 +5V
5 GND
6 +3.3V
7 Arduino Reset
8 Arduino D5
9 Arduino D6
10 Arduino D10
11 Arduino A2
12 Arduino A3
13 +3.3V
14 +3.3V
15 +3.3V
16 GND
17 GND
18 GND
19 +5V
20 +5V
21 +5V
22 +5V
23 GND
24 GND
25 Raspberry Pi ID_SC
26 Raspberry Pi ID_SD
27 Raspberry Pi GPIO 4
28 Raspberry Pi GPIO 5
29 Raspberry Pi GPIO 6
30 Raspberry Pi GPIO 7
31 Raspberry Pi GPIO 8
32 Raspberry Pi GPIO 9
33 Raspberry Pi GPIO 10
34 Raspberry Pi GPIO 11
35 Raspberry Pi GPIO 12
36 Raspberry Pi GPIO 16
37 Raspberry Pi GPIO 25
38 Raspberry Pi GPIO 26
39 GND
40 GND

WARNING: This is not the same as the standard Raspberry Pi GPIO headers!

enclosure-mark1's People

Contributors

aatchison avatar augustnmonteiro avatar connor-penrod avatar forslund avatar isaacnward avatar jarbasal avatar jdorleans avatar kathyreid avatar kfezer avatar krisgesling avatar learnedvector avatar matthewscholefield avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

enclosure-mark1's Issues

Reformat code

This is planned after @isaacnward 's PR has been merged rather than after to prevent mass chaos. I would prefer to use braces on a new line since that allows the programmer to more clearly see the start and end of a code block:

int func(int a, int b)
{
    if (a > b)
    {
        a += b << 2;
        return a * b;
    }
    else
    {
        for (int a = 0; a < b; ++a)
        {
            int c = a & b;
            if (c > b)
                return 0;
            return c;
        }
    }
}

However, if you all prefer I can just format it to how it has previously been done which is with braces on the same line:

int func(int a, int b) {
    if (a > b) {
        a += b << 2;
        return a * b;
    }
    else {
        for (int a = 0; a < b; ++a) {
            int c = a & b;
            if (c > b)
                return 0;
            return c;
        }
    }
}

Edit: It has been decided to retain the current code style and use the second. A PR has been made to address this.

Introduce concept of `MycroftMatrix`

Problem

To me it makes sense that we have two classes that represent the Neopixel rings and the LED matrix. Since the rings are only currently used as eyes I see the name acceptable. However, the matrix is used for both facial expressions and displaying various pieces of information such as the weather. Since the screen itself can only hold one thing on it at a time I was thinking we should probably have a class representing the matrix. @jdorleans mentioned that he did not want the mouth animations grouped with the weather which makes sense since they are handled differently. On the other hand, I think there should be some state representing that the weather is displayed rather than using the NONE state to retain the image drawn by the weather.

Solution

I suggest that we have a MycroftMatrix class that holds both a MycroftMouth and a MycroftWeather. Then this MycroftMatrix class could hold a state pointing to either displaying a mouth or a weather screen.

Is this necessary?

In the future there will be more screens such as calendar screen or a menu screen (in fact, I think we should also add a text screen to the matrix). Without a central system for handling these things the fragile mouth-centric state system will begin to show its age.

@isaacnward , @jdorleans , Any thoughts on this?

following docs doesnt work - updating arduino code

I was following the documentation to test fix for #104

I ran into these 2 issues following described steps

user@user-Predator-PH315-51:~/PycharmProjects/enclosure-mark1$ ./deploy.sh [email protected]
Compiling...
Success
Deploying to:  [email protected]
./deploy.sh: line 35: sshpass: command not found
./deploy.sh: line 36: sshpass: command not found
./deploy.sh: line 37: sshpass: command not found
Burning...
./deploy.sh: line 39: sshpass: command not found
Cool! : D

This can be fixed with

sudo apt-get install sshpass

but the script should fail safe and give proper feedback

The second error was this, but doesn't seem important and i think we can ignore safely

pi@mark_1:/opt/mycroft/enclosure $ sudo ./install-avrdude.sh
Reading package lists... Done
Building dependency tree       
Reading state information... Done
Package libusb-dev is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
However the following packages replace it:
  libusb-0.1-4

E: Package 'libusb-dev' has no installation candidate

Finally, when running upload.sh in the mark1

pi@mark_1:/opt/mycroft/enclosure $ ./upload.sh
Traceback (most recent call last):
  File "/opt/mycroft/enclosure/verifyArduino.py", line 28, in <module>
    import serial
ImportError: No module named serial

but it completes successfully, i think docs should mention we need to activate the mycroft-venv

Mycroft Component

There should be an interface to provide a framework for common behavior which includes initialization and updating.

Media progress bar

When Mycroft is playing music (and possibly when he's playing movies or television) his mouth should show a simple display of his current progress in what he's playing. This is probably a long-term goal, since it relies on (at the very least) MycroftAI/mycroft-core#123.

Mycroft Mark 1 : add one option on the screen menu to select language

Hello,

As I can see at https://docs.mycroft.ai/mark-1/using.mark1 and by checking the screen menu on my Mycroft mark 1, there is no option to select the language used by the Mycroft Mark 1.
Can you add this option ?

At https://docs.mycroft.ai/mark-1 , I can see that Mycroft Mark 1 use a 8x32 LED display.
But all the options in the screen menu have not more than 6 characters, then
I think that you can not add LANGUAGE but LANG for the text for this new option.

PS : I hope that I can select french in the next 2 months after a lot of work to add french support in mycroft-core and to translate mycroft-skills in french.

PS2: I opened this ticket in mycroft-core Github subpart because I can not find a specific subpart in https://github.com/MycroftAI

International text in the enclosure?

Characters such as À, Á, Ñ, Ç, , Æ... are not supported by the enclosure.

There are two approaches to address this:

  • Be able to render all characters (may be a lot of work, unicode is big)
  • Transliteration (find similar characters)

The enclosure uses transliteration to replace unsupported English lower case characters by the English upper case equivalents see here. However it does not consider other languages.

For instance ä could be transliterated to AE in German, but in English you would probably map it to A if you had to provide support for it.

What if instead of supporting English here, and having trouble to support all the other languages elsewhere we don't do any transliteration here and we handle all language transliterations in mycroft-core/mycroft/client/enclosure/mouth.py?

Ideally the enclosure would expect text UTF-8 encoded, although most UTF-8 code units would not be rendered because they are not yet supported.

Weather display

When you ask Mycroft for the weather, he should display it on his mouth while he speaks, instead of showing the mouth animation. A possible implementation would display an image indicating the weather conditions (snowy, sunny, partly cloudy, etc) on one panel of the mouth, the current temperature on the next, the high temperature on the third, and the low temperature on the fourth.

Slow encoder response time

The encoder works consistently, but Mycroft responds very sluggishly to it: if you turn the dial to change the volume, or hit the button to stop a skill, it takes up to 15 seconds to register the change. This is almost certainly caused by two related problems: the Arduino code's usage of delay(), which stops code execution entirely while it runs, and the current implementation of mouth animations, which immediately jumps out of the main loop to run through the entire animation (including delay()) instead of producing them iteratively.

'Reset' is easily confused with 'Reboot'

The menu item 'Reset' should probably be renamed. Several people have accidentally selected this when they really intended to restart the device. Perhaps it should be called 'Clear'?

Menu mode

Holding the encoder button should eventually trigger menu mode, allowing the user to change volume, language, sleep time and LED brightness (as well as possibly other things?).

Reset Password from Faceplate ( referenced in mycroft-core issue)

From Joshua Montgomery

Lets implement a new feature that lets you reset the pi user password from the faceplate.

Select "Reset Password" and the faceplate displays it ( while Mycroft says it out loud ). Then call the passwd feature. Also: can someone please check to make sure that - if the root account has a login shell - that the password is randomized on first-boot.

Use only non-ambiguous characters. ( NOT 0OI1 etc.)

MycroftAI/mycroft-core#399

Animation editor

There should be a utility to create animations graphically rather than by manually editing the code. For the output I was thinking of it generating the PROGMEM c array that can be copy and pasted into the code. This actually should be very simple if it is a terminal application using the WASD keys to move the cursor and another letter to toggle the pixel.

Raspberry Pi support?

Hi!

Would it be possible to run this on a raspberry PI oled display? Has anyone tried that?

Thanks,

Andres

Mouth Animations

There should be multiple types of animation classes that extend a base MouthAnim class that provide behavior to change frames. These include:

  • ForwardAnim
  • PingPongAnim
  • CustomAnim (for a manual sequence of frame ids)

When displaying text on the screen...

It would be nice if:

  • The text will fit on the screen, so don't scroll (eg. the pairing code)
  • The text is too large for the screen, it will scroll
  • We could vary the speed to better sync up with the voice.

Rethink MycroftHT1632 class

We should evaluate if this class is absolutely necessary or if it would be better to do away with it in favor of loading the program memory into a buffer. There is potential to implement this buffer inside MycroftDisplay to replace the class.

Extra info for the README

Here's the place to mention anything that isn't already on the README file (or anything that should be changed). For instance, I think the image should be improved or perhaps in unnecessary altogether.

Volume display

When Mycroft's volume is changed, instead of just displaying a speaking animation on his mouth as he tells the user what the new volume is, he should display a volume bar that shows the current volume level.

Bug Long mouth image string doesn't work

@JarbasAl originally reported that the MycroftMouth::showIcon() method doesn't handle split icon strings correctly.

After debugging we found that the $ sign indicating that a continuation message will be sent isn't stripped from the icon string generating a faulty code.

Additionally the documentation doesn't match the implementation. It states that the followup code should start with a $ instead of end with one.

A basic fix has been created in the bugfix/large-image branch

Documentation

To auto generate the possible commands run this in the root directory:
cat lib/*Processor/*Processor.cpp | grep -E "startsWith\(\"|Processor\(\"" | sed -e 's/^.*Processor("\([a-z]*\)").*$/\n=== \1 ===/gm' | sed -e 's/^.*"\(.*\)".*$/\1/gm'

Current output:

=== system ===
reset
mute
unmute
blink=
version

=== eyes ===
color=
level=
fill=
volume=
spin=
on
off
set
reset

=== test ===
begin

=== mouth ===
reset
faketalk
talk
listen
think
text=
icon
viseme=

=== weather ===
display=

Other notes to document:

eyes.color=

Hexadecimal color converted to decimal int. For example, pink (#FF00FF) would be 16711935 by converting FF00FF using this website.

For reference, hexadecimal colors are stored as #RRGGBB

State indications

Mycroft should somehow indicate what "state" he's in using his faceplate. Some examples: disconnected from the internet (eyes red?), recording, connecting to databases to answer search queries (which could possibly use the existing "..." thinking animation).

TEST ALL THE THINGS

We will need to test all the hardware before units are shipped. So we need some kind of hardware kick-off for the tests. This should be triggered by hitting the encoder 3 times in succession or holding it for 10 seconds (before the device is paired).

The device should kick on all the lights, flash the eyes in red, green, blue, and white (all three combined). Then finally the unit should say, "I'm listening." and listen for 5 seconds, and then playback whatever is said. Finally, we need to do something like kick up a WiFi AP as a test.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.