Coder Social home page Coder Social logo

spark-bouncer's Introduction

spark-bouncer

spark-bouncer logo

Overview

spark-bouncer is a security focused door access control system built on top of the Spark Core platform.

It utilizes the user flash memory to juggle up to 3000 RFID keys. Configuration happens via cloud based function calls.

Security is provided by One-time passwords for each key usage, making your door immune against serial number spoofing attacks.

Your team is allowed to get in early, the crowd a bit later? No worries, the spark-bouncer keeps an eye on precise timing!

You plan to embed a flexible door access control into your existing infrastructure? The spark-bouncer is API driven!

Hook yourself into the live log event stream or query its persistently stored Circular Buffer.

Connect a relay to your electric strike and place a button on the inside to manually open the door, gentleman style.

Buzzing yourself in is just an API call away.

Hardware

Get started

Breadboard the parts together as described in the header and boot it up!

The code is currently optimized to locally compile outside of the cloud. If you just like to test it without a local environment, flash the included firmware.bin to skip the setup.

If it is the first time you are running the spark-bouncer, your flash memory needs to get initialized:

$ spark call [core-id] reset

Hold a comptabile RFID key to the reader, nothing will happen - yet! Query the log and store your key's serial number:

$ spark get [core-id] log
123456;aa:bb:cc:dd;0
$ spark call [core-id] update aa:bb:cc:dd;*;active,otp
1

Try your RFID key again - the relay should make a happy noise.

Let's see what has happened:

$ spark get [core-id] log
123490;aa:bb:cc:dd;1
123480;aa:bb:cc:dd;9
123456;aa:bb:cc:dd;0

After the key wasn't found in the first place (NOT_FOUND), we updated it (UPDATED) - and granted access at the end (OPEN)!

Usage

Bouncer, let me in!

By calling the published open function, you'll get an instant buzz.

Example:

$ spark call [core-id] open

Configure RFID access

The spark-bouncer stores up to 3000 users, each being identified by their 4 to 10 bytes long RFID serial numbers.

Store RFID key

You have to define whom to let in at which time. To do so, call the published update function with following argument:

[key serial];[time restrictions];[flags]

Format used in the fields:

  • key serial - aa:bb:cc[:...] - up to 10 hex values seperated by colons
  • time restrictions
      • -> open at all times
    • - -> never open
    • up to seven 4 byte hex values to define specific valid hours per weekday
  • flags - comma seperated list, set to false if flag not present
    • otp -> enable One Time Passwords for this key [recommended]
    • active -> mark key as active - mandatory for getting in
    • lost -> marks key as lost - won't get you in anymore
    • reset -> resets the stored OTP in case something went wrong

The call returns

  • 1 if all went well
  • -1 if the key couldn't get stored

Example:

$ spark call [core-id] update "aa:bb:cc:dd;*;active,otp"

Time based access

Each hour of a week day is mapped to a bit in a 4 byte long. Setting a bit to 1 grants access for the corresponding hour.

Examples:

  • For the time between 16h and 17h, the 16th bit must be set (0x10000).
  • For full day access, set all bits to high (0xFFFFFFFF).
  • Grant access for all of Monday and Sunday, otherwise only buzz in between 16h-17h and 0h-4h on Tuesdays:
$ spark call [core-id] update "aa:bb:cc:dd;FFFFFFFF 1000F 0 0 0 0 FFFFFFFF;active,otp"

Logging

Data format

All logging data is returned as a semicolon seperated list. The included elements are:

[timestamp];[key serial];[event code]

Event codes

Code Event Triggered when?
0 NOT_FOUND scanned RFID key is not stored yet
1 OPEN door access granted
2 OUT_OF_HOURS valid key but not good for now
3 DISABLED usage of a key which is not flagged active
4 LOST key is flagged as lost
5 OTP_MISSMATCH possible highjack attempted, removes key's active flag
8 STORAGE_FULL very unlikely, but yey, here's an error in case more than >3000 keys got stored
9 UPDATED key data got updated via update call

Subscribing to the live log

The spark-bouncer is publishing all key usages to the Spark Cloud event system as private events.

Example subscription:

$ spark subscribe "" [core-id]

Published events:

  • card - after key handling or updating, data based on data format
  • button - when manual buzzer button is pressed
  • call - when door is opened via the Spark Cloud

Query the most recent events via the cloud

The Spark Cloud allows to query runtime variables with a maximal length of 622.

The spark-bouncer always keeps an internal buffer up to date with the most recent log entries.

Published variables:

  • log - containing as many descendingly ordered data format entries as it can hold.

Example query:

$ spark get [core-id] log

Debugging

To control the Spark Core's debug output, call the published debug function with either

  • 1 -> to enable serial debug output, or
  • 0 -> to disable serial debug output

The debug mode can be enabled by default in the top of the code.

Example:

$ spark call [core-id] debug 1
1
$ spark serial monitor
Opening serial monitor for com port: "/dev/cu.usbmodemfa131"
[rfid] identifying f3:65:1d:bc
[flash] Key found, index #0
-- Active? yes
-- Lost? no
-- Times:
          Monday   Tuesday  Wednesday  Thursday   Friday   Saturday   Sunday
 0 h                                      *                               
 1 h                                      *                               
 2 h                                      *                               
 3 h                                      *                               
 4 h                                      *                               
 5 h                                                                      
 6 h                                                                      
 7 h                                                                      
 8 h                                                                      
 9 h        *         *                                                   
10 h        *         *                                                   
11 h        *         *                                       *         * 
12 h        *         *                                       *         * 
13 h        *         *                                       *         * 
14 h        *         *                                       *         * 
15 h        *         *                                       *         * 
16 h        *         *                                       *         * 
17 h        *         *                                      (*)        * 
18 h        *         *                                       *         * 
19 h        *         *                                       *         * 
20 h        *         *         *                                         
21 h                            *                                         
22 h                            *                                         
23 h                            *                                         

-- last update of user configuration: Sat Sep 13 21:32:47 2014
-- last seen: Sat Sep 13 21:44:06 2014

-- OTP:      64 39 2C BC 4A F6 62 04 B1 FF 49 D0 58 2B F4 E3
OTP on Chip: 64 39 2C BC 4A F6 62 04 B1 FF 49 D0 58 2B F4 E3
New OTP:     DA 29 14 1D 37 12 7D 56 04 84 24 A6 49 E0 CA 67
[card] hours match, opening!
[door] opening
[door] closing

Reset storage

Be careful, but if you need to reset your storage during development, call the published reset function. Your spark-bouncer will forget all he knew.

Recommended to disable in production environments.

spark-bouncer's People

Contributors

rastapasta avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

spark-bouncer's Issues

possible to make a post to a local rest api

Can the spark core make a REST call to a local server when the door was opened?

I'm running openhab as home automation and would love if the spark-bouncer could send something upon successful/unsuccessful entry (i.e. to switch the alarm on/off)

door opening loop

just to check it out I flashed your firmware without changing anything and as soon as the spark bots up it opens the relay.

When I switch on debugging and chest the serial port it seems to be stuck in this

Opening serial monitor for com port: "/dev/cu.usbmodem2623131"
[door] opening
[door] opening
.
.
.
.

Cloud Compilation Error

In file included from ../inc/spark_wiring.h:29:0,
from ../inc/application.h:29,
from MFRC522/MFRC522.h:77,
from MFRC522/MFRC522.cpp:8:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
#warning "Defaulting to Release Build"
^
MFRC522/MFRC522.cpp: In member function 'bool MFRC522::MIFARE_UnbrickUidSector(bool)':
MFRC522/MFRC522.cpp:1629:1: warning: control reaches end of non-void function [-Wreturn-type]
}
^
MFRC522/MFRC522.cpp: In member function 'byte MFRC522::PCD_CommunicateWithPICC(byte, byte, byte*, byte, byte*, byte*, byte*, byte, bool)':
MFRC522/MFRC522.cpp:379:20: warning: '_validBits' may be used uninitialized in this function [-Wmaybe-uninitialized]
if (*backLen < 2 || _validBits != 0) {
^
MFRC522/MFRC522.cpp: In member function 'void MFRC522::PICC_DumpMifareClassicSectorToSerial(MFRC522::Uid*, MFRC522::MIFARE_Key*, byte)':
MFRC522/MFRC522.cpp:1362:4: warning: 'invertedError' may be used uninitialized in this function [-Wmaybe-uninitialized]
if (invertedError) {
^
In file included from ../inc/spark_wiring.h:29:0,
from ../inc/application.h:29,
from flashee-eeprom/flashee-eeprom.h:22,
from flashee-eeprom/flashee-eeprom.cpp:17:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
#warning "Defaulting to Release Build"
^
In file included from ../inc/spark_wiring.h:29:0,
from ../inc/application.h:29,
from sparkbouncer.cpp:2:
../../core-common-lib/SPARK_Firmware_Driver/inc/config.h:12:2: warning: #warning "Defaulting to Release Build" [-Wcpp]
#warning "Defaulting to Release Build"
^
sparkbouncer.cpp:10:17: error: 'user_t' was not declared in this scope
int checkAccess(user_t &user);
^
sparkbouncer.cpp:10:25: error: 'user' was not declared in this scope
int checkAccess(user_t &user);
^
sparkbouncer.cpp:11:15: error: variable or field 'saveUser' declared void
void saveUser(user_t &user, uint16_t keyId);
^
sparkbouncer.cpp:11:15: error: 'user_t' was not declared in this scope
sparkbouncer.cpp:11:23: error: 'user' was not declared in this scope
void saveUser(user_t &user, uint16_t keyId);
^
sparkbouncer.cpp:11:38: error: expected primary-expression before 'keyId'
void saveUser(user_t &user, uint16_t keyId);
^
sparkbouncer.cpp:12:1: error: 'user_t' does not name a type
user_t readUser(uint16_t keyId);
^
sparkbouncer.cpp:13:15: error: variable or field 'dumpUser' declared void
void dumpUser(user_t &user);
^
sparkbouncer.cpp:13:15: error: 'user_t' was not declared in this scope
sparkbouncer.cpp:13:23: error: 'user' was not declared in this scope
void dumpUser(user_t &user);
^
sparkbouncer.cpp:116:25: error: 'int checkAccess(user_t&)' redeclared as different kind of symbol
} user_t;
^
sparkbouncer.cpp:10:5: error: previous declaration of 'int checkAccess'
int checkAccess(user_t &user);
^
sparkbouncer.cpp: In function 'void rfidIdentify()':
sparkbouncer.cpp:470:31: error: 'checkAccess' cannot be used as a function
if (debugMode) {
^
sparkbouncer.cpp: In function 'int checkAccess(user_t&)':
sparkbouncer.cpp:528:29: error: 'int checkAccess(user_t&)' redeclared as different kind of symbol
memcpy(target, buffer, 16);
^
sparkbouncer.cpp:10:5: error: previous declaration of 'int checkAccess'
int checkAccess(user_t &user);
^
make: *** [sparkbouncer.o] Error 1

Unfortunately I cannot compile locally either since I cannot get the toolchain installed with the guidance you provided...

connecting the RC522

you write in the header:

 * ************************************************************
 *  Connect RC522 module to following Spark Core pins:
 *    RST   -> A1
 *    SDA  ->  A2
 *    SCK  ->  A3
 *    MISO ->  A4
 *    MOSI ->  A5
 *
 *  Connect the door relais signal line to following default pin:
 *    D0
 *
 *  Connect the door open button to following default pins:
 *    D1  +  3.3V
 * ************************************************************

But I suppose you have to wire GND and 3.3V of the RC522 as well - right?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.