Coder Social home page Coder Social logo

aerosense-ai / data-gateway Goto Github PK

View Code? Open in Web Editor NEW
3.0 4.0 3.0 4.27 MB

Data influx for Aerosense.

Home Page: https://www.aerosense.ai/

License: Other

Python 99.32% Dockerfile 0.68%
cloud python renewable-energy renewables wind-energy wind-energy-analytics

data-gateway's People

Contributors

cortadocodes avatar juliusplindastus avatar thclark avatar time-trader avatar tommasopolonelli avatar weikangkong avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

data-gateway's Issues

Packet loss during persistance

Bug report

What is the current behavior?

Strange (negative gap) packet loss after data persistence possibly due to buffer overflow:
Here is the output, with bytes in the buffer waiting printed out:

[2021-12-06 18:04:30,224 | INFO | data_gateway.packet_reader] 0.000000 bytes waiting
[2021-12-06 18:04:30,241 | INFO | data_gateway.packet_reader] 769.000000 bytes waiting
[2021-12-06 18:04:30,242 | INFO | data_gateway.packet_reader] 522.000000 bytes waiting
[2021-12-06 18:04:30,243 | INFO | data_gateway.packet_reader] 275.000000 bytes waiting
[2021-12-06 18:04:30,244 | INFO | data_gateway.packet_reader] 28.000000 bytes waiting
[2021-12-06 18:04:30,265 | INFO | data_gateway.packet_reader] 0.000000 bytes waiting
[2021-12-06 18:04:34,236 | INFO | data_gateway.persistence] Window 0 written to disk.
[2021-12-06 18:04:34,236 | INFO | data_gateway.persistence] Saving Baros_P data to csv file.
[2021-12-06 18:04:34,494 | INFO | data_gateway.persistence] Saving Baros_T data to csv file.
[2021-12-06 18:04:34,664 | INFO | data_gateway.persistence] Saving Constat data to csv file.
[2021-12-06 18:04:34,721 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,722 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,722 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,723 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,723 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,724 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,725 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,725 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,726 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,726 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,727 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,727 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,728 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,728 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,728 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,729 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,729 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,730 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,730 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,731 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,731 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,732 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,732 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,733 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,733 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,733 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,734 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,734 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,735 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,735 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,736 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,736 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,737 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,737 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,737 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,738 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,738 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,739 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,739 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,740 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,740 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,741 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,741 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,742 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,742 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,742 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,743 | INFO | data_gateway.packet_reader] 4095.000000 bytes waiting
[2021-12-06 18:04:34,743 | INFO | data_gateway.packet_reader] 3952.000000 bytes waiting
[2021-12-06 18:04:34,744 | INFO | data_gateway.packet_reader] 3705.000000 bytes waiting
[2021-12-06 18:04:34,744 | INFO | data_gateway.packet_reader] 3458.000000 bytes waiting
[2021-12-06 18:04:34,745 | INFO | data_gateway.packet_reader] 3211.000000 bytes waiting
[2021-12-06 18:04:34,745 | INFO | data_gateway.packet_reader] 2964.000000 bytes waiting
[2021-12-06 18:04:34,746 | INFO | data_gateway.packet_reader] 2717.000000 bytes waiting
[2021-12-06 18:04:34,746 | INFO | data_gateway.packet_reader] 2470.000000 bytes waiting
[2021-12-06 18:04:34,746 | INFO | data_gateway.packet_reader] 2223.000000 bytes waiting
[2021-12-06 18:04:34,747 | INFO | data_gateway.packet_reader] 1976.000000 bytes waiting
[2021-12-06 18:04:34,747 | INFO | data_gateway.packet_reader] 1729.000000 bytes waiting
[2021-12-06 18:04:34,748 | INFO | data_gateway.packet_reader] 1482.000000 bytes waiting
[2021-12-06 18:04:34,748 | INFO | data_gateway.packet_reader] 1235.000000 bytes waiting
[2021-12-06 18:04:34,749 | INFO | data_gateway.packet_reader] 988.000000 bytes waiting
[2021-12-06 18:04:34,749 | INFO | data_gateway.packet_reader] 741.000000 bytes waiting
[2021-12-06 18:04:34,749 | INFO | data_gateway.packet_reader] 494.000000 bytes waiting
[2021-12-06 18:04:34,750 | INFO | data_gateway.packet_reader] 247.000000 bytes waiting
[2021-12-06 18:04:34,750 | INFO | data_gateway.packet_reader] 1023.000000 bytes waiting
[2021-12-06 18:04:34,751 | INFO | data_gateway.packet_reader] 776.000000 bytes waiting
[2021-12-06 18:04:34,751 | INFO | data_gateway.packet_reader] 529.000000 bytes waiting
[2021-12-06 18:04:34,752 | INFO | data_gateway.packet_reader] 282.000000 bytes waiting
[2021-12-06 18:04:34,752 | INFO | data_gateway.packet_reader] 35.000000 bytes waiting
[2021-12-06 18:04:34,766 | INFO | data_gateway.packet_reader] 7.000000 bytes waiting
[2021-12-06 18:04:34,766 | WARNING | data_gateway.packet_reader] Lost Baros_P packet(s): -57952858.82019043 ms gap
[2021-12-06 18:04:34,766 | WARNING | data_gateway.packet_reader] Lost Baros_T packet(s): -57952858.82019043 ms gap
[2021-12-06 18:04:34,811 | INFO | data_gateway.packet_reader] 741.000000 bytes waiting
[2021-12-06 18:04:34,811 | WARNING | data_gateway.packet_reader] Lost Baros_P packet(s): 57949038.85437011 ms gap
[2021-12-06 18:04:34,811 | WARNING | data_gateway.packet_reader] Lost Baros_T packet(s): 57949038.85437011 ms gap
[2021-12-06 18:04:34,812 | INFO | data_gateway.packet_reader] 494.000000 bytes waiting
[2021-12-06 18:04:34,812 | INFO | data_gateway.packet_reader] 247.000000 bytes waiting
[2021-12-06 18:04:34,813 | INFO | data_gateway.packet_reader] 0.000000 bytes waiting

Possible solutions:

@tommasopolonelli or @hmllr I would love your feedback on this.

set_buffer_size is not an option in Linux, and on windows it is more of "recommendation" to the driver.

  • Create our own buffer in the gateway code and manage it.
  • Put persistence operations on the separate thread from serial port reading
  • Use new and "experimental" Packetizer class when handling packets from serial port

Your environment

  • Library Version: 0.7.1

Allow override of configuration values from the environment

Feature request

I'd like to move to using a solution like balena to manage the raspberry pi images and keep all the hardware on consistent

Use Case

Please [describe your motivation and use case].

Current state

Please describe what you're doing presently to work around this or achieve what you're doing.

Data loss while writing to file

Recreated by @thclark after migration from gitlab - originally filed by Rafael Fischer Jun 17, 2021 12:52pm GMT+0100

Bug report

What is the current behavior?

Once, the data of the newly added constat "sensor" got not written to the file completely. It doesn't happen regularly.
The produced .json file can be found here

I think that the issue is the writing to the file because the file stops abruptly (\n79.64799072265616,-30.104429244995117,-29,-9,28600,\n79.69299072265616,-30.070194244384766,-29,"}) without finishing the current sample. Note that the other sensor data are complete up to timestamp 125s.

Curiously, also the end of the console output is strange (see below) with the "Lost set of Constat packets: -125733.03588867171 ms gap" message after the "stopping gateway" message. The gap duration of -125s would happen when the current_timestamp variable is set to zero somewhere.

I think what happened is, that the constat packet arrived in some critical moment of shutdown for the python application.

Your environment

  • Happened with commit 76d8d02
  • Platform: Windows

Console log:

(venv) PS C:\Users\Raphael Fischer\OneDrive\PBL\AeroSense\data-gateway\data_gateway> gateway start --interactive

[data_gateway.cli | INFO | 2021-06-17 12:05:28,950 | logging_handlers | 9496 | 15860] Using local logger.

[data_gateway.cli | INFO | 2021-06-17 12:05:28,950 | cli | 9496 | 15860] Using default configuration.

[data_gateway.cli | INFO | 2021-06-17 12:05:28,966 | cli | 9496 | 15860] Starting gateway in interactive mode - files will not be uploaded to cloud storage but will instead be saved to disk at '.\data_gateway' at intervals of 600.0 seconds.

Successfully updated handles.

startMics

Successfully updated handles.

Received first set of Constat packets

Lost set of Constat packets: 30.70434570312841 ms gap

startMics

Received first set of Mics packets

startBaros

Received first set of Baros_P packets

Received first set of Baros_T packets

startIMU

sReceived first Acc packet

Received first Gyro packet

tartAnalog

Received first Mag packet

Received first set of Analog Vbat packets

Lost set of Mics packets: 538.2341308577452 ms gap

Lost set of Baros_P packets: 515.991210937429 ms gap

Lost set of Baros_T packets: 515.991210937429 ms gap

Lost set of Mics packets: 539.6538085929024 ms gap

Successfully updated handles.

stop

[data_gateway.cli | INFO | 2021-06-17 12:07:46,315 | cli | 9496 | 15860] Stopping gateway.

Lost set of Constat packets: -125733.03588867171 ms gap

Assess handle updater for multiple nodes

Bug Report

In PacketReader.parse_packet(), we have a call (which originated in Raphael's old script) which allows the hardware to update its handles by sending a packet:

                        if packet_type == str(self.config.type_handle_def):
                            self.update_handles(packet)
                            continue

In moving to v3 (i.e. #64), this would become

packet_type == str(self.config.nodes[node_id].type_handle_def)

ie the different nodes could update handles differently.

However, we don't yet treat the handles as being different for each node; this could result in one node issuing an update that then corrupts data received from other nodes.

Workaround for implementing #64

We'll assume that the handles are consistent for each node and not worry about it for now, I'll close the loop with @tommasopolonelli to determine how this should be handled in future.

Update configuration for multiple nodes

Feature request

We need the configuration to handle multiple nodes attached to one receiver.

Here's an example of what the configuration should look like (see the comments for extra notes)
{
  "gateway": {  
    "serial_buffer_rx_size": 4095,
    "serial_buffer_tx_size": 1280,
    "baudrate": 2300000,
    "endian": "little",
    "installation_reference": "ost-wt-evaluation",
    "longitude": "001",
    "latitude": "001",
    "turbine_id": "OST_WIND",
    "receiver_firmware_version": "2.1",
    "packet_key_offset": 245
  },
  "nodes": {
    "0": {
      "node_firmware_version": "whatever",
      "blade_id": "BAV01",
      "mics_freq": 15625,
      "mics_bm": 1023,
      "baros_freq": 100,
      "diff_baros_freq": 1000,
      "baros_bm": 1023,
      "acc_freq": 100,
      "acc_range": 16,
      "gyro_freq": 100,
      "gyro_range": 2000,
      "mag_freq": 12.5,
      "analog_freq": 16384,
      "constat_period": 45,
      "max_timestamp_slack": 0.005,
      "max_period_drift": 0.02,
      "type_handle_def": 255,
      "mics_samples_per_packet": 8,
      "imu_samples_per_packet": 40,
      "analog_samples_per_packet": 60,
      "baros_samples_per_packet": 1,
      "diff_baros_samples_per_packet": 24,
      "constat_samples_per_packet": 24,
      "sensor_names": [
        "Mics",
        "Baros_P",
        "Baros_T",
        "Diff_Baros",
        "Acc",
        "Gyro",
        "Mag",
        "Analog Vbat",
        "Constat"
      ],
      "default_handles": {
        "34": "Abs. baros",
        "36": "Diff. baros",
        "38": "Mic 0",
        "40": "Mic 1",
        "42": "IMU Accel",
        "44": "IMU Gyro",
        "46": "IMU Magnetometer",
        "48": "Analog1",
        "50": "Analog2",
        "52": "Constat",
        "54": "Cmd Decline",
        "56": "Sleep State",
        "58": "Info Message"
      },
      "decline_reason": {
        "0": "Bad block detection ongoing",
        "1": "Task already registered, cannot register again",
        "2": "Task is not registered, cannot de-register",
        "3": "Connection Parameter update unfinished"
      },
      "sleep_state": {
        "0": "Exiting sleep",
        "1": "Entering sleep"
      },
      "info_type": {
        "0": "Battery info"
      },
      "samples_per_packet": {
        "Mics": 8,
        "Diff_Baros": 24,
        "Baros_P": 1,
        "Baros_T": 1,
        "Acc": 40,
        "Gyro": 40,
        "Mag": 40,
        "Analog Vbat": 60,
        "Constat": 24
      },
      "number_of_sensors": {
        "Mics": 10,
        "Baros_P": 40,
        "Baros_T": 40,
        "Diff_Baros": 5,
        "Acc": 3,
        "Gyro": 3,
        "Mag": 3,
        "Analog Vbat": 1,
        "Constat": 4
      },
      "sensor_conversion_constants": {
        "Mics": [1, 1, 1, 1, 1, 1, 1, 1, 1, 1],
        "Diff_Baros": [1, 1, 1, 1, 1],
        "Baros_P": [
          40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96,
          40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96,
          40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96,
          40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96, 40.96
        ],
        "Baros_T": [
          100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100,
          100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100,
          100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100, 100
        ],
        "Acc": [1, 1, 1],
        "Gyro": [1, 1, 1],
        "Mag": [1, 1, 1],
        "Analog Vbat": [1],
        "Constat": [1, 1, 1, 1]
      },
      "period": {
        "Mics": 6.4e-5,
        "Baros_P": 0.01,
        "Baros_T": 0.01,
        "Diff_Baros": 0.001,
        "Acc": 0.01,
        "Gyro": 0.01,
        "Mag": 0.08,
        "Analog Vbat": 6.103515625e-5,
        "Constat": 0.045
      },
      "sensor_commands": {
        "start": ["startBaros", "startDiffBaros", "startIMU", "startMics"],
        "stop": ["stopBaros", "stopDiffBaros", "stopIMU", "stopMics"],
        "configuration": [
          "configBaros",
          "configAccel",
          "configGyro",
          "configMics"
        ],
        "utilities": [
          "getBattery",
          "setConnInterval",
          "tpcBoostIncrease",
          "tpcBoostDecrease",
          "tpcBoostHeapMemThr1",
          "tpcBoostHeapMemThr2",
          "tpcBoostHeapMemThr4"
        ]
      },
      "sensor_coordinates": {
        "Mics": [
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0]
        ],
        "Baros_P": [
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0]
        ],
        "Baros_T": [
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0]
        ],
        "Diff_Baros": [
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0]
        ],
        "Acc": [
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0]
        ],
        "Gyro": [
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0]
        ],
        "Mag": [
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0]
        ],
        "Analog Vbat": [[0, 0, 0]],
        "Constat": [
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0],
          [0, 0, 0]
        ]
      }
    },
    "1": {},
    "2": {}
  },
  "session": {
    "label": null
  }
}
Here's what the configuration looks like now
{
  "mics_freq": 15625,
  "mics_bm": 1023,
  "baros_freq": 100,
  "diff_baros_freq": 1000,
  "baros_bm": 1023,
  "acc_freq": 100,
  "acc_range": 16,
  "gyro_freq": 100,
  "gyro_range": 2000,
  "mag_freq": 12.5,
  "analog_freq": 16384,
  "constat_period": 45,
  "serial_buffer_rx_size": 4095,
  "serial_buffer_tx_size": 1280,
  "baudrate": 2300000,
  "endian": "little",
  "max_timestamp_slack": 0.005,
  "max_period_drift": 0.02,
  "packet_key": 245,
  "type_handle_def": 255,
  "mics_samples_per_packet": 8,
  "imu_samples_per_packet": 40,
  "analog_samples_per_packet": 60,
  "baros_samples_per_packet": 1,
  "diff_baros_samples_per_packet": 24,
  "constat_samples_per_packet": 24,
  "sensor_names": [
    "Mics",
    "Baros_P",
    "Baros_T",
    "Diff_Baros",
    "Acc",
    "Gyro",
    "Mag",
    "Analog Vbat",
    "Constat"
  ],
  "default_handles": {
    "34": "Abs. baros",
    "36": "Diff. baros",
    "38": "Mic 0",
    "40": "Mic 1",
    "42": "IMU Accel",
    "44": "IMU Gyro",
    "46": "IMU Magnetometer",
    "48": "Analog1",
    "50": "Analog2",
    "52": "Constat",
    "54": "Cmd Decline",
    "56": "Sleep State",
    "58": "Info Message"
  },
  "decline_reason": {
    "0": "Bad block detection ongoing",
    "1": "Task already registered, cannot register again",
    "2": "Task is not registered, cannot de-register",
    "3": "Connection Parameter update unfinished"
  },
  "sleep_state": {
    "0": "Exiting sleep",
    "1": "Entering sleep"
  },
  "info_type": {
    "0": "Battery info"
  },
  "samples_per_packet": {
    "Mics": 8,
    "Diff_Baros": 24,
    "Baros_P": 1,
    "Baros_T": 1,
    "Acc": 40,
    "Gyro": 40,
    "Mag": 40,
    "Analog Vbat": 60,
    "Constat": 24
  },
  "number_of_sensors": {
    "Mics": 10,
    "Baros_P": 40,
    "Baros_T": 40,
    "Diff_Baros": 5,
    "Acc": 3,
    "Gyro": 3,
    "Mag": 3,
    "Analog Vbat": 1,
    "Constat": 4
  },
  "sensor_conversion_constants": {
    "Mics": [
      1,
      1,
      1,
      1,
      1,
      1,
      1,
      1,
      1,
      1
    ],
    "Diff_Baros": [
      1,
      1,
      1,
      1,
      1
    ],
    "Baros_P": [
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96,
      40.96
    ],
    "Baros_T": [
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100,
      100
    ],
    "Acc": [
      1,
      1,
      1
    ],
    "Gyro": [
      1,
      1,
      1
    ],
    "Mag": [
      1,
      1,
      1
    ],
    "Analog Vbat": [
      1
    ],
    "Constat": [
      1,
      1,
      1,
      1
    ]
  },
  "period": {
    "Mics": 6.4e-05,
    "Baros_P": 0.01,
    "Baros_T": 0.01,
    "Diff_Baros": 0.001,
    "Acc": 0.01,
    "Gyro": 0.01,
    "Mag": 0.08,
    "Analog Vbat": 6.103515625e-05,
    "Constat": 0.045
  },
  "sensor_commands": {
    "start": ["startBaros", "startDiffBaros", "startIMU", "startMics"],
    "stop": ["stopBaros", "stopDiffBaros", "stopIMU", "stopMics"],
    "configuration": ["configBaros", "configAccel", "configGyro", "configMics"],
    "utilities": [
      "getBattery",
      "setConnInterval",
      "tpcBoostIncrease",
      "tpcBoostDecrease",
      "tpcBoostHeapMemThr1",
      "tpcBoostHeapMemThr2",
      "tpcBoostHeapMemThr4"
    ]
  },
  "installation_data": {
    "installation_reference": "ost-wt-evaluation",
    "longitude": "001",
    "latitude": "001",
    "turbine_id": "OST_WIND",
    "blade_id": "BAV01",
    "hardware_version": "2.1",
    "sensor_coordinates": {
      "Mics": [
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ]
      ],
      "Baros_P": [
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ]
      ],
      "Baros_T": [
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ]
      ],
      "Diff_Baros": [
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ]
      ],
      "Acc": [
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ]
      ],
      "Gyro": [
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ]
      ],
      "Mag": [
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ]
      ],
      "Analog Vbat": [
        [
          0,
          0,
          0
        ]
      ],
      "Constat": [
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ],
        [
          0,
          0,
          0
        ]
      ]
    }
  },
  "session_data": {
    "label": null
  }
}

Change Notes

  • gateway-specific values are split out from node-specific values
  • node-specific values are repeated, one set for each node, keyed on an integer.
    • they are keyed rather than put in an array so that, in future, we could change them to strings or hex or whatever
  • session_data is renamed session because _data is redundant/obvious
  • gateway.receiver_firmware_version was previously called hardware_version but that was ambiguous
  • nodes.x.node_firmware_version is the other part of disambiguating hardware_version.
  • blade_id was part of data that ended up in the installations table (as was sensor position data). That installations table will require refactoring in the db / cloud functions because it shouldn't have that node specific data on it, we should just keep that in configurations and query through for it.
  • much of the configuration is duplicated between the nodes, but that is done so that in future, it'll be easier to run different firmware versions on the same installation.
  • packet_key (which was 254 for the v2 firmware) has been replaced with packet_key_offset (see below for how to find the packet key).

Steps

  • Update Configuration class to accept that new form input.
  • Identify the set of possible packet keys and use them to identify packets being received in PacketReader.read_packets() (instead of one fixed packet key being used as the id). Subtract the packet_key_offset to get the node_id of the packet, and include that in the packet queue data.
  • Adapt the persistence layer to allow for different nodes, and different nodes having different sensor names (perhaps by dividing persistence by node?)
  • Update all usage of self.config to use the appropriate variables (now deeper in the config) (see " Using node-specific data in the packet parser" below).
  • Update the database schema to:
    • Add node_id to the measurements table so that each measurement is referenced to the individual table.
    • Remove the geometry field from the installations table because it is duplicated from the configurations table.
      • See these notes on how to delete columns
      • The reason WHY we did this was so that we could show where the sensor locations were on the blade in the list of installations. But I think we should do this more smartly through materialized views rather than duping that data, particularly since it could change over time as configurations change.
    • Remove the blade_id value from the installations table (now node-specific, and again duplicated from the configurations table)
  • Update the sensor data sent to the cloud to include node_id.
  • Update the cloud function to correctly parse data from the configuration.
  • Update create_installation and add_sensor_type fo rhte new configuration pattern (the create_installation will also have to respect the new DB schema)

Getting node_id from packet_key_offset

The node id is encoded in, but is not the same as, the packet key. The encoding is as:
packet_key = node_id + packet_key_offset

Thus, for each node, packet_key = config.gateway.packet_key_offset + node_id.

BACKWARD COMPATIBILITY WITH V2 is achievable by setting the v2 up with a node_id=0 and thus the packet key is trivially equal to the offset.

Using node-specific data in the packet parser

Use something like:

        previous_timestamp = {}
        data = {}
        for node_id in self.config.node_ids: # todo add property to config
            data[node_id]={}
            previous_timestamp[node_id] = {}
            for sensor_name in self.config.nodes[node_id].sensor_names:
                previous_timestamp[node_id][sensor_name] = -1
                data[node_id][sensor_name] = [
                    ([0] * self.config.samples_per_packet[sensor_name])
                    for _ in range(self.config.number_of_sensors[sensor_name])
                ]

Extra Notes

  • The getBattery firmware bug (if you send getBattery you had to reset the sensor after) is apparently fixed in v3.

TypeError while persisting to cloud

Bug report

What is the current behavior?

Running in the cloud mode:

gateway start --gcp-project-name=aerosense-twined --gcp-bucket-name=aerosense-ingress-eu

Results in the following error during persistence to cloud:

[2021-12-06 15:38:51,200 | ERROR | data_gateway.persistence] Object of type Configuration is not JSON serializable
Traceback (most recent call last):
  File "/home/pi/aerosense/install/data-gateway/data_gateway/persistence.py", line 264, in _persist_window
    timeout=self.upload_timeout,
  File "/home/pi/flasks/gateway/lib/python3.7/site-packages/octue/cloud/storage/client.py", line 101, in upload_from_string
    self._overwrite_blob_custom_metadata(blob, metadata)
  File "/home/pi/flasks/gateway/lib/python3.7/site-packages/octue/cloud/storage/client.py", line 298, in _overwrite_blob_custom_metadata
    blob.metadata = self._encode_metadata(metadata)
  File "/home/pi/flasks/gateway/lib/python3.7/site-packages/octue/cloud/storage/client.py", line 320, in _encode_metadata
    return {key: json.dumps(value, cls=OctueJSONEncoder) for key, value in metadata.items()}
  File "/home/pi/flasks/gateway/lib/python3.7/site-packages/octue/cloud/storage/client.py", line 320, in <dictcomp>
    return {key: json.dumps(value, cls=OctueJSONEncoder) for key, value in metadata.items()}
  File "/usr/lib/python3.7/json/__init__.py", line 238, in dumps
    **kw).encode(obj)
  File "/usr/lib/python3.7/json/encoder.py", line 199, in encode
    chunks = self.iterencode(o, _one_shot=True)
  File "/usr/lib/python3.7/json/encoder.py", line 257, in iterencode
    return _iterencode(o, 0)
  File "/home/pi/flasks/gateway/lib/python3.7/site-packages/octue/utils/encoders.py", line 33, in default
    return TwinedEncoder.default(self, obj)
  File "/home/pi/flasks/gateway/lib/python3.7/site-packages/twined/utils/encoders.py", line 32, in default
    return json.JSONEncoder.default(self, obj)
  File "/usr/lib/python3.7/json/encoder.py", line 179, in default
    raise TypeError(f'Object of type {o.__class__.__name__} '
TypeError: Object of type Configuration is not JSON serializable
[2021-12-06 15:38:51,206 | WARNING | data_gateway.persistence] Upload of window may have failed - writing to disk at 'data_gateway/.backup/163946/window-0.json'

Your environment

  • Library Version: 0.7.1

Factor out hard-coded sensor names/types and packet structures

Feature request

Current state

Currently, the names/types of sensors are hard-coded in lots of different places in packet_reader.py as separate string literals, and also as constants where they're reused in a defined way. The logic used to parse data from them is also hard-coded in and is difficult to separate out. This is fine for now, but if the gateway needs to be used for different sensors, it would be a better approach to be able to define the packet reading algorithm to be modular and easily accommodate new/different sensors.

Parsing packets into structured data can be generalized as well.

log handling not idempotent across tests

Bug report

What is the current behavior?

We are leaking global logger state across tests, and as a result get a pytest error:

/usr/local/bin/python ~/.vscode-server/extensions/ms-python.python-2022.8.1/pythonFiles/testing_tools/run_adapter.py discover pytest -- --rootdir . -s --cache-clear
cwd: .
[ERROR 2022-6-6 9:43:24.553]: Error discovering pytest tests:
 [n [Error]: --- Logging error ---
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/logging/__init__.py", line 1086, in emit
    stream.write(msg + self.terminator)
ValueError: I/O operation on closed file.
Call stack:
  File "/usr/local/lib/python3.9/multiprocessing/util.py", line 332, in _exit_function
    info('process shutting down')
  File "/usr/local/lib/python3.9/multiprocessing/util.py", line 54, in info
    _logger.log(INFO, msg, *args)
Message: 'process shutting down'
Arguments: ()

The issue is as described here: pytest-dev/pytest#5743

What is the expected behavior?

The log handlers need to be cleaned up on a per-test basis in order to allow the tests to run in an idempotent way.

Enable architecture-specific installs

Bug report

We're unable to get poetry to install the gateway on mac M1, x86_84 and armv7l (rPi), because: poetry doesn't allow specification of different sources for different architectures - adding multiple sources is riddled with problems in poetry

  • So, grpcio won't install on rPi, because pypi doesn't have a binary for armv7l
  • This is unlike pip, which on rPi uses piwheels preferentially (see /etc/pip.conf on the rPi) to fetch the wheels

Plan

CI + development on mac M1 and Windows machines

poetry install

rPi development

poetry export -f requirements.txt --output requirements.txt --dev --without-hashes && pip install -r requirements.txt`

rPi production

pip install <data_gateway_url>

Test how pip uses pyproject.toml
Test if pip installing from the data gateway repo (per the installation instructions for deploying):

  • Works on all three platforms
  • Installs only production dependencies, not devdependencies

Implement WaitUntilSetComplete and timestamp synchronisation

Feature request

In parse_sensor_packet() of the newer packet reader script, the hardware team have implemented a function waitUntilSetComplete() in order to correct timestamps.

Sift through and update our PacketReader.parse_packets() to do the same thing.

@cortadocodes I'm assigning this to you because I don't have a big monitor here, and I'm really struggling on my 13" screen to do this!

The New Packet Reader

import serial

from datetime import datetime

from _thread import start_new_thread

import sys

import os

import time


MICS_FREQ = 15625

MICS_BM = 0x3FF  # 1023

BAROS_FREQ = 100

DIFF_BAROS_FREQ = 1000

BAROS_BM = 0x3FF  # 1023

ACC_FREQ = 100

ACC_RANGE = 16

GYRO_FREQ = 100

GYRO_RANGE = 2000

ANALOG_FREQ = 16384

NUM_NODES = 6


# for interaction with the base station

mode = 1  # mode = 0: Linux, mode = 1: Windows


# for interaction with the aerosense debug

# mode=1


if mode == 0:

    BAUDRATE = 2300000

    PORT = "/dev/ttyACM0"

elif mode == 1:

    BAUDRATE = 2300000

    PORT = "COM12"


ENDIAN = "little"

MAX_TIMESTAMP_SLACK = 5e-3  # 5ms

MAX_PERIOD_DRIFT = 0.02  # 2% difference between IMU clock and CPU clock allowed


PACKET_KEY = 0xFE  # 254


PACKET_KEY_OFFSET = 0xF5  # 245


TYPE_HANDLE_DEF = 0xFF  # 255


handles = {
    34: "Abs. baros",
    36: "Diff. baros",
    38: "Mic 0",
    40: "Mic 1",
    42: "IMU Accel",
    44: "IMU Gyro",
    46: "IMU Magnetometer",
    48: "Analog1",
    50: "Analog2",
    52: "Constat",
    54: "Cmd Decline",
    56: "Sleep State",
    58: "Remote Info Message",
    60: "Timestamp Packet 0",
    62: "Timestamp Packet 1",
    64: "Local Info Message",
}


decline_reason = {
    0: "Bad block detection ongoing",
    1: "Task already registered, cannot register again",
    2: "Task is not registered, cannot de-register",
    3: "Connection parameter update unfinished",
    4: "Not ready to sleep",
    5: "Not in sleep",
}


sleep_state = {0: "Exiting sleep", 1: "Entering sleep"}


remote_info = {0: "Battery info"}


local_info = {
    0: "Synchronization not ready as not every sensor node is connected",
    1: "Time synchronization info",
    2: "Time sync exception",
    4: "Time sync coarse data record error",
    8: "Time sync alignment error",
    16: "Time sync coarse data time diff error",
    32: "Device not connected",
    64: "select message destination successful",
    128: "Time sync success",
    129: "Coarse sync finish",
    130: "time sync msg sent",
}


def errPrint(s):

    print("***** " + s + " *****")


def parseHandleDef(payload):

    startHandle = int.from_bytes(payload[0:1], ENDIAN)

    endHandle = int.from_bytes(payload[2:3], ENDIAN)

    print(startHandle, endHandle)

    if endHandle - startHandle == 30:

        handles = {
            startHandle + 2: "Abs. baros",
            startHandle + 4: "Diff. baros",
            startHandle + 6: "Mic 0",
            startHandle + 8: "Mic 1",
            startHandle + 10: "IMU Accel",
            startHandle + 12: "IMU Gyro",
            startHandle + 14: "IMU Magnetometer",
            startHandle + 16: "Analog1",
            startHandle + 18: "Analog2",
            startHandle + 20: "Constat",
            startHandle + 22: "Cmd Decline",
            startHandle + 24: "Sleep State",
            startHandle + 26: "Remote Info Message",
            startHandle + 28: "Timestamp Packet 0",
            startHandle + 30: "Timestamp Packet 1",
            startHandle + 32: "Local Info Message",
        }

        print("Successfully updated the handles")

    else:

        errPrint("Handle error: " + str(startHandle) + " " + str(endHandle))


files = {}


MICS_SAMPLES_PER_PACKET = 8

BAROS_SAMPLES_PER_PACKET = 1

IMU_SAMPLES_PER_PACKET = int(240 / 2 / 3)

ANALOG_SAMPLES_PER_PACKET = 60

DIFF_BAROS_SAMPLES_PER_PACKET = 24


samplesPerPacket = {
    "Mics": MICS_SAMPLES_PER_PACKET,
    "Baros_P": BAROS_SAMPLES_PER_PACKET,
    "Baros_T": BAROS_SAMPLES_PER_PACKET,
    "Acc": IMU_SAMPLES_PER_PACKET,
    "Gyro": IMU_SAMPLES_PER_PACKET,
    "Mag": IMU_SAMPLES_PER_PACKET,
    "Analog": ANALOG_SAMPLES_PER_PACKET,
    "Diff_Baros": DIFF_BAROS_SAMPLES_PER_PACKET,
}


nMeasQty = {
    "Mics": 10,
    "Baros_P": 40,
    "Baros_T": 40,
    "Acc": 3,
    "Gyro": 3,
    "Mag": 3,
    "Analog": 2,
    "Diff_Baros": 5,
}


data = {
    "Mics": [([0] * samplesPerPacket["Mics"]) for i in range(nMeasQty["Mics"])],
    "Baros_P": [([0] * samplesPerPacket["Baros_P"]) for i in range(nMeasQty["Baros_P"])],
    "Baros_T": [([0] * samplesPerPacket["Baros_T"]) for i in range(nMeasQty["Baros_T"])],
    "Acc": [([0] * samplesPerPacket["Acc"]) for i in range(nMeasQty["Acc"])],
    "Gyro": [([0] * samplesPerPacket["Gyro"]) for i in range(nMeasQty["Gyro"])],
    "Mag": [([0] * samplesPerPacket["Mag"]) for i in range(nMeasQty["Mag"])],
    "Analog": [([0] * samplesPerPacket["Analog"]) for i in range(nMeasQty["Analog"])],
    "Diff_Baros": [([0] * samplesPerPacket["Diff_Baros"]) for i in range(nMeasQty["Diff_Baros"])],
}


period = {
    "Mics": 1 / MICS_FREQ,
    "Baros_P": 1 / BAROS_FREQ,
    "Baros_T": 1 / BAROS_FREQ,
    "Acc": 1 / ACC_FREQ,
    "Gyro": 1 / GYRO_FREQ,
    "Mag": 1 / 12.5,
    "Analog": 1 / ANALOG_FREQ,
    "Diff_Baros": 1 / DIFF_BAROS_FREQ,
}


currentTimestamp = {"Mics": 0, "Baros_P": 0, "Baros_T": 0, "Acc": 0, "Gyro": 0, "Mag": 0, "Analog": 0, "Diff_Baros": 0}

prevIdealTimestamp = {
    "Mics": 0,
    "Baros_P": 0,
    "Baros_T": 0,
    "Acc": 0,
    "Gyro": 0,
    "Mag": 0,
    "Analog": 0,
    "Diff_Baros": 0,
}


def writeData(type, timestamp, period, node=1):  # timestamp in s

    n = len(data[type][0])  # number of samples

    for i in range(len(data[type][0])):  # iterate through all sample times

        time = timestamp - (n - i) * period

        files[node][type].write(str(time) + ",")

        for meas in data[type]:  # iterate through all measured quantities

            files[node][type].write(str(meas[i]) + ",")

        files[node][type].write("\n")


# The sensor data arrive packets that contain n samples from some sensors of the same type, e.g. one barometer packet contains 40 samples from 4 barometers each.

# For each sensor type (e.g. baro), this function waits until the packets from all sensors have arrived. Then it writes those to the .csv file.

# Since timestamps only come at a packet level, this function also interpolates the within-packet-timestamps


def waitTillSetComplete(type, t, node=1):  # timestamp in 1/(2**16) s

    if type == "Mics" or type == "Baros_P" or type == "Baros_T" or type == "Diff_Baros" or type == "Analog":

        # For those measurement types, the samples are inherently synchronized to the CPU time already.

        # The timestamps may be slightly off, so it takes the first one as a reference and then uses the following ones only to check if a packet has been dropped

        # Also, for mics and baros, there exist packet sets: Several packets arrive with the same timestamp

        if currentTimestamp[type] != 0:

            idealNewTimestamp = prevIdealTimestamp[type] + samplesPerPacket[type] * period[type] * (2 ** 16)

            if abs(idealNewTimestamp - currentTimestamp[type]) > MAX_TIMESTAMP_SLACK * (
                2 ** 16
            ):  # If at least one set (= one packet per mic/baro group) of packets was lost

                if prevIdealTimestamp[type] != 0 and type != "Mics":

                    print(
                        "Lost set of "
                        + type
                        + " packets: "
                        + str((currentTimestamp[type] - idealNewTimestamp) / (2 ** 16) * 1000)
                        + "ms gap"
                    )

                if type != "Mics":

                    idealNewTimestamp = currentTimestamp[type]

            writeData(type, idealNewTimestamp / (2 ** 16), period[type], node)

            data[type] = [([0] * samplesPerPacket[type]) for i in range(nMeasQty[type])]

            prevIdealTimestamp[type] = idealNewTimestamp

            currentTimestamp[type] = t

        else:

            if type == "Mics":

                prevIdealTimestamp[type] = t

            currentTimestamp[type] = t

            print("Received first set of " + type + " packets")

    else:  # The IMU values are not synchronized to the CPU time, so we simply always take the timestamp we have

        if currentTimestamp[type] != 0:

            per = period[type]

            if (
                prevIdealTimestamp[type] != 0
            ):  # If there is a previous timestamp, calculate the actual sampling period from the difference to the current timestamp

                per = (currentTimestamp[type] - prevIdealTimestamp[type]) / samplesPerPacket[type] / (2 ** 16)

                if (
                    abs(per - period[type]) / period[type] < MAX_PERIOD_DRIFT
                ):  # If the calculated period is reasonable, accept it. If not, most likely a packet got lost

                    period[type] = per

                else:

                    print(
                        "Lost "
                        + type
                        + " packet: "
                        + str((currentTimestamp[type] - prevIdealTimestamp[type]) / (2 ** 16) * 1000)
                        + "ms gap"
                    )

            else:

                print("Received first " + type + " packet")

            writeData(type, t / (2 ** 16), period[type], node)

        prevIdealTimestamp[type] = currentTimestamp[type]

        currentTimestamp[type] = t


def parseSensorPacket(type, len, payload, node=1):

    global mic_cnt

    if not type in handles:

        print("Received packet with unknown type: ", type)

        print("Payload len: ", len)

        #        print("Payload: ", int.from_bytes(payload, ENDIAN))

        return

    t = int.from_bytes(payload[240:244], ENDIAN, signed=False)  # Read timestamp from packet

    if handles[type] == "Abs. baros":

        waitTillSetComplete("Baros_P", t, node)

        waitTillSetComplete("Baros_T", t, node)

        # Write the received payload to the data field

        for i in range(BAROS_SAMPLES_PER_PACKET):

            for j in range(nMeasQty["Baros_P"]):

                bps = 6  # bytes per sample

                data["Baros_P"][j][i] = int.from_bytes(
                    payload[(bps * j) : (bps * j + 4)], ENDIAN, signed=False
                )  # /4096

                data["Baros_T"][j][i] = int.from_bytes(
                    payload[(bps * j + 4) : (bps * j + 6)], ENDIAN, signed=True
                )  # /100

    elif handles[type] == "Diff. baros":

        waitTillSetComplete("Diff_Baros", t, node)

        # int_payload = [x for x in payload]

        # print(int_payload)

        # Write the received payload to the data field

        for i in range(DIFF_BAROS_SAMPLES_PER_PACKET):

            for j in range(nMeasQty["Diff_Baros"]):

                bps = 2  # bytes per sample

                # this result depends on the sensor (multiply with the sensor max value to get the scaled result)

                # data["Diff_Baros"][j][i] = (int.from_bytes(payload[(bps*(nMeasQty["Diff_Baros"]*i+j)) : (bps*(nMeasQty["Diff_Baros"]*i+j)+bps)], ENDIAN, signed=False) - 6553)/(58982-6553)

                data["Diff_Baros"][j][i] = int.from_bytes(
                    payload[(bps * (nMeasQty["Diff_Baros"] * i + j)) : (bps * (nMeasQty["Diff_Baros"] * i + j) + bps)],
                    ENDIAN,
                    signed=False,
                )

    elif handles[type] == "Mic 0":

        waitTillSetComplete("Mics", t, node)

        bps = 3  # bytes per sample

        for i in range(MICS_SAMPLES_PER_PACKET // 2):

            for j in range(5):

                data["Mics"][j][2 * i] = int.from_bytes(
                    payload[(bps * j + 20 * bps * i) : (bps * j + 20 * bps * i + 3)], "big", signed=True
                )

                data["Mics"][j][2 * i + 1] = int.from_bytes(
                    payload[(bps * j + 20 * bps * i + 5 * bps) : (bps * j + 20 * bps * i + 3 + 5 * bps)],
                    "big",
                    signed=True,
                )

                data["Mics"][j + 5][2 * i] = int.from_bytes(
                    payload[(bps * j + 20 * bps * i + 10 * bps) : (bps * j + 20 * bps * i + 3 + 10 * bps)],
                    "big",
                    signed=True,
                )

                data["Mics"][j + 5][2 * i + 1] = int.from_bytes(
                    payload[(bps * j + 20 * bps * i + 15 * bps) : (bps * j + 20 * bps * i + 3 + 15 * bps)],
                    "big",
                    signed=True,
                )

    elif handles[type] == "Mic 1":

        if payload[0] == 1:

            print("Sensor reading from flash done")  # print("Mics reading done")

        elif payload[0] == 2:

            print("Flash erasing done")  # print("Mics erasing done")

        elif payload[0] == 3:

            print("Sensor started")  # print("Mics and or (diff)baros started")

    elif handles[type].startswith("IMU Accel"):

        waitTillSetComplete("Acc", t, node)

        # Write the received payload to the data field

        for i in range(IMU_SAMPLES_PER_PACKET):

            data["Acc"][0][i] = int.from_bytes(payload[(6 * i) : (6 * i + 2)], ENDIAN, signed=True)

            data["Acc"][1][i] = int.from_bytes(payload[(6 * i + 2) : (6 * i + 4)], ENDIAN, signed=True)

            data["Acc"][2][i] = int.from_bytes(payload[(6 * i + 4) : (6 * i + 6)], ENDIAN, signed=True)

    elif handles[type] == "IMU Gyro":

        waitTillSetComplete("Gyro", t, node)

        # Write the received payload to the data field

        for i in range(IMU_SAMPLES_PER_PACKET):

            data["Gyro"][0][i] = int.from_bytes(payload[(6 * i) : (6 * i + 2)], ENDIAN, signed=True)

            data["Gyro"][1][i] = int.from_bytes(payload[(6 * i + 2) : (6 * i + 4)], ENDIAN, signed=True)

            data["Gyro"][2][i] = int.from_bytes(payload[(6 * i + 4) : (6 * i + 6)], ENDIAN, signed=True)

    elif handles[type] == "IMU Magnetometer":

        waitTillSetComplete("Mag", t, node)

        # Write the received payload to the data field

        for i in range(IMU_SAMPLES_PER_PACKET):

            data["Mag"][0][i] = int.from_bytes(payload[(6 * i) : (6 * i + 2)], ENDIAN, signed=True)

            data["Mag"][1][i] = int.from_bytes(payload[(6 * i + 2) : (6 * i + 4)], ENDIAN, signed=True)

            data["Mag"][2][i] = int.from_bytes(payload[(6 * i + 4) : (6 * i + 6)], ENDIAN, signed=True)

    # elif handles[type] == "Analog":

    #     waitTillSetComplete("Analog", t)

    #     def valToV(val):

    #         return (val << 6) / 1e6

    #     for i in range(ANALOG_SAMPLES_PER_PACKET):

    #         data["Analog"][0][i] = valToV(int.from_bytes(payload[(4*i):(4*i+2)], ENDIAN, signed=False))

    #         data["Analog"][1][i] = valToV(int.from_bytes(payload[(4*i+2):(4*i+4)], ENDIAN, signed=False))

    # print(data["Analog"][0][0])

    elif handles[type] == "Constat":

        print(f"Node: {node}, Constat packet: %d" % (t / (2 ** 16)))

    elif handles[type] == "Cmd Decline":

        reason_index = int.from_bytes(payload, ENDIAN, signed=False)

        print("Command declined, " + decline_reason[reason_index])

    elif handles[type] == "Sleep State":

        state_index = int.from_bytes(payload, ENDIAN, signed=False)

        print("\n" + sleep_state[state_index] + "\n")

    #     elif handles[type] == "Info Message":

    #         info_index = int.from_bytes(payload[0:1], ENDIAN, signed=False)

    #         print(info_index)

    #         if info_type[info_index] == "Battery info":

    #             voltage = int.from_bytes(payload[1:5], ENDIAN, signed=False)

    #             cycle = int.from_bytes(payload[5:9], ENDIAN, signed=False)

    #             stateOfCharge = int.from_bytes(payload[9:13], ENDIAN, signed=False)

    #             print(f"Node: {node} \n Voltage : {voltage/1000000} v \n Cycle count: {cycle/100} \n State of charge: {stateOfCharge/256}%")

    #######################################################################################

    elif handles[type] == "Remote Info Message":

        info_index = int.from_bytes(payload[0:1], ENDIAN, signed=False)

        print(remote_info[info_index])

        if remote_info[info_index] == "Battery info":

            voltage = int.from_bytes(payload[1:5], ENDIAN, signed=False)

            cycle = int.from_bytes(payload[5:9], ENDIAN, signed=False)

            stateOfCharge = int.from_bytes(payload[9:13], ENDIAN, signed=False)

            print(
                f"Node: {node} \n Voltage : {voltage/1000000} v \n Cycle count: {cycle/100} \n State of charge: {stateOfCharge/256}%"
            )

    elif handles[type] == "Local Info Message":

        info_index = int.from_bytes(payload[0:1], ENDIAN, signed=False)

        print(local_info[info_index])

        if info_index == 130:

            print(int.from_bytes(payload[1:3], ENDIAN, signed=False))

        if local_info[info_index] == "Time synchronization info":

            info_type = int.from_bytes(payload[1:5], ENDIAN, signed=False)

            if info_type == 0:

                print("seq data")

                for i in range(15):

                    seqDataFile.write(str(int.from_bytes(payload[5 + i * 4 : 9 + i * 4], ENDIAN, signed=False)) + ",")

                for i in range(15, 18):

                    seqDataFile.write(str(int.from_bytes(payload[5 + i * 4 : 9 + i * 4], ENDIAN, signed=True)) + ",")

                seqDataFile.close()

            elif info_type == 1:

                print("central data")

                for i in range(60):

                    centralDataFile.write(
                        str(int.from_bytes(payload[5 + i * 4 : 9 + i * 4], ENDIAN, signed=False)) + ","
                    )

                    centralCnt = centralCnt + 1

                    if centralCnt == 187:

                        centralDataFile.close()

                        break

            elif info_type == 2:

                print("perif 0 data")

                for i in range(61):

                    perif0DataFile.write(
                        str(int.from_bytes(payload[5 + i * 4 : 9 + i * 4], ENDIAN, signed=False)) + ","
                    )

                perif0DataFile.close()

            elif info_type == 3:

                print("perif 1 data")

                for i in range(61):

                    perif1DataFile.write(
                        str(int.from_bytes(payload[5 + i * 4 : 9 + i * 4], ENDIAN, signed=False)) + ","
                    )

                perif1DataFile.close()

            elif info_type == 4:

                print("perif 2 data")

                for i in range(61):

                    perif2DataFile.write(
                        str(int.from_bytes(payload[5 + i * 4 : 9 + i * 4], ENDIAN, signed=False)) + ","
                    )

                perif2DataFile.close()

    elif handles[type] == "Timestamp Packet 0":

        print("timestamp packet", int(len / 4), len)

        for i in range(int(len / 4)):

            files["ts" + str(packet_source)].write(
                str(int.from_bytes(payload[i * 4 : (i + 1) * 4], ENDIAN, signed=False)) + ","
            )

        # files["sampleElapse"+str(packet_source)].close()

    elif handles[type] == "Timestamp Packet 1":

        print("time elapse packet", int(len / 4), len)

        for i in range(int(len / 4)):

            files["sampleElapse" + str(packet_source)].write(
                str(int.from_bytes(payload[i * 4 : (i + 1) * 4], ENDIAN, signed=False)) + ","
            )


#     else:

#         print("unknown handle %d", type)

#######################################################################################


stop = False


def read_packets(ser):

    global stop

    while not stop:

        r = ser.read()

        if len(r) == 0:

            continue

        # print(f"Got packet key {r[0]}, key-PACKET_KEY_OFFSET = {r[0]-PACKET_KEY_OFFSET}")

        if (r[0] == PACKET_KEY) or (((r[0] - PACKET_KEY_OFFSET) <= 5) & ((r[0] - PACKET_KEY_OFFSET) >= 0)):

            pack_type = int.from_bytes(ser.read(), ENDIAN)

            length = int.from_bytes(ser.read(), ENDIAN)

            payload = ser.read(length)

            # print(f"{time.time()}:  Got packet type {pack_type}")

            if pack_type == TYPE_HANDLE_DEF:

                parseHandleDef(payload)

                nextPacketStart = 0

                packetCnt = 0

            else:

                parseSensorPacket(pack_type, length, payload, r[0] - PACKET_KEY_OFFSET)

    for type in files:

        for i in range(NUM_NODES):

            files[i][type].close()


def writeHeaders():

    for i in range(NUM_NODES):

        files[i]["Mics"].write("Time (s),x,y,z\n")

        files[i]["Baros_P"].write("time,x,y,z\n")

        files[i]["Baros_T"].write("time,x,y,z\n")

        files[i]["Diff_Baros"].write('"time","baro0","baro1","baro2","baro3","baro4"\n')

        files[i]["Acc"].write('"time","x","y","z"\n')

        files[i]["Gyro"].write('"time","x","y","z"\n')

        files[i]["Mag"].write('"time","x","y","z"\n')

        files[i]["Analog"].write("time,x,y,z\n")


folderString = datetime.now().strftime("%Y_%m_%d__%H_%M_%S")

os.mkdir(folderString)

for i in range(NUM_NODES):

    files[i] = {}

    files[i]["Mics"] = open(folderString + "/" + str(i) + "_mics.csv", "w")

    files["ts" + str(i)] = open(folderString + "/ts" + str(i) + ".csv", "w")

    files["sampleElapse" + str(i)] = open(folderString + "/sampleElapse" + str(i) + ".csv", "w")

    files[i]["Baros_P"] = open(folderString + "/" + str(i) + "_baros_p.csv", "w")

    files[i]["Baros_T"] = open(folderString + "/" + str(i) + "_baros_T.csv", "w")

    files[i]["Diff_Baros"] = open(folderString + "/" + str(i) + "_diff_baros.csv", "w")

    files[i]["Acc"] = open(folderString + "/" + str(i) + "_acc.csv", "w")

    files[i]["Gyro"] = open(folderString + "/" + str(i) + "_gyro.csv", "w")

    files[i]["Mag"] = open(folderString + "/" + str(i) + "_mag.csv", "w")

    files[i]["Analog"] = open(folderString + "/" + str(i) + "_analog.csv", "w")


seqDataFile = open(folderString + "/seqData.csv", "w")

centralDataFile = open(folderString + "/centralData.csv", "w")

perif0DataFile = open(folderString + "/perif0Data.csv", "w")

perif1DataFile = open(folderString + "/perif1Data.csv", "w")

perif2DataFile = open(folderString + "/perif2Data.csv", "w")


writeHeaders()


ser = serial.Serial(PORT, BAUDRATE)  # open serial port

# ser.set_buffer_size(rx_size = 100000, tx_size = 1280)


start_new_thread(read_packets, (ser,))


"""

time.sleep(1)

ser.write(("configMics "  + str(MICS_FREQ)  + " " + str(MICS_BM) + "\n").encode('utf_8'))

time.sleep(1)

ser.write(("configBaros " + str(BAROS_FREQ) + " " + str(BAROS_BM) + "\n").encode('utf_8'))

time.sleep(1)

ser.write(("configAccel " + str(ACC_FREQ)   + " " + str(ACC_RANGE) + "\n").encode('utf_8'))

time.sleep(1)

ser.write(("configGyro "  + str(GYRO_FREQ)  + " " + str(GYRO_RANGE) + "\n").encode('utf_8'))

"""


for line in sys.stdin:

    if line == "saveFinish\n":

        ser.write("syncSensorFinish\n".encode("utf_8"))

        print("----command syncSensorFinish issued----")

    for i in range(NUM_NODES):

        files["ts" + str(i)].close()

        files["sampleElapse" + str(i)].close()

    if line == "stop\n":

        stop = True

        break

    else:

        ser.write(line.encode("utf_8"))

        print("----command " + line[:-1] + " issued----")

Consider adding diagnostics like RPi cpu monitors to the gateway code

Feature request

There are some CPU frequency and monitoring scripts on the raspberry pi:

cpu_freq.sh

#!/bin/bash
temp=`head -n 1 /sys/class/thermal/thermal_zone0/temp | xargs -I{} awk "BEGIN {printf \"%.2f\n\", {}/1000}"`
echo $((`cat /sys/devices/system/cpu/cpu0/cpufreq/scaling_cur_freq`/1000)) MHz, $temp degrees

cpu_monitor.sh

while : ; do date; sh cpu_freq.sh; sleep 600; done

Use Case

This will get scrubbed with reinstalls, consider whether that's useful information to have, and if so, add a subprocess to periodically determine (e.g. hourly) and add to results.

Code Review

BACKGROUND

What have we done:

  • Took some rudimentary python code from a hardware engineer (Rafael), wrapped it in a CLI.
    • The rudimentary python reader was persisting data line by line to a CSV
  • Added a persistence layer, to save data read from the port to disc.
    • Remove the CSV writer, replace with JSON writer which dumps data in a big chunk (10 minutes' worth of data at a time)
  • Added an uploader, to push data (in 'windows' of 10 minutes) to a cloud endpoint
  • We then tested the process (persisting to disc, not uploading) running on a laptop connected to the receiver hardware.
    • We got a report from Rafael of a problem (our first view of what we have been calling an "overflow error") - issue #19
    • Shortly after, Rafael filed issue #20 which seems like it is similar or has an identical root cause.
    • Rafael was running these on a windows laptop.

    Note: here we change the buffer size. Rafael had done this in his original script - we didn't know why. We added a condition so that the adjustment ran only on windows, because you can't change the buffer size on a normal linux kernel.
    The first report of the "overflow error" came from a windows machine with modified buffer size. @time-trader has seen the same error on a linux machine (@time-trader further investigated in #6)
    Tom's Conclusion Our "overflow" issue isn't purely due to changing buffer size.
    Yuriy's explanation The buffer size is the number of bytes the port can hold. There is a sequence of operations:

       1.  read bytes from the port (one by one) until packet is formed (tell it how many bytes to read)
       2. parse the packet
       3. accumulate the parsed data in memory
       4. if time more than 10 minutes (window full):
             -  upload / persist parsed data to disc and empty memory
       5. If the time taken to execute steps 2-4 is greater than the time taken for the buffer to fill, given the bandwidth of the input data, then our overflow error occurs, **which manifests in missed packets**. Question: Will it result in just missed packets? or corrupt ones? Almost certain that the last packet is corrupt.
             - so whether this condition occurs is a function of a) the speed of the CPU / efficiency of code in steps 1-4, b) the bandwidth of influx data and c) size of the buffer
       6. GOTO 1
  • We can choose to run different sensors (e.g. pressure sensors or mics). If we run mics (high bandwidth) then the overflow problem happens IMMEDIATELY. If we run pressure sensors (lower bandwidth) it won't happen until we persist the data.
  • So we were trying to multithread this, by putting the reading (step 1) into one thread and all the others (2-4) into another. What happened was:
    • we used the serial.threading module on the port reader in this PR - It turns out this doesn't work as expected (i.e. the threading reader doesn't thread!!!) but we discovered a faster way of executing step 1, which is to read all the bytes at once.
    • Corollary Speeding up step 1 makes the overflow condition in step 5 less likely to occur although doesn't necessarily actually solve it.
  • Next, we will get rid of the (apparently useless) serial.threading module and just implement that quicker read ourselves. So that'll help, but still not solve the core of the problem.

THE QUESTION

How should we deal with this?

  • We plan to speed up step 1 by reading the entire port at once, rather than byte by byte.
  • Assuming Step 1 is fast enough that it's not the fundamental constraint...
    • Where should we multithread this (e.g. separate thread for 2-4, or separate thread for 2-3 and also 4?)
    • Will threading even help or will the GIL kill us?
    • Are there common patterns that we're missing here? What's "normal" for us to do?

OTHER NOTES

Approx estimate bandwidth of data coming in

10 mics, sampling at 15,625Hz, each sample is 3 bytes, we sample for 10 minutes then takes about 50 minutes to send over bluetooth (bluetooth capacity constrained).

10 * 3 * 15625 * (10/50) = 93750 bytes/s data inbound to the serial port

On f.read()-like operations efficiency

Notes for marcus

Marcus' question about file pointers (looking at serial.read): It would be useful to know if reading one byte from a file pointer (e.g. f.read(1), f.read(1), f.read(m)

# SLOW: because the instructions aren't necessarily executed sequentially so the CPU can't optimise the memory/disk access pattern
for i in range(n):
    f.read(1)

#FAST:
f.read(n)

#STILL PRETTY FAST
f.read(1)
f.read(1)
f.read(n-2)

# EFFICIENCY:
# Where p (total number of bytes waiting in the buffer) >>>> n (number of bytes in a packet)...
# time(f.read(p)) / (p/n) will be quicker than (p/n) executions of f.read(n)


</details>

Check packet length if-statement

Bug report

In PacketReader.parse_packet(), we have a check for packet length which is:

                        if len(packet) == 244:  # If the full data payload is received, proceed parsing it
                            timestamp = int.from_bytes(packet[240:244], self.config.endian, signed=False) / (2 ** 16)
                            # .... and other stuff to process the packet

                        if self.handles[packet_type] in [
                            "Mic 1",
                            "Cmd Decline",
                            "Sleep State",
                            "Info Message",
                        ]:
                            self._parse_info_packet(self.handles[packet_type], packet, previous_timestamp)

But, in a discussion with @tommasopolonelli yesterday whilst looking at the C code, I followed that the packet length transmitted may not be the full 244 bytes... and yet this seems to work!

We should close the loop to check if I understood correctly. And if I did, is [ "Mic 1", "Cmd Decline", "Sleep State", "Info Message"] the exclusive list of non-244 packets that are possible? Because if so then this works fine, otherwise we must be simply not processing packets.

In the case that we're not processing some packets, we should put the packet length onto the processing queue and check against the length.

Allow json5 routines and configurations

Feature request

It would be useful to be able to comment the configuration files, is there any easy way of accepting JSON5, with its inbuilt comments?

Example comments in a routine.json file:

{
  "commands": [
    ["startSync", 0],
    ["selMsgDest 3", 0.1],
    ["getBattery", 0.2],
    ["startBaros", 0.3],
    ["selMsgDest 4", 1.1],
    ["getBattery", 1.2],
    ["startBaros", 1.3],
    ["selMsgDest 5", 2.1],
    ["getBattery", 2.2],
    ["startBaros", 2.3],
    ["selMsgDest 3", 60.1],
    ["stopBaros", 60.2],
    ["selMsgDest 4", 61.1],
    ["stopBaros", 61.2],
    ["selMsgDest 5", 62.1],
    ["stopBaros", 62.2],
    ["stopSync", 65],
    ["enterHighSpeed", 65.1],
    ["selDevice 3", 70], // (wait until the connection parameter is updated, you can tell that with the constatpacket number)
    ["readSens", 75], // allow 1 minute download per minute collection
    ["selDevice 4", 135], // (wait until the connection parameter is updated, you can tell that with the constatpacket number)
    ["readSens", 140],
    ["selDevice 5", 200], // (wait until the connection parameter is updated, you can tell that with the constatpacket number)
    ["readSens", 205],
    ["exitHighSpeed", 265],
    ["selMsgDest 3", 270.0], // (now we reconnect all the nodes and put them into sleep)
    ["sleep", 270.1],
    ["selMsgDest 4", 271.0],
    ["sleep", 271.1],
    ["selMsgDest 5", 272.0],
    ["sleep", 272.1],
    ["selMsgDest 3", 330.0], // (wake up nodes and allow them to re-establish connection parameters)
    ["wakeUp", 330.1],
    ["selMsgDest 4", 331.0],
    ["wakeUp", 331.1],
    ["selMsgDest 5", 332.0],
    ["wakeUp", 332.1]
  ],
  "period": 360
}

Installation on Raspberry Pi OS

Following packages are pre-requisites for gateway on Raspbian GNU/Linux 10 (buster):

RUST:

Using: sudo apt-get install rustc, does not cut it. Most recent build is needed:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

OpenSSL, HDF5, ATLAS:

sudo apt-get install libssl-dev libhdf5-dev libatlas-base-dev

This might change for the newly released Raspbian Based on Debian 11 (Bullseye)

Gateway sensor scheduling routines

Feature request

Use Case

The goal is to start gateway daemonised process and provide user a possibility to schedule commands issued to the serial port that control sensors.

Possible use routines:

  1. getBattery
  2. startBaros, startIMU, startMics
  3. stopBaros, stopIMU after 10 mins -> write json file ->upload to cloud
  4. getBattery
  5. stopMics -> wait (very long) time for the data to arrive via BT (around 1 hour) -> write Mic data file -> upload to cloud (will be heavy)
  6. getBattery
  7. startDiffBaros, startIMU, startMics
  8. stopDiffBaros, stopIMU after 10 mins -> write json file ->upload to cloud
  9. getBattery
  10. stopMics -> wait (very long) time for the data to arrive via BT (around 1 hour) -> write Mic data file -> upload to cloud

and repeat from 1.

Current state

The commands can be issued manually in the interactive mode only.

Edge case when handling KeyboardInterrupt module shutdown

Bug report

What is the current behavior?

Pressing ctrl+c on the raspberry pi during acquisition usually results in a safe shutdown.

However, I noticed that sometimes I have the following traceback in the middle:

 $ gateway start
/home/pi/.cache/pypoetry/virtualenvs/data-gateway-lz7aa1yn-py3.7/lib/python3.7/site-packages/google_crc32c/__init__.py:29: RuntimeWarning: As the c extension couldn't be imported, `google-crc32c` is using a pure python implementation that is significantly slower. If possible, please configure a c build environment and compile the extension
  warnings.warn(_SLOW_CRC32C_WARNING, RuntimeWarning)
[2022-06-09 14:29:20,591 | INFO | multiprocessing | MainProcess] No configuration file provided - using default configuration.
[2022-06-09 14:29:20,693 | WARNING | multiprocessing | MainProcess] No routine was provided and interactive mode is off - no commands will be sent to the sensors in this session.
[2022-06-09 14:29:20,705 | INFO | multiprocessing | MainProcess] allocating a new mmap of length 4096
[2022-06-09 14:29:20,718 | INFO | multiprocessing | Reader] child process calling self.run()
[2022-06-09 14:29:20,719 | INFO | multiprocessing | Reader] Packet reader process started.
[2022-06-09 14:29:20,725 | INFO | multiprocessing | Parser] child process calling self.run()
[2022-06-09 14:29:20,726 | INFO | multiprocessing | Parser] Packet parser process started.
[2022-06-09 14:29:20,751 | INFO | multiprocessing | Parser] Windows will be saved to 'data_gateway/661014/.backup' at intervals of 600.0 seconds.
[2022-06-09 14:29:20,751 | INFO | multiprocessing | Parser] Windows will be uploaded to 'data_gateway/661014' at intervals of 600.0 seconds.
^C[2022-06-09 14:29:53,966 | INFO | multiprocessing | Reader] Stopping gateway.
[2022-06-09 14:29:53,967 | INFO | multiprocessing | Reader] process shutting down
[2022-06-09 14:29:53,968 | INFO | multiprocessing | Reader] process exiting with exitcode 0
[2022-06-09 14:29:53,967 | INFO | multiprocessing | MainProcess] Sent 'stopBaros' command to sensors.
[2022-06-09 14:29:53,967 | ERROR | multiprocessing | Parser] expected str, bytes or os.PathLike object, not NoneType
Traceback (most recent call last):
  File "/home/pi/aerosense/install/data-gateway/data_gateway/packet_reader.py", line 169, in parse_packets
    packet_type, packet = packet_queue.get(timeout=timeout).values()
  File "/usr/lib/python3.7/multiprocessing/queues.py", line 104, in get
    if not self._poll(timeout):
  File "/usr/lib/python3.7/multiprocessing/connection.py", line 257, in poll
    return self._poll(timeout)
  File "/usr/lib/python3.7/multiprocessing/connection.py", line 414, in _poll
    r = wait([self], timeout)
  File "/usr/lib/python3.7/multiprocessing/connection.py", line 920, in wait
    ready = selector.select(timeout)
  File "/usr/lib/python3.7/selectors.py", line 415, in select
    fd_event_list = self._selector.poll(timeout)
KeyboardInterrupt

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/pi/aerosense/install/data-gateway/data_gateway/persistence.py", line 255, in _persist_window
    cloud_path=storage.path.generate_gs_path(self.bucket_name, self._generate_window_path()),
  File "/home/pi/.cache/pypoetry/virtualenvs/data-gateway-lz7aa1yn-py3.7/lib/python3.7/site-packages/octue/cloud/storage/path.py", line 63, in generate_gs_path
    return CLOUD_STORAGE_PROTOCOL + join(bucket_name, paths[0].lstrip("/"), *paths[1:])
  File "/home/pi/.cache/pypoetry/virtualenvs/data-gateway-lz7aa1yn-py3.7/lib/python3.7/site-packages/octue/cloud/storage/path.py", line 45, in join
    path = os.path.normpath(os.path.join(*paths)).replace("\\", "/")
  File "/usr/lib/python3.7/posixpath.py", line 80, in join
    a = os.fspath(a)
TypeError: expected str, bytes or os.PathLike object, not NoneType
[2022-06-09 14:29:54,005 | WARNING | multiprocessing | Parser] Upload of window may have failed - writing to disk at 'data_gateway/661014/.backup/window-0.json'.
[2022-06-09 14:29:54,009 | INFO | multiprocessing | Parser] Window 0 written to disk.
[2022-06-09 14:29:54,011 | INFO | multiprocessing | Parser] Stopping gateway.
[2022-06-09 14:29:54,012 | INFO | multiprocessing | Parser] process shutting down
[2022-06-09 14:29:54,014 | INFO | multiprocessing | Parser] process exiting with exitcode 0
[2022-06-09 14:29:58,977 | INFO | multiprocessing | MainProcess] Sent 'stopDiffBaros' command to sensors.
[2022-06-09 14:30:03,983 | INFO | multiprocessing | MainProcess] Sent 'stopIMU' command to sensors.
[2022-06-09 14:30:08,989 | INFO | multiprocessing | MainProcess] Sent 'stopMics' command to sensors.

What is the expected behavior?

KeyboardInterrupt shuts down with no exception

Your environment

Latest Raspberry Pi OS needed to use python 3.9.x

Bug report

I updated the Raspberry Pi to use python 3.9.12 using the following instructions (from here):

sudo apt update
sudo apt upgrade
cd ~
mkdir -p python-install && cd python-install
wget https://www.python.org/ftp/python/3.9.12/Python-3.9.12.tgz
tar -zxvf Python-3.9.12.tgz
cd Python-3.9.12
./configure --enable-optimizations
sudo make altinstall
cd /usr/bin
sudo rm python
sudo rm python3
sudo ln -s /usr/local/bin/python3.9 python
sudo ln -s /usr/local/bin/python3.9 python3
cd ~
python --version

Unfortunately when I then install and run the gateway I get the error described here, essentially:

libm.so.6: version `GLIBC_2.29' not found

This forum discusses the problem, concluding that a newer version of the OS is needed in order to run Python 3.9.

Current Workaround

I'm downgrading to python 3.8.13, rather than building my own RPi image!

Cloud Function Error: google.api_core.exceptions.Forbidden: 403 GET, when trying to store Mic data

Bug report

What is the current behavior?

When handling the window with Mic sensor data, the following error occurs in window_handler:

Traceback (most recent call last):
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 2073, in wsgi_app
    response = self.full_dispatch_request()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1518, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1516, in full_dispatch_request
    rv = self.dispatch_request()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1502, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/functions_framework/__init__.py", line 171, in view_func
    function(data, context)
  File "/workspace/main.py", line 41, in upload_window
    window_handler.persist_window(unix_timestamped_window["sensor_data"], window_metadata)
  File "/workspace/window_handler.py", line 92, in persist_window
    self._store_microphone_data(
  File "/workspace/window_handler.py", line 134, in _store_microphone_data
    datafile = Datafile(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/resources/datafile.py", line 144, in __init__
    self._use_cloud_metadata(**initialisation_parameters)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/resources/datafile.py", line 495, in _use_cloud_metadata
    self.get_cloud_metadata()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/resources/datafile.py", line 249, in get_cloud_metadata
    cloud_metadata = GoogleCloudStorageClient(self.project_name).get_metadata(cloud_path=self.cloud_path)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/cloud/storage/client.py", line 117, in get_metadata
    bucket = self.client.get_bucket(bucket_or_name=bucket_name)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/client.py", line 780, in get_bucket
    bucket.reload(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/bucket.py", line 1029, in reload
    super(Bucket, self).reload(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/_helpers.py", line 232, in reload
    api_response = client._get_resource(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/client.py", line 364, in _get_resource
    return self._connection.api_request(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/_http.py", line 80, in api_request
    return call()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/api_core/retry.py", line 283, in retry_wrapped_func
    return retry_target(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/api_core/retry.py", line 190, in retry_target
    return target()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/_http/__init__.py", line 480, in api_request
    raise exceptions.from_http_response(response)
google.api_core.exceptions.Forbidden: 403 GET https://storage.googleapis.com/storage/v1/b/data-gateway-processed-data?projection=noAcl&prettyPrint=false: [email protected] does not have storage.buckets.get access to the Google Cloud Storage bucket. 

Seems like permissions error?

KeyError in file_handler, while running ingress-eu cloud fiunction

Bug report

There seems to be a problem with cloud function that gets window from the ingress bucket.

What is the current behavior?

{
  "textPayload": "Traceback (most recent call last):\n  File \"/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py\", line 2073, in wsgi_app\n    response = self.full_dispatch_request()\n  File \"/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py\", line 1518, in full_dispatch_request\n    rv = self.handle_user_exception(e)\n  File \"/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py\", line 1516, in full_dispatch_request\n    rv = self.dispatch_request()\n  File \"/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py\", line 1502, in dispatch_request\n    return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)\n  File \"/layers/google.python.pip/pip/lib/python3.8/site-packages/functions_framework/__init__.py\", line 171, in view_func\n    function(data, context)\n  File \"/workspace/main.py\", line 38, in clean_and_upload_window\n    window, window_metadata = window_handler.get_window()\n  File \"/workspace/window_handler.py\", line 73, in get_window\n    window_metadata = cloud_metadata[\"custom_metadata\"][\"data_gateway__configuration\"]\nKeyError: 'data_gateway__configuration'",
  "insertId": "000001-d74b9c37-a87c-4ad4-8879-37c0b2397de0",
  "resource": {
    "type": "cloud_function",
    "labels": {
      "project_id": "aerosense-twined",
      "function_name": "ingress-eu",
      "region": "europe-west6"
    }
  },
  "timestamp": "2021-12-11T12:54:03.594Z",
  "severity": "ERROR",
  "labels": {
    "execution_id": "22ajl43qdhwi"
  },
  "logName": "projects/aerosense-twined/logs/cloudfunctions.googleapis.com%2Fcloud-functions",
  "trace": "projects/aerosense-twined/traces/62cd5c882711eaae92ff91fb8fd7a643",
  "receiveTimestamp": "2021-12-11T12:54:04.565208817Z"
}

Traceback:

Traceback (most recent call last):
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 2073, in wsgi_app
    response = self.full_dispatch_request()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1518, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1516, in full_dispatch_request
    rv = self.dispatch_request()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1502, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/functions_framework/__init__.py", line 171, in view_func
    function(data, context)
  File "/workspace/main.py", line 38, in clean_and_upload_window
    window, window_metadata = window_handler.get_window()
  File "/workspace/window_handler.py", line 73, in get_window
    window_metadata = cloud_metadata["custom_metadata"]["data_gateway__configuration"]
KeyError: 'data_gateway__configuration'

Your environment

  • Library Version: 0.7.4 #8

Microphone data is not stored in data-gateway-processed-data bucket.

Bug report

What is the current behavior?

Microphone data still doesn't reach data-gateway-processed-data bucket. Possibly due to #45 ?

This causes cloud function error as it cannot locate microphone files:

Traceback (most recent call last):
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 2073, in wsgi_app
    response = self.full_dispatch_request()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1518, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1516, in full_dispatch_request
    rv = self.dispatch_request()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1502, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/functions_framework/__init__.py", line 171, in view_func
    function(data, context)
  File "/workspace/main.py", line 41, in upload_window
    window_handler.persist_window(unix_timestamped_window["sensor_data"], window_metadata)
  File "/workspace/window_handler.py", line 92, in persist_window
    self._store_microphone_data(
  File "/workspace/window_handler.py", line 141, in _store_microphone_data
    with datafile.open("w") as f:
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/resources/datafile.py", line 688, in __enter__
    os.makedirs(os.path.split(self.datafile.local_path)[0], exist_ok=True)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/resources/datafile.py", line 454, in local_path
    return self.download()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/resources/datafile.py", line 440, in download
    raise e
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/resources/datafile.py", line 433, in download
    GoogleCloudStorageClient(project_name=self.project_name).download_to_file(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/octue/cloud/storage/client.py", line 173, in download_to_file
    blob.download_to_filename(local_path, timeout=timeout)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/blob.py", line 1281, in download_to_filename
    client.download_blob_to_file(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/client.py", line 1152, in download_blob_to_file
    _raise_from_invalid_response(exc)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/blob.py", line 4466, in _raise_from_invalid_response
    raise exceptions.from_http_status(response.status_code, message, response=response)
google.api_core.exceptions.NotFound: 404 GET https://storage.googleapis.com/download/storage/v1/b/data-gateway-processed-data/o/microphone%2F790791%2Fwindow-343.hdf5?alt=media: No such object: data-gateway-processed-data/microphone/790791/window-343.hdf5: ('Request failed with status code', 404, 'Expected one of', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>)

And surely goes hand in hand with another new error:

InvalidResponse: ('Request failed with status code', 404, 'Expected one of', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>)

 Traceback (most recent call last):
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/client.py", line 1139, in download_blob_to_file
    blob_or_uri._do_download(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/storage/blob.py", line 1001, in _do_download
    response = download.consume(transport, timeout=timeout)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/resumable_media/requests/download.py", line 214, in consume
    return _request_helpers.wait_and_retry(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/resumable_media/requests/_request_helpers.py", line 147, in wait_and_retry
    response = func()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/resumable_media/requests/download.py", line 207, in retriable_request
    self._process_response(result)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/resumable_media/_download.py", line 188, in _process_response
    _helpers.require_status_code(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/resumable_media/_helpers.py", line 105, in require_status_code
    raise common.InvalidResponse(
google.resumable_media.common.InvalidResponse: ('Request failed with status code', 404, 'Expected one of', <HTTPStatus.OK: 200>, <HTTPStatus.PARTIAL_CONTENT: 206>) 

Your environment

  • Gateway Version: 0.11.5
  • Platform Linux

Other information

Please give as much detail as you can, like:

  • detailed explanation
  • stacktraces
  • related issues
  • suggestions how to fix
  • links for us to have context, eg. stackoverflow, gitter, etc

pyenv local 3.7.0

on windows I get a package compatibility issue using the suggested windows package

Weird timestamp behaviour when writing output file

Reproduced from gitlab - filed by Rafael Fischer on Jun 21, 2021 4:05pm GMT+0100

Bug report

What is the current behavior?

During the T1 Aventa test we did today, we sampled once for over 10 minutes and exactly after 10 minutes, when the first data file was written, the following appeared on the cli:

	Constat packet: 2265
	Constat packet: 2266
	Constat packet: 2267
	Constat packet: 2268
	Lost set of Mics packets: 4345173.096801933 ms gap
	Lost set of Mics packets: -4317922.26965332 ms gap
	Lost set of Baros_P packets: 27359.23461917446 ms gap
	Lost set of Baros_T packets: 27359.23461917446 ms gap
	Lost Acc packet: 27708.526611328125 ms gap
	Lost Gyro packet: 27708.526611328125 ms gap
	Constat packet: 2297
	Lost set of Constat packets: 28075.860595725317 ms gap
	Constat packet: 2299
	Constat packet: 2300
	Lost Mag packet: 28500.274658203125 ms gap
	Constat packet: 2301
	Constat packet: 2302
	Constat packet: 2303
	Lost set of Analog Vbat packets: 23995.330810546875 ms gap
	Constat packet: 2304
	Constat packet: 2305
	Constat packet: 2306

Note that the Constat messages were added temporarily by me to see when the connection is alive. They are sent on every constat packet that arrives together with its timestamp in seconds.

The packet losses with the huge gaps must have something to do with the file being written / the active file being changed.

You can find the produced files under [https://ostch.sharepoint.com/:f:/r/teams/TS-Aerosense/Freigegebene%20Dokumente/General/2_Implementation/21_WorkPackages/WP2_ResearchAndDevelopment/T2.3_Electronics/BLE%20performance%20tests/Measurements%20June%2021/424536?csf=1&web=1&e=Rvai4d]

Your environment

  • Branch: feature/extended_constat
  • Platform Windows

Merge basestation readme file to docs

Feature request

On the rpi, @time-trader made a readme (see below) for aspects of how to connect to different things. I'll be wiping this soon (and anyway raspberry pis shouldn't be considered permanent storage, their SSDs are notoriously unreliable).

Here's the text, it should be merged into the relevant bits of the docs To connect to RaspPI, contact Juri, to get VPN keys: Option 1: Husarnet ssh pi@base-station Option 2: WireGuard VPN: ssh [email protected]

Pass: REDACTED

To use the gateway:
1.Activate a virtual environment:
source /home/pi/flasks/gateway/bin/activate
2.Use gateway command:
gateway

2.1
For help:
gateway --help
gateway start --help
gateway create-installation --help

2.2 For the gateway to store data into the BigQuery SQL database, make sure the "installation" exists. The instalation is defined in the configuration json file.

....
"installation_data": {
"installation_reference": "ost-wt-evaluation",
"longitude": "001",
"latitude": "001",
"turbine_id": "OST_WIND",
"blade_id": "BAV01",
"hardware_version": "2.1",
"sensor_coordinates": {.....}
}
....

If one wishes to make a new installation:
gateway create-installation --config-file="/home/pi/data_gateway_configurations/.json"

2.3
EXAMPLES:

  1. To start gateway with upload to cloud, and a pre-defined routine:
    gateway start --gcp-project-name=aerosense-twined --gcp-bucket-name=aerosense-ingress-eu --config-file="/home/pi/data_gateway_configurations/configuration.json" --routine-file="/home/pi/data_gateway_configurations/routine-pret4.json"--window-size=60

  2. IF you wish to run in interactive mode (no upload to cloud) :
    gateway start --config-file="/home/pi/data_gateway_configurations/configuration.json" --window-size=60 -l -nc -i

google.api_core.exceptions.GoogleAPICallError: 413 POST in the cloud Function

Bug report

What is the current behavior?

Executing cloud function with windows of window_size=60 (sec) and with Diff_Baros or Baros Sensor Data fails with following error:

  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 2073, in wsgi_app
    response = self.full_dispatch_request()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1518, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1516, in full_dispatch_request
    rv = self.dispatch_request()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/flask/app.py", line 1502, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/functions_framework/__init__.py", line 171, in view_func
    function(data, context)
  File "/workspace/main.py", line 41, in upload_window
    window_handler.persist_window(unix_timestamped_window["sensor_data"], window_metadata)
  File "/workspace/window_handler.py", line 99, in persist_window
    self.dataset.add_sensor_data(
  File "/workspace/big_query.py", line 80, in add_sensor_data
    errors = self.client.insert_rows(table=self.client.get_table(self.table_names["sensor_data"]), rows=rows)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 3411, in insert_rows
    return self.insert_rows_json(table, json_rows, **kwargs)
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 3590, in insert_rows_json
    response = self._call_api(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/bigquery/client.py", line 760, in _call_api
    return call()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/api_core/retry.py", line 283, in retry_wrapped_func
    return retry_target(
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/api_core/retry.py", line 190, in retry_target
    return target()
  File "/layers/google.python.pip/pip/lib/python3.8/site-packages/google/cloud/_http/__init__.py", line 480, in api_request
    raise exceptions.from_http_response(response)
google.api_core.exceptions.GoogleAPICallError: 413 POST https://bigquery.googleapis.com/bigquery/v2/projects/aerosense-twined/datasets/greta/tables/sensor_data/insertAll?prettyPrint=false: <!DOCTYPE html>

This might be related to JSON file being too big? However checking the ingress bucket shows that all files are below 20 MB

Add support for multiple sensor nodes

Feature request

Add possibility to operate multiple sensor nodes with the single base base station,

Open questions:

Command in multi-device context

@JuliusPlindastus @tommasopolonelli @hmllr @time-trader
Since we are extending the firmware to support multiple peripheral devices, it is a good time to discuss how you would like to issue commands in the multi-device context. To finalize the format of the command, the following questions need to be answered:

  1. Is it desired to send commands to either one of the peripheral devices or all of them, or actually it is better if the destination can be any subset of the available devices.
  2. Is it better to first declare the peripheral devices we want to communicate to with a separate command, and then issue commands like we used to:
    -selConn 3 (this means we want to communicate to device 1 and 2)
    -startMics
    -stopMics
    or add an extra argument to specify:
    -startMics 3
    -stopMics 3

Configuration file

Configuration file contains sensor properties as well installation metadata. We should probably allow for possibility to have different sampling frequencies on different nodes. Should we keep a single configuration file?

Integer keys in the configuration file

Bug report

JSON key is always a string, therefore the current configuration dict should be refactored so that we do not use integers as keys, or we should make sure to convert keys to int while reading configuration JSON.

What is the current behavior?

Starting gateway with --config-file flag throws the UnknownPacketTypeError:

[2021-12-06 14:13:58,234 | INFO | data_gateway.cli] Loaded configuration file from '/home/pi/data_gateway/configuration.json'.
[2021-12-06 14:13:58,235 | INFO | data_gateway.cli] Starting packet reader in non-interactive mode - files will be uploaded to cloud storage at intervals of 600.0 seconds.
[2021-12-06 14:13:58,236 | WARNING | data_gateway.packet_reader] Timestamp synchronization unavailable with current hardware; defaulting to using system clock.
[2021-12-06 14:13:58,977 | ERROR | data_gateway.packet_reader] Received packet with unknown type: 34
[2021-12-06 14:13:59,247 | ERROR | data_gateway.persistence] Object of type Configuration is not JSON serializable
Traceback (most recent call last):
  File "/home/pi/aerosense/install/data-gateway/data_gateway/packet_reader.py", line 132, in read_packets
    packet_type=packet_type, payload=payload, data=data, previous_timestamp=previous_timestamp
  File "/home/pi/aerosense/install/data-gateway/data_gateway/packet_reader.py", line 200, in _parse_payload
    raise exceptions.UnknownPacketTypeError("Received packet with unknown type: {}".format(packet_type))
data_gateway.exceptions.UnknownPacketTypeError: Received packet with unknown type: 34

What is the expected behavior?

34 is a defined handle in the valid_configuration.json, but of course as mentioned before, the problem is that it is defined as a string.

Your environment

  • Library Version: 0.7.1
  • Platform: Raspbian GNU/Linux 10 (buster)

Other information

Please give as much detail as you can, like:

  • detailed explanation
  • stacktraces
  • related issues
  • suggestions how to fix
  • links for us to have context, eg. stackoverflow, gitter, etc

Possibility to add barometer and temperature sensors in the base station

Possibility to add barometer and temperature sensors in the base station

###Goal: Improve calibration of barometers on the sensor node

Barometers on the sensor node measure the atmospheric pressure + aerodynamic pressure. As we are only interested in the aerodynamic pressure, we must remove the atmospheric pressure. A feature that would greatly help the calibration and improve the accuracy of the measurements is to measure the atmospheric pressure where there's no wind (i.e. in the base station).

Current state

For now, we retrieve only data from sensor node(s) and transmit it to the cloud. It would be good to be able to also process other data coming from other sensors/systems which could follow a similar pipeline.
The sampling frequency does not need to be as high as 100Hz. A mean value which could be sent every 10min/30min/1hour would be sufficient.
The temperature might also be valuable, but I think it will be highly influenced by the temperature inside the base station.

Recompute sensor_time_offset after handles update.

Feature request

Use Case

Sensor node internal clock can reset itself, if the sensor node crashes and resets. If such event occurs, sensor_time_offset should be recomputed.

Current state

Currently sensor_time_offset is computed only once per gateway session. IF the internal sensor node clock is reset during the gateway operation, and the connection is reestablished, then previously computed offset is no longer correct.

Proposed solution

When connection is reestablished, the handles are updated, so it is possible to reinitialize sensor_time_offset to None, in this method. This will force time-stamping method to recompute it.

Persist recieved battery state info to cloud

Feature request

Use Case

Save Battery state info packet as a "sensor" data and send it to the cloud.

Current state.

Battery state info packet currently only outputs to logger

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.