Coder Social home page Coder Social logo

rosbag.js's People

Contributors

brianc avatar cjds avatar davepagurek avatar davidcrawford avatar davidswinegar avatar defunctzombie avatar disambiguator avatar esthersweon avatar gkjohnson avatar hhsaez avatar janpaul123 avatar jtbandes avatar matthewsteel avatar pankdm avatar stuk avatar surajhpatil avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rosbag.js's Issues

Feature Suggestion: Allow decompression functions to be asynchronous

A few of the rosbags I'm working with are lz4 compressed and I'm noticing that my frame time is often dominated by the decompression function when reading the file leading to hiccups in application responsiveness.

It would be great to be able to run the decompression asynchronously on a web worker -- using ArrayBuffers and SharedArrayBuffers it should be possible to win pretty easily on decompression time.

Is it as easy as awaiting a promise when calling the compression function here? The other question is then would it be safe to transfer the ArrayBuffer ownership temporarily to a webworker while decompression work happens? Or is it expected that multiple chunks share an array buffer? It looks like the answer might be no in the browser but it's unclear when running in node. If they are separate then it would be possible to decompress multiple chunks in parallel, too.

Propagate FileReader Errors

I this issue when trying to process a large (1.0+GB) rosbag file in the browser -- when the FileReader hits an error the bag handle never responds because onload is never called.

The error can be propagated by adding something like the following code in this function:

reader.onerror = function() {
    reader.onload = null;
    reader.onerror = null;

    setImmediate(cb, reader.error, null);
}

I'd submit a PR but I'm not certain how to reproduce the issue and it seems to happen intermittently. I'm only seeing the issue in Chrome and seeing that the error returned by the reader is DOMException: The requested file could not be read, typically due to permission problems that have occurred after a reference to a file was acquired.

I'm loading and opening the bag like so:

import { open } from 'rosbag';

fetch(url, { credentials: 'same-origin' })
    .then(res => res.blob())
    .then(blob => open(new File([blob], 'temp.bag')))
    .then(() => { /* ... nothing happens ... */ })

Ultimately it seems to be a browser problem but it would be nice to propagate the error. Admittedly my experience with the FileReader API is limited -- maybe you guys have a bit more insight?

Thanks!

Add BigInt reading support

Not all values of int64 and uint64 types can be represented in plain Numbers in JS. We currently use int53 which just logs a console.assert error if the number is out of range, but doesn't prevent returning an incorrect value. The new BigInt type can be used instead, but this should be an opt-in feature because it's not source compatible (for instance, 1n + 2 is a type error).

Transform record is polluted with unused fields when loading Example.bag

When loading the example bag in the fixtures folder with the following code:

   const bag = await open('.../example.bag')
    bag.readMessages({ topics: '/tf' }, msg => {
        console.log(JSON.stringify(msg.message, (key, val) => val === undefined ? null : val, 2));
    });

Record is printed out in the following form:

{
  "transforms": [
    {
      "x": null,
      "y": null,
      "z": null,
      "w": null,
      "header": {
        "x": null,
        "y": null,
        "z": null,
        "w": null,
        "seq": 0,
        "stamp": {
          "sec": 1396293888,
          "nsec": 56065082
        },
        "frame_id": "world"
      },
      "child_frame_id": "turtle2",
      "transform": {
        "x": null,
        "y": null,
        "z": null,
        "w": null,
        "translation": {
          "x": 4,
          "y": 9.088889122009277,
          "z": 0,
          "w": null
        },
        "rotation": {
          "x": 0,
          "y": 0,
          "z": 0,
          "w": 1
        }
      }
    }
  ]
}

The same thing happens in the browser, as well:

image

You can see that a lot of the objects in the record definition seem to be inheriting from the Quaternion class unnecessarily, even the translation field.

"extractFields" function triggers Buffer.indexOf polyfill slow path

Hello! When I was doing perf analysis on my app I noticed that there was a lot of time spent in extractFields and Buffer.indexOf and noticed a quirk of the Buffer.indexOf polyfill implementation -- specifically it creates a new buffer and uses a custom Javascript search function rather than the native Uint8Array.indexOf implementation when a string is passed in. Here's a bit more of a breakdown:

Here's a quick benchmark to run to show the performance difference. I'm seeing a 10-15 times improvement when passing a number into indexOf rather than a string. Note that this is when running a browser / node with the polyfill. When running this benchmark with the native node Buffer implementation there are still some small improvements but not nearly as drastic:

import { Buffer } from 'buffer';

const buff = Buffer.alloc( 10000 );
buff[ 5000 ] = 61;

const c = '=';
console.time( 'String' );
for ( let i = 0; i < 1000; i ++ ) {

    buff.indexOf( c );

}
console.timeEnd( 'String' );

const n = '='.charCodeAt( 0 );
console.time( 'Number' );
for ( let i = 0; i < 1000; i ++ ) {

    buff.indexOf( n );

}
console.timeEnd( 'Number' );

It seems that this gets run in a pretty tight loop so it should be worth switching over to something like field.indexOf( 61 ). If that sounds agreeable I can submit a PR with the change!

Error: Expected file to be a File or Blob. Make sure you are correctly importing the node or web version of Bag.

I am very new to all of this and this might be a very silly question, but I am getting below error when trying to load the example.bag file in fixtures or any type of bag file. I created an Angular CLI project and installed ROSBAG using npm install command.

App.Component.ts

import { Component, OnInit } from '@angular/core';
import {open} from 'rosbag'

@Component({
  selector: 'app-root',
  templateUrl: './app.component.html',
  styleUrls: ['./app.component.css']
})
export class AppComponent implements OnInit {
  title = 'RosUtil';

  async readRosBagFile() {
    const bag = await open('.../example.bag')
    bag.readMessages({ topics: '/tf' }, msg => {
        console.log(JSON.stringify(msg.message, (key, val) => val === undefined ? null : val, 2));
    });
  }

  ngOnInit() {
    this.readRosBagFile()
  }
}

Error

core.js:15724 ERROR Error: Uncaught (in promise): Error: Expected file to be a File or Blob. Make sure you are correctly importing the node or web version of Bag.
Error: Expected file to be a File or Blob. Make sure you are correctly importing the node or web version of Bag.
    at open (index.js:4784)
    at AppComponent.<anonymous> (app.component.ts:13)
    at step (tslib.es6.js:97)
    at Object.next (tslib.es6.js:78)
    at tslib.es6.js:71
    at new ZoneAwarePromise (zone.js:910)
    at Module.__awaiter (tslib.es6.js:67)
    at AppComponent.push../src/app/app.component.ts.AppComponent.readRosBagFile (app.component.ts:12)
    at AppComponent.push../src/app/app.component.ts.AppComponent.ngOnInit (app.component.ts:20)
    at checkAndUpdateDirectiveInline (core.js:22099)
    at resolvePromise (zone.js:831)
    at zone.js:741
    at rejected (tslib.es6.js:69)
    at ZoneDelegate.push../node_modules/zone.js/dist/zone.js.ZoneDelegate.invoke (zone.js:391)
    at Object.onInvoke (core.js:17299)
    at ZoneDelegate.push../node_modules/zone.js/dist/zone.js.ZoneDelegate.invoke (zone.js:390)
    at Zone.push../node_modules/zone.js/dist/zone.js.Zone.run (zone.js:150)
    at zone.js:889
    at ZoneDelegate.push../node_modules/zone.js/dist/zone.js.ZoneDelegate.invokeTask (zone.js:423)
    at Object.onInvokeTask (core.js:17290)

Error reports missing file header when file is non existent

When opening a file that doesn't exist the error is reported as "Missing file header":

(node:69920) UnhandledPromiseRejectionWarning: Error: Missing file header.

rather than nonexistent file. Here's some repro code:

const { open } = require('rosbag');
open('../non/existent.bag')

Node.js streams support for .pipe()

Currently getMessages() reads all the messages as fast as possible. I have a use case of processing this data and I would like to pause the reading of the bag while I process the messages that are already read.

By having a stream based API, I will be able to handle the entire processing pipeline as a Node.js stream.

parsingMessageDefinition fails for "JSON" incompatible ros message

Following rosMessage is being passed to rosbags parsingMessageDefinition function,

bool Alive=True
bool Dead=False

time stamp
string robot_name
bool heartbeat_alive
string error_msg

The parsingMessageDefinition function fails (at line 116) while parsing the above message, because the boolean True and False are not compatible with standard JSON true and false.

  • This issue occurred while loading a bag file into webviz.
  • In the webviz itself, it is being called in bagConnectionsHelper.js, where the messageDefinition is being passed to the parseMessageDefinition function
  • Theoretically, the input content should be sanitized before processing.

Issue for bags with the type Header in it

Any bag with the type Header in it seems to have an unresolvable issue when it comes to looking up its connections.

I can't remove Header as a type from the bag because https://wiki.ros.org/diagnostics package uses the Header type.

I've attached the bag that causes it

const lz4 = require("lz4js");
const { open, parseMessageDefinition } = require('rosbag');

function connections_returner(connections) {
    const emptyConnections = [];
    const op = [];
    for (const connection in Object.values(connections)) {
      const { messageDefinition, md5sum, topic, type } = connections[connection];
      if (messageDefinition && md5sum && topic && type) {
        op.push({ messageDefinition, md5sum, topic, type });
      }
    }
   return op;
}

async function checkConnections() {
  const bag = await open('test.bag');
  const connections = connections_returner(bag.connections);
  connections.forEach((connection) => {
    const connectionTypes = parseMessageDefinition(connection.messageDefinition);
  });
}

checkConnections();

This is the traceback I get

(node:442) UnhandledPromiseRejectionWarning: Error: Expected 1 top level type definition for 'Header' but found 0
    at findTypeByName (/test/node_modules/rosbag/dist/node/index.js:1450:11)
    at definitions.forEach.definition (/test/node_modules/rosbag/dist/node/index.js:1499:27)
    at Array.forEach (<anonymous>)
    at types.forEach (/test/node_modules/rosbag/dist/node/index.js:1497:17)
    at Array.forEach (<anonymous>)
    at parseMessageDefinition (/test/node_modules/rosbag/dist/node/index.js:1494:9)
    at connections.forEach (/test/test.js:39:29)
    at Array.forEach (<anonymous>)
    at checkConnections (/test/test.js:37:15)

Here's a link to the file that causes it
https://filebin.net/56sdttauoozk98i8/test.bag?t=ut7rwok9

How can you detect the end of reading results & other comments!

Hello!

First off -- thank you for making your parser available publicly! So far it seems to work fantastically. I'm still just getting started with it but I had a few questions and comments about how to use it. I'm using the currently published npm version 1.2.1, though I see that the structure of the repo has changed a bit since the last publish.

  • When reading rosbag data is there a way to detect when the reader has finished returning ReadResults? You could compare the bag endTime to the ReadResult timestamp to see if they're equal but that doesn't seem like the most robust approach.
  • Is there documentation on how to load the library in a browser? Is a build process required? It would be nice to have an example.
  • The documentation for "BagOptions" references https://github.com/cruise-automation/rosbag.js/blob/master/test/rosbag.js#L139, which is no longer available. Presumably it should point to rosbag.test.js.

Thanks again!
Garrett

Replay at given rate

Hi! thanks for the amazing library.

Since bag.readMessages reads through the entire bag in a go, I was wondering how to implement replay at a given rate.

One way I could think of is pushing the messages into an event-queue like structure and then picking them up at appropriate intervals using setTimeout - but that would mean that the browser's memory will get full.

Is there a recommended approach? Pardon if the question is stupid.
Thanks!

Get progress of readMessages

I want to show the progress of reading the file, something like progress bar.How to get the progress of how much of file has been parsed/read.

Please provide info If I have overlooked already existing properties.

Parse error for numeric constant with leading zero

A user reported this in the Webviz Slack workspace.

Certain msg definitions (example kobuki_msgs/VersionInfo) have numeric constant with some leading zeros: uint64 SMOOTH_MOVE_START = 0000000000000001

ROS seems to handle this fine, but rosbag.js throws an error when trying to parse the constant, because JSON.parse throws. (@foxglove/rosmsg does handle it correctly, although I don't think we were aware of this bug/limitation when we wrote it.)

"should read bytes from a file" test fails on Windows

The test fails here on Windows because the reported size is 22 bytes rather than 21 bytes. I'm not exactly sure why but maybe it has something to do with line endings or an added byte at eof when reading?

One option is just to make sure that the reported size is the same as the size reported by the fs.stat:

assert.equal(reader.size(), fs.statSync(fixture).size);

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.