This repository has been merged into the MIL monorepo.
You can view the repository prior to the merge here
NaviGator ASV on-board software
This repository has been merged into the MIL monorepo.
You can view the repository prior to the merge here
Caused by something asking for odom before odom has been received by the navigator singleton object I THINK. The navigator singleton object needs to wait for odom if it hasn't been received yet if that is indeed the problem.
Using Semaphore Continuous Integration requires admin privileges of the repo to set it up. @zachgoins are you the admin? Since we have the install script now, it should be simple to actually set it up but you need to be the git admin.
Considering the ambiguity in the rules of the use of mission outputs (scan the code colors, coral survey shapes, pinger gate) in determining mission inputs (shape to dock in, color to shoot at), I think we should abstract this decision making away from missions. Each mission should only set the parameters corresponding to the observed data (if any) of that mission. Then the mission planner should decide how to use that data to run other missions with different inputs. This way we have one file to edit for scripting all the missions together. Thoughts?
Example: Scan the code sets Colors=[GREEN,BLUE,YELLOW], Coral Survey sets SHAPE_2=TRIANGLE, then the mission planner runs detect deliver with inputs SHAPE=TRIANGLE, COLOR=GREEN
While we get the hardware and electrical components up and running we should start experimenting with creating a sim from scratch or perhaps using a third party product. Having a working simulation is necessary for control implementation.
The simulator does not launch a bounds server (it can't because it doesn't have absolute odom), so the ogrid, and therefore everything that relies on it (obstacle avoidance, missions) do not work in the sim. This can be a pretty quick fix , jut publish bounds that are in a square the size of the gazebo world.
Missing write up on simulator, acoustic based transit, and stereo vision.
Daniel just tried to use the install script from a brand new Ubuntu 14.04 install. We noticed that it does not install git. Might be nice to add this to the script so it can be run without installing anything else first
Is there a reason to have an alarm to both kill and revive the boat? The AlarmRaiser class has a clear_alarm
method that could be used to clear the kill alarm removing the redundancy.
Create sensor fusion/kalman filtering node to give accurate pose data using GPS and IMU
Create serial node to communicate with the micro controller we decide to use. I do not know how much we will need to communicate between micro and ROS right now but it is good to start with serial early so it is ready when we need it
Considerations
NaviGator is having problems going to GPS way points. Boat local Lat/long are very far off from Google Earth points.
Correct me if I'm wrong but it seemed that the boat was unable to return to recorded (using Sylphase INS) points using own INS system. Maybe an error from translation of relative to world?
Need to update firmware on controller to trip at 70 amps (currently 60 amps).
Need TF for downward looking camera. Camera will be mounted on 2x4 cross beam at a depth less than 1 meter. Lucas has been notified. Has anyone done the camera calibration for the camera? The camera is in 121 on the file cabinet next to the window.
The pointgrey camera on the right side of navigator, under the shooter, is unreliable in the auto configuration as compared to the front cameras, which look quite good at any of the tested conditions. This could be for a few reasons:
By competition, we should have a mission written for the identify symbols and dock mission. It will likely use the same perception used for detect deliver (but running on the front camera). So far here is the plan:
There seems to still be some frustrating problems trying to launch the sim. It is probably a matter of timing. Sometimes gazebo will just freeze on a black screen, sometimes gazebo doesn't come up at all. Sometimes it freezes in spawning models. Happens on slow computers and fast computers
Don't really have a more detailed bug report. Just launch it a couple times in a row and you’ll see some issues.
Seems like it is mostly is gazebo crashing, which freezes everything else up.
Transition the kill system used on Propagator to be used on the WAM-V.
It would be nice if mission-specific parameters could be set through the navigator singleton with methods similar to what we have implemented the vision services. A yaml file would list all the mission parameters and their valid values.
Ex:
detect_deliver_shape:
param: /mission/detect_deliver/Color
args: ['GREEN','BLUE','RED']
Then in a mission (like scan the code):
if scan_the_code_color[2] == "RED"
#Throws error if color is invalid, warnings if already set, etc
navigator.set_mission_param('detect_deliver_shape','RED')
I'll work on this. Not sure how this will work into mission planer system...
After resetting on RVIZ the markers save their location and don't update. If there is a way to have the markers only update with absolute coordinates this would be best.
Dr. Eric M. Schwartz suggested that the shape perception contain some data about how confident we are that the shape is a given shape and/or color.
Good idea.
Create node to turn a desired wrench (linear and angular force) into the proper thrust to move the vehicle along this wrench.
Considerations:
This weekend, we double fed the ball launcher during the last run. This was caused by starting a mission and killing it before the mission completed (ball loaded but not fired). When we re-attempted the mission, the mechanism fed a 2nd ball into the chamber. Daniel thinks that this can be solved by using ROS action library.
The camera launch files are configured to use a yaml file with the calibration found in the repo, but I've only observed zeroed out camerainfo messages and no rectified image available.
The boat needs to either station hold or move to a safe point when communication is lost.
So far, no work has been done to write the perception or mission for the find the break challenge. By the competition, we should have the mission written and the best perception we can get based on the sim, the footage in hawaii, and potential testing in a pool.
Also: MRAC controller should initialize to the current position, not (0,0)
Everyone,
We should clarify who is doing what concercing lidar classification! After our Monday meeting, my goal was to take Tess's work and merge it back into my C++ code to have 1 node handling creating the ogrid, identifying objects, volume classification, and answering service requests. This information can then be used by other nodes for various purposes!
Essentially, I wanted to try free time up for Tess and also bundle the lidar code into 1 place. This way you can yell at me for any problems/errors/mistakes.
Tess you mentioned concerns about loosing functionality but I've already got your python code converted to C++ for remembering object ids and answering service requests.
Thoughts/comments/concerns?
-Ira
When the simulator starts up, the challenges do not appear. We are missing all of the challenges except for the boat. Also, I cannot see land in the simulator.
In file included from /home/user/nav_ws/devel/include/navigator_msgs/ShooterManual.h:8:0,
from /home/user/nav_ws/src/NaviGator/mission_systems/shooter/firmware/shooter.cpp:5:
/opt/ros/indigo/include/ros/service_traits.h:31:50: fatal error: boost/type_traits/remove_reference.hpp: No such file or directory
#include <boost/type_traits/remove_reference.hpp>
^
compilation terminated.
make[6]: *** [CMakeFiles/arduino.dir/shooter.cpp.obj] Error 1
make[5]: *** [CMakeFiles/arduino.dir/all] Error 2
make[4]: *** [CMakeFiles/arduino.dir/rule] Error 2
make[3]: *** [arduino] Error 2
make[2]: *** [NaviGator/mission_systems/shooter/CMakeFiles/navigator_shooter_firmware_arduino] Error 2
make[1]: *** [NaviGator/mission_systems/shooter/CMakeFiles/navigator_shooter_firmware_arduino.dir/all] Error 2
make: *** [all] Error 2
Invoking "make -j4 -l4" failed
Matt Langford said he and Tess ran catkin_make clean to fix
So far, no work has been done to write the perception or mission for the Underwater Shape Identification (previously Coral Survey) challenge. By the competition, we should have the mission written and the best perception we can get based on the sim, the footage in hawaii, and potential testing in a pool.
Create node to turn inputs from a game controller into a desired wrench to send to the primitive driver. It is advised to use the default Ubuntu drivers to maintain cross platform integration
We need to verify if we can command the wam-V computer from a remote computer using the Ubiquiti system. This is necessary for us to take the boat to the lake.
This service will call the camera - to - lidar service I'm writing in #21 and produce a normal to the plane found at the points returned. We will use this in detect deliver to align perpendicular to the target
It fails for almost every build. No idea what the problem could be. See me if youd like ssh access into the semaphore VM.
We could make our own CI system that wouldn't be garbage by having a script on mil plumbusi that pulls from any pr branches it sees and tests the build on there. Not sure how we would get the cool checkmark or x that semaphore has tho
So missions can know what object to search the database for, it would be helpful to define constants for the keys to use in the message file. For example
string SHOOTER=detect deliver target
string DOCK=find and dock
Currently, a service call to /database/single or /database/full returns data in the format of PerceptionObject.msg, which contains the objects id, name, size, and position. This removes some of the data of the original Buoy.msg sent by object detection, such as a normal and the 3D points of the object. For use in missions and for less confusion, perhaps this should just be one message format?
The simulation publishes the right (shooter side) camera to /right, whereas on the real boat it is /right/right. It should be the same on both. For convenience, let's change the sim to use /right/right
We need a way to be able to fire the launcher from other programs using something like
shooter.fire(), shooter.cancel(), etc...
Basically just opens up the commands you would manually input into the terminal to the ROS server system where we can call the functions from any programs.
The rules give an example of the pinger mission where the boat drives through one gate, circles a buoy, and then drives back through another gate (page 19, http://robotx.org/images/files/2016-MRC-Tasks-2016-11-03.pdf). We'll have to be careful with how we define this mission since the pinger mission cannot be dependent on itself (ie, running a pinger mission immediately after a pinger mission, pinger mission dependent on pinger mission). Recommend breaking the mission into three parts: pinger mission, circle mission, pinger mission. Thoughts?
Once we have found the object we are trying to find in the image we typically want to make some maneuver based on that object. ie, we need to know it's relative location in 3D space relative to the boat.
I will make a service that given a pixel location and camera extrinsic to the boat will use the lidar to tell where the object is. So in the mission system we just use one function to get the distance rather than interacting with the lidar directly.
For testing / the team video, the shooter should be controllable using the xbox controller and/or keyboard control in the GUI. Ideally it would have one button for shoot, one for fire, and one for cancel. I'll look through the controller script to see if I can figure this out.
We need a GUI for the judges. See pages 27-28:
http://robotx.org/images/files/2016-MRC-Tasks-2016-07-13.pdf
One thing holding me back from making serious use out of the simulator has been apparent differences between the location of sensors on Navigator in gazebo and the transformations publishes by tf.launch. Rather then trying to get these numbers just right, perhaps the simulator should just publish it's own TF's based on the boat model. A quick Google seems to show that this may even be the default behavior of a ros-ified gazebo.
Considering we will have almost no testing with the identify and dock challenge before hawaii, we really need a good model for the sim. Right now the displayed symbols are too dark (like detect deliver used to be). The model should be fixed and make it easy to change which symbols are present
Transition the heartbeat monitoring used on PropaGator 2
The thrusters config get reset whenever the driver gets launched. This causes problems with the controller since the default acceleration is low.
Please leave as detailed descriptions of object misclassification you have noticed at the last few lakedays so we can get them resolved by competition. We should also discuss the desired behaviors of the object database, such as what to do when the type of an object changes.
I'll start
Not sure if it's a hardware, firmware, or software issue. The thrusters won't return feedback information (we mainly just want battery voltage)
This is what stereo TF looks like right now. It looks like one stereo is right in front of the other.
I don't think this is intended. @DSsoto Is there a reason for this change?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.