Coder Social home page Coder Social logo

Comments (25)

Neel1302 avatar Neel1302 commented on June 17, 2024 1

Hi @m-binev, Thanks again for the support as well.

@pablo-quilez I did create a pull request and I hope that can be merged into the basler ROS driver after your test. At large, I hope it can help other ROS users take full advantage of the camera features.

Here is a link to the fork for the hardware trigger branch that I have implemented for anyone to use, i.e. until the pull request for this is tested and merged.

Thanks,
Neel

from pylon-ros-camera.

pablo-quilez avatar pablo-quilez commented on June 17, 2024

Hi Claudio,

are you suggesting to do it configurable, as for example the fps we were discussing in the other issue? I think providing services for this would be a good idea, but maybe just changing to a LatestImageOnly strategy is a best option. What do you think?

from pylon-ros-camera.

claudiofantacci avatar claudiofantacci commented on June 17, 2024

It is probably good to have it configurable with GrabStrategy_LatestImageOnly as default.
Also, it may be worth investigating how these options behave with trigger mode On and Off.
I think, at least for us, we started noticing this after we moved to trigger mode On.

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

Hi Pablo, hi Claudio,

as LatestImageOnly (1 buffer in the queue) is a version of LatesImages (one or more buffers, which is freely configurable in pylon), it may be a good idea to implement LatesImages.
I think, this strategy would best suit to ROS, because images are copied and published in the ROS space. As this operation may took longer, it may happen that you work with "older" images when using OneByOne under circumstances.
On the other side, if the use case requires all images to be kept, you would need OneByOne.
Thus, having OneByOne and LatestImages should cover most of the use cases.
@claudiofantacci , @pablo-quilez

from pylon-ros-camera.

claudiofantacci avatar claudiofantacci commented on June 17, 2024

Hi @m-binev, thanks for reaching out here 😄

I'm sorry, I'm not sure I got the point of your comment.
In general, as long as we have a way to

  • use only the latest available image (LastImageOnly or LastImages with buffer 1)
  • buffer up to N images (LastImages with buffer > 1)
  • buffer all images (OneByOne)

we are covering most (or all?) use cases.

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

Hi @claudiofantacci ,
I basically wanted to say that having OneByOne and LatestImages should cover all use cases with USB cameras (and most use cases with GigE).
For GigE cameras, having also "UpcomingImage" may make sense, but we may consider implementing that strategy in ROS only if we see real demand for that.

from pylon-ros-camera.

claudiofantacci avatar claudiofantacci commented on June 17, 2024

Ok, great. For the record, we use GigE cameras and for me UpcomingImage is some specific use case that I cannot really understand. Any example?

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

Hi,
well, if I recall correctly, the users that were looking for such grab strategy wanted to make sure they do not deal with "older" images, which e.g. were still available in the queue.
In those cases they were running the cameras in free-run or continuous hardware trigger and used the grabbed images for a kind of preview to the end customer, e.g. in ophthalmology applications. While doing preview, they were also configuring different camera parameters controlling the brightness and the color to find the perfect settings. Once done, they wanted to switch off the preview and get a fresh new image with the last parameters.
I suppose, that something like that can also be reached while using e.g. SW trigger, but as we said before, with SW trigger you get lower maximum frame rate.
I hope that sheds some more light))

from pylon-ros-camera.

claudiofantacci avatar claudiofantacci commented on June 17, 2024

Interesting implementation! Thanks for the explanation!

from pylon-ros-camera.

Neel1302 avatar Neel1302 commented on June 17, 2024

To follow up: What is a temporary fix to get the FPS as high as Pylon Viewer? Is it by grab strategy modification or is by changing the trigger mode/timeout properties? I have no bandwidth limits (quad channel PCIE card from basler installed).

@claudiofantacci I understand you mentioned that "setting grab_timeout_ = and timeout = 2 * grab_timeout_ resulted in very stable performance" in #29 (comment). Have you figured out how to set the framerate based on the yaml file (i.e. using a variable rather than hardcoding it again?).

Thanks,
Neel

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

@Neel1302 Hi, please check #21 and my comment from April, 16th, especially regarding the resulting frame rate in free-run/hardware trigger mode (both not supported under ROS yet), and software triggering. Hope that helps.

from pylon-ros-camera.

Neel1302 avatar Neel1302 commented on June 17, 2024

@m-binev @claudiofantacci Hi, Thanks. I did have a look at your comment and it seems that the software trigger mode set by the ros driver is driving down fps.

For anyone that wants to run in continuous acqisition mode refer to this fork that I wrote and tested for USB cameras (Users need to modify accordingly for GigE cameras). Test results show no drop in fps.

Screenshot from 2020-07-28 19-47-45
Screenshot from 2020-07-28 19-48-01

Thanks,
Neel

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

@Neel1302 Hi Neel, cool! You seem to have implemented the so called free-run mode (the camera triggers itself automatically), which allows for reaching the maximum possible frame rate (excluding any other possible bottlenecks, of course). Looking at the changes you have made in 'pylon_camera_base.hpp', I am pretty optimistic that a hardware trigger mode can be easily implemented, too.
That is, it is more or less taking the 'CSoftwareTriggerConfiguration' configuration handler (which is a header file), modify 1-2 lines in it, rename it, register it instead of the software trigger or the AcquireContinuous configuration handlers, and eventually adapt the part for waiting for images to be received with some reasonable timeout.
@pablo-quilez I think we could use & adapt that implementation for further driver improvements.
Regards, Momchil

from pylon-ros-camera.

Neel1302 avatar Neel1302 commented on June 17, 2024

@m-binev Thanks! Yes, I agree, that is a good suggestion. Infact, in about 2 weeks from now, I will be adding another set of cameras to my platform's sensor suite requiring hardware triggering due to specific application constraints. I will report/post the link to that modification/implementation here if I achieve success with it.

Thanks,
Neel

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

@Neel1302 Hi Neel, attached I am sending you a hardware trigger configuration based on the software trigger configurator, which is typically located in 'C:\Program Files\Basler\pylon 6\Development\include\pylon' in the pylon installation.
I'll be curious to get to know your results when you come to testing hardware trigger. Regards, Momchil
HardwareTriggerConfiguration.zip

from pylon-ros-camera.

Neel1302 avatar Neel1302 commented on June 17, 2024

@m-binev Thanks again. That's great. I will get to testing it in about 2 weeks time.

I had a look at the configuration you suggested. I saw the TriggerMode that was set as well as the trigger wait set to FrameStart as well the GPIO input line setting with a parameter for which signal edge to detect as a trigger. For my application, I will be having a PPS input trigger signal at 1Hz from a GPS, but the cameras are to be operating at a higher frame rate: 30 FPS. Thus, I plan to set the triggerName to FrameBurstStart. So essentially for every PPS trigger sent from the GPS, the camera will capture 30 frames. So I understand for the part regarding waiting for the trigger (in the base code), I need to poll FrameBurstTriggerWait. Nonetheless, I think what you have attached will help with getting a head start and I will update after doing some tests.

Thanks again,
Neel

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

Hi Neel, okay, as you seem to have a little more specific setup, you have to addapt/extend the header file according to your needs. But you have nailed it down in principle))
The combination of 'FrameBurstStart + AcquisitionBurstFrameCount + FrameBurstTriggerWait' including an "infinite" timeout for the RetrieveResult() function should do the job for you.
Yes, please drop me a line you are done with your testing. Cheers

from pylon-ros-camera.

Neel1302 avatar Neel1302 commented on June 17, 2024

Hi @m-binev,

The I tried to use the configuration file you sent and placed in /opt/pylon5/include/pylon as I am on Linux. I confirmed that this is where the other configurations such as AcquireContinuousConfiguration.h is situated. It seems as though after adding the CHardwareContinuousConfiguration.h file, it cannot be found in the Pylon namespace. Here is the error I receive when I use catkin to re-build the project.

image

As far as I understand, I need to re-build the pylon driver itself. Do you have any idea regarding this?

Thanks,
Neel

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

@Neel1302 Hi Neel, have you added an include with the path to the new header file in "pylon_camera_base.hpp"? It seems the compiler cannot find the header file, because it is a new one to pylon. Regards, Momchil

from pylon-ros-camera.

Neel1302 avatar Neel1302 commented on June 17, 2024

Hi @m-binev, you are right with the compiler not being able to find the header file. So what I did was to add: #include <pylon/HardwareTriggerConfiguration.h> // CHardwareTriggerConfiguration in /opt/pylon5/include/pylon/PylonIncludes.h. I figured out that this is where CSoftwareTriggerConfiguration.h is included as well.

What I am trying to figure out now is why CConfigurationHelper is included. I understand it is to disable compression, but I am not sure if this is a must. Is there a specific directory where this sits as the compiler cannot find it.

Update: Assuming that disabling compression is not required, I went ahead and modified the hardware trigger configuration you sent for Frame Burst Triggering and initially it did not work. After some debugging, I found that the culprit was:

cam_->UserSetSelector.SetValue(Basler_UsbCameraParams::UserSetSelector_Default);
cam_->UserSetLoad.Execute();

The ROS driver resets the parameters of the camera after the hardware trigger configuration is loaded and thus I was not seeing any image acquisition triggered by the input trigger signal! Perhaps this is why the software trigger mode was set to "On" yet again in the same file despite being specified in the CSoftwareTriggerConfiguration.h file.

// UserSetSelector_Default overrides Software Trigger Mode !!
cam_->TriggerSource.SetValue(Basler_UsbCameraParams::TriggerSource_Software);
cam_->TriggerMode.SetValue(Basler_UsbCameraParams::TriggerMode_On);

See pylon camera reference for Basler_UsbCameraParams::UserSetSelectorEnums as shown below:
image

The solution is to disable resetting of the camera parameters. Having disabled the two lines refered above (line 95-96), everything started to work as expected. See below for confirmation of the achieved frame rate (in my case a 1Hz signal is sent into GPIO Line3 and AcquisitionFrameBurstCount set to 20):

image

Thus it seems that with these additional changes, the hardware configuration you suggested does work and I have confirmed that experimentally.

Thanks again for your help and I hope this can help develop the ROS driver further and include these triggering features (potentially through the config file directly),
Neel

from pylon-ros-camera.

m-binev avatar m-binev commented on June 17, 2024

Hi @Neel1302 , that is positive news - great))
Just to complete the discussion on header files: "CConfigurationHelper.h" is a standard pylon header file for disabling all trigger modes and compression used in ace 2 cameras. It is located in \pylon X\Development\include\pylon. In this case it is not used at all though and the include could be commented out.
Otherwise, it is great that the Hardware Trigger mode is working fine, too.
Many thanks for all the modifications, testing and confirmation. Cheers, Momchil
@pablo-quilez FYI.

from pylon-ros-camera.

pablo-quilez avatar pablo-quilez commented on June 17, 2024

Hi @Neel1302,

thank you for your implementation! Could you do a Pull Request to devel? I can test it here and then merge it.

Cheers

from pylon-ros-camera.

pablo-quilez avatar pablo-quilez commented on June 17, 2024

Hello,

we have tested and merged the #56 which adds the possibility to choose the grabbing strategy as well as the timeout. @Neel1302 my colleage checked your PR and thanks for the contribution, but we cannot merge it directly, only use it as guide to add the hardware trigger as option. I will keep the issue open and the PR.

Cheers,
Pablo

from pylon-ros-camera.

Neel1302 avatar Neel1302 commented on June 17, 2024

@pablo-quilez I am happy that my contribution can help users in need of this feature. If there is specific reason why it cannot be merged directly, let me know and I can try to see how I can help as I do think for uses cases such as in autonomous driving and machine vision, the hardware triggering feature is essential, especially given most basler cameras support this feature. While this is under implemention, I hope my fork implementing this functionallity can help those that require it.

Thanks,
Neel

from pylon-ros-camera.

farhad-dalirani avatar farhad-dalirani commented on June 17, 2024

The current implementation of the camera driver uses the following function call to start grabbing

void Pylon::CInstantCamera::StartGrabbing(EGrabStrategy strategy = GrabStrategy_OneByOne, EGrabLoop grabLoopType = GrabLoop_ProvidedByUser)

with default parameters, i.e. calling

Pylon::CInstantCamera::StartGrabbing()

in diffeernt part of the code, see for example pylon_camera/include/pylon_camera/internal/impl/pylon_camera_base.hpp.

However this implies that, for real-time applications, if a frame is not grabbed (e.g. a trigger fails) a queue with past frames is formed, resulting in an synchronized grabbed sequence of images. This may be desirable in some applications, but the dirver/ROS node should provide a way to choose the best strategy, which are

enum Pylon::EGrabStrategy

Enum Description
GrabStrategy_OneByOne The images are processed in the order of their arrival. This is the default grab strategy.
GrabStrategy_LatestImageOnly Only the latest image is kept in the output queue, all other grabbed images are skipped. If no image is in the output queue when retrieving an image with CInstantCamera::RetrieveResult(), the processing waits for the upcoming image.
GrabStrategy_LatestImages This strategy can be used to grab images while keeping only the latest images. If the application does not retrieve all images in time, all other grabbed images are skipped. The CInstantCamera::OutputQueueSize parameter can be used to control how many images can be queued in the output queue. When setting the output queue size to 1, this strategy is equivalent to GrabStrategy_LatestImageOnly grab strategy. When setting the output queue size to CInstantCamera::MaxNumBuffer, this strategy is equivalent to GrabStrategy_OneByOne.
GrabStrategy_UpcomingImage The input buffer queue is kept empty. This prevents grabbing. However, when retrieving an image with a call to the CInstantCamera::RetrieveResult() method a buffer is queued into the input queue and then the call waits for the upcoming image. The buffer is removed from the queue on timeout. Hence after the call to the CInstantCamera::RetrieveResult() method the input buffer queue is empty again. The upcoming image grab strategy cannot be used together with USB camera devices.

Hi, I have a similar problem, can you look at this question?

from pylon-ros-camera.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.