Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Noetic support, segfault #71

Open
chfritz opened this issue Jun 29, 2021 · 6 comments
Open

Noetic support, segfault #71

chfritz opened this issue Jun 29, 2021 · 6 comments

Comments

@chfritz
Copy link

chfritz commented Jun 29, 2021

Is there any support planned for noetic? Compiling from source under ubuntu 20.04 is easy enough and works out of the box. But when running it the node segfaults right away after sending a first image.

@WaldoPepper
Copy link

Any news on this? Noetic support would be highly appreciated, because we ROS1'ler will stick with it for some time... ;-)

@clydemcqueen
Copy link

GStreamer 1.16 (default on Ubuntu 20.04) will cause a segfault. Fix: #61

@chfritz
Copy link
Author

chfritz commented Sep 10, 2023

You can use the appsink element, which will give you events in your C++ code with raw byte data. You'll probably want to decode from h264 first (assuming that's what your IP camera provides -- most do). You can use avdec_h264 for that, or, if you have hardware acceleration, e.g., on a Jetson, use the vendor specific gstreamer plugins for that. Once decoded, you will get full frames as byte arrays in C++.
Let me know if you need any more help.

@chfritz
Copy link
Author

chfritz commented Sep 11, 2023

If the images you receive from gstreamer are already in the right format (e.g., video/x-raw,format=RGB which corresponds to the ROS format rgb8), then you can publish them directly as the data field in sensor_msgs/Image messages in ROS.

@chfritz
Copy link
Author

chfritz commented Sep 11, 2023

Can you post your pipeline here? As I said earlier you need to convert h264 to raw using avdec_h264, but I thought you said you are already doing that?

@chfritz
Copy link
Author

chfritz commented Sep 11, 2023

That looks right to me. You'll also want to add a caps-filter after the videoconvert to tell that node what format to convert to, e.g.:
gst-launch-1.0 -v rtspsrc location=(IP address) ! rtph264depay ! capsfilter caps="video/x-h264" ! h264parse ! advec_h264 ! videoconvert ! video/x-raw,format=RGB ! appsink

The bytearrays you'll get in this appsink will then just be plain-old RGB frames that you can publish.

Could you please also provide a relevant resource showing how the video stream could be published and subscribed using sensor_msgs/Image in ROS Noetic? I am a little confused since I am new to ROS and GStreamer.

That has nothing to do with gstreamer anymore, just publishing messages on ROS topics, so the very basic ROS tutorial should be all you need: http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28c%2B%2B%29#roscpp_tutorials.2FTutorials.2FWritingPublisherSubscriber.Writing_the_Publisher_Node

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants