Hi,
I have an image topic `camera/image_color` which has images streaming with the encoding `sensor_msgs::image_encodings::YUV422`. When I see this image on rqt, I can see the image normally with correct colors.
I have the image rectification nodelet loaded with the following arguments
`--load image_proc/rectify image_proc_rectify_color image_mono:=image_color image_rect:=image_rect_color`.
It basically takes in `image_color` and spits out `image_rect_color`. The encoding of this rectified image is same as the input (YUV422) as per the header of the rectified image when I echo it. However, when I try to visualize this image on rqt, the image seems to be rectified, but the colors are weird.
Does anyone know what I might be missing? Does image rectify nodelet assumes RGB input and doesn't care about the encoding in the header of the input image? Is that why colors are wrong? Any help is appreciated.
Note: The YUV422 is the direct output from the camera sensor and I don't have `image_raw` or `image_mono` at all.
↧