Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Example of IR stereo image output and IMU #12264

Closed
Userpc1010 opened this issue Oct 6, 2023 · 3 comments
Closed

Example of IR stereo image output and IMU #12264

Userpc1010 opened this issue Oct 6, 2023 · 3 comments

Comments

@Userpc1010
Copy link

Userpc1010 commented Oct 6, 2023

Required Info
Camera Model { D455 }
Firmware Version (5.15.0.2)
Operating System & Version { Linux (Ubuntu 20.04)}
Kernel Version (Linux Only) (5.15.0-83-generic)
Platform PC
SDK Version { v 2.54.1 }
Language {C/C++/opencv/ }
Segment { others }

Issue Description

I'm trying to output IR stereo image and imu data with OpenCV imshow, but I only get a dark screen:

#include <signal.h>
#include <stdlib.h>
#include <iostream>
#include <algorithm>
#include <fstream>
#include <chrono>
#include <ctime>
#include <sstream>

#include <condition_variable>

#include <opencv2/opencv.hpp>

#include <librealsense2/rs.hpp>
#include "librealsense2/rsutil.h"

using namespace std;

rs2_vector interpolateMeasure(const double target_time, const rs2_vector current_data, const double current_time, const rs2_vector prev_data, const double prev_time);

int main() {

    double offset = 0; // ms

    // Declare RealSense pipeline, encapsulating the actual device and sensors
    rs2::pipeline pipe;
    // Create a configuration for configuring the pipeline with a non default profile
    rs2::config cfg;
    cfg.enable_stream(RS2_STREAM_INFRARED, 1, 848, 480, RS2_FORMAT_Y8, 30);
    cfg.enable_stream(RS2_STREAM_INFRARED, 2, 848, 480, RS2_FORMAT_Y8, 30);
    //cfg.enable_stream(RS2_STREAM_ACCEL, RS2_FORMAT_MOTION_XYZ32F, 200);
    //cfg.enable_stream(RS2_STREAM_GYRO, RS2_FORMAT_MOTION_XYZ32F, 200);

    // IMU callback
    std::mutex imu_mutex;
    std::condition_variable cond_image_rec;
    vector<double> v_gyro_timestamp;
    vector<rs2_vector> v_gyro_data;

    double prev_accel_timestamp = 0;
    rs2_vector prev_accel_data;
    double current_accel_timestamp = 0;
    rs2_vector current_accel_data;
    vector<double> v_accel_timestamp_sync;
    vector<rs2_vector> v_accel_data_sync;

    cv::Mat imCV, imRightCV;
    int width_img, height_img;
    double timestamp_image = -1.0;
    bool image_ready = false;
    int count_im_buffer = 0; // count dropped frames

    auto imu_callback = [&](const rs2::frame& frame)
    {
        std::unique_lock<std::mutex> lock(imu_mutex);

        if(rs2::frameset fs = frame.as<rs2::frameset>())
        {
//            count_im_buffer++;

//            double new_timestamp_image = fs.get_timestamp()*1e-3;
//            if(abs(timestamp_image-new_timestamp_image)<0.001){
//                std::cout << "Two frames with the same timeStamp!!!\n";
//                count_im_buffer--;
//                return;
//            }

            rs2::video_frame ir_frameL = fs.get_infrared_frame(1);
            assert(ir_frameL);
            rs2::video_frame ir_frameR = fs.get_infrared_frame(2);
            assert(ir_frameR);

            imCV = cv::Mat(cv::Size(848, 480), CV_8UC1, (void*)(ir_frameL.get_data()), cv::Mat::AUTO_STEP);
            imRightCV = cv::Mat(cv::Size(848, 480), CV_8UC1, (void*)(ir_frameR.get_data()), cv::Mat::AUTO_STEP);

            //cv::imshow("img_l", imCV);
            //cv::imshow("img_r", imRightCV);

            timestamp_image = fs.get_timestamp()*1e-3;
            image_ready = true;

//            while(v_gyro_timestamp.size() > v_accel_timestamp_sync.size())
//            {

//                int index = v_accel_timestamp_sync.size();
//                double target_time = v_gyro_timestamp[index];

//                v_accel_data_sync.push_back(current_accel_data);
//                v_accel_timestamp_sync.push_back(target_time);
//            }

            lock.unlock();
            cond_image_rec.notify_all();
        }
        else if (rs2::motion_frame m_frame = frame.as<rs2::motion_frame>())
        {
            if (m_frame.get_profile().stream_name() == "Gyro")
            {
                // It runs at 200Hz
                v_gyro_data.push_back(m_frame.get_motion_data());
                v_gyro_timestamp.push_back((m_frame.get_timestamp()+offset)*1e-3);
                //rs2_vector gyro_sample = m_frame.get_motion_data();
                //std::cout << "Gyro:" << gyro_sample.x << ", " << gyro_sample.y << ", " << gyro_sample.z << std::endl;
            }
            else if (m_frame.get_profile().stream_name() == "Accel")
            {
                // It runs at 60Hz
                prev_accel_timestamp = current_accel_timestamp;
                prev_accel_data = current_accel_data;

                current_accel_data = m_frame.get_motion_data();
                current_accel_timestamp = (m_frame.get_timestamp()+offset)*1e-3;

                while(v_gyro_timestamp.size() > v_accel_timestamp_sync.size())
                {
                    int index = v_accel_timestamp_sync.size();
                    double target_time = v_gyro_timestamp[index];

                    rs2_vector interp_data = interpolateMeasure(target_time, current_accel_data, current_accel_timestamp,
                                                                prev_accel_data, prev_accel_timestamp);

                    v_accel_data_sync.push_back(interp_data);
                    v_accel_timestamp_sync.push_back(target_time);

                }
                // std::cout << "Accel:" << current_accel_data.x << ", " << current_accel_data.y << ", " << current_accel_data.z << std::endl;
            }
        }
    };

    rs2::pipeline_profile pipe_profile = pipe.start(cfg, imu_callback);

    rs2::stream_profile cam_left = pipe_profile.get_stream(RS2_STREAM_INFRARED, 1);
    rs2::stream_profile cam_right = pipe_profile.get_stream(RS2_STREAM_INFRARED, 2);

//    rs2::stream_profile imu_stream = pipe_profile.get_stream(RS2_STREAM_GYRO);
//    float* Rbc = cam_left.get_extrinsics_to(imu_stream).rotation;
//    float* tbc = cam_left.get_extrinsics_to(imu_stream).translation;
//    std::cout << "Tbc (left) = " << std::endl;
//    for(int i = 0; i<3; i++){
//        for(int j = 0; j<3; j++)
//            std::cout << Rbc[i*3 + j] << ", ";
//        std::cout << tbc[i] << "\n";
//    }

    float* Rlr = cam_right.get_extrinsics_to(cam_left).rotation;
    float* tlr = cam_right.get_extrinsics_to(cam_left).translation;
    std::cout << "Tlr  = " << std::endl;
    for(int i = 0; i<3; i++){
        for(int j = 0; j<3; j++)
            std::cout << Rlr[i*3 + j] << ", ";
        std::cout << tlr[i] << "\n";
    }

    rs2_intrinsics intrinsics_left = cam_left.as<rs2::video_stream_profile>().get_intrinsics();
    width_img = intrinsics_left.width;
    height_img = intrinsics_left.height;
    std::cout << "Left camera: \n";
    std::cout << " fx = " << intrinsics_left.fx << std::endl;
    std::cout << " fy = " << intrinsics_left.fy << std::endl;
    std::cout << " cx = " << intrinsics_left.ppx << std::endl;
    std::cout << " cy = " << intrinsics_left.ppy << std::endl;
    std::cout << " height = " << intrinsics_left.height << std::endl;
    std::cout << " width = " << intrinsics_left.width << std::endl;
    std::cout << " Coeff = " << intrinsics_left.coeffs[0] << ", " << intrinsics_left.coeffs[1] << ", " <<
        intrinsics_left.coeffs[2] << ", " << intrinsics_left.coeffs[3] << ", " << intrinsics_left.coeffs[4] << ", " << std::endl;
    std::cout << " Model = " << intrinsics_left.model << std::endl;

    rs2_intrinsics intrinsics_right = cam_right.as<rs2::video_stream_profile>().get_intrinsics();
    width_img = intrinsics_right.width;
    height_img = intrinsics_right.height;
    std::cout << "Right camera: \n";
    std::cout << " fx = " << intrinsics_right.fx << std::endl;
    std::cout << " fy = " << intrinsics_right.fy << std::endl;
    std::cout << " cx = " << intrinsics_right.ppx << std::endl;
    std::cout << " cy = " << intrinsics_right.ppy << std::endl;
    std::cout << " height = " << intrinsics_right.height << std::endl;
    std::cout << " width = " << intrinsics_right.width << std::endl;
    std::cout << " Coeff = " << intrinsics_right.coeffs[0] << ", " << intrinsics_right.coeffs[1] << ", " <<
        intrinsics_right.coeffs[2] << ", " << intrinsics_right.coeffs[3] << ", " << intrinsics_right.coeffs[4] << ", " << std::endl;
    std::cout << " Model = " << intrinsics_right.model << std::endl;

//    float imageScale = 1.0f;

    double timestamp;
    cv::Mat im, imRight;

    // Clear IMU vectors
    v_gyro_data.clear();
    v_gyro_timestamp.clear();
    v_accel_data_sync.clear();
    v_accel_timestamp_sync.clear();

    while (true)
    {
        std::vector<rs2_vector> vGyro;
        std::vector<double> vGyro_times;
        std::vector<rs2_vector> vAccel;
        std::vector<double> vAccel_times;

        {
            std::unique_lock<std::mutex> lk(imu_mutex);
            if(!image_ready)
            cond_image_rec.wait(lk);

//            if(count_im_buffer>1)
//            std::cout << count_im_buffer -1 << " dropped frs\n";
//            count_im_buffer = 0;

            while(v_gyro_timestamp.size() > v_accel_timestamp_sync.size())
            {
                int index = v_accel_timestamp_sync.size();
                double target_time = v_gyro_timestamp[index];

                rs2_vector interp_data = interpolateMeasure(target_time, current_accel_data, current_accel_timestamp, prev_accel_data, prev_accel_timestamp);

                v_accel_data_sync.push_back(interp_data);
                // v_accel_data_sync.push_back(current_accel_data); // 0 interpolation
                v_accel_timestamp_sync.push_back(target_time);
            }

            // Copy the IMU data
            vGyro = v_gyro_data;
            vGyro_times = v_gyro_timestamp;
            vAccel = v_accel_data_sync;
            vAccel_times = v_accel_timestamp_sync;
            timestamp = timestamp_image;
            im = imCV.clone();
            imRight = imRightCV.clone();

            // Clear IMU vectors
            v_gyro_data.clear();
            v_gyro_timestamp.clear();
            v_accel_data_sync.clear();
            v_accel_timestamp_sync.clear();

            image_ready = false;
        }


//        if(imageScale != 1.f)
//        {
//            int width = im.cols * imageScale;
//            int height = im.rows * imageScale;
//            cv::resize(im, im, cv::Size(width, height));
//            cv::resize(imRight, imRight, cv::Size(width, height));
//        }

        cv::imshow("img_l", im);
        cv::imshow("img_r", imRight);

        std::cout << "loop!\n";

    }
    std::cout << "System shutdown!\n";
}

rs2_vector interpolateMeasure(const double target_time,
                              const rs2_vector current_data, const double current_time,
                              const rs2_vector prev_data, const double prev_time)
{

    // If there are not previous information, the current data is propagated
    if(prev_time == 0)
    {
        return current_data;
    }

    rs2_vector increment;
    rs2_vector value_interp;

    if(target_time > current_time) {
        value_interp = current_data;
    }
    else if(target_time > prev_time)
    {
        increment.x = current_data.x - prev_data.x;
        increment.y = current_data.y - prev_data.y;
        increment.z = current_data.z - prev_data.z;

        double factor = (target_time - prev_time) / (current_time - prev_time);

        value_interp.x = prev_data.x + increment.x * factor;
        value_interp.y = prev_data.y + increment.y * factor;
        value_interp.z = prev_data.z + increment.z * factor;

        // zero interpolation
        value_interp = current_data;
    }
    else {
        value_interp = prev_data;
    }

    return value_interp;
}

Screenshot from 2023-10-06 22-57-43

Without IMU I get the image without any problems:

#include <signal.h>
#include <stdlib.h>
#include <iostream>
#include <algorithm>
#include <fstream>
#include <chrono>
#include <ctime>
#include <sstream>

#include <condition_variable>

#include <opencv2/opencv.hpp>


#include <librealsense2/rs.hpp>
#include "librealsense2/rsutil.h"

int main(int argc, char * argv[]) try
{
    // Declare depth colorizer for pretty visualization of depth data
//    rs2::colorizer color_map;

    // Declare RealSense pipeline, encapsulating the actual device and sensors
    rs2::pipeline pipe;
    // Start streaming with default recommended configuration

    rs2::config cfg;
    cfg.enable_stream(RS2_STREAM_INFRARED, 1, 848, 480, RS2_FORMAT_Y8, 30);
    cfg.enable_stream(RS2_STREAM_INFRARED, 2, 848, 480, RS2_FORMAT_Y8, 30);

    pipe.start(cfg);

    using namespace cv;
//    const auto window_name = "Display Image";
//    namedWindow(window_name, WINDOW_AUTOSIZE);

    while (waitKey(1) < 0 )
    {
        rs2::frameset data = pipe.wait_for_frames(); // Wait for next set of frames from the camera

        rs2::video_frame ir_frameL = data.get_infrared_frame(1);
        rs2::video_frame ir_frameR = data.get_infrared_frame(2);

        // Query frame size (width and height)
        const int width_img = ir_frameL.as<rs2::video_frame>().get_width();
        const int height_img = ir_frameL.as<rs2::video_frame>().get_height();

        Mat imCV = cv::Mat(cv::Size(width_img, height_img), CV_8U, (void*)(ir_frameL.get_data()), cv::Mat::AUTO_STEP);
        Mat imRightCV = cv::Mat(cv::Size(width_img, height_img), CV_8U, (void*)(ir_frameR.get_data()), cv::Mat::AUTO_STEP);

        // Update the window with new data
        cv::imshow("img_l", imCV);
        cv::imshow("img_r", imRightCV);

    }

    return EXIT_SUCCESS;
}
catch (const rs2::error & e)
{
    std::cerr << "RealSense error calling " << e.get_failed_function() << "(" << e.get_failed_args() << "):\n    " << e.what() << std::endl;
    return EXIT_FAILURE;
}
catch (const std::exception& e)
{
    std::cerr << e.what() << std::endl;
    return EXIT_FAILURE;
}

Screenshot from 2023-10-06 23-02-31

I can't figure out what I'm doing wrong? I didn't find an example for stereo + imu so I turned here.

Screenshot from 2023-10-07 12-18-57

Sometimes when starting up, this mini image from the camera appears)))

@MartyG-RealSense
Copy link
Collaborator

Hi @Userpc1010 The RealSense SDK has a C++ example program called rs-data-collect that streams depth, infrared and color and automatically additionally streams IMU data from camera models that are equipped with an IMU.

https://github.com/IntelRealSense/librealsense/tree/master/tools/data-collect

Another approach for streaming IMU data at the same time as other stream types such as infrared is to use a callback, as described in the C++ discussion at #6426

@Userpc1010
Copy link
Author

Userpc1010 commented Oct 7, 2023

I found out what the problem was, it was missing: cv::waitKey(1); very important additions I believe.
Thank you for your time, it's can to close.

@MartyG-RealSense
Copy link
Collaborator

You are very welcome. I'm pleased to hear that you achieved a solution. As you suggested, I will close this case. Thanks again!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants