-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
implementing the code on aarch64 nvidia jetson [linux] #655
Comments
https://forums.developer.nvidia.com/t/mediapipe/121120 |
glog and gpu helloworld I can bypass and run; |
Can you build it by running the following command?
To have a video-in-video-out demo, you can do
|
@jiuqiant Thank you for your response!
seems to output: the output is attached; |
it seems like an opencv installation issue by now |
reinstalling opencv with ` |
renstalled opencv; |
`GLOG_logtostderr=1 bazel-bin/mediapipe/examples/desktop/hand_tracking/hand_tracking_gpu --calculator_graph_config_file=mediapipe/graphs/hand_tracking/hand_tracking_mobile.pbtxt --input_video_path=/home/nvidia/Downloads/Big_Buck_Bunny_1080_10s_30MB.mp4 --output_video_path=/home/nvidia/Downloads/output.mp4 I0430 06:23:11.340766 18936 demo_run_graph_main_gpu.cc:88] Start running the calculator graph. |
it provides output; |
It's possible to get the coordinates/landmarks out of the graph. The solution is to observe the "hand_landmarks" stream in the main function by using an output stream poller. Here is a similar issue for your reference: #200. If you are looking for some example code, please take a look at #200 (comment).
Looks great! Just want to double check that it's running on Nvidia Jetson, right? |
Looks nice! It would also be great if you could document concisely what all you had to do for getting things running on Nvidia Jetson. We are interested in officially supporting that platform, and others could benefit from your experience. . |
exactly! it was executed on nvidia jetson. |
here are the steps then edit the file third-party/ |
then I can execute
dropped in latter release##make modules_preparemkdir v4l2loopback The outputs are as follows; though I do not see any output, but text |
will --input_video_path=/dev/video2 be a correct link to webcam? |
If you don't specify |
@jiuqiant by default opencv reads from /dev/video0 - it doesn't work for webcam mode as it is CSI Bayer's sensor device. webcam on jetsons would be on /dev/video1[2] etc
otherwise /dev/vide0 will be unreadable, and only /dev/video2 will work as webcam |
in the file https://gist.github.com/eknight7/d4a57504c8f866fc80c0eb2c61ff6b4f#file-multi_hand_tracking_run_graph_cpu_main-cc-L52 but which file do I edit in the mediapipe folder? is there any way to read from csi bayers sensor directly? e.g. via incorporating the gst nvargussrc from the excerpt above? |
could you also advise regarding if it is possible to use mediapipe to localize gaze position x, y from image? can you get through the implementation of it with jetson? |
If you build |
I can get 1 hand with v4l2loopback from sensor 0 - > 2 |
what are the options to 'sink'/write the output stream to a file? or to forward it further to a virtual device? what can be done with arguments? passed through the command line? for the output stream? |
I mentioned an approach to get landmarks data in #655 (comment). Are you able to do this in the main function already? If yes, it's pretty easy to sink this to a file by using |
I missed it; was focusing on redirection of the video stream; |
Do I have to build the code from https://github.com/mgyong/mediapipe-issue200 ?
|
@mgyong Thank you for your message. I will try to get the existing draft more clear. There is still much of variability which might be messed by single Readme.md file; For example as of Opencv 4.3 installation which is a separate story as it will be up to |
@mgyong you may find the steps at https://github.com/AndreV84/mediapipe/blob/master/README.md |
I can probably pack it into a Docker file; for Jetson l4t |
@AndreV84 Thanks. This is great. We will look into incorporating a link to this in our documentation. We now have a community driven repo awesome MediaPipe mediapipe.orgI where we link related repos. I have invited you to make change to include your repo doc into the awesosme MediaPipe repo If you have time, it would be great to include a cross compilation section (where folks can use a Docker file to build the MediaPipe example into a binary in container on desktop and deploy binary to the Nvidia device) |
I am trying to revise and rebuild the mediapipe on newer Jetson OS release with newer opencv version; Once implemented I will update the guide. If there will be issues I will askk for help. Thanks |
any idea?
|
we were able to get progress with it; |
I tryied to implement the above code, up until the color conversion |
@radisc |
Yes, thank you, i already saw that implementation, but i was trying to do something similar without using videoconvert, because as far as i know it is pretty CPU intensive and maybe use instead the nvvideoconvert plugin from the deepstream package |
@radisc is your system - Jetson?
there is no videoconvert |
Oh, sorry i missed the patch, however i tried this also, and i manage to convert it and visualize it before feeding it to mediapipe. Instead of But mediapipe doens't seem to like it |
the patched file uses
as long as you execute the patched file on jetson with CSI camera it shoudl work as is |
Ok, i followed the link you provided https://github.com/AndreV84/mediapipe/blob/1aa89d348a91dad0d63e0b34b7ec50f668b4d232/demo_run_graph_main_gpu_mod.cc#L70 |
the file has
|
actually the file has the vcideoconvert that wasn't presented in the variant of the code quoted by you previously in earlier post that havent had the videoconvert
once you got the default scenario to work you could try adjusting certain parameters then rebuilding |
what is your hardware? jetson? which model? not jetson? arm? x86? |
I'm using a Jetson Xavier NX, the link you provided has videoconvert in it, is there another file that i'm missing with the patched gstreamer part? |
I can see a variant cited here #655 (comment) |
yes, I tested on NX / AGX devices. the patched file should work as is. however you may try to improve it also |
the best improvement would be adding #include "jetson-utils/videoSource.h" library probably, but it is a separate story |
Yes, that is what i tried to do and it works, because i can visualize it correctly with
Then it seems mediapipe can't process it correctly because and i get a blank screen from its output |
Here is another thread with a bit more detail: |
does it works as is before you start modifying the code? |
there is a library jetson-util |
Hi @AndreV84 , this is what i've done: Then, to remove videoconvert i tried this: |
Did you manage to include jetson_utils in bazel? |
@radisc |
Hi Guys,
You mentioned Android & IoS devices;
I am just wondering if the code will run on arm 64 nvidia jetson [ tegra linux];
The text was updated successfully, but these errors were encountered: