From c18dc16ae79910860ecf4409fc2fd8a20b5aaca5 Mon Sep 17 00:00:00 2001 From: ateRstones <8556589+ateRstones@users.noreply.github.com> Date: Wed, 1 Nov 2023 18:47:58 +0100 Subject: [PATCH 1/6] Starting HOWTO file --- HOWTO.md | 79 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 79 insertions(+) create mode 100644 HOWTO.md diff --git a/HOWTO.md b/HOWTO.md new file mode 100644 index 00000000..3afbc22b --- /dev/null +++ b/HOWTO.md @@ -0,0 +1,79 @@ +# Howto + +## Demo + +### 1. Build the project +After cloning the repository from github ([https://github.com/UHHRobotics22-23/MarimbaBot](https://github.com/UHHRobotics22-23/MarimbaBot)) the ros project has to be built according to the instructions in the [main readme](./README.md) + +### 2. Setup the configuration +Additionally to the UR5, the following devices have to be connected and configured before launching the project: +1. Logitech StreamCam (packages marimbabot_vision and marimbabot_speech) +2. Scarlett 2i2 USB Audio Interface (package marimbabot_audio) +3. Wifi connection to the MalletHolder (package marimbabot_hardware) + +##### Logitech StreamCam (required for packages marimbabot_vision and marimbabot_speech): +Change the parameter device of the node audio_capture in the [launch file](marimbabot_speech/launch/command_recognition.launch) of the package marimbabot_speech: + +```bash +marimbabot_speech/launch/command_recognition.launch +``` + +and modify the device_id parameter in the [configuration file](marimbabot_vision/config/cv_camera.yaml) of the package marimbabot_vision: + +```bash +marimbabot_vision/config/cv_camera.yaml +``` + +##### Scarlett 2i2 USB Audio Interface (required for package marimbabot_audio): + +Adjust the device parameter for the note_audio_capture node in the [launch file](marimbabot_audio/launch/audio_feedback.launch) of the package marimbabot_audio: + +```bash +marimbabot_audio/launch/audio_feedback.launch +``` + +### 3. Run a demo environment +In order to run the whole project on the real robot, one has to run two launch files. First, the launch file that sets up the robot and its hardware: + +```bash +roslaunch marimbabot_bringup marimbabot_ur5_bringup.launch +``` + +Second, the launch file that brings up the launch file for each package: + +```bash +roslaunch marimbabot_bringup marimbabot.launch +``` + +### 4. (Optional) rqt visualizations +*Honestly, I don't know how to do this - Tom* + +## Hardware + +### 1. Power Up the Hardware +The hardware is powered by a 7.4v lipo battery. +The connector is on the back. +Plugging the battery in is sufficient. +To check if the system is powered, test if the servo is holding its position. + +### 2. Connect to the Wifi +After powering on, the holder emits a wifi network. +By default it is called '**Marimbabot_Mallet**' with the password '**12345678**'. +Connect the computer running the ros application to the wifi. + +### 3. Place the Mallet Holder in the Gripper +The mallet holder is held from the top. +The shape of the top part fits the gripper base plate and should allow for only one, repeatable holding position. +For grabbing the holder use the predefined gripper positions in RViz. +First run the main bringup to initialize the necessary drivers. + + roslaunch marimbabot_bringup marimbabot_ur5_bringup.launch + +RViz can then be started + + rviz + +First use the gripper position *basic_open*. +Then hold the holder in to gripper and change it to *basic_closed*. + +Note: When restarting the robot / ros the gripper sometimes opens and closes, be ready to catch it if necessary. \ No newline at end of file From 6a2968881a2bc8429861f49a2df7af56568af504 Mon Sep 17 00:00:00 2001 From: ateRstones <8556589+ateRstones@users.noreply.github.com> Date: Mon, 6 Nov 2023 16:02:06 +0100 Subject: [PATCH 2/6] Added part for the marimba positioning --- HOWTO.md | 21 ++++++++++++++++++++- 1 file changed, 20 insertions(+), 1 deletion(-) diff --git a/HOWTO.md b/HOWTO.md index 3afbc22b..d04aff66 100644 --- a/HOWTO.md +++ b/HOWTO.md @@ -76,4 +76,23 @@ RViz can then be started First use the gripper position *basic_open*. Then hold the holder in to gripper and change it to *basic_closed*. -Note: When restarting the robot / ros the gripper sometimes opens and closes, be ready to catch it if necessary. \ No newline at end of file +Note: When restarting the robot / ros the gripper sometimes opens and closes, be ready to catch it if necessary. + +### 4. Calibration of the Marimba Position +The marimba in our environment is expected to be in a specific orientation for each run. +The debug sequences are used to hit the outermost keys at the center. +Through 2-3 iterations of adjusting the marimba such that the center of these keys is hit, the calibration of the marimba is possible. + +For the system to be ready run (skip if already started in previous step) + + roslaunch marimbabot_bringup marimbabot_ur5_bringup.launch + +Further the command gui has to be started to run the testing sequences. + + rosrun marimbabot_behavior commandGUI.py + +A gui window will appear with the option to input play sequences. +The both outermost notes are **c'4** and **c'''4**. For hitting them repeatedly, input the notes separated by a space, confirm, read and finally play them. + +The full scale is available as a present for testing all notes. +Using the chord testing program, the 2nd mallet can also be tested and further adjustments to the position be made. \ No newline at end of file From 57cbcb863b986689adc7aa0681940e500e3cdfa4 Mon Sep 17 00:00:00 2001 From: ateRstones <8556589+ateRstones@users.noreply.github.com> Date: Mon, 6 Nov 2023 16:07:45 +0100 Subject: [PATCH 3/6] Added stub step for the other devices --- HOWTO.md | 12 +++++++++++- 1 file changed, 11 insertions(+), 1 deletion(-) diff --git a/HOWTO.md b/HOWTO.md index d04aff66..4bb6ccf5 100644 --- a/HOWTO.md +++ b/HOWTO.md @@ -95,4 +95,14 @@ A gui window will appear with the option to input play sequences. The both outermost notes are **c'4** and **c'''4**. For hitting them repeatedly, input the notes separated by a space, confirm, read and finally play them. The full scale is available as a present for testing all notes. -Using the chord testing program, the 2nd mallet can also be tested and further adjustments to the position be made. \ No newline at end of file +Using the chord testing program, the 2nd mallet can also be tested and further adjustments to the position be made. + +### 5. Audio Feedback Microphone +*I don't know the specific steps here - Tom* + +### 6. Note Reading Camera +The Logitech StreamCam is used for the detection of notes on the whiteboard. +Mount the camera on a tripod and point it to the whiteboard. Connect to the computer using USB. + +Magnetic notes and note lines are provided in the hardware box. +Attach the note lines to the whiteboard before adding notes. \ No newline at end of file From 3470f76336cc97f962bf9cc58fa540bf49b600ff Mon Sep 17 00:00:00 2001 From: Yunlong Wang <92165787+yunlong-wang-cn@users.noreply.github.com> Date: Thu, 9 Nov 2023 12:35:29 +0100 Subject: [PATCH 4/6] Update HOWTO.md update the visualization and setup the microphone --- HOWTO.md | 13 +++++++++---- 1 file changed, 9 insertions(+), 4 deletions(-) diff --git a/HOWTO.md b/HOWTO.md index 4bb6ccf5..389967c7 100644 --- a/HOWTO.md +++ b/HOWTO.md @@ -45,8 +45,13 @@ Second, the launch file that brings up the launch file for each package: roslaunch marimbabot_bringup marimbabot.launch ``` -### 4. (Optional) rqt visualizations -*Honestly, I don't know how to do this - Tom* +### 4. (Optional) rqt/rviz visualizations +There are several visualization windows, basically, it just publishes the image to ros topic, you can observe the visualization at the rviz or rqt by opening a image windows. Following is the list of topics: +- **/audio_node/spectrogram_img**: show the spectrum of music notes detection +- **/audio_node/live_midi_img**: show the live midi image of music notes detection +- **/audio_node/feedback_img**: the final evaluation image of ground-truth and robot playing +- **cv_camera_node/image_raw**: the image capture by camera, which come from the whiteboard +- **detection_visualization**: show the recognized music note ## Hardware @@ -98,11 +103,11 @@ The full scale is available as a present for testing all notes. Using the chord testing program, the 2nd mallet can also be tested and further adjustments to the position be made. ### 5. Audio Feedback Microphone -*I don't know the specific steps here - Tom* +For setting up the microphone regarding the audio feedback, we used a microphone and a soundcard. **There is one thing to be careful of, you need to open the signal amplification at the soundcard, something even the red light is on, but it doesn't mean it really works, so better to reopen it, or check it by observing the system volume at the configuration of ubuntu system.** ### 6. Note Reading Camera The Logitech StreamCam is used for the detection of notes on the whiteboard. Mount the camera on a tripod and point it to the whiteboard. Connect to the computer using USB. Magnetic notes and note lines are provided in the hardware box. -Attach the note lines to the whiteboard before adding notes. \ No newline at end of file +Attach the note lines to the whiteboard before adding notes. From 1bfd0bea144ab3d8ae8d0de1074d37b8f625bd9c Mon Sep 17 00:00:00 2001 From: Florian Vahl Date: Thu, 9 Nov 2023 17:59:51 +0100 Subject: [PATCH 5/6] Restructure, grammarly, fixes --- HOWTO.md | 151 +++++++++++++++++++++++++++++-------------------------- 1 file changed, 79 insertions(+), 72 deletions(-) diff --git a/HOWTO.md b/HOWTO.md index 389967c7..6d2b4d22 100644 --- a/HOWTO.md +++ b/HOWTO.md @@ -1,113 +1,120 @@ -# Howto +# Howto - Demo -## Demo +### Build the project +The project has to be built and setup according to the instructions in the [main readme](./README.md) -### 1. Build the project -After cloning the repository from github ([https://github.com/UHHRobotics22-23/MarimbaBot](https://github.com/UHHRobotics22-23/MarimbaBot)) the ros project has to be built according to the instructions in the [main readme](./README.md) - -### 2. Setup the configuration +### Sensor Setup Additionally to the UR5, the following devices have to be connected and configured before launching the project: 1. Logitech StreamCam (packages marimbabot_vision and marimbabot_speech) 2. Scarlett 2i2 USB Audio Interface (package marimbabot_audio) -3. Wifi connection to the MalletHolder (package marimbabot_hardware) -##### Logitech StreamCam (required for packages marimbabot_vision and marimbabot_speech): -Change the parameter device of the node audio_capture in the [launch file](marimbabot_speech/launch/command_recognition.launch) of the package marimbabot_speech: +#### Audio Feedback Microphone +For setting up the microphone regarding the audio feedback, we used a microphone and a sound card. **There is one thing to be careful of, you need to open the signal amplification at the soundcard, something even the red light is on, but it doesn't mean it really works, so better to reopen it, or check it by observing the system volume at the configuration of ubuntu system.** -```bash -marimbabot_speech/launch/command_recognition.launch -``` +Adjust the device parameter for the note_audio_capture node in the [launch file](marimbabot_audio/launch/audio_feedback.launch) of the package marimbabot_audio. -and modify the device_id parameter in the [configuration file](marimbabot_vision/config/cv_camera.yaml) of the package marimbabot_vision: +#### Note Reading Camera +The Logitech StreamCam is used for the detection of notes on the whiteboard. +Mount the camera on a tripod and point it to the whiteboard. Connect to the computer using USB. -```bash -marimbabot_vision/config/cv_camera.yaml -``` +Magnetic notes and note lines are provided in the hardware box. +Attach the note lines to the whiteboard before adding notes. -##### Scarlett 2i2 USB Audio Interface (required for package marimbabot_audio): +Change the parameter device of the node audio_capture in the [launch file](marimbabot_speech/launch/command_recognition.launch) of the package marimbabot_speech. -Adjust the device parameter for the note_audio_capture node in the [launch file](marimbabot_audio/launch/audio_feedback.launch) of the package marimbabot_audio: +and modify the device_id parameter in the [configuration file](marimbabot_vision/config/cv_camera.yaml) of the package marimbabot_vision. -```bash -marimbabot_audio/launch/audio_feedback.launch -``` +### Power Up the Mallet Holder +The mallet joint is powered by a 7.4v lipo battery. +The connector is on the back. +Plugging the battery in is sufficient. +To check if the system is powered, test if the servo is holding its position. -### 3. Run a demo environment -In order to run the whole project on the real robot, one has to run two launch files. First, the launch file that sets up the robot and its hardware: +### Connect the Mallet Holder to the Wifi +After powering on, the holder emits a wifi network. +By default, it is called '**Marimbabot_Mallet**' with the password '**12345678**'. +Connect the computer running the ROS application to the wifi. + +### Place the Mallet Holder in the Gripper +The mallet holder is held from the top. +The shape of the top part fits the gripper base plate and should allow for only one, repeatable holding position. +For grabbing the holder use the predefined gripper positions in RViz. +First, run the main bringup to initialize the necessary drivers. ```bash roslaunch marimbabot_bringup marimbabot_ur5_bringup.launch ``` -Second, the launch file that brings up the launch file for each package: +Then RViz can then be started ```bash -roslaunch marimbabot_bringup marimbabot.launch +rviz ``` -### 4. (Optional) rqt/rviz visualizations -There are several visualization windows, basically, it just publishes the image to ros topic, you can observe the visualization at the rviz or rqt by opening a image windows. Following is the list of topics: -- **/audio_node/spectrogram_img**: show the spectrum of music notes detection -- **/audio_node/live_midi_img**: show the live midi image of music notes detection -- **/audio_node/feedback_img**: the final evaluation image of ground-truth and robot playing -- **cv_camera_node/image_raw**: the image capture by camera, which come from the whiteboard -- **detection_visualization**: show the recognized music note +First, select the `arm` planning group in the MoveIt Motion Planning RViz plugin and move the robot to the `marimbabot_home` goal state. +This makes inserting the mallet holder easier. +Afterward, select the `gripper` planning group and move the gripper to the `basic_open` goal state. +Now you can insert the holder and move it to the `basic_closed` goal state. -## Hardware +**Note: When restarting the robot / ros the gripper sometimes opens and closes, be ready to catch it if necessary.** -### 1. Power Up the Hardware -The hardware is powered by a 7.4v lipo battery. -The connector is on the back. -Plugging the battery in is sufficient. -To check if the system is powered, test if the servo is holding its position. +### Calibration -### 2. Connect to the Wifi -After powering on, the holder emits a wifi network. -By default it is called '**Marimbabot_Mallet**' with the password '**12345678**'. -Connect the computer running the ros application to the wifi. +While we could calibrate the marimba position by adapting the URDF description the following approach is often preferred due to its speed and simplicity. +The marimba in our environment is expected to be in a specific position and orientation. +We can command the robot to hit where it expects the middle of the highest and lowest keys and manually adjust the instrument's position accordingly. +The `repeat_low_end` and `repeat_high_end` commands of the following script can be used to hit these keys several times. -### 3. Place the Mallet Holder in the Gripper -The mallet holder is held from the top. -The shape of the top part fits the gripper base plate and should allow for only one, repeatable holding position. -For grabbing the holder use the predefined gripper positions in RViz. -First run the main bringup to initialize the necessary drivers. +```bash +rosrun marimbabot_planning sound_pad.py +``` - roslaunch marimbabot_bringup marimbabot_ur5_bringup.launch +Repeating this process a few times with both ends of the instrument leads to a reasonably good positioning. -RViz can then be started +In the end, one might execute a `chromatic_scale` using the same script to verify all notes are hit correctly. - rviz +### Run a demo +In order to run the whole project on the real robot, one has to run two launch files. Start them if you haven't done it already due to an earlier step. First, the launch file that sets up the robot and its hardware: -First use the gripper position *basic_open*. -Then hold the holder in to gripper and change it to *basic_closed*. +```bash +roslaunch marimbabot_bringup marimbabot_ur5_bringup.launch +``` -Note: When restarting the robot / ros the gripper sometimes opens and closes, be ready to catch it if necessary. +Second, the launch file that brings up the launch file for each package: -### 4. Calibration of the Marimba Position -The marimba in our environment is expected to be in a specific orientation for each run. -The debug sequences are used to hit the outermost keys at the center. -Through 2-3 iterations of adjusting the marimba such that the center of these keys is hit, the calibration of the marimba is possible. +```bash +roslaunch marimbabot_bringup marimbabot.launch +``` -For the system to be ready run (skip if already started in previous step) +### (Optional) rqt/rviz visualizations +There are several visualizations, basically, it just publishes the image to the ROS topic, you can observe the visualization with rviz or rqt by opening an image window. Following is the list of topics: +- **/audio_node/spectrogram_img**: the spectrum of the music notes detection +- **/audio_node/live_midi_img**: the live midi image of the music notes detection +- **/audio_node/feedback_img**: the final evaluation image of ground-truth and robot playing +- **cv_camera_node/image_raw**: the image captured by the camera, showing the whiteboard +- **detection_visualization**: the recognized music notation - roslaunch marimbabot_bringup marimbabot_ur5_bringup.launch +### Usage -Further the command gui has to be started to run the testing sequences. +Now that everything is set up we are able to use the MarimbaBot. - rosrun marimbabot_behavior commandGUI.py +The robot's wakeword is `Hi Marimbabot!`. -A gui window will appear with the option to input play sequences. -The both outermost notes are **c'4** and **c'''4**. For hitting them repeatedly, input the notes separated by a space, confirm, read and finally play them. +To play some notes of the whiteboard follow these steps: -The full scale is available as a present for testing all notes. -Using the chord testing program, the 2nd mallet can also be tested and further adjustments to the position be made. +- Say `Hi Marimbabot!` +- Wait for the activation sound +- Say `Read the notes` +- Wait for confirmation via TTS +- Say `Hi Marimbabot!` +- Wait for the activation sound +- Say `Play the music/notes` -### 5. Audio Feedback Microphone -For setting up the microphone regarding the audio feedback, we used a microphone and a soundcard. **There is one thing to be careful of, you need to open the signal amplification at the soundcard, something even the red light is on, but it doesn't mean it really works, so better to reopen it, or check it by observing the system volume at the configuration of ubuntu system.** +Some other instructions that you might want to try are: -### 6. Note Reading Camera -The Logitech StreamCam is used for the detection of notes on the whiteboard. -Mount the camera on a tripod and point it to the whiteboard. Connect to the computer using USB. +- Play faster/slower (by n bpm) +- Play louder/softer (by n steps) +- Play in a loop +- Stop playing +- Preview the piece -Magnetic notes and note lines are provided in the hardware box. -Attach the note lines to the whiteboard before adding notes. From 14708b61de1ff32313e75017843592e21ebdf8ee Mon Sep 17 00:00:00 2001 From: Florian Vahl Date: Thu, 9 Nov 2023 18:08:49 +0100 Subject: [PATCH 6/6] Add some usage tips --- HOWTO.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/HOWTO.md b/HOWTO.md index 6d2b4d22..e17f951f 100644 --- a/HOWTO.md +++ b/HOWTO.md @@ -98,7 +98,8 @@ There are several visualizations, basically, it just publishes the image to the Now that everything is set up we are able to use the MarimbaBot. -The robot's wakeword is `Hi Marimbabot!`. +The robot's wakeword is `Hi Marimbabot!`. +The webcam mic is used for the speech detection, so don't move too far away from the whiteboard and microphone. Also don't stand in front of the whiteboard while the robot is reading the notes. To play some notes of the whiteboard follow these steps: