Codebase for generating expressive robot motion with a large language model (LLM aka GPT-4)
TBA
This codebase uses the framework as described in our submitted paper. In this implimentation, the GPT-4 model is used. The main files in this repo are:
(Start of this notebook includes an explanation of the OpenAI API hyper-parameters used in this work) https://github.com/liamreneroy/LLM_motion/blob/main/scripts/llm_to_motion.ipynb
https://github.com/liamreneroy/LLM_motion/blob/main/media/sample_prompt.txt
https://github.com/liamreneroy/LLM_motion/blob/main/media/sample_output.txt
https://github.com/liamreneroy/LLM_motion/blob/main/stats/
https://github.com/liamreneroy/LLM_motion/blob/main/data/
Motion Parameters: https://tinyurl.com/eight-motion-parameters
Final Poses: https://tinyurl.com/final-poses
To evaluate the generalizability of our framework, we conducted a second experiment to test its ability to generate nonverbal audio expressions. https://github.com/liamreneroy/LLM_motion/tree/main/llm_audio_testcase
https://github.com/liamreneroy/LLM_motion/tree/main/llm_audio_testcase/selected%20sounds
https://github.com/liamreneroy/LLM_motion/tree/main/llm_audio_testcase/plots
https://github.com/liamreneroy/RL_audio/tree/main/notebooks/audio/sonif_libA/looped
pygame (see this webpage ~ https://www.pygame.org/wiki/GettingStarted)
jupyterlab, numpy, termcolor, openpyxl, nbconvert-webpdf, openai, wandb
Either use:
--> sudo apt-get install <package_name>
--> python3 -m pip install <package_name>
--> conda install -c conda-forge <package_name>
Example using conda:
--> conda install -c conda-forge <package_name>
jupyterlab or notebook
numpy
termcolor
openpyxl
nbconvert-webpdf
openai
wandb
Liam Rene Roy [email protected] [email protected]