Resources
This appendix provides curated links to official documentation, tutorials, and community resources for the tools and technologies covered in this book.
ROS 2
Official Documentation
- ROS 2 Documentation: https://docs.ros.org/en/humble/
- ROS 2 Installation Guide: https://docs.ros.org/en/humble/Installation.html
- ROS 2 Tutorials: https://docs.ros.org/en/humble/Tutorials.html
- ROS 2 API Reference: https://docs.ros2.org/latest/api/
Community Resources
- ROS 2 Discourse Forum: https://discourse.ros.org/c/ros2/6
- ROS 2 GitHub: https://github.com/ros2/ros2
- ROS 2 Examples: https://github.com/ros2/examples
Key Packages
- Nav2 (Navigation): https://navigation.ros.org/
- MoveIt (Manipulation): https://moveit.ros.org/
- ros2_control: https://control.ros.org/
Gazebo
Official Documentation
- Gazebo Classic: https://classic.gazebosim.org/documentation
- Gazebo Garden: https://gazebosim.org/docs/garden
- Gazebo Tutorials: https://classic.gazebosim.org/tutorials
Community Resources
- Gazebo Forums: https://community.gazebosim.org/
- Gazebo GitHub: https://github.com/gazebosim/gazebo
Robot Models
- Gazebo Model Database: https://app.gazebosim.org/
- URDF Tutorial: http://wiki.ros.org/urdf/Tutorials
Unity
Official Documentation
- Unity Learn: https://learn.unity.com/
- Unity Manual: https://docs.unity3d.com/Manual/index.html
- Unity Robotics Hub: https://github.com/Unity-Technologies/Unity-Robotics-Hub
ROS Integration
- ROS-TCP-Connector: https://github.com/Unity-Technologies/ROS-TCP-Connector
- Unity Robotics Tutorials: https://github.com/Unity-Technologies/Unity-Robotics-Hub/tree/main/tutorials
NVIDIA Isaac
Official Documentation
- Isaac Sim Documentation: https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/overview.html
- Isaac ROS Documentation: https://nvidia-isaac-ros.github.io/
- Isaac Sim Tutorials: https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/tutorial_intro.html
Getting Started
- Isaac Sim Installation: https://docs.omniverse.nvidia.com/app_isaacsim/app_isaacsim/install_workstation.html
- Isaac ROS Getting Started: https://nvidia-isaac-ros.github.io/getting_started/
Community Resources
- NVIDIA Isaac Forums: https://forums.developer.nvidia.com/c/omniverse/isaac-sim/
- Isaac Sim GitHub: https://github.com/NVIDIA-Omniverse/IsaacSim
Speech Recognition & LLMs
OpenAI Whisper
- Whisper GitHub: https://github.com/openai/whisper
- Whisper Paper: https://arxiv.org/abs/2212.04356
- Whisper API: https://platform.openai.com/docs/guides/speech-to-text
Large Language Models
- OpenAI API: https://platform.openai.com/docs
- Anthropic Claude: https://docs.anthropic.com/
- LLaMA (Meta): https://ai.meta.com/llama/
- Hugging Face: https://huggingface.co/
Robotics-Specific LLMs
- PaLM-E (Google): https://palm-e.github.io/
- RT-2 (Robotic Transformer): https://robotics-transformer2.github.io/
Hardware & Edge Computing
NVIDIA Jetson
- Jetson Developer Guide: https://developer.nvidia.com/embedded/jetson-developer-kit
- Jetson Software: https://developer.nvidia.com/embedded/jetpack
- Jetson AI Courses: https://developer.nvidia.com/embedded/learn/jetson-ai-certification-programs
Sensors
- Intel RealSense: https://www.intelrealsense.com/
- RealSense ROS 2: https://github.com/IntelRealSense/realsense-ros
Robot Platforms
- Unitree Robotics: https://www.unitree.com/
- Boston Dynamics: https://www.bostondynamics.com/
- Hiwonder: https://www.hiwonder.com/
Further Study
Advanced Robotics
- Modern Robotics (Corke): http://hades.mech.northwestern.edu/index.php/Modern_Robotics
- Introduction to Robotics (Stanford): https://see.stanford.edu/Course/CS223A
- Robotics: Computational Motion Planning (Coursera): https://www.coursera.org/learn/robotics-motion-planning
Control Theory
- Control Bootcamp (Steve Brunton): https://www.youtube.com/playlist?list=PLMrJAkhIeNNSVjnsviglFagA3gQmM1yuk
- Underactuated Robotics (MIT): https://underactuated.mit.edu/
Machine Learning for Robotics
- CS330 Deep Multi-Task and Meta Learning (Stanford): http://cs330.stanford.edu/
- Deep RL Bootcamp (UC Berkeley): https://sites.google.com/view/deep-rl-bootcamp/
Computer Vision
- CS231n Convolutional Neural Networks (Stanford): http://cs231n.stanford.edu/
- OpenCV Tutorials: https://docs.opencv.org/4.x/d9/df8/tutorial_root.html
Reinforcement Learning
- Spinning Up in Deep RL (OpenAI): https://spinningup.openai.com/
- Deep RL Course (UCL): https://www.davidsilver.uk/teaching/
Community & Forums
General Robotics
- r/robotics (Reddit): https://www.reddit.com/r/robotics/
- ROS Discourse: https://discourse.ros.org/
- Robotics Stack Exchange: https://robotics.stackexchange.com/
AI & Machine Learning
- r/MachineLearning (Reddit): https://www.reddit.com/r/MachineLearning/
- Papers with Code: https://paperswithcode.com/
Physical AI Specific
- Embodied AI Workshop: https://embodied-ai.org/
- CoRL (Conference on Robot Learning): https://www.corl.org/
Books
Robotics
- "Introduction to Robotics" by John J. Craig
- "Modern Robotics" by Kevin M. Lynch and Frank C. Park
- "Robotics: Modelling, Planning and Control" by Bruno Siciliano
AI & Machine Learning
- "Deep Learning" by Ian Goodfellow, Yoshua Bengio, and Aaron Courville
- "Reinforcement Learning: An Introduction" by Sutton and Barto
- "Probabilistic Robotics" by Thrun, Burgard, and Fox
Computer Vision
- "Computer Vision: Algorithms and Applications" by Richard Szeliski
- "Deep Learning for Computer Vision" by Rajalingappaa Shanmugamani
Academic Papers
Vision-Language-Action
- PaLM-E: An Embodied Multimodal Language Model (Google, 2023)
- RT-2: Vision-Language-Action Models Transfer Web Knowledge to Robotic Control (Google, 2023)
- SayCan: Do As I Can, Not As I Say (Google, 2022)
Sim-to-Real Transfer
- Domain Randomization for Transferring Deep Neural Networks from Simulation to the Real World (OpenAI, 2017)
- Sim-to-Real Transfer of Robotic Control with Dynamics Randomization (Berkeley, 2018)
Humanoid Robotics
- Atlas: The Next Generation (Boston Dynamics, 2023)
- Humanoid Robot Locomotion and Manipulation (Various)
This resource list is maintained and updated as new tools and papers emerge. Contributions and suggestions are welcome.