Skip to main content

📚 Research & Open-Source Projects Using Pepper

  1. A Hybrid SLAM and Object Recognition System for Pepper Robot
    Integrates ORB‑SLAM2 and SIFT/RANSAC object recognition for autonomous indoor mapping and perception.
    đź”— arXiv Paper and GitHub Code

  2. Setting Up Pepper For Autonomous Navigation And Personalized Interaction With Users
    Combines ROS navigation, cloud-based speech recognition, and facial recognition to enable speech-triggered, user-aware navigation.
    đź”— arXiv Paper

  3. Upgrading Pepper’s Social Interaction with Advanced Hardware and Perception Enhancements
    Enhances Pepper with onboard Jetson GPU and RealSense camera, enabling real-time people detection and gaze estimation.
    đź”— arXiv Paper

  4. Adapted Pepper
    Hardware mod that adds GPU and 3D camera (D435i) making Pepper capable of running OpenPose/YOLO onboard.
    đź”— arXiv Paper


🛠️ Noteworthy GitHub Projects​

  1. pepperchat (iLab Sweden)
    Integrates OpenAI's ChatGPT with Pepper using NAOqi, enabling open-domain conversation.
    đź”— pepperchat GitHub

  2. pepper_robot (ros-naoqi)
    ROS meta-package offering basic Pepper control, drift fixes, and autonomy features via ROS wrappers.
    đź”— pepper_robot GitHub

  3. Pepper-Nao Basic Tutorial (PenguinZhou)
    Educational resource with Choregraphe and Python demos—includes vision, expression, and interaction samples.
    đź”— Pepper_Nao_Basic_Tutorial GitHub

  4. robotic-exercise-coach-pepper (M4rtinR)
    Demonstrates Pepper as a personal coach guiding squash and physiotherapy exercises using behavior trees.
    🔗 Robotic‑Exercise‑Coach‑Pepper GitHub

  5. Dialogue-Pepper-Robot (Igor Lirussi)
    Notebook + module offering open-domain conversational features using QI SDK and Java AIML backend.
    đź”— Dialogue-Pepper-Robot GitHub

  6. pepper_dcm_robot (ros-naoqi)
    Provides ROS 1 controllers enabling smooth joint trajectory control via Naoqi DCM or MoveIt integration.
    đź”— pepper_dcm_robot GitHub

  7. pepper_virtual (ros-naoqi)
    Simulated Pepper in Gazebo with ROS controllers—great for offline testing and development.
    đź”— pepper_virtual GitHub


🎯 How to Use These​

  • Clone and adapt demo code to your RAIL Lab Pepper environment.
  • Integrate research ideas (e.g. SLAM, emotion-aware navigation) into your workflows.
  • Use ROS packages to build autonomy stacks with SLAM, perception, and control.
  • Leverage conversational or coaching bots to build engaging user interactions.

These resources offer a wealth of inspiration—from high-level research breakthroughs to hands-on robotics demos. Want help integrating any of these into the RAIL Lab environment?