π Pepper Robot Demos and Tutorials
This page showcases various fun and practical ways to interact with the Pepper robot for demos, teaching, and experimentation. Whether you're using Choregraphe, Python scripts, the browser interface, or your voiceβPepper is ready to perform!
π£οΈ 1. Voice Commandsβ
Pepper comes pre-programmed with a number of verbal triggers:
β Try saying:β
- "close hand"
- "raise arm"
- "look left"
- "stop looking at me"
- "shake hand"
These commands are processed by Pepperβs onboard speech recognition and mapped to pre-programmed behaviors.
π§ 2. Using Choregrapheβ
You can create and trigger behaviors using the Choregraphe GUI.
Launch Choregraphe (in Docker or on host):β
./choregraphe/choregraphe
Connect to Pepper:β
Use the robot icon in the top-right and enter the robotβs IP address.
π€Ή Sample Behaviorsβ
- Text-to-Speech Node: Add a
Text to Speechbox and connect a string input. Press play. - Head Movement: Use
Set Anglebox to move the head joints. - Dance: Trigger sequences combining LED changes, text-to-speech, and body movement.
You can also import sample behaviors from:
/home/user/naoqi/samples/
π 3. Web Interface Demosβ
Open Pepperβs IP in a browser (e.g., http://192.168.1.5)
Try:β
- Typing a sentence and pressing "Say"
- Playing sounds
- Monitoring posture and sensor status in real-time
π 4. Python SDK Examplesβ
Run from Docker or SSH environment.
π£οΈ Text-to-Speech:β
from naoqi import ALProxy
tts = ALProxy("ALTextToSpeech", "192.168.1.5", 9559)
tts.say("Hello from Python!")
π‘ LED Control:β
leds = ALProxy("ALLeds", "192.168.1.5", 9559)
leds.fadeRGB("FaceLeds", 0x00FF00, 1.0) # green
πΊ Move Pepper's Arms:β
motion = ALProxy("ALMotion", "192.168.1.5", 9559)
motion.setAngles(["LShoulderPitch"], [0.5], 0.2)
π» 5. CLI (qicli) Commandsβ
Run these from a direct SSH terminal (ssh nao@192.168.1.5)
Say something:β
qicli call ALTextToSpeech.say "Pepper is ready!"
Trigger predefined behavior:β
qicli call ALBehaviorManager.runBehavior "animations/Stand/Gestures/Hey_1"
π Fun Demo Ideasβ
| Demo Title | Description |
|---|---|
| π Say Hello | Use TTS and hand wave to greet visitors |
| πΊ Dance Party | Sequence LED lights, music, and choregraphed movement |
| π€ Voice Control | Respond to user commands like βlook leftβ or βraise armβ |
| π€ Mirror Me | Mimic human actions via computer vision input (advanced) |
| π¨ Mood Light | Change facial LEDs based on emotion or time of day |
| π Teaching Assistant | Say phrases, point to slides, react to classroom events |
Pepper isn't just a robotβit's a charming performer, teacher, and lab companion. Try mixing interaction types for a fully immersive demo!