The sensor features advanced gesture detection, proximity detection and digital ambient light sensing. Gesture detection can accurately sense "up, down, left and right" and more complex movements.
The code using this sensor has been integrated into the OpenCat project. Uncomment the line
#define GESTUREin the OpenCat.ino, as shown in the figure below, and then use the Arduino IDE to upload the sketch to the robot main board, which can reproduce the example function of integrating the robot action.
If you want to test the function of a gesture sensor alone or want to learn more about its principle. You can use the Arduino IDE to upload the demo sketch(gesture.ino), as shown below:
After uploading the sketch, connect to the NyBoard with wire, as shown in the following picture:
For specific use, the end connected to the sensor can be fixed on the robot's head (included in Bittle's mouth or attached to the top of Nybble's head); of course, you can also use your creativity according to your needs.
This demo sketch (gesture.ino) implements real-time printing of various directional gestures (up, down, left, and right) made by the user in front of the gesture sensor in the serial monitor.
Last modified 10d ago