Today we are going to explore more complex sensors and the interactions that these sensors could support.
Topics
Code for Today
#include <Servo.h>
#include <ComponentObject.h>
#include <RangeSensor.h>
#include <SparkFun_VL53L1X.h>
#include <vl53l1x_class.h>
#include <vl53l1_error_codes.h>
#include <Wire.h>
#include "SparkFun_VL53L1X.h" //Click here to get the library: http://librarymanager/All#SparkFun_VL53L1X
//Optional interrupt and shutdown pins.
#define SHUTDOWN_PIN 2
#define INTERRUPT_PIN 3
SFEVL53L1X distanceSensor;
//Uncomment the following line to use the optional shutdown and interrupt pins.
//SFEVL53L1X distanceSensor(Wire, SHUTDOWN_PIN, INTERRUPT_PIN);
float filteredValue;
float prevFilteredValue;
// weight is the smoothing factor
// 1 --> no smoothing at all
// value closer to 0 (0.1) --> smoother values
float weight = 0.2;
Servo myServo;
void setup(void)
{
Wire.begin();
Serial.begin(115200);
Serial.println("VL53L1X Qwiic Test");
if (distanceSensor.begin() != 0) //Begin returns 0 on a good init
{
Serial.println("Sensor failed to begin. Please check wiring. Freezing...");
while (1)
;
}
Serial.println("Sensor online!");
myServo.attach(9);
}
void loop(void)
{
distanceSensor.startRanging(); //Write configuration bytes to initiate measurement
int distance = distanceSensor.getDistance(); //Get the result of the measurement from the sensor
distanceSensor.stopRanging();
filteredValue = filter(distance, weight, prevFilteredValue);
prevFilteredValue = filteredValue;
int angle = map(filteredValue,0,500,0,180);
angle = constrain(angle,0,172);
myServo.write(angle);
Serial.print(0);
Serial.print(" ");
Serial.print(4000);
Serial.print(" ");
Serial.print(filteredValue);
Serial.print(" ");
Serial.print(angle);
Serial.println();
delay(10);
}
float filter (float rawValue, float w, float prevValue) {
float result = w * rawValue + (1.0 - w) * prevValue;
return result;
}
Available Sensors in Your Kit
- MSA301 Accelerometer
- TLV493D Triple-Axis Magnetometer
- VL53L1X Distance Sensor
- LSM6DS3 IMU (Inertial Measurement Unit) 3D accelerometer and 3D gyroscope (Datasheet)
- Note that this one is embedded into the Arduino Nano 33 IoT board.
- I have a tutorial video on how to set it up. Video 1: Update the Firmware Video 2: Basic setup and testing the sensor
- Or follow the Arduino quickstart guide
Homework
- Choose one sensor from your kit or from the sensors that we have in Mechatronics (or something you yourself have).
- Test out the sensor and see what kind of data you can get from it. Try mapping the values to different outputs (light, servo motor).
- Come up with 1–3 different ways that you could turn the sensors into experimental interfaces for interactive systems. This system could be:
- Interactive art installation
- Experimental game
- Musical instrument or controller
- An experimental interface for a real product
- Wearable device
- Add a description of your idea (text, sketches, images, references) to our Miro board for this week. There is a frame for each of you. Check MyCourses for the Miro board link.
The required information:
- Your name
- The sensor you have chosen
- Description of what kind of data you can get out of it
- Description of the type of interaction you would use the sensor for
- Description of what your interactive system does?
- Sketches and images of concept
Additional information:
- A working prototype of your idea
- Video of it actually working (or faked video showing how it should work)
- Code for your prototype
- Circuit schematic of your prototype
- You can also combine multiple sensors
Note, if you already have an idea of what you want to do as a final project for this class, use this as a starting point for planning out that.
Feeling stuck?
Check out some examples from the following sites:
- Physical Computing’s Greatest Hits and Misses
- Arduino Project Hub
- Adafruit tutorials
- Sparkfun tutorials
- ALT.CTRL.GDC archives (Game Developers Conference)
- Shake That Button (more alternative game controllers)
- Hackaday
Or just look around you. What kind of objects do you use in your daily life? What are the gestures and affordances associated with those objects? Could you somehow use a sensor to convert those interactions into input data?