resonance a1
a physical computing experiment exploring tactile communication between two wearable devices — built with micro:bit, neopixel led strips, and grove sensors during an interaction design course at malmö university (mau).
why
how do you communicate feeling without screens? this project treats vibration patterns and light as a shared language — two devices linked by radio, where shaking one triggers a haptic + visual response on the other. the interaction is immediate and physical, not mediated by pixels.
how it works
each device runs on a bbc micro:bit with a neopixel strip (30 rgb leds) and a vibration motor.
- buttons a / b / ab select one of three vibration sequences
- shake gesture broadcasts the selected pattern to the paired device via radio
- receiving device plays the vibration pattern and runs a symmetric led animation (light expanding or contracting from center)
- grove ultrasonic sensor adds proximity feedback — triggering a pulse when something is within 5 cm
the three vibration patterns (vibr01, vibr02, vibr03) use different rhythms of short pulses separated by pauses, creating distinct tactile signatures.
hardware
| component | pin | role |
|---|---|---|
| neopixel strip (30 leds) | P0 | visual feedback — symmetric animations |
| vibration motor | P1 | haptic feedback — 3 pulse patterns |
| grove ultrasonic v2 | P2 | proximity trigger (≤ 5 cm) |
| micro:bit radio | — | device-to-device communication (group 1) |
stack
makecode · typescript · micro:bit · neopixel · grove · radio · physical-computing
blocks preview
this image shows the block code from the latest commit — may take a few minutes to refresh after a push.

open in makecode
to import and edit this project:
- open makecode.microbit.org
- click import → import url
- paste
https://github.com/fabio-cassisa/ResonanceA1_MAU
status
🟢 shipped — course project, complete
built by fabio cassisa · malmö university interaction design