Skip to content

How to Notify a Cyclist: Testing Sensory Inputs

As you may have seen, we’ve been experimenting with Reardar, a radar for cyclists. Whenever we’d discuss how Reardar would alert the cyclist to an approaching vehicle, the conversation would get hung up on the best means to provide that alert…or, for that matter, the best way to provide ANY information to a cyclist on the move. To find the answer, we decided to “Stop Meeting. Start Making.”, built some quick prototypes, and conducted some quick field testing to find the answer.


After collecting all our ideas, we took those that were realistic or interesting and moved on to prototyping. We knew that testing needed to include riding a bike in city traffic. Anyone who has missed a call while riding their bike can understand why. We also knew that setting up a proper test in actual traffic using all the mechanisms we were interested in with all of the different notifications needed would be impossible. Especially with a short schedule and a minuscule budget.

Our solution was to test the mechanisms against each other on the street outside of our downtown Seattle office using “dumb notifications” (not triggered by anything and not conveying real information). This would allow for simple, cheap test prototypes that would be used to downselect which were worth pursuing. To that end, we built simple prototypes to test the three groups we saw our ideas falling into: Audible, Haptic, and Visual.

For each test, the participant rode around the block while the mechanism vibrated, beeped, spoke, or flashed colors. This meant we’d have all of the real-world “noise” of being near Pike Place market during tourist season combined with the environmental effects of being out-of-the-lab (road vibration/potholes, dense city traffic, bright sun/dark shadows). Six TEAGUE’rs who regularly ride in these conditions opted to brave the streets for our testing.


Each mechanism tested was rated by the participant on its efficacy, whether it was distracting, and how intense the signal had to be to be noticed. All the mechanisms within a group (Haptic, Audio, and Visual) were ranked against the others. We did randomize the test order and normalize responses, to add an air of scientific rigor, but given the complexity and our timescale/budget, the results are most certainly qualitative.



Two types of audio were tested; tones (things like beeps, chirps, and dings) and random spoken words. The audio types were heard via two different mechanisms; bone conduction headphones (to let the cyclist hear their surroundings) and a smartphone mounted to the bike stem.


The audible group had a distinct winner and loser. Test participants preferred hearing Tones through the bone conduction headphones to Spoken Words. The headphones transmitted messages sharply, even at high speeds.

The participants clearly didn’t like Spoken Words played through the speaker. Not surprisingly, they said that certain frequencies blended into the background. One participant said that he caught himself looking down at the phone, straining to hear. These problems could be mitigated by moving the speaker closer to the cyclist’s ear or using a directional speaker. But that wouldn’t solve a problem that one of the participants mentioned; not wanting to look dumb in public, a risk you’ll face if your helmet is speaking out loud to you.



We used an Eccentric Rotating Mass (ERM) vibration motor taped to the body at three different locations to provide a haptic notification. A microcontroller vibrated the motor at different intensity levels intermittently. These three test locations were down-selected from a much larger list we’d generated. The chosen locations have enough nerve endings to make them effective notification points and all are spots where a small vibration device could live.


In the Haptic group, the Head location won, with ankle and wrist nearly equally ranked after it. All participants agreed that the head was the most effective (one participant commented “pretty ‘effing good!”), but it was also rated as being fairly distracting.

We used equal amounts of vibration intensity for all locations tested; in retrospect, the higher intensities were way too for a person’s skull directly behind their ear. We could easily turn down the power and reduce distraction. We could also use Linear Resonant Actuator (LRA) vibration motors, the same motor that all newer smartphones use. This should enable more nuanced messages; instead of just vibrating, you could use a double-click or gradually ascending vibration to convey different messages.



Four different visual mechanisms were tested. All four ran through the same sequence; intermittently flashing three colors at a low, medium, and high intensity. Super-bright LEDs were affixed to the end of the handlebars, a smartphone flashing colors was strapped to the stem, and a helmet with both an LED strip on the visor and a single LED affixed to the mirror were tested.


The visor LEDs ranked highest and weren’t perceived as being distracting. Not surprisingly, riding into the sun or in the shade impacted the efficacy, and some colors were easier to notice than others in all brightnesses.

Testers generally didn’t like the handlebar-mounted LEDs or phone screen; both either drew the attention away from the road (potentially dangerous) or the participant didn’t even notice the flashing.


A lot was learned in this quick prototyping/testing phase. We will pursue the helmet visor LEDs, head-mounted haptic motor, and bone conduction headphones. Next up, we’ll flesh out different form factors for integrating these mechanisms into a cyclist’s kit. We will also investigate interaction methods for each that could convey many different types of notifications to a cyclist.

After that, we have a plan for how we might test a lot of scenarios rapidly with a lot of test participants. Stay tuned!