This is only a preview of the November 2025 issue of Practical Electronics. You can view 0 of the 80 pages in the full issue. Articles in this series:
Items relevant to "3D Printer Filament Drying Chamber, Part 2":
Articles in this series:
Articles in this series:
Articles in this series:
Articles in this series:
Articles in this series:
|
Brilliant to the bone
Techno Talk
From bone-conduction headphones to exploring RealSense’s depth
Max the Magnificent
cameras and Eyesynth’s futuristic NIIRA headset, this month we journey
into sound, vision and accessibility technology with life-changing potential.
I mentioned this last month but,
as a reminder, earlier this year the
folks at H2O Audio (h2oaudio.com)
gifted me a pair of their TR2 multisport, waterproof bone conduction
beauties, which even work underwater if I want to listen to music while
swimming.
I must admit that before actually
taking these for a spin, I had wondered about the quality of sound that
can be achieved using bone conduction technology. I can honestly say
that they exceed my expectations. To
my untrained ear, the sound quality
is as good as that offered by my regular headphones.
Of course, they don’t block out external sound, but that’s an advantage
in many situations. For example,
when my wife (Gigi the Gorgeous)
and I recently visited her mother,
who lives in Nashville, Tennessee,
I spent a delightful couple of hours
trailing them as they meandered their
way around a series of stores.
I say “delightful” because I was
happily enjoying a science fiction
audiobook while still being able to
detect and respond to the occasional dulcet tones directed at me by my
spouse.
They’re also great if you’re on a
bicycle, scooter or similar, or even
just walking around the streets, as
you can still hear vehicles approaching, sirens and so on. Wearing them
allows you to listen to music, audiobooks etc without ruining your
situational awareness.
Similarly, bone conduction technology offers an ideal solution for
audio aids intended for the visually
impaired, who rely on being able to
hear what’s going on around them.
We will return to this in a moment,
but first…
RealSense’s D415
A few days ago, I had a very interesting chat with the folks from
RealSense (realsenseai.com). This
used to be a division at Intel, but they
recently spun out to begin operating
as an independent company.
Their primary focus (no pun intended) is computer vision technologies,
particularly depth-sensing systems,
that enable machines to perceive and
understand their environment. One
example of this is their D415 Depth
Camera.
There’s much more to this than
meets the eye (OK, pun intended this
time; sorry). Let’s start with the two
RGB+ cameras mounted on the left
and right sides (I’ll explain my use of
the ‘+’ qualifier in a moment). These
are used to provide stereoscopic depth
perception in much the same way as
human binocular vision.
Each camera captures the same
scene from slightly different angles.
By identifying matching features in
both images, the system measures
their horizontal displacement, called
disparity. Nearby objects show a significant disparity, while distant ones
show little.
Using triangulation, the system
converts disparity into depth, producing a 3D depth map where each
pixel encodes distance. This enables
applications like object detection,
robot navigation, augmented reality
and gesture tracking.
This is the clever bit. If we were
to use regular RGB cameras, their
performance would be degraded by
poor lighting and low-texture surfaces, such as a flat white wall. The
CMOS sensors used in RGB cameras are sensitive to a broad spectrum,
including infrared (IR). If left unchecked, this IR can ‘bleed’ into the
RGB channels, resulting in strange,
washed-out colours.
That’s why traditional RGB sensors
include an IR-cut filter to remove the
IR component from the image. The
RGB+ cameras in the D415 don’t have
this filter, meaning they also pick up
infrared light, hence the ‘+’.
Now observe the big ‘thing’ to the
left of centre (I hope I’m not being
too technical). This is an IR projector
that casts a random pattern of thousands of dots, allowing the system
to generate a 3D depth map even in
a dark room and/or when viewing
a flat, unicolored surface, such as a
painted wall.
The other ‘thing’ to the right of
centre is a regular RGB camera. This
captures a traditional RGB image of
the scene. This image is 100% aligned
with the 3D depth map, frame by
frame and pixel by pixel. The D415
makes this data available for use by
other people’s applications and systems (robotics, industrial automation,
3D scanning and modelling, medical
imaging, motion tracking etc).
Speaking of which…
Eyesynth’s NIIRA
One application that really (I’m
going to resist saying “caught my eye”)
attracted my attention is the NIIRA
audiovisual perception headset from
Eyesynth (eyesynth.com). If you look
closely at this device, you’ll see it
The D415 Depth Camera (Source: RealSense).
42
Practical Electronics | November | 2025
Techno Talk
Max the Magnificent
The NIIRA audiovisual
perception device
(Source: Eyesynth).
has a RealSense D415 depth camera
seamlessly moulded into the frame
above the lenses. This headset connects to a small AI computer, roughly
the size of a pack of playing cards.
The RealSense D415 delivers the raw
perception layer, and the Eyesynth
NIIRA system adds the intelligence,
interpreting the 3D scene, performing object detection, recognition, and
contextual analysis. Any relevant
information is translated into soundscapes that are conveyed to the user
via the NIIRA bone-conduction audio
system.
As we previously noted, people
with visual impairments rely heavily
on sound cues from their surroundings. The last thing they want is to
block their ears with headphones or
diminish their hearing with earbuds,
hence the use of bone conduction
technology.
A few years ago, I was invited to
give a talk on advanced technologies at Sheffield Hallam University.
Knowing Graham’s interest in this
sort of thing, I invited him to attend.
A lot has changed since I was a
student there. I graduated in 1980.
Goodness gracious me. I just realised
that was 45 years ago. I’m too young
for all this excitement!
The thing is that I got lost on the
way to my designated lecture theatre.
I found myself standing at an intersection of corridors whose layout
would have made the designer of the
Paris street system blush with pride.
I was confused, which is my natural state, so at least I was playing to
my strengths. Then I heard a “tap tap
tap” sound, and Graham appeared
from around a corner, striding confidently along. I’m embarrassed to
say that it was Graham who led us
to our destination.
If I ever win the lottery, I think it’s safe
to say that Graham will find a NIIRA
audiovisual perception device lurking
in his Christmas stocking.
PE
Me aged ~3 (left)
and Graham aged
~6 (right).
My cousin Graham
My cousin Graham is three years
older than I am. That means he knew
much more than I did when we were
younger. I know this to be true because he told me so. He also told me
that spaghetti was made from peeled
worms, which has left a bad taste in
my mouth to this day.
About 30 years ago, Graham started to lose his sight. He’s now totally
blind, but that hasn’t slowed him
down. He danced with gusto and
abandon at my wedding; he hosts a
local radio show; and he often attends
concerts in the city centre, making
his way there and back again by bus
and on foot.
Practical Electronics | November | 2025
43
|