This is only a preview of the October 2025 issue of Silicon Chip. You can view 34 of the 104 pages in the full issue, including the advertisments. For full access, purchase the issue for $10.00 or subscribe for access to the latest issues. Items relevant to "Digital Preamp & Crossover":
Items relevant to "HomeAssistant, Part 2":
Items relevant to "Vacuum Controller":
Items relevant to "Dual Train Controller":
Articles in this series:
Purchase a printed copy of this issue for $14.00. |
Autonomous Vehicles
ADVANCED DRIVER ASSISTANCE SYSTEMS
Driving automation includes fully autonomous vehicles (that can drive
entirely by themselves) as well as advanced driver assistance systems
(ADASs), which make a human driver’s job easier. Both technologies have
made significant strides in recent years. By Dr David Maddison, VK3DSM
The ‘future of driverless cars’ from an advertisement in the Philadelphia Saturday Evening Post, 1956.
Source: www.saturdayeveningpost.com/2018/05/driverless-cars-flat-tvs-predictions-automated-future-1956/
12
Silicon Chip
Australia's electronics magazine
siliconchip.com.au
T
his article will be about automation
in ground vehicles only; we have
previously discussed aerial automation in several articles, including the
recent one on Drones (also known as
UAVs) in the September issue. We also
discussed autonomous underwater
vehicles in the September 2015 issue,
and autonomous agricultural vehicles
in June 2018.
Classifications
Whether or not a vehicle is autonomous is not a simple yes/no answer;
there are different levels of autonomy.
Thus, there are several schemes to categorise levels of vehicle automation.
One of the most commonly used is
from the Society of Automotive Engineers (SAE), embodied in their J3016
standard. It defines six levels of vehicle automation.
For SAE levels 0-2, the driver is
fully driving the vehicle and remains
in complete control.
Level 0: No driving automation. The
vehicle may provide warnings and
momentary assistance only, such as
automatic emergency braking, blind
spot warning and lane departure warning. Most entry-level cars on the market today are at this level.
Level 1: Partial automation with a
single feature for the vehicle to control,
like steering, braking or acceleration.
May include lane centring or adaptive
cruise control. Many cars on the road
today have one of these features.
Level 2: Partial driving automation.
The vehicle can control (when necessary) steering, braking and acceleration, such as lane centring and adaptive cruise control. A reasonable proportion of cars on the road today have
both of these features, and they come
on most new higher-end vehicles.
Level 3: Conditional driving automation. This includes environment
detection, with capabilities like automated overtaking or negotiating traffic jams. The driver must be prepared
to take control of the vehicle when
required.
Examples include Audi A8L Traffic
Jam Pilot, Mercedes Benz Drive Pilot,
Honda Legend Traffic Jam Assist and
BMW Personal Pilot L3. Note that
these systems may not be approved
in certain locations.
For SAE levels 4-5, the driver is not
usually required to take control of the
vehicle (and may not be able to, as it
might not have controls).
siliconchip.com.au
Level 4: High level of driving automation. The vehicle drives itself
under nearly all circumstances. An
example is a driverless taxi for use on
local roads (eg, Waymo’s One; https://
waymo.com/waymo-one/), shuttle
buses in controlled urban environments, delivery services with trucks
(Gatik is partnering with Isuzu) and
public transportation.
Mercedes Benz is trialling level 4
driving on various roads in Beijing.
Pedals and steering wheel may not be
fitted to a Level 4 vehicle. Such a vehicle likely cannot go off road.
Level 5: This is similar to level 4,
but more advanced. Whereas level 4
is fully automated, it is restricted to
certain structured environments, like
road networks. Level 5 has full driving automation under all possible circumstances, including off road. There
is currently no example of a widely
available car that meets the level 5
criteria.
How do autonomous
vehicles work?
An autonomous vehicle requires
many integrated systems to function.
That includes multiple sensors to
sense and map the environment; actuators to operate systems like steering
or brakes; algorithms to guide tasks
like parking, lane keeping, or collision avoidance; machine learning to
handle a range of scenarios; powerful
computers to orchestrate this all; and
complex software running on reliable
operating systems.
A multitude of data from the sensors
must be brought together in a process
called sensor fusion. Sensor fusion
involves merging data from numerous sensors to create a more comprehensive and accurate view of the
environment than can be supplied by
individual sensors. It is equivalent to
how a human combines information
from multiple senses (sight, hearing,
balance etc).
The controlling computer receives
instructions from a person about
where to go, then plots a route and
sends appropriate instructions to
the actuators to move the vehicle in
the required direction. At the same
time, the vehicle is constantly monitoring its environment for collision
avoidance, lane keeping, observing
speed limits, stopping at signals and
stop signs, and observing other traffic rules.
Australia's electronics magazine
Important Developments
Some significant developments toward
advanced driver assistance systems and
autonomous vehicles are:
1939 the GM Futurama display at New York
World’s Fair prophesied a future in which there
were semi-automated vehicles equipped with
lane centring, lane change & blind spot assist
systems, as described in the book Magic
Motorways by Norman Bel Geddes.
1952 GM introduced the Autronic Eye, an
automatic headlight dimming system, on
some Oldsmobile and Cadillac models.
1958 Chrysler offered cruise control,
invented by a blind engineer, Ralph Teetor.
1964 Twilight Sentinel was introduced on
some Cadillac models, controlled by a photocell to sense ambient light levels and turn
the headlights on or off. It was introduced in
other models throughout the 1970s and later.
Some versions switched on the lights whenever the wipers were activated, to improve
safety in low-visibility conditions.
1977 Japan’s Tsukuba Mechanical Engineering Laboratory developed an experimental car that could drive itself on specially
marked streets.
1978 Rain sensing wipers were invented by
Australian Raymond J. Noack (siliconchip.au/
link/ac6o).
1989 the Volkswagen Futura concept car
had four-wheel steering to autonomously
manoeuvre into parking spots.
1992 the Mitsubishi Debonair used lidar to
warn the driver if they were too close to the
vehicle ahead, but couldn’t control the vehicle.
1995 the Mitsubishi Diamante had an adaptive cruise control using lidar but could not
apply the brakes.
2003 Honda introduced the Collision Mitigation Brake System to automatically apply
the brakes if it detected a collision was imminent.
2004 DARPA held their inaugural Grand
Challenge, a series of competitions to encourage the development of “autonomous ground
vehicles capable of completing a substantial
off-road course within a limited time”.
2006 the Lexus LS460 was sold with a Lane
Keep Assist feature that steers vehicle back
into lane if it deviates.
2015 Tesla offers the “Autopilot” feature on
their Model S.
2019 Mercedes Benz and Bosch test automated valet parking at Stuttgart Airport in
Germany, to guide a car autonomously to a
pre-booked parking place.
2023 Mercedes Benz’s DRIVE PILOT system
is approved in Nevada, USA, to drive on certain freeways during daylight below 40 miles
per hour (64km/h).
2024 BMW obtained approval for Personal
Pilot L3 in Germany, similar to Mercedes
DRIVE PILOT.
October 2025 13
Figs.1 & 2: the architecture of
a typical autonomous vehicle.
ML = machine learning, AI
= artificial intelligence, DL =
deep learning, UI/UX = user interface/user experience, AUTOSAR = Automotive
Open System Architecture, ROS = robot operating system, RTOS = real time
operating system, V2X = vehicle to everything.
It will also monitor itself, to ensure
sufficient fuel or battery charge, while
looking for places along the way to
refill.
Figs.1 & 2 show the generic hardware and software architecture of a
typical autonomous vehicle, along
with information flows and actions.
Environment sensing
Autonomous vehicles, or vehicles
with ADAS (Advanced Driver Assistance System), need ‘eyes’ to see the
environment around them, as well as
other sensors. The main sensors are
lidar for 3D mapping; radar; sonar;
cameras; and GPS/GNSS for locating
the vehicle.
Other sensors such as gyroscopes
and accelerometers can provide ‘dead
reckoning’ navigation when there is
no GNSS signal available, such as in
tunnels. Those sensors are usually
also used to detect if the vehicle is
veering off course (eg, due to skidding on a slippery road), allowing the
vehicle to take corrective action, and
also to detect collisions (eg, to trigger airbags).
The vehicle will probably also
have sensors to detect the temperature, ambient light level (to control
lights) and so on. It may even have a
microphone to listen for the siren of
an emergency vehicle, so it can pull
over to let it pass.
Lidar stands for Light Detection
and Ranging. It is like radar, emitting
laser pulses (rather than RF pulses, as
in radar) from a rotating assembly to
make a three-dimensional map (point
cloud) of the environment based on
the time for the reflected signal to
return.
An example of a commercial lidar
device for ADAS or autonomous vehicles is the HESAI Automotive-Grade
120° Ultra-High Resolution LongRange lidar (siliconchip.au/link/ac6p)
– see Figs.3 & 4. That model is said to
acquire 34 million data points per second to a range of 300m.
ADAS and autonomous vehicles
usually have multiple cameras. The
Figs.3 & 4: a HESAI lidar (Light Detection and Ranging) unit shown inset. Under
it is an example of lidar imagery (a point cloud) with 128 channels (bottom) and
the superior HESAI unit with 1440 channels (top).
Source: www.hesaitech.com/product/at1440-360
14
Silicon Chip
Australia's electronics magazine
imagery from these has to be turned
into meaningful data that can be used
by the controller. This is done by software to create a three-dimensional
map (point cloud), while extracting other useful data. An example of
software used for this is Nodar Hammerhead (siliconchip.au/link/ac6s),
shown in Fig.6.
Sonar sensors use ultrasonic sound
waves to measure distance, providing short-range information about
objects in the immediate vicinity of
the vehicle. Radar sensors use microwave radio beams to measure the
range, velocity and direction of objects
within their field of view.
The sensors need to work regardless of conditions such as heavy snow,
rain, ice, fog, road line markings
being obscured or absent, changes in
road surfaces, debris on the road, dirt
roads etc.
No single sensor is good at everything under all conditions, so a
variety of sensors are needed. For
Fig.5: environmental sensing by an
autonomous vehicle with multiple
cameras, radars, ultrasonic systems
and a lidar unit. Original source:
www.mdpi.com/1424-8220/23/6/3335
siliconchip.com.au
example, the sensors shown in Fig.5
are fused to produce the capability
shown in Fig.7.
Software
Hazard assessment
Relevant software standards include
ISO 26262, which is a process for
managing and reducing risks for electrical and electronic systems in road
vehicles. It covers planning, analysis,
design, implementation, verification,
validation, production, operation and
decommissioning. It includes guidance on model-based development,
software safety analysis, dependent
failure analysis, fault tolerance and
more.
ASIL refers to Automotive Safety
Integrity Level, a risk classification system specified by ISO 26262.
It defines functional safety as “the
absence of unreasonable risk due to
hazards caused by malfunctioning
behavior of electrical or electronic
systems”.
There are four levels of risk associated with system failure: A, B, C &
D, with A being the lowest level and
D the highest level of hazard if a system fails – see Fig.9. The higher the
risk level, the greater the required
reliability and robustness of the particular system.
AEC-Q100 is a standard that ensures
the safety of electronic parts used in
cars, focusing on reliability stress-
testing of integrated circuits.
Fig.6: an actual image from the Nodar Hammerhead at upper left and the
processed image outputs from their stereovision software at upper right and
bottom. Source: www.nodarsensor.com/products/hammerhead
Fig.7: the capabilities of the sensors from Fig.5 fused to show the overall
detection capability for cameras,
radar & lidar at lower right. Original
source: www.mdpi.com/14248220/23/6/3335
Fig.8: the architecture of NVIDIA’s
DriveOS software.
◀
According to Synopsys, today’s
autonomous cars use 100 million lines
of code, and in the near future, they
will have 300 million lines.
Operating systems for autonomous vehicles include QNX Neutrino
(used by Acura, Audi, BMW and Ford
among others; Unix-like); WindRiver
VxWorks (also used by BMW, Ford and
the Mars Perserverance rover); NVIDIA’s DriveOS (see Fig.8, used by Audi,
Mercedes-Benz, Tesla and Veoneer);
along with Integrity.
Apple, Google and Microsoft also
have their own versions of autonomous vehicle operating systems in use
or under development.
AUTOSAR
AUTOSAR (AUTomotive Open System Architecture; www.autosar.org)
is a global automotive and software
industry partnership to develop and
implement an open and standardised
siliconchip.com.au
Australia's electronics magazine
October 2025 15
software, electrical and electronic
framework for “intelligent mobility”.
It defines things such as common interfaces, communications protocols, data
formats etc.
The layered architecture of AUTOSAR includes an application layer
(vehicle specific), a runtime environment (that manages communications
between software components), a basic
software layer (communications and
memory management, etc) and a control unit abstraction layer, to allow
software to be developed regardless of
specific hardware – see Fig.10.
Fig.9: the ASIL hazard assessment levels for the failure of various systems on an
autonomous vehicle. A indicates the least concern of failure, while D is of most
concern. Source: www.synopsys.com/glossary/what-is-asil.html
Advanced Driver Assistance
Systems
Fig.11: the Cruise self-driving car. Source: https://unsplash.com/photos/a-carthat-is-sitting-in-the-street-PkKsHQ5u4g8
These systems can help a human to
operate a vehicle at SAE automation
levels 0 through 5, or be integrated
under the control of a master system to
drive a vehicle autonomously. Unfortunately, the names of these features
and their dashboard symbols are not
always standardised between manufacturers.
Adaptive Cruise Control is a system that automatically adjusts vehicle speed to maintain an appropriate
separation from the vehicle in front. It
uses sensors such as radar (typically
at 24GHz or 77GHz), lidar or binocular cameras (eg, Subaru’s “EyeSight”
system) to determine the distance to
the car ahead.
Adaptive Headlamps use a system
to automatically adjust the headlight
beam to avoid dazzling oncoming drivers (in theory, at least). The distance to
oncoming drivers, if any, is estimated
and the beam reach is adjusted appropriately. There is no binary high- or
low-beam in some systems; just a continuously variable range.
In one system by Mercedes, for
example, the beam reach is adjusted
between 65m and 300m, and adjustments are made every 40ms according
to information from a vehicle camera
that determines the distance to other
vehicles.
Anti-lock Braking Systems (ABSs)
are designed to prevent a vehicle from
skidding under hard braking, which
can both result in longer stopping distances and make steering ineffective. It
was originally introduced for rail vehicles in 1908 (although for a different
purpose; to improve brake effectiveness), and 1920 for aircraft, but it was
not universally adopted.
The widespread adoption of ABS
Australia's electronics magazine
siliconchip.com.au
Fig.10: the AUTOSAR software architecture, the acronyms stand for VFB:
Virtual Functional Bus; RTE: Runtime Environment; BSW: Basic Software.
Original source: Fürst, Simon “AUTOSAR – A Worldwide Standard is on the
Road” – siliconchip.au/link/ac6t
16
Silicon Chip
for aircraft happened in the 1950s.
These were hydraulic systems, but an
electronic system was developed for
the Concorde in the 1960s. The modern ABS system for cars was invented
in 1971 by Fiat and has been used on
many models since then. It has been
required on almost all cars sold for
decades now.
Modern systems monitor the rotational speed of each wheel and compare that with the speed of the vehicle.
If one wheel is rotating slower than the
rest of the vehicle, the brake pressure
for that wheel is reduced, unless the
car is turning. Brake pressure can be
reduced or reapplied up to 15 times
per second, and each wheel can be
controlled individually.
In more modern vehicles, the ABS
system is also part of the electronic
stability control system.
Automatic Emergency Braking uses
forward-looking vehicle sensors, such
as radar and lidar, to sense the distance and time to impact of a vehicle
or other obstacle. If the driver does not
brake in time, the brakes are automatically applied. This might also be used
in conjunction with automatic emergency steering (if fitted) if the braking
distance is insufficient.
Automatic Emergency Steering tries
to steer a vehicle away from an imminent collision. Hazards that can be
avoided include cars, cyclists, pedestrians, animals or road debris. Automatic emergency braking may also
be implemented. Decisions are made
based on inputs from radar, lidar, cameras, ultrasonic sensors etc.
The process for action is:
1. Detection; continuous monitoring
from sensors
2. Assessment; the control module
uses data from the sensors to determine the vehicle velocity, trajectory,
distance to the obstacle etc
3. Decision; if a collision is determined to be imminent and cannot be
avoided by emergency braking alone,
the calculations are made for a steering manoeuvre
4. Action; the steering actuator is
activated by the control module to
steer the vehicle on a path calculated
to avoid the obstacle and any other
obstacles
5. Notification; the driver is notified
of the action
There are various levels of Automated Parking, from basic to fully
automatic. For automated parking to
siliconchip.com.au
◀ Fig.12: the process of automatic
parallel parking. Original source: “A
novel control strategy of automatic
parallel parking system based on
Q-learning” – siliconchip.au/link/ac6u
Fig.13 (below): possible parking
scenarios for Volkswagen’s Parking
Assist. Original source: Green Car
Congress – siliconchip.au/link/ac6w
work, the parking space needs to be
‘parameterised’ so that the appropriate
vehicle direction, steering angle and
speed can be computed – Fig.12 shows
a reverse parking scenario. Other parking scenarios are possible, for example,
right-angle parking.
Volkswagen is one of many manufacturers who have developed automated parking, which they call “Parking Assist”, through three generations,
plus fully automatic parking.
Their first generation only allowed
for reverse parking into parallel
spaces, with a maximum of two moves,
and the target space had to be 1.4m
longer than the vehicle. Vacant parking spaces could be detected at up to
30km/h. It used ultrasonic sensors.
Their second generation could perform multiple manoeuvres to park, as
shown in Fig.13. It used cameras in the
side mirrors, at the front and the rear,
as well as ultrasonic sensors.
The third generation could park the
vehicle into a much smaller space and
detect vacant spaces at speeds up to
40km/h.
Australia's electronics magazine
These Parking Assist modes correspond to SAE Level 1, and require
driver supervision. Beyond that, Parking Assist at SAE Level 4 provides
for fully automated parking with no
human intervention required.
Automated Valet Parking is a system developed by some manufacturers
for a car to park and retrieve itself in
certain parking garages. Infrastructure
is required at the car park, as well as
communication between the vehicle
and the car park via V2X technology
(see below) to receive instructions
and location information within the
car park.
For more on this, see the video at
https://youtu.be/30eB8Jj7xh0
Tesla also have an “Actually Smart
Summon” feature, where the car will
unpark itself and come to the driver
with the use of a smartphone app as
long as the car is within 65m of the
driver, with a clear line of sight, and
is not on a public road.
Automatic Wipers: rain-sensing
wipers were invented by an Australian Raymond J. Noack. Moisture is
October 2025 17
windshield
LED
photodiode
raindrop
Fig.15: the output
of the Tesla Driver
Drowsiness Warning,
which is not visible
to the driver. Source:
www.vehiclesuggest.
com/tesla-hackerfigured-out-a-wayto-fool-tesla-camerabased-drivermonitoring-system
Fig.14: the operation of an automotive
rain sensor. In the presence of
raindrops, there is some loss in
the strength of the infrared beam
reflected. Source: https://w.wiki/ERxC
detected on the windscreen, and the
wipers are activated at an appropriate
speed and interval.
The rain sensor is typically located
in front of the rear-view mirror, and
monitors infrared light reflected back
from the outside surface of the glass,
as per Fig.14.
Blind Spot Monitors use radar or
cameras to monitor a driver’s so-called
‘blind spot’ and provide a warning
before they attempt to move into it
if something is detected there (eg, a
motorbike).
Subaru’s EyeSight Camera system
uses a pair of stereo cameras and was
first launched in 1989. It is used for
Adaptive Cruise Control, but can also
provide sensory input for pre-collision
braking that detects cars, motorcycles,
bicycles and pedestrians.
In the USA, the system was found
to reduce rear-end crashes and injuries up to 85%. Subaru is working to
integrate an AI judgement capability
into its EyeSight system.
Climate Control is a feature in most
vehicles now, providing both heating
and cooling. It is important for both
safety and comfort, for example, to
ensure that the windows remain clear
while driving. Some cars have automatic defogging features, including
some Kia and Hyundai models.
Collision Avoidance System is a system that monitors a vehicle’s speed,
the distance to the vehicle in front and
its speed, to provide a warning or take
corrective action if a collision is imminent. Sensors, such as radar and lidar,
are used to determine vehicle parameters, like speed and distance.
Automatic Emergency Braking and
Automatic Emergency Steering are
two possible systems that are used to
implement collision avoidance.
Crosswind Stability Control was
first used by Mercedes Benz from 2009
in some cars, then
later, vans and trucks.
A deviation caused by crosswinds can be automatically
corrected with the vehicle’s
ESC system by several methods, such as steering, torque
vectoring to provide more
Fig.16: an algorithm
flowchart for
implementing electronic
stability control (ESC).
Original source: https://
autoelectricalsystems.
wordpress.com/2015/12/20/
electronic-stabilityprogramme-esp
18
Silicon Chip
Australia's electronics magazine
drive force on the left or right side of
the vehicle, or differential braking.
Driver Drowsiness Detection uses
cameras and sensors such as eye-
tracking sensors to monitor driver
behaviour and sound an alarm to alert
the driver if drowsiness is detected.
Drowsiness is detected by sensing behaviours such as yawning, eye
blinking rate, eye gaze, head movements, facial expressions and driving
behaviour, such as lane deviations and
speed variations. Machine learning
analyses behaviour patterns and learns
to identify behaviours corresponding
to drowsiness. The idea is to alert the
driver to rest before they fall asleep.
Tesla Driver Drowsiness Warning
uses a camera to monitor the driver
and sounds an alert if drowsiness is
detected. Volkswagen monitors lane
deviations and steering movements
to detect drowsiness.
Other companies offering this feature include BMW (Attention Assistant), Citroën (AFIL/LDWS), Jeep
(Drowsy Driver Detection), Subaru
(Driver Monitoring System), Toyota
(Safety Sense) and Volvo (Driver Alert
System). Others also include Ford,
GM, Hyundai, Kia.
Some fleet operators, such as trucking companies, install centrally monitored driver drowsiness detection systems in their vehicles, which are monitored using AI systems and/or humans.
Fig.15 shows the output of a Tesla
Driver Drowsiness Warning obtained
by <at>greentheonly as he tests the camera with different scenarios such as
“driver’s eyes nominal”, “driver’s
eyes down/closed/up”, “view of head
truncated”, “driver looking left/right”,
“camera dark/blinded”, “driver head
down”. You can see his video at https://
youtu.be/pZWR4MQBI4M
siliconchip.com.au
Driving Modes such as for snow, ice,
sand, hill ascent and descent control
etc are available on some vehicles. The
vehicle’s performance is optimised
via control algorithms with the throttle response, traction control, stability
control, transmission behaviour etc,
adjusted as required.
Electronic Stability Control (ESC)
expands on ABS by adding a steering
angle sensor and a gyroscopic sensor.
If the intended direction of the vehicle
doesn’t correspond to the actual direction it is travelling (ie, it is losing traction), the ABS system can individually
brake between one and three wheels to
bring the vehicle back into alignment
with its intended direction.
The steering wheel sensor also provides information for Cornering Brake
Control (CBC) to take into account the
differential rotational speed of the
wheels on the inside and outside of
the curve. A typical control algorithm
is shown in Fig.16.
Heads-up displays (HUDs) convey information to the driver, such
as speed, the current speed limit, the
distance to the vehicle ahead, turns
for navigation etc. This information
is projected onto the windscreen; see
Fig.17.
Ice Warning is important in colder
climates as ice is often not visible on
the road (‘black ice’) and this is a serious safety hazard.
A variety of detection systems are
used, such as multispectral imaging
systems, to examine the road surface; thermal imaging systems; air
temperature and humidity measurement; weather data from external
sources; or information from vehicleto-infrastructure (V2I) or vehicle-to-
vehicle (V2V) systems.
Intelligent Speed Adaption (ISA) is
a system that reads road signs or uses
other data to ensure that the driver
stays within the speed limit for that
section of road. There may be a warning if the driver exceeds the limit, or
the driver may be able to request the
car travels at or below the limit.
Intersection Assistance is when a
vehicle is equipped with side-looking
radar to detect if drivers are coming
at right angles to the car; brakes can
be automatically activated to avoid a
collision.
Lane Deviation (or Departure) Warning uses cameras to monitor lane markings, to warn a driver if they start to
depart from the lane they are in, or to
siliconchip.com.au
Fig.17: a head-up display rendering showing various ADAS parameters. Source:
www.eetimes.com/add-ar-displays-for-adas-safety
Fig.18: the Night Vision Assistant on an Audi A8. Source: https://w.wiki/ERxE
Fig.19: the live 360° camera view on a Mazda CX-9.
keep them in the centre of the lane
even if they are not actively steering
the vehicle.
Lane Change Assistance uses sensors to detect if vehicles are in the
driver’s blind spots, and will alert the
driver if they are.
Navigation in an ADAS vehicle
may involve route recommendations
or alternatives, choice of toll or no
toll roads, advice on traffic congestion
Australia's electronics magazine
etc. The vehicle may receive real-time
updates as conditions change, such as
traffic congestion forming. Position
information is obtained with GPS or
another GNSS system.
Night Vision is a system using infrared cameras to improve driver awareness at night or in poor conditions –
see Fig.18. The first car to be offered
with this technology was the 2000
Cadillac de Ville.
October 2025 19
Do autonomous cars get confused?
This short video shows Waymo cars honking at each other: https://youtube.
com/shorts/PkVSoTZBh8U
This video shows a Waymo vehicle not taking the passenger where they
wanted to go on a simple trip: https://youtu.be/-Rxvl3INKSg
A police officer pulls over a Waymo: https://youtu.be/7W-VneUv8Gk
Omniview is a type of camera system that gives a 360° and/or bird’seye view of a vehicle. It is known by
many other names, such as Surround
View. It was first introduced as on the
2007 Nissan Elgrand and Infinit EX, as
“Around View Monitor”.
Video feeds from four to eight cameras are synthesised into a bird’s-eye
view to assist drivers with park, or to
remotely view their vehicle and its
surrounds – see Fig.19. There is quite
a bit of processing required to convert the images from the cameras into
a (mostly) seamless 360° image. The
steps include:
1. resizing the images
2. removing lens distortion
3. perspective transformation
4. stitching the images together
5. displaying the results
Such systems can also be retrofitted.
One example we found is the Taffio
360° Surround View Set (siliconchip.
au/link/ac6q).
Parking Sensors are usually ultrasonic rangefinders that give the driver
an audible (and visual) indication of
how close they are to objects. Typically, the closer the vehicle is to
an object, the faster it beeps. These
systems are often accompanied by a
rear-facing camera, which may have
lines marked on the image to assist
the driver with determining the path
of the vehicle in relation to obstacles.
Reversing Cameras are a common
feature now (required in new cars) and
a relatively simple one to implement.
The first known vehicle reversing camera was on the 1956 Buick Centurion
concept car. The first commercially
produced car to have one was the 1987
Toyota Crown in the Japanese market.
Temperature Sensors are used to
measure inside and outside temperatures, and may contribute to ice warning data or the operation of the climate
control system.
Traction Control is a system to
ensure that wheels don’t lose traction
with the road during heavy acceleration. Each wheel has a speed sensor,
and the speed data is sent to the ECU,
which compares it with the speed of
20
Silicon Chip
the vehicle. If there is a mismatch, taking into account if the car is cornering
or not, the engine torque is reduced or
a brake is applied on the wheel.
Traffic Jam Assist is a feature that
uses Adaptive Cruise Control and Lane
Departure Warning to take over driving in traffic jams. A safe distance is
maintained with the vehicle in front.
Traffic Sign Recognition uses a
camera to recognise traffic signs, such
as stop and speed limit signs, giving
appropriate warnings to drivers. Traffic sign recognition is facilitated by
the Vienna Convention on Road Signs
and Signals, which has attempted to
standardise road signs across various
countries, although Australia is not a
signatory.
Traffic sign recognition systems use
a variety of different algorithms, such
as recognising the board shape and
using character recognition to read the
writing. A further level of complexity
uses convolutional neural networks
(CNN), which are trained with real
signage and use deep learning to recognise various signs.
The output of the Freeman Chain
Code and shape determination of
the algorithm can also be used as an
input to CNNs. A typical sign recognition algorithm includes the following steps:
1. capture an image of the sign(s)
with a colour camera
2. convert the image from RGB to
HSL (hue, saturation, lightness)
3. apply a Gaussian smoothing filter
4. detect edges using a Canny edge
detector algorithm
5. use a Freeman chain code algorithm to detect letters and numbers
6. use a polygonal approximation of
digital curves to detect the sign shape
7. display the result
Tyre Pressure Monitors use either
inferences from other data or direct
pressure measurements. For indirect
systems, parameters such as wheel
speeds, accelerometer outputs and
other vehicle data are used to make
inferences about tyre pressure, and
a warning is issued to the driver to
check pressures.
Australia's electronics magazine
That is not as accurate as direct measurement systems, which use a sensor
in each wheel to determine the pressure. The sensor may either be battery-
operated, which requires maintenance
to replace the battery, or may be wirelessly supplied with power like RFID
systems.
Wrong Way Driving Warning is a
system on some vehicles to alert the
driver if they are driving in a direction
which they are not meant to, as determined by GPS data. It doesn’t seem to
be widely implemented.
V2X stands for vehicle-to-everything and describes wireless communication between the vehicle and any
other vehicle or entity with which the
vehicle may interact. Vehicle to infrastructure (V2I) and vehicle to vehicle
(V2V) are related systems.
Operational Design Domain
The Operational Design Domain
(ODD) defines the set of conditions
such as environmental, geographic,
time of day etc under which the vehicle is certified to operate safely. In
other words, it is a recognition of the
limitations of the autonomous system.
If the situation in which the vehicle
finds itself is outside of the ODD; for
example, certain traffic or road conditions, it might warn the driver or passenger and deactivate itself to allow
the driver to assume control. Alternatively, the vehicle may park itself.
Various standards and regulators
have defined the exact meaning of
ODD. An example is Mercedes Benz
stating the following for its Drive Pilot
Level 3 system for supervised autonomous driving, which is certified for
use in California and Nevada:
...requires speeds below 40 miles
per hour, clear lane markings, not
too much road curvature, clear
weather and lighting conditions, and
a high-definition map to be available
in the system’s memory...
Warning sounds
Electric autonomous vehicles can be
so quiet that pedestrians may not hear
them, so they are required to make a
sound at lower speeds. In Australia,
as of November 2025, all new electric,
hybrid and hydrogen-powered cars,
buses and trucks will be required to
be fitted with noise-making systems
which make a noise of 50dB below
20km/h. Similar laws apply in the EU,
Japan, the UK and the USA.
siliconchip.com.au
Legal liability for accidents
For SAE levels 0-3, the driver must
be able to take control of the vehicle
at any time, and they will be liable for
any accidents, as they should be constantly monitoring the vehicle, ready
to take control at any time.
For levels 4 & 5 vehicles, there is no
“driver”; they might not even have any
access to vehicle controls. It is unclear
who would be responsible for an accident that may occur.
Fully autonomous vehicles
We will now look at examples of
autonomous vehicles, starting with
one from Australia.
Australian road trains
Australian company Mineral
Resources (www.mineralresources.
com.au; MinRes) developed worldfirst autonomous road trains that can
haul 330 tonnes of iron ore along
150km of private road in Western Australia, from the Ken’s Bore mine site to
the Port of Ashburton. The trucks are
converted Kenworth models. There
are 150 trucks in the fleet, and they
drive at 80km/h.
There is an interval of 2-3 minutes
between each truck as they constantly
run along the road delivering iron ore.
Hexagon (https://hexagon.com) performed the conversions – see Fig.20.
According to their description, this
includes: a sensory system for awareness (truck performance, surroundings and location); an autonomy layer,
the brains for decision making; and
a by-wire system for controlling the
vehicle.
Table 1 – Tesla autopilot features (source: https://w.wiki/3wkp)
Feature Autopilot Enhanced Autopilot Full Self Driving
Traffic-aware cruise control
Autosteer
Navigate on autopilot
Auto lane change
Autopark
Summon
Smart summon
Traffic & stop sign control
Autosteer on city streets
✔
✔
✖
✖
✖
✖
✖
✖
✖
✔
✔
✔
✔
✔
✔
✔
✖
✖
✔
✔
✔
✔
✔
✔
✔
✔
✔
Fig.20: the world’s first autonomous road train, in Australia. Source: www.
mineralresources.com.au/our-business/onslow-iron-project/autonomous-roadtrains
Buses and shuttles
The Apalong is a Level 4 driverless bus from China that has been in
production since 2017 – see Fig.21.
It travels at between 20km/h and
40km/h and can accommodate 14
people. It uses Baidu’s Apollo 3 Open
Driving Platform (https://github.com/
ApolloAuto/apollo).
Cars
Tesla is constantly updating the
software in its vehicles. It has a feature called “Autopilot” or “Enhanced
Autopilot” available in all its cars
produced since 2019, as well as some
vehicles offering “Full Self-Driving”
(FSD; supervised).
The capabilities of different versions of the software depends on the
siliconchip.com.au
Fig.21: the Apalong autonomous bus from China. Source: https://w.wiki/ERxF
Autonomous vehicle software
Few manufacturers have released the code for the autonomous cars, but the
Stanford Racing Team, the progenitor of Waymo One, released the code for
the vehicle that won the 2005 DARPA Grand Challenge event at:
https://sourceforge.net/projects/stanforddriving/
The vehicle ran this code, written in C and C++/ on a Linux operating system
running on Pentium M CPUs.
Australia's electronics magazine
October 2025 21
Fig.22: a Tesla Hardware 3 (HW3) Full Self Driving (FSD) board. A lot of the
circuitry at the top and bottom is the power supply for the two large UBQ01B0
multi-core processors. Source: https://w.wiki/ERxG
Fig.23: an autonomous mining truck for transporting minerals. Source:
Fortescue Metals Group Ltd – www.mining-technology.com/features/australialeads-the-way-in-autonomous-truck-use
market and local laws. Tesla classifies
these systems as SAE Level 2, possibly
for legal reasons, as FSD is arguably a
Level 4 technology (see Fig.26).
The FSD v12 software is available
for later vehicles with Hardware 4
(HW4; in Model S and Model Y after
January 2023).
It uses a neural network and artificial intelligence that has been trained
on millions of video clips. Older versions of the code were reliant upon
rule-based algorithms written in C++,
but later versions now use an ‘end-toend’ neural network that constantly
learns and adapts.
End-to-end means that the entire
FSD system is a neural network, not
just parts of it. The high-level Python
programming language is used for
machine learning, with C++ for
embedded systems. The software all
runs under the Linux operating system.
Samsung makes the processor for
HW4, a custom ‘system on a chip’
(SoC) device that has 16GB of RAM
and 256GB of storage. The internals
of the HW4 computer can be seen at:
siliconchip.au/link/ac6l
siliconchip.au/link/ac6m
The second link states that HW4
is running Linux kernel 5.4.161-rt67
Fig.22 shows a Tesla FSD board. We
can see that the main chips are labelled
UBQ0180. Wikichip (see siliconchip.
au/link/ac6n) states these are FSD
chips that incorporate “3 quad-core
Cortex-A72 clusters for a total of 12
CPUs operating at 2.2 GHz, a Mali G71
MP12 GPU operating 1 GHz, 2 neural
processing units operating at 2 GHz,
and various other hardware accelerators. The FSD supports up to 128-bit
LPDDR4-4266 memory”.
Each chip contains 6 billion transistors. As it was first shipped in Teslas
in 2019, we believe this unidentified
board is a Hardware 3 or HW3 board.
Table 1 illustrates the capabilities of
Tesla’s Autopilot, Enhanced Autopilot
and Full Self Driving.
Fig.24: the Liebherr T 264 battery-electric autonomous mining truck, jointly
developed with Fortescue. Source: Liebherr – siliconchip.au/link/ac6v
Mining vehicles
Australia is the world leader in the
use of autonomous mining trucks –
see Fig.23. As of May 2021, we had
575 such vehicles, compared to 143 in
Canada, 18 in Chile, 14 in Brazil, 12
in China, 7 in Russia, 6 in Norway, 5
in the USA and 3 in Ukraine.
Fortescue and Liebherr jointly developed an autonomous battery-electric
Australia's electronics magazine
siliconchip.com.au
22
Silicon Chip
T 264 truck, resulting in an order for
475 Liebherr machines. The T 264 is
8.6m wide, 14.2m long, 7.2m high with
the dump body on and can carry a payload of 240 tonnes. The truck itself
weighs 176 tonnes.
The prototype truck (Fig.24) has a
1.4MWh battery weighing 15 tonnes
that’s 3.6m long, 1.6m wide and 2.4m
high. It’s made up of eight sub-packs,
each consisting of 36 modules. It can
regeneratively charge as it goes downhill.
Taxis
Waymo One (https://waymo.com/
waymo-one) is an autonomous taxi
service currently available in the US
cities of Austin, Los Angeles, Phoenix, San Francisco and soon Atlanta
and Miami. Waymo One is a subsidiary of Alphabet Inc, Google’s parent
company.
Waymo vehicles have been under
development since 2015, and in 2020
offered the self-driving service without safety drivers present in the car.
The company traces its origins to the
2005 and 2007 US Defense Advanced
Project Agency’s (DARPA) Grand Challenge competitions and the Stanford
Racing Team. They won first place in
2005 and second in 2007.
Waymo have applied their self-
driving technology to several vehicle platforms; currently they use Jaguar I-Pace EVs (Fig.25), with whom
they have a partnership, at an estimated additional cost of US$100,000
($156,000) per vehicle. As of May
2025, approximately 1500 autonomous Waymo One vehicles were in
service, mostly the I-Pace.
Waymo vehicles are twice as safe as
human drivers according to accident
statistics, but have nevertheless been
involved in incidents, mostly minor.
A Waymo One taxi can be summoned via an App.
Amazon’s Zoox (https://zoox.com)
could be considered a ‘competitor’ to
Waymo. They are also an autonomous
taxi service operating in California
and Las Vegas, Nevada. Their vehicles
are fully electric and have no steering
wheel (see Fig.27).
Fig.25: Waymo’s modified Jaguar I-Pace EV. I-Paces have been discontinued, but
Waymo acquired a large number and continues to deploy them. Source: https://
waymo.com/blog/2018/03/meet-our-newest-self-driving-vehicle
Fig.26: a screenshot taken from an example video of Tesla’s FSD (Full-Self
Driving). Source: www.tesla.com/fsd
Further reading
More details on some of these ADAS
systems can be seen in our features on
Automotive Electronics, December
2020 and January 2021 (siliconchip.
SC
au/Series/353).
siliconchip.com.au
Fig.27: an Amazon Zoox robotaxi, which is design as a fully-autonomous taxi
(see https://zoox.com). Source: https://w.wiki/ESSv
Australia's electronics magazine
October 2025 23
|