This is only a preview of the December 2025 issue of Silicon Chip. You can view 35 of the 104 pages in the full issue, including the advertisments. For full access, purchase the issue for $10.00 or subscribe for access to the latest issues. Articles in this series:
Articles in this series:
Items relevant to "RGB LED Star Ornament":
Items relevant to "Earth Radio, Part 1":
Items relevant to "DCC Decoder":
Items relevant to "Digital Preamplifier, Part 3":
Purchase a printed copy of this issue for $14.00. |
HUMANOID & AN
Agility Digit
www.agilityrobotics.com
Boston Dynamics Atlas
https://bostondynamics.com/atlas
Unitree H1
www.unitree.com/h1
Tesla Optimus
www.tesla.com/en_eu/AI
Like many ideas that started as science fiction, humanoid and android robots are
now a reality. They have not yet been perfected – but they are here. Last month
we covered the tech; and now we showcase some of the most interesting robots.
W
e will now look at some historical
humanoid robots, followed by
those that are under active development or are available commercially.
Leonardo da Vinci’s
mechanical knight
Around 1495, Leonardo conceived
a “mechanical knight” that could perform actions such as moving its arms,
neck & jaw, raising its visor, sitting and
standing (Fig.18). It was operated with
gears and pulleys but obviously had
no electronic intelligence.
The significance of this robot is that
it is thought to be the first demonstration of the ability to mimic human-like
actions by mechanical means. Leonardo’s sketches for this machine were
rediscovered in the 1950s by Carlo
16
Silicon Chip
Pedretti and, in 1993, Mark Rosheim
collaborated with him to reconstruct
the robot.
As designed by Leonardo, it has
three degrees of freedom in its legs
and four in its arms. Pedretti described
it as “the first articulated humanoid
robot in the history of Western Civilisation”. There is a BBC audio presentation about the robot at www.bbc.
co.uk/sounds/play/m0004mf2
Elecktro
Elektro was an early humanoid robot
built by Westinghouse in 1937 for the
1939/1940 World’s Fair in New York.
It was extremely impressive and popular at the time, and it was featured in
daily shows at the fair – see Fig.19.
It was 2.1m tall, weighed 120kg,
Australia's electronics magazine
could perform 26 actions and had 700
words stored on eight 78 RPM records
in its chest cavity. The 26 actions were
hard-wired in a fixed sequence so that
in a show, the same script from the
handler resulting in the same sequence
of actions every time.
The robot gave the illusion of being
voice-controlled, but electronics of
the era were not advanced enough to
perform voice recognition. The robot
responded to the controller’s voice,
but only to the rhythm. For example, speaking one word might correspond to one impulse to close a
relay. Two words would correspond
to two impulses which might perform
another electro-mechanical action.
It could also recognise red or green
colours with a photocell, and state
siliconchip.com.au
NDROID ROBOTS
Part 2: by Dr David Maddison, VK3DSM
Figure 02
www.figure.ai
1X NEO Gamma
www.1x.tech/neo
Apptronik Apollo
https://apptronik.com/apollo
Fig.18: a
reconstruction
of Leonardo’s
“mechanical
knight” robot.
Source: https://w.
wiki/EotZ
Fig.19: the Elektro
robot. Source:
www.computertimeline.
com/timeline/
the-robots-ofwestinghouse
the colour. See the video titled “Elektro the Smoking Robot (Odd History)”
at https://youtu.be/sxGDdwbcfJg and
“The World’s First Celebrity Robot” at
https://youtu.be/dwBLOluOiUY
The system of voice control had
its origins in Westinghouse engineer
Roy James Wensley, who was issued a
patent in 1929 for a “supervisory control system” that enabled the control
siliconchip.com.au
of machinery over telephone or radio
connection by “voice”.
It wasn’t truly voice recognition,
but a series of tones generated with
pitch pipes or tuning forks at 600,
900 and 1400Hz. These activated
relays via tuned circuits, enabling
equipment such as at a power plant
or telephone exchange to be remotely
controlled.
Australia's electronics magazine
Booster Robotics T1
www.boosterobotics.com/robots/
The system was demonstrated with
a robot prior to Elektro called Herbert
Televox. Presumably, this system was
modified for Elektro to use impulses
instead of tones.
This system of tone control became
the basis of the DTMF (dual-tone
multi-frequency) signalling used in
traditional telephone exchanges.
Current humanoid and
android robots
We do not have space to include a
description of every available humanoid robot, so have selected the most
interesting ones.
1X Technologies
www.1x.tech
The Neo Gamma is a humanoid
robot under development by a Norwegian AI and robotics company,
intended for domestic service (Fig.20).
Neo Gamma uses artificial intelligence
and is trained on household tasks
using human motion-capture data. It
December 2025 17
Fig.20: the Neo Gamma
from 1X Technologies
performs domestic
duties. Source: www.
therobotreport.com/1xbuilt-humanoid-neogamma-better-fit-home
Fig.21: Australian Abi
companion robots.
Source: www.dromeda.
com.au/product
is covered in a knitted fabric to give it
a friendly appearance. EVE is another
humanoid robot from 1X.
Abi
www.dromeda.com.au
Abi is an Australian humanoid
robot, made in Melbourne – see Fig.21.
It is intended as a companion robot
with “playful features and infinite
empathy”, for use in nursing homes
and similar care facilities.
It uses advanced AI and machine
learning to recognise faces, understand
and express emotions and remember
conversations from days or months
ago.
It interacts with each resident
based on their personal cues, including speaking their preferred language.
Abi is fluent in 90 languages and can
participate in small group activities
such as singing, dancing, games and
conversation.
Actroids
www.kokoro-dreams.co.jp
The Actroid-DER series of android
robots is designed with strong human
likeness by Osaka University and manufactured by the Kokoro Company Ltd.
Actroids were first displayed in 2003,
so they are relatively old, and various
models are available to rent.
A model from the
Actroid-DER series
is shown here. It
performs simple functions
like blinking and
speaking using
AI. Later models
have 47 actuators,
which are pneumatically driven.
For more information, see the video at
https://youtu.be/l8qHXdKF300
AgiBot
www.agibot.com
This Chinese company has developed the AI-powered Genie Operator-1
(GO-1) model for quickly training
humanoid robots in a variety of tasks
– see Fig.23. According to Agibot, the
model enables robots to “understand
instructions in natural language and
perform reasoning, rather than being
limited to preprogrammed routines”.
AgiBot also produces a variety of
robots, such as the AgiBot A2 interactive service robot. The open-source
AgiBot X1 robot is documented at
www.agibot.com/DOCS/OS
Apollo A1
https://apptronik.com
Apptronik from Austin, Texas, has
developed the Apollo A1 general-
purpose humanoid robot, intended
for jobs in warehouses, manufacturing
plants and so on – see Fig.24.
As the robot is further developed, its
use will be extended to areas like construction, oil and gas extraction, electronics production, retail, home delivery, aged care and others. Apptronik
has origins in the development of
robots for NASA, such as Valkyrie
(described below).
Apollo is 172cm tall, has a run time
of four hours with a swappable battery
pack, weighs 72kg, can carry 25kg, is
modular and the torso can even be
mounted on a wheeled or stationary
platform. It has a chest-mounted display that shows its status.
Apollo can be tethered to a power
supply to allow continuous operation.
It has ‘force control’ systems to limit
the amount of mechanical force it
can impose to enhance safety around
humans. It walks within a defined
perimeter, so it does not get too close
to people or objects. It immediately
pauses when moving objects are
detected in its vicinity.
Mercedes Benz has an agreement to
test Apollo in its manufacturing operations. See the video at https://youtu.
be/TfUOg38iXxo
ASIMO https://global.honda/en/robotics/
Apart from cars, Honda is famous for
one of the earliest humanoid robots,
ASIMO (Advanced Step in Innovative
Mobility) – see Fig.25. It was retired
in 2018. Honda researchers researched
the following things with ASIMO:
O Moving around while sharing the
same space with people.
O Performing tasks using its hands.
O Interacting with people, including
Fig.22: an Actroid-DER series android.
Source: www.kokoro-dreams.co.jp/
english/rt_tokutyu/actroid/
Fig.23: AgiBot robots running GO-1
perform some kitchen tasks. Source:
https://youtu.be/9dvygD4G93c
18
Silicon Chip
Australia's electronics magazine
siliconchip.com.au
understanding spoken words and controlling movement/behaviour by estimating the intention of nearby people.
Over its 20 years of demonstrations,
it walked a total of 7907km.
Honda continues to develop robotics, but with a focus on developing a
variety of robots with specific functions rather than just one general-
purpose humanoid robot.
Fig.24: the Apptronik A1 generalpurpose robot. Source: https://
apptronik.com
Digit
www.agilityrobotics.com
Agility Robotics from Albany, Oregon, USA has its origins in Oregon
State University. In 2023, they released
the current version of their bipedal
robot, digit – see Fig.26.
Digit is said to be the world’s first
commercially deployed humanoid
robot. They are currently in operation at Amazon and GXO Logistics.
Amazon has also stated they intend
to replace their entire human workforce by 2030. The robots cost around
US$250,000 ($380,000) each. See
https://youtu.be/NvYsGcQvMw8 and
https://youtu.be/MYzRPJ7TaLc
AheadForm
www.aheadform.com
Heads capable of expression are
among the most complicated things to
build for such robots, apart from the
software. Just as motor vehicle manufacturers might contract out specialists
to make certain components, the same
concept can apply to robots.
AheadForm specialises in making
robot heads capable of a wide variety of human-like facial expressions,
including subtle but important ones
such as smirks – see Fig.27. These
highly realistic heads are said to avoid
the “uncanny valley” in which humanlike heads are seen as not quite realistic enough, and thus disturbing to
the viewer.
AiMOGA Robot
Artificial Intelligence with
Multi-Objective Genetic Algorithm
was the robot’s original name, but it
is now called Mornine. Its job is as an
intelligent sales consultant for some
Chery car dealerships, with about 220
to be delivered – see Fig.28.
The robot is manufactured by Chery.
It uses a ‘multimodal sensory model’
to perceive a customer’s gestures and
questions by combining sensory data
from speech, visual and other data. It
then uses an “advanced emotion and
personality engine” for personalised
interactions.
siliconchip.com.au
Fig.25: Honda’s ASIMO (Advanced Step in Innovative Mobility)
showcased at the Tokyo Motor Show in 2011. Source: https://w.wiki/EpwP
Fig.26: a demonstration of
Digit robots from Agility
Robotics at work in a
warehouse.
Fig.27: the AheadForm
robot head (left) and
its creator (right).
Source: https://youtu.
be/gnyFtUU-TJ8
Fig.28: a Mornine robot
at a Chery dealership.
Source: https://
motorillustrated.com/
chery-debuts-humanoidrobot-mornine-to-dealerstheir-future-salesreps/153657/
Australia's electronics magazine
December 2025 19
It takes advantage of DeepSeek’s AI
and CheryGPT’s large language models
to understand natural language, give
appropriate responses and chat in any
of ten languages.
The robot can be used for other purposes apart from car sales, such as a
bookshop assistant, other sales roles,
companionship, shopping guide or
caregiver. The robot is 166cm tall and
weighs 55kg. Chery intends to launch
the robot commercially in 2027. One
estimate is that the robots will cost
around $89,000 each.
Fig.29: Alter3 after being instructed to
“take a selfie”. Source: https://arxiv.
org/pdf/2312.06571
Fig.30: the Ameca robot. Source:
https://engineeredarts.com/gallery/
Fig.31: the Berkeley Humanoid Lite.
Fig.32: the Booster T1
robot. Source: www.
boosterobotics.com/
robots
20
Alter3 https://tnoinkwms.github.io/ALTER-LLM
Alter3 is an experimental humanoid
robot from the University of Tokyo that
uses the GPT-4 large language model to
interpret instructions, then produces
separate code to generate the required
motions – see Fig.29.
There is a video showing the implementation of this code at https://youtu.
be/l4d6N_Rf8mk
Ameca
https://engineeredarts.com
A humanoid robot by UK company
Engineered Arts (Fig.30), Ameca is
designed for interaction with humans;
its legs are still under development.
Despite that, Ameca has advanced
facial expression capabilities,
extremely advanced emotional intelligence and conversational capabilities in multiple languages. It runs an
in-house-developed operating framework called Tritium; for its language
model, it uses OpenAI’s GPT 4.0 or can
be controlled by teleoperation.
It can watch, listen, learn, track
faces in real time, analyse the emotional state of someone speaking to it
and respond appropriately. Ameca is
available for purchase or rental. See
https://youtu.be/b9xFM61KKyc
Fig.33: the latest version of Boston
Dynamics’ Atlas robot. Source:
https://bostondynamics.com/atlas/
Berkeley Humanoid Lite
Berkeley Humanoid Lite (https://
lite.berkeley-humanoid.org) is an
open-source, customisable 3D-printed
humanoid robot – see Fig.31. It is 88cm
tall and weighs 16kg. Its developers
state that it can be made for around
US$5000 ($7650) with readily available components.
It can perform several useful tasks,
including manipulating a Rubik’s
Cube, and it can also be teleoperated.
To make it, you need a 3D printer that
can produce parts within a 20 × 20 ×
20cm workspace. For more details, see
https://youtu.be/dIdJGkMDFl4
Australia's electronics magazine
siliconchip.com.au
Fig.34: Atlas picks and places parts
in an experimental situation. Source:
https://bostondynamics.com/video/
pick-carry-place-repeat
Fig.35: the torso of the Clone android
showing anatomical similarity to
a human. Source: https://x.com/
clonerobotics
Fig.36: the Pudu D9 robot. Source:
www.pudurobotics.com/en/products/
d9
Booster T1
www.boosterobotics.com
Booster Robotics is a Chinese company that has developed a humanoid
robot platform intended for developers to write their own software for –
see Fig.32. There is an online manual describing the robot with details
for developers at siliconchip.au/link/
ac7h
picking and placing a part is shown
in Fig.34.
D9
www.pudurobotics.com
Pudu Robotics is developing the
general-purpose D9 robot – see Fig.36
and the video at https://youtu.be/
gd5DdfJX_RM
Boston Dynamics Atlas
Boston Dynamics (website: https://
bostondynamics.com) is a subsidiary of Hyundai. It was one of the first
companies with a functional humanoid robot called Atlas, and there are
many impressive videos of its products on YouTube.
The latest version of Atlas is shown
in Fig.33. Unlike previous versions,
which were hydraulically actuated,
this one is fully electric. That makes
it quieter, less complicated, more compact, with more natural movements
and some say less intimidating.
Computationally, Atlas uses NVIDIA’s Isaac GR00T framework and the
NVIDIA Jetson Thor computing platform, which is specifically designed
for humanoid robots.
It uses the Blackwell GPU architecture with 2560 CUDA cores, 96 tensor
cores and Arm Neoverse V3AE CPUs
with 14 cores and 128GB of LPDDR5X
memory, giving up to 2070 FP4 teraflops of AI computing power (FP4 is
floating point 4-bit operations).
Hyundai are testing Boston Dynamics Atlas robots for building electric
cars in their Georgia, US manufacturing facilities and plan to roll them
out globally. A demonstration of Atlas
Clone
https://clonerobotics.com
Polish company Clone Robotics
developed Clone, which they describe
as an android. Clone adopts a different
approach from other humanoid robots.
It seeks to emulate the actual structure
of the human body, with anatomically
accurate bones, joints, muscles and a
nervous system – see Fig.35.
It is hydraulically actuated with
a 500W pump. It has an unusually
large 200 degrees of freedom, 206
bones, 1000 artificial muscle materials (known as myofibres) and 200 sensors. Myofibres are composed of mesh
tubes that contain balloons of hydraulic fluid. These are inflated or deflated
by hydraulic pressure. The aforementioned pump acts as the ‘heart’ of the
system.
The idea is based on the McKibben artificial muscle concept first
invented in the 1950s. For cooling, it
even ‘sweats’ fluid, just like a human.
In essence, Clone seeks to mimic the
human body.
Clone runs on an NVIDIA Jetson
Thor inference GPU in the skull running Cybernet, Clone’s visuomotor
foundation model. See Clone Robotics’s YouTube channel (www.youtube.
com/<at>CloneRobotics).
siliconchip.com.au
Certis
www.certisgroup.com
Certis is a Singapore company that
is using an Agibot humanoid robot
to study potential applications for
humanoid robots in security and integrated facilities management.
Australia's electronics magazine
EveR-4 android
The EveR-4 is
an android that
was developed at
the Korea Institute of Industrial
Technology and
was exhibited at
RoboWorld 2011
in Seoul.
It is one of a series of four such
androids. It can exhibit a wide range of
facial expressions and has over thirty
motors actuating its face.
The EverR-4 has been investigated
for use in jobs such as a medical receptionist, where the robot was positively
received by patients. See the videos at https://youtu.be/OvsZqPcnNIE
and https://youtu.be/b2GtBAPG1ho
(EveR-4 as a receptionist).
FALCON
FALCON is an experimental robot
control framework from Carnegie
Mellon University, designed to train
humanoid robots for a complex tasks
called “force-adaptive loco-manipulation”. This is walking or standing
and using their arms at the same time,
while applying strong, precise force.
Examples include pushing a wheelbarrow, opening a door or engaging in
a tug -of-war with another robot. Two
AI agents are used to accomplish such
tasks; one for the legs and the other
December 2025 21
for the arms – they communicate with
each other. The systems is tested on
humanoid platforms from Unitree and
Booster Robotics. See https://youtu.be/
OfsvJ5-Fyzg
Figure AI
www.figure.ai
An American robotics company, its
investors include OpenAI, NVIDIA
and Jeff Bezos.
It is aiming to build robots that learn
and reason like humans. Their most
advanced robot is Figure 02 (shown
in Fig.37), which runs the Helix generalist vision language action (VLA)
model.
The aim of Figure AI is to enable
robots to work in an unstructured
home environment with thousands
of objects it has never encountered
before, and to reason how to deal with
them without prior programming or
demonstrations.
Helix can also run on two robots
simultaneously, enabling them to
cooperatively solve problems and
share learning. This is shown by putting groceries away in the video at
https://youtu.be/Z3yQHYNXPws
Figure 02 with Helix is also
intended for industrial use, where it
will be introduced first, before being
brought to the domestic market. An
industrial environment is much more
structured than a home environment,
and should be easier to operate in.
According to Figure AI: VLM
(vision language model) backbones
are general, but not fast, while
robot visuomotor policies are fast
but not general. Helix resolves
this trade-off through two
complementary systems,
trained end-to-end to communicate:
O System 2 (S2) is an
onboard internet-pretrained
VLM operating at 7-9Hz for
scene understanding and
language comprehension,
enabling broad generalisation across objects and
contexts.
O System 1 (S1) is a fast
reactive visuomotor policy that
translates the latent semantic representations produced
by S2 into precise continuous
robot actions at 200Hz.
Fig.37: Figure AI’s Figure 02
robot uses a VLA model for
control. Source: www.figure.ai
22
Silicon Chip
This decoupled architecture allows
each system to operate at its optimal
timescale. S2 can “think slow” about
high-level goals, while S1 can “think
fast” to execute and adjust actions in
real-time. For example, during collaborative behaviour, S1 quickly adapts
to the changing motions of a partner
robot while maintaining S2’s semantic objectives.
Gary
www.hospital-robots.com
Gary is a wheeled hospital robot
from Israeli company Unlimited
Robotics – see Fig.40. It can perform
tasks such as providing bedside assistance to staff, medication reminders,
provide companionship to patients,
deliver supplies, sanitise, clean, automate mundane tasks etc.
It runs on an Intel Core i7 processor
with 512GB of memory and has a Google tensor processing unit (TPU) for its
AI for neural network machine learning. For software, it uses the Linux
operating system with the in-house
developed Ra-Ya platform, which is
designed to make it easier for inexperienced robot programmers to build
applications if they are familiar with
Python or JavaScript.
Geminoid HI-6
In 2006, Professor Hiroshi Ishiguro
of the Intelligent Robotics Laboratory
Fig.38 (below): the HUBO2
robot. Source: www.rainbowrobotics.com/en_hubo2
at The University of Osaka made an
android replica of himself, the Geminoid HI-1. It has now upgraded to the
Geminoid HI-6 – see Fig.42.
It is an upper body only and is teleoperated. It is pneumatically operated
with 16 actuators and has 16 degrees
of freedom.
The purpose of the android is to
explore questions of “what exactly
human presence is, whether human
presence can be transmitted to remote
locations, and whether androids can
surpass humans through experimentation”.
GR-1 & GR-2
www.fftai.com
Fourier is a Chinese company
that has developed the GR-1 and
GR-2 humanoid robots. The GR-2 is
a general purpose robot with sensors in its dextrous hands to make
them touch-sensitive. It can be programmed with frameworks like ROS
and NVIDIA Isaac Lab. See the video
at https://youtu.be/N7qYcOuR7P8
HMND 01
https://thehumanoid.ai
Humanoid has developed the
HMND 01 robot, which they describe
as a “labour automation unit” (see
Fig.39). It is 175cm tall, weighs 70kg,
has 41 degrees of freedom, a payload
capacity of 15kg, a runtime of four
hours and walking speed of 5.4km/h.
Uses for the robot include goods
handling, such as in warehouses,
picking and packing for e-commerce warehouses and parts handling in manufacturing operations.
HRP-5P
An experimental
humanoid robot developed in Japan in 2018
by the National Institute
of Advanced Industrial
Science and Technology, it was demonstrated
installing wall plasterboard sheeting similar
to Gyprock. See https://
youtu.be/fMwiZXxo9Qg
HUBO www.rainbow-robotics.com
Rainbow Robotics is a
Korean company that offers
the HUBO2, shown in Fig.38,
Fig.39: the HMND 01
robot. Source: TechNode –
siliconchip.au/link/ac7p
siliconchip.com.au
which they claim is the world’s first
humanoid robot to be commercialised.
The first HUBO was released in 2004,
with the HUBO2 going on sale in 2010.
It has made commercial appearances,
such as at the 2012 Philadelphia Phillies baseball game.
iCub
https://icub.iit.it
iCub is an open-source research and
recreational robot designed by a consortium of European universities for
research into human cognition and
artificial intelligence – see Fig.41. It
is 104cm tall and weighs 22kg.
It has been demonstrated with capabilities such as crawling, solving 3D
mazes, performing archery, producing
facial expressions, exercising force
control, grasping small objects, and
performing collision avoidance. It is
said to use a neuromorphic processor.
Fig.40: Gary, the hospital robot, enters
a room. Source: www.hospital-robots.
com/about
Fig.41: iCub at the Center for Robotics
and Intelligent Systems (CRIS).
Source: https://w.wiki/Eotd
Iggy Rob
www.igus.com
A partially humanoid robot; instead
of having legs, it has a wheeled base
(see Fig.43). Its suggested applications
include performing wait staff duties,
such as delivering food and drinks to
restaurant customers, parts delivery on
factory floors, and for education and
research into robotics. It costs about
US$54,500 ($83,500), is 1.7m tall and
can carry 100kg.
InMoov
https://inmoov.fr
The InMoov humanoid robot project is for hobbyists and universities,
with a whole community of developers, including in Australia and New
Zealand – see Fig.44. It is open source
and can be 3D printed on any standard
printer with a 12 × 12 × 12cm area.
It utilises two Arduino Mega or
Arduino Uno microcontroller boards,
two Nervo Board shields and 28 servo
motors. It has two cameras for object
and face tracking, speakers for speech,
one Kinect sensor (discontinued) or
OAK-D-Lite-FF for 3D depth and gesture recognition, and a PIR sensor for
presence detection. All of its fingers
are motorised.
Iron
Iron from car manufacturer Xpeng
(siliconchip.au/link/ac7i) uses their
proprietary 40-core Turing AI chip
with 3000 TOPS (trillions of operations per second) of processing power
and their Tianji AIOS AI operating system, which is also used in their cars.
It features 60 joints, 200 degrees
siliconchip.com.au
Fig.42: Professor Hiroshi Ishiguro with the android robot replica he made of
himself, Geminoid HI-6. Which one is which? Source: https://drive.google.com/
drive/folders/1RN710FOs7r9KJ2TmXkh-W_0cqcHUtlNj
Fig.43: the Iggy Rob robot. Source:
www.igus.com/automation/news/
humanoid-robot
Australia's electronics magazine
Fig.44: the torso of InMoov. Its lower
legs have not yet been developed.
Source: https://inmoov.fr/gallery-v2/
December 2025 23
Figs.45, 46 & 47: the Kuavo 4.0 robot (source: www.lejurobot.com/en); the NASA Valkyrie or R5 robot, which is being
tested in Australia (source: https://x.com/CWeezyeth/status/1643599326650986496/photo/1); and a range of humanoid
industrial robots from Persona AI; a miner, builder, welder, fabricator and assembler (source: https://personainc.ai).
of freedom, is 173cm tall and weighs
70kg. It is already being used to produce Xpeng cars – see Fig.48.
Kuavo 4.0
www.lejurobot.com/en
Kuavo is a product of Leju Robotics,
designed as a general-purpose humanoid robot for applications such as personal assistance and industrial automation (Fig.45). It utilises Huawei’s
HarmonyOS and PanGu multimodal
large language model for AI.
It is not clear if Kuavo uses it, but
other products of this company are
the world’s first to use 5G-A positioning, which uses 5G technology for
high-precision location within indoor
spaces.
NASA
The 2011 NASA Robonaut2 or R2
was the first humanoid in space, tested
aboard the International Space Station
(ISS). It was initially only a torso, but a
mobility platform was added in 2014.
It had significant technical problems,
was not used and returned to Earth
in 2018.
The NASA Valkyrie (see Fig.46),
also known as R5, is still under active
development. It is also being tested
in Australia by Woodside Energy
“to develop remote mobile dexterous manipulation capabilities for
uncrewed and offshore energy facilities”. It is 1.8m tall, weighs 136kg and
runs on three Intel Core i7 CPUs. It is
not currently deployed in space.
Nurabot
www.foxconn.com/en-us
Foxconn, in cooperation with
NVIDIA, has developed a robotic nurse
with Tawian’s Taichung Veterans General Hospital called Nurabot (Fig.49).
It has perception, navigation, understands language and has an ability to
adapt.
It is intended to address labour
shortages, monitor patients’ vital
signs, address caregiver burnout,
move patients, deliver meals and
medication, offer companionship,
turn patients over in bed and learn
patients’ habits. It does this with high-
resolution sensors, autonomous navigation capabilities and NLP (natural
language processing).
In trials, the nurses love it because
it takes over repetitive tasks and gives
them more time for non-routine tasks.
Patients like it as well, as the robot is
always there for them.
Optimus
www.tesla.com/en_eu/AI
Tesla has developed the Optimus
Gen 2 general purpose humanoid robot
Fig.48: Xpeng’s Iron robot making cars. Source: https://baa. Fig.49: the Nurabot nursing robot. Source: www.honhai.
yiche.com/xiaopengP7jia/thread-51117011.html
com/en-us/press-center/press-releases/latest-news/1605
24
Silicon Chip
Australia's electronics magazine
siliconchip.com.au
Fig.50: the Optimus robot
through various design
iterations. Left to right,
they are Bumblebee
(September 2022),
Optimus Gen 1 (March
2023), Gen 2 (December
2023) and Gen 3 (current).
Source: https://x.com/
niccruzpatane/status/
1937798034894762071/
photo/1
shown in Fig.50. The status of Gen 3
shown in that figure is unclear.
Optimus uses Tesla’s in-house AI
and uses parts of Tesla cars, such as
the same computers, with custom-
designed chips to provide the robot’s
neural networks for tasks like locomotion, manipulation, navigation, vision
processing and decision-making. Tthe
same AI architecture as Tesla’s Autopilot and FSD systems.
It uses the same cameras as Tesla
cars for vision processing, actuators
based on design principles developed for the cars, and the same 4680
batteries used in the Cybertruck. The
neural networks used by Optimus are
the same as used by Tesla cars to process visual inputs and adapt to environments. It also uses the same deep
learning and auto-labelling techniques
used by vehicles.
Using its neural network, it can
learn from videos of humans performing various tasks such as vacuuming,
stirring food, disposing of rubbish or
moving items on a factory floor.
Elon Musk has previously shared a
vision of collective learning and data
sharing in Tesla’s AI and robotics initiatives, so the possibility exists that
when one Optimus robot learns a new
skill, they could all learn it, depending on the programming.
Some critics have pointed out that
early demonstration videos of Optimus involved teleoperation by human
siliconchip.com.au
operators. Tesla is currently using two
Optimus robots in its factories for tasks
like moving batteries, but they currently operate at half the efficiency of
human workers.
They intend to produce 1000 robots
by the end of 2025, to be used by Tesla,
with sales to other companies starting
in 2026. The estimated cost per unit
is US$20,000 to US$30,000 ($30,000
to $45,000).
There has been discussion of controlling an Optimus robot via patients
with the Neuralink brain-computer
interface implant. This means a disabled person could ‘inhabit’ the body
of an Optimus robot and control it to
perform tasks by thought alone.
It has also been proposed that such
a person could control just parts of the
robot, such as the arms, legs or hands,
which could be attached to the body as
artificial limbs. A patient named Alex
has already demonstrated the ability
to control an Optimus hand through
Neuralink, via thought alone; see the
video at siliconchip.au/link/ac7n
Optimus has been integrated with
X’s Grok AI, so it will be possible to
have conversations with the robot and
to get the Optimus to do anything as
commanded by the operator. It may
also be able to provide companionship.
PAL Robotics
https://pal-robotics.com
This Spanish firm makes a range of
Australia's electronics magazine
robots, including bipedal humanoid
types such as the REEM-C, TALOS and
Kangaroo research platforms for robotics and AI research. They also make
the ARI wheeled social robot for tasks
like answering customer enquiries or
running promotional campaigns.
It uses AI, running on Ubuntu and
the ROS framework, with facial recognition and other visualisation tools.
Persona AI
https://personainc.ai
Persona AI in Houston are developing a range of heavy-duty industrial humanoid robots, based on one
modular platform but customised for
a variety of tasks including shipbuilding and welding – see Fig.47. One of
the strengths of their robots is their
advanced hands, derived from work
by NASA.
The firm was
co-founded by
ex-NASA staff. One
usage model for these
robots includes renting
them out for specific
jobs. For deployment
for precision shipyard
welding, they have a
development partnership with HD Hyundai
Robotics, Vazil and HD
Korea Shipbuilding &
Offshore Engineering,
and are expected to be
deployed in 2027.
December 2025 25
Phoenix
www.sanctuary.ai
A robot from Canadian company
Sanctuary AI (Fig.51), according to
the company, Phoenix, running under
their Carbon AI control system, “mimics subsystems found in the human
brain, such as memory, sight, sound,
and touch”. See the video at https://
youtu.be/-HizP4UQvug for more information.
Fig.51: the upper
body of the
Phoenix robot.
Fig.54: the OP3 robot. Source:
https://en.robotis.com/sub/
business_platform_
op3.php
Fig.52: the EngineAI PM01 in service
as an experimental police robot.
Source: South China Morning Post –
siliconchip.au/link/ac7q
Fig.55: a
robot hand
from Shadow Robot.
Source: https://shadowrobot.
com/dexterous-hand-series
Fig.56:
SoftBank
Robotics’ NAO.
PM01
www.engineai.com.cn
EngineAI makes the PM01 general-
purpose humanoid robot. Chinese
police are using one in an experimental and demonstration role (see
Fig.52). In police work, it can perform
facial recognition and crowd scanning, although at this stage of development, it is not likely to be intelligent enough to be genuinely useful
for police work.
Still, it hints at a possible RoboCoplike future. See the video at https://
youtu.be/vu930qj9CEI
Poppy Project www.poppy-project.org
An open-source platform for the creation, use and sharing of interactive
3D-printed robots for education, artists, scientists and hackers (see Fig.53).
Porton Man Robotic Test System
The Porton Man Robotic Test System (siliconchip.au/link/ac7j) is a
humanoid robot for the US Army.
Its purpose is to wear and test
nuclear, biological and chemical
(NBC) protective suits. It performs
tasks that soldiers would perform,
such as walking, running, kneeling, or
laying prone while wearing the equipment and has 100 embedded sensors
to test for leaks and to measure other
parameters.
Its main advantage is the ability to
repeat movements precisely, enabling
effective comparisons between different NBC ensembles.
RoboPrime humanoid project
RoboPrime is a very low-cost
3D-printed humanoid robot project for
the enthusiast. The website at https://
github.com/simonepri/roboprime is
no longer actively maintained; however, builders might still find some
ideas there.
Fig.53: the Poppy Humanoid v1.0.2
is 83cm tall, weighs 3.5kg, has 25
actuators and runs Odroid XU4 with
Ubuntu 14.04. Source: www.poppyproject.org/en/robots/poppy-humanoid
Fig.57: SoftBank
Robotics’
Pepper.
26
Australia's electronics magazine
Silicon Chip
ROBOTIS
https://en.robotis.com
This Korean robot company makes
open-source humanoid platforms,
such as the OP3 (Fig.54), for research
siliconchip.com.au
Humanoid robots losing control
There were recent incidents of humanoid robots losing control, which can
happen, just like any other machine. There is obvious potential for this to
harm people. That is why it is imperative that these robots are built with failsafe systems and some means for their handlers to deactivate them. You
can see one such incident in the video at https://youtu.be/1eYZr9vdGl8
Legal concerns for AI
There are no specific laws governing AI in Australia. In the event that AI and
humanoid robots become sufficiently advanced to develop consciousness,
it has been argued that they should be afforded rights as humans have.
However, they are still man-made machines that mimic humans and still not
human, just very advanced appliances.
In regards to foundation models, polls of organisations have suggested
that there is agreement that those who develop the models should be
responsible for the risks, not those who use the models.
and education purposes. You can find
the relevant files at siliconchip.au/
link/ac7k
Shadow Robot Company
The Shadow Robot company
(https://shadowrobot.com) specialises
in making dextrous robot hands for
other robotics manufacturers. One of
their robot hands is shown in Fig.55.
SoftBank
www.softbankrobotics.com
SoftBank Robotics offer two humanoid robots, NAO and Pepper. NAO
(Fig.56) is described a teaching assistant. It is described as having a personality and an ability to inspire students
from all ages, from preschool to university, including the ability to work
with special-needs children.
It is described as being able to connect “theory to practice with hands-on
Fig.58: Sophia by Hanson Robotics.
Source: www.hansonrobotics.com/
sophia
siliconchip.com.au
projects that encourage participation,
teamwork, and creative problem-
solving”. NAO can speak 20 languages,
move naturally and runs on the Linuxbased NAOqi OS, a flexible framework
that gives a lot of options for customising the robot. It is quite small at around
57cm tall, with a weight of 4.8kg.
Pepper (Fig.57) is a robot designed
to greet customers in a business. It can
make personalised recommendations,
help people find what they’re looking
for, sell to and interact with humans.
Pepper is 1.2m tall, weighs 28kg, runs
for 12 hours on one charge, has a variety of sensors, runs on the NAOqi OS
and can be customised with various
SDKs (software development kits).
carry on human-like conversations.
The manufacturer describes Sophia as
“a human-crafted science fiction character depicting the future of AI and
robotics, and a platform for advanced
robotics and AI research”.
The robot has made appearances on
many popular TV shows. Sophia utilises symbolic AI, neural networks,
expert systems, machine perception,
conversational natural language processing, adaptive motor control and
cognitive architecture systems, among
others.
The manufacturer says that the robot
has demonstrated rudimentary levels
of consciousness under certain conditions using the Tononi Phi system of
measuring consciousness (siliconchip.
au/link/ac7l).
StUWArt
A 140cm-tall experimental humanoid robot from the University of Western Australia (UWA; siliconchip.au/
link/ac7m) – see Fig.59. It uses a commercial robot platform, with the focus
of UWA research being on the development by students of autonomous
control software enabling it to move
and walk.
The basic platform appears to be a
Unitree G1, described below.
Sophia
www.hansonrobotics.com
A humanoid robot by Hong-Kongbased Hanson Robotics (Fig.58). It can
Titan
www.roboforce.ai
RoboForce has developed the Titan
industrial robot, which can lift 40kg
and place it with 1mm accuracy, running for eight hours on one charge –
see Fig.60. The robots are modular and
can be equipped with different hands
Fig.59: the StUWArt humanoid robot
with software under development by
UWA students.
Fig.60: the Titan-T industrial robot by
RoboForce. Source: www.roboforce.
ai/product
Australia's electronics magazine
December 2025 27
and bases. It is optimised to perform
the five so-called primitive actions of
all human labour: pick, place, press,
twist and connect.
Fig.61: the Unitree G1 robot.
Source: www.unitree.com/g1
Fig.62: the soccer-playing version of the Unitree G1, the G1-Comp. Source:
www.unitree.com/robocup
Toyota Research Institute
Toyota sees its future with large
numbers of its workers being humanoid robots. The Toyota Research
Institute has partnered with Boston Dynamics to integrate its ‘large
behaviour models’ with Boston’s
Atlas robot. Toyota has trained their
in-house AI large behaviour model
to perform 500 tasks, to be integrated
with Atlas.
Toyota trains their large behaviour
models using humans to demonstrate
the required tasks, with joysticks to
control robot movement, or robots are
trained using videos. The AI model
then synthesises these experienced
operations into the relevant actions.
Toyota envisions using the robots
for tasks such as transporting parts,
assembling components and conducting inspections.
They may eventually replace
between 5% and 15% of human workers. There is a video about Toyota’s
large language models at https://youtu.
be/DeLpnTgzJT4
Unitree G1 robot
www.unitree.com
Standing at 130cm tall, and weighing 35kg, the G1 has a swappable battery pack with a life of two hours,
three-finger hands with optional tactile sensor arrays, 3D lidar and depth
cameras, a microphone, speakers, 43
degrees of freedom, eight CPUs and
other features – see Fig.61.
Another version of this robot, the
G1-Comp, is designed for playing in
robot soccer competitions (Fig.62).
The G1-Comp uses the YOLOv11
algorithm for real-time object detection and recognition, pose estimation
and image classification using convolutional neural networks. The H1 is
similar to the G1 but is taller (180cm).
See the videos at https://youtu.be/
GzX1qOIO1bE (G1) and https://youtu.
be/M0KrTumJBFc (G1-Comp).
Fig.63: the Walker S Lite (left) and Walker S. Source: www.ubtrobot.com/en/
humanoid/products/WalkerS
Walker S
www.ubtrobot.com
The Chinese UBTECH Walker S
Industrial Humanoid Robot assists
on production lines with inspections
and installing small parts. It comes in
two models: the Walker S (170cm tall,
65kg, 41 degrees of freedom, 2.5 hour
battery) and the Walker S Lite (130cm
Australia's electronics magazine
siliconchip.com.au
28
Silicon Chip
What is AIoT?
AIoT or artificial intelligence of things is like the internet of things (IoT),
where devices can communicate with each other. However, each device
also possesses artificial intelligence. The combination of IoT and AI in AIoT
makes for an extremely powerful network, where AI-powered devices can
interact and communicate with each other locally, area-wide, country-wide
or even worldwide.
It could conceivably be dangerous in the future, if not managed
appropriately, with limitations to stop AI getting out of control. Just imagine
an army of robots being programmed with malicious intent…
Glossary of Terms
AI – Artificial Intelligence; machines
simulating human intelligence, such as
learning, reasoning and problem-solving
ANN – Artificial Neural Network;
computational models inspired by
human brains, used in machine learning
ASIC – Application-Specific Integrated
Circuit; a custom-designed chip
optimised for a specific function or task
CNN – Convolutional Neural Network; deep
tall, 63kg, 41 degrees of freedom, two
hour battery) – see Fig.63.
Features of the Walker S include
all-terrain autonomous adaptation,
robust self-balancing, multi-modal
large model-based decision making, hand-eye coordination and
whole body manipulation, U-SLAM
(UBTECH simultaneous location and
mapping), 3D point cloud semantic
navigation, human and environment
comprehensive perception and multimodal human-robot interaction.
It runs on Robot Operating System,
Linux ROSA 2.0, supports teleoperation and AIoT (artificial intelligence
of things).
For more information, see the videos at https://youtu.be/UCt7qPpTt-g
and siliconchip.au/link/ac7o
carpentry, 3D concrete printing and
other tasks – see Fig.64. It is intended
to address a shortage of skilled construction workers and to reduce
human injuries.
Zyrex
https://ricrobotics.com
This 6m-tall AI robot for construction sites from the Californian company RIC Robotics is yet to be released,
but is designed to perform welding,
Further viewing
The smallest humanoid robot
The world’s smallest humanoid
robot is 57.7mm tall and was built
by Tatsuhiko Mitsuya at the Nagoya
Institute of Technology in Japan – see
Fig.65.
Upgrading humanoid robots
AI is progressing rapidly. Since AI
is software-based, it is usually possible to upgrade a humanoid robot to
make it smarter as new AI software is
released, thus protecting the investment in robot hardware.
A fascinating look at “The Proto-
Robots of Antiquity” is available in
the YouTube video at https://youtu.
SC
be/0QGkf13fVs4
learning models optimised for vision,
detecting edges, shapes and patterns
CPU – Central Processing Unit; a general-
purpose processor that executes
instructions & manages computing tasks
DoF – Degrees of Freedom; independent
movements a robot joint or mechanism
can perform
End Effector – a tool/device at a robotic
arm’s end that interacts with objects
FPGA – Field-Programmable Gate Array;
a chip programmable for specific
hardware tasks post-manufacturing
GPU – Graphics Processing Unit; a
processor specialised for highly parallel
tasks like machine learning
LLM – Large Language Model; an AI model
trained on massive text datasets to
generate or understand language
Multimodal – An AI that processes and
integrates multiple data types (text,
images, audio, video etc)
Neuromorphic Processor – a chip
that uses artificial neurons to mimic the
human brain
NLP – Natural Language Processing; an
AI’s ability to understand, interpret and
generate human language
Organoid – a simplified version of an
organ designed to imitate it
RTOS – Real-Time Operating System; an
operating system that guarantees timely
processing for critical tasks
Fig.64: the giant Zyrex
construction robot,
compared to a human-sized
robot at lower left. The robot
is not fully humanoid like the
others in this article – but it
has some similarities. Source:
Robotics & Automation News
– siliconchip.au/link/ac7r
Fig.65: the world’s
smallest humanoid
robot.
Tactel – Tactile Element; a sensor element
that detects touch, pressure or texture
information
Teleoperation – operating a machine
remotely
TPU – Tensor Processing Unit; a Google-
designed chip optimised for accelerating
machine learning workloads.
Transformer – a neural network
architecture that uses attention to
process sequential data efficiently
VLA – Vision-Language Action; an AI that
combines visual input and language to
perform actions or tasks
VLM – Vision-Language Model; an AI that
combines image understanding with text
comprehension and generation
siliconchip.com.au
Australia's electronics magazine
December 2025 29
|