Results 191 to 200 of about 482,764 (277)

Compliant Pneumatic Feet with Real‐Time Stiffness Adaptation for Humanoid Locomotion

open access: yesAdvanced Robotics Research, EarlyView.
A compliant pneumatic foot with real‐time variable stiffness enables humanoid robots to adapt to changing terrains. Using onboard vision and pressure control, the foot modulates stiffness within each gait cycle, reducing impact forces and improving balance. The design, cast in soft silicone with embedded air chambers and Kevlar wrapping, offers durable,
Irene Frizza   +3 more
wiley   +1 more source

Backpropagation Through Soft Body: Investigating Information Processing in Brain–Body Coupling Systems

open access: yesAdvanced Robotics Research, EarlyView.
This study explores how information processing is distributed between brains and bodies through a codesign approach. Using the “backpropagation through soft body” framework, brain–body coupling agents are developed and analyzed across several tasks in which output is generated through the agents’ physical dynamics.
Hiroki Tomioka   +3 more
wiley   +1 more source

Numerical Modeling of Photothermal Self‐Excited Composite Oscillators

open access: yesAdvanced Robotics Research, EarlyView.
We present a numerical framework for simulating photothermal self‐excited oscillations. The driving mechanism is elucidated by highlighting the roles of inertia and overshoot, as well as the phase lag between the thermal moment and the oscillation angle, which together construct the feedback loop between the system state and the environmental stimulus.
Zixiao Liu   +6 more
wiley   +1 more source

Robotic Control for Human–Robot Collaborative Assembly Based on Digital Human Model and Reinforcement Learning

open access: yesAdvanced Robotics Research, EarlyView.
This work presents a robotic control method for human–robot collaborative assembly based on a biomechanics‐constrained digital human model. Reinforcement learning is used to generate physiologically plausible human motion trajectories, which are integrated into a virtual environment for robot control learning.
Bitao Yao   +4 more
wiley   +1 more source

Multimodal Human–Robot Interaction Using Human Pose Estimation and Local Large Language Models

open access: yesAdvanced Robotics Research, EarlyView.
A multimodal human–robot interaction framework integrates human pose estimation (HPE) and a large language model (LLM) for gesture‐ and voice‐based robot control. Speech‐to‐text (STT) enables voice command interpretation, while a safety‐aware arbitration mechanism prioritizes gesture input for rapid intervention.
Nasiru Aboki   +2 more
wiley   +1 more source

Home - About - Disclaimer - Privacy