Results 281 to 290 of about 5,531,875 (391)
An AI‐Enabled All‐In‐One Visual, Proximity, and Tactile Perception Multimodal Sensor
Targeting integrated multimodal perception of robots, an AI‐enabled all‐in‐one multimodal sensor is proposed. This sensor is capable of perceiving three types of modalities, including vision, proximity, and tactility. By toggling an ultraviolet light and adjusting the camera focus, it switches smoothly between multiple perceptual modalities, enabling ...
Menghao Pu +7 more
wiley +1 more source
The Future of Research in Cognitive Robotics: Foundation Models or Developmental Cognitive Models?
Research in cognitive robotics founded on principles of developmental psychology and enactive cognitive science would yield what we seek in autonomous robots: the ability to perceive its environment, learn from experience, anticipate the outcome of events, act to pursue goals, and adapt to changing circumstances without resorting to training with ...
David Vernon
wiley +1 more source
Postural correlates of pleasant landscapes visual perception. [PDF]
Akounach M, Lelard T, Mouras H.
europepmc +1 more source
Hard‐Magnetic Soft Millirobots in Underactuated Systems
This review provides a comprehensive overview of hard‐magnetic soft millirobots in underactuated systems. It examines key advances in structural design, physics‐informed modeling, and control strategies, while highlighting the interplay among these domains.
Qiong Wang +4 more
wiley +1 more source
Process dynamics of serial biases in visual perception and working memory processes. [PDF]
Park HB.
europepmc +1 more source
Grounding Large Language Models for Robot Task Planning Using Closed‐Loop State Feedback
BrainBody‐Large Language Model (LLM) introduces a hierarchical, feedback‐driven planning framework where two LLMs coordinate high‐level reasoning and low‐level control for robotic tasks. By grounding decisions in real‐time state feedback, it reduces hallucinations and improves task reliability.
Vineet Bhat +4 more
wiley +1 more source

