Results 141 to 150 of about 993,691 (269)
ABSTRACT Quantifying oral polymorphonuclear neutrophils (oPMNs) is a clinically validated approach for assessing periodontal inflammation. However, current methods, such as manual hemocytometry and flow cytometry, are time‐consuming (>3 h), require invasive sampling, and depend on staining and complex instrumentation, making them unsuitable for point ...
Mohsen Hassani +9 more
wiley +1 more source
This review presents recent progress in vision‐augmented wearable interfaces that combine artificial vision, soft wearable sensors, and exoskeletal robots. Inspired by biological visual systems, these technologies enable multimodal perception and intelligent human–machine interaction.
Jihun Lee +4 more
wiley +1 more source
Information Transmission Strategies for Self‐Organized Robotic Aggregation
In this review, we discuss how information transmission influences the neighbor‐based self‐organized aggregation of swarm robots. We focus specifically on local interactions regarding information transfer and categorize previous studies based on the functions of the information exchanged.
Shu Leng +5 more
wiley +1 more source
An AI‐Enabled All‐In‐One Visual, Proximity, and Tactile Perception Multimodal Sensor
Targeting integrated multimodal perception of robots, an AI‐enabled all‐in‐one multimodal sensor is proposed. This sensor is capable of perceiving three types of modalities, including vision, proximity, and tactility. By toggling an ultraviolet light and adjusting the camera focus, it switches smoothly between multiple perceptual modalities, enabling ...
Menghao Pu +7 more
wiley +1 more source
Continual Learning for Multimodal Data Fusion of a Soft Gripper
Models trained on a single data modality often struggle to generalize when exposed to a different modality. This work introduces a continual learning algorithm capable of incrementally learning different data modalities by leveraging both class‐incremental and domain‐incremental learning scenarios in an artificial environment where labeled data is ...
Nilay Kushawaha, Egidio Falotico
wiley +1 more source
Video games as stimuli in neuroimaging studies: a minireview. [PDF]
Blank IB, Klucharev V, Shestakova A.
europepmc +1 more source
Here, we present a textile, wearable capacitive interface enabling multidirectional remote control by dynamically modulating electrode overlap and spacing via a freely gliding upper electrode. A forearm‐mounted prototype drives robotic and media tasks with 12–15 ms latency, maintains < 0.8% drift after 500 cycles, and remains stably functional at 90 ...
Cagatay Gumus +8 more
wiley +1 more source
Multimodal Human–Robot Interaction Using Human Pose Estimation and Local Large Language Models
A multimodal human–robot interaction framework integrates human pose estimation (HPE) and a large language model (LLM) for gesture‐ and voice‐based robot control. Speech‐to‐text (STT) enables voice command interpretation, while a safety‐aware arbitration mechanism prioritizes gesture input for rapid intervention.
Nasiru Aboki +2 more
wiley +1 more source
Game over? examining associations between video game play and visual and auditory spatial ability. [PDF]
Pasescu P +3 more
europepmc +1 more source
A flexible, skin‐integrated electromagnetic actuator is developed for wearable virtual/augmented reality (VR/AR) haptic systems. A tunable design model enables control over displacement and resonance frequency. The system is validated through a custom VR application with a 6 × 4 actuator array, demonstrating real‐time, spatially targeted tactile ...
Naji Tarabay +9 more
wiley +1 more source

