Profile Picture

Diar Abdlkarim

Research Scientist

As AI becomes the default interface to computing, the real question is how the brain adapts at the neural and behavioral levels. I study Human-Computer Interaction (HCI), Human-Robot Interaction (HRI), and extended reality (XR) to reveal how perception–action circuits encode, predict, and recalibrate during interaction with machines. To test these mechanisms, I build tracking hardware, haptic devices, and XR tools that capture fine-grained sensorimotor signals and close the loop in real time. The goal is brain-informed experiences that strengthen perception, movement, and collaboration.

Research

My work spans human–computer interaction, sensorimotor neuroscience, and rehabilitation. I study how people plan and adapt hand actions to reveal the mechanisms of neuroplasticity. These findings set measurement priorities, shaping study designs and clarifying clinical success criteria.

Research

Hardware

Hardware translates those priorities into instruments. I build hand and finger‑tracking systems with haptics that capture motion and deliver tactile cues. They enable experiments linking touch, action, and learning with the precision studies demand.

Hardware

Software

Software turns instrument data into understanding. I develop XR pipelines that link robots and game engines in real time, synchronise devices, and analyze movement. Outputs become interpretable measures of brain–behaviour change and drive engaging rehabilitation experiences.

Software

Courses & Activities

Contact Me