Profile Picture

Diar Abdlkarim

Research Scientist

As a research scientist and engineer, I believe we are in the midst of a fundamental shift in how we interact with technology. This transformation is largely driven by recent innovations in AI, which has rapidly become our primary interface with digital systems. This unifying development allows us all to more effectively benefit from the wonders of technology. My work exists at the intersection of scientific research in human-technology interaction (HTI), encompassing human-computer interaction (HCI), human-robot interaction (HRI), and immersive technologies (e.g., extended reality, XR).

Research

I conduct scientific research on human sensory-motor action and perception, developing advanced hand and finger tracking hardware, haptic feedback devices, and immersive XR software tools to study and enhance human-technology interaction in both physical and virtual environments.

Research

Hardware

Development of research-grade hardware for real-time hand and finger tracking, including health and fitness devices that support clinical assessment of human touch and the development of brain-computer interfaces (BCI), incorporating electroencephalography (EEG), electromyography (EMG), electrical muscle stimulation (EMS), pupillometry, and eye tracking.

Hardware

Software

Development of various software research tools, such as phone applications (see Tactile on iOS App Store), XR applications (see SideQuest App Store), and various communication tools for real-time data streaming between robotic devices and game engines like Unity or Unreal Engine.

Software

Projects

Obi Robotics Advanced Hand-Tracking Glove

Modular Hand-Tracking Glove ("Obi" Project). I built a custom glove that captures every subtle movement of my hand and fingers. It began as a tangle of wires and tiny IMU sensors taped to a glove – a rough prototype that let a digital hand on my screen mimic my own. That early experiment evolved into a refined, modular glove system I call "Obi Reach." Each finger’s motion is tracked with up to 3 degrees of freedom, sampling at 240 Hz for smooth, real-time response. The glove even incorporates an optical tracking module for precise position in space, and I’ve embedded small haptic actuators to provide tactile feedback to the wearer. Designing this from scratch – including custom electronics – was challenging, but incredibly rewarding. Now I can use this glove for everything from immersive VR interactions to controlling robotic hands, and it’s easily customizable for new experiments.

Project 1

Fingertip Indentation Feedback Device

This fingertip-mounted haptic device enables users to feel the sensation of pressing virtual buttons and encountering bumps in digital environments, creating a tangible sensory link to virtual worlds. The device features a compact housing and an indentor mechanism that converts rotary motion from a small geared motor into precise linear indentation of the fingertip. Real-time force feedback allows for immersive, engaging experiences by rendering tactile cues directly to the user. Beyond enhancing VR immersion, this tool is valuable for human-computer interaction research and assessing sensory-motor performance, such as evaluating patients with peripheral neuropathy and related conditions.

Project 1

Fingertip Vibrotactile Feedback Device

This fingertip-mounted haptic device delivers both low and high frequency vibrotactile signals to the user's fingertips, simulating the sensation of texture during exploratory touch and virtual object interaction. By providing nuanced vibration patterns, the device enhances the perception of surface qualities and material properties in digital environments. These tactile cues complement visual and auditory feedback, enriching multi-sensory experiences during active engagement. Clinically, the device is valuable for sensory assessment, enabling the evaluation of tactile function in patients with conditions such as Carpal Tunnel Syndrome (CTS), vibration-induced neuropathies, and other disorders affecting hand and finger sensation.

Project 1

Desktop Clinical Vibrotactile Assessment Device

This desktop, grounded device is designed specifically for clinical research and assessment of sensory-motor performance. Unlike fingertip-mounted haptic devices, this system provides precise vibrotactile stimuli to the user's fingertip while simultaneously measuring the force applied by the user. This dual capability enables comprehensive evaluation of tactile sensitivity and motor control, which is essential for diagnosing and monitoring conditions such as Carpal Tunnel Syndrome (CTS), vibration-induced neuropathies, and other disorders affecting hand function. The device is not intended for virtual or augmented reality applications, but rather for use in clinical and research settings. It is accompanied by a dedicated phone application that wirelessly interfaces with the device, allowing clinicians and researchers to run a variety of preset assessments and collect data either on-site or remotely, streamlining the process of sensory-motor evaluation.

Desktop Clinical Vibrotactile Assessment Device

Immersive VR Pool Game with Haptics

As a fun experiment in combining simulation and tactile feedback, I created a virtual reality pool game in my home setup. I can step into a VR pool hall, grab a virtual cue, and line up shots just like real life. The special part is the custom haptic twist I added: I connected my wrist squeeze-band to the game. When the cue strikes a ball or the ball sinks into a pocket, the band gives my wrist a quick squeeze or pulse. The first time I felt that thud “for real,”. This project blends my love of games with my obsession for realism in VR, and it taught me a lot about syncing physical feedback with virtual physics.

Project 2

Squeeze-Based Wrist Haptic Band

To bring a sense of touch into my virtual experiences, I developed a haptic wristband that provides squeeze to my wrist. Essentially, it’s a band that can squeeze my wrist in sync with events in a simulation. I built this to explore whether pressure cues could make virtual actions feel more real – like the jolt of a virtual ball hitting a pool cue or the weight of a digital object. The design went through many iterations to get the squeeze just right: firm enough to be felt, but comfortable and safe. When something happens in VR, a small motor tightens the band for a split second, translating visual impact into a tactile one. It’s a quirky gadget, but strapping it on genuinely adds immersion – my brain starts interpreting those squeezes as if I’m touching the virtual world.

Project 2

VR Physics Object Stacking Sandbox

This project is about creating realistic hand-object interactions without coding, but rather through the use of the physics engine provided by the game engine (Unity in this case). Users hands can naturally interact with the virtual world, hold objects, press buttons and manipulate the enviornment in a physically realistic and plausible manner.

Project 2

Publications

Development and validation of the Interoceptive States Vocalisations (ISV) and Interoceptive States Point Light Displays (ISPLD) databases

Biotti, F., Sidnick, L., Hatton, A. L., Abdlkarim, D., Wing, A., Treasure, J., Happé, F., Brewer, R. (2025).
Behavior Research Methods, 57(5), 133. Springer US New York.

Text Entry for XR Trove (TEXT): Collecting and Analyzing Techniques for Text Input in XR

Bhatia, A., Mughrabi, M. H., Abdlkarim, D., Di Luca, M., Gonzalez-Franco, M., Ahuja, K., Seifi, H. (2025).
In Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems (pp. 1–18).

Hovering Over the Key to Text Input in XR

Gonzalez-Franco, M., Abdlkarim, D., Bhatia, A., Macgregor, S., Fotso-Puepi, J. A., Gonzalez, E. J., Seifi, H., Di Luca, M., Ahuja, K. (2024).
In 2024 IEEE International Symposium on Emerging Metaverse (ISEMV) (pp. 13–16). IEEE.

Viewing angle matters in British Sign Language processing

Watkins, F., Abdlkarim, D., Winter, B., Thompson, R. L. (2024).
Scientific Reports, 14(1), 1043. Nature Publishing Group UK London.

A methodological framework to assess the accuracy of virtual reality hand-tracking systems: A case study with the Meta Quest 2

Abdlkarim, D., Di Luca, M., Aves, P., Maaroufi, M., Yeo, S.-H., Miall, R. C., Holland, P., Galea, J. M. (2024).
Behavior Research Methods, 56(2), 1052–1063. Springer US New York.

Tempo change and leadership in ensemble synchronisation: a case study

Li, M. S., Tomczak, M., Elliot, M., Bradbury, A., Goodman, T., Abdulkarim, D., Di Luca, M., Hockman, J., Wing, A. (2023).

Annotation of soft onsets in string ensemble recordings

Tomczak, M., Li, M. S., Bradbury, A., Elliott, M., Stables, R., Witek, M., Goodman, T., Abdlkarim, D., Di Luca, M., Wing, A., et al. (2022).
arXiv preprint arXiv:2211.08848.

Robot, Pass me the tool: Handle visibility facilitates task-oriented handovers

Ortenzi, V., Filipovica, M., Abdlkarim, D., Pardi, T., Takahashi, C., Wing, A. M., Di Luca, M., Kuchenbecker, K. J. (2022).
In 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI) (pp. 256–264). IEEE.

PrendoSim: Proxy-Hand-Based Robot Grasp Generator

Abdlkarim, D., Ortenzi, V., Pardi, T., Filipovica, M., Wing, A., Kuchenbecker, K. J., Di Luca, M. (2021).
In 18th International Conference on Informatics in Control, Automation and Robotics. SciTePress Digital Library.

Acting is not the same as feeling: Emotion expression in gait is different for posed and induced emotions

Schuster, B. A., Sowden, S. L., Abdulkarim, D., Wing, A. M., Cook, J. L. (2019).
Frontiers in Human Neuroscience, 13.

L1 and L2 sign recognition: the role of visual angle

Watkins, F., Abdlkarim, D., Thompson, R. L. (2018).
In 3rd International Conference on Sign Language Acquisition (Istanbul: Koç Üniversitesi).

About Me

I am a futurist with an optimistic belief in technology’s power to improve our lives. My passion lies in exploring how the human nervous system can interface with machines, especially as artificial intelligence transforms our world. From building custom haptic devices and VR gloves to developing immersive simulations, I strive to create technology that brings us closer to our digital experiences. My research is driven by curiosity about how we sense, move, and interact, and how new tools can expand what it means to be human in the age of AI.

Contact Me