Research Projects

   Virtual Therapy Exergame for Upper Extremity Rehabilitation

   
Team Dr. Leila Barmaki, Lauren Baron
Project term 6/2020 - Present
Funder National Institutes of Health, NSF
Partner

Delaware INBRE, University of Delaware College of Engineering

Description

In this project, a creative drawing game was developed in VR to promote upper extremity mobility exercise as users draw using broad and continuous arm movements. Many models have been developed with varying difficulties and dimensions: 2D Fish, 3D Fish, 2D Chicken, 2D Square, etc. Our goal is to integrate this VR exergame into the upper-limb rehabilitation for post-stroke patients to provide an engaging yet effective therapy. We also explore multi-modal data collection to assess the movement of the entire limb through hand-controller tracking and a wearable elbow sensor sleeve.

Publications

Baron, L. ∗, Chheang, V. ∗, Chaudhari, A., Liaqat, A., Chandrasekaran, A., Wang, Y., Cashaback, J., Thostenson, E., Barmaki, R., 2023, June. Virtual Therapy Exergame for Upper Extremity Rehabilitation Using Smart Wearable Sensors.In ACM/IEEE International Conference On Connected Health: Applications, Systems And Engineering Technologies (CHASE ’23).  
Baron, L. ∗, Chheang, V. ∗, Chaudhari, A., Liaqat, A., Chandrasekaran, A., Wang, Y., Cashaback, J., Thostenson, E., Barmaki, R., 2023, June. Poster: Virtual Therapy Exergame for Upper Extremity Rehabilitation Using Smart Wearable Sensors. In ACM/IEEE INTERNATIONAL CONFERENCE ON CONNECTED HEALTH: APPLICATIONS, SYSTEMS AND ENGINEERING TECHNOLOGIES (CHASE ’23).
Baron, L., Wang, Q., Segear, S., Cohn, B.A., Kim, K. and Barmaki, R., 2021, November. Enjoyable Physical Therapy Experience with Interactive Drawing Games in Immersive Virtual Reality. In Symposium on Spatial User Interaction (pp. 1-8).

Multi-Modal Affect Analysis for Children with ASD

   
Team Dr. Leila Barmaki, Dr. Zhang Guo, Jicheng Li, Dr. Pinar Kullu, Eli Brignac      
Project term 1/2020 - Present
Funder Amazon Research Awards
Partner

University of Delaware AI Center for Excellence

Description

In this project, we analyze mutual gaze for a social behavior assessment of children with Autism Spectrum Disorder, particularly during play therapy. We also assess movement synchronization of ASD with a skeleton-based transformer network.

Publications

Guo, Z., Chheang, V., Li, J., Kenneth E. Barner, Anjana Bhat, and Barmaki, R. 2023, June. Social Visual Behavior Analytics forAutism Therapy of Children Based on Automated Mutual Gaze Detection. In ACM/IEEE International Conference On Connected Health: Applications, Systems And Engineering Technologies (CHASE ’23). 
Li, J., Bhat, A. and Barmaki, R., 2022, November. Pose Uncertainty Aware Movement Synchrony Estimation via Spatial-Temporal Graph Transformer. In Proceedings of the 2022 International Conference on Multimodal Interaction  (pp. 73-82). 
Guo, Z., Kim, K., Bhat, A. and Barmaki, R., 2021. An Automated Mutual Gaze Detection Framework for Social Behavior Assessment in Therapy for Children with Autism. In Proceedings of the 2021 International Conference on Multimodal Interaction (pp. 444-452). 
Li, J., Bhat, A. and Barmaki, R., 2021, October. Improving the Movement Synchrony Estimation with Action Quality Assessment in Children Play Therapy. In Proceedings of the 2021 International Conference on Multimodal Interaction (pp. 397-406).   
Guo, Z. and Barmaki, R., 2020. Deep neural networks for collaborative learning analytics: Evaluating team collaborations using student gaze point prediction. Australasian Journal of Educational Technology, 36(6), pp.53-71.

   VR Balance Training for Lower Extremity Rehabilitation

Team Dr. Leila Barmaki, Sydney Segear         
Project term 6/2021 - Present
Funder National Institutes of Health, NSF
Partner

Delaware INBRE, University of Delaware College of Engineering

Description

In this project, users are immersed in a virtual ice skating rink and must follow a coach avatar in a series of balance exercises. There are several settings in the project to customize the coaching style of the avatar (i.e. audio feedback) and the point of view (i.e. first person vs third person). The user is immersed using a Windows MR HMD and body tracking data is collected using an Azure Kinect. The goal of this application is to improve balance in lower extremity rehabilitation for post-stroke patients to prevent falling in older adults.

   VR and Robotics for Upper Extremity Rehabilitation

Team Dr. Leila Barmaki, Dr. Vuthea Chheang         
Project term 12/2022 - Present
Funder National Institutes of Health, NSF
Partner

Delaware INBRE, University of Delaware College of Engineering

Description

In this project, users draw a simple circle and diamond task in a VR condition and a VR KinArm condition. The KinArm is an end-point robot that was integrated with VR, HTC Vive Trackers, and a wearable elbow sensor sleeve. The goal of this study is to introduce a framework for upper extremity rehabilitation using VR and robotics, especially for patients with Parkinson's disease or that have suffered a stroke.

Lab Streaming Layer Framework for VR/AR

   
Team Dr. Leila Barmaki, Ryan Bilash, Lauren Baron, Kyle Wang        
Project term 6/2022 - Present
Funder National Institutes of Health, NSF, NIGMS, NSERC
Partner

UD Research Foundation, UD College of Engineering, AWS, Unidel Foundation

Description

In this project, we use the open-source framework LSL to stream multiple channels of data from VR/AR devices to any PC. We use HoloLens to collect eye tracking gaze data and position/rotation of the headset and use Azure Kinect to collect body tracking data. The data is streamed as an XDF file which we convert to a CSV for easier readability. We plan on using this data to tell us information on the user's performance as well as where their attention goes during VR/AR tasks. 

Publications

Wang, Q., Zhang, Q., Sun, W., Boulay, C., Kim, K., Barmaki, R. 2023, April. A scoping review of the use of lab streaming layer framework in virtual and augmented reality research. In VIRTUAL REALITY.
Wang, Q., Beardsley, V.J., Zhang, Q., Kim, K. and Barmaki, R., 2021, November. An LSL-Middleware Prototype for VR/AR Data Collection. In Symposium on Spatial User Interaction (pp. 1-2). 

Game- and Video-Based Learning for STEM+C Education

Team Dr. Leila Barmaki, Shayla Sharmin   
Project term 11/2022 - Present
Funder NSF
Partner

Department of Computer and Information Sciences

Description

In this project, we compare game-based learning methods and video-based learning to investigate how engagement and knowledge gain changes while learning computer science topics such as “graph theory”. With the interactive desktop-based game, we also collect the brain oxygenation data from participants using functional near-infrared spectroscopy (fNIRS), and eye tracker. Our goal is to use non-invasive multi-modal data to better understand the differences of watching the video and engaging in the game and comprehend how different learning methods affect our brains.

Multi-User Metaverse for Parkinson’s Disease Patients

Team Dr. Leila Barmaki, Dr. Vuthea Chheang     
Project term 12/2022 - Present
Funder National Institutes of Health, NSF
Partner

Delaware INBRE, University of Delaware College of Engineering

Description

In this project, we develop an immersive VR system where multiple users can interact in a metaverse setting. Potential scenarios include co-located or distributed collaborative VR so that Parkinson's disease patients can consult with their therapists remotely or with other patients in physical therapy together. 

ChatGPT VR Learning Tool with Virtual Avatars

 
Team Dr. Leila Barmaki, Dr. Vuthea Chheang     
Project term 2/2023 - Present
Funder National Institutes of Health, NSF
Partner

Delaware INBRE, University of Delaware College of Engineering

Description

In this project, we integrate the ChatGPT and DALLE-2 AI systems with virtual avatars in an immersive VR environment. The goal is for users to verbally ask the avatar questions to assist in learning about anatomy and other medical concepts and get a response. We will measure their learning with an assessment in the VR environment.

Print Friendly, PDF & Email