Cover Page

Scrivener Publishing
100 Cummings Center, Suite 541J
Beverly, MA 01915-6106

Publishers at Scrivener
Martin Scrivener (martin@scrivenerpublishing.com)
Phillip Carmical (pcarmical@scrivenerpublishing.com)

Emerging Technologies for Health and Medicine

Virtual Reality, Augmented Reality, Artificial Intelligence, Internet of Things, Robotics, Industry 4.0

Dac-Nhuong Le

Deputy-Head, Faculty of Information Technology, Haiphong University, Haiphong, Vietnam

Chung Van Le

CVS Center, Duy Tan University, Danang, Vietnam

Jolanda G. Tromp

University of New York in Oswego, NY, USA

Gia Nhu Nguyen

Dean, Graduate School, Duy Tan University, Danang, Vietnam

 

 

Wiley Logo

List of Figures

1.1 (a) Example of Virtual Reality [10], (b) Example of Augmented Reality Training in Health care [4]

1.2 Relationship of Real and Virtual Environment (Augmented Reality or Mixed Reality) [15]

1.3 Levels of VR Immersion (a) A Non Immersive VR System (b) A Semi Immersive VR System (c) A Fully Immersive VR System [10]

1.4 AR Systems Formats (a) Marked AR System. (b) Mark less AR System [38]

1.5 Gadgets and Wearable Devices used in Health Care Applications [37]

2.1 3D virtual reality simulation of human anatomy

2.2 Practicing in Virtual Reality

2.3 Design of the Study

2.4 Three teaching methods: A. Plastic models, B. Real cadaver, C. Virtual Reality

2.5 The scores of the different university students (HPMU, DTU and BMTU) after the first posttraining exam

2.6 The scores of the different university students (HPMU, DTU and BMTU) after the second posttraining exam

2.7 The scores of the different university students (HPMU, DTU and BMTU) grouped together per condition (Manikin, Cadaver, VR) after the first posttraining exam (Post test 1, yellow), and the second posttraining exam (Post test 2, red)

2.8 The aggregated scores of all university students (HPMU, DTU and BMTU) grouped together per condition (Manikin, Cadaver, VR), with the first posttraining exam scores (Post test 1, orange), and the scores on the second posttraining exam (Post test 2, green)

3.1 AR Market Predictions

3.2 An empathy scene

3.3 The response buttons

3.4 Among the Participants

3.5 Testing on Preschool Students

4.1 Oculus Rift Consumer Version 1

4.2 Example of two users communicating in a Virtual Reality space

4.3 Image of Virtual Reality Interview Training Session

4.4 Participants response to measure 12

4.5 Participants responses to measure 13

5.1 Oral and Maxillofacial Surgery-Before and After Results

5.2 3D Patient Skull Generation

5.3 Dental Implant Surgery via Screw fitted on Jawbone

5.4 Orthognathic Surgery

5.5 Dental Simulation

5.6 DentSim Real Time Root Canal Treatment Simulation

6.1 The iterative process of VR Development

6.2 Usability and other potential acceptance criteria, Nielsen’s Usability Diagram.

6.3 The process of empirical evaluation

6.4 Electrode Placement to recording Galvanic Skin Response

6.5 Electrode Placement to recording facial expressions with EMG

6.6 The development cycle, using Rapid prototype <> test cycles by the Interaction Design Foundation

7.1 Intelligent Telemedicine Rehabilitation Robotic Architecture

7.2 Hierarchical Intelligent Behavior Control for Robot

7.3 Intelligent Behavior Control Algorithm

7.4 General process model for Telemedicine sensor data management

7.5 Mobile Patient Emergency for Stroke Patients to Nearest Hospital

7.6 Artificial Intelligence Technologies Components

8.1 Basic Service-Oriented Architecture

8.2 SOA Service Communication using WCF

8.3 SOA implemented as WCF process and services

8.4 The Proposed Telemedicine System Modules

8.5 Block Diagram of Image Compression Using Wavelet Technique

8.6 Two level wavelet decomposition

8.7 Image Enhancement and ROI segmentation flowchart

8.8 Results of Image Enhancement and Region of Interest Segmentation

8.9 Mammogram images while applying steps of Fuzzy C-Mean algorithm steps. (a) Original image, (b) image with segmented ROI after applying the morphological operators, (c) The resulted image after clustering

9.1 Mobile Doctor Brain AI App

9.2 Research Area 1: AI for Raspberry pi - system on chip

9.3 Research Area 2: AI Real-time EMG Human Motion Analysis

9.4 General process model for Artificial Intelligence Telemedicine sensor data management (Three Layers: Signal Processing, Mobile Data Aggregation with AI Engine an Cloud CBR Patients Expert system)

9.5 Patient Emergency Scenario for Stroke/Heart Attack Elderly and Expert Doctor Recommendations

9.6 Architecture of support vector machine

9.7 Classification accuracies for RBF kernel with different sigma values for Kneeing and Pulling actions

9.8 Case-Based Reasoning Cycle

9.9 EMG Commerical Shimmer Sensor

10.1 A hierarchical Case-Based Decision Support System for Cancer Diagnosis on three levels of medical expert

10.2 Frame scheme of the 44 real medical features of thyroid cancer

11.1 Proposed mechanical structure of Ro CIF VIP with 5 DOF

11.2 Translation Joint Class diagram

11.3 Top (a) and bottom (b) sliding prismatic joint

11.4 Top (a) and bottom (b) gripper prismatic joint

11.5 Extraction prismatic joint

11.6 Root component which starts the entire simulation

11.7 SimStarter class diagram

11.8 GUI Class which provides commands to the user

11.9 GUI required for controlling the virtual simulation

11.10 GUI used to compute CIF parameters using intelligent interfaces

11.11 GUI Class for navigating through the virtual environment

11.12 Pan (a), Zoom (b), and Rotation (c) navigation with mouse

11.13 SimManager class diagram required for accessing the simulated components from within the app

11.14 Slider class diagram inherited by each Translation Joint class

11.15 Ro CIF VIP class component required to link the simulated objects with the manager variables

11.16 Save to XML experimental data, class diagram

11.17 Saved data for Top Sliding (a) and Bottom Sliding (b) prismatic joint

11.18 Saved data for Top Gripper (a) and Bottom Gripper (b) prismatic joint

11.19 Saved data for Extractor prismatic joint

12.1 Telemedicine Technologies

12.2 Telemedicine Architecture

13.1 Physical model of the walking robot leg; Bt and OE are center of circle respectively ellipse arc trajectories; P1; PA; P; PB are knee joint positions in leg cyclic evolution; Q1;QA; Q;QB are base point positions in leg cyclic evolution; P*A;P*B;Q*A;Q*B are critical positions

13.2 Physical model of three dimensional walking robot leg

14.1 Twolink inverted pendulum model in the sagittal plane

14.2 Stabilization is done in 18 seconds, with a disturbance to the ankle and the hip of x0 = [0:02; 0:03; 0; 0] and a low R value

14.3 Stabilization is done in 35 seconds, with a disturbance to the ankle and the hip of x0 = [0:02; 0:03; 0; 0] and a high R value

14.4 Stabilization is done in 18 seconds, with a disturbance to the ankle and the hip of x0 = [–0:02; 0:03; 0; 0] and a low R value

14.5 Stabilization is done in 60 seconds, with a disturbance to the ankle and the hip of x0 = [–0:02; 0:03; 0; 0] and a high R value

14.6 Results for disturbance only to the hip x0 = [0; 0:03; 0; 0] with high R values, stabilization is done in 40 seconds.

14.7 Results for disturbance only to the hip x0 = [0; 0:03; 0; 0] with lower R values, stabilization is done in 20 seconds.

15.1 The shapes and colors

15.2 The fixed motor directions

15.3 RoboTherapist flowchart

15.4 RoboTherapist

15.5 Survey on students’ attentiveness

15.6 Survey on the effectiveness of robotic approach

15.7 Opinions on robotic approach

15.8 Traditional method in teaching basic shapes using cardboard and whiteboard

15.9 After 10 minutes learning autistic children started to lose their interest

15.10 Autistic children still attracted to learn even after 20 minutes

15.11 Hands-on learning

15.12 Test after learning process with Robot

16.1 Lower limb rehabilitation robot

16.2 Left mechanical leg

16.3 Necessary sensor element

16.4 Linkage model of LLRR mechanical leg

16.5 Comparison between the original path and the new path planned

16.6 The angular position of the end point at X axis and Y axis

16.7 The velocity in the direction of X axis and Y axis

16.8 The acceleration in the direction of X axis and Y axis

16.9 The angular position of three joints

16.10 The angular velocity of three joints

16.11 The acceleration of three joints

16.12 The angular position curves at X axis and Y axis

16.13 Angular velocity curves at the line, X axis and Y axis

16.14 The acceleration in the direction of the line, X axis and Y axis

16.15 The displacement of three joints

16.16 The velocity of three joints

16.17 The acceleration of three joints

16.18 Riding body posture

16.19 Calculated circular trajectory

16.20 Interaction control strategy

16.21 Match scene in game

16.22 First-person perspective

16.23 Function test

16.24 Screenshot of synchronization test

16.25 Feedback terrains test

17.1 Brain Anatomy

17.2 Nervous System

17.3 Neuron

17.4 Pyramidal Neuron

17.5 Pyramidal Neuron Chain

17.6 Synapse with Pyramidal Neuron

17.7 Neurotransmitters

17.8 Neuron Dipole

17.9 Working of BCI

17.10 Delta Waves

17.11 Theta Waves

17.12 Alpha Waves

17.13 Beta Waves

17.14 Gamma Waves

17.15 Steps

17.16 Electrode position

17.17 Electrode Cap side view

17.18 Differential Amplifier

17.19 Differential Amplifier Working

17.20 Fp2-F8 Single Tracing

17.21 Chain

17.22 Bipolar Montage Electrodes

17.23 Anterior-Posterior Montage

17.24 Eye Deflection Readings

17.25 BCI methods Implantation

18.1 Ultrasonic Sensor Working

18.2 NodeMCU Board [4]

18.3 NodeMCU pin diagram [5]

18.4 Simple Buzzer

18.5 Basic Flow Diagram

18.6 7805 IC

18.7 Breadboard

18.8 SNAP Connector

18.9 Female to Female wire, Male to Male wire, Male to Female wire

18.10 All Components

18.11 Circuit Diagram

18.12 Connection of 7805 IC and Battery with Bread board

18.13 NodeMCU connections

18.14 Buzzer Implementation

19.1 Overview of system

19.2 Connection of Arduino with ACS712 sensor and Relay

19.3 Raspberry Pi and connected different modules

19.4 Baby Monitoring System and connected sub modules

19.5 Graph of Body Temperature of a baby

19.6 Graph of Pulse-Rate of a baby

19.7 Steps of face detection and Recognition

19.8 Energy Measurement and Conservation Module

19.9 LM35 Temperature Sensor

19.10 IR Temperature Sensor

19.11 Soil Moisture Sensor

19.12 PIR Motion Sensor

19.13 Sound Sensor

19.14 Pulse Rate Sensor

19.15 Accelerometer ADXL335 Module

19.16 Accelerometer sensor MEM mechanism

19.17 Sensor with neutral position

19.18 The sensor in a flexed position

19.19 ACS712 Current Sensor

19.20 Graph 1

19.21 Graph 2

19.22 Graph 3

19.23 Flow of process for Online Energy Meter module

19.24 Real Power

19.25 Reactive Power

19.26 Apparent Power

19.27 Power Factor

19.28 Root Mean Square

List of Tables

1.1 Standard terms in Virtual Reality and Augmented Reality

1.2 Design Elements in Virtual Reality and Augmented Reality

1.3 Smart phones health care apps

2.1 Age and gender variation among groups and conditions (Group A plastic manikin, group B real cadaver, group C Virtual Reality)

2.2 The statistical summary of pre-test and post-test 1 scores

2.3 The statistics of scoring average after swapping participants from Manikin condition to VR condition

3.1 Comparative studies on the existing AR applications for empathy development [6]

3.2 Overall Result of EMPARTI Evaluation Form

4.1 SPSS output for Paired Samples T-Test comparing participants anxiety levels prior-to and post-to the VR mock interview

5.1 The foundation highlighting the Origin of Augmented Reality

6.1 Overview of Design and Evaluation methods and recommended time and setting for using them

9.1 Sample of published experimental results

10.1 Retrieval Accuracy of the CBIR

12.1 Multiple Barrier on different themes

14.1 Parameters of the NAO robot

15.1 Participants Details

15.2 Test Assessment Results (traditional vs robotic intervention)

19.1 Few libraries used in this system

19.2 Minimum and maximum value obtained from soil moisture sensor

19.3 Respiration rate as per the age group

Foreword

There are some key factors driving the increasing adoption of augmented reality (AR) and virtual reality (VR) technologies, which depend mainly on the growing integration of technology and digitization in the field of healthcare, as well as increasing healthcare expenditures which focus on delivery of efficient health services and its significance in training healthcare professionals. The advanced technologies related to AR and VR have a great effect on the healthcare industry with their adoption in virtual training of surgeons in 3D operating room simulations for difficult surgeries and as phobia buster in mental health treatment as well as for chronic pain management. Also, VR plays a major role in eye movement desensitization and reprocessing (EMDR) therapy to enable reframing of traumatic memories through certain eye movements. Furthermore, this technology offers benefits in various areas of care management such as autism and depression therapy, cancer therapy, and assisted living. VR-based organ models have played a crucial part in preparing surgeons for delicate and complicated operations that demand greater precision, less complications, and reduced trauma. On the other hand, AR is considered a useful active and powerful tool for training and education. AR-based applications are effectively used to provide the improved care of many patients. For example, the vein visualization technology, developed by AccuVein Inc. was developed to handle scanning, which helps doctors and nurses successfully locate veins and valves at the first go, reducing pain and the required time. These applications are also used in the aftercare of patients and assist elderly people in managing their medications. This book focuses on adopting robots in conjunction with VR and AR to help in healthcare and medicine applications; for instance, we discuss a training system developed for a lower limb rehabilitation robot based on virtual reality (VR), mainly including trajectory planning and VR control strategy. It can simulate bike riding and encourages patients to join in their recovery and rehabilitation through a built-in competitive game. The robot could achieve linear trajectory, circle trajectory and arbitrary trajectory based on speed control, in which the training velocity and acceleration in the trajectory planning have been simulated. A human-machine dynamics equation was built which is used to judge the intent of a patient’s movement. The VR training mode is a variable speed active training under the constraint trajectory, and it has an adapting training posture function which can provide an individual riding training track according to the leg length of patients. The movement synchronization between the robot and virtual model was achieved by interaction control strategy, and the robot can change the training velocity based on the signal from feedback terrains in the game. A serious game about a bike match in a forest was designed in which the user can select the training level as well as change perspective through the user interface.

The main purpose of this book is to publish the best papers submitted to the special session on VR/AR Healthcare and Medicine Applications at the International Conference on Communication, Management and Information Technology (ICCMIT 2018) in Madrid, Spain.1 ICCMIT 2018 is an annual meeting for scientists, engineers and academicians to discuss the latest discoveries and realizations in the foundations, theory, models and applications of nature-inspired systems, and emerging areas related to the three tracks of the conference covering all aspects of communication, engineering, management, and information technology given by panels made up of world-class speakers and at workshops.

Prof. Ibrahiem El Emary
Prof. Musbah J. Aqel
International Cyprus University