Science-- there's something for everyone

Monday, January 21, 2013

Touchless surgical interfaces



It won’t surprise you to know that computers and electronic imagery are nearly ubiquitous in today’s operating rooms (ORs). But have you thought about the problems involved in keeping such devices sufficiently sterile? Mithun Jacob, Juan Wachs and Rebecca Packer from Purdue University certainly have. The researchers are testing ‘non-contact gesture-controlled human-computer interfaces’, adopted from gaming consoles. Rather than touching controls, images can be accessed via hand gestures.

Traditionally, there are two ways to control an OR computer workstation. The surgeon can manipulate it himself, in which case the workstation must be sterilized, a nearly impossible task. I suppose the surgeon could change gloves before and after each data entry, but that would greatly increase the length of surgeries and provide multiple opportunities for contamination. Alternatively, the surgeon could verbally instruct a nurse or assistant to manipulate the computer, but that can also lead to delays as miscommunications are straightened out. The Purdue University researchers had a better idea.

To begin with, the researchers asked ten surgeons to come up with a gesture command lexicon. For example, facing the palms toward each other and moving them closer or farther apart would signify zooming in or zooming out, respectively.  They paired these gestures with special 3D-sensing cameras that could interact with the computer workstations. They next set about designing algorithms to let the camera system distinguish between motions intended to manipulate the computer and unrelated gestures. Body position and direction of gaze proved critical for ensuring that the interface was interpreting the surgeons’ movements correctly. In trials, volunteers got the result they were looking for about 93% of the time, definitely good enough to continue testing.

You can see a test example below:





Jacob MG, Wachs JP, & Packer RA (2012). Hand-gesture-based sterile interface for the operating room using contextual cues for the navigation of radiological images. Journal of the American Medical Informatics Association : JAMIA PMID: 23250787.