Gregory J. Gerling, Ph.D.

    Associate Professor

    Systems and Information Engineering

    gregory-gerling virginia edu

Greg Gerling     Gerling Lab Logo          
Current Research


DARPA N11AP20002 (Gerling, G.J. with Cederna, P.S. and Urbanchek, M.G., U. of Michigan)

Computational Models and Real-time Prototypes to Mimic the Signal Modulation Capability in Connecting Artificial Tactile Sensors with the Peripheral Neural Afferents

Major goals: We seek to understand signal modulation parameters of the biological generation of action potentials in response to the controlled input of current pulse waveforms, a necessary step in the design of a neural prosthetic sense of touch.  There are two aims proposed: 1) determining the amplitude and duration of the minimum current waveform to consistently elicit single APs in the rat peripheral nerve, 2) eliciting trains of APs in the living rat which mimic natural AP response to ramp-and-hold stimuli and vibration by inputting a) controlled modulation current waveforms to the sensory PNI and b) progressing to a direct connection between force sensor and SPNI.  Dr. Gerling's group in specific will perform research design and coordination, modeling of empirical relationships in the data, mathematical optimization and experimental actuator design. 

Example project: Designing artificial sensor systems to mimic touch receptors.  As artificial touch in neural prostheses requires replacing the biological touch receptors lost through amputation, we transition our computational model to create a physical sensor system that mimics the output of natural touch receptors. A force sensor is embedded in a skin-like silicon substrate (A). The analog output of the sensor is converted via algorithms (B) to biologically relevant trains of action potentials (C) that represent how mechanoreceptors code the magnitude of stimulus indentation and velocity of stimulus movement.

Research Staff and Graduate Student: Elmer Kim and Aaron Williams

Artificial Sensors Mimic Mechanoreceptors


R01 NS073119-01 (Gerling, G.J. and Lumpkin E.A., Columbia U.

Part of NSF-NIH Collaborative Research in Computational Neuroscience (CRCNS) Program

Collaborative Research: Predictively Modeling the Impact of Receptor Arrangement on Action Potential Initiation in Mammalian Touch Receptors (short title: CRCNS: Modeling impact of receptor arrangement spike initiation in touch)

Major goals: To develop a computational model that describes action potential timing in mammalian touch receptors.  My group in specific will perform computational modeling and experimental actuator design.  In our modeling effort, my group will use computational methods including solid mechanics, differential equations, statistics, and information theory to explain the biphasic sensitivity and variability of SAI afferents.  In our work with experimental actuator platforms, we will use a custom-built mechanical indenter to perform mechanical characterization of mouse skin and a second indenter to deliver vibration stimuli to characterize the noise and variability of afferents.

Example project: Modeling the spike timing variability from the SAI (slowly adapting type I) mechanoreceptor in the mouse.  In general, to better understand the neural basis of touch, we design controlled experiments where our custom-built, computer-controlled indenter (A) stimulates mouse skin samples with an intact touch receptor and nerve (B) while neural responses are recorded (C). We then construct computational models to explain the sub-transformations underlying the touch receptor’s neural response (D).  

Graduate Students: Daine Lesniak and Yuxiang "Shawn" Wang

Modeling Mechanoreceptor Biological Response


National Library of Medicine T15LM009462 (Guerlain, S.A.; Harrison, J.H; Gerling, G.J.; Bass, E.J. and  other Co-Is)

Systems Engineering Focus on Clinical Informatics

Major goals: To train students in health and bioinformatics. This training grant funds approximately 7 graduate students and 3 post docs per year who will attain a degree in Systems and Information Engineering and who are co-advised by faculty in the healthcare and biological domains.  The grant is a 5-year renewable grant ending in 06.30.2012.  Dr. Gerling's role is in advising Post-doctoral scholars, Ph.D. students, and STTP students (under-represented minority students on 3 month assignments) students. He also serves on the Executive Committee and is in charge of recruiting.

Example project: A mixed reality environment for treating phantom pain.  Phantom Limb Pain (PLP) is chronic, severe, and affects up to 90% of amputees.  Recently, non-invasive therapies like mirror visual feedback (MVF) have shown promise in treating PLP, and this approach has been extended to virtual reality (VR).  These treatments are thought to work, in part, by eliminating or reducing the incongruence between the sensory and motor systems caused by deafferentation.  We hypothesize that a dynamic interaction between the phantom limb, intact limb, and the environment will afford more life-like simulations and enhanced pain relief.  To test this hypothesis we are developing a mixed reality (MR) simulator for the treatment of phantom limb pain.  The goals are to augment virtual reality strategies with additional sense modalities, to enable the phantom to interact and impart forces on the intact limb in real-time.


Figure 1. (left) Prototype mirror box with proprioceptive and haptic feedback via slider bar.  Our preliminary results are that with a small group (N=3) of amputee volunteers, the haptic mirror box provided pain relief within 10 minutes, motion of the phantom was increased, and tactile sensation of the intact limb was referred to the phantom.  Movement of the stump enhances the illusion for below-knee amputees, and reduces the illusion for above-knee amputees, (right) Concept for a 1 degree of freedom mixed reality environment utilizing the animation of phantom hands and fingers in the VR environment and the presentation of physical stimuli to the intact limb.

Graduate Students: Aaron Williams and Mark Farrington (past: Daine Lesniak and Anila Jahangiri)


Systems Engineering Research Center (SERC) (Scherer, W. T.; Bailey, R.R.;Gerling, G.J.; Louis, G.E.) 

Development of an Extensible Systems Engineering Capstone Experience for Non-Systems Engineering Seniors.

Major goals: The request for proposals is entitled: Research on Building Education & Workforce Capacity in Systems Engineering.  This education-focused effort seeks to develop methods for teaching Systems Engineering concepts to students in other engineering departments (e.g., electrical and mechanical engineering) through an enhanced capstone experience. 

Example projects: Using Electroactive Polymers to Simulate Light Touch and Vibration and A Sensorized Glove for Tracking the Hand and Fingers for Visualization in a Virtual Reality Environment

Click here for YouTube Video

Click here for Short Description of Project

Hand TrackingFinger ConstrictionElectroactive Polymer Actuator

Completed Research

Dept. of Defense

Congressionally Directed Medical Research Program, administered by the Department of the Army W81XWH-08-1-005 (Gerling, G.J., Martin, M.L. & Childress, R.M.) 

The development of prostate palpation skills through simulation training may impact early detection of prostate abnormalities and early management

Major goals: To further develop the prostate exam simulator by determining distinct skill levels for the discernment of palpable inclusions, determine how contextual factors in the exam influence diagnosis decision-making, determine methods to customize performance assessment and training intervention, and determine if applied finger techniques correlate with level of performance.

Example projects:
1. Characterize the material properties of prostate tissue, removed post-surgery, and indented with a custom-built spherical indenter.  The mechanical characterization of prostate tissue has not received much attention and is often disconnected from the clinic, where samples are readily attained.  We developed a spherical indenter to generate force-displacement data from ex vivo tissue, both whole mount and 5 mm cross-sections.  Indentation velocity, depth, and sphere diameter, and four means of estimating elastic modulus (EM) were validated.  EM was then estimated for ~30 prostate specimens in the clinic.

2. Understand the perceptible limits of the DRE, which are based on some unresolved combination of the size, depth, and hardness of abnormalities within a given prostate stiffness.  Using psychophysical testing methodologies, this work informs the range of disease states that are palpable, from human sensory limits.

3. Develop an efficient and accurate means of assessing the palpation skill of trainees.  Integrate computerized adaptive testing (CAT) with the VPES to provide proficiency estimates with fewer test items, thereby reducing testing duration. The main components in our CAT exam are to develop an item bank of prostate scenarios, implement the item response theory (IRT) and an item selection procedure, and determine the stopping criteria and scoring method.

 4. Correlate general aspects of finger technique with measures of performance assessment and correlate technique patterns of experts and novices with measures of performance assessment. Algorithmically defined a set of finger palpation techniques for the digital rectal exam (DRE) based upon past qualitative definitions of hands-on technique and evaluated performance between experts and novices.  Four palpation techniques were defined: global finger movement, local finger movement, and average intentional finger pressure, and dominant intentional finger frequency.  Streaming feedback from force and balloon sensors in the instrumented prostate provided the source data. 

Graduate Students: Sarah Rigsbee, Ninghuan "Miki" Wang, Angela Lee and Bill Carson

VPES Sensor Data VPES Local Finger Palpation

Figure 1. (left) Main Components of Virginia Prostate Examination Simulator Apparatus: (A) electronics for automatic balloon inflation and sensor signal conditioning, (B) instrumented torso, (C) laptop and (D) instrumented prostate, (center) Example Plot of Force Sensor and Balloon Sensor Data for a Testing Scenario, (right) Finger pressure output over time, where three local palpation patterns are identified algorithmically and characterized..

Prostate Spherical Indenter

Figure 2.  Portable indentation system and user interface built to make tissue measurements.


DARPA HR0011-08-1-0072 (Gerling, G.J.) 

Enabling the Sense of Touch: Mimicking Responses from Single-Receptors and Optimizing Populations

Major goals: To develop a multi-level mathematical model that describes single-unit and population responses in touch receptors.  The focus is upon the transition of algorithms that characterize the base level of transduction to artificial sensor-substrates which could be used in the development of prosthetic sensors to restore a person's sense of touch.

Examine and model how controlled variations of the skin, mechanoreceptors and stimuli influence the neuronal response of mechanoreceptors. Our approach is to build a computational understanding of touch sensing that links neurophysiological findings with prototype sensor arrays. Based on data from experiments with mice, we utilize a) solid mechanics models to calculate distributions of stress and strain in the skin upon surface deformation, b) neuron models to convert these quantities into spikes, and c) correlation and Bayesian techniques to compare the predictions to observed responses for like stimuli. We also use statistical signal analysis to characterize the variance in neuronal responses. The resultant models are used to implement artificial sensor grids in silicone substrates and analog hardware that mimics the spike-based response.

Example projects:
1. Previous models of touch have linked skin mechanics to neural firing rate, neural dynamics to action potential elicitation, and mechanoreceptor populations to psychophysical discrimination.  However, few span all levels.  The objective of work herein is to build a computational model of cutaneous skin and tactile neuron, and then validate its predictions of skin surface deflection, single-afferent firing rate to indenter shift, and population response for sphere discrimination.  The model uses a 3D finite element representation of the distal phalange with hyper- and visco-elastic mechanics to model a population of receptors distributed over its surface.  Each receptor model is comprised of a bi-phasic function to represent Merkel cells' transformation of stress/strain to membrane current and a leaky integrate-and-fire neuronal model to generate the timing of action potentials.  Results indicate that predicted skin surface deflection matches Srinivasan's human observations for 50 micron and 3.17 mm cylinders, and single-afferent responses achieve R2=0.81 when compared to Johnson's primate recordings. Additionally, sphere discrimination results correlate with Goodwin's psychophysical experiments, whereby 287 and 365 m-1 spheres are discriminable, but not 287 and 296 m-1.  The model predicts that a sensor density of 100 receptors/cm2, as Johansson observed at the fingertip, is required at the limit of human discrimination.

2. The next generation of prosthetic limbs will restore tactile feedback to the nervous system by mimicking how skin mechanoreceptors, such as those innervated by the slowly adapting type I (SAI) afferent, produce trains of action potentials in response to compressive stimuli. Our systems integration effort seeks to computationally replicate the neural firing behavior of an SAI afferent in its response to both magnitude and rate of indentation force by integrating a force sensor, housed in a skin-like substrate, with a mathematical model of neuronal spiking, the leaky integrate-and-fire.  The effort is unique because it accounts for skin elasticity by measuring force within simulated skin, utilizes few free parameters, and separates parameter fitting and model validation using response surface methodology.  Additionally, the paradigm of ramp-and-hold, sustained stimuli ties with tasks of object manipulation and grasp.  Ramp-and-hold experiments were conducted on both the spiking-sensor model and mouse SAI afferents.  The results indicate that model-produced spike firing compares favorably with that observed for SAI afferents. As indentation magnitude increases (1.2, 1.3, to 1.4 mm), the time between spikes decreases from 98.81, 54.52, to 41.11 ms. Moreover, as rate of ramp-up increases, the time between spikes decreases from 21.85, 19.98, to 15.42 ms.

Figure 1. The model is validated at each of three points; the skin mechanics sub-model, single SAI electrophysiological response, and population response. (1) shows the indentation of a spherical stimulus into the skin mechanics model, (2) denotes the response in neural spike times for a single receptor directly underneath the sphere, and (3) shows the response from a population of 3 receptors. The shaded region under "Neural Spike Times" signifies the 50 ms timeframe in which the indenter was moving into the skin.

Figure 2. (left) 3D FE mesh of human distal phalange. Shown are the (a) overall mesh, (b) cross section of the mesh near the interconnect with the middle phalange, (c - d) longitudinal section for both the outer surface and inner mesh, and (e) four layers of microstructures.  In (e) the epidermis is 0.471 mm thick (0.371 mm stratum corneum and 0.1 mm living epidermis) and the dermis is 1.153 mm thick, (center) Mapping real-world objects to idealized primitives, (right) Sensor distribution for two biological variables: population layout and density. 


Simulation Framework for Training Chest Tube Insertion Using Virtual Reality and Force Feedback

Major goals: To train the cognitive and motor tasks that underlie the chest tube insertion procedure to medical and nursing students.  This emergency procedure is required under conditions of pneumothorax (air leak from lung into chest), hemothorax (bleeding into chest) and empyema (pus in the chest).

Example sub-projects:
1. Developing the first virtual simulator for training test tube insertion

2. Utilizing force feedback robotic devices (SensAble OMNI) and programming graphic and haptic interaction

3. Designing a pre-simulation test, vitals monitor, virtual operating room for hands-on interaction status aids, and post-performance report.

4. Focusing on teaching cognitive tasks.

5. Creating a reconfigurable virtual environment that builds upon learning concepts (the grouping and presentation of cognitive tasks in blocks; navigation and status aids) to help trainees more readily learn the examination's numerous steps

6. Breaking down the 18 procedural steps into 6 major tasks within simulation

Collaborators: Dr. Marcus Martin (Medicine, UVa) and Prof. Reba Moyer Childress (Nursing, UVa)