For decades, academic research has led to the very latest in cutting-edge technology. But now, cutting-edge technology is at the forefront of university research. On Tuesday, March 22, the University of Arizona celebrated the grand opening of its Sensor Lab, a high-tech research center based on human physiology and
Under the umbrella of UA Health Sciences, the Sensor Lab offers virtual reality, wearable sensors, 360° video and other technologies that can provide unique research in a variety of fields. The lab supports faculty and university students, but is also available to investors and local businesses interested in developing custom technologies.
Sensor Lab projects range from virtual reality for training at the UA College of Nursing, to wearable sensors for remote tai chi classes for the elderly, to augmented art where dancers can interact with projections.
“I would say our strength here is in sensor-based research, where we provide expertise and connections and opportunities for different departments to work together,” said Sensor Lab coordinator Gustavo Almeida. “We bring together capabilities that would otherwise be difficult to organize. Not only could there be unexpected search results, but unexpected applications.
The UA College of Engineering presented two Sensor Lab research programs. The first is a series of 3D-printed wearables that can track health data, similar to a FitBit. However, these custom wearable devices are designed to detect frailty in the elderly and can be charged wirelessly. The portable devices are powered wirelessly by a battery up to two meters away, but when out of range they can run for about eight hours.
“Fragility is actually a pretty big issue, but maybe it’s not too prevalent in people’s minds. And people can have very poor outcomes if it’s not detected early. But if you catch it, you can improve the quality of life and prolong life. However, this is a very gradual process that can be difficult to detect early,” said Assistant Professor of
biomedical genius Philipp Gutruf. “So we’ve developed sensors that people can wear at home or at work, and we can get 24/7 health data that’s a lot more complex than what you’d get from a FitBit.”
The second engineering project is a touch-sensing robotic arm that allows surgeons to have a “superhuman feel”. The robotic arm is designed to sense things like fabric durability and heat in ways that human senses can’t. Although none of these research projects are used outside the lab, doctors in the real world are beginning to experiment with them.
“Because the Sensor Lab is right next to the hospital, all surgeons can come here and try out the robots and train. We monitor them closely and then can gauge how well these devices are working to help them,” Gutruf said. “In this Sensor Lab, we have a very controlled environment that allows us to train our algorithms to make very accurate diagnoses… It’s a great place with a variety of projects, but they all have the sensors in common. The focus is on the capabilities we can bring to these projects with this new facility.
The Sensor Lab focuses on new research, but also on the development of new types of sensors. If a company develops a new type of sensor, the lab can provide space to do so and also examine other sensors currently available on the market, according to Jennifer Barton, professor of engineering and optical sciences.
“We can reconfigure the spaces to look like a hospital room, for example. It’s a really unique space that would be really hard to have anywhere else,” Barton said. “Having this wide array of sensors is great, not just for commercially available ones, but also for experimental sensors…These types of collision activities are great for getting new insights.”