- Kim Osborne
What can Google's 'Project Soli' do for Healthcare?
Introduction to Google's 'Project Soli'
Soli is a new sensing technology that uses miniature radar to detect touchless gesture interactions.
Soli sensor technology works by emitting electromagnetic waves in a broad beam.
Objects within the beam scatter this energy, reflecting some portion back towards the radar antenna. Properties of the reflected signal, such as energy, time delay, and frequency shift capture rich information about the object’s characteristics and dynamics, including size, shape, orientation, material, distance, and velocity.
Soli tracks and recognizes dynamic gestures expressed by fine motions of the fingers and hand. In order to accomplish this with a single chip sensor, we developed a novel radar sensing paradigm with tailored hardware, software, and algorithms. Unlike traditional radar sensors, Soli does not require large bandwidth and high spatial resolution; in fact, Soli’s spatial resolution is coarser than the scale of most fine finger gestures. Instead, our fundamental sensing principles rely on motion resolution by extracting subtle changes in the received signal over time. By processing these temporal signal variations, Soli can distinguish complex finger movements and deforming hand shapes within its field.
Soli Gesture Recognition and Hardware
The Soli software architecture consists of a generalized gesture recognition pipeline which is hardware agnostic and can work with different types of radar. The pipeline implements several stages of signal abstraction: from the raw radar data to signal transformations, core and abstract machine learning features, detection and tracking, gesture probabilities, and finally UI tools to interpret gesture controls.
The Soli SDK enables developers to easily access and build upon our gesture recognition pipeline. The Soli libraries extract real-time signals from radar hardware, outputting signal transformations, high precision position and motion data, and gesture labels and parameters at frame rates from 100 to 10,000 frames per second.
The Soli sensor is a fully integrated, low-power radar operating in the 60-GHz ISM band. In our journey toward this form factor, we rapidly iterated through several hardware prototypes, beginning with a large bench-top unit built from off-the-shelf components -- including multiple cooling fans. Over the course of 10 months, we redesigned and rebuilt the entire radar system into a single solid-state component that can be easily integrated into small, mobile consumer devices and produced at scale.
The custom-built Soli chip greatly reduces radar system design complexity and power consumption compared to our initial prototypes. We developed two modulation architectures: a Frequency Modulated Continuous Wave (FMCW) radar and a Direct-Sequence Spread Spectrum (DSSS) radar. Both chips integrate the entire radar system into the package, including multiple beamforming antennas that enable 3D tracking and imaging with no moving parts.
What can Google's 'Project Soli' do for Healthcare?
In March 2015, Google’s Advanced Technology and Projects group (ATAP) announced Project Soli, which uses radar technology to create a touch free interface using hand gestures. While much of the hype has been around the implications for consumer wearables, Project Soli may also drive change in how clinicians and patients interact with technology.
Using radar rather than cameras, Project Soli appears to stand out from other gesture-based technology by providing extremely high granularity and fast response times. The release video demonstrates its ability to quickly recognize sub-millimeter motions such as a finger flick or small hand rotation, as well as an extremely small form-factor chip which can be embedded behind certain surfaces.
The implications for healthcare are very exciting: the ability to easily and intuitively control electronic information through gestures could radically change how we access information in hospital settings.
Eliminating Input Devices
The explosion of electronic information use in healthcare has resulted in a similar expansion of the quantity of computers, tablets and dashboards used in the hospital environment, and it is not unusual to have thousands of such devices in a medium-sized acute care hospital (300-500 beds). This poses a very real challenge for infection control professionals, as many of the surfaces are used constantly, but can suffer damage from the strong chemicals used to clean them. Despite the best infection control procedures, these surfaces often become a breeding ground for two of Canada’s most common hospital-acquired infections (HAIs): methicillin-resistant Staphylococcus aureus (MRSA) and Clostridium difficile bacteria.
Incorporating Project Soli-type chips would allow users to control devices without touching them, reducing the risk of spreading infections. This could mean that not only would there be fewer germs present, but also that surfaces could be constructed with more durable materials if they do not have to recognize touch inputs.
Gesture recognition has the potential to streamline surgical procedures and provide better information to surgeons as they are operating, by allowing them to manipulate CT and MRI images without touching anything outside the sterile field. In this context, Project Soli does not represent a disruptive change, but rather an incremental improvement on technology already being pioneered in some hospitals.
A Kinect-based gesture recognition system named GestSure has been developed in Toronto and was first demonstrated at Sunnybrook Hospital in 2011, and Microsoft has worked with hospitals in London, England to develop a similar Kinect-based technology that meets this need. However the increased responsiveness and granularity of Project Soli’s device may make gesture-recognition in ORs more intuitive and easy to use – which is important when driving adoption by clinicians.
Finding a Place in Healthcare
Project Soli is still in the very early development stages, which means it will likely be several years before it is available commercially – and even reaching that stage will be dependent on how it is adopted by application developers and hardware manufacturers – but the possibilities for healthcare are immense....
Source : http://atap.google.com/soli/
Source : https://www.linkedin.com/pulse/what-googles-project-soli-means-healthcare-kim-osborne-p-eng-/
Kim is a professional engineer specializing in Information Technology strategic planning and technical design on a range of healthcare & commercial projects throughout Canada. She excels at supporting projects from initiation through to implementation, and works closely with users and stakeholders to translate workflow needs and challenges into functional requirements and technical planning documentation. Her coordinated and knowledgeable approach to developing effective stakeholder engagement ensures that recommendations and project deliverables are feasible, and tailored to the unique needs of each client. Kim has applied her technical and business skills drive the success of a range of IT, security and AV design projects from visioning to construction, and works closely with other consultants to ensure a coordinated approach to planning. Her experience includes technology strategic planning for multiple hospital clients across Canada, network design and output specifications for large hospital redevelopment projects, work on large P3 construction projects and construction and commissioning support for technology systems implementation. Her extensive industry knowledge and real-world experience provides valuable strategic guidance throughout the IMIT visioning and planning stages, ensuring that solutions are value-driven, forward-thinking and can be realistically implemented.