Google not only invests into consumer electronic companies like Nest, it also heavily invests into healthcare companies. These include 23andme, a privately held personal genomics and biotechnology company, and Foundation Medicine, a cancer diagnostics company. It also invested in Verily Life Sciences. The company’s mission is to make the world’s health data useful so that people enjoy healthier lives. And, looking at their latest publication in Nature Biotechnology, that is what they have done, jointly with the parent company Google, whose staff helped to develop the artificial intelligence algorithms.
In a nutshell, what the researchers have done was using existing eye (retina) scans and medical histories of patients to train the computer program to look for patterns. These patterns were, e.g. risk of cardiovascular disease (e.g. stroke), but also patterns to determine the age of a person, and if the person was a smoker. The scientists used a training set of 300,000 patients and tested the predictive ability of their algorithm on two independent sets of data, one included 12,000 patients and the second about 1,000. While the absolute error margin for age determination was 3.3 years, the blood pressure error was only around 11 mm Hg. The system is not perfect though. In 30% of the cases, the program got it wrong.
However, considering that this is a non-invasive system, and the fact that just by looking into a camera you can tell with a high probability if you are at risk of stroke, this is already very impressive. It is not unreasonable to think that this capability will expand in a not too distant future to smartphones.
Publication reference: Nature Biomedical Engineering (2018)