top of page
  • Mak Wen Yao

Digital Phenotyping: The new era of Healthcare Technology or unauthorized pervasive surveillance?

Smartphones and the internet has taken over the lives of many. Old or young, many people today are glued to their mobile devices and, inevitably, leave behind a unique digital footprint that could be utilized for healthcare purposes. The idea of deciphering a person's well-being from their online activities seems far-fetched in the first place. How could a simple mouse click or a few keyboard strokes accurately depict the inner working of a person's mind? As it turns out, an average person interacts with his or her phone more than 2,600 times daily, while spending an average of 2.4 hours of screen time every day. When these bits and pieces of information are accrued over time, a representative picture of that person’s behaviour could be constructed.

Digital phenotypes

The emerging field of digital phenotyping is a multidisciplinary approach to understanding human behaviour better. "Our interactions with the digital world could actually unlock secrets of disease," stated Dr Sachin Jain, Chief Executive of CareMore Health. Dr Jain was also among the first researchers that introduced the term “digital phenotypes”. As ownership of smartphones is increasing exponentially, these mobile devices are perfectly suited for digital phenotyping. Coupled with a well-designed application, smartphones could help researchers to collect many different types of data that are previously impossible to achieve.

Dr Sachin Jain, Chief Executive of CareMore Health, who had attempted to use Twitter as a viable research tool on sleep issues.

A good example will be the artificial intelligence (AI) algorithm that Facebook plans to incorporate into its social media platform to help detect signs of possible suicidal thoughts. "One of the most ambitious effort [in digital phenotyping]," as mentioned by Natasha Singer from The New York Times. In order to better identify specific keywords that are associated with suicidal thoughts, the Facebook team relies on machine learning to discern the context behind a set of predetermined keywords. The algorithm is programmed to scan posts or live streams for a specific language pattern that implies potential of self-harm by the users. It also cross-references them with related comments, such as “Are you OK?” or “Can I help you?” by family and friends, to ascertain the suicidal intention.

Digital phenotyping is not restricted to merely scanning social media posts. As argued by Jukka-Pekka Onnela and Scott L Rauch, data captured via a smartphone or other wearables – such as spatial trajectories via GPS, physical mobility patterns, social interactions and even voice samples – could shed light on complicated clinical cases. For example, the spatial data could reveal how a depressed person spends his or her time at home and at work, while the phone communication logs could reveal the size and interactions of that person's social network. These "social markers,” as Onnela and Rauch called them, could provide information that is actionable and clinically relevant to treatment.

Smartphones and wearables could be used to monitor and collect health and geospatial data in real-time, generating a wealth of potentially useful information for clinical uses.

Real-time data capture also allows clinicians to detect subtle social or behavioural changes that warrant a closer inspection. For example, a sudden decrease in social activity may indicate an imminent bout of depressive relapse. Essentially, both smartphone data capture and Facebook screening operate on the basis of successfully identifying stereotypical communication patterns that are related to certain psychiatric tendencies to allow intervention in time.

Ethics, privacy and the right to practice medicine

Facebook's attempt to filter and identify suicidal tendency among millions of its users is, perhaps, founded on the goodwill to help the vast community it serves. However, critics are not entirely convinced that the method is both effective and ethical. Currently, there is little evidence that shows a person's digital activity can be reliably translated into meaningful medical information. There are too many confounding factors that could affect a person's online interactions. "It's this whole new potential for snake oil," said Dr Steve Steinhubl, the Director of Digital Medicine at Scripps Translational Science Institute, when describing digital phenotyping. The effort also poses complicated ethical questions. The fundamental element of “informed consent” is glaringly missing from the whole Facebook screening exercise, where the users were not given the choice of opting out. The issue of consent, access to data, privacy and confidentiality must assume top priority, given a large amount of sensitive information that may be collected via smartphone-based digital phenotyping or via social media scans. "The problem is just a larger issue which is, once you have tech companies implementing unilateral suicide detection or algorithms, then they can also very easily develop Alzheimer's detection algorithms or depression detection algorithms or addiction detection algorithms, and also say that we are doing this for good. And it becomes very surveillance and Big Brother," said Singer while responding to On Pointradio host, Jane Clayson, on the Facebook initiative. Dr Steinhubl also commented that Facebook is "certainly right up to that line of practising medicine not only without a license, but maybe without proof that what they are doing provides more benefit than harm." Digital phenotyping has the potential to offer a viable alternative to clinicians and researchers to understand their patients and the underlying diseases. As with any other technological development, digital phenotyping could be a double-edged sword and warrants serious consideration before indiscriminate use of the technology becomes widespread. MIMS Sources:

Screenshot 2023-11-06 at 13.13.55.png
bottom of page