Can conclusions be drawn about health problems based on typing behavior or facial expressions or other behavioral data available via the iPhone sensors? The alleged project partners Apple, the University of California (UCLA) and the drug manufacturer Biogen want to find out, according to the Wall Street Journal (WSJ). The WSJ got its information from people in the project environment and, according to its own information, was able to view relevant documents.
iPhone sensors deliver data, but does this data allow conclusions?
Specifically, Apple is supposed to be making depression diagnostics easier with UCLA under the project name “Seabreeze” and diagnostics with Biogen under the project name “Pi” cognitive impairments work. In the summer, Biogen received FDA approval for a new drug for the treatment of mild cognitive disorders. Solid diagnostics with the iPhone could help to find patients at an early stage of the disease at which drug interventions are still showing a good effect. None of the project partners has confirmed the cooperation so far.
The sensor data collected by Apple’s iPhones, mobility, physical activity, Include sleep patterns, typing habits, and more. The researchers want to examine these for patterns that are related to the target states. From this, algorithms would then be developed in order to reliably recognize them.
Earlier academic studies had already suggested that there is a chance of such solutions. It was found that people with certain mental illnesses use their digital devices differently than others. The extent to which this knowledge can be converted into reliable algorithms is still completely unclear.
Especially for the early diagnosis of mental illnesses and brain disorders would be a success Apple research is an asset. Because access to specialists who are able to reliably perform such diagnostics is not completely barrier-free, even in Western countries. Apple and its partners hope with their work to create a generally available alternative for the first specialist contacts.
At Apple, the project should follow up WSJ information received enthusiastic support. Apple manager Jeff Williams, who is responsible for the health department of the iPhone manufacturer, is said to have spoken enthusiastically in front of his employees about the potential to combat the rising rates of depression and anxiety as well as other brain disorders.
Since the data to be collected will be very sensitive, Apple should want to rely on algorithms from the start that process the data on the users’ devices – i.e. without Cloud participation – can perform. To what extent Apple will bother the loss of trust after the at least communicative disaster surrounding the child pornography scan on the customer’s iPhones remains to be seen.
Depression research: pilot study is in progress
Apple has worked with researchers in the past to develop health functions. For example, the detection of atrial fibrillation using the Watch’s EKG function was developed in cooperation with Stanford University.
Depression research relies on the University of California on data from the video camera, keyboard and audio sensors of the iPhone as well as data on movement, vital signs and sleep. Among other things, the typing speed, the frequency of typing errors and the content of the writing should be measured. In each individual data element, the researchers are looking for clues about the emotions, concentration, energy level, state of mind and much more of the device users. At the same time, however, physical data are also collected, such as the amount of the stress hormone cortisol in the hair follicles of the participants Fall, the researchers are collecting this data from 150 people using Apple Watch and iPhone. The study is to be broadened shortly. Then the data of 3,000 study participants should be collected and evaluated.