By Jennifer Conrad | 08.16.21 Partners at EY, the accounting giant formerly known as Ernst & Young, are testing a new workplace gimmick. They spice up client presentations or routine emails with synthetic talking-head-style video clips starring virtual body doubles of themselves made with AI software—a corporate spin on a technology commonly known as deepfakes. EY calls its virtual doubles ARIs, for artificial reality identities, instead of deepfakes. As Tom Simonite reports, the firm's exploration of the technology, provided by UK startup Synthesia, comes as the pandemic has quashed more traditional ways to cement business relationships. To create an ARI, the subject sits in front of a camera for about 40 minutes, reading a special script. The footage and audio provides Synthesia's algorithms with enough examples of a person's facial movements and how they pronounce different phonemes to mimic their appearance and voice. After that, generating a video of a person's ARI is as easy as typing out what they want to say. Two for One? EY partners have used their doubles in emails and presentations. One partner who does not speak Japanese used the translation function built into Synthesia's technology to display his AI avatar speaking the native language of a client in Japan, to apparently good effect. Read whether these virtual doubles have a future in business. | In December 2020, researchers at DeepMind said they'd solved the problem of how to predict the structure of a protein from the sequence of amino acids that it comprises. The Google subsidiary achieved the feat using AlphaFold2, a multilayered neural network that generates predicted 3D structures for a protein when given its amino acid sequence. Helping scientists figure out what proteins look like can facilitate research into the inner workings of cells and, by revealing ways to inhibit particular proteins, potentially aid in the process of drug discovery, writes Grace Huckins. So when DeepMind was slow to make the code public, a team at the University of Washington set out to recreate it. Its model, a protein prediction algorithm called RoseTTAFold, allows researchers to submit amino acid sequences and see predictions, without getting their hands dirty with computer code. In the end, both DeepMind and the Washington researchers published their findings this summer. Why Time Is of the Essence Understanding the three-dimensional structure of a protein that's essential to the survival of a pathogen could help scientists develop drugs to fight that pathogen. The applications could extend to the pandemic; DeepMind used a version of AlphaFold2 to predict the structures of some SARS-CoV-2 proteins last August. Read about the race to solve this fundamental problem in biology. | Researchers at UC Davis recently tested an early detection system that uses AI to classify admissions to animal rehabilitation centers, in the hope of creating warnings about problems among marine birds and other animals. As Jennifer Clare Ball reports, the system scans intake reports from 30 California centers, listing information such as the animal's species, age, reason for admission, and diagnosis. Then the AI uses natural language processing to categorize the reports, looking for patterns in the admissions related to certain illnesses and injuries. When the system detects an unusually high number of cases in a given species, it automatically sends an alert to wildlife experts. Because the system processes rehab center admission data in just a day or two, it can produce "prediagnostic" alerts, which are faster than waiting until diagnoses have been confirmed. Not Just for the Birds An early warning system can help stop animal disease outbreaks in their tracks, and it can benefit human health too. "Wild animals can serve as an early indicator" of diseases like West Nile virus, says UC Davis' Terra Kelly. Read about the promise of AI in detecting animal health issues. | |
Post a Comment