The Daily Dose: Visions of a dark technological future

His university says it has no record of him. He has no obvious online footprint beyond an account on the question-and-answer site Quora, where he was active for two days in March. Two newspapers that published his work say they have tried and failed to confirm his identity. And experts in deceptive imagery used state-of-the-art forensic analysis programs to determine that Taylor’s profile photo is a hyper-realistic forgery – a “deepfake.”

The international collaboration between Russian and American space agencies appears to be fraying at the edges. NASA Chief Jim Bridenstine claimed everything continued to be fine, despite Roskosmos Chief, Dmitri Rogozin, claim to the contrary the day before. “With the lunar project, we are witnessing the departure of our American partners from the principles of cooperation and mutual support that have developed with the ISS,” Rogozin said. “Frankly speaking, we are not interested in participating in such a project.” That sounds pretty dodgy to us. Someone’s not being completely candid.

Reproducibility of research results is a central tenet that separates science from “hokey” religions (as Han Solo would say). That’s why it’s so disturbing that roughly 89% of preclinical data (mostly in animals) cannot be reproduced because of sloppy experiment reporting. Now scientists are trying to address the problem. As per Science, “To address the problem of poor reporting, Percie du Sert and a team of researchers have developed a checklist of 10 critical details each animal study needs to report, such as the number of animals used, their sex, whether they were randomly allocated to a test group and control group, and whether the researchers knew which animal was in which group.” This is the second attempt at establishing standardized criteria. The first time pretty much failed.

Human racial bias pervades many of the things we create. Standardized tests are one example. Certain aspects of artificial intelligence have already shown an inclination to racial bias. You can add another to the list, speech-to-text technologies. A paper in the Proceedings of the National Academy of Sciences investigates it. According to the authors, “Automated speech recognition (ASR) systems are now used in a variety of applications to convert spoken language to text, from virtual assistants, to closed captioning, to hands-free computing. By analyzing a large corpus of sociolinguistic interviews with white and African American speakers, we demonstrate large racial disparities in the performance of five popular commercial ASR systems. Our results point to hurdles faced by African Americans in using increasingly widespread tools driven by speech recognition technology. More generally, our work illustrates the need to audit emerging machine-learning systems to ensure they are broadly inclusive.” We shudder to think about how future, more advanced, AIs will perform.

Sign up for Scientific Inquirer’s Steady State Newsletter for the week’s top stories, exclusive interviews, and weekly giveaways. Plenty of value added but without the tax.

IMAGE SOURCE: Creative Commons

Leave a Reply

%d bloggers like this: