Project Silica: Microsoft’s Laser-Written Glass Could Store Data for 10,000 Years: Microsoft researchers have built an end-to-end archival storage system that writes digital data into borosilicate glass—the same family of glass used in ovenware—and keeps it readable for at least 10,000 years. Instead of magnetizing tape or disk surfaces that degrade within a decade, the team uses an ultrafast, high-energy laser to create microscopic 3D “voxels” (tiny deformations) inside the glass. Data are retrieved by scanning the glass with a microscope and decoding the optical signatures with machine learning, including techniques that separate signals across hundreds of stacked layers. The paper reports 4.8 TB stored in a 12 cm-wide, 2 mm-thick glass tile—about 2 million books—and aging tests suggesting extreme heat tolerance and far longer lifetimes at room temperature. (Nature)

A Neural Blueprint for Human-Like Intelligence in Soft Robots: MIT researchers report an AI control system that helps soft robotic arms learn a broad “vocabulary” of motions once, then adapt to new conditions without retraining. The approach is aimed at a classic soft-robotics problem: compliance is safe, but it’s hard to control precisely when the environment changes. In demonstrations, the arm learned target shapes (including a curved “C”) and then held performance under disturbances like fixed and continuously changing fan speeds. In the toughest anti-disturbance scenario, the system still reached the target with 93.8% accuracy, suggesting a path toward soft devices that can be both gentle and reliable. The team frames the method as a step toward assistive, rehab, and medical soft robots that must operate safely around people. (MIT News)

Tech Is Taking Over Olympic Curling—With Robots in the Mix: A new IEEE Spectrum feature traces how curling—already a sport obsessed with millimeters—has become a testbed for sensing, analytics, and robot-assisted precision. The piece connects recent “double-touch” controversies to an arms race of measurement: higher-fidelity tracking, AI-based interpretation, and experimental curling robots designed to deliver repeatable throws in messy real-world ice conditions. Advocates argue that better instrumentation can clarify ambiguous calls and accelerate training feedback loops. Critics worry that outsourcing nuance to machines risks flattening the human craft that defines curling’s ethos. The article also highlights how robotic systems force the sport to define what it’s actually judging: outcome alone, or the human execution that produced it. (IEEE)

New “Mars GPS” Lets Perseverance Self-Localize to 25 Centimeters: NASA engineers have rolled out a localization upgrade for the Perseverance rover that functions like “GPS on Mars,” enabling the rover to pinpoint its position to within about 25 centimeters. The payoff is autonomy: if the rover can reliably estimate where it is without constant human-in-the-loop corrections, it can drive farther between check-ins and spend more time doing science. A JPL robotics operations lead describes the change as letting Perseverance determine its own location, supporting longer autonomous traverses and faster exploration cadence. The technique is positioned as broadly reusable—potentially benefiting other rovers that need to move “fast and far” while keeping navigation error tightly bounded in a GPS-less environment. (phys.org)

A Robotic Root-Imaging Factory to Accelerate Tougher Crops: Oak Ridge National Laboratory has launched APPL, a robotic platform that repeatedly images plant roots growing in soil—without digging them up—producing large, AI-ready datasets to speed crop research. Roots are notoriously hard to measure at scale, yet they drive water uptake, nutrient acquisition, and stress tolerance. APPL aims to automate that hidden half of plant biology: standardized chambers, repeatable imaging, and continuous time-series data that can be linked to aboveground traits. ORNL says the resulting datasets are analyzed with AI and its Frontier exascale supercomputer, with an eye toward breeding or engineering crops better suited to drought, poor soils, and future climate volatility. It’s a robotics-meets-biology pipeline designed for throughput, not one-off experiments. (phys.org)

The “Goldilocks” Speed for AI Prosthetic Arms: About One Second per Reach: A ScienceDaily report spotlights a VR study suggesting that how fast an AI-powered prosthetic moves can determine whether it feels helpful—or unsettling. Researchers tested different autonomous reach speeds and found a sweet spot: movements that took about one second per reach were most likely to be perceived as natural. Too fast felt “creepy,” while too slow felt awkward and inefficient. In the preferred range, participants reported stronger feelings of control, comfort, and trust, pointing to an underappreciated design constraint for assistive robotics: acceptance isn’t just about accuracy, but about matching human expectations for timing and intent. The work frames motion speed as a tunable parameter that could improve real-world adoption of AI prostheses and other human-facing robots. (Science Daily)

Resource-Sharing Modular Robots That Get Stronger When Parts Fail: EPFL roboticists describe a modular system where individual units share key resources—power, sensing, and communication—so the collective becomes more resilient than conventional designs. Instead of each module operating as a fragile standalone, the robot redistributes capabilities when a segment is compromised, lowering the odds that a single failure cascades into total shutdown. The team reports the idea in the context of modular origami-style robots (highlighting EPFL’s Mori3 platform), emphasizing that shared “infrastructure” can keep the system functional even as components degrade. The larger implication is a design philosophy shift: build robot bodies like robust networks, not brittle chains. That matters for field deployment, where damage, wear, and partial outages are normal—not exceptional—and recovery must be automatic. (Eureka Alert)

Robots That “See” Around Corners Using Radio Waves and AI: Researchers at the University of Pennsylvania introduce HoloRadar, a system that uses radio signals—processed with AI—to reconstruct 3D geometry in spaces outside a robot’s line of sight. The core promise is safety: robots and autonomous vehicles routinely face T-intersections, blind warehouse aisles, and cluttered indoor layouts where cameras and lidar can’t directly observe hazards. By leveraging radio waves that can penetrate or reflect in ways light cannot, HoloRadar aims to infer hidden objects (including people) beyond corners, then translate that inference into actionable spatial maps. The team positions the approach as a complementary sensing modality rather than a replacement, potentially adding a “sixth sense” for navigation in occluded environments. The release notes the work’s link to NeurIPS 2025. (Eureka Alert)

A Robotic Leg Changes Your Body Image—But Not Perfectly: A new open-access paper in PNAS Nexus tracks how people’s subjective body image evolves as they learn to walk with a wearable robotic leg over multiple days of training. The authors measure both gait performance and perceived motion after each session, testing the idea that motor learning and “incorporation” of the device co-develop. Results suggest participants improved physical walking patterns and also adjusted their perceived body image toward more typical movement—evidence that practice can integrate a wearable robot into the sensorimotor system. But the study also reports a persistent mismatch between perceived and actual motion, likely because wearers still lack direct sensation and full control of the prosthesis. The findings matter for exoskeleton design: embodiment isn’t automatic; it’s learned—and may remain incomplete. (PNAS Nexus)

Flapping-Wing Robo-Bird Uses Two Tails to Fly Fast or Slow: A New Atlas report highlights “The Swift,” a flapping-wing ornithopter that tackles a common limitation of bird-like robots: they often fly only at one brisk speed, restricting where they can operate. The design uses two interchangeable tails. With the Speed tail, it reportedly reaches up to 31 km/h (19 mph); with the Precision tail, it can maintain stable flight as slow as 3.5 km/h (2 mph), enabling indoor use. The system relies on real-time inertial measurement unit corrections and adjustable assist levels, and it’s built from resilient materials intended to survive crashes. The article frames the project as a leap from earlier micro-ornithopters (like MetaFly), pushing toward more versatile maneuvering and controllability in a consumer-scale package. (New Atlas)

The Chemist Who Taught AI to Run the Lab: Scientific American profiles chemical engineer Gabriel Gomes and his push toward “autonomous science” via AI agents that can translate natural language into executable lab procedures—interfacing with robotics and lab infrastructure. Gomes describes building Coscientist to lower the barrier to using cloud labs packed with expensive, robot-controlled equipment, so chemists don’t have to become software engineers to run experiments. The piece includes concrete examples, from simple robot demonstrations (like drawing with food coloring on well plates) to scaling up experimental datasets that would be difficult for humans to produce consistently. Gomes also stresses the need to “trust but verify,” noting that models can be overly agreeable. The throughline is a future where robotics + language models reduce drudgery and expand experimental ambition—if safety and validation keep pace. (Scientific American)

Isomorphic’s IsoDDE Raises the Bar—and the Stakes—for Closed Drug-Discovery AI: Isomorphic Labs, DeepMind’s biopharma spin-off, has unveiled a proprietary drug-discovery model called IsoDDE, described in a 27-page technical report dated 10 February. Scientists say its performance—especially in predicting drug–protein binding affinity and antibody–target interactions—looks like a leap on the scale of a future “AlphaFold4.” But unlike AlphaFold’s research-friendly releases, IsoDDE’s methods remain largely opaque, offering few clues for replication. The report claims IsoDDE outperforms both open-source alternatives such as Boltz-2 and traditional physics-based binding calculations. Critics note that Isomorphic may have benefited from private structural datasets and partnerships, though others argue major gains are still possible using public data. The announcement spotlights a growing tension: breakthrough capability versus scientific transparency. (Nature)

Leave a Reply

Trending

Discover more from Scientific Inquirer

Subscribe now to keep reading and get access to the full archive.

Continue reading