When Google announced it was rolling out its artificial intelligence-powered search feature earlier this month, the company promised that “Google will do the googling for you.”The new feature, called AI Overviews, provides brief, AI-generated summaries highlighting key information and links on top of search results.
Unfortunately, AI systems are inherently unreliable. And within days of AI Overviews being released in the US, users quickly shared examples of the feature suggesting that its users add glue to pizza, eat at least one small rock a day, and that former US president Andrew Johnson earned university degrees between 1947 and 2012, despite dying in 1875.
Yesterday, Liz Reid, head of Google Search, announced that the company has been making technical improvements to the system.
But why is AI Overviews returning unreliable, potentially dangerous information in the first place? And what, if anything, can be done to fix it? Read the full story.
—Rhiannon Williams
AI-directed drones could help find lost hikers faster
If a hiker gets lost in the rugged Scottish Highlands, rescue teams sometimes send up a drone to search for clues of the individual’s route. But with vast terrain to cover and limited battery life, picking the right area to search is critical.
Traditionally, expert drone pilots use a combination of intuition and statistical “search theory”—a strategy with roots in World War II–era hunting of German submarines—to prioritize certain search locations over others.
Now researchers want to see if a machine-learning system could do better. Read the full story.
—James O’Donnell
Source link