- AiNews.com
- Posts
- Google Unveils Major AI Search Upgrades at I/O 2025
Google Unveils Major AI Search Upgrades at I/O 2025

Image Source: ChatGPT-4o
Google Unveils Major AI Search Upgrades at I/O 2025
At its annual I/O conference, Google announced a sweeping set of updates aimed at transforming Search from a tool for retrieving information into a more interactive and intelligent assistant. The centerpiece of this shift is AI Mode, a new experience built on Google's Gemini AI models that introduces deeper reasoning, real-time visual interaction, and personalized assistance.
The updates expand on last year’s introduction of AI Overviews, which offered brief, AI-generated summaries for search results. Google says these summaries have significantly changed user behavior, especially in large markets like the U.S. and India, where they’ve driven a more than 10% increase in search activity for queries where AI Overviews appear. The company also emphasizes the speed of the responses, calling them the fastest AI-generated answers available in the industry.
AI Mode: A New Tab for a New Kind of Search
Rolling out to users in the U.S. starting today, AI Mode builds on the success of AI Overviews but offers a more comprehensive experience. It includes:
Advanced reasoning powered by Gemini 2.5
Multimodal input, allowing users to include text, images, or voice
Follow-up questioning, for deeper exploration
Custom links to relevant web sources
AI Mode uses a technique called query fan-out, which breaks down a search into subtopics and launches multiple parallel queries on your behalf. This allows Search to pull from a broader and more nuanced set of web results than traditional keyword-based searches, helping you discover even more of what the web has to offer and surface content that’s especially relevant to what you’re asking
AI Mode will also serve as a testing ground for features that may later become part of Google’s core Search product.
Deeper Research and Real-Time Interaction
Two new tools extend AI Mode’s functionality further:
Deep Search: Designed for in-depth research, Deep Search uses an enhanced version of query fan-out to issue hundreds of queries simultaneously. It synthesizes results into fully cited, expert-level reports within minutes, saving users potentially hours of manual searching.
Live Capabilities: Bringing Google's Project Astra into the fold, the Live feature lets users engage with Search in real time using their camera. For example, a user can point their phone at a science project, ask a question, and receive explanations or resources related to what’s being viewed. This conversational interaction also supports back-and-forth questions.
Search That Acts: Agentic Capabilities and Shopping Tools
Google is expanding Search beyond information retrieval into task execution with a new set of agentic capabilities—AI-powered tools that take action on a user’s behalf to save time and reduce complexity.
These capabilities are powered by Project Mariner, a framework designed to handle multi-step tasks in real-world scenarios. In AI Mode, this means Search can now assist with time-consuming actions like:
Finding and comparing event tickets based on specific criteria such as date, price range, seating section, and availability
Filling out online forms automatically, reducing the friction of completing repetitive steps
Surfacing real-time options from across the web while keeping the user in control of final decisions
For example, if a user asks, “Find two affordable tickets for this Saturday’s Reds game in the lower level,” AI Mode will fan out multiple queries, scan ticketing platforms for real-time pricing and availability, and compile a tailored list of options—all before the user clicks a link. The goal, Google says, is to minimize the manual work of searching across sites while still preserving user agency in the final purchase.
These capabilities will initially focus on three core use cases:
Event tickets
Restaurant reservations
Local appointments
To support this, Google is partnering with major platforms including Ticketmaster, StubHub, Resy, and Vagaro, enabling smoother integrations for booking and checkout.
Smarter Shopping with AI Assistance
On the e-commerce front, Google is enhancing its AI-powered shopping experience by combining the Gemini model with the Shopping Graph, which maps billions of products, prices, and sellers in real time.
In AI Mode, users can now:
Upload a single image to virtually try on outfits from billions of apparel listings—an effort to make product discovery more personal and visual
Ask detailed questions about products, materials, use cases, or comparisons, and receive curated recommendations backed by real product data
Enable agentic checkout, a new feature that monitors price changes and initiates purchases when set conditions are met—such as reaching a certain discount level. If you find something you like, you can opt in to let AI Mode watch the price and complete the purchase automatically when the timing is right.
Google emphasizes that these shopping tools are guided by user input and control. Even when AI assists with monitoring or action, users decide when and how to complete a transaction, often using services like Google Pay for checkout.
Personalized and Visual Search Experiences
AI Mode will soon support personal context, offering tailored results based on past searches and connected apps like Gmail. This could enable search suggestions aligned with upcoming travel plans, past reservations, or personal interests—provided the user opts in.
For instance, if you search for something like “things to do in Nashville this weekend with friends — we're big foodies who like music,” AI Mode can draw on your previous restaurant reservations and searches to recommend spots with outdoor seating that match your tastes. It can also suggest nearby events happening during your stay, using details from your flight and hotel confirmations to tailor suggestions based on your itinerary and location.
Whenever AI Mode uses personal context, it will be clearly indicated on screen. These features are optional and entirely under your control—you can choose to connect or disconnect them at any time.
Charts and Graphs
Another feature will allow users to generate custom charts and graphs, helping visualize complex data based on their specific queries. While it will launch with support for topics like sports and finance, the goal is to make it easier to interpret information across a range of subjects.
For example, a user might ask for a graph comparing two baseball teams' home field advantages, and receive an interactive visual built from real-time sports data. Search will analyze your question and generate an interactive graph using Google’s real-time sports data, tailored to the specifics of what you asked.
Availability
AI Mode begins rolling out to all users in the United States starting today. Additional features previewed during the I/O keynote—including Deep Search, agentic tools, and real-time visual capabilities—will launch first through Labs in the coming weeks and months.
Users who want early access to these experimental features can opt in by enabling the AI Mode experiment in Google Search Labs.
What This Means
With AI Mode, Google is signaling a major shift: from delivering information to providing intelligent assistance that can reason, respond, and take action. This move comes as competition in AI-powered search continues to accelerate.
ChatGPT, with its advanced voice mode, already supports real-time, camera-based conversations—allowing users to show an object, ask a question aloud, and receive spoken responses in a continuous, interactive dialogue. Perplexity offers voice interaction through its mobile app, and supports multimodal queries with image uploads, though it doesn't yet support live camera input or reverse image search. Claude also supports image uploads but lacks both voice interaction and real-time visual capabilities.
What sets Google apart is the integration of multiple capabilities—multimodal input, personalized context, real-time interaction, and agentic tools—directly into its core Search product. With AI Mode, the company is aiming to make help more immediate, contextual, and actionable—whether that means explaining what you’re pointing your camera at, visualizing data, or completing tasks like booking reservations.
For users, this marks a broader change in expectations: search engines are no longer just gateways to information, but partners in decision-making. As these systems take on more responsibility, how they use personal data, maintain transparency, and earn trust will be just as important as how well they perform.
The future of search may hinge not just on accuracy or speed, but on who builds the most intuitive, helpful, and human-centered experience.
Editor’s Note: This article was created by Alicia Shapiro, CMO of AiNews.com, with writing, image, and idea-generation support from ChatGPT, an AI assistant. However, the final perspective and editorial choices are solely Alicia Shapiro’s. Special thanks to ChatGPT for assistance with research and editorial support in crafting this article.