Microsoft Adds Copilot Mode To Edge With Multi-Tab AI Analysis via @sejournal, @MattGSouthern

Microsoft launches Copilot Mode in Edge, introducing multi-tab AI analysis, voice navigation, and more features in development.

  • Copilot Mode brings AI tools to Microsoft’s Edge browser
  • Available tools include multi-tab content analysis, voice navigation, and a unified search/chat interface.
  • Features in development include task execution, topic-based organization, and a persistent AI assistant.
OpenAI Study Mode Brings Guided Learning to ChatGPT via @sejournal, @MattGSouthern

OpenAI has launched a new feature in ChatGPT called Study Mode, offering a step-by-step learning experience designed to guide users through complex topics.

While aimed at students, Study Mode reflects a broader trend in how people use AI tools for information and adapt their search habits.

As more people start using conversational AI tools to seek information, Study Mode could represent the next step of AI-assisted discovery.

A Shift Toward Guided Learning

Activate Study Mode by selecting “Study and learn” from the tools in ChatGPT and ask a question.

Screenshot from: openai.com/index/chatgpt-study-mode/, July 2025.

Instead of giving direct answers, this feature promotes deeper engagement by asking questions, providing hints, and tailoring explanations to meet user needs.

Screenshot from: openai.com/index/chatgpt-study-mode/, July 2025.

Study Mode runs on custom instructions developed with input from teachers and learning experts. The feature incorporates research-based strategies, including:

  • Encouraging people to take part actively
  • Helping manage how much information people can handle
  • Supporting self-awareness and a desire to learn
  • Giving helpful and practical feedback.

Robbie Torney, Senior Director of AI Programs at Common Sense Media, explains:

“Instead of doing the work for them, study mode encourages students to think critically about their learning. Features like these are a positive step toward effective AI use for learning. Even in the AI era, the best learning still happens when students are excited about and actively engaging with the lesson material.”

How It Works

Study Mode adjusts responses based on a user’s skill level and context from prior chats.

Key features include:

  • Interactive Prompts: Socratic questioning and self-reflection prompts promote critical thinking.
  • Scaffolded Responses: Content is broken into manageable segments to maintain clarity.
  • Knowledge Checks: Quizzes and open-ended questions help reinforce understanding.
  • Toggle Functionality: Users can turn Study Mode on or off as needed during a conversation.

Early testers describe it as an on-demand tutor, useful for unpacking dense material or revisiting difficult subjects.

Looking Ahead

Study Mode is now available to logged-in users across Free, Plus, Pro, and Team plans, with ChatGPT Edu support expected in the coming weeks.

OpenAI plans to integrate Study Mode behavior directly into its models after gathering feedback. Future updates may include visual aids, goal tracking, and more personalized support.


Featured Image: Roman Samborskyi/Shutterstock

Google AI Mode Update: File Uploads, Live Video Search, More via @sejournal, @MattGSouthern

Google is expanding AI Mode in Search with new tools that include PDF uploads, persistent planning documents, and real-time video assistance.

The updates begin rolling out today, with the AI Mode button now appearing on the Google homepage for desktop users.

PDF Uploads Now Supported On Desktop

Desktop users can now upload images directly into search queries, a feature previously available only on mobile.

Support for PDFs is coming in the weeks ahead, allowing you to ask questions about uploaded files and receive AI-generated responses based on both document content and relevant web results.

For example, a student could upload lecture slides and use AI Mode to get help understanding the material. Responses include suggested links for deeper exploration.

Image Credit: Google

Google plans to support additional file types and integrate with Google Drive “in the months ahead.”

Canvas: A Tool For Multi-Session Planning

A new AI Mode feature called Canvas can help you stay organized across multiple search sessions.

When you ask AI Mode for help with planning or creating something, you’ll see an option to “Create Canvas.” This opens a dynamic side panel that saves and updates as queries evolve.

Use cases include building study guides, travel itineraries, or task checklists.

Image Credit: Google

Canvas is launching for desktop users in the U.S. enrolled in the AI Mode Labs experiment.

Real-Time Assistance With Search Live

Search Live with video input also launches this week on mobile. This allows you to utilize AI Mode while pointing your phone camera at real-world objects or scenes.

The feature builds on Project Astra and is available through Google Lens. Start by tapping the ‘Live’ icon in the Google app, then engage in back-and-forth conversations with AI Mode using live video as visual context.

Image Credit: Google

Chrome Adds Contextual AI Answers

Lens is getting expanded desktop functionality within Chrome. Soon, you’ll see a “Ask Google about this page” option in the address bar.

When selected, it opens a panel where you can highlight parts of a page, like a diagram or snippet of text, and receive an AI Overview.

This update also allows follow-up questions via AI Mode from within the Lens experience, either through a button labeled “Dive deeper” or by selecting AI Mode directly.

Looking Ahead

These updates reflect Google’s vision of search as a multi-modal, interactive experience rather than a one-off text query.

While most of these tools are limited to U.S.-based Labs users for now, they point to a future where AI Mode becomes central to how searchers explore, learn, and plan.

Rollout timelines vary by feature. So keep a close eye on how these capabilities add to the search experience and consider how to adapt your content strategies accordingly.

Google Explains The Process Of Indexing The Main Content via @sejournal, @martinibuster

Google’s Gary Illyes discussed the concept of “centerpiece content,” how they go about identifying it, and why soft 404s are the most critical error that gets in the way of indexing content. The context of the discussion was the recent Google Search Central Deep Dive event in Asia, as summarized by Kenichi Suzuki.

Main Body Content

According to Gary Illyes, Google goes to great lengths to identify the main content of a web page. The phrase “main content” will be familiar to those who have read Google’s Search Quality Rater Guidelines. The concept of “main content” is first introduced in Part 1 of the guidelines, in a section that teaches how to identify main content, which is followed by a description of main content quality.

The quality guidelines define main content (aka MC) as:

“Main Content is any part of the page that directly helps the page achieve its purpose. MC can be text, images, videos, page features (e.g., calculators, games), and it can be content created by website users, such as videos, reviews, articles, comments posted by users, etc. Tabs on some pages lead to even more information (e.g., customer reviews) and can sometimes be considered part of the MC.

The MC also includes the title at the top of the page (example). Descriptive MC titles allow users to make informed decisions about what pages to visit. Helpful titles summarize the MC on the page.”

Google’s Illyes referred to main content as the centerpiece content, saying that it is used for “ranking and retrieval.” The content in this section of a web page has greater weight than the content in the footer, header, and navigation areas (including sidebar navigation).

Suzuki summarized what Illyes said:

“Google’s systems heavily prioritize the “main content” (which he also calls the “centerpiece”) of a page for ranking and retrieval. Words and phrases located in this area carry significantly more weight than those in headers, footers, or navigation sidebars. To rank for important terms, you must ensure they are featured prominently within the main body of your page.”

Content Location Analysis To Identify Main Content

This part of Illyes’ presentation is important to get right. Gary Illyes said that Google analyzes the rendered web page to located the content so that it can assign the appropriate amount of weight to the words located in the main content.

This isn’t about the identifying the position of keywords in the page. It’s just about identifying the content within a web page.

Here’s what Suzuki transcribed:

“Google performs positional analysis on the rendered page to understand where content is located. It then uses this data to assign an importance score to the words (tokens) on the page. Moving a term from a low-importance area (like a sidebar) to the main content area will directly increase its weight and potential to rank.”

Insight: Semantic HTML is an excellent way to help Google identify the main content and the less important areas. Semantic HTML makes web pages less ambiguous because it uses HTML elements to identify the different areas of a web page, like the top header section, navigational areas, footers, and even to identify advertising and navigational elements that may be embedded within the main content area. This technical SEO process of making a web page less ambiguous is called disambiguation.

3. Tokenization Is Foundation Of Google’s Index

Because of the prevalence of AI technologies today, many SEOs are aware of the concept of tokenization. Google also uses tokenization to convert words and phrases into a machine-readable format for indexing. What gets stored in Google’s index isn’t the original HTML; it’s the tokenized representation of the content.

4. “Soft 404s Are A Critical Error

This part is important because it frames soft 404s as a critical error. Soft 404s are pages that should return a 404 response but instead return a 200 OK response. This can happen when an SEO or publisher redirects a missing web page to the home page in order to conserve their PageRank. Sometimes a missing web page will redirect to an error page that returns a 200 OK response, which is also incorrect.

Many SEOs mistakenly believe that the 404 response code is an error that needs fixing. A 404 is something that needs fixing only if the URL is broken and is supposed to point to a different URL that is live with actual content.

But in the case of a URL for a web page that is gone and is likely never returning because it has not been replaced by other content, a 404 response is the correct one. If the content has been replaced or superseded by another web page, then it’s proper in that case to redirect the old URL to the URL where the replacement content exists.

The point of all this is that, to Google, a soft 404 is a critical error. That means that SEOs who try to fix a non-error event like a 404 response by redirecting the URL to the home page are actually creating a critical error by doing so.

Suzuki noted what Illyes said:

“A page that returns a 200 OK status code but displays an error message or has very thin/empty main content is considered a “soft 404.” Google actively identifies and de-prioritizes these pages as they waste crawl budget and provide a poor user experience. Illyes shared that for years, Google’s own documentation page about soft 404s was flagged as a soft 404 by its own systems and couldn’t be indexed.”

Takeaways

  • Main Content
    Google gives priority to the main content portion of a given web page. Although Gary Illyes didn’t mention it, it may be helpful to use semantic HTML to clearly outline what parts of the page are the main content and which parts are not.
  • Google Tokenizes Content For Indexing
    Google’s use of tokenization enables semantic understanding of queries and content. The importance for SEO is that Google no longer relies heavily on exact-match keywords, which frees publishers and SEOs to focus on writing about topics (not keywords) from the point of view of how they are helpful to users.
  • Soft 404s Are A Critical Error
    Soft 404s are commonly thought of as something to avoid, but they’re not generally understood as a critical error that can negatively impact the crawl budget. This elevates the importance of avoiding soft 404s.

Featured Image by Shutterstock/Krakenimages.com

Google’s Mueller Advises Testing Ecommerce Sites For Agentic AI via @sejournal, @martinibuster

Google’s John Mueller re-posted the results of an experiment that tested if ecommerce sites were accessible by AI Agents, commenting that it may be useful to check if your ecommerce site works for AI agents that are shopping on behalf of actual customers.

AI Agent Experiment On Ecommerce Sites

Malte Polzin posted commentary on LinkedIn on an experiment he did to test if the top 50 Swiss ecommerce sites were open for business for users who are shopping online with ChatGPT agents.

They reported that most of the ecommerce stores were accessible to ChatGPT’s AI agent but he also found some stores were not for a few reasons.

Reasons Why ChatGPT’s AI Agent Couldn’t Shop

  • CAPTCHA prevented ChatGPT’s AI agent from shopping
  • Blocked by Cloudflare’s Turnstile tool that’s a CAPTCHA alternative.
  • Store blocked access with a maintenance page
  • Bot defense blocked access

Google’s John Mueller Offers Advice

Google’s John Mueller recommended checking if your ecommerce store is open for business to shoppers who use AI agents. It may become more commonplace that users employ agentic search for online shopping.

He wrote:

“Pro tip: check your ecommerce site to see if it works for shoppers using the common agents. (Or, if you’d prefer they go elsewhere because you have too much business, maybe don’t.)

Bot-detection sometimes triggers on users with agents, and it can be annoying for them to get through. (Insert philosophical discussion on whether agents are more like bots or more like users, and whether it makes more sense to differentiate by actions rather than user-agent.)”

Should SEOs Add Agentic AI Testing To Site Audits?

SEOs want to consider adding Agentic AI accessibility to their site audits for ecommerce sites. There may be other use cases where an AI agent may need access to filling out forms, for example on a local services website.

Which Marketing Jobs Are Most Affected by AI? via @sejournal, @MattGSouthern

New research from Microsoft reveals that marketing and sales professionals are among the most affected by generative AI, based on an analysis of 200,000 real workplace conversations with Bing Copilot.

The research examined nine months of anonymized data from January to September 2024, offering a large-scale look at how professionals use AI in their daily tasks.

AI’s Role In Marketing & Sales Work

Microsoft calculated an “AI applicability score” to measure how often AI is used to complete or assist with job-related tasks and how effectively it performs those tasks.

Sales representatives received one of the highest scores (0.46), followed closely by customer service representatives (0.44), writers and authors (0.45), and other marketing roles like:

  • Technical Writers (0.38)
  • Public Relations Specialists (0.36)
  • Advertising Sales Agents (0.36)
  • Market Research Analysts (0.35)

Overall, “Sales and Related” occupations ranked highest in AI impact across all major job categories, followed by computing and administrative roles.

As Microsoft researchers note:

“The current capabilities of generative AI align most strongly with knowledge work and communication occupations.”

Tasks Where AI Performs Well

The study found AI is particularly effective at:

  • Gathering information
  • Writing and editing content
  • Communicating information to others
  • Supporting ongoing learning in a specific field

These tasks often show high success and satisfaction rates among users.

However, the study also uncovered that in 40% of conversations, the AI performed tasks different from what the user initially requested. For example, when someone asks for help with research, the AI might instead explain research methods rather than deliver information.

This reflects AI’s role as more of a helper than a replacement. As the researchers put it:

“The AI often acts in a service role to the human as a coach, advisor, or teacher.”

Areas Where Human Strength Excels

Some marketing tasks still show resistance to AI. These include:

  • Visual design and creative work
  • Strategic data analysis
  • Roles that require physical presence or in-person interaction, such as event marketing or client-based sales

These activities consistently scored lower for AI satisfaction and task completion.

Education, Wages & Job Security

The study found a weak correlation between AI impact and wages. The correlation coefficient was 0.07, indicating that AI is reshaping tasks across income levels, not just automating low-paying jobs.

For roles requiring a Bachelor’s degree, the average AI applicability score was slightly higher (0.27), compared to 0.19 for jobs with lower education requirements. This suggests knowledge work may see more AI involvement, but not necessarily replacement.

The researchers caution against assuming automation leads to job loss:

“This would be a mistake, as our data do not include the downstream business impacts of new technology, which are very hard to predict and often counterintuitive.”

What You Can Do

The data supports a clear takeaway: AI is here to stay, but it’s not taking over every aspect of marketing work.

Digital anthropologist Giles Crouch, quoted in coverage of the study, said:

“The conversation has gone from this fear of massive job loss to: How can we get real benefit from these tools? How will it make our work better?”

There are a few ways marketing professionals can adapt, such as:

  • Sharpening skills in areas where AI falls short, such as visual creativity and strategic interpretation
  • Using AI as a productivity booster for content creation and information gathering
  • Positioning themselves as AI collaborators rather than competitors

Looking Ahead

AI is reshaping marketing by changing how work gets done, not by eliminating roles.

As with past technological changes, those who adapt and integrate these tools into their workflow may find themselves better positioned for long-term success.

The full report includes a detailed breakdown of occupations and task types across the U.S. workforce.


Featured Image: Roman Samborskyi/Shutterstock

Google Warns: CSS Background Images Aren’t Indexed via @sejournal, @MattGSouthern

In a recent Search Off the Record podcast, Google’s Search Relations team cautioned developers against using CSS for all website images.

While CSS background images can enhance visual design, they’re invisible to Google Image Search. This could lead to missed opportunities in image indexing and search visibility.

Here’s what Google’s Search Advocates advise.

The CSS Image Problem

During the episode, John Mueller shared a recurring issue:

“I had someone ping me I think last week or a week before on social media: “It looks like my developer has decided to use CSS for all of the images because they believe it’s better.” Does this work?”

According to the Google team, this approach stems from a misunderstanding of how search engines interpret images.

When visuals are added via CSS background properties instead of standard HTML image tags, they may not appear in the page’s DOM, and therefore can’t be indexed.

As Martin Splitt explained:

“If you have a content image, if the image is part of the content… you want an img, an image tag or a picture tag that actually has the actual image as part of the DOM because you want us to see like ah so this page has this image that is not just decoration. It is part of the content and then image search can pick it up.”

Content vs. Decoration

The difference between a content image and a decorative image is whether it adds meaning or is purely cosmetic.

Decorative images, such as patterned backgrounds, atmospheric effects, or animations, can be safely implemented using CSS.

When the image conveys meaning or is referenced in the content, CSS is a poor fit.

Splitt offered the following example:

“If I have a blog post about this specific landscape and I want to like tell people like look at this amazing panoramic view of the landscape here and then it’s a background image… the problem is the content specifically references this image, but it doesn’t have the image as part of the content.”

In such cases, placing the image in HTML using the img or picture tag ensures it’s understood as part of the page’s content and eligible for indexing in Google Image Search.

What Makes CSS Images Invisible?

Splitt explained why this happens:

“For a user looking at the browser, what are you talking about, Martin? The image is right there. But if you look at the DOM, it absolutely isn’t there. It is just a CSS thing that has been loaded to style the page.”

Because Google parses the DOM to determine content structure, images styled purely through CSS are often overlooked, especially if they aren’t included as actual HTML elements.

This distinction reflects a broader web development principle.

Splitt adds:

“There is ideally a separation between the way the site looks and what the content is.”

What About Stock Photos?

The team addressed the use of stock photos, which are sometimes added for visual appeal rather than original content.

Splitt says:

“The meaning is still like this image is not mine. It’s a stock image that we bought or licensed but it is still part of the content,” the team noted.

While these images may not rank highly due to duplication, implementing them in HTML still helps ensure proper indexing and improves accessibility.

Why This Matters

The team highlighted several examples where improper implementation could reduce visibility:

  • Real estate listings: Home photos used as background images won’t show up in relevant image search queries.
  • News articles: Charts or infographics added via CSS can’t be indexed, weakening discoverability.
  • E-commerce sites: Product images embedded in background styles may not appear in shopping-related searches.

What To Do Next

Google’s comments indicate that you should follow these best practices:

  • Use HTML (img or picture) tags for any image that conveys content or is referenced on the page.
  • Reserve CSS backgrounds for decorative visuals that don’t carry meaning.
  • If users might expect to find an image via search, it should be in the HTML.
  • Proper implementation helps not only with SEO, but also with accessibility tools and screen readers.

Looking Ahead

Publishers should be mindful of how images are implemented.

While CSS is a powerful tool for design, using it to deliver content-related images may conflict with best practices for indexing, accessibility, and long-term SEO strategy.

Listen to the full podcast episode below:


Featured Image: Roman Samborskyi/Shutterstock

Google On Balancing Needs Of Users And The Web Ecosystem via @sejournal, @martinibuster

At the recent Search Central Live Deep Dive 2025, Kenichi Suzuki asked Google’s Gary Illyes how Google measures high quality and user satisfaction of traffic from AI Overviews. Illyes’ response, published by Suzuki on LinkedIn, covered multiple points.

Kenichi asked for specific data, and Gary’s answer offered an overview of how Google gathers external data to form internal opinions on how AI Overviews is perceived by users in terms of satisfaction. He said that the data informs public statements by Google, including those made by CEO Sundar Pichai.

Illyes began his answer by saying that he couldn’t share specifics about the user satisfaction data, but he still continued to offer his overview.

User Satisfaction Surveys

The first data point that Illyes mentioned was user satisfaction surveys to understand how people feel about AI Overviews. Kenichi wrote that Illyes said:

“The public statements made by company leaders, such as Sundar Pichai, are validated by this internal data before being made public.”

Observed User Behavior

The second user satisfaction data point that Illyes mentioned was inferring from the broader market. Kenichi wrote:

“Gary suggested that one can infer user preference by looking at the broader market. He pointed out that the rapidly growing user base for other AI tools (like ChatGPT and Copilot) likely consists of the same demographic that enjoys and finds value in AI Overviews.”

Motivated By User-Focus

This part means putting the user first as the motivation for introducing a new feature. Illyes specifically said that causing a disruption is not Google’s motivation for AI search features.

Acknowledged The Web Ecosystem

The last point he made was to explain that Google’s still figuring out how to balance their user-focused approach with the need to maintain a healthy web ecosystem.

Kenichi wrote that Illyes said:

“He finished by acknowledging that they are still figuring out how to balance this user-focused approach with the need to continue supporting the wider web ecosystem.”

Balancing The Needs Of The Web Ecosystem

At the dawn of modern SEO, Google did something extraordinary: they reached out to web publishers through the most popular SEO forum at the time, WebmasterWorld. Gary Illyes himself, before he joined Google, was a WebmasterWorld member. This outreach by Google was the initiative of one Googler, Matt Cutts. Other Googlers provided interviews, but Matt Cutts, under the WebmasterWorld nickname of GoogleGuy, held two-way conversations with the search and publisher community.

This is no longer the case at Google, which is largely back to one-way communication accompanied by intermittent social media outreach.

The SEO community may share in the blame for this situation, as some SEOs post abusive responses on social media. Fortunately, those people are in the minority, but that behavior nonetheless puts a chill on the few opportunities provided to have a constructive dialogue.

It’s encouraging to hear Illyes mention the web ecosystem, and it would be even further encouraging to hear Googlers, including the CEO, focus more on how they intend to balance the needs of the users with those of the creators who publish content, because many feel that Google’s current direction is not sustainable for publishers.

Featured Image by Shutterstock/1000 Words

Why A Site Deindexed By Google For Programmatic SEO Bounced Back via @sejournal, @martinibuster

A company founder shared their experience with programmatic SEO, which they credited for initial success until it was deindexed by Google, calling it a big mistake they won’t repeat. The post, shared on LinkedIn, received scores of supportive comments.

The website didn’t receive a manual action, Google deindexed the web pages due to poor content quality.

Programmatic SEO (pSEO)

Programmatic SEO (aka pSEO) is a phrase that encompasses a wide range of tactics that have automation at the heart of it. Some of it can be very useful, like automating sitewide meta descriptions, titles, and alt text for images.

pSEO is also the practice of using AI automation to scale content creation sitewide, which is what the person did. They created fifty thousand pages targeting long tail phrases, phrases that are not commonly queried. The site initially received hundreds of clicks and millions of impressions but the success was not long-lived.

According to the post by Miquel Palet (LinkedIn Profile):

“Google flagged our domain. Pages started getting deindexed. Traffic plummeted overnight.

We learned the hard way that shortcuts don’t scale sustainably.

It was a huge mistake, but also a great lesson.

And it’s one of the reasons we rebranded to Tailride.”

Thin AI Content Was The Culprit

A follow-up post explained that they believe the AI generated content backfired was because it was thin content, which makes sense. Thin content, regardless of how it was authored, can be problematic.

One of the posts by Palet explained:

“We’re not sure, but probably not because AI. It was thin content and probably duplicated.”

Rasmus Sørensen (LinkedIn profile), an experienced digital marketer shared his opinion that he’s seen some marketers pushing shady practices under the banner of pSEO:

“Thanks for sharing and putting some real live experiences forward. Programmatic SEO had been touted as the next best thing in SEO. It’s not and I’ve seen soo much garbage published the last few months and agencies claiming that their pSEO is the silver bullet.
It very rarely is.”

Joe Youngblood (LinkedIn profile) shared that SEO trends can be abused and implied that it is a viable strategy if done correctly:

“I would always do something like pSEO under the supervision of a seasoned SEO consultant. This tale happens all too frequently with an SEO trend…”

What They Did To Fix The site

The company founder shared that they rebranded the website to a new domain, redirecting the old domain to the new one, and focused their site on higher quality content that’s relevant to users.

They explained:

“Less pages + more quality”

A site: search for their domain shows that Google is now indexing their content, indicating that they are back on track.

Takeaways

Programmatic SEO can be useful if approached with an understanding of where the line is between good quality and “not-quality” content.

Featured Image by Shutterstock/Cast Of Thousands