Google Explains A Weird Domain Migration Outcome via @sejournal, @martinibuster

Google’s John Mueller offered an insight into why the domain name migrations between multiple language versions of the same website turned out vastly different even though the same process was followed for each of three websites.

Migrating To Different Domain Names

The person asking the question maintained three websites under three different country code top level domains (ccTLDs). The ccTLDs were .fr (France), .be (Belgium), and .de (Germany). The project was a migration from one domain name to another domain name, each within their respective ccTLD, like example-1.fr to example-2.fr.

Each site had the same content but in different languages that corresponded to the countries targeted by each of their respective ccTLD. Thus, because everything about the migration was equal the reasonable expectation was that the outcome of the migration would be the same for each site.

But that wasn’t the case.

Two out of the three site migrations failed and lost traffic. Only one of them experienced a seamless transition.

What Went Wrong?

The person asking for information about what went wrong tweeted:

“Hi @JohnMu,

AlicesGarden (.fr, .be, .de …) migrated to Sweeek (.fr, .be, .de …)

.FR and .BE lost a lot of traffic in Oct. 23

Other TLD performed well.

Redirects, canonical, hreflang, content, offer = OK
Search console migration = OK

What else could be wrong ?”

Original tweet:

John Mueller Tweets His Response

Google’s John Mueller responded that each site is a different site and should be regarded as differently even if they share the same content assets (in different languages) between them.

Mueller tweeted:

“I don’t know your sites, but even if the content’s the same, they’re essentially different sites (especially with ccTLDs), so it would be normal for a migration to affect them differently (and this seems to be quite a way back in the meantime).”

Here is his tweet:

Are Site Migrations Essentially Equal?

John makes an important observation. It may very well be that how a site fits into the Internet may be affected by a site migration, especially by how users may respond to a change in template or a domain name. I’ve done domain name migrations and those have gone well with a temporary slight dip. But that was just one domain name at a time, not multiple domains.

What Might Be Going On?

Someone in that discussion tweeted to ask if they had used AI content.

The person asking the original question tweeted their response

“Yes a bit of AI for short description, mainly in category pages, but nothing which could be deceptive from an end-user perspective.”

Could it be that the two of the site migrations failed and a third was successful because they coincidentally overlapped with an update? Given that the extent of AI content was trivial it’s probably unlikely.

The important takeaway is what Mueller said, that they’re all different sites and so the outcome should naturally be different.

Featured Image by Shutterstock/William Barton

Female Founder Of A Billion-Dollar Company: 10 Signs You’re An Entrepreneur

I know what it is like to be a reluctant entrepreneur.

Back in 2014, when I first started thinking about Starling Bank, if anyone had told me, “Ah, so you are going to be an entrepreneur,” I’d have said, “Absolutely not.” I was similarly reluctant to use the word “founder.”

Growing up, I always believed entrepreneurs were inventors. They made something. Today’s example might be James Dyson, because he invented the first bagless vacuum cleaner.

Elon Musk invented a rocket that went into space. I didn’t believe I was like this. I had gone to university, joined Lloyd’s Bank as a graduate trainee, and then worked my way up the corporate ladder.

It wasn’t until Starling Bank became a unicorn, reaching a valuation of over £1 billion, that I finally relented on the word “entrepreneur.”

Since then, I have reflected on why I never saw myself as a bona fide entrepreneur.

My conclusion is that, when I started my business, I didn’t do it because I wanted to be an entrepreneur. I did it because I had long since grown disillusioned with the world of banking and was convinced that I knew what needed to be done to put things right.

I had zero doubt about my idea for a new type of bank. That, in itself, is the basis for entrepreneurship. What I called myself really didn’t matter.

If you, too, are a reluctant entrepreneur, it might be helpful if I began by sharing what I, and other female entrepreneurs, think about how to identify whether there is an entrepreneur lurking beneath the surface (whether or not you would like to admit it to yourself).

Here are 10 signs you are an entrepreneur:

1. You Don’t Accept “No” As An Answer

Anyone who gets discouraged by a “no” is never going to make it as an entrepreneur. I held meetings with more than 300 investors before I found someone interested in putting money into Starling Bank.

If I had given up at 10 meetings, or a hundred meetings, the business would never have gotten off the ground. Similarly, plenty of customers, regulators, potential hires, and seemingly everyone said no.

Hearing “no” time after time is part and parcel of being an entrepreneur. If you are the type to welcome this two-letter word as a challenge that allows you to show what you can do, then you’ve got the entrepreneur mindset.

2. Your Achievements Have Not Been Recognized (So Far)

Women, in particular, often get frustrated with corporate life because they feel underestimated or unrecognized.

Their input is ignored or played down. They report speaking up in meetings to put forward an idea, and it is met with silence, only for a male colleague to be praised when he repeats a virtually identical point later on.

I spent most of my career complaining about not getting the plum projects or not being paid enough, and very often was only able to make any progress by quitting one job to take another. In the end, I had to start my own business to achieve what I always believed to be my full potential.

It is quite possible that this is the reason why so many women become entrepreneurs following maternity leave.

After a break from the corporate world, they can’t face re-entering the fray where their achievements go unnoticed month after month. The idea of setting out alone and being judged on their own merits is more alluring.

3. You Don’t Care About Fitting In

Closely related to the above, many entrepreneurs have long since decided they don’t fit in, so don’t waste any time on it. When I started speaking with other female entrepreneurs, I was struck by just how different they were from the people I knew from corporate life.

In my previous career, I was surrounded by people who spent their entire careers trying to fit in. Entrepreneurs didn’t seem interested in all of that. They seemed more grounded and focused on the task at hand, rather than carefully weighing each word to see how they came across.

There’s an element of mischievousness here, too. It’s not enough to feel disgruntled about being passed over; you need to feel determined to do something to strike a blow at the establishment. Any entrepreneur who feels this way would be in fine company.

This is the motivation that spurred on Dame Stephanie Shirley, also known as Steve, who started a software house back in 1962 that introduced radical new flexible work practices for women in technology.

4. You’re A Dreamer (And A Little Naive)

Entrepreneurs who have previously had “proper jobs” often report spending a lot of time looking at their firms or bosses and imagining how they would do it better. Their imagination is filled with dreams and ideas.

Mixed in with this is a certain amount of naivety. During my own entrepreneurial journey, it really helped that I dreamed big, but also that I was a little naive about the scale of what I dreamed about.

When things were difficult, it helped me imagine the next bit had to be easier. I’d tell myself we only need to get this one thing done, and it’ll be plain sailing.

This was even though every single day as an entrepreneur is about solving a new problem.

5. You’re Competitive

For some reason, a desire to win is often viewed negatively, but it is a quality all entrepreneurs need. In fact, I have not met an entrepreneur who isn’t super-competitive.

During the toughest times, an arch-rival is what you need to keep going, even if that rival doesn’t really see your business as a competitor.

They are the mark. You need to get ahead of them to get closer to your goal. True entrepreneurs turn this rivalry into pure energy.

It should be said, this rivalry should be healthy. Going head-to-head with daggers drawn causes unnecessary tension and doesn’t help anyone.

New entrepreneurs are often quite surprised to hear that most founders often speak with other businesses in their sector. It’s a great way to swap ideas and can help everyone.

6. You’re Goal-Driven

Entrepreneurs are goal-driven. We like to take the initiative and solve problems, even if it means working on them for long periods to remove all obstacles that get in the way. This also means they are strategic in their game plans.

Right from the beginning at Starling, I had a plan in mind – and it was a grand plan.

Whenever I talked to anyone about anything to do with the business, I was thinking ahead about how what we were discussing would impact my end goal.

7. You’re Constantly Looking To Learn Something And Improve

Whenever I approach something new, my first instinct is to head to a bookshop. I’ve been the same since I was a young child when my parents gave me an unlimited budget to spend at our local one. I always bought non-fiction and academic books and would pore over them to find out how to do things.

One year, my parents bought me a second-hand, 1956 copy of the Encyclopaedia Britannica – all 24 volumes. I devoured every page and, perhaps oddly, still have an incredible knowledge of life in the 1950s.

I’ve never lost the idea that books are there to advise and inform. Whenever I try anything new, I download dozens of books on the subject.

If I start something in the house, my Kindle is full of books on interior design. My shelves feature a lot of books about living and working in America/Ireland/Switzerland, basically tracking my career progression.

Beside them are endless books on starting a business, managing the transition from start-up to high-growth business, or engaging teams.

Not everyone is such a bookworm; I understand that. Some gather their intel by meeting as many experts as they can and listening carefully to every word. They learn something from every interaction.

Yet, however they absorb it, entrepreneurs are always looking to improve their outlook. If there is a better way to do something, they need to know it.

8. You’re Good With People

Your employees are people. Your customers are people. Your regulators are people. Your investors are people. All of your stakeholders are people.

Each one plays a role in making a high-growth business a success.

If just one stakeholder doesn’t engage, the business won’t succeed. Thus, being good with people is key.

9. You’re Persuasive

To be successful, an entrepreneur needs people to listen to them and be inspired, whether they are investors, customers, or colleagues.

This talent is essential right from the beginning of an entrepreneurial venture when early team members need to be persuaded to give up the security of their existing jobs to join you on this (potentially crazy) endeavor.

When things don’t go according to plan, you’ll have to persuade them to keep the faith and stick with it.

Every single difficult conversation has to be sprinkled with just enough magic optimism dust. This is even more so when people are working for next to nothing – or even nothing at all.

You need them to come back the next day. Sometimes, that takes some next-level charm skills. Remaining upbeat and persuasive can be particularly challenging during funding rounds when so much is riding on the outcome.

I have many vivid memories of returning to the office empty-handed and, yet, telling everyone I had a great lead.

10. You’re Fully Committed

Entrepreneurialism is not like an uplifting movie, where the main character has an idea and then a sped-up reel shows them hurtling towards success. There will be multiple high points and low points along the way.

Almost every entrepreneur will face a time when they are hours away from losing everything they worked for. Before Starling received its banking license and began opening accounts, the business came close to collapse three times.

It has been well documented that we lost a crucial potential investor at the eleventh hour, and the entire team walked out to start a competitive product (great news, a rival – see point five above!).

It takes 100% commitment to the vision to keep going at these low points. If you have that, you’re an entrepreneur.


This chapter is an extract from Female Founders’ Playbook by Anne Boden ©2024 and is reproduced with permission from Kogan Page Ltd.

To read the full book, SEJ readers have an exclusive 25% discount code and free shipping to the US or UK. Use promo code SEJ25 at koganpage.com here.

More resources: 


Featured Image: Rido/Shutterstock

Google Confirms Links Are Not That Important via @sejournal, @martinibuster

Google’s Gary Illyes confirmed at a recent search marketing conference that Google needs very few links, adding to the growing body of evidence that publishers need to focus on other factors. Gary tweeted confirmation that he indeed say those words.

Background Of Links For Ranking

Links were discovered in the late 1990’s to be a good signal for search engines to use for validating how authoritative a website is and then Google discovered soon after that anchor text could be used to provide semantic signals about what a webpage was about.

One of the most important research papers was Authoritative Sources in a Hyperlinked Environment by Jon M. Kleinberg, published around 1998 (link to research paper at the end of the article). The main discovery of this research paper is that there is too many web pages and there was no objective way to filter search results for quality in order to rank web pages for a subjective idea of relevance.

The author of the research paper discovered that links could be used as an objective filter for authoritativeness.

Kleinberg wrote:

“To provide effective search methods under these conditions, one needs a way to filter, from among a huge collection of relevant pages, a small set of the most “authoritative” or ‘definitive’ ones.”

This is the most influential research paper on links because it kick-started more research on ways to use links beyond as an authority metric but as a subjective metric for relevance.

Objective is something factual. Subjective is something that’s closer to an opinion. The founders of Google discovered how to use the subjective opinions of the Internet as a relevance metric for what to rank in the search results.

What Larry Page and Sergey Brin discovered and shared in their research paper (The Anatomy of a Large-Scale Hypertextual Web Search Engine – link at end of this article) was that it was possible to harness the power of anchor text to determine the subjective opinion of relevance from actual humans. It was essentially crowdsourcing the opinions of millions of website expressed through the link structure between each webpage.

What Did Gary Illyes Say About Links In 2024?

At a recent search conference in Bulgaria, Google’s Gary Illyes made a comment about how Google doesn’t really need that many links and how Google has made links less important.

Patrick Stox tweeted about what he heard at the search conference:

” ‘We need very few links to rank pages… Over the years we’ve made links less important.’ @methode #serpconf2024″

Google’s Gary Illyes tweeted a confirmation of that statement:

“I shouldn’t have said that… I definitely shouldn’t have said that”

Why Links Matter Less

The initial state of anchor text when Google first used links for ranking purposes was absolutely non-spammy, which is why it was so useful. Hyperlinks were primarily used as a way to send traffic from one website to another website.

But by 2004 or 2005 Google was using statistical analysis to detect manipulated links, then around 2004 “powered-by” links in website footers stopped passing anchor text value, and by 2006 links close to the words “advertising” stopped passing link value, links from directories stopped passing ranking value and by 2012 Google deployed a massive link algorithm called Penguin that destroyed the rankings of likely millions of websites, many of which were using guest posting.

The link signal eventually became so bad that Google decided in 2019 to selectively use nofollow links for ranking purposes. Google’s Gary Illyes confirmed that the change to nofollow was made because of the link signal.

Google Explicitly Confirms That Links Matter Less

In 2023 Google’s Gary Illyes shared at a PubCon Austin that links were not even in the top 3 of ranking factors. Then in March 2024, coinciding with the March 2024 Core Algorithm Update, Google updated their spam policies documentation to downplay the importance of links for ranking purposes.

Google March 2024 Core Update: 4 Changes To Link Signal

The documentation previously said:

“Google uses links as an important factor in determining the relevancy of web pages.”

The update to the documentation that mentioned links was updated to remove the word important.

Links are not just listed as just another factor:

“Google uses links as a factor in determining the relevancy of web pages.”

At the beginning of April Google’s John Mueller advised that there are more useful SEO activities to engage on than links.

Mueller explained:

“There are more important things for websites nowadays, and over-focusing on links will often result in you wasting your time doing things that don’t make your website better overall”

Finally, Gary Illyes explicitly said that Google needs very few links to rank webpages and confirmed it.

Why Google Doesn’t Need Links

The reason why Google doesn’t need many links is likely because of the extent of AI and natural language undertanding that Google uses in their algorithms. Google must be highly confident in its algorithm to be able to explicitly say that they don’t need it.

Way back when Google implemented the nofollow into the algorithm there were many link builders who sold comment spam links who continued to lie that comment spam still worked. As someone who started link building at the very beginning of modern SEO (I was the moderator of the link building forum at the #1 SEO forum of that time), I can say with confidence that links have stopped playing much of a role in rankings beginning several years ago, which is why I stopped about five or six years ago.

Read the research papers

Authoritative Sources in a Hyperlinked Environment – Jon M. Kleinberg (PDF)

The Anatomy of a Large-Scale Hypertextual Web Search Engine

Featured Image by Shutterstock/RYO Alexandre

Google On 404 Errors And Ranking Drops via @sejournal, @martinibuster

In a Google Office Hours podcast, Google’s Gary Illyes answered a question about 404 Page Not Found errors that coincided with a drop in rankings.

Fake External 404 Errors

There are probably many reasons for 404 errors created by bots. One reason for those error responses could be that they are originating from automated scanners that are looking for files or folders that are typical for specific vulnerable plugins or themes.

Checking the the IP address and user agent of the bot that’s causing the 404 server error responses can also yield clues if those 404 responses are from automated scanning bots. If the IP address indicates it’s originating from a web host, or a Russian or Chinese IP address then it’s probably a hacker. If the user agent is an out of date version of Chrome or Firefox then that’s probably a hacker’s bot, too. That’s just one reason out of many.

Google Answers The Question

The person asking the question correlated a drop in rankings with 404 Page Not Found server responses.

This is the question that was asked:

“False 404 URLs hitting my website from external source, could this be related to ranking drop? What can I do to fix it?”

Google’s Gary Illyes responded:

“Fake 404s that Googlebot might’ve crawled cannot be reasonably attributed to a ranking drop. It’s normal to have any number of 404s on a site and you don’t have to fix them, though if you see in your analytics software that a larger number of actual users are also coming through those 404 URLs, I would personally try to convert them somehow by, for example, showing them some relevant content instead.”

Ranking Drops And 404 Page Not Found

Gary said that 404s are normal and unlikely to cause a drop in search rankings. It’s true that 404 errors are a common occurrence. In general that’s okay and most of the time there’s no need to fix anything.

404s That Are Generated By Actual Users

There are other cases where 404s are created by real people who are following a link from somewhere and getting a Page Not Found response. This is easy to diagnose by checking if the URL the site visitors are trying to reach closely resembles an actual URL. That’s an indication that someone misspelled a URL and the way to fix that is by creating a redirect from the misspelled URL to the correct one.

About The Drop In Rankings

Something that Gary didn’t mention but is worth mentioning is that there may be a small possibility that a bot did find a vulnerability and the 404s were caused by a scanner that was scanning for vulnerabilities before eventually finding one.

One way to check for that is to use phpMyAdmin, a server app, to view your database tables in the section for users and see if there’s an unrecognized user.

Another way, if the site is hosted on WordPress, is to use a security plugin to scan the site to see if it’s using a vulnerable theme or plugin.

Jetpack Protect is a free vulnerability scanner that’s created by the developers at Automattic. It won’t fix a vulnerability but it will warn a user if it finds plugin or theme related vulnerabilities. The paid premium version offers more protection.

Other trustworthy WordPress security plugins are Sucuri and Wordfence, both of which do different things and are available in free and premium versions.

But if that’s not the case then the ranking drops are pure coincidence and the real reasons lie elswhere.

Listen to the question and answer at 12:27 minute mark of the Office Hours podcast:

Featured Image by Shutterstock/Asier Romero

Google On Diagnosing A Deindexed WordPress Site via @sejournal, @martinibuster

Google’s John Mueller answered a question about a WordPress site that was completely deindexed from Google Search after changing to a different web hosting platform. Mueller’s answer shows where to start investigating the reasons why that happens.

Dropped From Index After A Site Migration

A submitted question to the Google Office Hours podcast related that their site disappeared from the search engine results pages after they had migrated their site from WordPress to “self-publishing” and that the site was subsequently deindexed.

The question may mean that they were hosting the site on the WordPress.com managed WordPress platform and then migrated their site away to another web host, typically referred to as self-hosting.

Migrating a WordPress site to another web host takes a few relatively simple steps but it can go wrong at virtually every step of the process.

More on migrating a WordPress site later because it’s relevant to the question.

John Mueller Answers The Question

Mueller answered the question from the point of view of analyzing the website itself, which is the best place to start in this specific case. The reason is because the question implies that the site can still be reached online.

This is the question:

“After the site was transferred from WordPress to self-publishing, almost all publications disappeared from the index. The search results are ‘0’.”

John Mueller responded:

“If your website dropped out of the search results and is no longer being indexed at all, right about the time when you did a migration, then my guess is that your new website is somehow blocking search engines, or at least, blocking Google. I’d start by analyzing the data in the Search Console, and working forward from there.”

Search console may show the exact date that pages started dropping out of Google’s index and the reason why they’re dropping out. Typical reasons may be the pages are not found (404) or that Google was blocked from crawling by a robots.txt. Those are the starting points for identifying what’s happening on Google’s side.

Diagnose If WordPress Is Blocking Google

This kind of problem typically happens when a WordPress site is set to be hidden from the search engines, which means there’s a robots.txt entry that’s blocking search engines from indexing the site.

Google Search Console will tell you when this is happening through the Page Indexing Report which will show that the site is blocked by a robots.txt in the column of the report labeled “Why pages aren’t indexed”.

If that’s the case then you can actually see this to be the case in your robots.txt file typically located in the root of your domain, /robots.txt (example.com/robots.txt).

If the page is blocked by Robots.txt then it may be that a WordPress setting was applied at some point in the migration to block search indexing.

This is a setting that’s native to the WordPress admin panel that can be reached here:

Settings 🡪 Reading.

There you’ll find a setting called “Search Engine Visibility” with a checkbox labeled Discourage search engines from indexing this site.

Screenshot Of WordPress Visibility Setting

An image of a WordPress settings interface with options for a website, including blog pagination and search engine visibility, with an arrow pointing to the option WordPress search visibility settings

If that’s the case then untick that box and you’re done.

If there’s a robots.txt entry that’s blocking search engines but the above box isn’t checked then it could be another plugin doing that, like an SEO or migration plugin. If that’s not the case then maybe whoever was helping do the move inserted that entry in which case it’s an easy thing to download the robots.txt, edit the file in a text file editor then uploading it back.

Other issues could be a failure to update DNS settings to point to the new web hosting service or it could be something on the web host side. Starting the investigation at Google Search Console is good advice.

Listen to Google’s answer here at the 7:24 minute mark:

Featured Image by Shutterstock/Roman Samborskyi

Google Answers Whether Having Two Sites Affects Rankings via @sejournal, @martinibuster

Google’s John Mueller answered whether having two sites could negatively affect search rankings. His answer is surprisingly applicable to different ways of interpreting the question.

Can Having Two Sites Affect Rankings?

A person submitted a question to Google where they wanted to know if having two sites could negatively affect their rankings. The question as reproduced in Google is concise and also a little vague which leads to the outcome that the question can be answered in way that’s different from the way that Mueller answered it.

This is the question:

“My rankings have dropped due to Google finding out I have two websites. Is this correct?”

Google’s John Mueller answered:

“No. That’s not likely. Many people have several websites. Separate websites are not a problem.

The issue is often more indirect: if you work on a lot of websites, you’re not going to have a lot of time to make truly awesome websites everywhere. And, if you’re making websites that aren’t awesome, then that can be something which our algorithms pick up on when it comes to recommending your site to others.”

A Different Way To Answer The Question

John Mueller answered the question under the assumption that the two sites in the question are on different topics. He prefaced his answer by saying that many “people have several websites” which is true.

But many people don’t have multiple websites on the same topic. The person asking the question was vague about whether the sites were about different topics, too.

It’s very possible that the sites are on the same topic, in which case it makes sense why they may be concerned that Google found out about the two sites because it could be seen as trying to game Google. After all, who worries about having multiple sites on different topics?

If the sites were on the same topic then the answer to the question is somewhat different.

One of the important considerations when one person controls multiple sites on the same topic is that they’re doing it for ranking purposes which is not a good starting point for any website.

I’m not saying there’s something corrupt about the practice but I am saying that it’s not really the best starting point for creating signals of quality. It’s not a matter of someone thinking that they’re going to create multiple high quality sites for users, right?

Another reason why people create multiple sites for ranking (and not quality) is because people feel if they split up a topic into subsidiary subtopics they can create stronger sites about those related subtopics as opposed to one site with multiple related subtopics.

But what almost inevitably happens is that they wind up running multiple related sites that could be stronger together as one authoritative website.

I asked Bill Hartzer of Hartzer Consulting (Facebook profile) if he thought multiple sites on the same topic could affect rankings.

Bill agreed with me and shared:

“A lot of people, after building a website that ranks well, will think that they can simply create another website on the same topic and “make double the money” or get “double the traffic” and it’s simply not true.

Companies will also have one main website, but they’ll create a separate website on a separate domain name for each of their products or services. Over the past 10 years or so, that hasn’t been a good strategy. While it’s good to register the domain names of your products or services, it’s better to combine all those websites into one main, more authoritative website.

Typically if they’re on the same topic, one website, the original site, will continue to rank well. But the second website doesn’t rank as well. In most cases, it’s always better to combine the websites into one website.”

Multiple Sites And Rankings

John Mueller is right that publishing multiple sites (on different topics) could compromise a person’s ability to focus on one site to make it outstanding, remarking that there’s an indirect negative effect on rankings. He is also correct in saying that it’s unlikely to have a direct negative effect on rankings.

Changing the question to whether there’s an effect on ranking if the multiple sites are on the same topic, then the answer becomes more nuanced but follow a similar trajectory as Mueller’s original answer that it detracts from being able to create one outstanding site and can lead to a person creating multiple middling sites.

But that’s not necessarily a foregone conclusion when a person is creating  multiple sites on different topics. It’s absolutely possible to create multiple sites on different topics and to be successful at it. It might be hard for one person alone to pull it off but it’s not difficult to do when multiple people are working on the websites creating content and focusing on promotion.

Watch/listen to the Google SEO Office hours at the 33 second mark:

Featured Image by Shutterstock/Asier Romero

Google Answers If Changing Web Hosting Affects SEO via @sejournal, @martinibuster

Google’s Gary Illyes answered if switching web hosting platforms could result in a negative outcome for rankings and SEO. It’s a reasonable question because migrating a site to a new web host involves multiple technical factors that can go wrong and have an immediately negative effect.

What Does Changing A Website Host Entail?

Changing web hosts can be complicated but if the site is just a regular WordPress site then migrating to a new host is relatively easy, especially if you use plugins.

But even doing a manual migration isn’t that difficult if you’re already familiar with administering website database with apps like phpMyAdmin and know how to use an SFTP software.

Gary Ilyes’ answer refers to doing it “by the book” which is actually quite a lot and can be overwhelming if you’ve never done this before.

Does Changing Website Hosting Affect SEO?

The question that was asked is:

“My company is considering switching hosts for our website. Would switching have a negative and lasting effect on our SEO rankings?”

Google’s Gary Illyes responded:

“If you do things by the book, meaning the website keeps being resolvable and the actual downtime is minimal, changing hosts should not have negative effects on your pages’ rankings in Google’s search results.”

Two Things To Be Mindful Of

Gary Illyes’ answer assumes that the new web host is as good as the old web hosting platform. Obviously, downgrading the web hosting may come with minor to major negative outcomes.

Gary mentioned two factors to be mindful of:

  1. Website domain resolves
  2. Downtime is minimal

1. Website Is Resolvable

This is a reference to how the domain name is translated to an IP address that matches where the website itself is hosted. This typically means obtaining the Name Servers (NS) information from the new website hosting platform where the site files are and updating that at the domain name registrar. Additionally, the A Record (Address Record) should reflect the correct IP address for the new web hosting space (and other entries related to email).

2. Downtime Is Minimal

Believe it or not your website can be down for weeks and it won’t permanently lose rankings as long as when it comes back everything is exactly the same as it previously was. I know this from personal experience as I’ve been operating websites for 25 years. For example, there were a couple times when one or another of my websites went down due to hard drive failure at the dedicated web host, incorrect settings causing 500 errors and from taking having to take a website offline to fix hacked files.

A site can recover from being down for weeks and in my experience it’ll take a couple weeks for Google to recrawl and add all the webpages back into the search engine results pages (SERPs).

Listen to the answer on YouTube at the 7:58 minute mark:

Featured Image by Shutterstock/New Africa

Google On The SEO Impact Of 503 Status Codes via @sejournal, @MattGSouthern

Searchers who expect a website to always be available may be disappointed to encounter a 503 status code.

However, brief downtime is perfectly acceptable.

During the April edition of the Google Search Central SEO office hours, a question was raised about the potential impact of intermittently serving 503 “Service Unavailable” status codes.

Gary Illyes, a long-time Google webmaster trends analyst, clarified the search engine’s stance.

“Serving a 503 status code for an extended period of time will cause a decrease in crawl rate.

Fortunately for you, 10-15 minutes every now and then is not ‘extended’ by any means, so you should be fine.”

The Importance Of Uptime

While 100% uptime may be ideal, it’s not required to maintain good rankings in Google’s search results.

Websites undergo periods of maintenance and updates and can experience unplanned outages from time to time.

As long as these downtimes are brief and infrequent, they’re unlikely to impact crawling and indexing severely.

An extended period wasn’t clearly defined, but the example of 10-15-minute windows several times per week was deemed acceptable.

Previously, Google stated it would start deindexing webpages if a site is down for over a few days.

Planning For Updates

Advanced planning and strategy are recommended for websites expecting extended downtime.

Techniques like shadowing a staging site or employing progressive rollouts can reduce the visibility of errors and downtime.

Illyes advises:

“If you do things by the book, meaning the website keeps being resolvable and the actual downtime is minimal, changing [configuration] should not have negative effects on your pages’ rankings in Google’s search results.”

While continuous uptime is ideal for user experience, Google’s systems can tolerate short downtime without negatively ranking the website in search results.

Why SEJ Cares

Google’s guidance on brief 503 status codes provides relief for publishers who may be concerned about the potential negative impacts of website downtime.

Many sites go through regular update cycles on a weekly or monthly cadence, requiring some downtime windows.

These can be for publishing new content, product updates, security patches, and general housekeeping.

Additionally, unplanned outages, server issues, and other unavoidable downtime happen.

As long as temporary blips are being actively resolved, there’s no need to panic about losing rankings and search visibility.

How This Can Help You

Website owners, developers, and SEO professionals managing websites can benefit from understanding Google’s tolerances for 503 status codes.

Some key takeaways:

  • Plan for and minimize downtime during updates, but don’t stress over short periods serving 503s.
  • Monitor analytics and user feedback to ensure users aren’t severely impacted by unavailable periods.
  • Uptime and swift error resolutions should be prioritized as much as possible for overall website health.
  • Investigate techniques like staged rollouts to reduce errors for major updates requiring extended downtime.

With reasonable expectations set by Google, websites can make informed decisions about balancing downtimes with SEO priorities.

Hear the full question and answer in the video below:


Featured Image: Zikku Creative/Shutterstock

Google Ends Video Carousel Structured Data Test via @sejournal, @martinibuster

Google has ended support for the video carousel (limited access) markup documentation and the associated rich results because after testing the structured data they realized that it wasn’t useful at scale. Site publishers who use this specific structured data don’t necessarily have to remove it, other search engines may find it useful. However, Google won’t use it for rich results.

Google removed a paragraph from the section of the documentation that was labeled “Video with key moments in search results”.

Removed paragraph:

Video host carousel (limited access): Enable users to explore your video gallery pages by adding ItemList structured data. This feature is currently limited to a small set of providers. Google Search may automatically show a host carousel for your site without you having to add ItemList structured data.”

Google also removed the entire structured data example.

Screenshot Of Removed Structured Data Example

Screenshot of a webpage featuring a videoobject json-ld structured data example. the text and code snippets are clearly visible, intended to explain how to implement structured data for a video about self-driving bicycles.Screenshot of a webpage featuring a videoobject json-ld structured data example. the text and code snippets are clearly visible, intended to explain how to implement structured data for a video about self-driving bicycles.

Google removed a link to the “Carousel guidelines” from the section titled Guidelines where Google links to specific guideline documentation but curiously the page that’s being linked to still exists.

This is what was removed:

Carousel guidelines

Google removed the following phrase from the “Structured data type definitions” section:

“ItemList: Mark up pages with a list of videos to enable a video host carousel.”

And the entire section listing the recommended properties is now gone.

Screenshot Of Removed Recommended Properties Section

Screenshot of Google structured data documentation that was removedScreenshot of a coding tutorial page showing json code examples for adding video objects to an itemlist, with explanations of the listitem and itemlist properties and urls.

Why Google Removed The Documentation

Google’s developer changelog lists the following reasons why the documentation was removed:

“Removing video carousel (limited access) documentation
What: Removed video carousel guidance from the video structured data documentation.

Why: We initially tested video carousel markup with a group of site owners, and ultimately found that it wasn’t useful for the ecosystem at scale. You can leave the markup on your site so that search engines and other systems can better understand your web page.”

Read the updated guidance for video structured data

Video (VideoObject, Clip, BroadcastEvent) structured data

View an archive of the old structured data documentation

Archive of Video (VideoObject, Clip, BroadcastEvent) structured data

Featured Image by Shutterstock/Koshiro K

Brave Announces AI Search Engine – Shares Insights For SEO via @sejournal, @martinibuster

Brave announced their new privacy-focused AI search engine called Answer with AI that works with its own search index of billions of websites. Their current search engine already serves 10 billion search queries per year which means that Brave’s AI-powered search engine is now one of the largest AI search engines online.

Many in the search marketing and ecommerce communities have expressed anxiety about the future of the web because of AI search engines. Brave’s AI search engine still shows links and most importantly it does not by default answer commercial or transactional queries with AI, which should be good news for SEOs and online businesses. Brave values the web ecosystem and will be monitoring website visit patterns.

Search Engine Journal spoke with Josep M. Pujol, Chief of Search at Brave who answered questions about the search index, how it works with AI and most importantly, he shared what SEOs and business owners need to know in order to improve rankings.

Answer With AI Is Powered By Brave

Unlike other AI search solutions, Brave’s AI search engine is powered completely by its own search index of crawled and ranked websites. The entire underlying technology, from the search index to the Large Language Models (LLMs) and even the Retrieval Augmented Generation (RAG) technology is all developed by Brave. This is especially good from a standpoint of privacy and it also makes the Brave search results unique, further distinguishing it from other me-too search engine alternatives.

Search Technology

The search engine itself is all done in-house. According to Josep M. Pujol, Chief of Search at Brave:

“We have query-time access to all our indexes, more than 20 billion pages, which means we are extracting arbitrary information in real-time (schemas, tables, snippets, descriptions, etc.). Also, we go very granular on what data to use, from whole paragraphs or texts on a page to single sentences or rows in a table.

Given that we have an entire search engine at our disposal, the focus is not on retrieval, but selection and ranking. Additionally, to pages in our index, we do have access to the same information used to rank, such as scores, popularity, etc. This is vital to help select which sources are more relevant.”

Retrieval Augmented Generation (RAG)

The way the search engine works is it has a search index and large language models plus Retrieval Augmented Generation (RAG) technology in between that keeps the answers fresh and fact-based. I asked about RAG and Josep confirmed that’s how it works.

He answered:

“You are correct that our new feature is using RAG. As a matter of fact, we’ve already been using this technique on our previous Summarizer feature released in March 2023. However, in this new feature, we are expanding both the quantity and quality of the data used in the content of the prompt.”

Large Language Models Used

I asked about the language models in use in the new AI search engine and how they’re deployed.

“Models are deployed on AWS p4 instances with VLLM.

We use a combination of Mixtral 8x7B and Mistral 7B as the main LLM model.

However, we also run multiple custom trained transformer models for auxiliary tasks such as semantic matching and question answering. Those models are much smaller due to strict latency requirements (10-20 ms).

Those auxiliary tasks are crucial for our feature, since those are the ones that do the selection of data that will end up being on the final LLM prompt; this data can be query-depending snippets of text, schemas, tabular data, or internal structured data coming from our rich snippets. It is not a matter of being able to retrieve a lot of data, but to select the candidates to be added to the prompt context.

For instance, the query “presidents of france by party” processes 220KB of raw data, including 462 rows selected from 47 tables, 7 schemas. The prompt size is around 6500 tokens, and the final response is a mere 876 bytes.

In short, one could say that with “Answer with AI” we go from 20 billion pages to a few thousand tokens.”

How AI Works With Local Search Results

I next asked about how the new search engine will surface local search. I asked Josep if he could share some scenarios and example queries where the AI answer engine will surface local businesses. For example, if I query for best burgers in San Francisco will the AI answer engine provide an answer for that and links to it? Will this be useful for people making business or vacation travel plans?

Josep answered:

“The Brave Search index has more than 1 billion location-based schemas, from which we can extract more than 100 million businesses and other points of interest.

Answer with AI is an umbrella term for Search + LLMs + multiple specialized machine learning models and services to retrieve, rank, clean, combine and represent information. We mention this because LLMs do not make all the decisions. As of now, we use them predominantly to synthesize unstructured and structured information, which happens in offline operations as well as in query-time ones.

Sometimes the end result feels very LLM-influenced (this is the case when we believe the answer to the user question is a single Point of Interest, e.g. “checkin faro cuisine”, and other times their work is more subtle (e.g.”best burgers sf”), generating a business description across different web references or consolidating a category for the business in a consistent taxonomy.”

Tips For Ranking Well

I next asked if using Schema.org structured data was useful for helping a site rank better in Brave and if he had any other tips for SEO and online businesses.

He answered:

“Definitely, we pay special attention to schema.org structured data when building the context of the LLM prompt. The best is to have structured data about their business (standard schemas from schema.org). The more comprehensive those schemas are, the more accurate the answer will be.

That said, our Answer with AI will be able to surface data about the business not in those schemas too, but it is always advisable to repeat information in different formats.

Some businesses only rely on aggregators (Yelp, Tripadvisor, Yellow Pages) for their business information. There are advantages to adding schemas to the business web site even if only for crawling bots.”

Plans For AI Search In The Brave Browser

Brave shared that at some point in the near future they will integrate the new AI search functionality directly in the Brave Browser.

Josep explained:

“We plan to integrate the AI answer engine with Brave Leo (the AI assistant embedded in the Brave browser) very soon. Users will have the option to send the answer to Leo and continue the session there.”

Other Facts

Brave’s announcement also shared these facts about the new search engine:

“Brave Search’s generative answers are not just text. The deep integration between the index and model makes it possible for us to combine online, contextual, named entities enrichments (a process that adds more context to a person, place, or thing) as the answer is generated. This means that answers combine generative text with other media types, including informational cards and images.

The Brave Search answer engine can even combine data from the index and geo local results to provide rich information on points of interest. To date, the Brave Search index has more than 1 billion location-based schemas, from which we can extract more than 100 million businesses and other points of interest. These listings—larger than any public dataset—mean the answer engine can provide rich, instant results for points of interest all over the world.”

Try out the new AI search at http://search.brave.com/