Canada Still Works for U.S. Sellers

Cross-border ecommerce between the United States and Canada has rarely been more uncertain than in 2025. Nonetheless, for U.S. merchants, Canada remains an easy first step into international sales.

Canadian shoppers buy heavily from U.S. sites. But this year’s holiday shopping season is testing even experienced sellers. Tariffs, a postal strike, and shifting taxes have turned what should be a smooth northern route into a logistical puzzle.

Tariff Dust-Up

Many pundits have described recent trade relations as a war. Helaine Rich, the vice president of strategic sales and administration at ePost Global, an international shipping provider, was more tactful.

“The current administration really took a look at what countries are asking [American businesses] to pay when we’re shipping into their countries and what they’re charging as a premium on U.S. goods,” said Rich.

The result was a tariff for the United States that upset North-South relations. The back-and-forth trade negotiations included on-again, off-again reciprocal tariffs and surtaxes from the Canadian government — in some cases, up to 25%.

At peaks in the U.S.-Canada dispute, ecommerce “shipment volumes dropped quickly as tariffs rose and some Canadian consumers began avoiding U.S. brands,” Rich said.

Yet duties existed long before the recent dust-up.

Canadian Duties

“Nothing is more frustrating than thinking you paid for a product and then getting surprised at the door that you have to pay this, that, and the other duty,” said Rich.

Unfortunately, this is a common occurrence for Canadian shoppers buying from American ecommerce stores. Here are a few examples of what Canada typically adds.

  • Duties. Based on the type of goods, their origin (including whether they qualify under the United States–Mexico–Canada free-trade agreement or other treaties), and classification (harmonized system code).
  • GST. The Canadian goods and services tax applies to most imported goods, calculated on the Canadian-dollar value of goods plus duties.
  • PST or HST. Depending on the destination province, the importer or consumer may also owe provincial sales tax (PST) or harmonized sales tax (HST), combining federal and provincial components.

Postal Strike

The final challenge, beyond a volatile trade environment and duties due, is getting a package delivered in Canada.

Rich noted that the Canadian Union of Postal Workers is again holding strikes during the holiday season, delaying most holiday shipments, as it did last year.

Canada Post is the nation’s primary last-mile delivery service. Thus sending shipments via the U.S. Postal Service or any other carrier that partners with Canada Post is a risk.

For instance, Walmart Marketplace does not allow Canada Post or any affiliated services. Sellers who try to circumvent Walmart’s restriction face suspension.

Opportunity Nonetheless

Despite tariffs, taxes, and strikes, Canada remains a significant growth market for U.S. ecommerce retailers.

Depending on the projection, total Canadian ecommerce sales will range from $40 billion USD to $43 billion in 2025. About 20% of those sales will go to American stores, making the Canadian market worth around $8 billion for the year. (In contrast, 2024 U.S. retail ecommerce sales were roughly $1.2 trillion.)

Canada offers U.S. merchants geographic proximity, familiar consumer behavior, and lesser competition. Plus, nearly all consumers speak English.

If a U.S.-based online store wants to expand internationally, Canada is a great place to start.

Selling to Canada

Success requires researching three factors: cost competitiveness, duty-paid pricing, and carrier contingencies.

Cost

Before marketing to Canadian shoppers, merchants should model the total landed cost — product, shipping, duties, and GST/HST — to ensure each sale is profitable, according to Rich at ePost Global.

This step includes identifying and understanding tariffs or product restrictions. Canada forbids the import of some products sold domestically in the United States.

Bottom line: Can a U.S. business competitively sell its products into Canada?

Duty-paid pricing

Duty-paid pricing is akin to free shipping.

Shipping fees create friction, prompting shoppers to abandon orders if too expensive. Ditto for import duties.

Free shipping offers remove that friction.

The equation is similar to “delivery duty paid.” If paying Canadian duties for customers increases sales volume enough to offset the cost and generate more profit, do it.

Carriers

Finally, no carrier is a good choice when it is experiencing a strike. Shipments will almost certainly encounter delays.

Have a contingency plan that uses carriers not impacted by the current Canadian strikes.

Google Sharpens Suspension Accuracy and Speeds Up Appeals for Advertisers via @sejournal, @brookeosmundson

Google account suspensions have long been one of the most stressful issues advertisers face. A single notification can pause revenue, disrupt campaigns, and leave teams scrambling to understand what went wrong, often at no fault of their own.

Over the past several months, Google has heard that feedback and is now rolling out measurable improvements aimed at reducing the burden on legitimate advertisers.

These updates should bring meaningful relief. Misapplied suspensions are down, appeals are moving faster, and Google is promising more transparency into why enforcement actions happen at all.

What’s Changed in Google’s Process

Google announced several updates aimed at preventing unnecessary enforcement actions and speeding up resolutions when mistakes happen.

Google Ads Liaison Ginny Marvin shared additional context in a LinkedIn video. She explained that advertisers often faced long, unclear appeal processes. Many of those advertisers were compliant, but still got caught in broad enforcement filters designed to protect users. The new improvements are meant to address that gap and create a smoother experience for legitimate businesses.

Screenshot taken by author, November 2025

According to Google’s data:

  • Incorrect account suspensions are down more than 80%
  • Appeals are being resolved 70% faster
  • 99% of appeals are reviewed within 24 hours

These numbers reflect improvements in Google’s automated systems, better internal checks, and more precise policy evaluation. The goal is to reduce the number of trusted advertisers who get suspended by mistake and to shorten the time it takes to recover when an account needs review.

Google also mentioned ongoing work to make enforcement decisions easier to understand. While full visibility into every signal is unlikely, these updates indicate an effort to give advertisers clearer direction when issues occur.

How This Helps Advertisers

These changes bring meaningful stability to daily operations. When incorrect suspensions drop by such a large margin, advertisers experience fewer unexpected pauses in performance.

That consistency matters for both in-house teams and agencies managing multiple accounts.

The faster appeal timeline also reduces the fallout from any suspension that does occur. Getting nearly all appeals reviewed within a day helps advertisers avoid extended downtime and protects campaign momentum.

Clarity matters as well. Advertisers have long asked for more detail when suspensions happen.

Even small improvements in transparency can save hours of troubleshooting and prevent repeated appeals that contribute to delays.

These updates should also improve confidence in Google’s enforcement systems. When advertisers trust the process, they can focus on optimization instead of worrying that a routine change will trigger a policy issue.

How This Shapes Future Enforcement

Google’s changes reflect a broader effort to balance user protection with a better advertiser experience. Automated enforcement will always play a significant role in preventing harmful behavior, but legitimate businesses need a system that treats them fairly and resolves issues quickly.

The latest results show encouraging progress. There is still room for improvement, especially in policy clarity and long-term consistency, but the direction is positive.

Google has stated that this work will continue and that advertiser feedback remains central to future updates. For marketers, this signals a more stable and predictable enforcement environment, which supports healthier performance and stronger planning across campaigns.

Google Reminds Websites To Use One Review Target via @sejournal, @MattGSouthern

Google updated its review snippet documentation to clarify that each review or rating in structured data should point to one clear target, reducing ambiguity.

  • Google updated its review snippet docs to clarify how review targets should be specified
  • You should avoid attaching the same review or rating to multiple different entities
  • A quick audit of templates and plugins can catch confusing nesting.
Lazy Link Building Building Strategies That Work via @sejournal, @martinibuster

I like coming up with novel approaches to link building. One way to brainstorm an approach is to reverse a common method. I created a couple of approaches to link building, several are passive and two others are a little more active but have very little to do with email outreach. I wrote about these tips back around 2013, but I’ve polished them up and updated them for today.

Passive Link Building

Someone asked that I put together some tips for those who are too lazy to do link building. So here it goes!

Guilt Trip Copyright Infringers

Check who’s stealing your content. Be  hard on scrapers. But if it’s an otherwise legit site, you might want to hold off asking them to take down your content. Check if they’re linking to a competitor or similar sites, like from a links page.

You can ask them nicely to take down the content and after they email you back to confirm the link is down, email them back to thank them. But then say something like, “I see you are linking to Site-X.com. If my content was good enough to show on your site, then I would be grateful and much obliged if you considered it good enough to list from your links page.

I heard a keynote speaker at an SEO conference once encouraging people to come down hard on people who steal your content. I strongly disagree with that approach. Some people who steal your content sometimes are under the impression that if it’s on the Internet then it’s free and they can use it on their own site.  Some think it’s free to use as long as they link back to your site.

If they are linking to your site, tell them that you prefer they don’t infringe on your copyright but that you would be happy to write them a different article they can use as long as they link back to your site. You can be nice to people and still get a link.

Reverse Guest Posting

Instead of publishing articles on someone else’s site, solicit people to publish on your site. Many people tweet, promote, and link from their sites to sites that they are interviewed on. An interesting thing about doing this is that interviewing people who have a certain amount of celebrity helps to bring more people to your site, especially if people are searching for that person.

Relationship Building

Authors of books are great for this kind of outreach. People are interested in what authors and experts say. Sometimes you can find the most popular authors and influencers at industry conferences. I’ve met some really famous and influential people at conferences and got their email address and scored interviews by just going up and talking to these people.

This is called relationship building. SEOs and digital marketers are so overly focused on sending out emails and doing everything online that they forget that people actually get together in person at industry events, meetups, and other kinds of social events.

Giveaways

This is an oldie and I get it that many SEOs have talked about this. But this is something that I used successfully from way back around 2005. I did an annual giveway to my readers and website members.

The way I did it was to contact some manufacturers of products that are popular with my readers and ask for a discount if I buy in bulk and tell them I’ll be promoting their products to my subscribers, readers, and members. I’ve been responsible for making several companies popular by bringing attention to their products, elevating them from a regional business to a nationwide business.

Leverage Niche Audience For Links

The way to do this is to identify an underserved subtopic of your niche, then create a useful section that addresses a need for that niche. The idea is to create a compelling reason to link to the site.

Here is an example of how to do this for a travel destination site.

Research gluten free, dairy free, nut-free, raw food dining destinations. Then make a point to visit, interview, and build a resource for those.

Conduct interviews with lodging and restaurant owners that offer gluten free options. You’ll be surprised by how many restaurants and lodgings might decide on their own to link to your site or maybe just hint at it.

Summary

Outreach to sites about a niche topic, not just to businesses but also to organizations and associations related to that niche that have links and resources pages. Just tell them about the site, quickly explain what it offers and ask for a link. This method is flexible and can be adapted to a wide range of niche topics. And if they have an email or publish articles, suggest contributing to those but don’t ask for a link, just ask for a mention.

Don’t underestimate the power of building positive awareness of your site. Focus on creating positive feelings for your site (goodwill) and generating positive word of mouth, otherwise known as external signals of quality. The rankings will generally follow.

Featured Image by Shutterstock/pathdoc

The Quid Pro No Method Of Link Building via @sejournal, @martinibuster

Expressly paying for links has been out for awhile. Quid Pro No is in. These are some things you can do when a website asks for money in exchange for a link. During the course of building links, whether it’s free links, publishing an article or getting a brand mention, it’s not unusual to get solicited for money. It’s tempting to take the bait and get a project done. But I’m going to suggest some considerations prior to making a decision as well as a way to turn it around using an approach that I call Quid Pro No.

Link building, digital pr, brand mention building can often lead to solicitations for a paid link. There are many good reasons for not engaging in paid links and in my experience it’s possible to get a link without doing it their way when someone asks you for money in return for a link.

Red Light Means Stop

The first consideration is that someone who has their hand out for money is a red light is because it’s highly likely they have done this before and are highly likely linking to low quality websites that are in really bad neighborhoods, putting the publisher’s site and any sites associated with it into the outlier part of the web graph where sites are identified as spam and tend to not get indexed. In this case consider it a favor that they outed their site for the crap neighborhood it resides in and walk away. Quid pro… no.

Getting solicited for money can be a frequent occurrence. Site publishers, some of them apparently legit, are publishing Guest Post Submission Guidelines for the purpose of attracting paying submissions. It’s an industry and overly normalized in certain circles. Beware.

Spook The Fish

A less frequent occurrence is by the newb who’s trying to extract something. If the site checks out then there may be room for some kind of concession. If they’re asking for money, in this case, Quid Pro No means to FUD them away from this kind of activity THEN turn them around to doing the project on your terms.

When angling on a river fish that’s on the hook might make a run downstream away from you which makes it tough to land the fish because you’re fighting the fish and the current. Sometimes a tap on the rod will spook them into changing position. Sometimes a sharp pull can direct them to turn around. For this character I have found it efficacious to spook them with all the bad things that can happen and turn them around to where I want them to be.

Very briefly, and in the most polite terms, explain you’d love to do business, but that there are other considerations. Here’s what you can trot out:

  • FTC Guidelines
    FTC guidelines prohibit a web publisher from accepting money for an unlabeled advertisement.
  • Google Guidelines
    Google prohibits paid links

Land The Link

What’s in it for me is a useful concept that can be used to convince someone that it’s in their interest to do things your way. It’s important to convince the other party that there’s something in it for them. They want something so sometimes it’s worthwhile to make them feel as if they’re getting something out of the deal.

The approach I take for closing a project, whether it’s a free link or an article project is to circle back to asking for an article project by focusing on communicating why my site is high quality and ways that we can cross-promote. It’s essentially relationship building. The message is that your site is authoritative, well promoted and that there are ways that both sites can benefit without doing a straight link buy.

But at this point I want to emphasize again that any site that’s asking for money in exchange for a link is not necessarily a good neighborhood. So you might not actually want a link from them if they’re linking out to low quality sites.

Or Go For A Labeled Sponsored Post

However, another way to turn this around is to just go ahead and pay them as long as it’s a labeled as a sponsored post and contains either multiple no-follow links and or brand mentions. Sponsored posts get indexed by search engines and AI platforms that will use those as validation for how great your site is and recommend it.

What’s beautiful about a labeled sponsored post is that they give you full control over the messaging, which can be more valuable than a tossed-off link in a random paragraph. And because everything is disclosed and compliant, you reduce the long-term risk while still capturing visibility in AI Mode, ChatGPT and Perplexity through the citation signals.

Quid Pro No

Quid Pro No is about negatively responding to a solicitation and turning it around and getting something you want without actually saying the word no.

Featured Image by Shutterstock/Studio Romantic

Google Defends Parasite SEO Crackdown As EU Opens Investigation via @sejournal, @MattGSouthern

Google has defended its enforcement of site reputation abuse policies after the European Commission announced an investigation into whether the company unfairly demotes news publishers in search results.

The company published a blog post stating the investigation “is misguided and risks harming millions of European users” and that it “risks rewarding bad actors and degrading the quality of search results.”

Google’s Chief Scientist for Search, Pandu Nayak, wrote the response.

Background

The European Commission announced an investigation under the Digital Markets Act examining whether Google’s anti-spam policies unfairly penalize legitimate publisher revenue models.

Publishers complained that Google demotes news sites running sponsored content and third-party promotional material. EU antitrust chief Teresa Ribera said:

“We are concerned that Google’s policies do not allow news publishers to be treated in a fair, reasonable and non-discriminatory manner in its search results.”

Google updated its site reputation abuse policy last year to combat parasite SEO. The practice involves spammers paying publishers to host content on established domains to manipulate search rankings.

The policy targets content like payday loan reviews on educational sites, casino content on medical sites, or third-party coupon pages on news publishers. Google provided specific examples in its announcement including weight-loss pill spam and payday loan promotions.

Manual enforcement began shortly after. Google issued penalties to major publishers including Forbes, The Wall Street Journal, Time and CNN in November 2024.

Google later updated the policy to clarify that first-party oversight doesn’t exempt content primarily designed to exploit ranking signals.

Google’s Defense

Google’s response emphasized three points.

First, Google stated that a German court dismissed a similar claim, ruling the anti-spam policy was “valid, reasonable, and applied consistently.”

Second, Google says its policy protects users from scams and low-quality content. Allowing pay-to-play ranking manipulation would “enable bad actors to displace sites that don’t use those spammy tactics.”

Third, Google says smaller creators support the crackdown. The company claims its policy “helps level the playing field” so legitimate sites competing on content quality aren’t outranked by sites using deceptive tactics.

Nayak argues the Digital Markets Act is already making Search ‘less helpful for European businesses and users,’ and says the new probe risks rewarding bad actors.

The company has relied exclusively on manual enforcement so far. Google confirmed in May 2024 that it hadn’t launched algorithmic actions for site reputation abuse, only manual reviews by human evaluators.

Google added site reputation abuse to its Search Quality Rater Guidelines in January 2025, defining it as content published on host sites “mainly because of that host site’s already-established ranking signals.”

Why This Matters

The investigation creates a conflict between spam enforcement and publisher business models.

Google maintains parasite SEO degrades search results regardless of who profits. Publishers argue sponsored content with editorial oversight provides legitimate value and revenue during challenging times for media.

The distinction matters. If Google’s policy captures legitimate publisher-advertiser partnerships, it restricts how news organizations monetize content. If the policy only targets manipulative tactics, it protects search quality.

The EU’s position suggests regulators view Google’s enforcement as potentially discriminatory. The Digital Markets Act prohibits gatekeepers from unfairly penalizing others, with fines up to 10% of global revenue for violations.

Google addressed concerns about the policy in December 2024, confirming that affiliate content properly marked isn’t affected and that publishers must submit reconsideration requests through Search Console to remove penalties.

The updated policy documentation clarified that simply having third-party content isn’t a violation unless explicitly published to exploit a site’s rankings.

The policy has sparked debate in the SEO community about whether Google should penalize sites based on business arrangements rather than content quality.

Looking Ahead

The European Commission has opened the investigation under the Digital Markets Act and will now gather evidence and define the specific DMA provisions under examination.

Google will receive formal statements of objections outlining alleged violations. The company can respond with arguments defending its policies.

DMA investigations move faster than traditional antitrust cases. Publishers may submit formal complaints providing evidence of traffic losses and revenue impacts.

The outcome could force changes to how Google enforces spam policies in Europe or validate its current approach to protecting search quality.


Featured Image: daily_creativity/Shutterstock

llms.txt: The Web’s Next Great Idea, Or Its Next Spam Magnet via @sejournal, @DuaneForrester

At a recent conference, I was asked if llms.txt mattered. I’m personally not a fan, and we’ll get into why below. I listened to a friend who told me I needed to learn more about it as she believed I didn’t fully understand the proposal, and I have to admit that she was right. After doing a deep dive on it, I now understand it much better. Unfortunately, that only served to crystallize my initial misgivings. And while this may sound like a single person disliking an idea, I’m actually trying to view this from the perspective of the search engine or the AI platform. Why would they, or why wouldn’t they, adopt this protocol? And that POV led me to some, I think, interesting insights.

We all know that search is not the only discovery layer anymore. Large-language-model (LLM)-driven tools are rewriting how web content is found, consumed, and represented. The proposed protocol, called llms.txt, attempts to help websites guide those tools. But the idea carries the same trust challenges that killed earlier “help the machine understand me” signals. This article explores what llms.txt is meant to do (as I understand it), why platforms would be reluctant, how it can be abused, and what must change before it becomes meaningful.

Image Credit: Duane Forrester

What llms.txt Hoped To Fix

Modern websites are built for human browsers: heavy JavaScript, complex navigation, interstitials, ads, dynamic templates. But most LLMs, especially at inference time, operate in constrained environments: limited context windows, single-pass document reads, and simpler retrieval than traditional search indexers. The original proposal from Answer.AI suggests adding an llms.txt markdown file at the root of a site, which lists the most important pages, optionally with flattened content so AI systems don’t have to scramble through noise.

Supporters describe the file as “a hand-crafted sitemap for AI tools” rather than a crawl-block file. In short, the theory: Give your site’s most valuable content in a cleaner, more accessible format so tools don’t skip it or misinterpret it.

The Trust Problem That Never Dies

If you step back, you discover this is a familiar pattern. Early in the web’s history, something like the meta keywords tag let a site declare what it was about; it was widely abused and ultimately ignored. Similarly, authorship markup (rel=author, etc) tried to help machines understand authority, and again, manipulation followed. Structured data (schema.org) succeeded only after years of governance and shared adoption across search engines. llms.txt sits squarely inside this lineage: a self-declared signal that promises clarity but trusts the publisher to tell the truth. Without verification, every little root-file standard becomes a vector for manipulation.

The Abuse Playbook (What Spam Teams See Immediately)

What concerns platform policy teams is plain: If a website publishes a file called llms.txt and claims whatever it likes, how does the platform know that what’s listed matches the live content users see, or can be trusted in any way? Several exploit paths open up:

  1. Cloaking through the manifest. A site lists pages in the file that are hidden from regular visitors or behind paywalls, then the AI tool ingests content nobody else sees.
  2. Keyword stuffing or link dumping. The file becomes a directory stuffed with affiliate links, low-value pages, or keyword-heavy anchors aimed at gaming retrieval.
  3. Poisoning or biasing content. If agents trust manifest entries more than the crawl of messy HTML, a malicious actor can place manipulative instructions or biased lists that affect downstream results.
  4. Third-party link chains. The file could point to off-domain URLs, redirect farms, or content islands, making your site a conduit or amplifier for low-quality content.
  5. Trust laundering. The presence of a manifest might lead an LLM to assign higher weight to listed URLs, so a thin or spammy page gets a boost purely by appearance of structure.

The broader commentary flags this risk. For instance, some industry observers argue that llms.txt “creates opportunities for abuse, such as cloaking.” And community feedback apparently confirms minimal actual uptake: “No LLM reads them.” That absence of usage ironically means fewer real-world case studies of abuse, but it also means fewer safety mechanisms have been tested.

Why Platforms Hesitate

From a platform’s viewpoint, the calculus is pragmatic: New signals add cost, risk, and enforcement burden. Here’s how the logic works.

First, signal quality. If llms.txt entries are noisy, spammy, or inconsistent with the live site, then trusting them can reduce rather than raise content quality. Platforms must ask: Will this file improve our model’s answer accuracy or create risk of misinformation or manipulation?

Second, verification cost. To trust a manifest, you need to cross-check it against the live HTML, canonical tags, structured data, site logs, etc. That takes resources. Without verification, a manifest is just another list that might lie.

Third, abuse handling. If a bad actor publishes an llms.txt manifest that lists misleading URLs which an LLM ingests, who handles the fallout? The site owner? The AI platform? The model provider? That liability issue is real.

Fourth, user-harm risk. An LLM citing content from a manifest might produce inaccurate or biased answers. This just adds to the current problem we already face with inaccurate answers and people following incorrect, wrong, or dangerous answers.

Google has already stated that it will not rely on llms.txt for its “AI Overviews” feature and continues to follow “normal SEO.” And John Mueller wrote: “FWIW no AI system currently uses llms.txt.” So the tools that could use the manifest are largely staying on the sidelines. This reflects the idea that a root-file standard without established trust is a liability.

Why Adoption Without Governance Fails

Every successful web standard has shared DNA: a governing body, a clear vocabulary, and an enforcement pathway. The standards that survive all answer one question early … “Who owns the rules?”

Schema.org worked because that answer was clear. It began as a coalition between Bing, Google, Yahoo, and Yandex. The collaboration defined a bounded vocabulary, agreed syntax, and a feedback loop with publishers. When abuse emerged (fake reviews, fake product data), those engines coordinated enforcement and refined documentation. The signal endured because it wasn’t owned by a single company or left to self-police.

Robots.txt, in contrast, survived by being minimal. It didn’t try to describe content quality or semantics. It only told crawlers what not to touch. That simplicity reduced its surface area for abuse. It required almost no trust between webmasters and platforms. The worst that could happen was over-blocking your own content; there was no incentive to lie inside the file.

llms.txt lives in the opposite world. It invites publishers to self-declare what matters most and, in its full-text variant, what the “truth” of that content is. There’s no consortium overseeing the format, no standardized schema to validate against, and no enforcement group to vet misuse. Anyone can publish one. Nobody has to respect it. And no major LLM provider today is known to consume it in production. Maybe they are, privately, but publicly, no announcements about adoption.

What Would Need To Change For Trust To Build

To shift from optional neat-idea to actual trusted signal, several conditions must be met, and each of these entails a cost in either dollars or human time, so again, dollars.

  • First, manifest verification. A signature or DNS-based verification could tie an llms.txt file to site ownership, reducing spoof risk. (cost to website)
  • Second, cross-checking. Platforms should validate that URLs listed correspond to live, public pages, and identify mismatch or cloaking via automated checks. (cost to engine/platform)
  • Third, transparency and logging. Public registries of manifests and logs of updates would make dramatic changes visible and allow community auditing. (cost to someone)
  • Fourth, measurement of benefit. Platforms need empirical evidence that ingesting llms.txt leads to meaningful improvements in answer correctness, citation accuracy, or brand representation. Until then, this is speculative. (cost to engine/platform)
  • Finally, abuse deterrence. Mechanisms must be built to detect and penalize spammy or manipulative manifest usage. Without that, spam teams simply assume negative benefit. (cost to engine/platform)

Until those elements are in place, platforms will treat llms.txt as optional at best or irrelevant at worst. So maybe you get a small benefit? Or maybe not…

The Real Value Today

For site owners, llms.txt still may have some value, but not as a guaranteed path to traffic or “AI ranking.” It can function as a content alignment tool, guiding internal teams to identify priority URLs you want AI systems to see. For documentation-heavy sites, internal agent systems, or partner tools that you control, it may make sense to publish a manifest and experiment.

However, if your goal is to influence large public LLM-powered results (such as those by Google, OpenAI, or Perplexity), you should tread cautiously. There is no public evidence those systems honor llms.txt yet. In other words: Treat llms.txt as a “mirror” of your content strategy, not a “magnet” pulling traffic. Of course, this means building the file(s) and maintaining them, so factor in the added work v. whatever return you believe you will receive.

Closing Thoughts

The web keeps trying to teach machines about itself. Each generation invents a new format, a new way to declare “here’s what matters.” And each time the same question decides its fate: “Can this signal be trusted?” With llms.txt, the idea is sound, but the trust mechanisms aren’t yet baked in. Until verification, governance, and empirical proof arrive, llms.txt will reside in the grey zone between promise and problem.

More Resources:


This post was originally published on Duane Forrester Decodes.


Featured Image: Roman Samborskyi/Shutterstock

Ask A PPC: How To Protect A Budget From Competitor-Branded Terms via @sejournal, @navahf

This week’s question comes from Evan, who asked: “How do I prevent my PPC budget from getting eaten by branded competitor terms?”

It’s a good question, as few things frustrate advertisers more than watching transactional budgets get drained by competitor-branded searches. Marketing dollars intended for high-intent, conversion-ready audiences often get spent on clicks from users searching for competitors instead. These searches typically convert at lower rates and can produce deceptively low CPCs, creating false positives that distort performance data.

Protecting spend from competitor traffic requires a mix of negative keyword management, platform tools, and thoughtful campaign structure. Here’s how advertisers can take control and ensure their budgets stay focused on profitable intent.

Use Strategic Negatives

Negative keywords remain the most reliable way to prevent ads from serving on competitor-branded queries. Adding competitor names as phrase match negatives blocks variations of that brand name, while exact match negatives offer more precision when overlap risk is high.

However, advertisers must be careful. Some competitor names resemble valuable generic phrases. For example, if a competitor calls its business “Dog Trainer Near Me,” excluding that term could block qualified local leads. The goal is to remove competitor intent, not legitimate customer searches.

It’s also important to recognize that negative keyword limits are imposed by the ad platforms themselves. Google Ads and Microsoft Ads both restrict the number of negatives an account can include. Most advertisers can expect to cap out between 2,500 and 10,000 negatives per account, depending on structure and platform. Because of this limitation, advertisers should be selective about what they block.

The most efficient approach is to create a shared list of proven competitor negatives and apply it at the campaign or account level. This method saves space and keeps exclusions consistent across campaigns. Regularly review search term reports to identify new competitor variants and refine your list based on performance data.

Leverage Brand Inclusions And Exclusions In AI Campaigns

Advertisers running AI-driven campaign types, such as Performance Max, can use brand inclusion and exclusion controls to refine targeting. These tools allow advertisers to specify which brands their ads can or cannot appear alongside.

It’s important to understand that brand exclusions are not the same as negative keywords. A negative keyword blocks a specific word or phrase. A brand exclusion tells the system to avoid what it identifies as queries related to a particular brand. This AI-driven interpretation can reduce the need for lengthy negative lists, though close variants may still slip through.

These settings only apply to campaigns that use AI optimization, so advertisers must opt into automated formats to access them. If an account does not meet the required conversion thresholds for AI bidding, traditional negatives remain the best control option.

Assign Accurate Conversion Values And ROAS Goals

Competitor searches often look cheap on paper but cost more in practice due to lower conversion rates. A click on a competitor term may cost less, but it usually takes many more of those clicks to produce a single conversion.

To correct for this, advertisers should ensure their conversion tracking reflects actual business value. Assign different conversion values to calls, form fills, trial signups, or purchases to align with real-world outcomes. This helps automated bidding systems prioritize actions that contribute most to revenue rather than chasing inexpensive but unprofitable clicks.

On Google Ads, using Maximize Conversion Value with a ROAS target or applying cross-per-click floors can guide automation toward efficiency. Bid caps on both Google and Microsoft Ads help maintain control and prevent runaway spend on experimental traffic.

Structure Competitor Campaigns Separately

When an advertiser chooses to bid on competitor-branded keywords intentionally, those campaigns should operate in isolation. Competitor campaigns need their own budget, bidding strategy, and performance goals.

If the purpose is awareness, advertisers can remove ROAS targets and focus on visibility. If the purpose is performance, set high ROAS thresholds to ensure efficiency. The goal is to appear in competitor search results strategically, not to capture volume for its own sake.

Each competitor should live in a separate ad group with tailored creative. Avoid dynamic keyword insertion and never include competitor names in ad copy. Doing so risks ad disapprovals or account suspensions. Instead, ads should highlight what differentiates the advertiser (unique offers, service quality, or proprietary advantages) without mentioning the competitor directly.

Competitor bidding should remain limited to a short list of key rivals. A smaller, well-targeted approach allows for better creative control and clearer measurement of performance impact.

Continuously Audit And Refine

Competitor-related traffic shifts over time, and advertisers need to stay vigilant. Regularly reviewing search term reports helps uncover new variations or misspellings of competitor names that may be triggering ads. When low-performing competitor queries appear, add them to your shared negative list.

Segment performance by device, location, and audience type to find patterns. For instance, competitor clicks may be less efficient on mobile devices or in certain regions. These insights can guide bid adjustments, audience exclusions, or negative refinements that further protect the budget.

Balance Control With Opportunity

Blocking competitor-branded traffic improves efficiency, but advertisers must balance control with opportunity. Removing competitor terms completely eliminates the chance to influence potential buyers who are comparing options. This trade-off is worth making for consistently underperforming queries, but should always be intentional.

Negatives and brand exclusions create a strong defense. Accurate conversion valuation and disciplined bidding drive smarter optimization. Separate competitor campaigns allow for strategic engagement without risking broad budget leakage.

More Resources:


Featured Image: Paulo Bobita/Search Engine Journal

From slow to super fast: how to boost site speed the right way

Did you know that even a one-second delay in page loading speed can cause up to 11% fewer page views? That’s right, you might have the best content strategy and a solid plan to drive traffic, but visitors won’t stay long if your site lags. Page speed is one of the biggest factors in keeping users engaged and converting.

In this guide, we’ll uncover the most common causes of slow websites and explore proven ways to boost website performance. Whether your site feels sluggish or you simply want to make it faster, these insights will help you identify what’s holding it back and how to fix it.

Table of contents

What do we mean by ‘website performance’ and why is it important for you?

Website performance is all about how efficiently your site loads and responds when someone visits it. It’s not just about how fast a page appears; it’s about how smoothly users can interact with your content across devices, browsers, and locations. In simple terms, it’s the overall quality of your site’s experience that should feel fast, responsive, and effortless to use.

When your page loading speed is optimized, you’re not only improving the user experience but also setting the foundation for long-term website performance.

Here’s why it matters for every website owner:

Fast-loading sites have higher conversion rates and lower bounce rates

Attention spans are notoriously short. As the internet gets faster, they’re getting shorter still. Numerous studies have found a clear link between the time it takes a page to load and the percentage of visitors who become impatient while waiting.

By offering a fast site, you encourage your visitors to stay longer. Not to mention, you’re helping them complete their checkout journey more quickly. That helps improve your conversion rate and build trust and brand loyalty. Think of all the times you’ve been cursing the screen because you had to wait for a page to load or were running in circles because the user experience was atrocious. It happens so often, don’t be that site.

A fast page improves user experience

Google understands that the time it takes for a page to load is vital to the overall user experience. Waiting for content to appear, the inability to interact with a page, and even noticing delays create friction.

That friction costs time, money, and your visitor’s experience. Research shows that the level of stress from waiting for slow mobile results can be more stressful than watching a horror movie. Surely not, you say? That’s what the fine folks at Ericsson Research found a few years back.

Ericsson Mobility Report MWC Edition, February 2016

Improving your site speed across the board means making people happy. They’ll enjoy using your site, make more purchases, and return more frequently. This means that Google will view your site as a great search result because you are delivering high-quality content. Eventually, you might get a nice ranking boost.

Frustration hurts your users and hurts your rankings

It’s not just Google – research from every corner of the web on all aspects of consumer behavior shows that speed has a significant impact on outcomes.

  • Nearly 70% of consumers say that page speed impacts their willingness to buy (unbounce)
  • 20% of users abandon their cart if the transaction process is too slow (radware.com)
  • The BBC found that they lost an additional 10% of users for every additional second their site took to load

These costs and site abandonment happen because users dislike being frustrated. Poor experiences lead them to leave, visit other websites, and switch to competitors. Google easily tracks these behaviors (through bounces back to search engine results pages, short visits, and other signals) and is a strong indicator that the page shouldn’t be ranking where it is.

Google needs fast sites

Speed isn’t only good for users – it’s good for Google, too. Slow websites are often inefficient. They may load too many large files, haven’t optimized their media, or fail to utilize modern technologies to serve their page. That means that Google has to consume more bandwidth, allocate more resources, and spend more money.

Across the whole web, every millisecond they can save, and every byte they don’t have to process, adds up quickly. And quite often, simple changes to configuration, processes, or code can make websites much faster with no drawbacks. That may be why Google is so vocal about its education on performance.

A faster web is better for users and significantly reduces Google’s operating costs. Either way, that means that they’re going to continue rewarding fast(er) sites.

Improving page speed helps to improve crawling for search engines

Modern sites are incredibly wieldy, and untangling that mess can make a big difference. The larger your site is, the greater the impact page speed optimizations will have. That not only impacts user experience and conversion rates but also affects crawl budget and crawl rate.

When a Googlebot comes around and crawls your webpage, it crawls the HTML file. Any resources referenced in the file, like images, CSS, and JavaScript, will be fetched separately. The more files you have and the heavier they are, the longer it will take for the Googlebot to go through them.

On the flip side, the more time Google spends on crawling a page and its files, the less time and resources Google has to dedicate to other pages. That means Google may miss out on other important pages and content on your site.

Optimizing your website and content for speed will provide a good user experience for your visitors and help Googlebots better crawl your site. They can come around more often and accomplish more.

Page speed is a ranking factor

Google has repeatedly said that a fast site helps you rank better. It’s no surprise, then, that Google has been measuring the speed of your site and using that information in its ranking algorithms since 2010.

In 2018, Google launched the so-called ‘Speed Update,’ making page speed a ranking factor for mobile searches. Google emphasized that it would only affect the slowest sites and that fast sites would not receive a boost; however, they are evaluating website performance across the board.

In 2021, Google announced the page experience algorithm update, demonstrating that page speed and user experience are intertwined. Core Web Vitals clearly state that speed is an essential ranking factor. The update also gave site owners metrics and standards to work with.

Of course, Google still wants to serve searchers the most relevant information, even if the page experience is somewhat lacking. Creating high-quality content remains the most effective way to achieve a high ranking. However, Google also states that page experience signals become more important when many pages with relevant content compete for visibility in the search results.

Google mobile-first index

Another significant factor in page speed for ranking is Google’s mobile-first approach to indexing content. That means Google uses the mobile version of your pages for indexing and ranking. This approach makes sense as we increasingly rely on mobile devices to access the internet. In recent research, Semrush found out that 66% of all website visits come from mobile devices.

To compete for a spot in the search results, your mobile page needs to meet Core Web Vitals standards and other page experience signals. And this is not easy at all. Pages on mobile take longer to load compared to their desktop counterparts, while attention span stays the same. People might be more patient on mobile devices, but not significantly so.

Take a look at some statistics:

  • The average website loading time is 2.5 seconds on desktop and 8.6 seconds on mobile, based on an analysis of the top 100 web pages worldwide (tooltester)
  • The average mobile web page takes 15.3 seconds to load (thinkwithgoogle)
  • On average, webpages on mobile take 70.9% longer to load than on desktop (tooltester)
  • A loading speed of 10 seconds increases the probability of a mobile site visitor bouncing by 123% compared to a one-second loading speed (thinkwithgoogle)

All the more reasons to optimize your website and content if your goal is to win a spot in the SERP.

Understanding the web page loading process

When you click a link or type a URL and press Enter, your browser initiates a series of steps to load the web page. It might seem like magic, but behind the scenes, there’s a lot happening in just a few seconds. Understanding this process can help you see what affects your page loading speed and what you can do to boost website performance.

The “one second timeline” from Google’s site speed documentation

The process of loading a page can be divided into three key stages:

Network stage

This is where the connection begins. When someone visits your site, their browser looks up your domain name and connects to your server. This process, known as DNS lookup and TCP connection, enables data to travel between your website and the visitor’s device.

You don’t have much direct control over this stage, but technologies like content delivery networks (CDNs) and smart routing can make a big difference, especially if you serve visitors from around the world. For local websites, optimizing your hosting setup can still help improve overall page loading speed.

Server response stage

Once the connection is established, the visitor’s browser sends a request to your server asking for the web page and its content. This is when your server processes that request and sends back the necessary files.

The quality of your hosting, server configuration, and even your website’s theme or plugins all influence how quickly your server responds. A slow response is one of the most common issues with slow websites, so investing in a solid hosting environment is crucial if you want to boost your website’s performance.

One popular choice is Bluehost, which offers reliable infrastructure, SSD storage, and built-in CDN support, making it a go-to hosting solution for many website owners.

Browser rendering stage

Now it’s time for the browser to put everything together. It retrieves data from your server and begins displaying it by loading images, processing CSS and JavaScript, and rendering all visible elements.

Browsers typically load content in order, starting with what’s visible at the top (above the fold) and then proceeding down the page. That’s why optimizing the content at the top helps users interact with your site sooner. Even if the entire page isn’t fully loaded yet, a quick initial render can make it feel fast and keep users engaged.

Key causes that are causing your website to slow down

While you can’t control the quality of your visitors’ internet connection, most slow website issues come from within your own setup. Let’s examine the key areas that may be hindering your site’s performance and explore how to address them to enhance your website’s performance.

Your hosting service

Your hosting plays a big role in your website’s performance because it’s where your site lives. The speed and stability of your host determine how quickly your site responds to visitors. Factors such as server configuration, uptime, and infrastructure all impact this performance.

Choosing a reliable host eliminates one major factor that affects speed optimization. Bluehost, for example, offers robust servers, reliable uptime, and built-in performance tools, making it a go-to hosting choice for anyone serious about speed and stability.

Your website theme

Themes define how your website looks and feels, but they also impact its loading speed. Some themes are designed with clean, lightweight code that’s optimized for performance, while others are heavy with animations and complex design elements. To boost website performance, opt for a theme that prioritizes simplicity, efficiency, and clean coding.

Large file size

From your HTML and CSS files to heavy JavaScript, large file sizes can slow down your website. Modern websites often rely heavily on JavaScript for dynamic effects, but overusing it can cause your pages to load slowly, especially on mobile devices. Reducing file sizes, compressing assets, and minimizing unnecessary scripts can significantly improve the perceived speed of your pages.

Badly written code

Poorly optimized code can cause a range of issues, from JavaScript errors to broken layouts. Messy or redundant code makes it harder for browsers to load your site efficiently. Cleaning up your code and ensuring it’s well-structured helps improve both performance and maintainability.

Images and videos

Unoptimized images and large video files are among the biggest causes of slow websites. Heavy media files increase your page weight, which directly impacts loading times. If your header image or hero banner is too large, it can delay the appearance of the main content. Optimizing your media files through compression, resizing, and Image SEO can dramatically improve your website’s speed.

Too many plugins and widgets

Plugins are what make WordPress so flexible, but adding too many can slow down your site. Each plugin adds extra code that your browser needs to process. Unused or outdated plugins can also conflict with your theme or other extensions, further reducing performance. Audit your plugins regularly and only keep the ones that truly add value.

Absence of a CDN

A content delivery network (CDN) helps your website load faster for users worldwide. It stores copies of your site’s static content, such as images and CSS files, across multiple servers located in different regions. This means that users access your site from the nearest available server, reducing loading time. If your audience is global, using a CDN is one of the easiest ways to boost website performance.

Redirects

Redirects are useful for managing URLs and maintaining SEO, but too many can slow down your site. Each redirect adds an extra step before reaching the final page. While a few redirects won’t hurt, long redirect chains can significantly affect performance. Whenever possible, try to link directly to the final URL to maintain consistent page loading speed.

For WordPress users, the redirect manager feature in Yoast SEO Premium makes handling URL changes effortless and performance-friendly. You can pick from redirect types such as 301, 302, 307, 410, and 451 right from the dashboard. Since server-side redirects tend to load faster than PHP-based ones, Yoast lets you choose the type your stack supports, allowing you to avoid slow website causes and boost website performance.

A smarter analysis in Yoast SEO Premium

Yoast SEO Premium has a smart content analysis that helps you take your content to the next level!

How to measure page speed and diagnose performance issues

Before you can improve your website performance, you need to know how well (or poorly) your pages are performing. Measuring your page speed helps you identify what’s slowing down your website and provides a direction for optimization.

What is page speed, really?

Page speed refers to how quickly your website’s content loads and becomes usable. But it’s not as simple as saying, ‘My website loads in 4 seconds.’ Think of it as how fast a visitor can start interacting with your site.

A page might appear to load quickly, but still feel slow if buttons, videos, or images take time to respond. That’s why website performance isn’t defined by one single metric — it’s about the overall user experience.

Did you know?

There is a difference between page speed and site speed. Page speed measures how fast a single page loads, while site speed reflects your website’s overall performance. Since every page behaves differently, measuring site speed is a more challenging task. Simply put, if most pages on your website perform well in terms of Core Web Vitals, it is considered fast.

Core metrics that define website performance

Core Web Vitals are Google’s standard for evaluating how real users experience your website. These metrics focus on the three most important aspects of page experience: loading performance, interactivity, and visual stability. Improving them helps both your search visibility and your user satisfaction.

  • Largest Contentful Paint (LCP): Measures how long it takes for the main content on your page to load. Aim for LCP within 2.5 seconds for a smooth loading experience
  • Interaction to Next Paint (INP): Replaces the older First Input Delay metric and measures how quickly your site responds to user interactions like taps, clicks, or key presses. An INP score under 200 milliseconds ensures your site feels responsive and intuitive
  • Cumulative Layout Shift (CLS): Tracks how stable your content remains while loading. Elements shifting on screen can frustrate users, so keep CLS below 0.1 for a stable visual experience

How to interpret and improve your scores

Perfection is not the target. Progress and user comfort are what count. If you notice issues in your Core Web Vitals report, here are some practical steps:

  • If your LCP is slow: Compress images, serve modern formats like WebP, use lazy loading, or upgrade hosting to reduce load times
  • If your INP score is high: Reduce heavy JavaScript execution, minimize unused scripts, and avoid main thread blocking
  • If your CLS score is poor: Set defined width and height for images, videos, and ad containers so the layout does not jump around while loading
  • If your TTFB is high: Time to First Byte is not a Core Web Vital, but it still impacts loading speed. Improve server performance, use caching, and consider a CDN

Remember that even small improvements create a noticeable difference. Faster load times, stable layouts, and quicker interactions directly contribute to a smoother experience that users appreciate and search engines reward.

Tools to measure and analyze your website’s performance

Here are some powerful tools that help you measure, analyze, and improve your page loading speed:

Google PageSpeed Insights

Google PageSpeed Insights is a free tool from Google that provides both lab data (simulated results) and field data (real-world user experiences). It evaluates your page’s Core Web Vitals, highlights problem areas, and even offers suggestions under ‘Opportunities’ to improve load times.

Google Search Console (Page Experience Report)

The ‘Page Experience’ section gives you an overview of how your URLs perform for both mobile and desktop users. It groups URLs that fail Core Web Vitals, helping you identify whether you need to improve LCP, FID, or CLS scores.

Lighthouse (in Chrome DevTools)

Lighthouse is a built-in auditing tool in Chrome that measures page speed, accessibility, SEO, and best practices. It’s great for developers who want deeper insights into what’s affecting site performance.

WebPageTest

WebPage Test lets you test how your website performs across various networks, locations, and devices. Its ‘waterfall’ view shows exactly when each asset on your site loads, perfect for spotting slow resources or scripts that delay rendering.

Chrome Developer Tools (Network tab)

If you’re hands-on, Chrome DevTools is your real-time lab. Open your site, press F12, and monitor how each resource loads. It’s perfect for debugging and understanding what’s happening behind the scenes.

A quick checklist for diagnosing performance issues

Use this checklist whenever you’re analyzing your website performance:

  • Run your URL through PageSpeed Insights for Core Web Vitals data
  • Check your Page Experience report in Google Search Console
  • Use Lighthouse for a detailed technical audit
  • Review your WebPageTest waterfall to spot bottlenecks
  • Monitor your server performance (ask your host or use plugins like Query Monitor)
  • Re-test after every major update or plugin installation

Speed up, but with purpose

As Mahatma Gandhi once said, ‘There is more to life than increasing its speed.’ The same goes for your website. While optimizing speed is vital for better engagement, search rankings, and conversions, it is equally important to focus on creating an experience that feels effortless and meaningful to your visitors. A truly high-performing website strikes a balance between speed, usability, accessibility, and user intent.

When your pages load quickly, your content reads clearly, and your navigation feels intuitive, you create more than just a fast site; you create a space where visitors want to stay, explore, and connect.

Secrets Of A Wildly Successful Website via @sejournal, @martinibuster

Back in 2005 I intuited that there are wildly successful Internet enterprises that owed nothing to SEO. These successes intrigued me because they happened according to undocumented rules outside of the SEO bubble. These sites have stories and lessons about building success.

Turning Your Enthusiasm Into Success

In 2005 I interviewed the founder of the Church Of The Flying Spaghetti Monster, which at the time had a massive Page Rank score of 7. The founder explains how promotion was never part of a plan- in fact he denied having any success plan at all. He simply put the visual material out there and let people hotlink the heck out of it at the rate 40GB/day back in 2005.

The site is controversial because it was created in response to an idea called Intelligent Design, which is an ideology that believes that aspects of the universe and life are the products of an unseen intelligent hand and not products of undirected processes like evolution and natural selection. This article is not about religion, it’s about how someone leveraged their passion to create a wildly successful website.

The point is, there was no direct benefit to hotlinking, only the indirect benefits of putting his name out there and having it seen, known and remembered. It’s the essence of what we talk about when we talk about brand and mindshare building. Which is why I say that this interview is wildly relevant in 2013. Many of my most innovative methods for obtaining links are located within the mindset of identifying latent opportunities related to indirect benefits. There is a lot of opportunity there because most of the industry is focused on the direct-benefits/ROI mindset. Without further ado, here is the interview. Enjoy!

Secrets Of A Wildly Popular Website

The other day I stumbled across a successful website called, Church of the Flying Spaghetti Monster that does about 40 GB of traffic (including hotlinks) every single day. The site was created as a response to a social, cultural, political, and religiou issue of the day.

Many of you are interested in developing strategies to creating massively popular sites, so the following story of this hyper-successful website (PR 7, in case you were wondering) may be of interest.

Creating a website to react to controversy or a current event is an old but maybe forgotten methods for receiving links. Blogs fit into this plan very nicely. The following is the anatomy of a website created purely for the passion of it. It was not created for links or monetary benefit. Nevertheless it has accomplished what thousands of link hungry money grubbing webmasters aspire to every day. Ha!

So let’s take a peek behind the scenes of a wildly successful site that also makes decent change. The following is an interview with Bobby Henderson, the man behind the site.

Can you give me a little history of the Church of the Flying Spaghetti Monster website?

“The site was never planned. “the letter” had been written and sent off – with no reply – for months before it occurred to me to post it online.”

Have you ever built a website before, what is your web background?

“I made a website for the Roseburg, Oregon school district when I was in high school.

With the Fly Spaghetti Monster (FSM) site, I want things to be as plain and non-shiny as possible. Screw aesthetics. I don’t want it to look slick and well-designed at all. I prefer it to be just slapped together, with new content added frequently. I love it when people give me tips to make the site better. It’s received well over 100 million hits at this point, so maybe there’s something to this content-instead-of-shiny-ness thing.”

What made you decide to build your website?

“The idea of a Flying Spaghetti Monster was completely random. I wrote the letter at about 3am one night, for no particular reason other than I couldn’t sleep. And there must have been something in news about ID that day.

After posting the letter online, it was “discovered” almost immediately. It got boingboing’ed within a couple weeks, and blew up from there. I’ve done zero “promotion”. Promotion is fake. None of the site was planned, it has evolved over the months. Same with the whoring-out, the t-shirts,etc. None of that stuff was my idea. People asked for it, so I put it up. I can remember telling a friend that I would be shocked if one person bought a t-shirt. Now there have been around 20k sold.”

To what do you attribute the support of your site from so many people?

“I believe the support for the FSM project comes from spite…

I get 100-200 emails a day. Depends on the news, though. I got maybe 300 emails about that “pirate” attack on the cruise-ship. Incidentally, the reason we saw no change in global weather was because they were not real pirates. Real pirates don’t have machine guns and speedboats. (editors note: The FSM dogma asserts a connection between pirates and global warming)”

Were you surprised at how the site took off?

“Yes of course I’m surprised the site took off. And it blows my mind that it’s still alive. Yesterday was the highest-traffic day yet, with 3.5 million hits (most of those hits were hotlinked images).

What advice do you have to others who have a site they want to promote?

“Advice. . . ok .. here’s something. A lot of people go out of their way to stop hotlinking. I go out of my to allow it – going so far as paying for the extra bandwidth to let people steal my stuff. Why? It’s all part of the propaganda machine. It would be easy enough to prevent people from hotlinking FSM images. But I WANT people to see my propaganda, so why not allow it?

It’s like advertising, requiring zero effort by me. I am paying for about 40GB in bandwidth every day in just hijacked images – and it’s totally worth it, because now the Flying Spaghetti Monster is everywhere.”

Seeing how your deity is a flying spaghetti monster, I am curious… do you like eating spaghetti?

“No comment.”

Featured Image by Shutterstock/Elnur