New Ecommerce Tools: December 19, 2024

Each week we handpick new products and services for ecommerce merchants and curate them in a list. This installment includes updates on returns, payment financing, AI-powered assistants, social commerce, shipping, and security tools.

Got an ecommerce product release? Email releases@practicalecommerce.com.

New Tools for Merchants

eBay and Klarna expand BNPL partnership. eBay and Klarna have expanded their collaboration to European markets. eBay is offering Klarna’s buy-now-pay-later payments to shoppers in the U.K., Austria, France, Italy, Spain, and The Netherlands, with more markets coming soon. Payment options include interest-free “pay in 3,” pay in 30 days, or financing for larger purchases. The European expansion follows eBay’s recent announcement of free selling in the U.K.

Web page announcing the eBay and Klarna partnership

Ebay and Klarna

Amazon integrates Intuit QuickBooks for sellers. Intuit QuickBooks is now a preferred partner of Amazon for financial management solutions — integrated directly into Amazon Seller Central starting in mid-2025. The expanded Amazon-Intuit partnership will allow sellers to bring their existing Amazon data into Intuit products to understand and optimize profitability, manage cash flow, access capital, and simplify taxes.

Wix releases AI-powered virtual agent for customers. Wix.com, a website builder platform, has launched the AI Site-Chat virtual agent. According to Wix, with AI Site-Chat, businesses can connect with visitors anytime, answer questions, and provide relevant information in real-time. The customizable Site-Chat attempts to recognize the intent behind each user query, allowing it to deliver more precise search results and tailored recommendations.

Shopify releases Winter ’25 Boring Edition. Shopify has released its Winter ’25 Edition, called The Boring Edition, with more than 150 updates. According to Shopify, cart loading times are reduced, and checkout buttons are up to 58% faster. New customization options include checkout blocks, chat integration, and expansion of draft orders. Shopify POS has been updated, and Shopify Flow now automates more tasks, including returns and exchanges, marketing campaigns, and targeting with segment-based triggers.

Web page for Shopify's Winter ‘25 Boring Edition

Shopify’s Winter ’25 Boring Edition

ShipStation and Adobe Commerce partner on shipping tools for merchants. ShipStation, a shipping software from Auctane, has announced an expanded relationship with Adobe Commerce to provide merchants with shipping options to compare prices, print shipping labels, automate workflows, and more. Adobe Commerce sellers can unlock shipping features through the ShipStation extension, including in-cart delivery options within online storefronts and shipping cost calculators.

TikTok Shop launches in Ireland and Spain. TikTok Shop has launched in Ireland and Spain. Per TikTok, upcoming features include live shopping, shoppable videos in the “for you” feed, product showcases, affiliate programs, shop ads, and secure checkout. The launch in Ireland is in collaboration with Guaranteed Irish, which assists in identifying Irish products and merchants for consumers.

Shipium partners with DoorDash to improve local delivery for merchants. Shipium, an end-to-end shipping platform for ecommerce sellers, has partnered with DoorDash Drive On-Demand, a white-label fulfillment platform. Shipium customers now have immediate access to DoorDash’s services through the pre-integrated carrier network. Customers establish a relationship with DoorDash Drive On-Demand and turn on services. DoorDash Drive On-Demand helps retailers offer and outsource local delivery.

Web page for DoorDash's Drive On-Demand

DoorDash’s Drive On-Demand

Loop acquires Wonderment to expand its returns platform. Loop, a commerce returns platform, has acquired Wonderment, customer experience and order-tracking software for ​​Shopify merchants. Wonderment’s tracking product includes real-time shipment insights, page tracking, proactive alerts, one-click integrations, and more, all from a single interface. Loop says the acquisition brings its merchants AI-powered insights that transform shipping and returns data into actionable intelligence.

Bloomreach Discovery launches self-service features to accelerate integrations. Bloomreach has released features to accelerate integrations with Discovery, its AI search and merchandising platform. Self-service Catalog Creation allows partners to create and manage product catalogs. Catalog Management helps users manage catalogs programmatically using APIs. API Key Management simplifies the management of Discovery modules. Feed and Indexing enables users to upload and index product data. Finally, Web Typescript SDK streamlines the integration of Bloomreach search features for front-end developers.

GroupBy partners with adCaptcha to strengthen ecommerce security. GroupBy, an ecommerce search and product discovery provider, has partnered with adCaptcha, a verification platform, to combine security measures with personalized product discovery. GroupBy and adCaptcha’s partnership empowers merchants to identify and block malicious bot traffic. According to the companies, by reducing bot traffic with adCaptcha and by leveraging GroupBy’s AI-powered ecommerce search and product discovery platform, retailers can focus on serving real customers with intent-driven search results.

Cart.com acquires OceanX and deepens capabilities for health and beauty brands. Cart.com, a unified commerce solutions provider, has acquired OceanX, the fulfillment operations arm of Guthy-Renker., the multi-brand, omnichannel retailer. According to Cart.com, the acquisition strengthens its position in enterprise logistics and expands capabilities to support high-volume beauty, wellness, and lifestyle brands. Cart.com will add two new facilities totaling over 600,000 square feet to its network, including a distribution hub in Southern California and its third facility near Columbus, Ohio.

Home page of Cart.com

Cart.com

Temu’s U.S. Seller Program Is a DTC Opportunity

Direct-to-consumer brands eager to find customers have an opportunity with Temu, China’s rapidly growing discount marketplace.

In February 2024, Temu launched a U.S. Seller Program, effectively opening the platform to American businesses. The program gives U.S. brands access to an estimated 185 million domestic and international shoppers each month — and growing.

In 2023, Temu became Apple’s most downloaded free application and dominated the iOS and Android app stores in 2024.

Screenshot of the Temu home page.

The Temu U.S. home page focuses on discount items.

U.S. Seller Program

Temu, like its sister site, Pinduoduo, operates primarily on a consignment model.

Chinese and East Asian manufacturers fill Temu’s warehouse with goods and create product listings for the marketplace. When sold, the item is shipped directly from the Temu facility in a familiar, bright orange bag using clever air freight strategies to keep costs low.

The company recently changed tactics, allowing American merchants to list products and optionally employ Temu’s warehouse and fulfillment system.

The program is free for small sellers, but a business requires a subscription ranging from 2% to 5% of the selling price. At the time of writing, the marketplace also charged a payment processing fee of 2.9% + $0.30 per transaction, and sellers paid all shipping costs. Collectively, the fees make Temu similar to other marketplaces.

DTC Opportunity

Temu’s low prices may not fit traditional retailers, but DTC brands could have an opportunity.

A DTC product is unique. Similar products may exist in Temu, but none are identical. Plus, American-made products might have a competitive advantage owing to perceived value and quality.

All told, I see five potential benefits for DTC brands selling on Temu.

Brand building

DTC brands on Temu can introduce shoppers to the company and build relationships.

The introduction happens when a Temu buyer finds the brand’s products. The relationship starts with order fulfillment. DTC sellers could include in the packaging a physical product catalog, a coupon for a free item, or a note describing the brand’s story.

Items requiring a warranty registration offer the opportunity to collect the buyer’s email address and phone.

Revenue

Any established sales channel is a revenue opportunity. Temu has a massive user base, and those shoppers, discount-oriented as they may be, are the opportunity.

Temu’s media agency told me the company does not share estimated or average seller revenue. DTC shops should test, optimize, and iterate on the platform.

Marketing

DTC brands listing products on Temu can participate in platform-wide promotions and flash sales, driving traffic to listings and thus more interactions

Chinese expansion

DTC brands can flip the script and offer products to Chinese buyers via inventory stored in Hong Kong or other Temu locations.

Product development

The absence of a Temu listing fee facilitates the testing of new items. DTC brands can create short runs of prototype products, offer them on Temu, and learn what appeals to shoppers.

Marketplaces Generally

Selling on Temu should be part of a general marketplace strategy for DTC brands.

A brand selling on one marketplace can consider others. Hence listing on Amazon, Temu, Walmart, Esty, and eBay could all be part of an overall marketplace approach, such as:

  • Set marketplace-specific objectives. Define revenue targets, customer acquisition rates, or brand awareness metrics for each marketplace.
  • Establish marketplace audiences. Use analytics to learn customer demographics and purchasing behavior on a per-marketplace basis. A Temu shopper will likely differ from one on Amazon or Walmart.
  • Align products with the marketplace. A brand might have several versions of similar items. Perhaps the top quality goods are on Amazon and entry-level items on Temu. Returns and seconds could sell on eBay.
  • Optimize product listings. Common optimization tactics across all marketplaces include quality images, keyword-rich descriptions, and competitive prices. But keep in mind platform-specific practices, conventions, and rules.
Google Launches (Final?) Spam Update Of The Year via @sejournal, @MattGSouthern

Google announced the rollout of the December 2024 spam update.

The update, expected to be completed within a week, arrives amid ongoing industry discussions about the effectiveness of Google’s spam-fighting measures.

This December update caps off a year of spam-fighting measures, including the June Spam Update and the March Core Update, which targeted policy-violating websites and aimed to reduce “unhelpful” content by 40%.

It’s also worth mentioning that this update closely follows the December core update.

Looking Back At A Year Of Updates

This year saw an unprecedented frequency of major algorithm updates, with core updates in March, August, November, and December.

The August update, which took nearly three weeks to complete, targeted low-value SEO content while promoting high-quality material.

The December core update, launched on December 12, came unusually close to the November update, with Google explaining that different systems are often improved in parallel.

Policy Transformation

This year marked a shift in Google’s approach to spam detection and prevention with three major policy updates.

1.Site Reputation Abuse

Introduced in May 2024, Google began targeting “parasite SEO” practices where third-party content exploits established domains’ authority.

This update mainly affected:

  • Major publishers hosting third-party product reviews
  • News sites with extensive coupon sections
  • Sports websites with AI-generated content

The policy change led to notable casualties, including several high-profile publishers receiving manual actions for hosting third-party content without sufficient oversight.

2. Expired Domain Abuse

Google’s enhanced focus on expired domain manipulation addressed:

  • Purchase of expired domains for backlink exploitation
  • Repurposing authoritative domains for unrelated content
  • Domain squatting for search ranking manipulation

3. Scaled Content Abuse

Previously known as “spammy auto-generated content,” this rebranded policy expanded to include:

  • AI-generated content at scale
  • Mass-produced content across multiple sites
  • Content translation manipulation
  • Automated content transformation techniques

See more: An In-Depth Look At Google Spam Policies Updates And What Changed

Spam-Specific Updates

June 2024 Spam Update

  • Week-long implementation period
  • Focused on policy-violating websites
  • Enhanced detection of automated content

November 2024 SRA Enforcement

  • Implementation of site reputation abuse penalties
  • Affected major publishers’ sponsored content strategies
  • Required significant content policy adjustments across news sites

Looking Ahead

With the December core update having completed its rollout and the new spam update now underway, prepare for another round of potential ranking fluctuations through the end of the year.

The spam update is expected to be completed next week, with progress tracked through Google’s Search Status Dashboard.


Featured Image: JHVEPhoto/Shutterstock

AI Crawlers Account For 28% Of Googlebot’s Traffic, Study Finds via @sejournal, @MattGSouthern

A report released by Vercel highlights the growing impact of AI bots in web crawling.

OpenAI’s GPTBot and Anthropic’s Claude generate nearly 1 billion requests monthly across Vercel’s network.

The data indicates that GPTBot made 569 million requests in the past month, while Claude accounted for 370 million.

Additionally, PerplexityBot contributed 24.4 million fetches, and AppleBot added 314 million requests.

Together, these AI crawlers represent approximately 28% of Googlebot’s total volume, which stands at 4.5 billion fetches.

Here’s what this could mean for SEO.

Key Findings On AI Crawlers

The analysis looked at traffic patterns on Vercel’s network and various web architectures. It found some key features of AI crawlers:

  • Major AI crawlers do not render JavaScript, though they do pull JavaScript files.
  • AI crawlers are often inefficient, with ChatGPT and Claude spending over 34% of their requests on 404 pages.
  • The type of content these crawlers focus on varies. ChatGPT prioritizes HTML (57.7%), while Claude focuses more on images (35.17%).

Geographic Distribution

Unlike traditional search engines that operate from multiple regions, AI crawlers currently maintain a concentrated U.S. presence:

  • ChatGPT operates from Des Moines (Iowa) and Phoenix (Arizona)
  • Claude operates from Columbus (Ohio)

Web Almanac Correlation

These findings align with data shared in the Web Almanac’s SEO chapter, which also notes the growing presence of AI crawlers.

According to the report, websites now use robots.txt files to set rules for AI bots, telling them what they can or cannot crawl.

GPTBot is the most mentioned bot, appearing on 2.7% of mobile sites studied. The Common Crawl bot, often used to collect training data for language models, is also frequently noted.

Both reports stress that website owners need to adjust to how AI crawlers behave.

3 Ways To Optimize For AI Crawlers

Based on recent data from Vercel and the Web Almanac, here are three ways to optimize for AI crawlers.

1. Server-Side Rendering

AI crawlers don’t execute JavaScript. This means any content that relies on client-side rendering might be invisible.

Recommended actions:

  • Implement server-side rendering for critical content
  • Ensure main content, meta information, and navigation structures are present in the initial HTML
  • Use static site generation or incremental static regeneration where possible

2. Content Structure & Delivery

Vercel’s data shows distinct content type preferences among AI crawlers:

ChatGPT:

  • Prioritizes HTML content (57.70%)
  • Spends 11.50% of fetches on JavaScript files

Claude:

  • Focuses heavily on images (35.17%)
  • Dedicates 23.84% of fetches to JavaScript files

Optimization recommendations:

  • Structure HTML content clearly and semantically
  • Optimize image delivery and metadata
  • Include descriptive alt text for images
  • Implement proper header hierarchy

3. Technical Considerations

High 404 rates from AI crawlers mean you need to keep these technical considerations top of mind:

  • Maintain updated sitemaps
  • Implement proper redirect chains
  • Use consistent URL patterns
  • Regular audit of 404 errors

Looking Ahead

For search marketers, the message is clear: AI chatbots are a new force in web crawling, and sites need to adapt their SEO accordingly.

Although AI bots may rely on cached or dated information now, their capacity to parse fresh content from across the web will grow.

You can help ensure your content is crawled and indexed with server-side rendering, clean URL structures, and updated sitemaps.


Featured Image: tete_escape/Shutterstock

Ask An SEO: How To Move From Page 2 To Top Positions via @sejournal, @rollerblader

Today’s Ask an SEO question comes from Roy in Dinajpur:

“My website URL [is] still [in] position No. 15. How can increase to No. 3 or 4?”

Great question, and likely one of the top five that get asked. The answer is situational, and it is easier to resolve when you don’t overthink it.

The first thing to do is to look at the current pages in the top 10 positions and create a list by page of:

  • What they have in common.
  • Talking points and topics they cover.
  • How many internal links that point to these pages.
  • The number of quality and spammy backlinks each page has.
  • On-page factors like HTML structure, schema, and the quality of the content.
  • Content formatting and if they’re presenting the content in the most easy-to-understand and use formats.

I like to do this in spreadsheets because it lets me either assign values from one to 10 and add them up, or see what is missing and what is included across the sites more easily.

If you assign a number for each page with the aspect I’m looking for, I can add the columns and rows up to see how common it is based on the higher number.

If you only use a one (1), meaning it exists on the page, the higher the number, the more pages have it. If rating the quality of content, UX, formatting, sourcing, etc., I assign one to 10.

Once added up across or down, I can see which pages are the best and look at why. From there, I can begin working on my variation and create an even better experience.

Pro-tip: Better experiences may sometimes mean less content, removing specific sections as they may not be topically relevant, or adding in things I didn’t think of but make sense.

But don’t rely on this alone. Go deeper into the features on the pages and within the websites ranking above you, and then look at your own page.

Start To Review Your Own Content Or Page

Now, ask yourself:

  • Do I have the same content or not?
  • Is my content or page sharing something unique or more useful than these?
  • They all have X content, but is it topically relevant to the query I want my page to show up for?
    • If not, delete it so my page is more on-topic.
    • If yes, add it.
  • What could be better explained, or could clearer examples be used that are missing from theirs?
  • Can I easily absorb the text, or would bullets, tables, videos, sound clips, images, and infographics make it better?

These are ways you can begin to create more helpful content on your page. Then, look at some of the other factors that can help. Internal links can be a good place to start.

Where on my website do I reference this topic, product, or service, and will linking to my page help the website visitor?

If these same pages have traffic and backlinks and get social shares, add the internal link. Just make sure it benefits the end user and is not just there for SEO.

Now, look to see if you have conflicting internal links (links to the different pages off of the same keywords and the same intent).

In some cases, backlinks could be a factor, especially with “Your Money of Your Life” (YMYL) and medical queries. What does your page have that the others do not, and how is it more trustworthy than theirs?

You can use this to ask the websites linking to them to include you or replace their links with your resource instead.

Another option is to begin building quality links to your resource, but avoid spammy tactics like mass emailing, guest posting, scholarships, grants, forum and blog comments, PBNs, and link exchanges.

Technical audit and on-page SEO can help you as well. Schema does not help with rankings, but it does help with rich results and lets search engines know what your page is about. Make sure yours is not deprecated and is up to date.

Check your header tags, titles, descriptions, and wording. When doing that, also ensure that your content is around the same reading level and language style as the audience you want to reach.

Look At The Overall Site

Another thing is to consider the site overall.

Having one or two quality pages is good, but what about other topics that work for the same audience and would be interesting for them to read once they finish the page they’re on? This applies to ecommerce, publishers, and everything in between.

Are you using AI and LLMs to create content? You should probably delete that content immediately if you didn’t go in and edit it to have information only a human with experience would know.

If you’re using LLMs to create content, you’re recycling the knowledge already out there versus adding something new. It is the same as scraping four or five sites and using an article spinner to produce the output.

Is there thin content that is also in the category or being recommended? Delete that, too. Same with recommended articles from third parties and ad networks.

Having a couple of good-quality pages is great, but if the person clicks on the next article and it is thin, outdated, or inaccurate, you’re providing a bad experience, and some algorithms may use sitewide classifiers.

Those thin and spammy pages that do not educate and provide solutions impact the high-quality pages.

If all else is equal between you and another site, these low-quality pages could be the deciding factor if your high-quality page makes it to page one and who stays on page two if all else is equal.

The same goes for page and site speed. Yes, they matter, but not that much unless you’re a publisher.

Do Everything Right And You Should Get There

Sometimes, you can do everything right and have the best experience, but Google, Bing, Yahoo, Baidu, or Naver doesn’t bring you up to page one or top positions. Then you magically jump there, as do other pages during a core update.

There’s no one-size-fits-all solution for moving to the top five positions from page two, but by doing everything right, you should eventually make it there.

Fix the issues above and then keep working on it. Eventually, it pays off, and you’ll likely see your site and pages start hitting page one and going to top positions when you’ve fixed enough.

If you’re on page two, that means your page and your site have some quality that is trustworthy.

Now, it’s a matter of fine-tuning that experience so that it can become a page one result. The above tip should help you diagnose what could be better; once done, it’s a waiting game if your experience is already there. I hope this helps.

More resources:


Featured Image: Paulo Bobita/Search Engine Journal

8 Metrics To Measure The Effectiveness Of Your Internal Linking Strategy via @sejournal, @xandervalencia

You might’ve thought we’ve covered everything there is to know about internal linking.

But few dare to dig into the tricky details of tracking the success of an internal linking hierarchy. That’s because it’s messy, it’s difficult, and it’s not always straightforward – but it’s worth it.

In this guide, we’re covering the metrics that matter most when it comes to internal linking, how to track them, and what they mean in terms of the collective benefit to your website’s SEO strategy.

Is Internal Linking “Measurable”?

Yes, it is measurable, but it’s not always simple.

With something as indeterminate as “internal linking,” it’s easy to assume that the results are more subjective than objective.

For instance, it is difficult to tag individual internal links to assess how a user navigates your site — let alone determine if that results in a goal completion or conversion.

So, measuring the success of your internal linking strategy requires some creative thinking.

While the metrics may not be direct, in context, they can paint a picture of whether the internal links are benefiting your website’s SEO. You just need to know where to look!

Why Measure Your Internal Linking Results?

Internal linking is one of those SEO activities most often treated as a “best practice,” less often venturing into the realm of technical assessment and in-depth strategy.

Sure, there’s an understanding that one should link to the most important pages of their website, but how far do we go beyond that?

Glad you asked because there are a few ways to nerd out about internal linking. If you’re an SEO savant, I’m sure you will appreciate this.

  • User Navigation and Intent: Auditing your internal linking strategy via Google Analytics will reveal surprising insights about how users navigate your site. It will allow you to infer what users intend to find when perusing your site content (i.e., where are they going next?).
  • Page Authority: You’ll likely notice that some pages get more traffic than others. This may be a result of higher search volume keywords, volume and quality of backlinks, page authority, and a range of other factors. Internal links allow you to direct some of this authority to lower-performing pages.
  • Information Architecture: Internal linking is an essential part of facilitating an intuitive and easy user experience. By directing users to relevant pages and posts, you remove friction from their navigational process, lifting barriers to purchase.
  • Content Gaps: Through auditing, you will likely find gaps in your content. Have you thoroughly exhausted the topic “pillar” on your website, or are there more items to cover? Where would a user likely want to venture next? How can you take them there?

In essence, there are several benefits to auditing, analyzing, and updating your internal linking strategy.

If you’re ready to go beyond “best practices” and dig into the data, you’ve come to the right place.

Internal Linking: How To Measure Success

As we all know, in SEO, some things are subjective, and others are objective. An internal linking strategy involves a bit of both.

The metrics used to assess internal linking success are mostly objective, while observations and applications can be wholly subjective.

Feel free to interpret the data as you see fit for your own SEO strategy purposes, and know that you’re not limited to these metrics when it comes to analyzing your internal links.

1. Crawl Depth

One of my favorite metrics for analyzing internal links is crawl depth. This metric, reported by Google Search Console’s Crawl Stats report, measures how many pages search engine bots can access and index within a single crawl.

Before implementing internal link updates, I take a baseline of the site’s current crawl depth.

As internal links are added/updated, I most often see an increase in the number of pages found and indexed (assuming there was a discrepancy at the beginning).

An optimized internal linking structure can help search engines crawl deeper into the site, ensuring more pages are indexed and capable of being ranked by Google.

2. Bounce Rate

There are pros and cons to using bounce rate to measure SEO success. The metric alone can miss a lot of context.

For example, in cases of law firm SEO, a higher bounce rate might not be concerning if the end goal is a phone call rather than a user continuously navigating the site. There are many nuances to measuring and assessing the importance of bounce rate.

But when it comes to internal linking, assessing bounce rate can be informative.

Bounce rate (reported by Google Analytics) measures the percentage of website visitors who land on a website and then leave without taking any action. “Action” here could mean clicking on another page, completing a form, making a purchase, etc.

Internal links can increase the likelihood that a user will venture to another page on your website.

Again, compare the results before and after implementing your internal link improvements. A lower bounce rate may indicate that users are finding more relevant content, and are staying on your site for longer.

3. Behavior Flow

Universal Analytics’ “Behavior Flow” report was depreciated with the upgrade to GA4, but there are other ways to view a user’s navigational path through your website.

With the new “path exploration report,” you can analyze a user’s journey through your site, including the pages they land on and the actions they take.

Though not exactly a “metric,” this report does reveal data about which pages users are visiting and where they navigate to next. It also reveals where they drop off.

This is critical information when it comes to internal linking, as you can add links to pages to reduce drop-off, add visual aids to direct users to important pages, and change the placement of your links to improve click-throughs.

4. Pages Per Session

Another Google Analytics metric, Pages Per Session measures the average number of pages a visitor views during a session.

For example, if a visitor only visits two pages and then leaves, that’s not ideal. But if they visit more than two pages, indicating an intent to find information and, potentially, make a purchase, things are looking up!

This can be a helpful metric because it (in part) indicates whether your internal links are well-placed and are making it easy for visitors to navigate to additional pages.

Effective internal linking encourages users to explore more content, increasing page views per session, and signaling good user engagement.

Note that, like bounce rate, there are many nuances to assessing the importance of pages per session as an indicator of SEO performance.

For example, a business would likely prefer that a user calls them right away rather than venturing to several pages of their website. Immediate action is ideal!

5. Time On Page

While pages per session measures the number of pages a user visits within a session, time on page measures the amount of time a user spends on a single webpage before navigating to another page.

In the context of internal linking, higher time on page may indicate that your links are effective in guiding users to content that holds their attention.

Also, while not a direct ranking factor, time on page can contribute to search engines’ understanding of your site’s quality.

Pages that keep users engaged signal a positive user experience, which search engines may consider when determining your rankings.

In that way,  higher time on page as a result of internal linking improvements may indicate the success of your strategy.

6. Page Authority

Page Authority is a score developed by Moz to assess how well a particular page will rank in the SERPs based on a variety of factors. Scores range from 1 to 100, with a higher score indicating a higher expected ranking.

I like to look at Page Authority when it comes to internal linking because internal links can “send” authority to the pages they link to.

Basically, when you link from high-authority pages to other pages on your site, it helps distribute “link equity” across your site. This practice can raise the authority of less visible or lower-ranking pages.

You may notice that the Page Authority of a destination page increases after you link to it from a high-authority page. Measuring this, across multiple pages, can be a strong indicator of internal linking effectiveness.

7. Conversion Rate

You can use Google Tag Manager (GTM) to track conversions from users who click on internal links.

Internal links can guide users down the sales funnel as they navigate from one page to another and, ultimately, make a purchase, submit a form, etc.

Tracking whether linked pages lead to conversions (e.g., purchases or sign-ups) is crucial for assessing the effectiveness of your internal linking strategy.

Here’s how to track internal link conversions with GTM:

  1. Log into Google Analytics.
  2. Create a conversion event representing the action you want to track (e.g., form submissions, purchases, sign-ups).
  3. Take note of the event name and/or parameters (you’ll use them later).
  4. Log into Google Tag Manager and click “Triggers” in the sidebar. Select “New” to create a new trigger.
  5. Name the trigger (e.g., “Internal Link Click”).
  6. Choose “Click – Just Links” as the trigger type.
  7. In the Trigger Configuration section, set the following:
    • This trigger fires on: “Some Link Clicks”
  8. In the next section, create a condition to target only internal links. Set the condition to:
    • Click URL → Matches RegEx → ^https?://(www.)?yoursite.com
  9. Replace yoursite.com with your actual domain.
  10. Save the trigger.
  11. Next, go to the Tags section in GTM and click “New.” Name the tag (e.g., “Internal Link Click Event”).
  12. Choose Tag Type as “Google Analytics: GA4 Event.”
  13. Under Tag Configuration, fill in the following:
    • Configuration Tag: Select your GA4 configuration tag.
    • Event Name: Name the event (e.g., “internal_link_click”).
    • Event Parameters: Add additional parameters for deeper insights. Example: Parameter Name: “link_url”
    • Value: {{Click URL}}
  14. In the Triggering section, select the “Internal Link Click” trigger you created earlier.
  15. Save the tag.
  16. Back in GA4, click on Admin.
  17. Under the Property column, click on Events.
  18. You will see a list of events that GA4 has already tracked (including any custom events like “internal_link_click” if you’ve set up your GTM tag correctly).
  19. Find the event you want to track as a conversion (e.g., “internal_link_click”). If it is not listed, it means the event hasn’t been triggered yet, and you’ll need to wait until it fires or manually create the event (explained below).
  20. Once the event appears in the list, toggle the “Mark as conversion” switch next to the event. This will now track the event as a conversion in GA4.

8. Organic Traffic

One of the clearest signs of SEO success is increased traffic. However, it can be challenging to directly link traffic growth to changes in your internal linking strategy.

But you can compare traffic stats before and after internal link updates, all else being held equal.

Be sure to track the organic traffic to your website over time using tools like Google Analytics or Semrush.

The addition of internal links can direct more traffic flow to other pages on your site, improve the rate at which pages are indexed, and distribute page authority, which can boost your overall organic traffic.

Improve Your Internal Linking Strategy With These Tips

Internal linking is an important yet oft-overlooked strategy in SEO. It’s so simple that it’s easy to forget how impactful it can be.

With the help of the metrics above and some creative thinking, you can drive better organic results for your site and your clients.

  • Audit Often: Analyze your website performance every quarter (if not more) to assess your internal pages and determine whether any content gaps exist on your site. Audit your website for broken and/or redirected links, fixing these as needed to improve user experience and the crawlability of your website.
  • Add Links Regularly: Any time you add new content, look for opportunities to link to existing pages or articles. Aim for at least three internal links on each page.
  • Examine Your Traffic: Identify high-traffic, high-authority pages and add internal links from these to your lower-performing pages. Compare traffic before and after these changes.
  • Play With Placement: Experiment with the placement and prominence of your internal links. Use different visual components, weight, and colors to make internal links more obvious and enticing.

With this guide, you can get a clear picture of how well your internal linking strategy is performing and make adjustments to improve your SEO results.

Want more user engagement and action on your website? Internal linking is one way to do that!

More resources:


Featured Image: denayunebgt/Shutterstock

seo enhancements
A brief history of Google’s algorithm updates

SEO has changed significantly over the last decade, largely because Google has continuously updated its algorithms to improve search results. These updates aim to better understand user intent, reward high-quality content, and discourage manipulative practices. From foundational changes like Panda and Penguin to more recent updates like the November and December 2024 core updates, each has shaped how websites rank and how we approach optimization. Below is a look at some of Google’s most impactful updates and what they mean for SEO today.

Table of contents

2011 – Panda

The Panda update marked a shift in SEO by targeting low-quality content and spammy practices. It penalized sites with thin content or those created solely to manipulate rankings, such as affiliate-heavy pages. Over time, Panda became part of Google’s core algorithm, reinforcing the need for meaningful, high-quality content that provides real value to users.

2012 – Venice

Google’s algorithm update Venice was a noteworthy update, as it showed that Google understood that searchers are sometimes looking for results that are local to them. After Venice, Google’s search results included pages based on the location you set, or your IP address.

2012 – Penguin

The Google Penguin update focused on eliminating manipulative link-building practices. It penalized sites with spammy or paid backlinks, shifting the focus to earning genuine, high-quality links. By 2016, Penguin became part of the core algorithm, emphasizing the importance of ethical and relevant link-building strategies.

2012 – Pirate

The Pirate update addressed copyright infringement by penalizing sites with repeated DMCA takedown requests. It aimed to reduce the visibility of websites sharing unauthorized content, ensuring legitimate sources were prioritized in search results. This update highlighted the importance of respecting intellectual property online.

2013 – Hummingbird

The Hummingbird update improved Google’s ability to understand the meaning behind search queries. Rather than focusing on individual keywords, it considered the entire phrase to deliver more accurate results. This shift encouraged natural, conversational content and reduced the need for over-optimized keyword stuffing. It also laid the foundation for advancements in voice search and semantic search technology.

2014 – Pigeon

Another bird-related Google update followed in 2014 with Google Pigeon, which focused on local SEO. The Pigeon update affected both the results pages and Google Maps. It led to more accurate localization, giving preference to results near the user’s location. It also aimed to make local results more relevant and higher quality, taking organic ranking factors into account. 

2014 – HTTPS/SSL

Google introduced HTTPS as a ranking signal to encourage secure web connections. Sites using HTTPS gained a slight ranking advantage, promoting better data encryption and security for users. While initially a minor factor, it signaled Google’s growing focus on user safety and set the stage for security becoming a standard expectation online.

2015 – Mobile Update

Known as “Mobilegeddon,” this update prioritized mobile-friendly websites in mobile search results. As mobile usage surpassed desktop, Google aimed to ensure a better experience for users on smaller screens. While the immediate impact wasn’t drastic, it marked a clear shift toward mobile-first indexing, emphasizing the importance of mobile optimization for long-term SEO success.

2015 – RankBrain

RankBrain introduced machine learning to Google’s algorithm, helping the search engine interpret unfamiliar or complex queries. It analyzed past searches to predict the most relevant results, even for terms it hadn’t encountered before. While you can’t directly optimize for RankBrain, creating clear, helpful, and user-focused content ensures your site aligns with its goal of improving search relevance.

2016 – Possum 

In September 2016 it was time for another local update. Google’s algorithm update ​Possum update​ applied several changes to Google’s local ranking filter to further improve local search. After Possum, local results became more varied, depending more on the physical location of the searcher and the phrasing of the query. Some businesses, not doing well in organic search, found it easier to rank locally after this update. This indicated that this update made local search more independent of the organic results.

Read more: Near me searches: Is that a Possum near me? »

2018 – (Mobile) Speed Update

The Speed Update made page load time a ranking factor for mobile searches, building on its previous importance for desktop. Slow-loading sites were more likely to see a drop in rankings, especially on mobile devices. This update reinforced the need for fast, seamless user experiences, encouraging site owners to prioritize performance optimization.

2018 – Medic

The Medic Update was a broad core algorithm change that heavily impacted “Your Money or Your Life” (YMYL) websites, such as health, finance, and legal sites. It appeared to prioritize expertise, authoritativeness, and trustworthiness (E-A-T) in content, especially for topics affecting users’ well-being. While it wasn’t exclusively aimed at medical sites, it underscored the importance of credible, accurate, and user-focused information.

Keep reading: Google’s Medic update »

2019 – BERT

The BERT update (Bidirectional Encoder Representations from Transformers) enhanced Google’s ability to understand the context of words in a search query. By analyzing words in relation to the ones around them, BERT improved how Google interpreted natural language and intent. This update particularly helped with more conversational or complex queries, making search results more accurate and relevant. For content creators, it emphasized the value of clear, natural writing that directly addresses user needs.

Read on: Google BERT: A better understanding of complex queries »

2021 – Page Experience Update

The Page Experience update introduced a new ranking signal combining existing factors like mobile-friendliness and HTTPS with Core Web Vitals. These metrics measured real-world user experience, focusing on loading speed, interactivity, and visual stability. While content quality remained the top priority, this update emphasized the importance of delivering a smooth and user-friendly browsing experience.

Keep on reading: Page experience: a new Google ranking factor »

2021 – MUM (Multitask United Model)

Announced in 2021, MUM introduced a powerful AI system capable of processing information across multiple formats and languages. It can analyze text, images, and videos to deliver more comprehensive answers to complex queries. For example, MUM can combine insights from various sources to provide layered, context-rich results. This update signaled Google’s focus on deeper understanding and more diverse content delivery in search.

Read more: Google’s MUM understands what you need: 1000x more powerful than BERT »

2021 – Product Reviews Update

First run in April 2021, these updates prioritized detailed, insightful product reviews over thin or generic content. Google rewarded reviews that showed expertise, included real-world usage, and helped users make informed decisions. It’s a key update for affiliate and e-commerce sites focused on providing genuine value. The update ran multiple times over the years.

2022 – Helpful Content Update

The Helpful Content Update targeted low-quality, unoriginal content designed primarily to game search rankings. Instead, it rewarded “people-first” content—material that genuinely answers user questions and provides a satisfying experience. Sites with lots of unhelpful or shallow content saw declines, while those focused on creating valuable, user-centric content were prioritized. This update reinforced the importance of writing with the audience in mind, not just search engines.

Keep reading: Google to launch Helpful Content Update to diversify search results »

2023 / 2024 – A mix of updates

Between 2023 and 2024, Google rolled out a mix of core and spam algorithm updates to enhance search quality and combat manipulative practices. Core updates focused on refining how content is evaluated, rewarding pages that provide high-quality, relevant, and trustworthy information. At the same time, spam updates targeted tactics like keyword stuffing, spammy backlinks, and low-quality AI-generated content. These changes reinforced Google’s priorities: surfacing helpful, user-focused content while penalizing manipulative SEO strategies.

2024 – Site Reputation Abuse

Google is cracking down on site reputation abuse, including parasite SEO. This tactic involves using trusted domains to host unrelated third-party content, like payday loans or casino reviews, to manipulate rankings. Sites caught violating this policy risk manual penalties, which require removing or noindexing the problematic content to recover. Legitimate uses of third-party content, such as syndicated news or user-generated material, are still allowed when properly managed.

Google algorithm updates: What’s next?

Google continues to refine its search algorithms with a growing focus on AI-driven search experiences. Recent advancements, such as Google AI Overviews, show a shift toward providing users with more intuitive and context-rich results. These tools combine AI to summarize complex topics, pull insights from multiple sources, and answer broader questions in a concise way.

Looking ahead, we can expect updates to further enhance understanding of search intent, prioritize high-quality content, and improve how information is presented. At the same time, technical factors like site speed, mobile usability, and security will remain essential. For website owners, the key is to stay adaptable by focusing on creating helpful, accurate, and user-centered content while keeping an eye on emerging AI trends in search.

Read on: Should I follow every change Google makes? »

Coming up next!

Cut The Malarkey. Speaking Frankly About AI Search & SEO via @sejournal, @martinibuster

Search marketing is undergoing dramatic changes, with many debating whether SEO is on its way out as AI Search rises in popularity. What follows is a candid assessment of what is going on with SEO and search engines today.

An SEO School Shuts Down

An SEO school by a group called Authority Hackers recently announced their closure, emphasizing that it’s not because SEO is dead but due to the collapse of the content site model. They cited three reasons for this situation. The following is not about the SEO school, that’s just a symptom of something important going on today.

1. Google Updates is one of the reasons cited for the decline of the content site model. Here’s the candid part: If the Google updates killed your publishing site, that’s kind of the red flag that there’s something about the SEO that needs examination.

Here’s the frank part: Google’s updates have generally crushed websites that begin with keyword research, are followed by stealing content ideas from competitors and scraping Google’s SERPs for more keyword phrases. That’s not audience research, that’s search engine research. Search engine research results in Made For Search Engine websites. This doesn’t describe all websites that lost rankings but it’s a common method of SEO that in my opinion seriously needs to be reassessed.

2. The other reason cited by the SEO school is the “AI content tsunami.” I’m not sure what that means because it can mean a lot of things. Is that AI content spam? Or is that a reference to AI content sites overwhelming the publisher who cranks out two articles a week?

Do I need to say out loud what content output implies about site authority?

3. The third reason for the decline of the content model is the dramatic changes to Search Engine Results Pages (SERPs). Now this, this is a valid reason, but not for the reasons most SEOs think.

The organic SERPs have, for the past 25 years, been dominated by the top three ranked positions, with about 20-30% of the traffic siphoned off to Google Ads for search topics that convert. That’s the status quo: Three sites are winning and everyone else is losing.

AI Overviews has not changed a thing. AIO doubled down on the status quo. According to BrightEdge research, the top ranked websites in AIO are largely the same as the organic top ranked websites. What that means is that three sites are still winning and everyone else is still losing.

The biggest change to the SERPs that most SEOs are missing is what I already mentioned, that made for search engine websites have been getting wiped out by Google updates.

The helpful content update (HCU) is the scapegoat but that’s just ONE algorithm out of hundreds. There is literally no way for anyone to claim with 100% certainty that the HCU is the reason why any given site lost rankings. Google is a black box algorithm. A lot of people are saying but none of them can explain how they are able to pick out the effects of one algorithm out of hundreds.

The thing about being in SEO for 25 years is that people like me are accustomed to dramatic changes. Yes, the SERPs have changed dramatically. That’s how search engines have always done things.

If you’ve only been doing SEO for ten years, I can understand how the recent changes seem dramatic. But when you’ve been in it for as long as I have, dramatic changes are expected. That’s the status quo. Dramatic SERP changes is how it’s always been.

SEO Is Now AEO?

Someone started a discussion with two sentences that said AEO is the new SEO and that ChatGPT was quickly becoming the leading search engine, inspiring well over a hundred responses. The discussion is in a private Facebook group called AI/ChatGPT Prompts for Entrepreneurs.

AEO is a relatively new acronym meaning Answer Engine Optimization. It describes AI Search Optimization. AISEO is more a more precise acronym but it sounds too close to E-I-E-I-O.

Is AEO really a thing? Consider this: All AI search engines use a search index and traditional search ranking algorithms. For goodness sakes, Perplexity AI uses a version of Google’s PageRank, one of the most traditional ranking algorithms of all time.

People in that discussion generally agreed that AEO is not a thing, that AI Search Engines were not yet a major challenge to Google and that SEO is still a thing.

All is not upside down with the world because at least in that discussion the overwhelming sentiment is that AEO is not a thing. Many observed that ChatGPT uses Bing’s index, so if you’re doing “AEO” for ChatGPT you’re actually just doing SEO for Bing. Others expressed that the average person has no experience with ChatGPT and until it’s integrated into a major browser it’s going to remain a niche search engine.

There was one person insisting that Perplexity AI was designed as an AI Search Engine, completely misunderstanding that Perplexity AI uses a search index and identifies authoritative websites with an updated version of Google’s old PageRank algorithm.

AI has been a strong search engine factor in Google since at least 10 years. Longer if you consider that Google Brain began as a project in 2011.

  • AI in search is not new.
  • Search results summaries aren’t new either (Featured Snippets).
  • Google’s Information Gain patent for AI Chatbots filed in 2018.

AI in search feels new but it’s not new. The biggest difference isn’t in the back end, it’s in the front and it’s changing how users interact with data. This is the big change that all SEOs should be paying close attention to.

Featured Image by Shutterstock/pathdoc

18 Essential Accessibility Changes To Drive Increased Website Growth via @sejournal, @skynet_lv

This post was sponsored by “Skynet Technologies USA LLC”.

Did you know that 1 billion people have not reached you or your customers’ websites yet.

1 billion potential customers are waiting for businesses to step up and do what’s right.

Find out if your website is accessible to 1 billion people >>>

Accessibility isn’t just a compliance checkbox anymore – it’s a growth strategy.

The demand for scalable, innovative accessibility solutions has skyrocketed.

And your competition is already making these improvements.

For agencies, this means an unprecedented opportunity to meet clients’ needs while driving revenue.

Learn how you can generate additional revenue and boost your clients’ SERP ranking by gaining access to:

Ready to get started?

How Accessibility Improvements Can Increase Growth

The digital economy thrives on inclusion.

There is a large market of individuals who are not included in modern website usability.

With over a billion people globally living with disabilities, accessible digital experiences open doors to untapped markets.

Do Websites Need To Be Accessible?

The short answer is yes.

How Does An Accessible Website Drive Traffic?

Traffic comes from people who have needs. Of course, everyone has needs, including people with disabilities.

Accessible websites and tools cater to all users, expanding reach to a diverse and often overlooked customer base.

Global Potential & Unlocking New Audiences

From a global perspective, the global community of people with disabilities is a market estimated to hold a staggering $13 trillion in spending power.

By removing barriers and ensuring inclusive digital experiences, you can tap into this 1 billion-person market and drive substantial economic growth.

Digital accessibility helps to increase employment opportunities, education options, and simple access to various banking and financial services for everybody.

Boosts User Experience & Engagement 

Accessibility improvements run parallel with SEO improvements.

In fact, they often enhance overall website performance, which leads to:

  • Better user experience.
  • Higher rankings.
  • Increased traffic.
  • Higher conversion rates.

Ensures Your Websites Are Compliant

Increasing lawsuits against businesses that fail to comply with accessibility regulations have imposed pressure on them to implement accessibility in their digital assets.

Compliance with ADA, WCAG 2.0, 2.1, 2.2, Section 508, Australian DDA, European EAA EN 301 549, UK Equality Act (EA), Indian RPD Act, Israeli Standard 5568, California Unruh, Ontario AODA, Canada ACA, German BITV, Brazilian Inclusion Law (LBI 13.146/2015), Spain UNE 139803:2012, France RGAA standards, JIS X 8341 (Japan), Italian Stanca Act, Switzerland DDA, Austrian Web Accessibility Act (WZG) guidelines aren’t optional. Accessibility solution partnerships ensure to stay ahead of potential lawsuits while fostering goodwill.

6 Steps To Boost Your Growth With Accessibility

  1. To drive growth, your agency should prioritize digital accessibility by following WCAG standards, regularly testing with tools like AXE, WAVE, or Skynet Technologies Website Accessibility Checker, and addressing accessibility gaps. Build accessible design frameworks with high-contrast colors, scalable text, and clear navigation.
  2. Integrate assistive technologies such as keyboard navigation, screen reader compatibility, and video accessibility. Focus on responsive design, accessible forms, and inclusive content strategies like descriptive link text, simplified language, and alternative formats.
  3. Providing accessibility training and creating inclusive marketing materials will further support compliance and growth.
  4. To ensure the website thrives, prioritize mobile-first design for responsiveness across all devices, adhere to WCAG accessibility standards, and incorporate keyboard-friendly navigation and alt text for media.
  5. Optimize page speed and core web vitals while using an intuitive interface with clear navigation and effective call-to-action buttons, and use SEO-friendly content with proper keyword optimization and schema markups to boost visibility.
  6. Ensure security with SSL certificates, clear cookie consent banners, and compliance with privacy regulations like GDPR and CCPA. Finally, implement analytics and conversion tracking tools to gather insights and drive long-term growth.

We know this is a lot.

If this sounds good to you, let us help you get set up.

How Can Digital Accessibility Partnerships Supercharge Your Clients’ SEO?

Partnering for digital accessibility isn’t just about inclusivity — it’s a game-changer for SEO, too!

Accessible websites are built with cleaner code, smarter structures, and user-friendly features like alt text and clear headings that search engines love.

Plus, faster load times, mobile-friendly designs, and seamless navigation keep users engaged, reducing bounce rates and boosting rankings. When you focus on making a site accessible to everyone, you’re not just widening your audience—you’re signaling to search engines that the website is high-quality and relevant. It’s a win-win for accessibility and SEO!

12 Essential Factors To Consider For Successful Accessibility Partnerships

  1. Expertise: Look for a provider with a proven track record in digital accessibility, including knowledge of relevant global website accessibility standards and best practices.
  2. Experience: Consider their experience working with similar industries or organizations.
  3. Tools and technologies: Evaluate their use of automated and manual testing tools to identify and remediate accessibility issues.
  4. Price Flexibility: Explore pricing models that align with both the budget and project requirements. Whether for a single site or multiple sites, the service should be compatible and scalable to meet the needs.
  5. Platform Compatibility: Ensure seamless accessibility integration across various platforms, providing a consistent and accessible experience for all users, regardless of the website environment.
  6. Multi-language support: Enhance user experience with global language support, making websites more inclusive and accessible to a global audience.
  7. Regular check-ins: Schedule regular meetings to discuss project progress, address any issues, and make necessary adjustments.
  8. Clear communication channels: Establish clear communication channels (for example: email, and project management tools) to facilitate efficient collaboration.
  9. Transparent reporting: Request detailed reports on the progress of accessibility testing, remediation efforts, and overall project status.
  10. KPIs to measure success: Review the partner’s historical data, especially those similar projects in terms of scale, complexity, and industry.
  11. Evaluate technical expertise: Assess their proficiency in using various accessibility testing tools and ability to integrate different APIs.
  12. Long-term partnership strategy: Compare previous data with the current one for improvement and optimization process. It is crucial for a long-term partnership that there is a specific interval of review and improvements.

    Scaling Accessibility With Smart Partnerships

    All in One Accessibility®: Simplicity meets efficiency!

    The All in One Accessibility® is an AI-powered accessibility tool that helps organizations to enhance their website accessibility level for ADA, WCAG 2.0, 2.1, 2.2, Section 508, Australian DDA, European EAA EN 301 549, UK Equality Act (EA), Indian RPD Act, Israeli Standard 5568, California Unruh, Ontario AODA, Canada ACA, German BITV, Brazilian Inclusion Law (LBI 13.146/2015), Spain UNE 139803:2012, France RGAA standards, JIS X 8341 (Japan), Italian Stanca Act, Switzerland DDA, Austrian Web Accessibility Act (WZG), and more.

    It is available with features like sign language LIBRAS (Brazilian Portuguese Only) integration, 140+ multilingual support, screen reader, voice navigation, smart language auto-detection and voice customization, talk & type, Google and Adobe Analytics tracking, along with premium add-ons including white label and custom branding, VPAT/ACR reports, manual accessibility audit and remediation, PDF remediation, and many more.

    • Quick Setup: Install the widget to any site with ease—no advanced coding required.
    • Feature-Rich Design: From text resizing and color contrast adjustments to screen reader support, it’s packed with tools that elevate the user experience.
    • Revenue Opportunities: Agencies can resell the solution to clients, adding a high-value service to their offerings while earning attractive commissions through the affiliate program.
    • Reduced development costs: Minimizes the financial impact of accessibility remediation by implementing best practices and quick tools.

    Agency Partnership: Scaling accessibility with ease!

    • Extended Service Offerings: The All in One Accessibility® Agency Partnership allows agencies to offer a powerful accessibility widget – quick accessibility solution into their services, enabling them that are in high demand.
    • White Label: As an agency partner, you can offer All in One Accessibility® under their own brand name.
    • Centralized Management: It simplifies oversight by consolidating accessibility data and reporting, allowing enterprises to manage multiple websites seamlessly.
    • Attractive Revenue Streams: Agencies can resell the widget to clients, earning significant revenue through competitive pricing structures and repeat business opportunities.
    • Boost Client Retention: By addressing accessibility needs proactively, agencies build stronger relationships with clients, fostering long-term loyalty and recurring contracts.
    • Increase Market Reach: Partnering with All in One Accessibility® positions agencies as leaders in inclusivity, attracting businesses looking for reliable accessibility solutions.
    • NO Investment, High Return: With no setup costs, scalable features, and up to 30% commission, the partnership enables agencies to maximize profitability with their clients.

    Affiliate Partnership: A revenue opportunity for everyone!

    The All in One Accessibility® Affiliate Partnership program is for content creators, marketers, accessibility advocates, web professionals, 501 (c) organizations (non-profit), and law firms.

    • Revenue Growth through Referrals: The All in One Accessibility® affiliate partnership allows affiliates to earn competitive commissions by promoting a high-demand accessibility solution, turning referrals into consistent revenue.
    • Expanding Market Reach: Affiliates can tap into a diverse audience of businesses seeking ADA and WCAG compliance, scaling both revenue and the adoption of accessibility solutions.
    • Fostering Accessibility Awareness: By promoting the All in One Accessibility® widget, affiliates play a pivotal role in driving inclusivity, helping more websites become accessible to users with disabilities.
    • Leveraging Trusted Branding: Affiliates benefit from partnering with a reliable and recognized quick accessibility improvement tool, boosting their credibility and marketing impact.
    • Scaling with Zero Investment: With user-friendly promotional resources and a seamless onboarding process, affiliates can maximize returns without any costs.

    Use Accessibility As A Growth Engine

    Endeavoring for strategic partnerships with accessibility solution providers is a win-win for agencies aiming to meet the diverse needs of their clients. These partnerships not only enhance the accessibility of digital assets but also create opportunities for growth, and loyalty, top search engine rankings, boost revenue, improve compliance with legal standards, and make you to contribute into digital accessibility world.

    With Skynet Technologies USA LLC, Transform accessibility from a challenge into a revenue-driving partnership. Let inclusivity power the success.

    Ready to get started? Embarking on a digital accessibility journey is simpler than you think! Take the first step by evaluating the website’s current WCAG compliance with a manual accessibility audit.

    For more information, Reach out hello@skynettechnologies.com.


    Image Credits

    Featured Image: Image by Skynet Technologies. Used with permission.

    AI is changing how we study bird migration

    A small songbird soars above Ithaca, New York, on a September night. He is one of 4 billion birds, a great annual river of feathered migration across North America. Midair, he lets out what ornithologists call a nocturnal flight call to communicate with his flock. It’s the briefest of signals, barely 50 milliseconds long, emitted in the woods in the middle of the night. But humans have caught it nevertheless, with a microphone topped by a focusing funnel. Moments later, software called BirdVoxDetect, the result of a collaboration between New York University, the Cornell Lab of Ornithology, and École Centrale de Nantes, identifies the bird and classifies it to the species level.

    Biologists like Cornell’s Andrew Farnsworth had long dreamed of snooping on birds this way. In a warming world increasingly full of human infrastructure that can be deadly to them, like glass skyscrapers and power lines, migratory birds are facing many existential threats. Scientists rely on a combination of methods to track the timing and location of their migrations, but each has shortcomings. Doppler radar, with the weather filtered out, can detect the total biomass of birds in the air, but it can’t break that total down by species. GPS tags on individual birds and careful observations by citizen-scientist birders help fill in that gap, but tagging birds at scale is an expensive and invasive proposition. And there’s another key problem: Most birds migrate at night, when it’s more difficult to identify them visually and while most birders are in bed. For over a century, acoustic monitoring has hovered tantalizingly out of reach as a method that would solve ornithologists’ woes.

    In the late 1800s, scientists realized that migratory birds made species-specific nocturnal flight calls—“acoustic fingerprints.” When microphones became commercially available in the 1950s, scientists began recording birds at night. Farnsworth led some of this acoustic ecology research in the 1990s. But even then it was challenging to spot the short calls, some of which are at the edge of the frequency range humans can hear. Scientists ended up with thousands of tapes they had to scour in real time while looking at spectrograms that visualize audio. Though digital technology made recording easier, the “perpetual problem,” Farnsworth says, “was that it became increasingly easy to collect an enormous amount of audio data, but increasingly difficult to analyze even some of it.”

    Then Farnsworth met Juan Pablo Bello, director of NYU’s Music and Audio Research Lab. Fresh off a project using machine learning to identify sources of urban noise pollution in New York City, Bello agreed to take on the problem of nocturnal flight calls. He put together a team including the French machine-listening expert Vincent Lostanlen, and in 2015, the BirdVox project was born to automate the process. “Everyone was like, ‘Eventually, when this nut is cracked, this is going to be a super-rich source of information,’” Farnsworth says. But in the beginning, Lostanlen recalls, “there was not even a hint that this was doable.” It seemed unimaginable that machine learning could approach the listening abilities of experts like Farnsworth.

    “Andrew is our hero,” says Bello. “The whole thing that we want to imitate with computers is Andrew.”

    They started by training BirdVoxDetect, a neural network, to ignore faults like low buzzes caused by rainwater damage to microphones. Then they trained the system to detect flight calls, which differ between (and even within) species and can easily be confused with the chirp of a car alarm or a spring peeper. The challenge, Lostanlen says, was similar to the one a smart speaker faces when listening for its unique “wake word,” except in this case the distance from the target noise to the microphone is far greater (which means much more background noise to compensate for). And, of course, the scientists couldn’t choose a unique sound like “Alexa” or “Hey Google” for their trigger. “For birds, we don’t really make that choice. Charles Darwin made that choice for us,” he jokes. Luckily, they had a lot of training data to work with—Farnsworth’s team had hand-annotated thousands of hours of recordings collected by the microphones in Ithaca.

    With BirdVoxDetect trained to detect flight calls, another difficult task lay ahead: teaching it to classify the detected calls by species, which few expert birders can do by ear. To deal with uncertainty, and because there is not training data for every species, they decided on a hierarchical system. For example, for a given call, BirdVoxDetect might be able to identify the bird’s order and family, even if it’s not sure about the species—just as a birder might at least identify a call as that of a warbler, whether yellow-rumped or chestnut-sided. In training, the neural network was penalized less when it mixed up birds that were closer on the taxonomical tree.  

    Last August, capping off eight years of research, the team published a paper detailing BirdVoxDetect’s machine-learning algorithms. They also released the software as a free, open-source product for ornithologists to use and adapt. In a test on a full season of migration recordings totaling 6,671 hours, the neural network detected 233,124 flight calls. In a 2022 study in the Journal of Applied Ecology, the team that tested BirdVoxDetect found acoustic data as effective as radar for estimating total biomass.

    BirdVoxDetect works on a subset of North American migratory songbirds. But through “few-shot” learning, it can be trained to detect other, similar birds with just a few training examples. It’s like learning a language similar to one you already speak, Bello says. With cheap microphones, the system could be expanded to places around the world without birders or Doppler radar, even in vastly different recording conditions. “If you go to a bioacoustics conference and you talk to a number of people, they all have different use cases,” says Lostanlen. The next step for bioacoustics, he says, is to create a foundation model, like the ones scientists are working on for natural-language processing and image and video analysis, that would be reconfigurable for any species—even beyond birds. That way, scientists won’t have to build a new BirdVoxDetect for every animal they want to study.

    The BirdVox project is now complete, but scientists are already building on its algorithms and approach. Benjamin Van Doren, a migration biologist at the University of Illinois Urbana-Champaign who worked on BirdVox, is using Nighthawk, a new user-friendly neural network based on both BirdVoxDetect and the popular birdsong ID app Merlin, to study birds migrating over Chicago and elsewhere in North and South America. And Dan Mennill, who runs a bioacoustics lab at the University of Windsor, says he’s excited to try Nighthawk on flight calls his team currently hand-­annotates after they’re recorded by microphones on the Canadian side of the Great Lakes. One weakness of acoustic monitoring is that unlike radar, a single microphone can’t detect the altitude of a bird overhead or the direction in which it is moving. Mennill’s lab is experimenting with an array of eight microphones that can triangulate to solve that problem. Sifting through recordings has been slow. But with Nighthawk, the analysis will speed dramatically.

    With birds and other migratory animals under threat, Mennill says, BirdVoxDetect came at just the right time. Knowing exactly which birds are flying over in real time can help scientists keep tabs on how species are doing and where they’re going. That can inform practical conservation efforts like “Lights Out” initiatives that encourage skyscrapers to go dark at night to prevent bird collisions. “Bioacoustics is the future of migration research, and we’re really just getting to the stage where we have the right tools,” he says. “This ushers us into a new era.”

    Christian Elliott is a science and environmental reporter based in Illinois.