Google Product Studio Rolls Out To Additional Countries via @sejournal, @brookeosmundson

Google just expanded Google Product Studio, its AI-powered tool for creating better product images, to more countries.

Originally launched in May 2023, this handy feature is built right into Merchant Center Next, the revamped hub for managing product listings.

It’s a game-changer for small to medium-sized businesses (SMBs) and retailers looking to level up their visuals without breaking the bank—or spending hours in photo editing software.

What Is Google Product Studio?

At its core, it’s a tool that uses generative AI to help businesses enhance their product photos.

Whether you’re trying to grab attention in Shopping ads, make your images pop on organic search, or just keep up with competitors, this tool makes it quick and easy.

So, why should you care about Google Product Studio?

Well, let’s face it: consumers judge products by their visuals, and not everyone has the budget for professional photoshoots. That’s where Google Product Studio comes in, offering features like:

  • Background Removal and Replacement: Transform a cluttered image into a clean, professional-looking shot—or swap in a themed background for a seasonal promo.
  • Image Upscaling: Say goodbye to pixelated photos. Product Studio can upscale low-resolution images to make them shine.
  • Seasonal and Thematic Overlays: Want to add a holiday vibe or showcase a specific theme? It’s as simple as a few clicks.

Additionally, Product Studio now supports video generation, which launched just a few months ago.

These tools are especially useful for advertisers who need their listings to look polished without a lot of extra effort. Better visuals mean better click-through rates, which helps improve overall conversions and sales.

Where is Google Product Studio Available Now?

Until recently, Product Studio was only available in select regions, but this latest expansion means more merchants can now access it.

As of today, Product Studio is available in 15 new countries, including:

  • Czech Republic
  • Denmark
  • Finland
  • Greece
  • Hungary
  • Ireland
  • Mexico
  • New Zealand
  • Norway
  • Portugal
  • Romania
  • Slovakia
  • Sweden
  • Turkey
  • Ukraine

With this expansion, Product Studio is now available in 30 countries, which has already been made available previously to:

  • Australia
  • Austria
  • Belgium
  • Brazil
  • Canada
  • Germany
  • India
  • Italy
  • Japan
  • Netherlands
  • Poland
  • Spain
  • Switzerland
  • United Kingdom
  • United States

The Continued Expansion of AI

In a world where e-commerce competition keeps heating up, Product Studio is a lifeline for retailers who want to stay ahead.

Better images don’t just look good—they drive results. And with this expansion, more merchants worldwide can take advantage of Google’s AI magic to bring their product listings to life.

As e-commerce continues to evolve, tools like this make it easier than ever to keep up—and stand out.

Google expects to roll out Product Studio to additional countries in the following months.

WPForms Plugin Vulnerability Affects Up To 6 Million Sites via @sejournal, @martinibuster

The WPForms plugin for WordPress exposes websites to a vulnerability that allows attackers to update subscriptions and issue refunds. This flaw enables attackers to modify data they normally should not have access to.

Missing Capability Check

The vulnerability is due to a missing capability check in a function within the plugin called wpforms_is_admin_page, which means that the plugin doesn’t check for appropriate permissions of the user attempting to make a change with this function. That means that the plugin allows data to be modified by attackers lacking sufficient privileges.

Attackers need to acquire at least subscriber level permissions in order to launch an attack. Normally this kind of attack doesn’t attain this high of a severity rating. But it may be because sites that have users that pay for a subscription are likely to have subscriber level users. This may be why the severity level of this authenticated attack is higher than general.

The Wordfence announcement explains it like this:

“The WPForms plugin for WordPress is vulnerable to unauthorized modification of data due to a missing capability check on the ‘wpforms_is_admin_page’ function in versions starting from 1.8.4 up to, and including, 1.9.2.1. This makes it possible for authenticated attackers, with Subscriber-level access and above, to refund payments and cancel subscriptions.”

It’s recommended that users of versions WPForms plugin users from versions 1.8.4 up to an including 1.9.2.1 update their plugins.

Read the Wordfence security alert:

WPForms 1.8.4 – 1.9.2.1 – Missing Authorization to Authenticated (Subscriber+) Payment Refund and Subscription Cancellation

Featured Image by Shutterstock/Tithi Luadthong

Reddit Integrates AI-Powered Search With New “Reddit Answers” via @sejournal, @MattGSouthern

Reddit is testing a new AI-powered search feature called “Reddit Answers” with a small group of users in the United States.

This tool should help users find information, recommendations, and personal opinions by using real conversations from Reddit’s many communities.

In an announcement, the company states:

“In line with our mission to empower communities and provide human perspectives to everyone, starting today, we’re rolling out a test of Reddit Answers, a new way to get the information, recommendations, discussions, and hot takes people are looking for – on any topic – from real conversations and communities across all of Reddit.”

Transforming Search On Reddit

Reddit Answers offers a simple search tool powered by AI.

You can ask questions and get relevant answers from discussions on Reddit.

Screenshot from redditinc.com/blog/introducing-reddit-answers, December 2024.

When a question is submitted, the tool creates summaries of conversations and details from different subreddits.

It also links related communities and posts, allowing you to explore full conversations for more context.

The annoucement continues:

“People know that Reddit has answers, advice, and perspectives on almost anything they’re looking for, and AI-powered search is part of our longer-term vision to improve the search experience on Reddit – making it faster, smarter, and more relevant.”

Screenshot from reddit.com/answers, December 2024.

Why This Matters

Reddit’s search function has been a problem for users, who often find it less effective than other search engines. Reddit Answers aims to address this and help users find information more easily on the platform.

Many users turn to Google to search for Reddit content, adding “Reddit” to their queries because Reddit’s search often lacks relevant results.

By using AI to provide targeted answers and summaries, Reddit Answers could reduce this reliance on Google and keep users engaged.

Availability

Currently, the feature is available to a limited number of users in the U.S. and only works in English. The company plans to add more languages and expand to additional locations soon.

Those eager to experience the new search feature and stay informed about its availability can visit the dedicated Reddit Answers webpage for updates.

Google CEO: Search Will Change Profoundly In 2025 via @sejournal, @martinibuster

Sundar Pichai, Google’s CEO, was interviewed by Andrew Ross Sorkin at the New York Times DealBook Summit, where he discussed what to expect from Google Search in 2025 but also struggled to articulate Google’s concern for content creators.

When asked to compare where Google is today relative to the rest of the industry and whether Google should be the “default winner” Pichai reminded the interviewer that these were “the earliest stages of a profound shift” and underlined that Google is a leader in AI and not the follower. The entire AI industry is built on top of Google research discoveries that were subsequently open sourced, particularly transformers, without which the AI industry would not exist as it is today.

Pichai answered:

“Look, it’s a such a dynamic moment in the industry. When I look at what’s coming ahead, we are in the earliest stages of a profound shift. We have taken such a deep full stack approach to AI.

…we do world class research. We are the most cited, when you look at gen AI, the most cited… institution in the world, foundational research, we build AI infrastructure and when I’m saying AI infrastructure all the way from silicon, we are in our sixth generation of tensor processing units. You mentioned our product reach, we have 15 products at half a billion users, we are building foundational models, and we use it internally, we provide it to over three million developers and it’s a deep full stack investment.

We are getting ready for our next generation of models, I just think there’s so much innovation ahead, we are committed to being at the state of the art in this field and I think we are. Just coming today, we announced groundbreaking research on a text and image prompt creating a 3D scene. And so the frontier is moving pretty fast, so looking forward to 2025.”

Blue Link Economy And AI

It was pointed out by the interviewer that Google was the first mover on AI and then it wasn’t (a reference to OpenAI’s breakout in 2022 and subsequent runaway success). He asked Pichai how much of that was Google protecting the “blue link economy” so as not “to hurt or cannibalize that business” which is worth hundreds of billions of dollars.

Pichai answered that out of all the projects at Google, AI was applied the most to Search, citing BERT, MUM and multimodal search as helping close the gaps in search quality. Something that some in the search industry fail to understand is that AI has has been a part of Google since 2012 when it used Deep Neural Networks for identifying images and speech recognition and in 2014 when it introduced the world to sequence to sequence learning (PDF) for understanding strings of text. In 2015 Google introduced RankBrain, an AI system directly related to ranking search results.

Pichai  answered:

“The area where we applied AI the most aggressively, if anything in the company was in search, the gaps in search quality was all based on Transformers internally. We call it BERT and MUM and you know, we made search multimodal, the search quality improvements, we were improving the language understanding of search. That’s why we built Transformers in the company.

So and if you look at the last couple of years, we have with AI overviews, Gemini is being used by over a billion users in search alone.”

Search Will Change Profoundly In 2025

Pichai continued his answer, stating directly that Search will profoundly change not just in 2025, but in early 2025. He also said that progress is going to get harder because the easier things to innovate have been done (low hanging fruit).

He said:

“And I just feel like we are getting started. Search itself will continue to change profoundly in 2025. I think we are going to be able to tackle more complex questions than ever before. You know, I think we’ll be surprised even early in 2025, the kind of newer things search can do compared to where it is today… “

Pichai also said that progress wouldn’t be easy:

“I think the progress is going to get harder when I look at 2025, the low hanging fruit is gone.

But I think where the breakthroughs need to come from where the differentiation needs to come from is is your ability to achieve technical breakthroughs, algorithmic breakthroughs, how do you make the systems work, you know, from a planning standpoint or from a reasoning standpoint, how do you make these systems better? Those are the technical breakthroughs ahead.”

Is Search Going Away?

The interviewer asked Pichai if Google has leaned into AI enough, quoting an author who suggested that Google’s “core business is under siege” because people are increasingly getting answers from AI and other platforms outside of search, and that the value of search would be “deteriorating” because so much of the content online will be AI-generated.

He answered that it’s precisely in a scenario where the Internet is flooded with inauthentic content that search becomes even more valuable.

Pichai answered:

“In a world in which you’re flooded with like lot of content …if anything, something like search becomes more valuable. In a world in which you’re inundated with content, you’re trying to find trustworthy content, content that makes sense to you in a way reliably you can use it, I think it becomes more valuable.

To your previous part about there’s a lot of information out there, people are getting it in many different ways. Look, information is the essence of humanity. We’ve been on a curve on information… when Facebook came around, people had an entirely new way of getting information, YouTube, Facebook, Tik… I can keep going on and on.

…I think the problem with a lot of those constructs is they are zero sum in their inherent outlook. They just feel like people are consuming information in a certain limited way and people are all dividing that up. But that’s not the reality of what people are doing. “

Pichai Stumbles On Question About Impact On Creators

The interviewer next asked if content is being devalued. He used the example of someone who researches a topic for a book, reads twenty books, cites those sources in the bibliography and then gets it published. Whereas Google ingests everything and then “spits” out content all day long, defeating the human who in earlier times would write a book.

Andrew Ross Sorkin said:

“You get to spit it out a million times. A million times a day. And I just wonder what the economics of that should be for the folks that create it in the beginning.”

Sundar Pichai defended Google by saying that Google spends a lot of time thinking about the impact to the “ecosystem” of publishers and how much traffic it sends to them. The interviewer listened to Sundar’s answer without mentioning the elephant in the room, search results stuffed with Reddit and advertising that crowds out content created by actual experts, and the de-prioritization of news content which has negatively impacted traffic to news organizations around the world.

It was at this point that Pichai appeared to stumble as he tried to find the words to respond. He avoids mentioning websites, speaking in the abstract about the “ecosystem” and then when he runs out of things to say changes course and begins speaking about how Google compensates copyright holders who sign up for YouTube’s Content ID program.

He answered:

“Look I… uh… It’s a… very important question… uhm… look I… I… think… I think more than any other company… look you know… we for a long time through… you know… be it in search making sure… while it’s often debated, we spend a lot of time thinking about the traffic we send to the ecosystem.

Even through the moment through the transition over the past couple of years. It’s an important priority for us.”

At this point he started talking about Google’s content platform YouTube and how they use “Content ID” which is used to identify copyright-protected content. Content ID is a program that benefits the corporate music, film, and television industries, copyright owners who “own exclusive rights to a substantial body of original material that is frequently uploaded to YouTube.”

Pichai continued:

“In YouTube we put a lot of effort into understanding and you know identifying content and with content ID and uh creating monetization for creators.

I think… I think those are important principles, right. I think um… there’s always going to be a balance between understanding what is fair use uh… when new technology comes versus how do you… give value back proportionate to the value of the IP, the hard work people have put in.”

Insightful Interview Of Alphabet’s CEO

The interviewer did a great job at asking the hard questions but I think many in the search marketing community who are more familiar with the search results would have asked follow up questions about content creators who are not on Google’s YouTube platform or the non-expert content that pushes down content by actual experts.

Watch the New York Times Interview here:

Featured Image by Shutterstock/Shutterstock AI Generator
(no irony intended)

Google Rolls Out One-Click Event Tracking In GA4 via @sejournal, @MattGSouthern

Google simplifies analytics tracking with new one-click key event features in GA4, powered by machine learning.

  • Google released one-click event tracking in GA4 with two features: “Mark as key event” and “Create key event.”
  • Machine learning identifies important site events automatically, eliminating manual setup time.
  • These features are now available for all GA4 properties and enable better tracking and reporting.
Google: Focus On Field Data For Core Web Vitals via @sejournal, @MattGSouthern

Google stresses the importance of using actual user data to assess Core Web Vitals instead of relying only on lab data from tools like PageSpeed Insights (PSI) and Lighthouse.

This reminder comes as the company prepares to update the throttling settings in PSI. These updates are expected to increase the performance scores of websites in Lighthouse.

Field Data vs. Lab Data

Core Web Vitals measure a website’s performance in terms of loading speed, interactivity, and visual stability from the user’s perspective.

Field data shows users’ actual experiences, while lab data comes from tests done in controlled environments using tools like Lighthouse.

Barry Pollard, a Web Performance Developer Advocate at Google, recently emphasized focusing on field data.

In a LinkedIn post, he stated:

“You should concentrate on your field Core Web Vitals (the top part of PageSpeed Insights), and only use the lab Lighthouse Score as a very rough guide of whether Lighthouse has recommendations to improve performance or not…

The Lighthouse Score is best for comparing two tests made on the same Lighthouse (e.g. to test and compare fixes).

Performance IS—and hence LH Scores also ARE—highly variable. LH is particularly affected by where it is run from (PSI, DevTools, CI…), but also on the lots of other factors.

Lighthouse is a GREAT tool but it also can only test some things, under certain conditions.

So while it’s great to see people interested in improving webperf, make sure you’re doing just that (improve performance) and not just improving the score”

Upcoming Changes To PageSpeed Insights

Pollard discussed user concerns about PageSpeed Insights’s slow servers, which can cause Lighthouse tests to take longer than expected.

To fix this, Google is changing the throttling settings in PageSpeed Insights, which should lead to better performance scores when the update is released in the coming weeks.

These changes will affect both the web interface and the API but will not impact other versions of Lighthouse.

However, Pollard  reminds users that  “a score of 100 doesn’t mean perfect; it just means Lighthouse can’t help anymore.”

Goodhart’s Law & Web Performance

Pollard referenced Goodhart’s Law, which says that when a measure becomes a goal, it stops being a good measure.

In the web performance context, focusing only on improving Lighthouse scores may not improve actual user experience.

Lighthouse is a helpful tool, but it can only assess certain aspects of performance in specific situations.

Alon Kochba, Web Performance and Software Engineer at Wix, added context to the update, stating:

“Lighthouse scores may not be the most important – but this is a big deal for Lighthouse scores in PageSpeed Insights.

4x -> 1.2x CPU throttling for Mobile device simulation, which was way off for quite a while.”

Key Takeaway: Prioritize User Experience

As the update rolls out, website owners and developers should focus on user experience using field data for Core Web Vitals.

While Lighthouse scores can help find areas for improvement, they shouldn’t be the only goal.

Google encourages creating websites that load quickly, respond well, and are visually stable.


Featured Image: GoodStudio/Shutterstock

OpenAI Releases ChatGPT o1, ‘World’s Smartest Language Model” via @sejournal, @martinibuster

Today OpenAI rolled out what Sam Altman says is the world’s smartest language model in the world plus a brand new Pro tier that comes with unlimited usage limits and with a higher level of computing resources.

OpenAI ChatGPT o1 Model

Sam Altman announced on X (formerly Twitter) that their new AI model is now live and available in ChatGPT right now and will be arriving to the API soon.

He tweeted:

“o1, the smartest model in the world. smarter, faster, and more features (eg multimodality) than o1-preview. live in chatgpt now, coming to api soon.

chatgpt pro. $200/month. unlimited usage and even-smarter mode for using o1. more benefits to come!”

Screenshot Of ChatGPT 01 Model Availability

ChatGPT Pro Mode $200/Month

ChatGPT Pro Mode is a new tier that has more “thinking power” than the standard version of o1, which increases it’s reliability. Answers in Pro mode take longer to generate, displaying a progress bar and triggering an in-app notification if the user navigates to a different conversation.

OpenAI describes the new ChatGPT Pro Mode:

“ChatGPT Pro provides access to a version of our most intelligent model that thinks longer for the most reliable responses. In evaluations from external expert testers, o1 pro mode produces more reliably accurate and comprehensive responses, especially in areas like data science, programming, and case law analysis.

Compared to both o1 and o1-preview, o1 pro mode performs better on challenging ML benchmarks across math, science, and coding.”

The new tier is not a price increase from the regular plan, which is called Plus. It’s an entirely new plan called Pro.

OpenAI’s new o1 Pro plan provides unlimited access to its new o1 model, along with o1-mini, GPT-4o, and Advanced Voice. It also includes o1 Pro Mode, which has access to increased computational power to generate more refined and insightful responses to complex queries.

Read more about OpenAI’s new pro plan and O1 model:

Introducing ChatGPT Pro

Featured Image by Shutterstock/One Artist

Google Uses About 40 Signals To Determine Canonical URLs via @sejournal, @MattGSouthern

In a recent episode of Google’s Search Off the Record podcast, Allan Scott from the “Dups” team explained how Google decides which URL to consider as the main one when there are duplicate pages.

He revealed that Google looks at about 40 different signals to pick the main URL from a group of similar pages.

Around 40 Signals For Canonical URL Selection

Duplicate content is a common problem for search engines because many websites have multiple pages with the same or similar content.

To solve this, Google uses a process called canonicalization. This process allows Google to pick one URL as the main version to index and show in search results.

Google has discussed the importance of using signals like rel=”canonical” tags, sitemaps, and 301 redirects for canonicalization. However, the number of signals involved in this process is more than you may expect.

Scott revealed during the podcast:

“I’m not sure what the exact number is right now because it goes up and down, but I suspect it’s somewhere in the neighborhood of 40.”

Some of the known signals mentioned include:

  1. rel=”canonical” tags
  2. 301 redirects
  3. HTTPS vs. HTTP
  4. Sitemaps
  5. Internal linking
  6. URL length

The weight and importance of each signal may vary, and some signals, like rel=”canonical” tags, can influence both the clustering and canonicalization process.

Balancing Signals

With so many signals at play, Allan acknowledged the challenges in determining the canonical URL when signals conflict.

He stated:

“If your signals conflict with each other, what’s going to happen is the system will start falling back on lesser signals.”

This means that while strong signals like rel=”canonical” tags and 301 redirects are crucial, other factors can come into play when these signals are unclear or contradictory.

As a result, Google’s canonicalization process involves a delicate balancing act to determine the most appropriate canonical URL.

Best Practices For Canonicalization

Clear signals help Google identify the preferred canonical URL.

Best practices include:

  1. Use rel=”canonical” tags correctly.
  2. Implement 301 redirects for permanently moved content.
  3. Ensure HTTPS versions of pages are accessible and linked.
  4. Submit sitemaps with preferred canonical URLs.
  5. Keep internal linking consistent.

These signals help Google find the correct canonical URLs, improving your site’s crawling, indexing, and search visibility.

Mistakes To Avoid

Here are a few common mistakes to watch out for.

1. Incorrect or conflicting canonical tags:

  • Pointing to non-existent or 404 pages
  • Multiple canonical tags with different URLs on one page
  • Pointing to a different domain entirely

Fix: Double-check canonical tags, use only one per page, and use absolute URLs.

2. Canonical chains or loops

When Page A points to Page B as canonical, but Page B points back to A or another page, creating a loop.

Fix: Ensure canonical URLs always point to the final, preferred version of the page.

3. Using noindex and canonical tags together

Sending mixed signals to search engines. Noindex means don’t index the page at all, making canonicals irrelevant.

Fix: Use canonical tags for consolidation and noindex for exclusion.

4. Canonicalizing to redirect or noindex pages

Pointing canonicals to redirected or noindex pages confuses search engines.

Fix: Canonical URLs should be 200 status and indexable.

5. Ignoring case sensitivity

Inconsistent URL casing can cause duplicate content issues.

Fix: Keep URL and canonical tag casing consistent.

6. Overlooking pagination and parameters

Paginated content and parameter-heavy URLs can cause duplication if mishandled.

Fix: Use canonical tags pointing to the first page or “View All” for pagination, and keep parameters consistent.

Key Takeaways

It’s unlikely the complete list of 40+ signals used to determine canonical URLs will be made publicly available.

However, this was still an insightful discussion worth highlighting.

Here are the key takeaways:

  • Google uses approximately 40 different signals to determine canonical URLs, with rel=”canonical” tags and 301 redirects being among the strongest indicators
  • When signals conflict, Google falls back on secondary signals to make its determination
  • Clear, consistent implementation of canonicalization signals (tags, redirects, sitemaps, internal linking) is crucial
  • Common mistakes like canonical chains, mixed signals, or incorrect implementations can confuse search engines

Hear the full discussion in the video below:


Featured Image: chatiyanon/Shutterstock

Google Warns Of Duplicate Content “Black Holes” Caused By Error Pages via @sejournal, @MattGSouthern

Google’s “Search Off the Record” podcast recently highlighted an SEO issue that can make web pages disappear from search results.

In the latest episode, Google Search team member Allan Scott discussed “marauding black holes” formed by grouping similar-looking error pages.

Google’s system can accidentally cluster error pages that look alike, causing regular pages to get included in these groups.

This means Google may not crawl these pages again, which can lead to them being de-indexed, even after fixing the errors.

The podcast explained how this happens, its effects on search traffic, and how website owners can keep their pages from getting lost.

How Google Handles Duplicate Content

To understand content black holes, you must first know how Google handles duplicate content.

Scott explains this happens in two steps:

  1. Clustering: Google groups pages that have the same or very similar content.
  2. Canonicalization: Google then chooses the best URL from each group.

After clustering, Google stops re-crawling these pages. This saves resources and avoids unnecessary indexing of duplicate content.

How Error Pages Create Black Holes

The black hole problem happens when error pages group together because they have similar content, such as generic “Page Not Found” messages. Regular pages with occasional errors or temporary outages can get stuck in these error clusters.

The duplication system prevents the re-crawling of pages within a cluster. This makes it hard for mistakenly grouped pages to escape the “black hole,” even after fixing the initial errors. As a result, these pages can get de-indexed, leading to a loss of organic search traffic.

Scott explained:

“Only the things that are very towards the top of the cluster are likely to get back out. Where this really worries me is sites with transient errors… If those fail to fetch, they might break your render, in which case we’ll look at your page, and we’ll think it’s broken.”

How To Avoid Black Holes

To avoid problems with duplicate content black holes, Scott shared the following advice:

  1. Use the Right HTTP Status Codes: For error pages, use proper status codes (like 404, 403, and 503) instead of a 200 OK status. Only pages marked as 200 OK may be grouped together.
  2. Create Unique Content for Custom Error Pages: If you have custom error pages that use a 200 OK status (common in single-page apps), make sure these pages contain specific content to prevent grouping. For example, include the error code and name in the text.
  3. Caution with Noindex Tags: Do not use noindex tags on error pages unless you want them permanently removed from search results. This tag strongly indicates that you want the pages removed, more so than using error status codes.

Following these tips can help ensure regular pages aren’t accidentally mixed with error pages, keeping them in Google’s index.

Regularly checking your site’s crawl coverage and indexation can help catch duplication issues early.

In Summary

Google’s “Search Off the Record” podcast highlighted a potential SEO issue where error pages can be seen as duplicate content. This can cause regular pages to be grouped with errors and removed from Google’s index, even if the errors are fixed.

To prevent duplicate content issues, website owners should:

  1. Use the correct HTTP status codes for error pages.
  2. Ensure custom error pages have unique content.
  3. Monitor their site’s crawl coverage and indexation.

Following technical SEO best practices is essential for maintaining strong search performance, as emphasized by Google’s Search team.

Hear the full discussion in the video below:


Featured Image: Nazarii_Neshcherenskyi/Shutterstock

Wix Integrates Session Recording Toolkit Into Analytics Interface via @sejournal, @martinibuster

Wix has integrated TWIPLA’s session recording toolkit into its analytics reports, enabling users to replay visitor interactions and make data-driven decisions to improve user experience and conversions

Session Recordings Toolkit

TWIPLA, a German website analytics company, announced the native integration of their session recording toolkit directly within the Wix analytics reports. Wix publishers can now replay video that replays actual customer journeys through the site, to better understand customer behavior. This analytics feature is integrated directly within the Wix analytics interface.

Noa Kroytoro, Product Manager at Wix Analytics, commented:

“The launch of the session recordings toolkit enhances the reporting tools available to Wix users through Wix Analytics, providing them with deeper insights into customer behavior for
more effective user experience optimization. Our partnership with TWIPLA enables us to deliver our users a powerful solution for data-driven decision making.”

TWIPLA CEO Tim Hammermann, said:

“It’s one of the most popular tools we have and our clients have found that it helps them to make tangible improvements to online success, particularly because the granular filtering
system makes it so easy to find session replays that match specific visitor segments.”

Full instructions and details of how to use the new session recordings are available on the Wix website:

Wix Analytics: Adding and Setting Up Session Recordings

Read the official announcement by Twipla:

TWIPLA expands partnership with Wix, powering new session recordings toolkit for advanced UX optimization (PDF)