JavaScript SEO should be a solved problem by now. It isn’t.
Ecommerce sites keep hitting the same crawling, rendering, and indexing issues they were five years ago, now stacked on top of headless builds, AI-powered recommendations, and frameworks that can hide critical content from Google.
These top ecommerce players have figured out how to ship fast, modern JavaScript without sacrificing organic visibility. Here are five lessons worth stealing.
1. Chewy uses JavaScript for UX
Chewy is one of the largest online retailers of pet food and supplies in the U.S. They use Next.js, a React framework for building websites with built-in support for server rendering, static generation, and full-stack development features.
That means you can put important content in the initial HTML response without relying on client-side JavaScript.
Let’s look at a product page like the Benebone Wishbone Chew Toy.


Navigate to View Page Source and you’ll see the product title, description, pricing, reviews, Q&A, and breadcrumb navigation all present in the initial HTML. Googlebot can access it on the first pass, without waiting for rendering.


That’s important because if a web crawler like Googlebot encounters issues rendering your page, the important content can still be parsed on the first crawl. With the rise of AI chatbots, some of which still don’t render JavaScript, this has become even more important.
Not everything needs to be in the initial HTML, though. Without client-side JavaScript, the page would feel static and clunky.
Take the “Compare Similar Items” carousel. It’s loaded client-side, primarily there for shoppers. The internal links could offer some SEO benefit, but they’re not critical for indexing this page the way the title, description, and pricing are.


Chewy gets this balance right. The content that matters most for indexing is available on initial load. Client-side JavaScript enhances the experience rather than delivering the content that needs to be indexed.
Your customers search everywhere. Make sure your brand shows up.
The SEO toolkit you know, plus the AI visibility data you need.
Start Free Trial
Get started with

2. Myprotein makes navigation crawlable
Myprotein sells supplements, nutrition products, and some fitness apparel.
Their site is built on Astro, a content-first framework using Islands Architecture to ship zero JavaScript by default while supporting components from React, Vue, or Svelte.
Myprotein’s navigation is the part worth studying. It’s an important SEO area for ecommerce sites, and they get it right.


View the source on any Myprotein page and the navigation links (categories, dropdown items, and footer links) are all in the initial HTML response. Astro makes this possible through its island architecture.


The navigation ships as an interactive island, meaning Astro will hydrate it with JavaScript as soon as the browser is ready. But JavaScript makes the flyout menus interactive. It doesn’t create them.
These links are also proper elements with href attributes, which is what crawlers like Googlebot need to discover and follow links. Avoid using JavaScript click handlers to simulate navigation, such as:
Clear Protein Drinks
A crawler won’t follow that. Use a standard anchor element instead:
Not every site gets this right. When navigation depends entirely on client-side rendering, there’s a window where it’s invisible or empty.
Googlebot processes JavaScript in a separate rendering pass that can lag behind the initial crawl, which can mean delayed discovery of internal links critical for crawl efficiency and link equity distribution.
3. Harrods embeds structured data in the HTML
Harrods is a luxury department store selling fashion, beauty, and homeware.
Their site is built on Nuxt, a Vue framework for building websites with built-in routing, server rendering, and static generation, plus an opinionated project structure.
Their structured data is delivered in the initial HTML response. View the source on any product page and you’ll find structured data inside a element. The Product schema includes the product name, images, description, brand, and an Offer with price, currency, availability, and seller.


JSON-LD is the format Google recommends for structured data, and because it’s in the HTML response, Google can parse it on the first crawl pass without needing to render the page.
On JavaScript-powered sites, structured data can easily become a client-side dependency. If a framework fetches product data in the browser and generates JSON-LD from the response, that structured data only exists after JavaScript executes. The same is true for structured data injected through Google Tag Manager.
If markup is only added after the page loads, Google has to render the page to find it. Google has noted that dynamically generated Product markup can make Shopping crawls less frequent and less reliable, which matters when prices and availability change often.
By serving that structured data in the HTML directly, Harrods avoids this risk entirely.
Get the newsletter search marketers rely on.
4. Under Armour handles faceted navigation with JavaScript
Under Armour is a global sportswear brand selling athletic apparel, footwear, and accessories. Their site is built on Next.js, the same React framework Chewy uses.
A good place to see their JavaScript SEO in action is on category pages, where filters need to feel fast and interactive for shoppers, and be crawler-friendly.
Let’s look at the men’s shoes category page. When you apply a filter, say, selecting size 10, the product grid updates instantly without a full page reload. That’s client-side JavaScript updating the grid.


But the URL updates too. After selecting the filter, the URL becomes:
A shopper can copy that URL, send it to a friend, or bookmark it, and land right back on the same filtered view.
Notice what the URL isn’t:
- Not a hash fragment (#size=10), which doesn’t get sent to the server and is ignored by Google.
- Not a mess of bracketed query strings (?filters[0][size]=10).
- Not a dynamic route artifact like /shoes/SEO/ leaking into the live URL.
It’s a clean, readable query string with named parameters.
Under Armour is using the Next.js router to update the URL as filters change. Under the hood, it wraps the browser’s History API and uses the pushState() method to update the address bar without a reload.
When someone visits that same URL directly, the page loads with the filter already applied.
5. Manors Golf loads third-party scripts
Manors Golf sells golf apparel. Their site runs on Hydrogen, Shopify’s React-based framework for headless storefronts.
Hydrogen defers its own application scripts automatically since they load as ES modules. However, third-party scripts are the developer’s responsibility. On an ecommerce site, that can be a long list: reviews, chat, personalization, pixels, recommendations, payment scripts.
That matters for SEO in two ways. Render-blocking scripts hurt Core Web Vitals, most directly Largest Contentful Paint (LCP). They also give Googlebot more work to render the page, so it may get processed less reliably.
An external script () without async or defer blocks HTML parsing. Async fetches in the background and runs when ready. Defer waits until parsing finishes.
Manors loads external scripts from 12 third-party domains, including Klaviyo, TikTok, Microsoft Clarity, and Gorgias.
A look at the Elements panel shows them all loading with async:


By loading third-party scripts with async, Manors keeps them from blocking the initial render. That protects LCP and reduces the work Google’s Web Rendering Service (WRS) has to do.
See the complete picture of your search visibility.
Track, optimize, and win in Google and AI search from one platform.
Start Free Trial
Get started with

The balance between interactivity and crawlability
The issue isn’t that you’re using JavaScript. It’s what you’re using it for.
Googlebot can process JavaScript, but it’s slower and less reliable than reading HTML. The more your core content, structure, and navigation depend on JavaScript, the more room there is for things to go wrong.
The sites in this article all use JavaScript to enhance the experience rather than deliver it. Do that, and you won’t have to choose between a good user experience and good SEO.
Contributing authors are invited to create content for Search Engine Land and are chosen for their expertise and contribution to the search community. Our contributors work under the oversight of the editorial staff and contributions are checked for quality and relevance to our readers. Search Engine Land is owned by Semrush. Contributor was not asked to make any direct or indirect mentions of Semrush. The opinions they express are their own.