A year or two ago, speed was a status symbol. Agencies opened pitches with perfect PageSpeed scores, green bars became trophies, and anyone who could show "100/100" had apparently proven that their shop was "state of the art." But when you visited the sites a little later in live operation, the page speed had usually slowed down significantly. Today, things have become noticeably quiet. AI, personalization, and data linking now dominate many roadmaps. Speed appears at most as a hygiene factor—somewhere between "can be done later" and "works fine." The uncomfortable question behind this is: Has speed become unimportant—or are we just leaving money on the street because no one is paying attention anymore?
Where speed really works
Speed is not a technical issue, but a matter of perception. Users do not experience the Lighthouse Score, but moments: When will I see something? Does the page respond to my click? Does the layout jump? Google has done this in a large-scale Analysis mobile user data Made tangible: When the loading time of 1 second to 10 seconds rises, the probability of a bounce increases by 123%; already from 1 in 3 seconds are they 32%This is the point at which speed goes from being a "nice-to-have" to a matter of balance sheet. Every bounce is a purchase that never had a chance to happen.

Core Web Vitals – how impatience became measurable
Google searches billions of websites and shops every day to keep the latest content in its search index. When you think about it in terms of a search engine, even the smallest differences in loading time have a huge impact on costs. It's estimated that Google calls up between 20 and 40 billion different URLs every day. With 25 billion pages per day, that's around 9 trillion pages viewed per year. Between Top-optimized pages with a loading time of one second Google's annual costs will increase by almost USD 1 billion.if websites on average three instead of one second until the content is loaded. Google's controllers therefore have a genuine interest in ensuring that their servers visit slow pages less often and thus display them less prominently in the search index. The Gemini analysis of the effects sums it up bluntly:
- Enormous leverage: Even the jump from 1s to 3s could cost Google almost one billion dollars in additional computing power and energy if they were to continue crawling all pages with the same intensity.
- Google's countermeasure (crawl budget): In reality, Google often does not pay these costs. Instead, the Crawl budget reducedThis means that if your page takes 10 seconds to load, Google will simply visit it much less frequently in order to cover its own costs.
- Economic paradox: While Google officially states that they do not track costs per page individually, their budget allocation algorithms show that they are extremely optimized to avoid slow resources.

To ensure that speed is not just a feeling for website and shop operators and that Google does not get stuck with the costs, Google has introduced the Core Web Vitals Three key figures that translate user experience into measurable events. LCP (Largest Contentful Paint) asks: When is the main content visible? INP (Interaction to Next Paint) asks: How quickly does the site respond to interactions? CLS (Cumulative Layout Shift) asks: Will the layout remain stable? Important: Google has Information Not Provided since March 12, 2024 set as Core Web Vital (instead of FID - First Input Delay) – a signal that responsiveness is now considered just as business-critical as loading time. Since 2026, Google has been evaluating slow page speed even more differently and strictly than in previous years. Currently, the following applies:
- Interaction to Next Paint (INP): This value measures how quickly the page responds to user input. A value above 500 ms is considered "bad."
- Largest Contentful Paint (LCP): The main content element must be loaded in less than 2.5 seconds to be rated as "good."
- The 75th percentile: Google does not evaluate a single test run, but rather the data from 75% of your genuine visitors over a period of 28 days.
This naturally has an impact on Google's crawl budget. In other words, slow page speed damages the website even before it appears in the rankings:
- Crawl-Limit: If your server responds slowly, Googlebot will lower the crawl limit so as not to overload the server. Google will then simply crawl fewer pages of the website.
- Efficiency: Slow database queries and server responses mean that Google discovers important new or updated content much later.
However, many teams still measure page speed like the weather—just glancing at the tool every now and then. Core Web Vitals However, they are more like blood pressure readings. They show early on if something is gradually becoming unbalanced, long before it "hurts." And gradually is the key word, because in modern shop stacks, performance loss is rarely caused by one big mistake, but by many small decisions.
What the large studies really prove
In speed debates, striking phrases such as "every second costs X%" often crop up. Serious studies are more cautious—and that is precisely why they are useful. One of the most reliable studies is „Milliseconds Make Millions“ by Deloitte and Google. About 30 million real mobile sessions were analyzed across multiple industries (including retail, travel, luxury, and lead generation) in Europe and the US, with the aim of isolating speed as a metric and observing correlations with funnel progression, page views, bounce rates, and spend.
What makes this study so valuable is not a magic percentage, but the pattern: even small improvements—in the range of 0.1 seconds – correlate with measurable improvements in engagement and conversion metrics. The study presents its findings as observations, not as universal guarantees. In addition, it provides Think with Google in the section "elements reduce conversion" another finding. The more elements a page contains, the more the conversion probability decreases – an indication that "more features" can often mean not only more benefits, but also more friction. In a data analysis by Ubersuggest, NP Digital, Crazy Egg, and data from 210 marketers, Neil Patel clearly illustrates the friction in February 2025. Numerous SEO agencies are likely to take notice here. Strangely enough, as an e-commerce agency, we have never addressed the issue of speed in any of our collaborations with SEO agencies over the past ten years. The reason for this is probably to be found in the potential for friction with the existing e-commerce agency, which usually has deeper technical expertise.

When statistics become causality
The statement "correlation does not imply causation" is correct—and yet it is often used as an excuse for inaction. This is precisely why A/B testing is so powerful in a performance context. It lets users decide, not opinions. Vodafone ran an A/B test that explicitly optimized for Web Vitals. The result is clear in the web.dev Case Study: „a 31% improvement in LCP led to 8% more sales“.
The key point: Vodafone didn't just "tweak the design a little" and happen to see better figures. They worked specifically on one aspect that users immediately notice (LCP: "Now I see what it's all about") – and measured the effect.
Let's assume that a The shop has 2,000,000 sessions per month, a conversion rate of 2.0%, and an average shopping cart value of €85. Monthly revenue = 2,000,000 × 0.02 × 85 = 3.400.000 €If a performance improvement (e.g., better LCP/INP values) would increase the conversion rate by 8% to 0.0216% (Vodafone's order of magnitude in a specific experiment, not a guarantee), the revenue leverage would theoretically be €272,000 per monthThe serious interpretation is not "it will be exactly the same with you," but rather: Even individual percentage points are economically relevant when volumes are large – and therefore worth testing.
Speed in paid traffic is measured in EUR
In marketing, speed is not a philosophical concept, but an accounting one. Every click costs money—regardless of whether the user stays or leaves. Google describes the Quality Score in the Google Ads Help – „About Quality Score“ as a diagnostic value and refers to three components: expected CTR, ad relevance, Landing Page ExperienceThe latter can make the difference between "Buy or Bounce”.
What does that mean? When a landing page is slow, it rarely feels like a red card. It feels like a creeping loss of efficiency: more bounces after the click, fewer purchases per budget, rising CPA. That Chrome-Team put it very bluntly back in 2018, something that many teams still underestimate today: „Speed is now a landing page factor for Google Search and Ads.“
Here, too, a small sample calculation illustrates the impact. Let's say a shop spends €500,000 per month on ads and generates 10,000 orders. CPA = $50. If the post-click conversion rate drops by just 5% (because users bounce or cancel later), the same order volume would require $50 / 0.95 ≈ €52.63 CPAThat is around €526,315 Budget instead of €500,000. Difference: +$26,315 per month, $315,780 per year – for exactly the same result. That's the speed tax in the paid sector: invisible but reliable.
Apps, plugins, and the economics of adding extras
Now comes the part that everyone is familiar with in everyday life, but which is rarely dealt with strategically: modern shops are ecosystems. In Shopify, functionality is often added via apps; in Shopware, plugins and extensions take on similar roles. This is a boon for speed to market—but a risk for performance, because each extension comes with its own code, its own scripts, and its own requests. The problem is usually not "the one app," but the sum of them all. Tracking scripts, chat widgets, review badges, A/B testing, consent tools—all useful, all easily installed with a few clicks, but all together they weigh heavily.
Shopware describes in a current post explicitly states that page speed is influenced by, among other things, Third-party scripts (tracking, plugins) is burdened and whose reduction is a key lever.
The fact that this third-party load is often the invisible culprit is consistent with Google's own argument: In Chrome contribution to landing page speed JavaScript/images are highlighted as a significant contributor to page weight.
AI crawling – why speed is becoming more important for GEO
AI is shifting the surface of search—but the physical basis remains web infrastructure. OpenAI officially describes this in the "Overview of OpenAI Crawlers“ Own crawlers and user agents (e.g., GPTBot, OAI-SearchBot) and how website operators can control their access.
This does not mean that AI ranks speed, as there is currently no reliable, published formula for this. It means something simpler. When more systems retrieve, render, and evaluate content, the importance of Accessibility, stability, and efficiencyIn a world where machines read along, the ticket will not only be content, but also the question: Is the content delivered reliably and quickly?
The Speed Tax balance sheet
Putting these observations together, a sober balance sheet logic emerges: Speed is a silent tax on (1) revenue, (2) ad efficiency, and (3) visibility. The tricky part: it is rarely billed as a "speed problem," but rather as "the market is tougher," "ads are becoming more expensive," or "conversion rates are fluctuating." This is precisely where it is decided whether a partner is perceived as a service provider or a trusted advisor: not who improves the next score, but who makes the friction visible in euros and turns it into a lasting discipline.
For us as an e-commerce agency, this means selling speed not as a one-time clean-up operation, but as an operating system concept for international Shopify Plus and headless setups. Simply put, speed is a feature and requires constant optimization and effort. Core Web Vitals are a key performance indicator and not just a slide decoration; app/plugin ecosystems are a governance issue and not a playground for marketing departments. It's about a common goal in which marketing, product, and tech don't optimize against each other, but pull in the same direction.
The issue of speed has not disappeared because it has become unimportant. It has disappeared because it is inconvenient and optimization costs money up front. It also forces teams to weigh up "another plugin" against "a little more friction" – and this trade-off rarely feels urgent until it becomes expensive. The data is reliable enough to support decisions. Speed must be maintained as an ongoing discipline before milliseconds turn into millions—unfortunately on the wrong side of the balance sheet.
How fast is this blog page?
Performance as of 03/05/2026 at pagespeed.web.dev
96
Punkte
Leistung
100
Punkte
Barrierefreiheit
100
Punkte
SEO





