Blog

How to speed up your website — a practical guide

A slow website is an invisible tax on everything you do online. It costs you search rankings, conversions, and the trust of the people who visit. Google has been explicit about this: page speed is a ranking factor. And the data backs it up — a one-second delay in page load time reduces conversions by an average of 7%, according to research by Akamai. Amazon famously calculated that every 100ms of latency cost them 1% in sales.

But here is the thing most people miss: you do not need to be a performance engineer to make your website fast. Most of the biggest wins are straightforward, and many of them take less than an hour to implement. This guide walks you through everything — from understanding what "fast" actually means, to the specific techniques that will make the biggest difference for your site.

Whether you plan to do this yourself or hire someone, understanding these concepts will help you make better decisions and avoid wasting money on things that do not matter.

Why speed matters more than you think

Let us start with the numbers, because they are hard to ignore.

SEO rankings. As covered in our SEO basics guide, Google has used page speed as a ranking signal since 2010 for desktop and since 2018 for mobile. In 2021, they made Core Web Vitals — a specific set of speed and experience metrics — part of their ranking algorithm. Sites that pass Core Web Vitals have a measurable advantage in search results. It is not the only factor, but when two pages have similar content and authority, the faster one wins.

Conversion rates. Portent found that a site that loads in 1 second has a conversion rate 3x higher than a site that loads in 5 seconds. Vodafone ran an A/B test and found that a 31% improvement in LCP (Largest Contentful Paint) led to an 8% increase in sales. These are not marginal gains — they are the difference between a business that grows and one that stalls.

User experience. Google's own research shows that 53% of mobile visitors abandon a page that takes longer than 3 seconds to load. That is more than half your potential visitors gone before they even see your content. And the visitors who do stay on a slow site are less engaged, view fewer pages, and are less likely to come back.

Bounce rate. As page load time goes from 1 second to 3 seconds, the probability of bounce increases 32%. From 1 to 5 seconds, it increases 90%. From 1 to 10 seconds, it increases 123%. Every second counts, but the first few seconds count the most.

The bottom line: a fast website is not a nice-to-have. It directly affects your revenue, your search visibility, and how people perceive your brand.

How to measure your current speed

Before you optimize anything, you need to know where you stand. There are three tools you should use, and all of them are free.

Google PageSpeed Insights (pagespeed.web.dev). This is the most important tool — pair it with Google Search Consoleto see how speed affects your search performance. Enter your URL and it gives you two things: real-world data from actual Chrome users who visited your site (called "field data") and a lab test that simulates loading your page on a mid-range phone with a slow connection. The field data is what Google actually uses for ranking. The lab data tells you what specific issues to fix. You get a score from 0-100, but the score matters less than whether your Core Web Vitals are green (good), yellow (needs improvement), or red (poor).

Lighthouse (built into Chrome). Open Chrome DevTools (right-click, Inspect), go to the Lighthouse tab, and run an audit. This gives you the same analysis as PageSpeed Insights but runs it locally on your machine. It is useful for testing changes before you deploy them. Run it in an incognito window with no extensions, because browser extensions can skew the results.

WebPageTest (webpagetest.org). This is the most detailed tool. It lets you test from different locations around the world, on different devices and connection speeds. The waterfall chart is especially useful — it shows you exactly when each file starts and finishes loading, so you can see what is blocking your page. WebPageTest also shows a filmstrip view that captures screenshots at intervals during the load, so you can see exactly what your visitors see at each moment.

Run your homepage and your most important landing pages through all three tools. Write down the results. This is your baseline — you will compare against it after making changes.

Core Web Vitals explained simply

Core Web Vitals are the three metrics Google uses to judge your page experience. They sound technical, but the concepts are straightforward.

LCP — Largest Contentful Paint. This measures how long it takes for the biggest visible element on your page to finish loading. That is usually a hero image, a large heading, or a video thumbnail. LCP answers the question: "How long before the visitor sees the main content?" The target is under 2.5 seconds. If your LCP is over 4 seconds, Google considers it poor.

INP — Interaction to Next Paint. This replaced FID (First Input Delay) in March 2024. INP measures how quickly your page responds when someone interacts with it — clicking a button, tapping a link, typing in a form field. It captures the delay between the user's action and the visual response. The target is under 200 milliseconds. If your INP is over 500ms, users feel the lag and it hurts their experience.

CLS — Cumulative Layout Shift. This measures visual stability — how much the page layout shifts around while loading. You have experienced this: you start reading an article, then an ad loads above it and pushes the text down, or you are about to click a button and it moves because an image finally loaded. CLS quantifies that frustration. The target is under 0.1. Anything over 0.25 is poor.

To pass Core Web Vitals, you need all three metrics in the "good" range for at least 75% of your page visits. You do not need perfection — you need consistency. Google measures these on real user visits, not just lab tests, so your results depend on what devices and connections your actual visitors use.

Image optimization — the biggest quick win

Images are typically the heaviest resources on a webpage. On the average page, images account for roughly 50% of the total page weight. This makes image optimization the single most impactful thing you can do for most websites.

Use modern formats: WebP and AVIF. JPEG and PNG are decades old. WebP (supported by all modern browsers) produces files 25-35% smaller than JPEG at the same quality. AVIF is even better — up to 50% smaller — but browser support is still catching up. If your site builder supports it, use AVIF with a WebP fallback. If not, just use WebP. A 200 KB JPEG might become a 130 KB WebP or a 100 KB AVIF. Multiply that across every image on your page and the savings add up fast.

Compress your images. Most images are saved at much higher quality than necessary for the web. A JPEG at 80% quality is virtually indistinguishable from 100% quality to the human eye, but it can be 60% smaller. Tools like Squoosh (squoosh.app — free, by Google), TinyPNG, or ImageOptim can compress images without visible quality loss. If you use WordPress, plugins like ShortPixel or Imagify do this automatically on upload.

Resize images to their display size. This is one of the most common mistakes. Someone uploads a 4000 x 3000 pixel photo straight from their camera, but it is displayed at 800 x 600 on the page. The browser still downloads the full-size image. Resize your images to the maximum size they will be displayed at — and ideally provide multiple sizes for different screen widths using the srcset attribute. If you use a framework like Next.js, the built-in Image component handles this for you.

Lazy load images.By default, browsers try to load every image on the page immediately, even the ones that are far below the fold and not yet visible. Adding loading="lazy" to your image tags tells the browser to only load images when they are about to scroll into view. This dramatically reduces initial page load time. Do not lazy load your hero image or anything above the fold — those should load immediately.

Use responsive images. A visitor on a phone with a 375px-wide screen — and understanding mobile vs desktop traffic patterns helps here — does not need the same image as someone on a 27-inch monitor. The HTML srcset attribute and the picture element let you serve different image sizes to different devices. This can cut image payload by 70% or more on mobile.

Serve images from a CDN. A content delivery network (CDN) serves your images from servers physically close to your visitors. If your server is in New York and your visitor is in Tokyo, loading a 200 KB image takes noticeably longer than if it comes from a server in Tokyo. Services like Cloudflare, Bunny CDN, or imgix handle this for you. Many hosting platforms include CDN functionality built in.

JavaScript optimization

JavaScript is often the second-largest contributor to slow pages, and it is uniquely expensive because it does not just take time to download — it also takes time to parse, compile, and execute. A 200 KB image loads and it is done. A 200 KB JavaScript file has to be downloaded, parsed by the browser's JavaScript engine, and executed — which blocks other work the browser needs to do. On slower devices, this is especially painful.

Code splitting. Instead of loading all your JavaScript in one giant bundle, split it into smaller chunks that load on demand. Your homepage does not need the JavaScript for your checkout page. Modern bundlers like webpack, Vite, and frameworks like Next.js support code splitting out of the box — often automatically. The result: visitors only download the JavaScript they actually need for the page they are viewing.

Tree shaking. This is the process of removing unused code from your JavaScript bundles. If you import a utility library but only use 2 of its 50 functions, tree shaking removes the other 48. Modern build tools do this automatically if you use ES module syntax (import/export). Check your bundle analyzer output — you might be surprised by how much dead code ships to your visitors.

Use defer and async on script tags. By default, when the browser encounters a script tag in your HTML, it stops rendering the page, downloads the script, executes it, and only then continues rendering. This is called render-blocking. The defer attribute tells the browser to download the script in the background and execute it after the HTML is parsed. The async attribute downloads in the background and executes as soon as it is ready. For most third-party scripts, defer is the right choice.

Remove unused JavaScript. Over time, websites accumulate scripts that nobody remembers adding. An old chat widget you no longer use, an A/B testing tool from a campaign that ended months ago, a social sharing plugin for a platform you abandoned. Open your browser's DevTools, go to the Coverage tab, and reload the page. It shows you exactly how much of each JavaScript file is actually used. Anything with a high percentage of unused code is a candidate for removal.

CSS optimization

CSS is often overlooked in performance conversations because individual CSS files are smaller than images or JavaScript. But CSS is render-blocking by default — the browser will not paint anything on screen until it has downloaded and parsed all your CSS. This means a large or slow-loading stylesheet delays everything.

Critical CSS. The idea is simple: identify the CSS needed to render the content visible in the initial viewport (above the fold) and inline it directly in the HTML. Then load the rest of the CSS asynchronously. This way the browser can start painting the page immediately instead of waiting for a full stylesheet to download. Tools like Critical (by Addy Osmani) or Critters can extract critical CSS automatically as part of your build process.

Remove unused CSS. Most websites ship far more CSS than they actually use. If you use a CSS framework like Bootstrap or Tailwind, you might be sending hundreds of kilobytes of styles for components you never use. Tools like PurgeCSS scan your HTML and JavaScript to find which CSS selectors are actually used and remove the rest. Tailwind CSS does this by default in production builds. The Chrome Coverage tab (the same one mentioned for JavaScript) also works for CSS files.

Minification. CSS minification removes whitespace, comments, and unnecessary characters from your stylesheets. It typically reduces file size by 10-20%. Every modern build tool (webpack, Vite, PostCSS) includes CSS minification. If you use a CMS like WordPress, a caching plugin like WP Rocket or Autoptimize handles this. It is a small win, but it is free and there is no reason not to do it.

Third-party scripts — the hidden performance killer

This is the section most performance guides gloss over, but it is often the biggest problem for real-world websites. Third-party scripts — analytics, chat widgets, ad networks, social media embeds, marketing pixels, A/B testing tools — can absolutely destroy your page performance.

Here is why they are so damaging. Each third-party script typically comes from a different server, which means a separate DNS lookup, TCP connection, and TLS handshake before a single byte of content arrives. Many of them load additional scripts of their own. A chat widget might pull in 300-500 KB of JavaScript. A social media embed can load an entire iframe with its own set of resources. A typical analytics script adds 30-70 KB. Stack five or six of these on a page and you are easily adding 1-2 MB of extra JavaScript and multiple seconds to your load time.

Audit everything.Open your browser DevTools, go to the Network tab, and reload your page. Filter by "JS" and look at what is loading. You will probably find scripts you forgot about. For each one, ask: is this actively providing value? If you have not looked at your Hotjar recordings in six months, remove the script. If your live chat widget gets used once a week, consider loading it only on your contact page instead of every page.

Load third-party scripts asynchronously. Never let a third-party script block your page rendering. Use the defer or async attribute. Better yet, delay loading non-essential scripts until after the page has fully loaded or until the user interacts with the page (a technique sometimes called "interaction-based loading").

Choose lightweight alternatives. Not all tools in a category are equal. Some analytics scripts weigh 500 KB or more. sourcebeam's tracking script is under 1 KB — proof that analytics does not have to slow your site down. The same applies to other categories: there are lightweight chat widgets, minimal social sharing solutions, and lean A/B testing tools if you look for them.

Self-host when possible. If a third-party script allows it, hosting a copy on your own server or CDN eliminates the extra DNS lookup and connection time. It also gives you more control over caching. Just make sure you update the script periodically.

Use facades for heavy embeds. Instead of loading a YouTube embed (which pulls in over 1 MB of resources), show a static thumbnail with a play button. Only load the actual YouTube player when someone clicks. The lite-youtube-embed library does exactly this. The same pattern works for Google Maps, Twitter embeds, and chat widgets.

Server-side optimizations

Front-end optimization can only take you so far. If your server takes 2 seconds to respond before the browser even starts loading your page, all the image compression in the world will not make your site feel fast.

Use a CDN (Content Delivery Network). A CDN caches your website's content on servers around the world and serves it from the location closest to each visitor. This reduces latency dramatically. If your server is in Virginia and your visitor is in Berlin, a CDN can cut the response time from 200ms to 30ms. Cloudflare has a generous free tier. Other good options include Fastly, Bunny CDN, and AWS CloudFront. For many sites, putting Cloudflare in front of your existing hosting is the single easiest server-side improvement you can make.

Set proper caching headers. When a browser loads your page, it downloads HTML, CSS, JavaScript, images, and fonts. Caching headers tell the browser how long it can keep those files before checking for updates. For static assets (images, fonts, CSS, JS with hashed filenames), set a long cache duration — one year is standard. The browser will reuse the cached version on subsequent visits instead of downloading everything again. For HTML pages, use shorter cache durations (a few minutes to a few hours) or no-cache with ETags so visitors always get fresh content.

Enable compression (gzip or Brotli). Text-based files (HTML, CSS, JavaScript, JSON, SVG) compress very well. Gzip reduces file sizes by 60-80%. Brotli is newer and compresses 15-20% better than gzip. Most modern web servers and CDNs support both. If you are on shared hosting, check that gzip is enabled — it usually is, but not always. If you have access to your server configuration, enable Brotli as well.

Hosting quality matters. Cheap shared hosting ($3-5/month) puts your site on a server with hundreds of other websites. When one of those sites gets a traffic spike, everyone else slows down. If your site is slow despite optimizing everything on the front end, your hosting might be the bottleneck. For static sites and Jamstack architectures, platforms like Vercel, Netlify, and Cloudflare Pages are fast and often free. For WordPress, managed hosting like Kinsta, Flywheel, or Cloudways is significantly faster than generic shared hosting.

Reduce server response time (TTFB). Time to First Byte (TTFB) measures how long it takes for the server to start sending back a response after receiving a request. A good TTFB is under 200ms. If yours is over 600ms, something is wrong on the server side — slow database queries, lack of server-side caching, or poor hosting. For dynamic sites, page caching (serving a pre-built HTML file instead of generating one on every request) can reduce TTFB from seconds to milliseconds.

Font optimization

Custom fonts are one of those things that seem harmless but can quietly add seconds to your page load. A single font family with multiple weights (regular, bold, italic) can easily be 200-400 KB. And because fonts are render-blocking by default in some browsers, visitors may see invisible text (FOIT — Flash of Invisible Text) while the font downloads.

Use font-display: swap. This CSS property tells the browser to immediately show text using a fallback system font, then swap in the custom font once it has loaded. This means visitors can start reading immediately instead of staring at blank space. Add it to your @font-face declarations or include it as a parameter when loading from Google Fonts (&display=swap in the URL).

Subset your fonts. Most font files include characters for dozens of languages. If your site is only in English, you do not need Cyrillic, Greek, or Vietnamese characters. Subsetting removes the characters you do not use. Google Fonts does this automatically (it serves only Latin characters by default for English-language pages). If you self-host fonts, tools like glyphhanger or FontForge let you create a subset that includes only the characters you actually use. This can reduce a 150 KB font file to 20 KB.

Consider system font stacks. The fastest font is one that does not need to be downloaded at all. System font stacks use the fonts already installed on the visitor's device — San Francisco on Apple devices, Segoe UI on Windows, Roboto on Android. They look good, they load instantly, and they eliminate an entire category of performance problems. GitHub, Bootstrap, and many high-performance sites use system font stacks.

Self-host instead of using Google Fonts. Loading fonts from Google Fonts requires a DNS lookup and connection to fonts.googleapis.com, then another to fonts.gstatic.com. Self- hosting the font files on your own domain eliminates those extra connections. Download the font files, convert them to WOFF2 (the most compressed format), and serve them from your own server or CDN. The google-webfonts-helper tool makes this easy.

Preload your most important font. Add a preload link in your HTML head for the font file used in your main body text. This tells the browser to start downloading that font immediately rather than waiting until it encounters it in the CSS. Only preload one or two fonts — preloading too many defeats the purpose.

The performance budget concept

A performance budget is a simple idea that prevents your site from getting slower over time. You set limits on specific metrics and treat them the way you would treat a financial budget — you do not spend more than you have.

Here is what a practical performance budget looks like:

Total page weight: under 1.5 MB. Total JavaScript: under 300 KB (compressed). Total CSS: under 100 KB (compressed). LCP: under 2.5 seconds. INP: under 200ms. CLS: under 0.1. Number of HTTP requests: under 50.

Why this matters in practice. Without a budget, performance degrades gradually. A developer adds a new library — 50 KB. Marketing installs a new pixel — 30 KB. A designer adds a video background — 2 MB. Nobody notices the individual additions, but after six months your page weighs 5 MB and takes 8 seconds to load.

How to enforce it. If you have a build process, tools like bundlesize or Lighthouse CI can automatically fail a deployment if the budget is exceeded. If you do not have a build process, check PageSpeed Insights monthly and track total page weight over time. The point is not rigid enforcement — it is awareness. When someone wants to add a new 200 KB script, the performance budget forces a conversation: what will we remove to make room for this?

Think of it as a trade-off framework. Every feature has a cost. The performance budget makes that cost visible so you can make informed decisions instead of letting your site gradually slow down.

Quick wins that take 30 minutes or less

If you only have an afternoon, these are the highest-impact changes you can make with minimal effort:

Run your images through Squoosh or TinyPNG. Take the 5 largest images on your homepage and compress them. This alone can cut page weight by 50% or more. Time: 10 minutes.

Add loading="lazy" to below-the-fold images. A simple HTML attribute that prevents images from loading until the visitor scrolls to them. Time: 5 minutes.

Remove one unused third-party script. Open your Network tab, find the heaviest third-party script you are not actively using, and remove it. Time: 10 minutes.

Enable a CDN. Sign up for Cloudflare's free plan and point your domain to it. This adds caching, compression, and global distribution in one step. Time: 20 minutes.

Add font-display: swap to your custom fonts. One line of CSS that prevents invisible text during font loading. Time: 2 minutes.

Set explicit width and height on images. This prevents layout shifts (CLS) by reserving space for images before they load. The browser knows exactly how much space the image will take up, so the layout does not jump when the image arrives. Time: 10 minutes.

Preload your LCP image. If your LCP element is an image, add a preload link in your HTML head. This tells the browser to start downloading it immediately instead of waiting to discover it in the CSS or HTML. Time: 2 minutes.

These seven changes, done in a single sitting, can easily improve your PageSpeed score by 20-40 points and shave 1-3 seconds off your load time. They are not glamorous, but they work.

When to hire someone vs. do it yourself

Most of the quick wins above are doable by anyone who can edit HTML or use a website builder. But some performance work requires genuine technical expertise.

Do it yourself if: your site runs on a modern website builder (Squarespace, Webflow, WordPress with a good theme) and your main issues are uncompressed images, too many plugins, or missing lazy loading. These are configuration changes, not engineering work. The quick wins section above covers most of it.

Consider hiring someone if: your site is a custom-built application with complex JavaScript, your LCP is above 4 seconds despite basic optimizations, you need to implement code splitting or critical CSS in a custom codebase, or your server response time is consistently slow and you do not know why. A good performance consultant can audit your site, identify the top 3-5 issues, and either fix them or give you a clear plan.

What to look for in a consultant. Ask for before-and-after case studies with actual Core Web Vitals data. Good performance consultants measure everything and can show you the specific impact of their work. Be wary of anyone who promises a "perfect 100 Lighthouse score" — that number is not always achievable or even necessary. What matters is passing Core Web Vitals in the field, not a perfect lab score.

Budget expectations. A performance audit with recommendations typically costs $500-2,000. Implementation depends on complexity — fixing image optimization might be a few hundred dollars, while restructuring a complex JavaScript application could be $5,000-15,000. For most small business websites, the total cost of meaningful performance improvement is $1,000-3,000. Given the impact on conversions and SEO, it almost always pays for itself within a few months.

The bottom line

Website speed is not a vanity metric. It directly affects how much money your website makes, where it ranks in Google, and how people feel when they visit. The good news is that most of the biggest wins are not complicated — they are just overlooked.

Start by measuring where you are today. Run PageSpeed Insights on your key pages and write down the numbers. Then work through the quick wins: compress your images, remove unused scripts, enable a CDN, and fix your fonts. That alone will put you ahead of the majority of websites on the internet.

If you want to go further, set a performance budget, audit your third-party scripts regularly, and invest in server-side optimizations. Monitor your Core Web Vitals monthly — use sourcebeam alongside PageSpeed Insights to understand how speed improvements affect your actual traffic and conversions.

A fast website is not something you build once and forget about. It is an ongoing practice — a habit of measuring, optimizing, and making conscious trade-offs. But the payoff is real: better rankings, more conversions, and visitors who actually enjoy using your site. That is worth the effort.

sourcebeam helps you track how site speed improvements affect your real traffic and conversions — with a script that weighs under 1 KB. Try it free