SEO calculators
Tools for the bits of SEO that reward a tidy, considered output: URL slugs, title and description lengths, schema snippets. All run in your browser.
SEO calculators that solve real problems, not vanity ones
I have been doing SEO for 25 years. The single biggest waste of time in this industry is tools that produce a number with no relationship to ranking. Keyword density scores from a tool with a hard-coded "optimal" range. "SEO scores" out of 100 generated by a black-box checker. Title-tag graders that flag an exclamation mark as a problem. None of that survives contact with how Google actually ranks pages, and it never has. The tools in this category exist because there are a handful of small, mechanical questions that genuinely matter and that genuinely benefit from a quick answer: is this title going to truncate in the SERP, is this slug clean, does this robots.txt rule actually allow what I think it allows, is this keyword overrepresented in a way a reader would notice. Beyond that, no tool can rank a page for you, and any that claims to is selling something.
The other reason these exist is that the obvious alternatives, the title-tag preview tools and the slug generators floating around, almost all upload your input to a server. That is fine for a public URL. It is not fine when you are drafting a launch headline for a client whose campaign is under embargo, or when the meta description references a feature that has not shipped yet. Everything in this category runs in the browser. Your text never leaves the device.
Where the inputs come from
For SERP preview work, the inputs are obvious: take the live or draft title, description, and URL straight from the CMS or the brief. The SERP Snippet Preview Tool measures by pixel width, not character count, because that is what Google actually truncates on. The same is true of the Meta Length Checker: a 60-character title in narrow letters renders fine, the same character count packed with W and M and capitals will get cut off. Use the pixel reading, not the character count, when you are deciding whether to trim.
For CTR estimation, the volume input wants to come from Search Console or a keyword research tool you trust. Do not trust round numbers from a free Chrome extension; the precision is theatre. The CTR by SERP Position Estimator uses Advanced Web Ranking 2024 data, which is one of the better public CTR datasets available, and it lets you adjust for SERP features (featured snippets, AI overviews, ads above the fold) because position-one CTR on a clean SERP and position-one CTR underneath an AI Overview are very different numbers.
For robots.txt work, paste the actual file from the live site, not the version in your repository. The two drift more often than people admit. The Robots.txt Tester follows Google's published spec, including the longest-match rule that catches most teams out.
Common mistakes
The first is treating keyword density as a target. It is a diagnostic, nothing more. If a piece of content has a keyword density that looks weird (either far too low for the topic, or unnaturally high), that is a flag to read the copy and check it sounds like a person wrote it. There is no magic percentage. Anyone telling you to hit 2.4% is making it up.
The second is writing meta descriptions for a length limit instead of a click. Google rewrites descriptions on roughly 60-70% of queries anyway. Spending 20 minutes squeezing a description to exactly 156 characters is a worse use of time than writing two genuinely different descriptions and seeing which one earns better in the field. Use the length checker to make sure you are not getting truncated, then move on.
The third, and the one that costs the most search traffic in aggregate, is dirty URL slugs. Trailing dates, duplicate words, stop words like "and" and "the", uppercase characters, accents that get URL-encoded into noise. Every one of those is a small drag, and they compound across a site of any size. The URL Slug Generator exists because most CMS slug fields produce something usable and not quite right, and it is faster to fix it on the way in than refactor a thousand URLs later.
This category will keep growing into the genuinely useful corners: a hreflang validator, a canonical-tag conflict checker, a redirect-chain tester, an internal-link distribution viewer. If you only have time to bookmark one tool from this hub, make it the SERP Snippet Preview Tool. It is the difference between a click and a missed one.
-
URL Slug Generator
Turn any heading into a clean, ASCII-safe, hyphen-separated URL slug. Optional stop-word stripping and length cap. Underscore or hyphen as separator.
-
Meta Length Checker
Check SEO title and meta description against Google's display limits in characters and approximate pixel width. Live feedback, targets called out.
-
Keyword Density Calculator
Paste an article or page and see word counts plus 1-, 2- and 3-gram keyword density tables. Optional stop-word filter, runs in your browser.
-
Robots.txt Tester
Paste a robots.txt and check whether a URL is allowed or blocked for Googlebot, Bingbot or any user-agent. Follows Google's spec, shows the winning rule.
-
CTR by SERP Position Estimator
Estimate clicks from a Google ranking. Pick a position 1 to 20, set monthly volume and SERP features, and see expected click-through rate based on Advanced Web Ranking 2024 data.
-
SERP Snippet Preview Tool
Preview how a page title, meta description and URL render in Google's desktop and mobile search results. Truncation calculated by pixel width, not character count.
Frequently asked questions
How long should a meta title and description be?
Titles get truncated in Google around 60 characters (or roughly 580 pixels in desktop SERPs); meta descriptions around 160. Aim for 50 to 60 chars on titles and 140 to 160 on descriptions to give yourself a buffer. The Meta Length Checker shows live character and pixel counts as you type.
Why does my page get indexed but not rank?
Indexed means Google has it in its database. Ranking requires beating the existing pages on the same query, which depends on relevance, content depth, links pointing in, site authority and a dozen smaller signals. New pages on a new site can take six to twelve months to rank for anything competitive even when the page itself is genuinely good.
What is keyword cannibalisation and how do I spot it?
Cannibalisation is two or more of your own pages competing for the same search query. Google ends up rotating which one shows, neither ranks as well as a single consolidated page would, and CTR falls. Spot it by looking at the Search Console Performance report filtered by query: if multiple URLs are getting impressions for the same term, you have a candidate.
How do I write a URL slug that ranks?
Short, lowercase, hyphens between words, the primary keyword included once, no stop words, no dates that will go stale. Three to five words is the sweet spot. The URL Slug Generator takes any title and produces a clean slug, plus validates it against the standard rules.
What does the SERP snippet preview actually predict?
It predicts what Google's listing will look like, given your current title and description. It does not predict whether Google will use them: Google rewrites titles on roughly 60 per cent of pages and descriptions on a similar share, especially when the on-page text fits the query better than the meta. Use the SERP Snippet Preview to make sure your draft fits, then expect Google to remix it sometimes.
Why SEO tools that don't upload your text
Most title-tag and slug tools online do the same thing: send your input to a server, log it, save it, maybe train a model on it. For an SEO-agency draft or a client's unpublished page, that's the wrong answer. These tools run in your browser and nothing leaves your device. Paste freely.
The category is new and will expand. Next up: meta description length checker, then headline analyser, schema validator, and robots.txt tester.