What Is a Good Regex for URL Slug Validation?
Pattern: ^[a-z0-9]+(?:-[a-z0-9]+)*$. A URL slug is the human-readable portion of a URL path, typically created from a page title. Good slugs use only lowercase letters, digits, and hyphens. They should not start or end with a hyphen, and should not contain consecutive hyphens.
Breaking Down the Pattern
| Part | Meaning |
|---|---|
^ | Start of string |
[a-z0-9]+ | One or more lowercase letters or digits |
(?:-[a-z0-9]+)* | Optionally: hyphen followed by one or more lowercase letters/digits, repeated |
$ | End of string |
Test Cases
| Input | Match? | Note |
|---|---|---|
hello-world | Yes | Valid format |
my-blog-post-2026 | Yes | Valid format |
singleword | Yes | Valid format |
Hello-World | No | Contains uppercase letters |
-leading-hyphen | No | Starts with a hyphen |
Usage Examples
JavaScript
const pattern = /^[a-z0-9]+(?:-[a-z0-9]+)*$/;
pattern.test('hello-world'); // true
pattern.test('Hello-World'); // false
Python
import re
pattern = r'^[a-z0-9]+(?:-[a-z0-9]+)*$'
bool(re.match(pattern, 'hello-world')) # True
bool(re.match(pattern, 'Hello-World')) # False
Common Pitfalls
- This pattern disallows consecutive hyphens (my--slug). Decide if that is desired behavior.
- Trailing hyphens (slug-) are also rejected, which is usually correct.
- Maximum length is not enforced by regex -- add a separate length check (typically 60-80 chars).
Try It Yourself
Test this regex with our Regex Tester.
Frequently Asked Questions
How do I generate a slug from a title?
Convert to lowercase, replace spaces with hyphens, remove special characters, collapse consecutive hyphens, and trim leading/trailing hyphens. Most web frameworks have a built-in slugify function.
Should slugs include stop words?
It depends. Removing stop words (the, a, is, of) makes slugs shorter but can reduce readability. "best-practices-for-seo" vs "best-practices-seo" -- both are acceptable.
What is the ideal slug length?
Keep slugs under 60 characters. Shorter slugs are easier to read, share, and remember. Google does not penalize long URLs but truncates them in search results.