Build a robots.txt file visually by adding user-agent rules, Allow and Disallow paths, an optional crawl-delay, and a sitemap URL — then download the finished file. The output updates in real time so you can see the effect of each rule as you add it. Useful when setting up a new site or refining crawler access without memorising the robots.txt syntax.
Verify a file's integrity by checking its hash against an expected checksum. Supports SHA-256, SHA-1, MD5, and more. Runs in your browser — nothing uploaded. Free.
Paste a JSON Web Token to decode and inspect its header, payload, and expiry — entirely in your browser.
Test regular expressions against a string with live match highlighting, flag toggles, and a match details table.