Generate Your Sitemap
API Code Examples
Four endpoints are available - all return the same sitemap.xml. Pick any name that suits your workflow.
Standard endpoints
# Any of these endpoint names work identically: curl "https://seo.sup.sg/crawl?url=https://ai.dominance.sg" -o sitemap.xml curl "https://seo.sup.sg/sitemap_generator?url=https://ai.dominance.sg" -o sitemap.xml curl "https://seo.sup.sg/seo_sitemap_generator?url=https://ai.dominance.sg" -o sitemap.xml
import requests url = "https://ai.dominance.sg" endpoints = [ "https://seo.sup.sg/crawl", "https://seo.sup.sg/sitemap_generator", "https://seo.sup.sg/seo_sitemap_generator", ] # Use any endpoint - they are identical resp = requests.get(endpoints[0], params={"url": url}, timeout=120) resp.raise_for_status() with open("sitemap.xml", "wb") as f: f.write(resp.content) print("Saved to sitemap.xml")
import fs from "fs"; const target = "https://ai.dominance.sg"; const api = `https://seo.sup.sg/crawl?url=${encodeURIComponent(target)}`; const res = await fetch(api); if (!res.ok) throw new Error(await res.text()); const buf = Buffer.from(await res.arrayBuffer()); fs.writeFileSync("sitemap.xml", buf); console.log("Saved to sitemap.xml");
// Cargo.toml: reqwest = { version = "0.11", features = ["blocking"] } use std::fs; fn main() -> Result<(), Box<dyn std::error::Error>> { let target = "https://ai.dominance.sg"; let api = format!( "https://seo.sup.sg/crawl?url={}", urlencoding::encode(target) ); let body = reqwest::blocking::get(&api)?.bytes()?; fs::write("sitemap.xml", &body)?; println!("Saved to sitemap.xml"); Ok(()) }
Premium endpoint
# Deeper crawl: depth 3, up to 300 pages curl "https://seo.sup.sg/sitemap_premium?url=https://ai.dominance.sg" -o sitemap.xml
resp = requests.get( "https://seo.sup.sg/sitemap_premium", params={"url": "https://ai.dominance.sg"}, timeout=180 ) resp.raise_for_status() with open("sitemap.xml", "wb") as f: f.write(resp.content)
const res = await fetch( `https://seo.sup.sg/sitemap_premium?url=${encodeURIComponent("https://ai.dominance.sg")}` ); const buf = Buffer.from(await res.arrayBuffer()); fs.writeFileSync("sitemap.xml", buf);
let api = format!( "https://seo.sup.sg/sitemap_premium?url={}", urlencoding::encode("https://ai.dominance.sg") ); let body = reqwest::blocking::get(&api)?.bytes()?; fs::write("sitemap.xml", &body)?;
API Documentation
Base URL: https://seo.sup.sg
Standard SEO sitemap.xml generation. Crawls up to 50 pages at depth 2.
| Parameter | Type | Required | Description |
|---|---|---|---|
| url | string | Yes | The target website URL to crawl. Must be a valid http/https URL. |
| Response | Code | Description |
|---|---|---|
| sitemap.xml | 200 | Valid XML sitemap, Content-Type: application/xml |
| error message | 400 | Missing or invalid/unsafe URL |
| rate limited | 429 | 5 requests per 5 minutes per IP, burst of 5 |
| error message | 500 | Internal server error during sitemap generation |
Standard tree-like sitemap.xml generation. Crawls up to 50 pages at depth 2.
| Parameter | Type | Required | Description |
|---|---|---|---|
| url | string | Yes | The target website URL to crawl. Must be a valid http/https URL. |
| Response | Code | Description |
|---|---|---|
| sitemap.xml | 200 | Valid XML sitemap, Content-Type: application/xml |
| error message | 400 | Missing or invalid/unsafe URL |
| rate limited | 429 | 5 requests per 5 minutes per IP, burst of 5 |
| error message | 500 | Internal server error during sitemap generation |
Standard crawling sitemap.xml generation. Crawls up to 50 pages at depth 2.
| Parameter | Type | Required | Description |
|---|---|---|---|
| url | string | Yes | The target website URL to crawl. Must be a valid http/https URL. |
| Response | Code | Description |
|---|---|---|
| sitemap.xml | 200 | Valid XML sitemap, Content-Type: application/xml |
| error message | 400 | Missing or invalid/unsafe URL |
| rate limited | 429 | 5 requests per 5 minutes per IP, burst of 5 |
| error message | 500 | Internal server error during sitemap generation |
Premium sitemap generation. Crawls up to 300 pages at depth 3 for a more comprehensive sitemap.
| Parameter | Type | Required | Description |
|---|---|---|---|
| url | string | Yes | The target website URL to crawl. Must be a valid http/https URL. |
| Response | Code | Description |
|---|---|---|
| sitemap.xml | 200 | Valid XML sitemap, Content-Type: application/xml |
| error message | 400 | Missing or invalid/unsafe URL |
| rate limited | 429 | 5 requests per 1 minute per IP, burst of 5 |
| error message | 500 | Internal server error during sitemap generation |
Notes
Only links within the same domain as the target URL are crawled. The generated sitemap follows the sitemaps.org protocol and is compatible with Google Search Console, Bing Webmaster Tools, and all major search engines. Crawl depth and page limits are in place to ensure fair usage.