Retrieve a paginated list of all your batch scrape jobs
GET /api/batch/scrape/{batchId} to fetch those for a specific batch.
| Parameter | Type | Default | Description |
|---|---|---|---|
page | number | 1 | Page number (1-based) |
limit | number | 20 | Jobs per page. Maximum 50 |
| Field | Type | Description |
|---|---|---|
uuid | string | Batch ID — use this with other endpoints |
status | string | pending, running, completed, failed, or cancelled |
totalUrls | number | How many URLs were in this batch |
completedCount | number | Items that completed successfully |
failedCount | number | Items that failed |
outputFormat | string | "json" or "markdown" |
scrapingChannel | string | "api" or "playground" |
createdAt | string | ISO 8601 submission timestamp |
finishedAt | string | null | ISO 8601 completion timestamp, or null if still in progress |
| Field | Type | Description |
|---|---|---|
page | number | Current page number |
limit | number | Items per page |
total | number | Total number of batch jobs for your account |
totalPages | number | Total pages at the current limit |
| Code | Reason |
|---|---|
401 | Missing x-api-key header |
403 | Invalid or expired API key |