Schedules Endpoint

Set up recurring scraping jobs that run automatically on a schedule. Perfect for monitoring prices, tracking changes, or collecting data regularly.

POST/v1/schedules

Create a Schedule

Request
curl -X POST https://api.scrpy.co/v1/schedules \
  -H #a5d6ff;">"Authorization: Bearer sk_live_xxxxx" \
  -H #a5d6ff;">"Content-Type: application/json" \
  -d #a5d6ff;">'{
    "name": "Daily Price Check",
    "url": "https://store.com/product/123",
    "selectors": {
      "price": ".price",
      "availability": ".stock-status"
    },
    "cron": "0 9 * * *",
    "timezone": "America/New_York",
    "webhook": "https://yoursite.com/price-webhook"
  }'

Parameters

ParameterTypeDescription
namestringDescriptive name for the schedule
urlstringURL to scrape
cronstringCron expression for scheduling
timezonestringIANA timezone (default: UTC)
webhookstringURL to receive results

Cron Expression Examples

CronDescription
0 * * * *Every hour
0 9 * * *Daily at 9 AM
0 9 * * 1-5Weekdays at 9 AM
0 0 * * 0Weekly on Sunday at midnight
*/15 * * * *Every 15 minutes
0 9,18 * * *Daily at 9 AM and 6 PM
GET/v1/schedules

List Schedules

Retrieve all your scheduled scraping jobs.

PATCH/v1/schedules/:id

Update Schedule

Update an existing schedule's configuration.

POST/v1/schedules/:id/pause

Pause / Resume

Temporarily pause or resume a schedule without deleting it.

bash
# Pause a schedule
POST /v1/schedules/:id/pause

# Resume a schedule
POST /v1/schedules/:id/resume
DELETE/v1/schedules/:id

Delete Schedule

Permanently delete a schedule.

Related