Skip to main content

Get started in two ways

Choose your preferred method: use the web interface for quick testing or integrate via API for production use.

Option 1: Web Interface (No Code)

Perfect for testing and one-off scraping tasks.

Step 1: Create your account

1

Sign up

Go to app.manypi.com and create your free account. You’ll get 20,000 credits to start.
2

Verify email

Check your inbox and verify your email address to activate your account.

Step 2: Create your first scraper

  1. Click “Create Scraper” in your dashboard
  2. Enter a target URL (e.g., https://example.com/product)
  3. Describe what data you want to extract in plain English:
    • “Extract the product title, price, and rating”
    • “Get all article titles and publication dates”
    • “Scrape company name, location, and job listings”
  4. Click “Generate Schema” - AI will create the data structure
  5. Review and adjust the schema if needed
  6. Click “Create Scraper”
Be specific in your prompt for better results. Include field names and data types you expect.
If you prefer full control, you can manually define your JSON schema:
{
  "type": "object",
  "properties": {
    "title": { "type": "string" },
    "price": { "type": "string" },
    "rating": { "type": "number" },
    "inStock": { "type": "boolean" }
  },
  "required": ["title", "price"]
}

Step 3: Test your scraper

1

Run a test

Click “Run Now” on your scraper to test it with the target URL.
2

View results

See the extracted data in clean, structured JSON format. Results appear instantly in your dashboard.
3

Refine if needed

If the results aren’t perfect, edit your scraper’s prompt or schema and test again. Each test only costs credits when successful.
Success! You’ve created and tested your first scraper. Now you can run it anytime from the dashboard.

Option 2: API Integration (Production)

Perfect for automation, scheduled jobs, and production applications.

Step 1: Get your API key

1

Navigate to settings

Go to SettingsAPI Keys in your dashboard.
2

Generate key

Click “Create API Key”, give it a name, and copy the key.
Store it securely - you won’t be able to see it again!

Step 2: Create a scraper

Follow the same process as the web interface to create and test your scraper. You’ll need the scraper ID for API calls.
Find your scraper ID in the dashboard - it’s shown in the scraper details.

Step 3: Make your first API call

curl -X POST \
  'https://app.manypi.com/api/scrape/YOUR_SCRAPER_ID' \
  -H 'Authorization: Bearer YOUR_API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "url": "https://example.com/product/123"
  }'

Step 4: Handle the response

Response
{
  "success": true,
  "data": {
    "title": "Wireless Bluetooth Headphones",
    "price": "$79.99",
    "rating": 4.5,
    "inStock": true
  },
  "metadata": {
    "scraperId": "550e8400-e29b-41d4-a716-446655440000",
    "timestamp": "2024-01-15T10:30:00.000Z",
    "tokensUsed": 1500,
    "creditsCost": 1500,
    "creditsRemaining": 8500
  }
}
You’re live! Your scraper is now integrated and ready for production use.

Common use cases

E-commerce monitoring

Track competitor prices, product availability, and reviews across multiple sites.

Lead generation

Extract company information, contact details, and job postings from business directories.

Content aggregation

Collect articles, blog posts, and news from multiple sources for your platform.

Market research

Gather product data, pricing trends, and customer sentiment at scale.

Next steps


Tips for success

Be specific about what data you want. Instead of “get product info”, say “extract product title, price in USD, star rating out of 5, and availability status”.
Test your scraper on 2-3 different pages from the same site to ensure it works consistently.
Always check the success field in API responses and handle errors appropriately in your code.
Keep an eye on your credit usage in the dashboard. Set up alerts to avoid running out during important jobs.

Need help? Contact us at [email protected] or check out our full documentation.