You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -27,6 +27,7 @@ Crawl4AI is the #1 trending GitHub repository, actively maintained by a vibrant
27
27
1. Install Crawl4AI:
28
28
```bash
29
29
pip install crawl4ai
30
+
crawl4ai-setup # Setup the browser
30
31
```
31
32
32
33
2. Run a simple web crawl:
@@ -140,11 +141,12 @@ For basic web crawling and scraping tasks:
140
141
141
142
```bash
142
143
pip install crawl4ai
144
+
crawl4ai-setup # Setup the browser
143
145
```
144
146
145
147
By default, this will install the asynchronous version of Crawl4AI, using Playwright for web crawling.
146
148
147
-
👉 **Note**: When you install Crawl4AI, the setup script should automatically install and set up Playwright. However, if you encounter any Playwright-related errors, you can manually install it using one of these methods:
149
+
👉 **Note**: When you install Crawl4AI, the `crawl4ai-setup` should automatically install and set up Playwright. However, if you encounter any Playwright-related errors, you can manually install it using one of these methods:
148
150
149
151
1. Through the command line:
150
152
@@ -218,48 +220,173 @@ Crawl4AI is available as Docker images for easy deployment. You can either pull
VERSION=basic docker-compose --profile hub-amd64 up # Basic version
353
+
VERSION=all docker-compose --profile hub-amd64 up # Full ML/LLM support
354
+
VERSION=gpu docker-compose --profile hub-amd64 up # GPU support
355
+
```
356
+
357
+
### For ARM64 (M1/M2 Macs, ARM servers):
358
+
```bash
359
+
# Build and run locally
360
+
docker-compose --profile local-arm64 up
361
+
362
+
# Run from Docker Hub
363
+
VERSION=basic docker-compose --profile hub-arm64 up # Basic version
364
+
VERSION=all docker-compose --profile hub-arm64 up # Full ML/LLM support
365
+
VERSION=gpu docker-compose --profile hub-arm64 up # GPU support
261
366
```
262
367
368
+
Environment variables (optional):
369
+
```bash
370
+
# Create a .env file
371
+
CRAWL4AI_API_TOKEN=your_token
372
+
OPENAI_API_KEY=your_openai_key
373
+
CLAUDE_API_KEY=your_claude_key
374
+
```
375
+
376
+
The compose file includes:
377
+
- Memory management (4GB limit, 1GB reserved)
378
+
- Shared memory volume for browser support
379
+
- Health checks
380
+
- Auto-restart policy
381
+
- All necessary port mappings
382
+
383
+
Test the installation:
384
+
```bash
385
+
curl http://localhost:11235/health
386
+
```
387
+
388
+
</details>
389
+
263
390
---
264
391
265
392
### Quick Test
@@ -276,11 +403,11 @@ response = requests.post(
276
403
)
277
404
task_id = response.json()["task_id"]
278
405
279
-
#Get results
406
+
#Continue polling until the task is complete (status="completed")
280
407
result = requests.get(f"http://localhost:11235/task/{task_id}")
281
408
```
282
409
283
-
For advanced configuration, environment variables, and usage examples, see our [Docker Deployment Guide](https://crawl4ai.com/mkdocs/basic/docker-deployment/).
410
+
For more examples, see our [Docker Examples](https://github.com/unclecode/crawl4ai/blob/main/docs/examples/docker_example.py). For advanced configuration, environment variables, and usage examples, see our [Docker Deployment Guide](https://crawl4ai.com/mkdocs/basic/docker-deployment/).
0 commit comments