Compare commits

...

7 Commits

Author SHA1 Message Date
Kunthawat Greethong
2d2a6d30a0 update EA 2026-01-08 13:21:40 +07:00
Kunthawat Greethong
92391f9d18 update code 2026-01-08 12:48:05 +07:00
Kunthawat Greethong
143f2567c7 update code and fix bugs 2026-01-08 12:30:31 +07:00
Kunthawat Greethong
e7487af624 feat(oi): improve csv loading with caching
Add price caching to prevent repeated file reads and improve performance.
Implement multi-path search for CSV files with fallback options. Add
comprehensive logging for CSV load success/failure states. Update dashboard
to display CSV loading status. Simplify scraper CSV output format and automate
file transfer to terminal MQL5 Files directory.
2026-01-08 11:49:48 +07:00
Kunthawat Greethong
b7c0e68fa8 refactor(oi): improve data extraction and consolidate documentation
- Fix MQL5 API usage in EA to use correct CopyRates and POSITION_TYPE enums
- Refactor scraper data extraction to use drop_duplicates for unique strikes
- Consolidate Windows setup guide into main README
- Add virtual environment batch files for easier setup and execution
- Simplify run_scraper.bat to focus on core execution
- Normalize lot calculation to use SymbolInfo.LotsStep()
2026-01-06 20:18:12 +07:00
Kunthawat Greethong
2e8e07ed17 refactor(oi): update scraper for new QuikStrike website structure
- Replace direct product URL navigation with fixed heatmap URL and UI product selection
- Implement cookie validation with automatic session cleanup
- Update login flow to use SSO authentication and new form selectors
- Improve data extraction with iframe context and better table parsing
- Add multiple fallback selectors for gold price scraping
- Enhance error handling, logging, and timeout management
2026-01-06 12:16:53 +07:00
Kunthawat Greethong
28a4546cd8 feat(oi): add open interest scraper module
add new oi_scraper directory for collecting open interest data
and update the main EA to integrate with the scraper functionality
2026-01-04 17:35:14 +07:00
14 changed files with 4492 additions and 0 deletions

BIN
.DS_Store vendored Normal file

Binary file not shown.

File diff suppressed because it is too large Load Diff

Binary file not shown.

File diff suppressed because it is too large Load Diff

22
oi_scraper/.env.example Normal file
View File

@@ -0,0 +1,22 @@
# CME Group QuikStrike Login Credentials
CME_USERNAME=your_username_here
CME_PASSWORD=your_password_here
CME_LOGIN_URL=https://login.cmegroup.com/sso/accountstatus/showAuth.action
# QuikStrike URL (fixed - always same page)
QUIKSTRIKE_URL=https://www.cmegroup.com/tools-information/quikstrike/open-interest-heatmap.html
# Gold Price Source (investing.com)
INVESTING_URL=https://www.investing.com/commodities/gold
# Output Settings
CSV_OUTPUT_PATH=./oi_data.csv
TOP_N_STRIKES=3
# Scraping Settings
HEADLESS=false # Set to true for production
TIMEOUT_SECONDS=30
RETRY_ATTEMPTS=3
# Logging
LOG_LEVEL=INFO # DEBUG, INFO, WARNING, ERROR

31
oi_scraper/.gitignore vendored Normal file
View File

@@ -0,0 +1,31 @@
# Python cache
__pycache__/
*.py[cod]
*$py.class
*.so
# Virtual environments
venv/
env/
ENV/
# Environment variables
.env
# Output files
*.csv
*.png
*.log
# Session data
cookies.json
# IDE
.vscode/
.idea/
*.swp
*.swo
# OS
.DS_Store
Thumbs.db

268
oi_scraper/README.md Normal file
View File

@@ -0,0 +1,268 @@
# CME OI Scraper
Python scraper that extracts Open Interest data from CME Group QuikStrike and current gold price from investing.com.
## What It Extracts
1. **OI Levels (from CME QuikStrike):**
- Top 3 CALL strikes by OI volume (unique strikes)
- Top 3 PUT strikes by OI volume (unique strikes)
2. **Gold Price (from investing.com):**
- Current gold futures price (e.g., 4476.50)
## Prerequisites
- Python 3.9 or higher
- CME Group QuikStrike account (free registration at https://www.cmegroup.com)
- Windows 10/11 (for batch files) or Linux/macOS
## Quick Start
### Windows
1. **Run one-time setup:**
```cmd
cd C:\Path\To\oi_scraper
setup_env.bat
```
2. **Run the scraper:**
```cmd
run_with_venv.bat
```
### Linux/macOS
1. **Setup:**
```bash
cd /path/to/oi_scraper
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
playwright install chromium
```
2. **Run:**
```bash
source venv/bin/activate
python main.py
```
## Configuration
### Edit `.env` File
Copy and edit the environment file:
```cmd
copy .env.example .env
notepad .env
```
Required settings:
```env
CME_USERNAME=your_cme_username
CME_PASSWORD=your_cme_password
```
Optional settings:
```env
# Number of top strikes to export (default: 3)
TOP_N_STRIKES=3
# Run browser without window (default: false)
HEADLESS=false
# Page timeout in seconds (default: 30)
TIMEOUT_SECONDS=30
# Output CSV path
CSV_OUTPUT_PATH=./oi_data.csv
# Logging level: DEBUG, INFO, WARNING, ERROR
LOG_LEVEL=INFO
```
## Output Format
The scraper exports to `oi_data.csv`:
```csv
Type,Strike,OI
CALL,4375.0,147
CALL,4450.0,173
CALL,4500.0,176
PUT,4435.0,49
PUT,4400.0,102
PUT,4515.0,150
[Price]
FuturePrice,4467.8
```
The `[Price]` section contains the current gold futures price scraped from investing.com.
## Session Persistence
The scraper saves login sessions to `cookies.json`:
- **First run:** Logs in with credentials, saves cookies
- **Subsequent runs:** Uses saved cookies if session is valid
- **Session expired:** Automatically re-logs in and saves new cookies
This makes scheduled runs faster and reduces login attempts to CME servers.
To force a fresh login:
```cmd
del cookies.json
```
## Integration with EA
The EA reads OI data from CSV when configured:
```mql5
input ENUM_OI_SOURCE InpOISource = OI_SOURCE_CSV_FILE;
```
Copy `oi_data.csv` to your MT5 `MQL5/Files` directory:
```
C:\Users\YourUsername\AppData\Roaming\MetaQuotes\Terminal\Common\MQL5\Files\oi_data.csv
```
## Automatic Daily Scheduling
### Windows Task Scheduler
1. **Create scheduled task:**
- Open Task Scheduler (`taskschd.msc`)
- Click "Create Task"
2. **Configure General tab:**
- Name: `CME OI Scraper - Daily`
- ✅ Run whether user is logged on or not
- ✅ Run with highest privileges
3. **Configure Triggers tab:**
- New → On a schedule → Daily
- Start time: 9:00 AM (or your preferred time)
- ✅ Enabled
4. **Configure Actions tab:**
- Action: Start a program
- Program/script:
```
C:\Path\To\oi_scraper\run_scheduled.bat
```
- Start in:
```
C:\Path\To\oi_scraper
```
5. **Click OK to save**
### Linux/macOS (cron)
```bash
# Edit crontab
crontab -e
# Add line to run every day at 9 AM
0 9 * * * cd /path/to/oi_scraper && /path/to/venv/bin/python main.py
```
## Batch Files Reference
| File | Purpose |
|------|---------|
| `setup_env.bat` | One-time setup (creates virtual environment) |
| `run_with_venv.bat` | Manual run with visible window |
| `run_scheduled.bat` | For Task Scheduler (no window, no pause) |
## Troubleshooting
### Module Not Found Errors
**Error:** `ModuleNotFoundError: No module named 'playwright'`
**Solution:**
```cmd
run_with_venv.bat
```
The virtual environment ensures all dependencies are isolated.
### Login Fails
- Verify credentials in `.env`
- Check if CME requires 2FA (manual intervention needed)
- Set `HEADLESS=false` to see browser activity
- Check screenshots: `login_failed.png`, `login_error.png`
### No Data Extracted
- Check if CME table structure changed
- Increase `TIMEOUT_SECONDS=60` in `.env`
- Check logs for errors
- Screenshot saved as `login_debug.png`
### Browser Issues
```cmd
# Reinstall Chromium
python -m playwright install chromium
```
### Session Expires Frequently
Delete cookies to force fresh login:
```cmd
del cookies.json
```
### Check Python Path Issues (Windows)
```cmd
# Check which Python is being used
where python
# Use Python launcher
py -3 main.py
# Or use the virtual environment
run_with_venv.bat
```
## Finding Product IDs
To scrape other instruments (Silver, Crude Oil, etc.):
1. Visit CME QuikStrike OI Heatmap
2. Login to your CME account
3. Select a product from the dropdown
4. The URL updates with the `pid` parameter
5. Note: This scraper is configured for Gold by default
## Notes
- Targets the OI Heatmap table structure
- Exports top N unique strikes by OI volume
- Uses session cookies for faster subsequent runs
- CME sessions typically last several days to weeks
- Virtual environment recommended to avoid Python path conflicts
## Files
```
oi_scraper/
├── main.py # Main scraper script
├── requirements.txt # Python dependencies
├── .env.example # Environment template
├── .env # Your credentials (create from example)
├── setup_env.bat # Windows: Create virtual environment
├── run_with_venv.bat # Windows: Manual run
├── run_scheduled.bat # Windows: Task Scheduler run
├── oi_data.csv # Output file (generated)
├── cookies.json # Session cookies (generated)
└── scraper.log # Log file (generated)
```

376
oi_scraper/main.py Normal file
View File

@@ -0,0 +1,376 @@
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
CME OI Scraper - Extracts Open Interest data from CME QuikStrike and gold price from investing.com
Usage: python main.py
Requires: pip install -r requirements.txt
"""
import os
import logging
import json
from datetime import datetime
from playwright.sync_api import sync_playwright
from dotenv import load_dotenv
import pandas as pd
load_dotenv()
# Configuration
CME_USERNAME = os.getenv("CME_USERNAME")
CME_PASSWORD = os.getenv("CME_PASSWORD")
CME_LOGIN_URL = os.getenv(
"CME_LOGIN_URL", "https://login.cmegroup.com/sso/accountstatus/showAuth.action"
)
QUIKSTRIKE_URL = (
"https://www.cmegroup.com/tools-information/quikstrike/open-interest-heatmap.html"
)
QUIKSTRIKE_URL = (
"https://www.cmegroup.com/tools-information/quikstrike/open-interest-heatmap.html"
)
INVESTING_URL = os.getenv("INVESTING_URL", "https://www.investing.com/commodities/gold")
CSV_OUTPUT_PATH = os.getenv("CSV_OUTPUT_PATH", "./oi_data.csv")
TOP_N_STRIKES = int(os.getenv("TOP_N_STRIKES", "3"))
HEADLESS = os.getenv("HEADLESS", "false").lower() == "true"
TIMEOUT_SECONDS = int(os.getenv("TIMEOUT_SECONDS", "30"))
RETRY_ATTEMPTS = int(os.getenv("RETRY_ATTEMPTS", "3"))
LOG_LEVEL = os.getenv("LOG_LEVEL", "INFO")
COOKIE_FILE = "./cookies.json"
logging.basicConfig(level=getattr(logging, LOG_LEVEL))
logger = logging.getLogger(__name__)
def save_cookies(context):
cookies = context.cookies()
with open(COOKIE_FILE, "w") as f:
json.dump(cookies, f)
logger.info("Cookies saved to file")
def load_cookies(context):
if os.path.exists(COOKIE_FILE):
with open(COOKIE_FILE, "r") as f:
cookies = json.load(f)
context.add_cookies(cookies)
logger.info("Cookies loaded from file")
return True
return False
def delete_cookies():
if os.path.exists(COOKIE_FILE):
os.remove(COOKIE_FILE)
logger.info("Cookies deleted")
def are_cookies_valid(page):
logger.info("Checking if cookies are valid...")
page.goto(QUIKSTRIKE_URL, timeout=TIMEOUT_SECONDS * 1000)
page.wait_for_load_state("domcontentloaded", timeout=TIMEOUT_SECONDS * 1000)
page.wait_for_timeout(3000)
try:
frame = page.frame_locator("iframe.cmeIframe").first
page.wait_for_timeout(5000)
table_exists = frame.locator("table.grid-thm").count() > 0
if table_exists:
logger.info("Cookies are valid - OI table found in iframe")
else:
logger.info("Cookies may be expired - no OI table found in iframe")
return table_exists
except Exception as e:
logger.info(f"Cookies expired - error checking iframe: {e}")
return False
def login_to_cme(page):
logger.info("Attempting to login to CME QuikStrike...")
page.goto(CME_LOGIN_URL, timeout=TIMEOUT_SECONDS * 1000)
page.wait_for_load_state("domcontentloaded", timeout=TIMEOUT_SECONDS * 1000)
page.wait_for_timeout(1000)
try:
page.fill("#user", CME_USERNAME)
page.fill("#pwd", CME_PASSWORD)
page.click("#loginBtn")
logger.info("Waiting for login redirect...")
page.wait_for_timeout(30000)
current_url = page.url.lower()
logger.info(f"Current URL after login attempt: {current_url}")
if "login" in current_url or "sso" in current_url:
logger.error("Login may have failed - still on SSO/login page")
page.screenshot(path="login_failed.png")
return False
logger.info("Login successful")
page.screenshot(path="login_success.png")
return True
except Exception as e:
logger.error(f"Login error: {e}")
page.screenshot(path="login_error.png")
return False
def select_gold_product(page):
logger.info("Selecting Gold product...")
logger.info("Switching to iframe context...")
frame = page.frame_locator("iframe.cmeIframe").first
page.wait_for_timeout(5000)
logger.info("Step 1: Clicking dropdown arrow...")
frame.locator("#ctl11_hlProductArrow").click()
page.wait_for_timeout(1000)
logger.info("Step 2: Clicking Metals...")
frame.locator('a[groupid="6"]:has-text("Metals")').click()
page.wait_for_timeout(500)
logger.info("Step 3: Clicking Precious Metals...")
frame.locator('a[familyid="6"]:has-text("Precious Metals")').click()
page.wait_for_timeout(500)
logger.info("Step 4: Clicking Gold...")
frame.locator('a[title="Gold"]').click()
logger.info("Waiting for Gold data to load...")
page.wait_for_timeout(10000)
logger.info("Gold product selected")
def navigate_to_oi_heatmap(page):
logger.info(f"Navigating to QuikStrike: {QUIKSTRIKE_URL}")
page.goto(QUIKSTRIKE_URL, timeout=TIMEOUT_SECONDS * 1000)
page.wait_for_load_state("domcontentloaded", timeout=TIMEOUT_SECONDS * 1000)
page.wait_for_timeout(5000)
select_gold_product(page)
def extract_oi_data(page):
logger.info("Extracting OI data from Gold matrix table...")
logger.info("Switching to iframe context...")
frame = page.frame_locator("iframe.cmeIframe").first
page.wait_for_timeout(8000)
logger.info("Looking for table.grid-thm...")
call_levels = []
put_levels = []
table = frame.locator("table.grid-thm").first
table.wait_for(state="visible", timeout=10000)
logger.info("Table found, waiting for data...")
rows = table.locator("tbody tr").all()
logger.info(f"Found {len(rows)} rows in table")
for row in rows:
try:
cells = row.locator("td").all()
if len(cells) < 3:
continue
strike = None
for cell in cells:
text = cell.text_content().strip()
if text and text.replace(".", "").isdigit():
strike = float(text)
break
if strike is None:
continue
number_cells = row.locator("td.number").all()
logger.debug(f"Strike {strike}: found {len(number_cells)} number cells")
for i in range(0, len(number_cells), 2):
if i + 1 >= len(number_cells):
break
call_cell = number_cells[i]
put_cell = number_cells[i + 1]
call_text = call_cell.text_content().strip()
put_text = put_cell.text_content().strip()
if call_text and call_text != "-":
call_oi = int(call_text.replace(",", ""))
call_levels.append(
{"Type": "CALL", "Strike": strike, "OI": call_oi}
)
if put_text and put_text != "-":
put_oi = int(put_text.replace(",", ""))
put_levels.append({"Type": "PUT", "Strike": strike, "OI": put_oi})
except Exception as e:
logger.warning(f"Error parsing row: {e}")
continue
logger.info(
f"Extracted {len(call_levels)} CALL levels, {len(put_levels)} PUT levels"
)
if call_levels:
call_df = pd.DataFrame(call_levels)
call_df = call_df.drop_duplicates(subset="Strike", keep="first")
call_df = call_df.sort_values("OI")
call_df = call_df.tail(TOP_N_STRIKES)
call_df["Type"] = "CALL"
else:
call_df = pd.DataFrame()
if put_levels:
put_df = pd.DataFrame(put_levels)
put_df = put_df.drop_duplicates(subset="Strike", keep="first")
put_df = put_df.sort_values("OI")
put_df = put_df.tail(TOP_N_STRIKES)
put_df["Type"] = "PUT"
else:
put_df = pd.DataFrame()
result_df = pd.concat([call_df, put_df])
result_df = result_df[["Type", "Strike", "OI"]]
logger.info(f"Final top {TOP_N_STRIKES} unique strikes for CALL and PUT extracted")
return result_df
def scrape_investing_gold_price(page):
logger.info(f"Scraping gold price from: {INVESTING_URL}")
try:
page.goto(INVESTING_URL, timeout=60000, wait_until="domcontentloaded")
logger.info(f"Page loaded, title: {page.title()}")
page.wait_for_timeout(5000)
logger.info("Waited for JavaScript to render")
selectors = [
'div[data-test="instrument-price-last"]',
".text-5xl\\/9.font-bold.text-\\[#232526\\]",
'[data-test="instrument-price-last"]',
".text-5xl\\/9",
]
price = 0.0
for selector in selectors:
try:
locator = page.locator(selector)
if locator.count() > 0:
locator.first.wait_for(state="visible", timeout=10000)
price_text = locator.first.text_content().strip()
if price_text:
price_text = price_text.replace(",", "")
price = float(price_text)
logger.info(f"Extracted gold price ({selector}): {price}")
break
except Exception as e:
logger.debug(f"Selector {selector} failed: {e}")
continue
if price == 0.0:
logger.warning("Could not extract gold price, all selectors failed")
return price
except Exception as e:
logger.error(f"Error scraping gold price: {e}")
return 0.0
def export_to_csv(df, future_price=0.0):
output_path = CSV_OUTPUT_PATH
with open(output_path, "w", encoding="utf-8") as f:
f.write("Type,Strike,OI\n")
call_df = df[df["Type"] == "CALL"] if len(df) > 0 else pd.DataFrame()
put_df = df[df["Type"] == "PUT"] if len(df) > 0 else pd.DataFrame()
if len(call_df) > 0:
for _, row in call_df.iterrows():
f.write(f"CALL,{row['Strike']:.1f},{row['OI']}\n")
if len(put_df) > 0:
for _, row in put_df.iterrows():
f.write(f"PUT,{row['Strike']:.1f},{row['OI']}\n")
f.write(f"Future,{future_price},0\n")
logger.info(f"Exported OI data and price to {output_path}")
def run_scraper():
if not CME_USERNAME or not CME_PASSWORD:
logger.error("Missing CME_USERNAME or CME_PASSWORD in .env file")
return
future_price = 0.0
for attempt in range(RETRY_ATTEMPTS):
try:
with sync_playwright() as p:
browser = p.chromium.launch(headless=HEADLESS)
context = browser.new_context()
page = context.new_page()
cookies_loaded = load_cookies(context)
cookies_valid = False
if cookies_loaded:
cookies_valid = are_cookies_valid(page)
if cookies_valid:
logger.info("Using cached session")
else:
if cookies_loaded:
logger.info("Cookies expired, deleting and re-logging in...")
delete_cookies()
logger.info("Logging in to CME...")
if not login_to_cme(page):
browser.close()
if attempt < RETRY_ATTEMPTS - 1:
logger.info(
f"Retrying... Attempt {attempt + 2}/{RETRY_ATTEMPTS}"
)
continue
else:
logger.error("All login attempts failed")
return
navigate_to_oi_heatmap(page)
oi_data = extract_oi_data(page)
save_cookies(context)
if len(oi_data) > 0:
logger.info("Extracting gold price from investing.com...")
future_price = scrape_investing_gold_price(page)
logger.info(f"Gold price extracted: {future_price}")
export_to_csv(oi_data, future_price)
else:
logger.warning("No OI data extracted")
browser.close()
break
except Exception as e:
logger.error(f"Scraper error (attempt {attempt + 1}): {e}")
if attempt < RETRY_ATTEMPTS - 1:
logger.info(f"Retrying... Attempt {attempt + 2}/{RETRY_ATTEMPTS}")
else:
logger.error("All attempts failed")
if __name__ == "__main__":
run_scraper()

View File

@@ -0,0 +1,3 @@
playwright>=1.40.0
python-dotenv>=1.0.0
pandas>=2.2.0

View File

@@ -0,0 +1,13 @@
@echo off
REM ==========================================
REM CME OI Scraper - Scheduled Task Version
REM For use with Windows Task Scheduler
REM ==========================================
REM Navigate to script directory
cd /d %~dp0
REM Activate virtual environment and run scraper (no pause)
call venv\Scripts\activate.bat
python main.py
exit %ERRORLEVEL%

View File

@@ -0,0 +1,21 @@
@echo off
REM ==========================================
REM CME OI Scraper - Run with Virtual Environment
REM ==========================================
REM Navigate to script directory
cd /d %~dp0
echo ==========================================
echo CME OI Scraper
echo ==========================================
REM Activate virtual environment
call venv\Scripts\activate.bat
REM Run Python scraper
python main.py
REM Pause for 5 seconds if running manually (not scheduled)
if "%1"=="--scheduled" goto :eof
timeout /t 5

View File

@@ -0,0 +1,77 @@
# CME OI Scraper - PowerShell Script
# Copy this file to: run_scraper.ps1
# ==========================================
# Configuration
# ==========================================
$scriptPath = "C:\Users\YourUsername\Gitea\MeanRevisionEA\oi_scraper"
$logFile = "$scriptPath\scraper.log"
$csvFile = "$scriptPath\oi_data.csv"
$mt5Path = "C:\Users\YourUsername\AppData\Roaming\MetaQuotes\Terminal\[Your_Terminal_ID]\MQL5\Files\oi_data.csv"
# ==========================================
# Helper Functions
# ==========================================
function Write-Log {
param([string]$message)
$timestamp = Get-Date -Format "yyyy-MM-dd HH:mm:ss"
$logEntry = "[$timestamp] $message"
Write-Output $logEntry | Add-Content $logFile
Write-Host $logEntry
}
# ==========================================
# Main Script
# ==========================================
# Navigate to script directory
cd $scriptPath
Write-Log "=========================================="
Write-Log "CME OI Scraper - Daily Update"
Write-Log "=========================================="
try {
# Run Python scraper
Write-Log "Starting Python scraper..."
& python main.py *>> $logFile 2>&1
$exitCode = $LASTEXITCODE
if ($exitCode -eq 0) {
Write-Log "Python scraper completed successfully"
# Check if CSV was created
if (Test-Path $csvFile)) {
$fileInfo = Get-Item $csvFile
Write-Log "CSV file found (Last modified: $($fileInfo.LastWriteTime))"
# Copy to MT5 directory
Write-Log "Copying CSV to MetaTrader 5 Files directory..."
try {
Copy-Item -Path $csvFile -Destination $mt5Path -Force
Write-Log "CSV successfully copied to MT5 directory"
# Verify copy
if (Test-Path $mt5Path)) {
Write-Log "Verified: MT5 CSV file exists"
} else {
Write-Log "ERROR: MT5 CSV file not found after copy"
}
} catch {
Write-Log "ERROR: Failed to copy to MT5 directory - $_"
}
} else {
Write-Log "WARNING: CSV file not found after scraper execution"
}
} else {
Write-Log "ERROR: Python scraper failed with exit code $exitCode"
}
} catch {
Write-Log "ERROR: Script failed - $($_.Exception.Message)"
exit 1
}
Write-Log "=========================================="
Write-Log "Script completed"
Write-Log "=========================================="

View File

@@ -0,0 +1,24 @@
@echo off
REM ==========================================
REM CME OI Scraper - Manual Run with Virtual Environment
REM ==========================================
REM Navigate to script directory
cd /d %~dp0
echo ==========================================
echo CME OI Scraper - Manual Run
echo ==========================================
REM Activate virtual environment
call venv\Scripts\activate.bat
REM Run Python scraper
python main.py
MOVE "oi_data.csv" "C:\Users\limitrack\AppData\Roaming\MetaQuotes\Terminal\53785E099C927DB68A545C249CDBCE06\MQL5\Files\"
echo.
echo Scraper completed. Check oi_data.csv for results.
timeout /t 5

34
oi_scraper/setup_env.bat Normal file
View File

@@ -0,0 +1,34 @@
@echo off
REM ==========================================
REM CME OI Scraper - Virtual Environment Setup
REM ==========================================
echo ==========================================
echo Setting up Python Virtual Environment
echo ==========================================
REM Navigate to script directory
cd /d %~dp0
REM Create virtual environment
echo Creating virtual environment...
py -3 -m venv venv
REM Activate virtual environment and install dependencies
echo Installing dependencies...
call venv\Scripts\activate.bat
pip install --upgrade pip
pip install -r requirements.txt
REM Install playwright browser
echo Installing Playwright browser...
python -m playwright install chromium
echo ==========================================
echo Setup Complete!
echo ==========================================
echo.
echo To run the scraper, use: run_with_venv.bat
echo.
pause