Agent services for EcoMate: research, supplier sync, price monitor, spec drafting, and compliance checks.
EcoMate AI is an intelligent automation platform designed to streamline environmental technology research, supplier management, compliance monitoring, and business operations. The system provides:
fastapi==0.114.0 # Web API framework
uvicorn[standard]==0.30.6 # ASGI server
httpx==0.27.2 # HTTP client
selectolax==0.3.20 # HTML parsing
beautifulsoup4==4.12.3 # HTML/XML parsing
pydantic==2.8.2 # Data validation
pydantic-settings==2.4.0 # Settings management
python-dotenv==1.0.1 # Environment variables
pint==0.24.4 # Unit conversions
python-slugify==8.0.4 # URL-safe strings
boto3==1.34.152 # AWS SDK (MinIO)
psycopg[binary]==3.2.1 # PostgreSQL adapter
pgvector==0.3.3 # Vector database
temporalio==1.7.0 # Workflow engine
PyYAML==6.0.2 # YAML processing
pdfplumber==0.11.4 # PDF text extraction
Before starting, ensure you have:
python --version
)# Clone the repository
git clone https://github.com/your-org/ecomate-ai.git
cd ecomate-ai
# Create and configure environment
cp .env.example .env
# Make scripts executable (Linux/macOS)
chmod +x scripts/*.sh
Edit .env
file with your specific values:
# Database Configuration
PGUSER=postgres
PGPASSWORD=your_secure_password
PGDATABASE=ecomate
# MinIO Storage
MINIO_ROOT_USER=minioadmin
MINIO_ROOT_PASSWORD=your_secure_password
MINIO_BUCKET=ecomate-artifacts
# AI Models
OLLAMA_URL=http://localhost:11434
VERTEX_PROJECT=your-gcp-project
VERTEX_LOCATION=us-central1
VERTEX_GEMINI_MODEL=gemini-2.5-pro
# GitHub Integration
DOCS_REPO=your-org/ecomate-docs
GH_TOKEN=your_github_token
# Parser Configuration
PARSER_STRICT=false
DEFAULT_CURRENCY=USD
CURRENCY_DEFAULT=ZAR
PRICE_DEVIATION_ALERT=0.10
# E-commerce Integration
SHOPIFY_API_KEY=your_shopify_api_key
SHOPIFY_API_SECRET=your_shopify_secret
SHOPIFY_WEBHOOK_SECRET=your_webhook_secret
WOOCOMMERCE_CONSUMER_KEY=your_woocommerce_key
WOOCOMMERCE_CONSUMER_SECRET=your_woocommerce_secret
MEDUSA_API_KEY=your_medusa_api_key
MEDUSA_BASE_URL=http://localhost:9000
# Proposal Service
PROPOSAL_TEMPLATE_PATH=./services/proposal/templates
PROPOSAL_DEFAULT_MARGIN=0.15
PROPOSAL_CURRENCY=USD
# Maintenance Service
MAINTENANCE_SCHEDULE_PATH=./data/maintenance_schedule.csv
MAINTENANCE_DEFAULT_INTERVAL=90
MAINTENANCE_ALERT_DAYS=7
# Compliance Service
COMPLIANCE_RULES_PATH=./services/compliance/rules
# Telemetry Service
TELEMETRY_ALERT_HEADROOM=0.10
# Start infrastructure services
docker compose up -d postgres minio temporal nats
# Verify services are running
docker compose ps
# Create virtual environment
python -m venv .venv
# Activate virtual environment
# On Linux/macOS:
source .venv/bin/activate
# On Windows:
.venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
E-commerce Platform Configuration:
Compliance Rules Setup:
# Create compliance rules directory
mkdir -p services/compliance/rules
# Copy default rules (if available)
cp data/compliance_rules/* services/compliance/rules/
Proposal Templates Setup:
# Create proposal templates directory
mkdir -p services/proposal/templates
# Copy default templates (if available)
cp data/proposal_templates/* services/proposal/templates/
# Terminal 1: Start Temporal worker (includes all new workflows)
python services/orchestrator/worker.py
# Terminal 2: Start API server (includes all new endpoints)
uvicorn services.api.main:app --reload --port 8080
Service Health Verification:
# Test core services
curl http://localhost:8080/health
# Test new service endpoints
curl -X POST http://localhost:8080/proposal/generate -H "Content-Type: application/json" -d '{"project_name":"test"}'
curl -X POST http://localhost:8080/catalog/sync -H "Content-Type: application/json" -d '{"platform":"shopify"}'
curl -X POST http://localhost:8080/maintenance/schedule -H "Content-Type: application/json" -d '{"equipment_id":"test"}'
curl -X POST http://localhost:8080/compliance/check -H "Content-Type: application/json" -d '{"system_specs":{}}'
curl -X POST http://localhost:8080/telemetry/ingest -H "Content-Type: application/json" -d '{"system_id":"test","metrics":{}}'
The PostgreSQL service automatically runs initialization scripts from storage/init.sql
to set up required tables and extensions.
.env
MINIO_BUCKET
ecomate-ai
Ollama (Local):
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Pull required models
ollama pull llama2
ollama pull codellama
Google Services:
Google Vertex AI:
GOOGLE_APPLICATION_CREDENTIALS
environment variableGoogle Maps API:
GOOGLE_API_KEY
in your .env
file# Trigger research for specific query
curl -X POST 'http://localhost:8080/run/research' \
-H 'Content-Type: application/json' \
-d '{
"query": "domestic MBBR package South Africa",
"limit": 5
}'
# Trigger research for specific URLs
curl -X POST 'http://localhost:8080/run/new-research' \
-H 'Content-Type: application/json' \
-d '{
"urls": [
"https://supplier1.com/pumps",
"https://supplier2.com/uv-systems"
]
}'
# Manual price monitoring
curl -X POST 'http://localhost:8080/run/price-monitor' \
-H 'Content-Type: application/json' \
-d '{"create_pr": true}'
# Scheduled price monitoring
curl -X POST 'http://localhost:8080/run/scheduled-price-monitor'
The system includes specialized parsers for different product categories:
Pump Parser Features:
UV Reactor Parser Features:
Parser Selection Logic:
# Domain-based selection
parse_by_domain("https://grundfos.com/pumps") # β pump parser
parse_by_domain("https://trojan-uv.com/systems") # β UV parser
# Category-based selection
parse_by_category(data, "pump") # β pump parser
parse_by_category(data, "uv") # β UV parser
When vendor parsers fail or find insufficient data, the system automatically falls back to LLM processing:
Scenario: Research new suppliers for MBBR systems in South Africa
Process:
Scenario: Daily price tracking for existing suppliers
Process:
Scenario: Compare pump specifications across multiple suppliers
Process:
Scenario: Generate technical proposals with cost calculations
# Generate proposal for water treatment system
curl -X POST 'http://localhost:8080/run/proposal' \
-H 'Content-Type: application/json' \
-d '{
"project_requirements": {
"flow_rate": "100 m3/h",
"treatment_type": "MBBR",
"location": "Cape Town, South Africa",
"budget_range": "50000-100000 USD"
},
"include_calculations": true,
"format": "pdf"
}'
Process:
Scenario: Sync product data with e-commerce platforms
# Sync products to Shopify store
curl -X POST 'http://localhost:8080/run/catalog-sync' \
-H 'Content-Type: application/json' \
-d '{
"platform": "shopify",
"store_url": "https://mystore.myshopify.com",
"product_categories": ["pumps", "uv-systems"],
"sync_mode": "incremental"
}'
# Update WooCommerce inventory
curl -X POST 'http://localhost:8080/run/inventory-update' \
-H 'Content-Type: application/json' \
-d '{
"platform": "woocommerce",
"site_url": "https://mysite.com",
"update_stock": true,
"update_pricing": true
}'
Process:
Scenario: Schedule and track equipment maintenance
# Schedule maintenance for installed equipment
curl -X POST 'http://localhost:8080/run/schedule-maintenance' \
-H 'Content-Type: application/json' \
-d '{
"equipment_id": "PUMP-001",
"maintenance_type": "preventive",
"schedule": {
"frequency": "quarterly",
"next_date": "2024-04-01"
},
"technician_assignment": "auto"
}'
# Get maintenance schedule
curl -X GET 'http://localhost:8080/maintenance/schedule?month=2024-03'
Process:
Scenario: Monitor regulatory compliance and certifications
# Check compliance status for products
curl -X POST 'http://localhost:8080/run/compliance-check' \
-H 'Content-Type: application/json' \
-d '{
"product_ids": ["PUMP-001", "UV-002"],
"regulations": ["SANS", "ISO", "CE"],
"market": "south_africa"
}'
# Generate compliance report
curl -X POST 'http://localhost:8080/run/compliance-report' \
-H 'Content-Type: application/json' \
-d '{
"report_type": "certification_status",
"date_range": "2024-Q1",
"format": "pdf"
}'
Process:
Scenario: Monitor system health and performance metrics
# Configure monitoring alerts
curl -X POST 'http://localhost:8080/telemetry/alerts' \
-H 'Content-Type: application/json' \
-d '{
"alert_type": "performance",
"metric": "response_time",
"threshold": 5000,
"notification_channels": ["email", "slack"]
}'
# Get system metrics
curl -X GET 'http://localhost:8080/telemetry/metrics?timerange=1h'
Process:
Scenario: Complete project lifecycle from research to delivery
# 1. Research suppliers for project requirements
curl -X POST 'http://localhost:8080/run/research' \
-d '{"query": "MBBR systems 500m3/day South Africa", "limit": 10}'
# 2. Generate technical proposal
curl -X POST 'http://localhost:8080/run/proposal' \
-d '{"project_requirements": {...}, "include_calculations": true}'
# 3. Check compliance requirements
curl -X POST 'http://localhost:8080/run/compliance-check' \
-d '{"product_ids": [...], "regulations": ["SANS"], "market": "south_africa"}'
# 4. Schedule installation and maintenance
curl -X POST 'http://localhost:8080/run/schedule-maintenance' \
-d '{"equipment_id": "...", "maintenance_type": "installation"}'
# 5. Sync to e-commerce platform
curl -X POST 'http://localhost:8080/run/catalog-sync' \
-d '{"platform": "shopify", "product_categories": [...]}'
POST /run/research
application/json
{
"query": "string", // Search query for supplier research
"limit": 5 // Maximum number of URLs to process
}
200
: Success400
: Invalid request parameters500
: Internal server errorPOST /run/new-research
application/json
{
"urls": ["string"] // Array of URLs to crawl and extract
}
POST /run/price-monitor
{
"create_pr": true // Whether to create GitHub PR with results
}
POST /run/scheduled-price-monitor
curl -X POST 'http://localhost:8080/run/research' \
-H 'Content-Type: application/json' \
-d '{
"query": "submersible pumps wastewater treatment",
"limit": 3
}'
{
"workflow_id": "research-submersible-12345",
"status": "completed",
"results": {
"suppliers_found": 3,
"products_extracted": 15,
"parser_success_rate": 0.8,
"llm_fallback_used": 3,
"data_quality_score": 0.92
},
"artifacts": {
"csv_file": "suppliers_20240115.csv",
"pr_url": "https://github.com/org/ecomate-docs/pull/123"
}
}
{
"workflow_id": "price-monitor-67890",
"status": "completed",
"results": {
"products_monitored": 45,
"price_changes_detected": 7,
"significant_deviations": 2,
"average_change_percent": 0.03
},
"changes": [
{
"product_id": "pump-grundfos-123",
"old_price": 1500.00,
"new_price": 1650.00,
"change_percent": 0.10,
"currency": "USD"
}
]
}
POST /proposal/generate
application/json
{
"project_name": "string",
"requirements": {
"flow_rate": "number",
"treatment_type": "string",
"budget_range": "string"
},
"template_id": "string" // Optional: specific template to use
}
200
: Success, 400
: Invalid parameters, 500
: Generation errorPOST /catalog/sync
application/json
{
"platform": "shopify|woocommerce|medusa",
"store_config": {
"api_key": "string",
"store_url": "string",
"webhook_secret": "string"
},
"sync_options": {
"full_sync": "boolean",
"categories": ["string"]
}
}
200
: Success, 401
: Authentication failed, 500
: Sync errorPOST /maintenance/schedule
application/json
{
"equipment_id": "string",
"equipment_type": "string",
"installation_date": "ISO8601",
"operating_hours": "number",
"maintenance_type": "preventive|predictive|corrective"
}
200
: Success, 404
: Equipment not found, 500
: Scheduling errorGET /maintenance/plan/{equipment_id}
POST /compliance/check
application/json
{
"system_specs": {
"flow_rate": "number",
"treatment_efficiency": "number",
"discharge_quality": "object"
},
"regulations": ["string"], // Regulatory frameworks to check
"jurisdiction": "string"
}
200
: Success, 400
: Invalid specs, 500
: Validation errorPOST /telemetry/ingest
application/json
{
"system_id": "string",
"metrics": {
"flow_rate": "number",
"pressure": "number",
"temperature": "number",
"power_consumption": "number",
"efficiency": "number"
},
"timestamp": "ISO8601"
}
200
: Success, 400
: Invalid data, 500
: Processing errorcurl -X POST 'http://localhost:8080/proposal/generate' \
-H 'Content-Type: application/json' \
-d '{
"project_name": "Municipal Wastewater Treatment Upgrade",
"requirements": {
"flow_rate": 1000,
"treatment_type": "biological",
"budget_range": "500000-750000"
}
}'
curl -X POST 'http://localhost:8080/catalog/sync' \
-H 'Content-Type: application/json' \
-d '{
"platform": "shopify",
"store_config": {
"api_key": "your-api-key",
"store_url": "your-store.myshopify.com"
},
"sync_options": {
"full_sync": true,
"categories": ["pumps", "filters"]
}
}'
curl -X POST 'http://localhost:8080/maintenance/schedule' \
-H 'Content-Type: application/json' \
-d '{
"equipment_id": "pump-001",
"equipment_type": "submersible_pump",
"installation_date": "2024-01-15T00:00:00Z",
"operating_hours": 2400,
"maintenance_type": "predictive"
}'
curl -X POST 'http://localhost:8080/compliance/check' \
-H 'Content-Type: application/json' \
-d '{
"system_specs": {
"flow_rate": 500,
"treatment_efficiency": 0.95,
"discharge_quality": {
"bod": 10,
"tss": 15,
"ph": 7.2
}
},
"regulations": ["EPA_NPDES", "LOCAL_DISCHARGE"],
"jurisdiction": "US_CA"
}'
curl -X POST 'http://localhost:8080/telemetry/ingest' \
-H 'Content-Type: application/json' \
-d '{
"system_id": "plant-001",
"metrics": {
"flow_rate": 450.5,
"pressure": 2.3,
"temperature": 22.1,
"power_consumption": 15.2,
"efficiency": 0.92
},
"timestamp": "2024-01-15T10:30:00Z"
}'
repo
for private repositoriesGH_TOKEN
in environmentGOOGLE_APPLICATION_CREDENTIALS
# Check all services status
docker compose ps
# Verify API health
curl http://localhost:8080/health
# Check Temporal UI
open http://localhost:8088
# Test database connection
psql -h localhost -U postgres -d ecomate -c "SELECT version();"
π΄ Service Wonβt Start
# Check logs for specific service
docker compose logs [service-name]
# Restart specific service
docker compose restart [service-name]
# Full system restart
docker compose down && docker compose up -d
π΄ Port Conflicts
# Check what's using the port
lsof -i :8080 # or netstat -tulpn | grep 8080
# Kill process using port
sudo kill -9 [PID]
π΄ Database Connection Issues
# Reset database
docker compose down postgres
docker volume rm ecomate-ai_postgres_data
docker compose up -d postgres
# Wait for initialization
sleep 30
π΄ Proposal Generation Service Issues
Issue: Proposal generation fails with calculation errors
Error: Unable to calculate system sizing for given parameters
Solution:
docker compose logs proposal-service
Issue: PDF generation timeout
Error: Proposal PDF generation timed out after 30 seconds
Solution:
pip list | grep weasyprint
π΄ E-commerce Integration Issues
Issue: Shopify API authentication failed
Error: 401 Unauthorized - Invalid API credentials
Solution:
.env
curl -H "X-Shopify-Access-Token: TOKEN" https://SHOP.myshopify.com/admin/api/2023-10/shop.json
Issue: Product sync conflicts
Warning: Product SKU conflicts detected during sync
Solution:
π΄ Maintenance Scheduler Issues
Issue: Calendar integration not working
Error: Failed to connect to CalDAV server
Solution:
Issue: Maintenance notifications not sent
Warning: Failed to send maintenance reminder notifications
Solution:
π΄ Compliance Management Issues
Issue: Regulation database out of date
Warning: Regulation data is older than 30 days
Solution:
Issue: Certification validation errors
Error: Unable to validate certification status
Solution:
π΄ Telemetry & Alerts Issues
Issue: Metrics not being collected
Error: Prometheus metrics endpoint unreachable
Solution:
curl http://localhost:8080/metrics
Issue: Alert notifications not working
Error: Failed to send alert to configured channels
Solution:
Proposal Service Health Check:
# Test proposal generation
curl -X POST 'http://localhost:8080/run/proposal' \
-H 'Content-Type: application/json' \
-d '{"project_requirements": {"flow_rate": "100 m3/h", "treatment_type": "MBBR"}}'
# Check calculation engine
curl http://localhost:8080/proposal/health
E-commerce Service Health Check:
# Test platform connectivity
curl -X GET 'http://localhost:8080/catalog/platforms'
# Verify sync status
curl -X GET 'http://localhost:8080/catalog/sync-status'
Maintenance Service Health Check:
# Test scheduling functionality
curl -X GET 'http://localhost:8080/maintenance/health'
# Check calendar integration
curl -X GET 'http://localhost:8080/maintenance/calendar-status'
Compliance Service Health Check:
# Test compliance checking
curl -X GET 'http://localhost:8080/compliance/health'
# Verify regulation database
curl -X GET 'http://localhost:8080/compliance/regulations/status'
Telemetry Service Health Check:
# Test metrics collection
curl -X GET 'http://localhost:8080/telemetry/health'
# Check alert system
curl -X GET 'http://localhost:8080/telemetry/alerts/status'
Issue: Temporal Worker Connection Failed
Error: Failed to connect to Temporal server at localhost:7233
Solution:
docker compose ps temporal
netstat -an | grep 7233
docker compose restart temporal
Issue: Parser Extraction Failures
Warning: Parser failed, falling back to LLM processing
Solution:
services/parsers/
Issue: MinIO Storage Errors
Error: Unable to upload artifact to MinIO
Solution:
docker compose ps minio
.env
curl http://localhost:9000/minio/health/live
Issue: Database Connection Errors
Error: Could not connect to PostgreSQL database
Solution:
docker compose ps postgres
.env
psql -h localhost -U postgres -d ecomate
Slow Research Workflows:
worker.py
High Memory Usage:
Python Dependencies:
# Update requirements
pip install --upgrade -r requirements.txt
# Check for security vulnerabilities
pip audit
# Update specific packages
pip install --upgrade pydantic temporalio
Docker Services:
# Update service images
docker compose pull
# Restart with new images
docker compose down
docker compose up -d
# Clean up old images
docker image prune
Database Migrations:
# Backup database
docker exec postgres pg_dump -U postgres ecomate > backup.sql
# Apply schema changes
psql -h localhost -U postgres -d ecomate -f migrations/001_add_parser_metadata.sql
Environment Variables:
.env.example
with current .env
Parser Updates:
services/parsers/
python services/parsers/_demo_test.py
Health Checks:
# API health
curl http://localhost:8080/health
# Temporal health
curl http://localhost:8088/api/v1/namespaces
# Database health
psql -h localhost -U postgres -d ecomate -c "SELECT 1;"
Log Monitoring:
# Application logs
tail -f logs/ecomate-ai.log
# Docker service logs
docker compose logs -f temporal
docker compose logs -f postgres
# Install development dependencies
pip install -r requirements-dev.txt
# Install pre-commit hooks
pre-commit install
# Run tests
pytest tests/ -v
# Start development server with hot reload
uvicorn services.api.main:app --reload --port 8080
# Run all tests with coverage
pytest --cov=services tests/
# Run specific test categories
pytest tests/unit/ # Unit tests only
pytest tests/integration/ # Integration tests only
# Run tests with verbose output
pytest -v -s tests/
# Enable debug logging
export LOG_LEVEL=DEBUG
# Run with debugger
python -m pdb services.api.main.py
# Profile performance
python -m cProfile -o profile.stats services/parsers/_demo_test.py
Unit Tests:
# Test calculation engine
pytest tests/unit/test_proposal_calculations.py -v
# Test PDF generation
pytest tests/unit/test_proposal_pdf.py -v
# Test template rendering
pytest tests/unit/test_proposal_templates.py -v
Integration Tests:
# Test complete proposal workflow
pytest tests/integration/test_proposal_workflow.py -v
# Test API endpoints
pytest tests/integration/test_proposal_api.py -v
Manual Testing:
# Test proposal generation with sample data
curl -X POST 'http://localhost:8080/run/proposal' \
-H 'Content-Type: application/json' \
-d '{
"project_name": "Test WWTP",
"requirements": {
"flow_rate": 500,
"treatment_type": "MBBR",
"budget_range": "200000-300000"
},
"include_calculations": true,
"generate_pdf": true
}'
# Verify calculation accuracy
python tests/manual/verify_calculations.py
# Test PDF output quality
python tests/manual/check_pdf_generation.py
Unit Tests:
# Test platform connectors
pytest tests/unit/test_shopify_connector.py -v
pytest tests/unit/test_woocommerce_connector.py -v
pytest tests/unit/test_medusa_connector.py -v
# Test product mapping
pytest tests/unit/test_product_mapping.py -v
# Test inventory sync
pytest tests/unit/test_inventory_sync.py -v
Integration Tests:
# Test complete sync workflow
pytest tests/integration/test_catalog_sync.py -v
# Test conflict resolution
pytest tests/integration/test_sync_conflicts.py -v
Manual Testing:
# Test Shopify integration
curl -X POST 'http://localhost:8080/catalog/sync' \
-H 'Content-Type: application/json' \
-d '{
"platform": "shopify",
"store_config": {
"api_key": "test-key",
"store_url": "test-store.myshopify.com"
},
"sync_options": {
"dry_run": true,
"categories": ["pumps"]
}
}'
# Test product mapping accuracy
python tests/manual/verify_product_mapping.py
# Test sync performance
python tests/manual/benchmark_sync_speed.py
Unit Tests:
# Test scheduling algorithms
pytest tests/unit/test_maintenance_algorithms.py -v
# Test calendar integration
pytest tests/unit/test_calendar_integration.py -v
# Test notification system
pytest tests/unit/test_maintenance_notifications.py -v
Integration Tests:
# Test complete scheduling workflow
pytest tests/integration/test_maintenance_workflow.py -v
# Test external calendar sync
pytest tests/integration/test_calendar_sync.py -v
Manual Testing:
# Test maintenance scheduling
curl -X POST 'http://localhost:8080/maintenance/schedule' \
-H 'Content-Type: application/json' \
-d '{
"equipment_id": "PUMP-001",
"equipment_type": "centrifugal_pump",
"installation_date": "2024-01-15",
"operating_hours": 2000,
"maintenance_type": "predictive"
}'
# Test calendar integration
python tests/manual/test_calendar_sync.py
# Verify notification delivery
python tests/manual/test_notifications.py
Unit Tests:
# Test regulation parsing
pytest tests/unit/test_regulation_parser.py -v
# Test compliance checking
pytest tests/unit/test_compliance_checker.py -v
# Test certification validation
pytest tests/unit/test_certification_validator.py -v
Integration Tests:
# Test complete compliance workflow
pytest tests/integration/test_compliance_workflow.py -v
# Test regulation updates
pytest tests/integration/test_regulation_updates.py -v
Manual Testing:
# Test compliance checking
curl -X POST 'http://localhost:8080/compliance/check' \
-H 'Content-Type: application/json' \
-d '{
"system_specs": {
"flow_rate": 1000,
"treatment_efficiency": 95,
"discharge_quality": {
"bod": 10,
"cod": 30,
"tss": 15
}
},
"regulations": ["SANS", "ISO"],
"jurisdiction": "south_africa"
}'
# Test regulation database updates
python tests/manual/test_regulation_updates.py
# Verify compliance report generation
python tests/manual/test_compliance_reports.py
Unit Tests:
# Test metrics collection
pytest tests/unit/test_metrics_collector.py -v
# Test alert engine
pytest tests/unit/test_alert_engine.py -v
# Test notification channels
pytest tests/unit/test_notification_channels.py -v
Integration Tests:
# Test complete telemetry workflow
pytest tests/integration/test_telemetry_workflow.py -v
# Test alert delivery
pytest tests/integration/test_alert_delivery.py -v
Manual Testing:
# Test telemetry ingestion
curl -X POST 'http://localhost:8080/telemetry/ingest' \
-H 'Content-Type: application/json' \
-d '{
"system_id": "WWTP-001",
"metrics": {
"flow_rate": 850,
"pressure": 2.5,
"temperature": 22.5,
"power_consumption": 45.2,
"efficiency": 94.8
},
"timestamp": "2024-01-15T10:30:00Z"
}'
# Test alert configuration
curl -X POST 'http://localhost:8080/telemetry/alerts' \
-H 'Content-Type: application/json' \
-d '{
"alert_type": "threshold",
"metric": "efficiency",
"threshold": 90,
"operator": "less_than",
"notification_channels": ["email", "slack"]
}'
# Verify metrics dashboard
python tests/manual/test_metrics_dashboard.py
# Test alert response times
python tests/manual/benchmark_alert_latency.py
# Test complete project lifecycle
pytest tests/integration/test_project_lifecycle.py -v
# Test service communication
pytest tests/integration/test_service_communication.py -v
# Test data consistency across services
pytest tests/integration/test_data_consistency.py -v
# Load testing for new endpoints
python tests/performance/load_test_new_services.py
# Stress testing for concurrent operations
python tests/performance/stress_test_concurrent.py
# Memory usage profiling
python tests/performance/profile_memory_usage.py
# Complete system integration test
pytest tests/e2e/test_complete_system.py -v
# User journey testing
pytest tests/e2e/test_user_journeys.py -v
# API contract testing
pytest tests/e2e/test_api_contracts.py -v
# Generate test data for proposals
python tests/fixtures/generate_proposal_data.py
# Create mock e-commerce data
python tests/fixtures/generate_catalog_data.py
# Setup maintenance test scenarios
python tests/fixtures/generate_maintenance_data.py
# Create compliance test cases
python tests/fixtures/generate_compliance_data.py
# Generate telemetry test data
python tests/fixtures/generate_telemetry_data.py
# Setup test database with new service schemas
python tests/setup/setup_test_database.py
# Seed test data for all services
python tests/setup/seed_test_data.py
# Clean test environment
python tests/setup/cleanup_test_env.py
# .github/workflows/test-new-services.yml
name: Test New Services
on: [push, pull_request]
jobs:
test-new-services:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: pip install -r requirements-dev.txt
- name: Run new service tests
run: |
pytest tests/unit/test_proposal* -v
pytest tests/unit/test_catalog* -v
pytest tests/unit/test_maintenance* -v
pytest tests/unit/test_compliance* -v
pytest tests/unit/test_telemetry* -v
- name: Run integration tests
run: pytest tests/integration/ -v
- name: Generate coverage report
run: pytest --cov=services --cov-report=xml
- name: Upload coverage
uses: codecov/codecov-action@v3
# Code coverage requirements
pytest --cov=services --cov-fail-under=80
# Code quality checks
ruff check services/
black --check services/
mypy services/
# Security scanning
bandit -r services/
safety check
# Generate test data for proposals
python tests/fixtures/generate_proposal_data.py
# Create mock e-commerce data
python tests/fixtures/generate_catalog_data.py
# Setup maintenance test scenarios
python tests/fixtures/generate_maintenance_data.py
# Create compliance test cases
python tests/fixtures/generate_compliance_data.py
# Generate telemetry test data
python tests/fixtures/generate_telemetry_data.py
# Setup test database with new service schemas
python tests/setup/setup_test_database.py
# Seed test data for all services
python tests/setup/seed_test_data.py
# Clean test environment
python tests/setup/cleanup_test_env.py
# .github/workflows/test-new-services.yml
name: Test New Services
on: [push, pull_request]
jobs:
test-new-services:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: pip install -r requirements-dev.txt
- name: Run new service tests
run: |
pytest tests/unit/test_proposal* -v
pytest tests/unit/test_catalog* -v
pytest tests/unit/test_maintenance* -v
pytest tests/unit/test_compliance* -v
pytest tests/unit/test_telemetry* -v
- name: Run integration tests
run: pytest tests/integration/ -v
- name: Generate coverage report
run: pytest --cov=services --cov-report=xml
- name: Upload coverage
uses: codecov/codecov-action@v3
# Code coverage requirements
pytest --cov=services --cov-fail-under=80
# Code quality checks
ruff check services/
black --check services/
mypy services/
# Security scanning
bandit -r services/
safety check
Example:
def extract_pump_specifications(
html_content: str,
base_url: str
) -> list[Pump]:
"""Extract pump specifications from HTML content.
Args:
html_content: Raw HTML content from supplier page
base_url: Base URL for resolving relative links
Returns:
List of Pump objects with extracted specifications
Raises:
ParserError: When HTML structure is incompatible
"""
# Implementation here
pass
pip audit
# Run all tests
pytest tests/
# Run with coverage
pytest --cov=services tests/
# Run specific test file
pytest tests/test_parsers.py
tests/
βββ unit/
β βββ test_parsers.py
β βββ test_models.py
β βββ test_utils.py
βββ integration/
β βββ test_workflows.py
β βββ test_api.py
βββ fixtures/
βββ sample_html/
βββ test_data.json
# Test parser with sample data
def test_pump_parser_grundfos():
with open('fixtures/grundfos_pumps.html') as f:
html = f.read()
pumps = parse_pump_table(html, 'https://grundfos.com')
assert len(pumps) > 0
assert all(pump.flow_rate_lpm > 0 for pump in pumps)
assert all(pump.head_meters > 0 for pump in pumps)
main
CHANGELOG.md
## Description
Brief description of changes and motivation.
## Changes Made
- [ ] Added new UV reactor parser
- [ ] Updated normalization functions
- [ ] Added unit tests
## Testing
- [ ] Unit tests pass
- [ ] Integration tests pass
- [ ] Manual testing completed
## Breaking Changes
None / List any breaking changes
## Dependencies
List any new dependencies or version updates
type(scope): description
feat(parsers): add UV reactor specification extraction
fix(api): handle timeout errors in research endpoint
docs(readme): update installation instructions
test(parsers): add comprehensive pump parser tests
docs/
directoryThis project is proprietary software owned by AvaPrime Technologies. All rights reserved. See the LICENSE file for complete terms and conditions.
For licensing inquiries, please contact: licensing@ecomate.co.za