A comprehensive, global-scale AI agent platform with Model Context Protocol (MCP) capabilities, data intelligence features, and enterprise-grade reliability. Built for Fortune 500 companies and global enterprises.
git clone https://github.com/yourusername/smartagents-enterprise.git
cd smartagents-enterprise
cp .env.example .env
docker-compose up -d
git clone https://github.com/yourusername/smartagents-enterprise.git
cd smartagents-enterprise
pip install -r requirements.txt
python backend/src/main_enterprise.py
Access Points:
| Agent | Purpose | Status | Features |
|---|---|---|---|
| ๐ DeerFlow Agent | Workflow automation | โ Active | Process orchestration, task management |
| ๐ง Ollama Agent | Local LLM inference | โ Active | Chat, completions, embeddings |
| ๐ฏ TensorZero Agent | ML model management | โ Active | Model serving, training, evaluation |
| ๐ค Voice Assistant | Speech processing | โ Active | STT, TTS, voice commands |
| ๐๏ธ Computer Vision | Image analysis | โ Active | Object detection, OCR, classification |
| ๐ฑ๏ธ Computer Use | Desktop automation | โ Active | GUI automation, screen control |
| ๐ MCP Agent | Protocol server/client | โ Active | Model communication, interoperability |
| ๐ MCP Data Agent | Intelligent querying | โ Active | Data analysis, insights generation |
| ๐ฅ๏ธ System Agent | System monitoring | โ Active | Resource monitoring, health checks |
| โ๏ธ Azure Foundry | Cloud ML integration | โ Active | Azure ML, cognitive services |
git clone https://github.com/yourusername/smartagents-enterprise.git
cd smartagents-enterprise
# Copy environment template
cp .env.example .env
# Edit environment variables
nano .env
# Install Python dependencies
pip install -r requirements.txt
# Install Node.js dependencies (for frontend development)
cd frontend && npm install && cd ..
docker-compose up -d
# Start Redis (if not using Docker)
redis-server
# Start the enterprise backend
python backend/src/main_enterprise.py
All API requests require JWT token in Authorization header:
Authorization: Bearer <your-jwt-token>
POST /api/auth/login # Authenticate user
GET /api/agents # List all agents
POST /api/agents/{id}/execute # Execute specific agent
GET /api/health # System health check
GET /metrics # Prometheus metrics
WS /ws # WebSocket connection
curl -X POST "http://localhost:8000/api/agents/ollama/execute" \
-H "Authorization: Bearer your-jwt-token" \
-H "Content-Type: application/json" \
-d '{
"input": "Generate a summary of AI trends",
"parameters": {
"model": "llama2",
"max_tokens": 500
}
}'
Access metrics at: http://localhost:8000/metrics
Key metrics include:
smartagents_requests_total - Total API requestssmartagents_request_duration_seconds - Request durationsmartagents_active_agents - Number of active agentssmartagents_errors_total - Total errors by typeImport the provided dashboard from grafana-dashboard.json:
http://localhost:3001http://prometheus:9090| Variable | Description | Default |
|---|---|---|
JWT_SECRET |
Secret key for JWT tokens | your-secret-key |
REDIS_URL |
Redis connection URL | redis://localhost:6379 |
LOG_LEVEL |
Logging level | INFO |
WORKERS |
Number of worker processes | 4 |
RATE_LIMIT_REQUESTS |
Requests per minute | 100 |
Each agent can be configured via environment variables or config files:
# Example: Ollama Agent configuration
OLLAMA_BASE_URL = "http://localhost:11434"
OLLAMA_DEFAULT_MODEL = "llama2"
OLLAMA_TIMEOUT = 60
# Run all tests
pytest
# Run with coverage
pytest --cov=backend/src
# Run specific test suite
pytest tests/test_agents.py
# Install testing tools
pip install locust
# Run load tests
locust -f tests/load_test.py --host=http://localhost:8000
See DEPLOYMENT.md for comprehensive deployment guide including:
# Deploy to AWS ECS
./deploy/aws/deploy.sh
# Deploy to Google Cloud Run
./deploy/gcp/deploy.sh
# Deploy to Azure Container Instances
./deploy/azure/deploy.sh
# Check agent status
curl http://localhost:8000/api/agents/status
# View agent logs
tail -f logs/agents.log
# Monitor system resources
htop
# Check Redis memory usage
redis-cli info memory
# View application metrics
curl http://localhost:8000/metrics
# Test WebSocket connection
wscat -c ws://localhost:8000/ws
# Check firewall settings
sudo netstat -tlnp | grep :8000
git checkout -b feature/amazing-featuregit commit -m 'Add amazing feature'git push origin feature/amazing-featureThis project is licensed under the MIT License - see the LICENSE file for details.
Made with โค๏ธ for the enterprise AI community