Real-time Twitter event streaming application that connects via WebSocket to an Apify actor and distributes filtered, deduplicated Twitter events to multiple output channels. Monitor curated crypto Twitter accounts with live CLI streams, interactive web dashboards, and instant alerts.
Get up and running in under 10 minutes:
Clone the repository
git clone <repository-url>
cd crypto-twitter-alpha-stream
Install dependencies
npm install
Configure environment
cp .env.example .env
Add your Apify token
Edit .env and set your token:
APIFY_TOKEN=your_apify_token_here
Start streaming!
npm run dev
That's it! You should now see live Twitter events streaming in your terminal.
Prefer Docker? Run with docker-compose:
# Create .env file with your token
echo "APIFY_TOKEN=your_token_here" > .env
# Start the application
docker-compose up
Access the dashboard at http://localhost:3000
The application uses WebSocket (WSS) to connect to the Apify actor for real-time event streaming. Understanding the protocol helps with troubleshooting and advanced configuration.
wss://muhammetakkurtt--crypto-twitter-tracker.apify.actor/?token=YOUR_TOKENAfter connecting, the client sends a subscribe message to specify which channels to monitor:
{
"op": "subscribe",
"channels": ["all"],
"users": ["elonmusk", "vitalikbuterin"]
}
Fields:
op: Always "subscribe"channels: Array of channel names (see below)users: Optional array of usernames for actor-side filtering (omitted if no filters)Available Channels:
all: All event types (tweets, follows, profile updates)tweets: Only tweet events (post_created)following: Only follow events (follow_created)profile: Only profile update events (user_updated)Multiple Channels: You can subscribe to multiple channels simultaneously:
{
"op": "subscribe",
"channels": ["tweets", "following"]
}
User Filtering: Include the users array for actor-side filtering to reduce events and costs:
{
"op": "subscribe",
"channels": ["all"],
"users": ["elonmusk", "vitalikbuterin", "cz_binance"]
}
When you configure user filters in your .env file (via the USERS environment variable), the application automatically includes them in the subscribe message. The actor then filters events server-side, sending only events from your specified users. This reduces both the number of events delivered to your client and your usage costs.
Events are sent as JSON messages with this structure:
{
"event_type": "post_created",
"data": {
"username": "elonmusk",
"action": "post_created",
"tweetId": "123456789",
"tweet": {
"id": "123456789",
"type": "tweet",
"created_at": "2024-01-15T14:30:22.000Z",
"body": {
"text": "Bitcoin is the future...",
"urls": [],
"mentions": []
},
"author": {
"handle": "elonmusk",
"id": "44196397",
"verified": true,
"profile": {
"name": "Elon Musk",
"avatar": "https://...",
"bio": "..."
}
},
"metrics": {
"likes": 1000,
"retweets": 500,
"replies": 200,
"views": 50000
}
}
}
}
Event Types:
connected: Connection establishedsubscribed: Subscription confirmedpost_created: New tweet postedfollow_created: User followed another useruser_updated: User profile updatedshutdown: Server shutting down (client will reconnect after 5 seconds)error: Error occurred (includes error code and message)The application transforms actor events to a standardized internal format:
Actor Format (received from WebSocket):
{
"event_type": "post_created",
"data": { /* nested tweet/user data */ }
}
Internal Format (used by application):
{
"type": "post_created",
"timestamp": "2024-01-15T14:30:22.000Z",
"primaryId": "123456789",
"user": {
"username": "elonmusk",
"displayName": "Elon Musk",
"userId": "44196397"
},
"data": { /* complete actor data preserved */ }
}
Transformation Rules:
username: Extracted with priority: data.username ā data.user.handle ā data.tweet.author.handleuserId: Extracted from data.user.id or data.tweet.author.iddisplayName: Extracted from data.user.profile.name or data.tweet.author.profile.nameprimaryId: Generated based on event type (tweetId for tweets, userId for follows/updates)data: Complete deep copy of actor data fieldThe application uses an intelligent deduplication system to prevent duplicate events from being processed multiple times. This is critical for avoiding redundant notifications and ensuring data consistency.
Deduplication Key Structure:
The deduplication key is composed of three parts:
{eventType}:{primaryId}:{contentHash}
Components:
post_created, follow_updated)Stable Identifier Extraction:
The application uses stable identifiers instead of timestamps for the primaryId field:
Tweet Events (post_created, post_updated):
tweetId or tweet.id from the event data"123456789"Follow Events (follow_created, follow_updated):
{userId}-{followingId}"44196397-123456"Profile Events (user_updated, profile_updated, profile_pinned):
user.id"44196397"Fallback: Only when no stable ID is available, uses {username}-{timestamp}
Why Stable IDs Matter:
Using stable identifiers instead of timestamps ensures:
Content Hash:
The content hash detects when the same entity has different content:
Example - Tweet Deduplication:
Event 1 (received at 10:00):
{
"type": "post_created",
"primaryId": "123456789",
"data": { "tweet": { "body": { "text": "Hello world" } } }
}
Dedup key: post_created:123456789:abc123
Event 2 (same tweet received at 10:05):
{
"type": "post_created",
"primaryId": "123456789",
"data": { "tweet": { "body": { "text": "Hello world" } } }
}
Dedup key: post_created:123456789:abc123 ā Same key, deduplicated
Event 3 (tweet updated at 10:10):
{
"type": "post_updated",
"primaryId": "123456789",
"data": { "tweet": { "body": { "text": "Hello world updated" } } }
}
Dedup key: post_updated:123456789:def456 ā Different type and hash, processed
Cache Behavior:
DEDUP_TTL)Configuration:
DEDUP_TTL=60 # Dedup cache TTL in seconds (default: 60)
Benefits:
ā Prevents duplicate notifications to users ā Reduces unnecessary processing overhead ā Works reliably across reconnections ā Detects content changes for update events ā Memory efficient with automatic expiration
Implementation Location: The deduplication logic is implemented in DedupCache.ts and generateDedupKey() function.
Connected Event:
{
"event_type": "connected",
"data": {
"connection_id": "ws_1234567890_abc123",
"channels": [],
"filter": {"enabled": false}
}
}
Subscribed Event:
{
"event_type": "subscribed",
"data": {
"channels": ["all"],
"filter": {
"enabled": true,
"users_count": 2,
"sample_users": ["elonmusk", "vitalikbuterin"]
}
}
}
Shutdown Event:
{
"event_type": "shutdown",
"data": {
"message": "Server shutting down"
}
}
When a shutdown event is received, the client waits 5 seconds and then automatically reconnects.
Error Event:
{
"event_type": "error",
"data": {
"code": "INVALID_SUBSCRIPTION",
"message": "channels must be an array"
}
}
The WebSocket connection uses protocol-level ping/pong frames for health monitoring:
ws library automatically responds with pong framesThe client implements automatic reconnection with exponential backoff:
Exponential Backoff Formula:
delay = min(initialDelay Ć multiplier^attempts, maxDelay)
Default Configuration:
Reconnection Sequence:
Special Cases:
The application accepts flexible URL formats and automatically converts them:
Accepted Formats:
http://host ā Converted to ws://host for WebSockethttps://host ā Converted to wss://host for WebSocketws://host ā Used as-is for WebSocketwss://host ā Used as-is for WebSocketREST Endpoints: For HTTP endpoints like /active-users and /health, the URL is converted back:
ws://host ā http://hostwss://host ā https://hostThis allows you to configure the base URL in any format, and the application handles the conversion automatically.
The Apify actor uses a per-client charging model, meaning each connected client is charged separately based on the events delivered to that client.
Without User Filtering:
With User Filtering:
USERS environment variable with specific usernamesExample 1: No Filter
USERS=
# Result: Receives events from all monitored accounts
Example 2: Filter 3 Users
USERS=elonmusk,vitalikbuterin,cz_binance
# Result: Receives ONLY these 3 accounts' events
# You are charged only for events from these 3 accounts
Example 3: Filter 10 Users
USERS=elonmusk,vitalikbuterin,cz_binance,SBF_FTX,justinsuntron,aantonop,APompliano,naval,balajis,VitalikButerin
# Result: Receives ONLY these 10 accounts' events
# You are charged only for events from these 10 accounts
Follow these steps to optimize costs with user filtering:
Step 1: Check the Monitored Users List
Before configuring filters, check which accounts are returned by the actor's monitored users endpoint:
curl -H "Authorization: Bearer YOUR_APIFY_TOKEN" \
https://muhammetakkurtt--crypto-twitter-tracker.apify.actor/active-users
This returns a JSON array of monitored usernames:
["elonmusk", "vitalikbuterin", "cz_binance", "SBF_FTX", ...]
Note: The actor may monitor additional users beyond this list. This endpoint returns a subset for reference.
Step 2: Configure Your User Filter
Add the usernames you want to monitor to your .env file:
USERS=elonmusk,vitalikbuterin,cz_binance
Format Rules:
Step 3: Start the Application
npm start
The application will:
How User Filtering Works:
When you configure user filters, the application includes them in the WebSocket subscribe message:
{
"op": "subscribe",
"channels": ["all"],
"users": ["elonmusk", "vitalikbuterin", "cz_binance"]
}
The actor receives this subscribe message and applies the user filter server-side. Only events from the specified users are sent to your client over the WebSocket connection. This means:
ā ļø Monitored users list is informational: The /active-users endpoint returns a list of monitored users, but the actor may monitor additional users. If you configure a username not in the returned list, you may still receive events if the actor monitors that user.
ā ļø Validation warnings are advisory: The application validates filters on startup and warns about usernames not in the returned list. These warnings are informational - the actor may still monitor those users.
ā Actor-side filtering: By filtering at the source, you only receive events from your specified users. This reduces the number of events delivered to your client.
ā
Client-side filtering still works: You can still use KEYWORDS to further filter events after they're received. This provides additional refinement.
ā Two-layer filtering system: The application uses both actor-side filtering (by users) and client-side filtering (by keywords, event types):
When you start the application with user filters, it validates your configuration:
Valid Configuration:
ā All configured users are in the returned monitored users list
ā Connecting with user filters: elonmusk, vitalikbuterin
Advisory Warning:
ā ļø WARNING: User filter validation notice!
The following usernames are NOT in the returned monitored users list:
someuser, anotheruser
Note: The actor may monitor additional users beyond this list.
If these users are monitored by the actor, you WILL receive their events.
Valid configured users (confirmed in list):
elonmusk, vitalikbuterin
Sample of returned monitored users:
elonmusk, vitalikbuterin, cz_binance, SBF_FTX, ...
To see the full list of monitored users, visit:
/active-users endpoint
The application will still proceed with the connection, but you'll only receive events for valid usernames.
The dashboard provides runtime subscription management, allowing you to modify subscription parameters (channels and users) without restarting the application. This feature uses a staged-apply UX pattern with security controls.
Runtime subscription management enables you to:
Important: Runtime changes are temporary and do not persist across restarts. To make changes permanent, edit your configuration file.
The dashboard uses a staged-apply workflow to prevent accidental changes:
Benefits:
Example Workflow:
Current State: channels=["all"], users=["elonmusk"]
ā
User modifies in UI: channels=["tweets"], users=["elonmusk", "vitalikbuterin"]
ā
Staged state updated (not yet applied)
ā
User clicks "Apply Changes"
ā
Server updates subscription atomically
ā
All dashboards receive broadcast and update
Subscription modifications are restricted based on client origin:
Control Clients (localhost connections):
Read-Only Clients (remote connections):
Why This Matters:
Example:
// Local client (127.0.0.1)
socket.emit('setRuntimeSubscription', {...});
// ā
Success: Subscription updated
// Remote client (192.168.1.100)
socket.emit('setRuntimeSubscription', {...});
// ā Error: Forbidden
Idle mode allows you to pause monitoring while maintaining the connection:
What is Idle Mode?
channels: []When to Use Idle Mode:
How to Enter Idle Mode:
Via Dashboard:
Via API:
socket.emit('setRuntimeSubscription', {
channels: [],
users: []
}, (response) => {
// response.data.mode === "idle"
});
How to Exit Idle Mode:
Simply select channels and apply:
socket.emit('setRuntimeSubscription', {
channels: ['all'],
users: []
}, (response) => {
// response.data.mode === "active"
});
socket.emit('getRuntimeSubscription', (response) => {
if (response.success) {
console.log('Channels:', response.data.channels);
console.log('Users:', response.data.users);
console.log('Mode:', response.data.mode);
console.log('Source:', response.data.source);
console.log('Updated:', response.data.updatedAt);
}
});
// Switch from all events to tweets only
socket.emit('setRuntimeSubscription', {
channels: ['tweets'],
users: ['elonmusk', 'vitalikbuterin']
}, (response) => {
if (response.success) {
console.log('Now monitoring tweets only');
}
});
// Add user filters to reduce event volume
socket.emit('setRuntimeSubscription', {
channels: ['all'],
users: ['elonmusk', 'vitalikbuterin', 'cz_binance']
}, (response) => {
if (response.success) {
console.log('Now monitoring 3 users only');
}
});
// Pause monitoring
socket.emit('setRuntimeSubscription', {
channels: [],
users: []
}, (response) => {
if (response.success) {
console.log('Entered idle mode - no events will be received');
}
});
// Listen for subscription changes from other clients
socket.on('runtimeSubscriptionUpdated', (state) => {
console.log('Subscription updated by another client');
console.log('New channels:', state.channels);
console.log('New users:', state.users);
// Update UI to reflect new state
});
It's important to understand the difference between global subscription and local dashboard filters:
Global Subscription (Runtime Subscription Management):
Local Dashboard Filters (Client-Side):
Example:
Global Subscription: users=["elonmusk", "vitalikbuterin"]
ā
Actor sends only these 2 users' events
ā
Dashboard receives events from 2 users
ā
Local Filter: keywords=["bitcoin"]
ā
Dashboard displays only bitcoin-related events
Best Practice: Use global subscription to control event volume and costs, then use local filters for UI refinement.
The dashboard provides a convenient "Use selected users" feature to copy your local filter selections to the global upstream subscription:
Workflow:
Important Notes:
Example Workflow:
Step 1: Select users in left sidebar
Local Filters: ["elonmusk", "vitalikbuterin", "cz_binance"]
ā
Step 2: Click "Use selected users (3)"
Upstream Draft: ["cz_binance", "elonmusk", "vitalikbuterin"] (normalized)
ā
Step 3: Click "Apply Changes"
Global Subscription: ["cz_binance", "elonmusk", "vitalikbuterin"]
ā
Actor now sends only these 3 users' events
Why Manual Copy?:
Additional Features:
Runtime Changes Are Temporary:
source field indicates origin: "config" (from file) or "runtime" (modified)Making Changes Permanent:
To make runtime changes permanent, edit your configuration file:
.env or config/config.jsonCHANNELS and USERS to match desired stateExample:
# Before restart (runtime changes)
# channels: ["tweets"], users: ["elonmusk"]
# Edit .env to make permanent
CHANNELS=tweets
USERS=elonmusk
# After restart
# source: "config", channels: ["tweets"], users: ["elonmusk"]
See Configuration Guide for details on making changes permanent.
The application supports three configuration methods with the following priority:
The required configuration includes your Apify token and the actor URL:
APIFY_TOKEN=your_apify_token_here
APIFY_ACTOR_URL=https://muhammetakkurtt--crypto-twitter-tracker.apify.actor
# Apify actor URL (already set to the deployed actor)
# Supports http/https/ws/wss formats - automatically converted
APIFY_ACTOR_URL=https://muhammetakkurtt--crypto-twitter-tracker.apify.actor
# Select channels (comma-separated: all, tweets, following, profile)
# Can specify multiple channels to subscribe to multiple event types
CHANNELS=all
# Filter by specific users (comma-separated)
USERS=elonmusk,vitalikbuterin,cz_binance
# Filter by keywords (comma-separated, case-insensitive)
KEYWORDS=bitcoin,ethereum,defi
# Debug mode (enables detailed logging throughout event processing pipeline)
# Set to 'true' to enable verbose logging for troubleshooting
# WARNING: Debug mode is very verbose and may impact performance
DEBUG=false
# Enable/disable web dashboard
# Default: false (disabled), set to true to enable
DASHBOARD_ENABLED=false
DASHBOARD_PORT=3000
# Telegram alerts
TELEGRAM_ENABLED=false
TELEGRAM_BOT_TOKEN=your_bot_token
TELEGRAM_CHAT_ID=your_chat_id
# Discord alerts
DISCORD_ENABLED=false
DISCORD_WEBHOOK_URL=https://discord.com/api/webhooks/...
# Generic webhook
WEBHOOK_ENABLED=false
WEBHOOK_URL=https://your-webhook-url.com/endpoint
For complete configuration options, see:
.env.example - All environment variables with descriptionsdocs/CONFIGURATION.md - Detailed configuration guideconfig/config.example.json - JSON configuration exampleMonitor all events in your terminal:
npm run dev
Output:
[post_created] @elonmusk: Bitcoin is the future of money...
[profile_update] @vitalikbuterin: changed bio
[following] @cz_binance: followed @SBF_FTX
--- Stats (60s) ---
events_total=120 delivered=95 deduped=25 rate=2.0/s
Enable the dashboard to get a visual interface:
DASHBOARD_ENABLED=true
DASHBOARD_PORT=3000
npm run dev
Open http://localhost:3000 in your browser to:
Get instant notifications on Telegram with rich formatting:
TELEGRAM_ENABLED=true
TELEGRAM_BOT_TOKEN=123456789:ABCdefGHIjklMNOpqrsTUVwxyz
TELEGRAM_CHAT_ID=123456789
ALERT_RATE_LIMIT=30
You'll receive rich formatted alerts with:
Example message:
š New Tweet
š¤ @elonmusk
Bitcoin is the future of money and will replace all fiat currencies...
š¼ļø 2 images
š„ 1 video(s)
š 2024-01-15 14:30:22 UTC
[š View Tweet] [š¤ View Profile]
Monitor only specific accounts for certain keywords:
CHANNELS=tweets
USERS=elonmusk,vitalikbuterin,cz_binance
KEYWORDS=bitcoin,ethereum,btc,eth
This will only show tweets from these three users that mention crypto keywords.
Send rich embeds to a Discord channel:
DISCORD_ENABLED=true
DISCORD_WEBHOOK_URL=https://discord.com/api/webhooks/123456789/abcdefg
ALERT_RATE_LIMIT=30
Discord embeds include:
Example embed features:
Run all outputs at once:
CLI_ENABLED=true
DASHBOARD_ENABLED=true
TELEGRAM_ENABLED=true
DISCORD_ENABLED=true
Events will be broadcast to all enabled channels independently.
WebSocket connection for real-time event streaming from the Apify actor.
URL: wss://muhammetakkurtt--crypto-twitter-tracker.apify.actor/
Authentication: Token passed as query parameter: ?token=YOUR_TOKEN
Protocol:
Subscribe Message:
{
"op": "subscribe",
"channels": ["all"],
"users": ["elonmusk", "vitalikbuterin"]
}
See WebSocket Protocol section for detailed protocol documentation.
GET /status
Returns application health and statistics.
Port: 3001 (configurable via HEALTH_PORT)
Response:
{
"connection": {
"status": "connected",
"channels": ["all"],
"uptime": 3600
},
"events": {
"total": 1250,
"delivered": 980,
"deduped": 270,
"rate": 2.5
},
"alerts": {
"telegram": { "sent": 45, "failed": 2 },
"discord": { "sent": 45, "failed": 0 },
"webhook": { "sent": 0, "failed": 0 }
},
"filters": {
"users": ["elonmusk", "vitalikbuterin"],
"keywords": ["bitcoin", "ethereum"]
}
}
Example:
curl http://localhost:3001/status
WebSocket connection for real-time event streaming to the dashboard.
Port: 3000 (configurable via DASHBOARD_PORT)
Events:
event - New Twitter eventconnection-status - Connection status changeactiveUsers - Active users list updateSee docs/API.md for detailed API documentation.
Terminal-based live stream with periodic statistics.
Features:
Configuration:
CLI_ENABLED=true
CLI_STATS_INTERVAL=60000 # Display stats every 60 seconds
Interactive web interface for visual monitoring.
Features:
Access: http://localhost:3000 (default)
Configuration:
DASHBOARD_ENABLED=false # Default: false, set to true to enable
DASHBOARD_PORT=3000
Push notifications to external services.
Telegram:
TELEGRAM_ENABLED=true
TELEGRAM_BOT_TOKEN=your_bot_token
TELEGRAM_CHAT_ID=your_chat_id
ALERT_RATE_LIMIT=30
Features:
Discord:
DISCORD_ENABLED=true
DISCORD_WEBHOOK_URL=https://discord.com/api/webhooks/...
ALERT_RATE_LIMIT=30
Features:
Generic Webhook:
WEBHOOK_ENABLED=true
WEBHOOK_URL=https://your-webhook-url.com/endpoint
ALERT_RATE_LIMIT=30
Rate Limiting: All alert channels support configurable rate limiting via ALERT_RATE_LIMIT (default: 10 messages per minute). For high-volume crypto streams, consider setting this to 30 or higher.
Problem: "Failed to connect to WebSocket endpoint"
Solutions:
APIFY_TOKEN is correctcurl -H "Authorization: Bearer YOUR_TOKEN" https://muhammetakkurtt--crypto-twitter-tracker.apify.actor/healthProblem: "WebSocket connection closed unexpectedly"
Solutions:
DEBUG=true npm startProblem: "Subscription timeout - no subscribed event received"
Solutions:
CHANNELS=allProblem: "Authentication failed"
Solutions:
.envProblem: No events appearing
Solutions:
USERS and KEYWORDS)USERS filter, verify usernames are in the active users list:curl -H "Authorization: Bearer YOUR_TOKEN" \
https://muhammetakkurtt--crypto-twitter-tracker.apify.actor/active-users
curl http://localhost:3001/statusDEBUG=true npm startProblem: Configured user filters but receiving no events
Solutions:
elonmusk to testProblem: "Reconnection loop - constantly reconnecting"
Solutions:
RECONNECT_INITIAL_DELAY=1000
RECONNECT_MAX_DELAY=30000
RECONNECT_BACKOFF_MULTIPLIER=2.0
DEBUG=true npm startProblem: "Events are delayed or arriving in bursts"
Solutions:
Problem: "Heartbeat timeout - connection considered dead"
Solutions:
Problem: "Server shutdown event received"
Solutions:
Problem: "Invalid subscribe message error"
Solutions:
CHANNELS is set correctly (comma-separated: all,tweets,following,profile)CHANNELS=allProblem: Cannot access dashboard at http://localhost:3000
Solutions:
DASHBOARD_ENABLED=true in your .envDASHBOARD_PORT)http://127.0.0.1:3000 insteadProblem: Telegram/Discord alerts not arriving
Solutions:
curl -X POST -H "Content-Type: application/json" \
-d '{"content":"Test message"}' \
YOUR_WEBHOOK_URL
Problem: Application consuming too much memory
Solutions:
DEDUP_TTL to expire cache entries fasterProblem: Receiving duplicate events
Solutions:
DEDUP_TTL (default: 60 seconds)Problem: npm test or npm run build fails
Solutions:
node_modules and reinstall: rm -rf node_modules && npm installrm -rf distnode --versionnpx tsc --noEmitEnable debug mode for comprehensive logging throughout the event processing pipeline. This is essential for troubleshooting data transformation issues, validation failures, and understanding how events flow through the system.
Enable Debug Mode:
# Enable debug logging
DEBUG=true npm start
# Or in development mode
DEBUG=true npm run dev
# Or set in .env file
DEBUG=true
What Debug Mode Logs:
Debug mode provides detailed visibility into the event processing pipeline:
Raw Actor Events (WSSClient):
Transformed Events (WSSClient):
Validation Failures (StreamCore):
Event Structure Details:
Transformation Errors:
WebSocket Connection Events:
Example Debug Output:
[WSSClient] WebSocket connected
[WSSClient] Sending subscribe message: {"op":"subscribe","channels":["all"],"users":["elonmusk"]}
[WSSClient] Subscribed successfully: {"channels":["all"],"filter":{"enabled":true,"users_count":1}}
[WSSClient] Raw actor event: {
"data": {
"username": "elonmusk",
"action": "post_created",
"tweetId": "123456789",
"tweet": {
"body": { "text": "Hello world" },
"author": { "handle": "elonmusk", "id": "44196397" }
}
},
"event_type": "post_created"
}
[WSSClient] Transformed event: {
"type": "post_created",
"timestamp": "2024-01-15T14:30:22.000Z",
"primaryId": "123456789",
"user": {
"username": "elonmusk",
"displayName": "Elon Musk",
"userId": "44196397"
},
"data": { ... }
}
[StreamCore] Event validated successfully
When to Use Debug Mode:
Performance Impact:
ā ļø Warning: Debug mode is very verbose and logs large JSON structures. This can:
Best Practices:
ā
Enable temporarily for troubleshooting specific issues
ā
Disable in production unless actively debugging
ā
Use with filters to reduce log volume (e.g., USERS=elonmusk)
ā
Redirect to file for analysis: DEBUG=true npm start > debug.log 2>&1
Troubleshooting with Debug Mode:
Problem: Events not appearing in CLI
DEBUG=true npm start
# Check if raw actor events are being received
# Check if transformation is working correctly
# Check if validation is passing
Problem: Missing tweet text or user information
DEBUG=true npm start
# Compare raw actor event structure with transformed event
# Verify nested fields are being preserved
# Check extraction logic is accessing correct paths
Problem: Events being rejected
DEBUG=true npm start
# Look for validation failure messages
# Check which required fields are missing
# Verify event structure matches expected format
Disabling Debug Mode:
# Remove from command line
npm start
# Or set in .env file
DEBUG=false
# Or remove the DEBUG variable entirely
Note: Debug logging is controlled by the DEBUG environment variable. Set it to 'true' (string) to enable, or 'false'/unset to disable.
If you're still experiencing issues:
DEBUG=true for detailed diagnostics# Development mode with auto-reload
npm run dev
# Build for production
npm run build
# Run production build
npm start
The dashboard is built with Svelte 5 and requires separate build steps:
# Navigate to frontend directory
cd frontend
# Install frontend dependencies
npm install
# Development mode with hot reload
npm run dev
# Build for production
npm run build
# Run frontend tests
npm test
# Type checking
npm run check
The production build output (frontend/build/) is automatically served by the backend's DashboardOutput.
For detailed frontend documentation, see:
# Run all tests
npm test
# Run tests in watch mode
npm test:watch
# Generate coverage report
npm run test:coverage
# Run specific test file
npm test -- WSSClient.test.ts
The test suite includes:
crypto-twitter-alpha-stream/
āāā src/ # Backend TypeScript source
ā āāā index.ts # Application entry point
ā āāā Application.ts # Main application orchestrator
ā āāā config/ # Configuration management
ā ā āāā ConfigManager.ts # Config loader with priority resolution
ā ā āāā types.ts # Configuration type definitions
ā āāā ws/ # WebSocket client with reconnection
ā ā āāā WSSClient.ts # WebSocket wrapper with exponential backoff
ā āāā filters/ # Event filtering pipeline
ā ā āāā FilterPipeline.ts # Filter chain orchestrator
ā ā āāā EventFilter.ts # User and keyword filters
ā āāā streamcore/ # Core event processing
ā ā āāā StreamCore.ts # Event validation and distribution
ā āāā outputs/ # Output channels
ā ā āāā CLIOutput.ts # Terminal output with stats
ā ā āāā DashboardOutput.ts # WebSocket server for dashboard
ā ā āāā AlertOutput.ts # Alert orchestrator
ā ā āāā AlertChannel.ts # Telegram, Discord, Webhook channels
ā ā āāā RateLimiter.ts # Rate limiting for alerts
ā āāā activeusers/ # Active users fetcher
ā ā āāā ActiveUsersFetcher.ts # Periodic user list refresh
ā āāā health/ # Health monitoring
ā ā āāā HealthMonitor.ts # HTTP status endpoint
ā āāā models/ # Data models and types
ā ā āāā types.ts # Event types and interfaces
ā āāā utils/ # Utility functions
ā ā āāā LogSanitizer.ts # Sensitive data sanitization
ā āāā validation/ # Input validation
ā ā āāā UserFilterValidator.ts # User filter validation
ā āāā dedup/ # Deduplication
ā ā āāā DedupCache.ts # TTL-based event cache
ā āāā eventbus/ # Event bus
ā āāā EventBus.ts # Pub/sub for internal events
āāā frontend/ # Dashboard frontend (Svelte 5)
ā āāā src/
ā ā āāā lib/ # Reusable components and stores
ā ā ā āāā components/ # Svelte 5 components
ā ā ā āāā stores/ # Svelte 5 runes-based stores
ā ā ā āāā hooks/ # Custom hooks
ā ā ā āāā utils/ # Frontend utilities
ā ā ā āāā types/ # TypeScript types
ā ā āāā routes/ # SvelteKit routes
ā ā āāā app.css # Global styles (Tailwind)
ā āāā build/ # Production build (served by backend)
ā āāā static/ # Static assets
ā āāā tests/ # Frontend tests (Vitest)
ā āāā vite.config.ts # Vite configuration
āāā tests/ # Backend tests (Jest)
ā āāā unit tests # Specific examples and edge cases
ā āāā integration tests # Component interactions
ā āāā property tests # Property-based tests (fast-check)
āāā config/ # Configuration files
āāā docs/ # Documentation
āāā dist/ # Compiled backend output
crypto-twitter-alpha-stream/
āāā src/ # Backend source code
ā āāā index.ts # Application entry point
ā āāā Application.ts # Main orchestrator
ā āāā config/ # Configuration management
ā āāā ws/ # WebSocket client with reconnection
ā āāā filters/ # Event filtering pipeline
ā āāā streamcore/ # Core event processing
ā āāā outputs/ # Output channels (CLI, Dashboard, Alerts)
ā āāā activeusers/ # Active users fetcher
ā āāā health/ # Health monitoring
ā āāā models/ # Data models and types
ā āāā utils/ # Utility functions (LogSanitizer)
ā āāā validation/ # Input validation (UserFilterValidator)
ā āāā dedup/ # Deduplication cache
ā āāā eventbus/ # Event bus for pub/sub
āāā frontend/ # Dashboard frontend (Svelte 5 + TypeScript)
ā āāā src/ # Svelte components and stores
ā āāā build/ # Production build output (served by backend)
ā āāā docs/ # Frontend documentation
ā āāā ... # Vite, Tailwind, test configs
āāā tests/ # Test files (mirrors src/ structure)
ā āāā unit tests # Specific examples and edge cases
ā āāā integration tests # Component interactions
ā āāā property tests # Universal correctness properties
āāā config/ # Configuration files
ā āāā config.json # Optional JSON config (gitignored)
ā āāā config.example.json # Example configuration
ā āāā README.md # Configuration guide
āāā docs/ # Documentation
ā āāā API.md # API documentation
ā āāā CONFIGURATION.md # Detailed configuration guide
āāā dist/ # Compiled backend output
āāā .env.example # Environment variables template
āāā .env # Local environment variables (gitignored)
āāā docker-compose.yml # Docker Compose configuration
āāā Dockerfile # Multi-stage Docker build
āāā package.json # Project metadata and scripts