Skip to main content

Utility Scripts

This document describes the utility scripts available in the Tracker GraphQL application for maintenance, testing, and troubleshooting.

Overview

The application includes several utility scripts located in the scripts/ directory that help with common tasks:

  1. Database Maintenance: Scripts for checking and refreshing database tables
  2. User Management: Scripts for creating and managing users
  3. Cache Management: Scripts for flushing the Redis cache
  4. Data Listing: Scripts for listing trackers and production runs

Database Maintenance Scripts

check_aggregated_tables.py

This script checks the status of the TimescaleDB continuous aggregates and materialized views.

python scripts/check_aggregated_tables.py

Output example:

Checking location_history_hourly...
- Last refresh: 2025-04-17 13:45:00+01
- Row count: 12,543

Checking location_history_daily...
- Last refresh: 2025-04-17 00:00:00+01
- Row count: 1,245

Checking location_history materialized view...
- Last refresh: 2025-04-17 14:00:00+01
- Row count: 13,788

refresh_continuous_aggregates.py

This script manually refreshes the TimescaleDB continuous aggregates and materialized views.

python scripts/refresh_continuous_aggregates.py

You can specify a time range to refresh:

python scripts/refresh_continuous_aggregates.py --start "2025-04-10" --end "2025-04-17"

User Management Scripts

create_user.py

This script creates a new user in the database.

python scripts/create_user.py --email user@example.com --password securepassword --name "User Name" --role admin

Available roles:

  • admin: Full access to all features
  • user: Standard user access
  • viewer: Read-only access

Cache Management Scripts

flush_cache.py

This script flushes the Redis cache, which can be useful when troubleshooting or after making database changes.

python scripts/flush_cache.py

You can flush specific cache keys:

python scripts/flush_cache.py --pattern "location_history:*"

Data Import Scripts

import_trackers.py

This script imports tracker data from a CSV file into the trackers table.

python scripts/import_trackers.py <production_run_id> --csv-path path/to/trackers.csv

The script:

  1. Validates that the production run exists
  2. Reads tracker data from the CSV file
  3. Handles flexible column naming in the CSV
  4. Skips rows with missing required fields
  5. Avoids creating duplicate trackers
  6. Provides detailed logging of the import process

Example usage:

# Import trackers for production run ID 5 using the default CSV file
python scripts/import_trackers.py 5

# Import trackers for production run ID 10 using a specific CSV file
python scripts/import_trackers.py 10 --csv-path data/new-trackers.csv

Data Listing Scripts

list_trackers.py

This script lists all trackers in the database with their current status.

python scripts/list_trackers.py

You can filter by client:

python scripts/list_trackers.py --client-id 123

Output example:

ID  | Name        | Status      | Last Report          | Client
----+-------------+-------------+---------------------+--------
1 | Tracker-001 | in_transit | 2025-04-17 13:45:00 | Client A
2 | Tracker-002 | in_storage | 2025-04-17 12:30:00 | Client A
3 | Tracker-003 | delivered | 2025-04-16 09:15:00 | Client B

list_production_runs.py

This script lists all production runs in the database.

python scripts/list_production_runs.py

You can filter by client:

python scripts/list_production_runs.py --client-id 123

Output example:

ID  | Name        | Start Date  | End Date    | Client    | Trackers
----+-------------+-------------+-------------+-----------+--------
1 | Run-2025-01 | 2025-01-15 | 2025-02-28 | Client A | 5
2 | Run-2025-02 | 2025-03-01 | 2025-04-15 | Client A | 8
3 | Run-2025-03 | 2025-03-15 | 2025-04-30 | Client B | 3

Mock Data Scripts

setup_mock_data.py

This script sets up mock data for testing and development purposes.

python scripts/setup_mock_data.py

This script:

  1. Creates mock clients
  2. Creates mock trackers
  3. Creates mock storage and delivery locations
  4. Creates mock production runs
  5. Associates trackers with production runs
  6. Generates mock location reports

setup_mock_data_direct.py

This script sets up mock data directly in the database, bypassing the GraphQL API.

python scripts/setup_mock_data_direct.py

This is faster than setup_mock_data.py but doesn't test the GraphQL API.

Running Scripts in Production

When running scripts in a production environment, it's recommended to:

  1. Make a backup of the database first
  2. Run the script in a test environment first
  3. Use the --dry-run flag if available to see what changes would be made
  4. Run the script during off-peak hours

Example:

# Make a backup
pg_dump -U postgres -d tracker > backup_before_script.sql

# Run the script with dry run
python scripts/flush_cache.py --dry-run

# If everything looks good, run the script
python scripts/flush_cache.py

Creating New Scripts

When creating new utility scripts, follow these guidelines:

  1. Place the script in the scripts/ directory
  2. Use the same database connection method as existing scripts
  3. Include command-line arguments for flexibility
  4. Add a --dry-run flag for safety
  5. Include helpful output and error messages
  6. Document the script in this file