Frontend Caching System
The application implements a sophisticated caching system to optimize data access, reduce API calls, and improve overall performance. This document details the technical implementation of our caching solution.
Overview
The caching system is designed to:
- Minimize redundant API calls (Rule #1)
- Handle chunked data for virtual scrolling
- Provide fast access to frequently used data
- Maintain data freshness with TTL (Time-To-Live)
- Handle cache invalidation efficiently
- Manage memory usage through size limits
Cache Keys
The system defines standard cache keys for different types of data:
export const CACHE_KEYS = {
FILTERED_RESULTS: "filtered_results",
BRAND_LIST: "brand_list",
CAMPAIGN_LIST: "campaign_list",
RECENT_ACTIVITY: "recent_activity",
};
Technical Implementation
Core Cache Manager
class CacheManager {
constructor() {
this.cache = new Map();
this.expiryTimes = new Map();
this.defaultTTL = 5 * 60 * 1000; // 5 minutes default TTL
this.maxSize = 100; // Maximum number of cache entries
}
}
Cache Operations
Chunk-based Caching
async getOrFetch(baseKey, fetchFn, offset = 0, limit = 100) {
try {
const pageNumber = Math.floor(offset / limit);
const chunkKey = this.generateChunkKey(baseKey, pageNumber, offset, limit);
// Try to get from cache
const cachedData = await this.get(chunkKey);
if (this.validateData(cachedData)) {
return cachedData;
}
// Fetch new chunk
const newData = await fetchFn();
if (this.validateData(newData)) {
await this.set(chunkKey, newData);
}
return newData;
} catch (error) {
console.error("Error in getOrFetch:", error);
throw error;
}
}
Features:
- Chunk-based data management
- Integration with virtual scrolling
- Efficient memory usage
- Request cancellation support
Setting Cache Entries
async set(key, value, ttl = this.defaultTTL) {
try {
this.cache.set(key, value);
this.expiryTimes.set(key, Date.now() + ttl);
this.trimCache();
return this.updateStore(key, value);
} catch (error) {
console.error("Error setting cache:", error);
return false;
}
}
Features:
- Configurable TTL per entry
- Automatic cache size management
- Store integration for state updates
- Error handling
Advanced Features
1. Chunk Key Generation
generateChunkKey(baseKey, pageNumber, offset, limit) {
const chunkKey = `${baseKey}_chunk_${pageNumber}_${offset}_${limit}`;
return chunkKey;
}
generateFilteredResultsKey(params) {
// Create base key from filters
const baseKey = {
campaignId: params.campaignId || null,
orderBy: {
field: params.orderBy?.field || "ID",
direction: params.orderBy?.direction || "ASC",
},
filters: {
brandId: params.filters?.brandId || null,
currentStatuses: (params.filters?.currentStatuses || []).sort(),
productionRunIds: (params.filters?.productionRunIds || []).sort(),
},
};
return `${CACHE_KEYS.FILTERED_RESULTS}_${JSON.stringify(baseKey)}`;
}
Features:
- Consistent chunk identification
- Support for complex filters
- Pagination handling
- Sorting consistency
2. Store Integration
updateStore(key, data) {
try {
// Parse chunk info from key
const chunkMatch = key.match(/chunk_(\d+)_(\d+)_(\d+)$/);
const pageNumber = chunkMatch ? parseInt(chunkMatch[1]) : -1;
// Check if this is a filtered results key
const isFilteredResults = key.startsWith(CACHE_KEYS.FILTERED_RESULTS);
if (isFilteredResults) {
// Update store with every chunk's data
if (this.validateData(data)) {
store.setTrackersData(data);
// Only update brands on first chunk
if (pageNumber === 0 && data.brands) {
store.setBrandsData(data.brands);
}
}
} else if (key === CACHE_KEYS.BRAND_LIST) {
store.setBrandsData(data);
} else if (key === CACHE_KEYS.RECENT_ACTIVITY) {
store.setRecentActivity(data);
}
return data;
} catch (error) {
console.error("Error updating store:", error);
return data;
}
}
Features:
- Chunk-aware store updates
- Efficient state management
- Data consistency
- Error resilience
3. Data Validation
validateData(data) {
if (!data?.trackersConnection?.edges) {
return false;
}
return true;
}
logChunkInfo(prefix, data) {
if (!this.validateData(data)) return;
console.log(`${prefix}:`, {
rows: data.trackersConnection.edges.length,
firstId: data.trackersConnection.edges[0]?.node?.id,
lastId: data.trackersConnection.edges[data.trackersConnection.edges.length - 1]?.node?.id,
totalCount: data.trackersConnection.totalCount,
});
}
Features:
- Data integrity checks
- Detailed logging
- Error prevention
- Debugging support
Usage Example
// Initialize cache manager
const cacheManager = new CacheManager();
// Get or fetch chunked data
const result = await cacheManager.getOrFetch(
baseKey,
async () => {
const result = await dataLoader.loadTrackerData(offset, limit, signal);
return result;
},
offset,
limit,
);
Performance Considerations
The caching system is optimized for:
- Chunk-based data management
- Fast access times (O(1) lookups)
- Minimal memory footprint
- Efficient cache invalidation
- Reduced network requests
- Improved scrolling performance
Best Practices
-
Chunk Management:
- Use appropriate chunk sizes
- Consider data access patterns
- Clean up unused chunks
- Monitor memory usage
-
Cache Duration:
- Use shorter TTLs for frequently changing data
- Use longer TTLs for static content
- Consider user activity patterns
-
Error Handling:
- Implement fallbacks for cache failures
- Log cache errors for monitoring
- Handle edge cases gracefully
-
Memory Management:
- Monitor chunk cache size
- Clean up old chunks
- Use appropriate buffer sizes
- Consider device limitations