Files
acc-server-manager/documentation/CACHING_IMPLEMENTATION.md
2025-06-30 22:50:52 +02:00

6.4 KiB

Caching Implementation for Performance Optimization

This document describes the simple caching implementation added to improve authentication middleware performance in the ACC Server Manager.

Problem Identified

The authentication middleware was experiencing performance issues due to database queries on every request:

  • HasPermission() method was hitting the database for every permission check
  • User authentication data was being retrieved from database repeatedly
  • No caching mechanism for frequently accessed authentication data

Solution Implemented

Simple Permission Caching

Added lightweight caching to the authentication middleware using the existing cache infrastructure:

Location: local/middleware/auth.go

Key Features:

  • Permission Result Caching: Cache permission check results for 10 minutes
  • Existing Cache Integration: Uses the already available InMemoryCache system
  • Minimal Code Changes: Simple addition to existing middleware without major refactoring

Implementation Details

Cache Key Strategy

cacheKey := fmt.Sprintf("permission:%s:%s", userID, permission)

Cache Flow

  1. Cache Check First: Check if permission result exists in cache
  2. Database Fallback: If cache miss, query database via membership service
  3. Cache Result: Store result in cache with 10-minute TTL
  4. Return Result: Return cached or fresh result

Core Method

func (m *AuthMiddleware) hasPermissionCached(ctx context.Context, userID, permission string) (bool, error) {
    cacheKey := fmt.Sprintf("permission:%s:%s", userID, permission)

    // Try cache first
    if cached, found := m.cache.Get(cacheKey); found {
        if hasPermission, ok := cached.(bool); ok {
            return hasPermission, nil
        }
    }

    // Cache miss - check with service
    has, err := m.membershipService.HasPermission(ctx, userID, permission)
    if err != nil {
        return false, err
    }

    // Cache result for 10 minutes
    m.cache.Set(cacheKey, has, 10*time.Minute)
    return has, nil
}

Performance Benefits

Expected Improvements

  • Reduced Database Load: Permission checks avoid database queries after first access
  • Faster Response Times: Cached permission lookups are significantly faster
  • Better Scalability: System can handle more concurrent users with same database load
  • Minimal Memory Overhead: Only boolean values cached with automatic expiration

Cache Effectiveness

  • High Hit Ratio Expected: Users typically access same resources repeatedly
  • 10-Minute TTL: Balances performance with data freshness
  • Per-User Per-Permission: Granular caching for precise invalidation

Configuration

Cache TTL

cacheTTL := 10 * time.Minute  // Permission cache duration

Cache Key Format

"permission:{userID}:{permissionName}"

Examples:

  • permission:user123:ServerView
  • permission:user456:ServerCreate
  • permission:admin789:SystemManage

Integration

Dependencies

  • Existing Cache System: Uses local/utl/cache/cache.go
  • No New Dependencies: Leverages already available infrastructure
  • Minimal Changes: Only authentication middleware modified

Backward Compatibility

  • Transparent Operation: No changes required to existing controllers
  • Same API: Permission checking interface remains unchanged
  • Graceful Degradation: Falls back to database if cache fails

Usage Examples

Automatic Caching

// In controller with HasPermission middleware
routeGroup.Get("/servers", auth.HasPermission(model.ServerView), controller.GetServers)

// First request: Database query + cache store
// Subsequent requests (within 10 minutes): Cache hit, no database query

Manual Cache Invalidation

// If needed (currently not implemented but can be added)
auth.InvalidateUserPermissions(userID)

Monitoring

Built-in Logging

  • Cache Hits: Debug logs when permission found in cache
  • Cache Misses: Debug logs when querying database
  • Cache Operations: Debug logs for cache storage operations

Log Examples

[DEBUG] [AUTH_CACHE] Permission user123:ServerView found in cache: true
[DEBUG] [AUTH_CACHE] Permission user456:ServerCreate cached: true

Maintenance

Cache Invalidation

  • Automatic Expiration: 10-minute TTL handles most cases
  • User Changes: Permission changes take effect after cache expiration
  • Role Changes: New permissions available after cache expiration

Memory Management

  • Automatic Cleanup: Cache system handles expired entry removal
  • Low Memory Impact: Boolean values have minimal memory footprint
  • Bounded Growth: TTL prevents unlimited cache growth

Future Enhancements

Potential Improvements (if needed)

  1. User Data Caching: Cache full user objects in addition to permissions
  2. Role-Based Invalidation: Invalidate cache when user roles change
  3. Configurable TTL: Make cache duration configurable
  4. Cache Statistics: Add basic hit/miss ratio logging

Implementation Considerations

  • Keep It Simple: Current implementation prioritizes simplicity over features
  • Monitor Performance: Measure actual performance impact before adding complexity
  • Business Requirements: Add features only when business case is clear

Testing

  1. Permission Caching: Verify permissions are cached correctly
  2. Cache Expiration: Confirm permissions expire after TTL
  3. Database Fallback: Ensure database queries work when cache fails
  4. Concurrent Access: Test cache behavior under concurrent requests

Performance Testing

  • Before/After Comparison: Measure response times with and without caching
  • Load Testing: Verify performance under realistic user loads
  • Database Load: Monitor database query reduction

Conclusion

This simple caching implementation provides significant performance benefits with minimal complexity:

  • Solves Core Problem: Reduces database load for permission checks
  • Simple Implementation: Uses existing infrastructure without major changes
  • Low Risk: Minimal code changes reduce chance of introducing bugs
  • Easy Maintenance: Simple cache strategy is easy to understand and maintain
  • Immediate Benefits: Performance improvement available immediately

The implementation follows the principle of "simple solutions first" - addressing the performance bottleneck with the minimum viable solution that can be enhanced later if needed.