Skip to main content
Version: Next

Caching Flow

The Caching Flow is designed to enhance API performance and reliability by caching responses to frequently accessed API requests. This reduces the load on API providers, and improves response times for consumers. Caching based on URL, query parameters and specific headers are supported.

Caching Flow


Scenarios

  • Reduced Latency: Serve responses faster by avoiding redundant API calls for frequently accessed data.
  • Cost Optimization: Reduce API call costs by minimizing requests to third-party providers.
  • Failover Support: Provide cached data when APIs are unavailable or experience downtime.
  • Header-Based Differentiation: Enable cache rules based on headers for advanced use cases like user-specific data or subscription tiers.

Flow Configuration Template

name: CachingFlow

filter:
url: <URLPattern> # Define the URL pattern for the filter, e.g., api.example.com/*, 195.458.125.1/*
method: ["<HTTPMethod>"] # Optional: List of HTTP methods, e.g., GET, POST
headers:
- key: <HeaderKey> # Optional: Header key, e.g., 'X-API-Key'
value: <HeaderValue> # Optional: header value, e.g., '12345', '67890'

processors:
ReadCache:
processor: ReadCache # Reads responses from the cache if available.
parameters:
- key: caching_key_parts
value:
- <JSONPath.for.cache.part>

WriteCache:
processor: WriteCache # Writes responses to the cache.
parameters:
- key: ttl_seconds
value: <TTLInSeconds> # Time to live for requests in the queue
- key: record_max_size_bytes
value: <MaxRecordSize> # Maximum size of cached responses is 8 KB.
- key: max_cache_size_mb
value: <MaxCacheSize> # Cache size limited to 200 MB.
- key: caching_key_parts
value:
- <JSONPath.for.cache.part>

flow:
request:
- from:
stream:
name: globalStream
at: start
to:
processor:
name: ReadCache
- from:
processor:
name: ReadCache
condition: cache_miss
to:
stream:
name: globalStream
at: end
response:
- from:
stream:
name: globalStream
at: start
to:
processor:
name: WriteCache
- from:
processor:
name: WriteCache
to:
stream:
name: globalStream
at: end
- from:
processor:
name: ReadCache
condition: cache_hit
to:
stream:
name: globalStream
at: end

Flow Example

name: CachingFlow

filter:
url: httpbin.com/*
processors:
ReadCache:
processor: ReadCache
parameters:
- key: caching_key_parts
value:
- $.request.headers.api_key
- $.request.query_param.resource_id
WriteCache:
processor: WriteCache
parameters:
- key: ttl_seconds
value: 600
- key: record_max_size_bytes
value: 8192
- key: max_cache_size_mb
value: 200
- key: caching_key_parts
value:
- $.request.headers.api_key
- $.request.query_param.resource_id
flow:
request:
- from:
stream:
name: globalStream
at: start
to:
processor:
name: ReadCache
- from:
processor:
name: ReadCache
condition: cache_miss
to:
stream:
name: globalStream
at: end
response:
- from:
stream:
name: globalStream
at: start
to:
processor:
name: WriteCache
- from:
processor:
name: WriteCache
to:
stream:
name: globalStream
at: end
- from:
processor:
name: ReadCache
condition: cache_hit
to:
stream:
name: globalStream
at: end

Flow Components


Troubleshooting

Ensure caching_key_parts is consistent between Read Cache and Write Cache processors.