CachingΒΆ
1. What is Caching?ΒΆ
1.1 DefinitionΒΆ
Caching is a technique that temporarily stores frequently accessed data in a fast storage layer (such as memory) to reduce retrieval time and improve application performance. Instead of recalculating or reloading the same data repeatedly from a slow source (e.g., a database or an API), the system can fetch it from a cache, significantly speeding up responses.
1.2 Real-World AnalogyΒΆ
Imagine a busy coffee shop. If a customer orders the same drink repeatedly, instead of preparing it from scratch each time, the barista might pre-make a batch and serve it instantly when someone orders it. This is how caching worksβstoring results for quick retrieval instead of performing the same task repeatedly.
1.3 Basic Example of Caching in PythonΒΆ
import time
cache_store = {}
def expensive_computation(x: int) -> int:
if x in cache_store:
print("Returning cached result")
return cache_store[x]
print("Performing expensive computation...")
time.sleep(3) # Simulating a slow operation
result = x * x
cache_store[x] = result
return result
# First call (slow)
print(expensive_computation(10)) # Computes and stores result
# Second call (fast)
print(expensive_computation(10)) # Fetches from cache
ExplanationΒΆ
- We use a dictionary
cache_store
to store computed results. - If the result exists in the cache, it is returned instantly.
- If not, the function performs a slow operation and caches the result for future use.
- The second call avoids recomputation, making it much faster.
2. Benefits of CachingΒΆ
2.1 Faster Response TimesΒΆ
Caching speeds up data retrieval by serving precomputed or preloaded results instead of performing expensive operations.
2.2 Reduced Load on Databases & APIsΒΆ
Fetching data from cache reduces the number of database queries or API calls, minimizing load and costs.
2.3 Improved ScalabilityΒΆ
With caching, applications handle more users efficiently without overwhelming the database or backend services.
2.4 Example: Database Query Without vs. With CachingΒΆ
Without Caching (Slow Execution)ΒΆ
import sqlite3
import time
conn = sqlite3.connect(":memory:")
cursor = conn.cursor()
cursor.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
cursor.execute("INSERT INTO users (name) VALUES ('Alice')")
conn.commit()
def get_user_slow(user_id: int):
time.sleep(2) # Simulating slow DB query
cursor.execute("SELECT * FROM users WHERE id = ?", (user_id,))
return cursor.fetchone()
# Slow execution
print(get_user_slow(1))
print(get_user_slow(1))
With Caching (Fast Execution)ΒΆ
import sqlite3
import time
conn = sqlite3.connect(":memory:")
cursor = conn.cursor()
cursor.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
cursor.execute("INSERT INTO users (name) VALUES ('Alice')")
conn.commit()
cache_store = {}
def get_user_fast(user_id: int):
if user_id in cache_store:
print("Returning from cache")
return cache_store[user_id]
time.sleep(2) # Simulating slow DB query
cursor.execute("SELECT * FROM users WHERE id = ?", (user_id,))
result = cursor.fetchone()
cache_store[user_id] = result
return result
# First call is slow, second call is instant
print(get_user_fast(1))
print(get_user_fast(1))
ExplanationΒΆ
- Without caching, every call queries the database, making repeated calls slow.
- With caching, the first call fetches from the database, but subsequent calls retrieve the stored value instantly.
3. What is Caching in Esmerald?ΒΆ
Esmerald provides a built-in caching system to speed up responses, reduce redundant processing, and optimize performance. It supports multiple backends, including:
- In-Memory Caching (default)
- Redis Caching
- Custom Backends
Esmeraldβs caching system integrates seamlessly with request handlers using the @cache
decorator.
4. How to Use Caching in EsmeraldΒΆ
4.1 Using the @cache
DecoratorΒΆ
The @cache
decorator allows caching responses for a defined ttl
(time-to-live) and a chosen backend.
from esmerald.utils.decorators import cache
Basic ExampleΒΆ
from esmerald import Esmerald, Gateway, get
from esmerald.utils.decorators import cache
@get("/expensive/{value}")
@cache(ttl=10) # Cache for 10 seconds
async def expensive_operation(value: int) -> dict:
return {"result": value * 2}
app = Esmerald(routes=[Gateway(handler=expensive_operation)])
4.2 Specifying a Cache BackendΒΆ
Using Redis as a BackendΒΆ
from esmerald import get
from esmerald.caches.redis import RedisCache
from esmerald.utils.decorators import cache
redis_cache = RedisCache(redis_url="redis://localhost:6379")
@get("/data/{key}")
@cache(backend=redis_cache, ttl=30)
async def fetch_data(key: str) -> dict:
return {"key": key, "value": key[::-1]} # Simulating an expensive operation
5. Customizing Caching in EsmeraldΒΆ
5.1 Using Esmerald Settings to Set a Default Cache BackendΒΆ
Instead of specifying the backend every time, we can configure a global cache backend using EsmeraldAPISettings
.
Example: Setting Redis as the Default BackendΒΆ
from esmerald import EsmeraldAPISettings
from esmerald.caches.redis import RedisCache
class CustomSettings(EsmeraldAPISettings):
cache_backend = RedisCache(redis_url="redis://localhost:6379")
β
Now, all @cache
decorators without a specified backend will use Redis.
Tip
You can set the default backend to any supported backend, including custom ones. This allows you to maintain a consistent caching strategy across your application.
The default cache backend is the InMemoryCache, which is used if no backend is specified.
6. Building Custom Caching BackendsΒΆ
You can extend Esmeraldβs caching system by creating your own backend.
6.1 Custom File-Based Cache BackendΒΆ
To create a custom backend, you need to implement the CacheBackend
interface.
That can be imported from:
from esmerald.protocols.cache import CacheBackend
ExampleΒΆ
import json
import os
from typing import Any
from esmerald.protocols.cache import CacheBackend
class FileCache(CacheBackend):
def __init__(self, directory: str = "cache_files") -> None:
self.directory = directory
os.makedirs(directory, exist_ok=True)
async def get(self, key: str) -> Any | None:
filepath = os.path.join(self.directory, key)
if os.path.exists(filepath):
with open(filepath) as f:
return json.load(f)
return None
async def set(self, key: str, value: Any, ttl: int | None = None) -> None:
filepath = os.path.join(self.directory, key)
with open(filepath, "w") as f:
json.dump(value, f)
async def delete(self, key: str) -> None:
filepath = os.path.join(self.directory, key)
if os.path.exists(filepath):
os.remove(filepath)
β This custom backend caches data in files instead of memory or Redis.
6.2 Using the Custom Backend in EsmeraldΒΆ
Now you can use the custom backend in your Esmerald application.
from esmerald import get
from esmerald.utils.decorators import cache
file_cache = FileCache()
@get("/file-cache/{data}")
@cache(backend=file_cache, ttl=60)
async def file_cached_endpoint(data: str) -> dict:
return {"data": data, "cached": True}
β Data is now cached in files instead of memory or Redis.
RecapΒΆ
β
Esmerald provides an easy-to-use caching system with multiple backends.
β
You can use the @cache
decorator to cache responses.
β
You can set a global cache backend via EsmeraldAPISettings
.
β
You can create custom caching backends to store data in different ways.