Skip to content

CachingΒΆ

1. What is Caching?ΒΆ

1.1 DefinitionΒΆ

Caching is a technique that temporarily stores frequently accessed data in a fast storage layer (such as memory) to reduce retrieval time and improve application performance. Instead of recalculating or reloading the same data repeatedly from a slow source (e.g., a database or an API), the system can fetch it from a cache, significantly speeding up responses.

1.2 Real-World AnalogyΒΆ

Imagine a busy coffee shop. If a customer orders the same drink repeatedly, instead of preparing it from scratch each time, the barista might pre-make a batch and serve it instantly when someone orders it. This is how caching worksβ€”storing results for quick retrieval instead of performing the same task repeatedly.

1.3 Basic Example of Caching in PythonΒΆ

import time

cache_store = {}


def expensive_computation(x: int) -> int:
    if x in cache_store:
        print("Returning cached result")
        return cache_store[x]

    print("Performing expensive computation...")
    time.sleep(3)  # Simulating a slow operation
    result = x * x
    cache_store[x] = result
    return result


# First call (slow)
print(expensive_computation(10))  # Computes and stores result

# Second call (fast)
print(expensive_computation(10))  # Fetches from cache

ExplanationΒΆ

  1. We use a dictionary cache_store to store computed results.
  2. If the result exists in the cache, it is returned instantly.
  3. If not, the function performs a slow operation and caches the result for future use.
  4. The second call avoids recomputation, making it much faster.

2. Benefits of CachingΒΆ

2.1 Faster Response TimesΒΆ

Caching speeds up data retrieval by serving precomputed or preloaded results instead of performing expensive operations.

2.2 Reduced Load on Databases & APIsΒΆ

Fetching data from cache reduces the number of database queries or API calls, minimizing load and costs.

2.3 Improved ScalabilityΒΆ

With caching, applications handle more users efficiently without overwhelming the database or backend services.

2.4 Example: Database Query Without vs. With CachingΒΆ

Without Caching (Slow Execution)ΒΆ

import sqlite3
import time

conn = sqlite3.connect(":memory:")
cursor = conn.cursor()
cursor.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
cursor.execute("INSERT INTO users (name) VALUES ('Alice')")
conn.commit()


def get_user_slow(user_id: int):
    time.sleep(2)  # Simulating slow DB query
    cursor.execute("SELECT * FROM users WHERE id = ?", (user_id,))
    return cursor.fetchone()


# Slow execution
print(get_user_slow(1))
print(get_user_slow(1))

With Caching (Fast Execution)ΒΆ

import sqlite3
import time

conn = sqlite3.connect(":memory:")
cursor = conn.cursor()
cursor.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
cursor.execute("INSERT INTO users (name) VALUES ('Alice')")
conn.commit()


cache_store = {}


def get_user_fast(user_id: int):
    if user_id in cache_store:
        print("Returning from cache")
        return cache_store[user_id]

    time.sleep(2)  # Simulating slow DB query
    cursor.execute("SELECT * FROM users WHERE id = ?", (user_id,))
    result = cursor.fetchone()
    cache_store[user_id] = result
    return result


# First call is slow, second call is instant
print(get_user_fast(1))
print(get_user_fast(1))

ExplanationΒΆ

  • Without caching, every call queries the database, making repeated calls slow.
  • With caching, the first call fetches from the database, but subsequent calls retrieve the stored value instantly.

3. What is Caching in Esmerald?ΒΆ

Esmerald provides a built-in caching system to speed up responses, reduce redundant processing, and optimize performance. It supports multiple backends, including:

  1. In-Memory Caching (default)
  2. Redis Caching
  3. Custom Backends

Esmerald’s caching system integrates seamlessly with request handlers using the @cache decorator.


4. How to Use Caching in EsmeraldΒΆ

4.1 Using the @cache DecoratorΒΆ

The @cache decorator allows caching responses for a defined ttl (time-to-live) and a chosen backend.

from esmerald.utils.decorators import cache

Basic ExampleΒΆ

from esmerald import Esmerald, Gateway, get
from esmerald.utils.decorators import cache


@get("/expensive/{value}")
@cache(ttl=10)  # Cache for 10 seconds
async def expensive_operation(value: int) -> dict:
    return {"result": value * 2}


app = Esmerald(routes=[Gateway(handler=expensive_operation)])
βœ… First request is computed, subsequent requests are served instantly from cache.


4.2 Specifying a Cache BackendΒΆ

Using Redis as a BackendΒΆ

from esmerald import get
from esmerald.caches.redis import RedisCache
from esmerald.utils.decorators import cache

redis_cache = RedisCache(redis_url="redis://localhost:6379")


@get("/data/{key}")
@cache(backend=redis_cache, ttl=30)
async def fetch_data(key: str) -> dict:
    return {"key": key, "value": key[::-1]}  # Simulating an expensive operation
βœ… The response is stored in Redis and remains available for 30 seconds.


5. Customizing Caching in EsmeraldΒΆ

5.1 Using Esmerald Settings to Set a Default Cache BackendΒΆ

Instead of specifying the backend every time, we can configure a global cache backend using EsmeraldAPISettings.

Example: Setting Redis as the Default BackendΒΆ

from esmerald import EsmeraldAPISettings
from esmerald.caches.redis import RedisCache


class CustomSettings(EsmeraldAPISettings):
    cache_backend = RedisCache(redis_url="redis://localhost:6379")

βœ… Now, all @cache decorators without a specified backend will use Redis.

Tip

You can set the default backend to any supported backend, including custom ones. This allows you to maintain a consistent caching strategy across your application.

The default cache backend is the InMemoryCache, which is used if no backend is specified.


6. Building Custom Caching BackendsΒΆ

You can extend Esmerald’s caching system by creating your own backend.

6.1 Custom File-Based Cache BackendΒΆ

To create a custom backend, you need to implement the CacheBackend interface.

That can be imported from:

from esmerald.protocols.cache import CacheBackend

ExampleΒΆ

import json
import os
from typing import Any

from esmerald.protocols.cache import CacheBackend


class FileCache(CacheBackend):
    def __init__(self, directory: str = "cache_files") -> None:
        self.directory = directory
        os.makedirs(directory, exist_ok=True)

    async def get(self, key: str) -> Any | None:
        filepath = os.path.join(self.directory, key)
        if os.path.exists(filepath):
            with open(filepath) as f:
                return json.load(f)
        return None

    async def set(self, key: str, value: Any, ttl: int | None = None) -> None:
        filepath = os.path.join(self.directory, key)
        with open(filepath, "w") as f:
            json.dump(value, f)

    async def delete(self, key: str) -> None:
        filepath = os.path.join(self.directory, key)
        if os.path.exists(filepath):
            os.remove(filepath)

βœ… This custom backend caches data in files instead of memory or Redis.

6.2 Using the Custom Backend in EsmeraldΒΆ

Now you can use the custom backend in your Esmerald application.

from esmerald import get
from esmerald.utils.decorators import cache

file_cache = FileCache()


@get("/file-cache/{data}")
@cache(backend=file_cache, ttl=60)
async def file_cached_endpoint(data: str) -> dict:
    return {"data": data, "cached": True}

βœ… Data is now cached in files instead of memory or Redis.


RecapΒΆ

βœ… Esmerald provides an easy-to-use caching system with multiple backends. βœ… You can use the @cache decorator to cache responses. βœ… You can set a global cache backend via EsmeraldAPISettings. βœ… You can create custom caching backends to store data in different ways.