One of the ways to relieve strain on a server is by caching . This is done by caching after it has been processed and then serving it from the the next time it is requested. This tutorial will give a detailed discussion of , explaining how to install and cache data in Python .

Introduction to Redis and Caching

Caching refers to storing the server response in the client itself, so that a client need not make a server request for the same resource again and again. A server response should have information about how caching is to be done, so that a client caches the response for a time period or never caches the server response.

A cache, on the other hand, is a hardware or software component that is used to store data so future requests for the same data can be served faster.

In this age where users expect results within a second, it is wise to serve requests by reading data from the cache, which is ultimately faster than reading from a slower data store; thus, the system performance depends on how many requests can be served from the cache.

Redis is an open-source, in-memory data structure store, used as a database, cache, and message broker. It works by storing data in a cache and providing it the next time it is requested, instead of querying the database every time.

Install Redis

The first step is to get Redis up and running locally on your machine. The simplest way to install Redis is via the operating system’s package manager like so:

You can also follow the instructions from the official Redis site.

Download and extract Redis 4.0.6 tar as follows:

The binaries that are now compiled are available in the src directory. Run Redis with:

You can interact with Redis using the built-in client:

To check if the redis server is running, issue the following command on the terminal:

API Example

Let’s create our Django project. Our project will be able to cache all the products in a store, making it easy and fast to retrieve data in subsequent queries.

To make use of Redis in our , we need to do the following:

  1. Check if results for the current query exist in the cache.
  2. If results exist in the cache, retrieve them.
  3. If results don’t exist, fetch them, store them in the cache, and then forward them to the requesting entity.

Requirements

  • Django
  • django-redis
  • Redis
  • loadtest

Create Your Project

Before we get started, create a directory and install a virtual environment. A virtual environment will let you install library versions required by your application.

Next, activate the virtual environment and install the project requirements.

Create a Django Project

Create a new app called store, which will handle product management in our store.

Add the store application and rest_framework to the list of installed apps in the settings.py file.

Creating the Models

In store/models.py, we start by creating the Product model for storing the product details as follows:

Migrations

Create an initial migration for our products model, and sync the database for the first time.

Create superuser

Create a superuser, log in to the admin panel, and populate your database with some sample data which we will use to do our tests.

Configuring Redis in Python Applications

In order to use Redis with a Django application, we need to set up Redis to store the application’s cache data. And the following to your settings.py file:

Next, we are going to create an endpoint that retrieves all the products from our database. We will first test the performance of the application in terms of how long it takes to retrieve data from the database without caching it. We will then implement another endpoint that retrieves data from a cache and compare the performance.

In store/views.py, add the following code which retrieves all the products present in the database.

Configuring URLs

Create a file store/urls.py and add the following code.

We also need to import URLs from the users application to the main django_cache/urls.py file.

Let’s do a test and see if we are on track. We will be using loadtest. If you are not familiar with loadtest, it’s a tool for testing performance.

Installing loadtest as root is simple:

As seen from the above, 55 requests are processed per second.

Let’s create another endpoint for retrieving data after caching with Redis. Edit users/views.py to match the following:

The code above will check if the key product is present in the cache, and if found, the data represented will be returned to the browser. In the event that no data is present in the cache, we first retrieve the data from the database, store it in the cache, and then return the data queried to the browser.

Update store/urls.py as follows.

Let’s carry out the tests.

The first time you hit the endpoint localhost:8000/store/cache, the application will query from the database and return data, but subsequent calls to the URL will bypass the database and query from the cache since the data is already available in the cache.

Conclusion

In this tutorial, we used Redis to give an application the illusion of speed. We leverage the use of RAM in Redis to store the results of queries and then return those results from the cache in subsequent queries rather than doing the round trip to the database.

There are other caching tools available, such as Memcached, which is similar to Redis. However, Redis is more popular than Memcached because it takes only a few minutes to set up and get working in applications. Redis has more sophisticated mechanisms as it has been described as a “data structure store”, thus making it more powerful and flexible. Redis also has a larger advantage because you can store data in any form.

Hopefully, this tutorial has shown you how easy it is to add a caching layer to your application, hence improving performance. Caching must be something to consider when you need to reduce loading times and server costs.



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here