Skip to main content

How to boost performance with caching

Posted on August 5, 2017 by PhilZonaPhilZona

Nothing feels worse than visiting a website or application that takes forever to load. Whether you’re trying to read the news or check out the latest photos and updates from your friends, slow websites are incredibly frustrating. But not only are they frustrating, you’d be surprised at how page-load times can affect business:

  • A one second delay can reduce conversions by up to 7 percent
  • 47% of internet users expect the sites they visit to load in 2 seconds or less
  • 40% of people will abandon a site that takes more than 3 seconds to load

Sometimes poor speeds result from a bad internet connection, but infrastructure and code can play a huge role as well. As a Solutions Architect, you have control over these factors, so let’s talk about how to boost performance with caching.

What is caching?

Caching is a term often related to performance and databases. The database can often be a performance bottleneck in an application. It can only handle so many write operations (saving data) and read operations (viewing/displaying the data). A cache is kind of like another database that focuses more on the read aspect. A cache stores data in memory (RAM) so it reads incredibly quickly compared to a traditional database that is stored on disk.
A cache layer sits between your application code and your database. This allows the cache to serve data to the application without using database resources. Caching can be a complex issue as part of an application stack because the data can often go “stale.” When data becomes stale, this means that it’s fallen behind the true data in the database. If the cache layer receives requests from the application and does not have the required data, then it will retrieve that data from the database, store it in the cache, and then serve it back to the application.

How does caching fit into an application?

You can use caching as part of a small process in your application or you can rely heavily on caching for nearly every database operation. They key with caching is to increase the performance of your application without adding too much complexity. Finding that balance will depend on what the application does and how it works.
Here are a couple examples of when caching can help:

  • If you run a read heavy application such as a news site or blog that gets many views (reads) but very few writes (saving to the database)
  • If you run an API that serves data from user requests (reads)

Caching software

Plenty of software exists to help with caching needs, but it can be hard to know which solution to choose. The two options that are most widely used are Redis and Memcached. Luckily, Amazon Web Services offers deployment options for each of these.
Deploying Redis or Memcached using AWS ElastiCache has many benefits compared to hosting them yourself:

  • Administration included – You don’t have to worry about replication, scaling, clustering, or backups. AWS takes care of all these.
  • Easily Scalable – You can easily scale your instances up and down depending on your resource requirements.
  • Part of the AWS Family – ElastiCache integrates with all of the other Amazon Web Services. This includes security, high availability, support and pricing models.

If you were to manage a Redis or Memcached server yourself, then you would need to handle your own updates and security patches. You’d also be responsible for monitoring and backups (which come with a whole new layer of complexity to manage). A server can go down at any time, so you’d essentially be on call 24/7. When you use AWS, these issues are managed automatically, so you won’t have to worry about urgent, middle-of-the-night maintenance calls.

What to watch out for

Caching is a fantastic way to increase performance, but you should know a few things before implementing it. Caching creates a complex layer in your application or infrastructure and must be thoroughly planned when deploying to larger applications. Some of the main things to watch out for are:

  • Stale data – A cache will store the version of data at the moment it is cached. If the data changes, the cache will be old and out of sync with the database. For example, imagine an e-commerce site where a product has been deleted from the database but not from the cache. If a customer requests that product, they may still see it as available and try to order it. As a solution, you can use ElastiCache’s cache expiration to set the amount of time data remains cached. You can choose to set long expiration times for less important pieces of data and shorter expiration times for more important pieces of data.
  • Overhead – Building a cache layer into your code can require time, planning, and testing. You’ll have to consider how to handle data that is not saved in the cache but is saved in the database and how to handle stale data.
  • Complexity – Caching creates complexity. How will you store and name the data? A key-value approach? How will you handle refreshing the cache? How long will you cache certain pieces of data? Will some elements require longer caching times than others? These are all considerations that need to be planned before a cache can be implemented in production.

For many, the pros outweigh the cons. Using a cache wisely in your application or infrastructure can benefit your application, but only if it’s properly planned.

How does caching affect performance?

We’ve talked a lot about “performance,” but what does that mean exactly? Caching offers two key performance and application benefits:

  • Speed – Decrease loading times by storing data in memory and not on the disk
  • Lower resource usage on the database server

This means that your application can run faster, which can be the difference between users staying on your site and clicking the back button in their browser. Lower resource consumption is also huge–when your servers aren’t under a heavy load, you can support more connections (if you run an e-commerce site, this means more customers).

Caching with Amazon Web Services

AWS offers a caching solution called ElastiCache. ElastiCache is a cloud based cache service which allows you to easily setup and deploy caching services using either Redis or Memcached caching software. ElastiCache offers features such as replication, scaling, clustering, and backups. It boasts flexible pricing, especially with on-demand node reservation.
ElastiCache uses instance types similar to the EC2 compute service. It allows you to create an instance with the hardware specification that matches your needs. For example, it can be used for smaller applications that would benefit from an instance such as the cache.t2.medium which offers 2 vCPUs and 3.22 GB of Memory. It can also be used by larger applications which would benefit from an instance such as the cache.m4.2xlarge which offers 8 vCPU’s and 29.7 GB of Memory.


As a Certified Solutions Architect, one of your main responsibilities will be figuring out how to get the most out of your applications and infrastructure. With a little planning, a cache can be a great tool to help you do just that. Now that you’ve got the basics, you’re ready to implement a cache in your own project! For some ideas on how to get started, check out this guide to caching strategies.

Have tips or tricks you’d like to share? Want to let everyone know how caching has boosted your performance? Let us know in the comments.


Leave a Reply

Your email address will not be published. Required fields are marked *