Caching vs content delivery networks - what's the difference?

In the world of network optimization, Content Delivery Networks (CDNs) and caching play a vital role in improving website performance and user experience.
And while both aim to speed up website loading times, they have distinct purposes and mechanisms.
In this tutorial, we'll dive deep into the details of CDNs and caching to understand their similarities, differences, and how they contribute to enhancing online experiences.
What is Caching?
Imagine you’re a librarian managing a popular library. Every day, readers come in asking for the same set of books like “Think and Grow Rich” or “The Intelligent Investor.”
Initially, you fetch these books from the main shelves, which takes time and effort. But soon, you notice a pattern: the same set of books are requested repeatedly by different readers. So, what do you do?
You decide to create a special section near the entrance where you keep copies of these frequently requested books. Now, when readers come asking for them, you don’t have to run to the main shelves each time. Instead, you simply hand them the copies from the special section, saving time and making the process more efficient.
This special section represents the cache, storing frequently accessed books for quick retrieval.
Caching is a technique used to store copies of frequently accessed data temporarily. The cached data can be anything from web pages and images to database query results. When a user requests cached content, the server retrieves it from the cache instead of generating it anew, significantly reducing response times.
When a web server receives a request, it can follow different caching strategies to handle it efficiently. One prevalent strategy is known as read-through caching:
- Request Received: The web server gets a request from a client.
- Check Cache: It first looks into the cache to see if the response to the request is already there.
- Cache Hit: If the response is in the cache (hit), it sends the data back to the client right away.
- Cache Miss: If the response isn’t in the cache (miss), the server queries the database to fetch the required data.
- Store in Cache: Once it gets the data from the database, it stores the response in the cache for future requests.
- Send Response: Finally, the server sends the data back to the client.

What to Consider When Implementing a Cache System
Decide When to Use a Cache:
- A cache is best for frequently read but infrequently modified data.
- Cache servers are not suitable for storing critical data as they use volatile memory.
- Important data should be stored in persistent data stores to prevent loss in case of cache server restarts.