Revisiting Memcached

While Redis often dominates the conversation, Memcached remains a crucial tool for high-scale, high-concurrency applications. Companies like Facebook, YouTube, and Wikipedia continue to rely on Memcached for their demanding caching needs.

🚨 Before we start

Microsoft's caching extension for in-memory caching is designed for a single instance and cannot be distributed across servers. And for distributed caching, it does not natively support Memcached (custom implementations are possible).

What Sets Memcached Apart?

Memcached is a high-performance distributed memory object caching system. Its primary purpose is to cache frequently accessed data.

It uses a straightforward key-value data model that allows us to easily access data without needing complex query languages, such as database queries, user sessions, etc.

💪 Key Difference

A key difference is its multithreading capability. Unlike Redis, where the primary request processing flow is handled by a single thread, Memcached can handle multiple client requests concurrently. This architecture makes it exceptionally efficient for applications requiring high throughput and parallel processing.

Memcached Integration in .NET with EnyimMemcachedCore

For .NET, EnyimMemcachedCore simplifies Memcached integration, providing built-in features for robust and reliable caching.

(We can also implement safety mechanisms manually):

  • CAS (Check and Set): Implements optimistic locking to prevent lost updates in concurrent environments.
  • Cache Stampede Prevention: Uses Add to implement a "lock" key, ensuring only one client regenerates expired data.
  • Race Condition: Uses Increment, Decrement, or Add for thread-safe counters or flags.
public async Task<T> GetAsync<T>(CacheKey key, Func<Task<T>> acquire)
{
	try
	{
		// 🚀 Use GetValueOrCreateAsync to handle cache misses and data fetching in one call
		// 🛡️ Prevents cache stampedes: Ensures only one thread fetches data for a given key
		// 🛡️ Handles race conditions: Internal locking ensures thread-safe cache updates
		var cacheEntry = await _memcachedClient.GetValueOrCreateAsync(
			key.Key, // Cache key
			key.CacheTimeSecond, // Cache expiration time
			async () => await acquire() // Factory method to fetch data if cache miss

		);
	_logger.LogInformation("cacheEntry for key {key.key} is {cacheEntry}:",key.Key, cacheEntry);
		return cacheEntry;
	}
	catch (Exception ex)
	{
	// 🛑 Log errors and fallback to fetching fresh data
	_logger.LogError("Memcached Error {ex.Message}", ex.Message);
		return await acquire(); // 🛠️ Fallback to DB
	}
}

//Program.cs
var memcachedSection = builder.Configuration.GetSection("Memcached");

builder.Services
	.AddOptions<MemcachedClientOptions>()
	.Bind(memcachedSection)
	.ValidateDataAnnotations()
	.Validate(options => options.Servers?.Any() ?? false, "At least one Memcached server must be configured")
	.ValidateOnStart();

builder.Services.AddEnyimMemcached();

//appsetting
  "Memcached": {
    "Servers": [
      {
        "Address": "localhost",
        "Port": 11211
      }
    ],
    "SocketPool": {
      "MinPoolSize": 5,
      "MaxPoolSize": 100,
      "ConnectionTimeout": "00:00:10"
    }
  }

✅ Pros of Memcached

  • Multithreaded Architecture: Enables high concurrency and efficient handling of numerous client requests.
  • Performance and Simplicity: Optimized for rapid data retrieval and straightforward deployment.
  • Horizontal Scalability: Supports easy scaling by adding more nodes to the cluster.
  • Distributed Caching: Facilitates sharing cached data across multiple servers.

❌ Cons of Memcached

  • No Native Replication: Requires external solutions for data replication and redundancy.
  • Lack of Persistence: Data is not persisted to disk.
  • Limited Data Structures: Primarily supports basic key-value pairs.

Implementation Here: https://github.com/Maxofpower/FeatureManagement

Up Next
    Ebook Download
    View all
    Learn
    View all