Getting Started - Caching

Different ways for getting started with ServiceStack:
Project templates, Walkthroughs, Docs and Videos, choose what's best for you

Setting up a caching provider

By default, a MemoryCacheClient is configured without any additional code in an AppHost. This means the all the features that use ICacheClient will work out of the box with no additional setup.

Configuring to use another implementation requires registering ICacheClient with your applications IoC. These will usually be registered as a factory Func that resolves a single client from a pool.

Supported Caching Providers

ServiceStack comes with support for multiple Cache Providers out of the box. These include:

  • Memory Cache
  • Redis
  • OrmLiteCacheClient
  • AWS DynamoDb
  • Azure Table Storage
  • Memcached

Each of these options have their own strengths. MemoryCache is useful for single host web services without needing any infrastructure dependencies. Redis is a common choice due to being a fast key-value store with non-volatile persistent storage and support for rich comp-sci data structures.

OrmLiteCacheClient supports all OrmLite’s RDBMS providers for using an existing RDBMS as a distributed cache. Memcached is a tried and tested distributed memory caching provider. AWS DynamoDb and Azure Table Storage give you option to use cloud managed solution that may better suit your use case.

Using Cache in Services

ServiceStack services have Cache autowired and can be used for any custom cache population or retrieval via Get<T> or Set<T>.

The CacheResponse attribute is a built in Request Filter attribute of easily adding cache support for an endpoint. This can be customized to VaryByUser and VaryByRoles as well as controlling duration/max age.

Since the CacheResponse attribute is service wide, this allows for patterns like creating a Cache proxy with minimal code via the use of a service's Gateway property.