Using the HTTP Caching Policy in Policy Manager

Learn how to enhance performance of message processing through caching responses to previously made service requests using the HTTP Caching Policy.

About Policies Managing Policies About Operational Policies

For information about using policies in the context of the developer portal, see Business Policies.

Table of Contents

  1. Introduction
  2. HTTP Caching Policy Modes
  3. How Stale Caches are Managed
  4. HTTP Caching Policy Options
  5. Public/Private Caching
  6. Configuration

Introduction

Caching is used to increase performance by storing responses to previous requests so that they can be served as a response to similar new requests that come in, often without requiring the overhead of a downstream server round-trip.

To support caching, Akana provides an HTTP Caching Policy.

The HTTP Caching Policy is an Operational policy that allows you to:

  • Define how long a response can be cached for HTTP requests.
  • Select a caching mode (HTTP Proxy Mode or HTTP Mediation/Server Mode).
  • Select a caching scope (shared or private)

HTTP Caching Policy Modes

The HTTP Caching policy supports the following modes:

  • HTTP Proxy Mode: Using this mode, the container expects the downstream call to be HTTP and to return cache control headers. If downstream cache control headers are provided in the response, caching is executed according to the directives contained in these headers. Note that directives might take precedence over policy configuration settings (for example, max-age directive will override policy time-to-live setting).
  • HTTP Mediation / Server Mode: Using this mode, the presence or absence of downstream headers is not taken into consideration. In effect, caching headers included in the response are ignored. In this scenario, the way that caching decisions are executed by the HTTP Caching policy is based completely on the client cache-control headers, if present.

Back to top

How Stale Caches are Managed

When a message is processed and the headers are read, a determination is made as to whether the cache is stale or not. If a cache is discovered to be stale, response validators are used to send a conditional response to the originating server. If it's determined that the cache is not stale, the cached response is used. Otherwise, a full response will be received that will be used instead of the cached response.

For more information on the stale cache handling approach used for this policy, see https://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13 (RFC 2616 for HTTP caching).

Back to top

HTTP Caching Policy Options

The policy includes the following configuration options:

  • Time To Live: Allows you to specify the maximum time in seconds a response will be cached. If not specified, the maximum time is determined by the container settings.
  • Staleness Period Seconds: If a value is entered, any cached entry will live in the cache for the number of seconds in Time To Live, plus the number of seconds in the Staleness Period. The "stale" portion of the entry is only used if Cache-Control directives on the request allow for a stale entry to be used, by the use of a max-stale directive. The default is 0.
  • Act as HTTP Proxy: Uncheck to enable either the HTTP Proxy Mode (checked) or HTTP Mediation / Server Mode.
  • Shared Cache: Uncheck to enable the ability to use a private cache. In this case, you must also select a Subject Category (below).
  • Subject Category: Allows you to select a subject category if Shared Cache is unchecked. The subject category is used to find the cache's principal name that is set when the cache is created, and use it as part of the cache key.

Back to top

Public/Private Caching

By default, the caching module considers itself to be a shared (public) cache (Shared Cache checked), and will not, for example, cache responses to requests with Authorization headers or responses marked with Cache-Control: private. If, however, the cache is only going to be used by one logical "user" (behaving similarly to a browser cache), then you will want to turn off the shared cache setting (Shared Cache unchecked).

Back to top

Configuration

Let's take a quick walkthrough of the HTTP Caching Policy configuration process to get you started.

Step 1: Add Policy / Use System Policy

In Policy Manager, to create an HTTPS Caching Quota Policy instance, go to Policies > Operational Policies and choose Add Policy.

Step 2: Modify Policy

When you click Modify to make changes to the HTTP Caching policy on the Policy Details page, the initial policy looks like this:

Configure the policy options based on your requirements and click Apply.

Step 3: Attach Policy

After you've saved your policy, you can attach it to a web service, binding, or binding operation that you would like to enhance the message processing of.

Step 4: Test Policy and View Monitoring Data

After you've attached the HTTPS Caching Policy to a web service, operation, or binding, send a request to your service and go to the Services > Monitoring section to view the results for Logs, Real Time Charts, and Historical Charts. For more information on using the monitoring functions, refer to the Policy Manager Online Help, available via the Help button.

Back to top