Difference between revisions of "User:TerryE/Traffic Server Configuration"

From Apache OpenOffice Wiki
Jump to: navigation, search
(Basic tweaks to the framework)
m (Why Traffic Server?)
Line 2: Line 2:
 
== Why Traffic Server? ==
 
== Why Traffic Server? ==
  
Apache Traffic Server is a lightweight, yet high-performance, web proxy cache that improves network efficiency and performance<ref>http://trafficserver.apache.org/</ref> . Like [[MWmanual:Squid caching|Squid]] and [[MWmanual:Varnish caching|Varnish]], Traffic Server can be configured as a reverse proxy<ref>http://trafficserver.apache.org/docs/v2/admin/reverse.htm</ref>.  In this mode, it acts as a full surrogate for the back-end wiki with port 80 on the advertised hostname for the wiki resolving to Traffic Server, and indoing so this enables the processing of web requests to be offloaded from the PHP and database intensive [[MWwiki:MediaWiki|MediaWiki]] application.
+
Apache Traffic Server is a lightweight, yet high-performance, web proxy cache that improves network efficiency and performance<ref>http://trafficserver.apache.org/</ref> . Like [[MWmanual:Squid caching|Squid]] and [[MWmanual:Varnish caching|Varnish]], Traffic Server can be configured as a reverse proxy<ref>http://trafficserver.apache.org/docs/v2/admin/reverse.htm</ref>.  In this mode, it acts as a full surrogate for the back-end wiki with port 80 on the advertised hostname for the wiki resolving to Traffic Server. In doing so this enables the processing of web requests to be offloaded from the PHP and database intensive [[MWwiki:MediaWiki|MediaWiki]] application.
  
 
Traffic Server can be configure to store high frequency cached content in memory, and where content is flush to disk, access will still invovle significantly less physical I/O than the MediaWiki application. Hence permitting a significantly higher throughput for a give CPU and I/O resource constraint.  MediaWiki id been designed to integrate closely with such web cache packages and will Traffic Server when a page should be purged from the cache in order to be regenerated.  From MediaWiki's point of view, a correctly-configured Traffic Server installation is interchangeable with Squid or Varnish.
 
Traffic Server can be configure to store high frequency cached content in memory, and where content is flush to disk, access will still invovle significantly less physical I/O than the MediaWiki application. Hence permitting a significantly higher throughput for a give CPU and I/O resource constraint.  MediaWiki id been designed to integrate closely with such web cache packages and will Traffic Server when a page should be purged from the cache in order to be regenerated.  From MediaWiki's point of view, a correctly-configured Traffic Server installation is interchangeable with Squid or Varnish.

Revision as of 00:11, 22 August 2011

Why Traffic Server?

Apache Traffic Server is a lightweight, yet high-performance, web proxy cache that improves network efficiency and performance[1] . Like Squid and Varnish, Traffic Server can be configured as a reverse proxy[2]. In this mode, it acts as a full surrogate for the back-end wiki with port 80 on the advertised hostname for the wiki resolving to Traffic Server. In doing so this enables the processing of web requests to be offloaded from the PHP and database intensive MediaWiki application.

Traffic Server can be configure to store high frequency cached content in memory, and where content is flush to disk, access will still invovle significantly less physical I/O than the MediaWiki application. Hence permitting a significantly higher throughput for a give CPU and I/O resource constraint. MediaWiki id been designed to integrate closely with such web cache packages and will Traffic Server when a page should be purged from the cache in order to be regenerated. From MediaWiki's point of view, a correctly-configured Traffic Server installation is interchangeable with Squid or Varnish.

The architecture

An example setup of Traffic Server, Apache and MediaWiki on a single server is outlined below. A more complex caching strategy may use multiple web servers behind the same Traffic Server caches (all of which can be made to appear to be a single host) or use independent servers to deliver wiki or image content.

Outside world <--->

Server

Traffic Server accelerator
w.x.y.z:80

<--->

Apache webserver
127.0.0.1:80

To the outside world, Traffic Server appears to act as the web server. In reality it passes on requests to the Apache web server, but only when necessary. An Apache running on the same server only listens to requests from localhost (127.0.0.1) while Traffic Server only listens to requests on the server's external IP address. Both services run on port 80 without conflict as each is bound to different IP addresses.

Configuring Traffic Server 2.x

/etc/sysconfig/Traffic Server

This is the first configuration file loaded by Traffic Server on startup. It specifies the amount of memory to be allocated to the Traffic Server cache, the location of the main (*.vcl) configurations and the specific IP addresses to which Traffic Server must respond.

The remainder of the configuration data, including the address of the backend server(s), is listed in the main *.vcl file - not here.

# Needs to go in here

A sample XXXX

The address(es) of the backend server(s) must be specified here. In a simple installation, one server (localhost) is sufficient. Larger sites may operate multiple wiki or image servers behind a single Traffic Server cache[3]:

# set default backend if no server cluster specified
backend default {
        .host = "localhost";
        .port = "80"; }
 
# create a round-robin director: "apaches" uses roberto and sophia as backend servers.
director apaches round-robin {
  { .backend = { .host = "wiki1"; .port = "80"; } }
  { .backend = { .host = "wiki2"; .port = "80"; } } }
 
# access control list for "purge": open to only localhost and other local nodes
acl purge {
        "localhost";
        "wiki1";
        "wiki2";
        "image1";
}

If more than one backend webserver is available, a list of servers to be used may be selected here on a per-domain basis. This could allow multiple, relatively powerful servers to be used to respond to wiki page text requests while requests for static images are handled on a local web server. A simple one-server installation would simply pass all unhandled requests to the default web server.

Any requests other than a simple 'get' will be passed directly through to the web server, along with all requests from logged-in users.

Most common browsers do support compression (gzip or zip) of returned pages. While Traffic Server itself performs no compression, it is configured here to store separate copies of a page depending on whether the user's browser supports compression.[4] If a browser accepts both gzip and zip (deflate), the gzip version of the page is served as it is smaller and therefore slightly quicker to display. The browser's reported capabilities are checked here and the gzip'ped version of pages is served wherever possible.

# vcl_recv is called whenever a request is received 
sub vcl_recv {
        # Serve objects up to 2 minutes past their expiry if the backend
        # is slow to respond.
        set req.grace = 120s;
 
        # Use our round-robin "apaches" cluster for the backend.
        if (req.http.host ~ "^images.example.org$") 
           {set req.backend = default;}
        else
           {set req.backend = apaches;}
 
        # This uses the ACL action called "purge". Basically if a request to
        # PURGE the cache comes from anywhere other than localhost, ignore it.
        if (req.request == "PURGE") 
            {if (!client.ip ~ purge)
                {error 405 "Not allowed.";}
            lookup;}
 
        # Pass any requests that Traffic Server does not understand straight to the backend.
        if (req.request != "GET" && req.request != "HEAD" &&
            req.request != "PUT" && req.request != "POST" &&
            req.request != "TRACE" && req.request != "OPTIONS" &&
            req.request != "DELETE") 
            {pipe;}     /* Non-RFC2616 or CONNECT which is weird. */
 
        # Pass anything other than GET and HEAD directly.
        if (req.request != "GET" && req.request != "HEAD")
           {pass;}      /* We only deal with GET and HEAD by default */
 
        # Pass requests from logged-in users directly.
        if (req.http.Authorization || req.http.Cookie)
           {pass;}      /* Not cacheable by default */
 
        # Pass any requests with the "If-None-Match" header directly.
        if (req.http.If-None-Match)
           {pass;}
 
        # Force lookup if the request is a no-cache request from the client.
        if (req.http.Cache-Control ~ "no-cache")
           {purge_url(req.url);}
 
        # normalize Accept-Encoding to reduce vary
        if (req.http.Accept-Encoding) {
          if (req.http.User-Agent ~ "MSIE 6") {
            unset req.http.Accept-Encoding;
          } elsif (req.http.Accept-Encoding ~ "gzip") {
            set req.http.Accept-Encoding = "gzip";
          } elsif (req.http.Accept-Encoding ~ "deflate") {
            set req.http.Accept-Encoding = "deflate";
          } else {
            unset req.http.Accept-Encoding;
          }
        }
 
        lookup;
}

Traffic Server must pass the user's IP address as part of the 'x-forwarded-for' header, so that MediaWiki may be configured to display the user's address in MWwiki:special:recentchanges instead of Traffic Server's local IP address.

sub vcl_pipe {
        # Note that only the first request to the backend will have
        # X-Forwarded-For set.  If you use X-Forwarded-For and want to
        # have it set for all requests, make sure to have:
        # set req.http.connection = "close";
 
        # This is otherwise not necessary if you do not do any request rewriting.
 
        set req.http.connection = "close";
}

Traffic Server must be configured to allow a PURGE request from MediaWiki, instructing the cache to discard stored copies of pages which have been modified by user edits. These requests normally originate only from wiki servers within the local site.

If the page is not in the cache, a 200 (success) code is still returned as the objective is to remove the outdated page from the cache.

# Called if the cache has a copy of the page.
sub vcl_hit {
        if (req.request == "PURGE") 
            {purge_url(req.url);
            error 200 "Purged";}
 
        if (!obj.cacheable)
           {pass;}
}
 
# Called if the cache does not have a copy of the page.
sub vcl_miss {
        if (req.request == "PURGE") 
           {error 200 "Not in cache";}
}

The web server may set default expiry times for various objects. As MediaWiki will indicate (via a PURGE request) when a page has been edited and therefore needs to be discarded from cache, the Apache-reported defaults for expiry time are best ignored or replaced with a significantly-longer expiry time.

Pages served to logged-in users (identified by MediaWiki setting browser cookies) or which require passwords to access are never cached.

In this example, the 'no-cache' flag is being ignored on pages served to anonymous-IP users. Such measures normally are only needed if a wiki is making extensive use of extensions which add this flag indiscriminately (such as a wiki packed with random <choose>/<option> Algorithm tags on the main page and various often-used templates).

# Called after a document has been successfully retrieved from the backend.
sub vcl_fetch {
 
        # set minimum timeouts to auto-discard stored objects
#       set beresp.prefetch = -30s;
        set beresp.grace = 120s;
 
        if (beresp.ttl < 48h) {
          set beresp.ttl = 48h;}
 
        if (!beresp.cacheable) 
            {pass;}
 
        if (beresp.http.Set-Cookie) 
            {pass;}
 
#       if (beresp.http.Cache-Control ~ "(private|no-cache|no-store)") 
#           {pass;}
 
        if (req.http.Authorization && !beresp.http.Cache-Control ~ "public") 
            {pass;}
 
}

Configuring MediaWiki

Since Traffic Server is doing the requests from localhost, Apache will receive "127.0.0.1" as the direct remote address. However, as Traffic Server forwards requests to Apache, it is configured to add the "X-Forwarded-For" header so that the remote address from the outside world is preserved. MediaWiki must be configured to use the "X-Forwarded-For" header in order to correctly display user addresses in MWwiki:special:recentchanges.

The required configuration is the same for Squid as for Traffic Server. Make sure the LocalSettings.php file contains the following lines:

$wgUseSquid = true;
$wgSquidServers = array('example.org:80');
//Use $wgSquidServersNoPurge if you don't want MediaWiki to purge modified pages
//$wgSquidServersNoPurge = array('127.0.0.1');

Be sure to replace 'example.org' with the IP address on which your Traffic Server cache is listening. These settings serve two purposes:

  • If a request is received from the Traffic Server cache server, the MediaWiki logs need to display the IP address of the user, not that of Traffic Server. A MWwiki:special:recentchanges in which every edit is reported as '127.0.0.1' is all but useless; listing that address as a Squid/Traffic Server server tells MediaWiki to ignore the IP address and instead look at the 'x-forwarded-for' header for the user's IP.
  • If a page or image is changed on the wiki, MediaWiki will send notification to every server listed in $wgSquidServers telling it to discard (purge) the outdated stored page.

See also Squid configuration settings for all settings related to Squid/Traffic Server caching.

Some notes

As most of the traffic is handled by the Traffic Server cache, a statistics package[5] will not give meaningful data if configured to analyse Apache's access_log. There are packages available to log Traffic Server access data to a file for analysis if needed. Counters on individual wiki pages will also severely underestimate the number of views to each page (and to the site overall) if a web cache is deployed. Many large sites will turn off the counters with $wgDisableCounters.

The display of the user's IP address in the user interface must also be disabled by setting $wgShowIPinHeader = false;

Note that Traffic Server is an alternative to Squid, but does not replace other portions of a complete MediaWiki caching strategy such as:

Pre-compiled PHP code
The default behaviour of PHP under Apache is to load and interpret PHP web scripts each time they are accessed. Installation of a cache such as APC (yum install php-pecl-apc, then allocate memory by setting apc.shm_size=128 or better in /etc/php.d/apc.ini) can greatly reduce the amount of CPU time required by Apache to serve PHP content.
Localisation/Internationalisation
By default, MediaWiki (as of version 1.16+) will create a huge l10n_cache database table and access it constantly - possibly more than doubling the load on the database server after an "upgrade" to the latest MediaWiki version. Set $wgLocalisationCacheConf to force the localisation information to be stored to the file system to remedy this.
Variables and session data
Storing variable data such as the MediaWiki sidebar, the list of namespaces or the spam blacklist to a memory cache will substantially increase the speed of a MediaWiki installation. Forcing user login data to be stored in a common location is also essential to any installation in which multiple, interchangeable Apache servers are hidden behind the same Traffic Server caches to serve pages for the same wikis. Install the memcached package and set the following options in LocalSettings.php to force both user login information and cached variables to use memcache:
$wgMainCacheType = CACHE_MEMCACHED;
$wgMemCachedServers = array ( '127.0.0.1:11211' );
$wgSessionsInMemcached = true;
$wgUseMemCached = true;
Note that, if you have multiple servers, the localhost address needs to be replaced with that of the shared memcached server(s), which must be the same for all of the matching web servers at your site. This ensures that logging a user into one server in the cluster logs them into the wiki on all the interchangeable web servers.

In many cases, there are multiple alternative caching approaches which will produce the same result. See MWmanual::Cache.

Apache configuration

Log file

The Apache web server log, by default, shows only the address of the Traffic Server cache server, in this example "127.0.0.1:80"

Apache may be configured to log the original user's address by capturing "x-forwarded-for" information under a custom log file format.[6]

An example for Apache's httpd.conf to configure logging of x-forwarded-for is:

LogFormat "%{X-Forwarded-for}i %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" cached

Image hotlinking

If a site uses Apache's mod_rewrite to block attempts by other websites to hotlink images, this configuration will need to be removed and equivalent configuration added to Traffic Server's configuration files. Where an image server is located behind Traffic Server, typically 90% or more of common image requests never reach Apache and therefore will not be blocked by a "http_referer" check in Apache's configurations.

See also

References

  1. http://trafficserver.apache.org/
  2. http://trafficserver.apache.org/docs/v2/admin/reverse.htm
  3. http://wikia.googlecode.com/svn/utils/Traffic Serverhtcpd/mediawiki.vcl
  4. http://Traffic Server.projects.linpro.no/wiki/FAQ/Compression
  5. AWStats
  6. http://httpd.apache.org/docs-2.2/mod/mod_log_config.html
Personal tools