Prevent your continuous integration/deployment workflow from pummeling the npm registry’s couchdb. Reduce headaches and errors during npm install.
Now that PyPi is being accelerated by the Fastly caching network, pip/easy_install already are running faster. However, this can be taken a step further by setting up a simple caching proxy. By caching packages locally (on the machine or in your private network), you don’t have to keep hitting Fastly/PyPi to download them. This is especially useful if you are constantly running builds and/or tests: AKA continuous integration.
Phabricator is an awesome suite of tools open sourced by Facebook and now maintained by Phacility. At Disqus, it’s the central nexus of our engineering team. Since so much of an engineer’s day revolves around using the web interface, I was tasked with trying to optimize our local instance of it. The quickest win was installing and enabling APC per the Installation Guide. Next up, I opened the network tab of Chrome’s developer tools and found that PHP is handling the serving of static assets. Granted, phabricator does set very sane and liberal headers so that browsers will heavily cache all the assets, each browser still needs to obtain them first. To ease the pain of the first load, I setup Nginx to handle caching them as well. This way PHP only has to serve and/or generate assets once and something that’s far better at serving static content, can handle the heavy lifting from then on out.
In the upcoming 0.4 release of the nginx-push-stream-module, it will have support for the Nginx Gzip filter. Being able to gzip messages will free up bandwidth and decrease latency when under high load. However, the default deflate settings Nginx uses are not ideal for the high concurrency and small messages that are typically sent with the push-stream module. By default, Nginx may allocate up to a relatively large (264kb) chunk of memory for zlib upfront for every request that supports gzip. This adds up fast when there are thousands of concurrent connections to Nginx.
Nginx already has a neat module included with it to proxy requests to a memcached server (
memcached_pass). Combine that with the upstream round robin load balancer, you have the beginnings of a memcached page cache cluster for your nginx. However, calling each server sequentially can become quite expensive if the key resides in one of the last servers.