<?xml version="1.0" encoding="utf-8"?>
<feed xml:lang="en-us" xmlns="http://www.w3.org/2005/Atom"><title>Simon Willison's Weblog: bigpage</title><link href="http://simonwillison.net/" rel="alternate"/><link href="http://simonwillison.net/tags/bigpage.atom" rel="self"/><id>http://simonwillison.net/</id><updated>2010-02-19T09:14:08+00:00</updated><author><name>Simon Willison</name></author><entry><title>Making Facebook 2x Faster</title><link href="https://simonwillison.net/2010/Feb/19/facebook/#atom-tag" rel="alternate"/><published>2010-02-19T09:14:08+00:00</published><updated>2010-02-19T09:14:08+00:00</updated><id>https://simonwillison.net/2010/Feb/19/facebook/#atom-tag</id><summary type="html">
    
&lt;p&gt;&lt;strong&gt;&lt;a href="http://www.facebook.com/note.php?note_id=307069903919"&gt;Making Facebook 2x Faster&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;
Facebook have a system called BigPipe which allows them to progressively send their pages to the browser as the server-side processing completes to optimise client loading time. Anyone reverse engineered this yet to figure out how they actually do it?


    &lt;p&gt;Tags: &lt;a href="https://simonwillison.net/tags/bigpage"&gt;bigpage&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/facebook"&gt;facebook&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/optimisation"&gt;optimisation&lt;/a&gt;, &lt;a href="https://simonwillison.net/tags/performance"&gt;performance&lt;/a&gt;&lt;/p&gt;



</summary><category term="bigpage"/><category term="facebook"/><category term="optimisation"/><category term="performance"/></entry></feed>