Server:WebProxy

From Openmoko

Revision as of 18:02, 10 August 2008 by Borgcube (Talk | contribs)

Jump to: navigation, search
Wishes warning! This article or section documents one or more OpenMoko Wish List items, the features described here may or may not be implemented in the future.

This is a brief page describing a web proxy optimised for use on devices with a reasonable amount of persistant storage, and very limited bandwidth.

Once, each page linked to a subpage of contents, which remained static, and could be easily refreshed if it changed based on dates in the HTTP headers.

Now, this is the case in the minority of popular sites. Most sites now have a substantial fraction of pages with some non-static contents.

As an example of this, for example consider http://www.ebay.com/index.html.

Over a 15 minute period, the size was constant at around 66K, and it was different most times it was loaded.

Simply compressing this page using advanced compression techniques provides a useful compression - taking the page to 15K.

A very simple test, using diff and gzip however, revealed that the variation between pages is quite small.

This means that if the user clicks 'reload', if the proxy simply compresses the page, the user needs to download 15K.

If, however, the user-agent and the proxy act in concert, this can be reduced to under 0.5K. (split on "<", count the compressed differences).

This is done by the user-agent caching the pages it downloads, then informing the proxy of which version of the page it has.

The proxy then simply sends the compressed differences between the previous and current version.

improvement: it would be better NOT to modify the client, but instead have a 'reassembly proxy' on the client, so that all http clients/user agents benefit without hacks. The reassembly proxy could then inject a cookie to keep track of page versions. - compare this http://ozlabs.org/~rusty/rproxy.html however and check for patent issues before any actual work is done.

Other optimisations:

  • Comparing pages, and ensuring that any page has in fact changed before downloading, as many servers misreport pages changed when they have not.
  • Convert all jpegs to progressive, and initially only download the first 'scan' of the image, which is 1/8th the size or so. Allow the user to download the remainder of the file for full resolution by clicking on it.
  • Probably 'Ziproxy' (http://ziproxy.sourceforge.net/) could be extended to provide this functionality.

Extended usage scenarios:

  • Different profiles, depending on how the Freerunner is connected (Wifi vs. USB vs. GPRS).
  • Traffic-measurement especially for GPRS-connection for users with limited data-amounts (for example 200MB/month) or in areas with limited data-consumption, like on commercial wifi-aps on airports and such.
Personal tools
Wishes warning! This article or section documents one or more OpenMoko Wish List items, the features described here may or may not be implemented in the future.

This is a brief page describing a web proxy optimised for use on devices with a reasonable amount of persistant storage, and very limited bandwidth.

Once, each page linked to a subpage of contents, which remained static, and could be easily refreshed if it changed based on dates in the HTTP headers.

Now, this is the case in the minority of popular sites. Most sites now have a substantial fraction of pages with some non-static contents.

As an example of this, for example consider http://www.ebay.com/index.html.

Over a 15 minute period, the size was constant at around 66K, and it was different most times it was loaded.

Simply compressing this page using advanced compression techniques provides a useful compression - taking the page to 15K.

A very simple test, using diff and gzip however, revealed that the variation between pages is quite small.

This means that if the user clicks 'reload', if the proxy simply compresses the page, the user needs to download 15K.

If, however, the user-agent and the proxy act in concert, this can be reduced to under 0.5K. (split on "<", count the compressed differences).

This is done by the user-agent caching the pages it downloads, then informing the proxy of which version of the page it has.

The proxy then simply sends the compressed differences between the previous and current version.

improvement: it would be better NOT to modify the client, but instead have a 'reassembly proxy' on the client, so that all http clients/user agents benefit without hacks. The reassembly proxy could then inject a cookie to keep track of page versions. - compare this http://ozlabs.org/~rusty/rproxy.html however and check for patent issues before any actual work is done.

Other optimisations:

  • Comparing pages, and ensuring that any page has in fact changed before downloading, as many servers misreport pages changed when they have not.
  • Convert all jpegs to progressive, and initially only download the first 'scan' of the image, which is 1/8th the size or so. Allow the user to download the remainder of the file for full resolution by clicking on it.
  • Probably 'Ziproxy' (http://ziproxy.sourceforge.net/) could be extended to provide this functionality.

Extended usage scenarios:

  • Different profiles, depending on how the Freerunner is connected (Wifi vs. USB vs. GPRS).
  • Traffic-measurement especially for GPRS-connection for users with limited data-amounts (for example 200MB/month) or in areas with limited data-consumption, like on commercial wifi-aps on airports and such.