Offline Wikipedia reader

From Openmoko

Revision as of 02:11, 12 January 2009 by Myfanwy (Talk | contribs)

Jump to: navigation, search

Instructions can be found here: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html

Initial indexing should be carried out on a desktop/laptop, as the freerunner is not powerful enough

offline wikipedia is one of the applications that runs on the Openmoko Phones. For a list of all applications, visit Applications

This project provides software to store the entirety of Wikipedia (any language) locally on a Linux device.

All Wikipedia pages are downloaded from the Wikipedia page dump.

For more info visit the official website: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html

The English Wikipedia (text only) is around 6GB, so the entire content can be stored on one 8GB device. The German Wikipedia is approximately 1/4 the size, so can be stored on a correspondingly smaller card; ditto for other languages


Development status

At present, a single tar.bz is downloaded from the site above, the pages extract downloaded and copied to the correct location, and the indexing process run (takes approximately one hour on a dual-core 1.1GHz cpu, 1.5GB RAM laptop)

In the future the software will be released as an ipk, with an automatic download of the most recent wikipedia dump, and a diff utility to allow it to be updated.


150px Mokopedia

Read entirety of Wikipedia offline


Homepage: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html
Package: no package yet
Tested on: -

Personal tools

Instructions can be found here: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html

Initial indexing should be carried out on a desktop/laptop, as the freerunner is not powerful enough

offline wikipedia is one of the applications that runs on the Openmoko Phones. For a list of all applications, visit Applications

This project provides software to store the entirety of Wikipedia (any language) locally on a Linux device.

All Wikipedia pages are downloaded from the Wikipedia page dump.

For more info visit the official website: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html

The English Wikipedia (text only) is around 6GB, so the entire content can be stored on one 8GB device. The German Wikipedia is approximately 1/4 the size, so can be stored on a correspondingly smaller card; ditto for other languages


Development status

At present, a single tar.bz is downloaded from the site above, the pages extract downloaded and copied to the correct location, and the indexing process run (takes approximately one hour on a dual-core 1.1GHz cpu, 1.5GB RAM laptop)

In the future the software will be released as an ipk, with an automatic download of the most recent wikipedia dump, and a diff utility to allow it to be updated.


150px Mokopedia

Read entirety of Wikipedia offline


Homepage: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html
Package: no package yet
Tested on: -