Offline Wikipedia reader

From Openmoko

(Difference between revisions)
Jump to: navigation, search
Line 8: Line 8:
 
This project provides software to store the entirety of Wikipedia (any language) locally on a Linux device.
 
This project provides software to store the entirety of Wikipedia (any language) locally on a Linux device.
  
All Wikipedia pages are downloaded from the Wikipedia page dump.
+
All Wikipedia pages are downloaded from the Wikipedia page dump - [http://download.wikimedia.org/enwiki/20081008/]. The files needed is called 'pages-articles.xml.bz2', and the most recent is 4.1GB
  
 
For more info visit the official website: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html
 
For more info visit the official website: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html
Line 14: Line 14:
 
The English Wikipedia (text only) is around 6GB, so the entire content can be stored on one 8GB device. The German Wikipedia is approximately 1/4 the size, so can be stored on a correspondingly smaller card; ditto for other languages
 
The English Wikipedia (text only) is around 6GB, so the entire content can be stored on one 8GB device. The German Wikipedia is approximately 1/4 the size, so can be stored on a correspondingly smaller card; ditto for other languages
  
 +
The software functions by running a lightweight webserver on the phone, and using php to present the pages, which are then viewed using any web browser.
  
 
=Development status=
 
=Development status=

Revision as of 02:32, 12 January 2009

Instructions can be found here: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html

Initial indexing should be carried out on a desktop/laptop, as the freerunner is not powerful enough

offline wikipedia is one of the applications that runs on the Openmoko Phones. For a list of all applications, visit Applications

This project provides software to store the entirety of Wikipedia (any language) locally on a Linux device.

All Wikipedia pages are downloaded from the Wikipedia page dump - [1]. The files needed is called 'pages-articles.xml.bz2', and the most recent is 4.1GB

For more info visit the official website: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html

The English Wikipedia (text only) is around 6GB, so the entire content can be stored on one 8GB device. The German Wikipedia is approximately 1/4 the size, so can be stored on a correspondingly smaller card; ditto for other languages

The software functions by running a lightweight webserver on the phone, and using php to present the pages, which are then viewed using any web browser.

Development status

At present, a single tar.bz is downloaded from the site above, the pages extract downloaded and copied to the correct location, and the indexing process run (takes approximately one hour on a dual-core 1.1GHz cpu, 1.5GB RAM laptop)

In the future the software will be released as an ipk, with an automatic download of the most recent wikipedia dump, and a diff utility to allow it to be updated.


150px Mokopedia

Read entirety of Wikipedia offline


Homepage: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html
Package: no package yet
Tested on: -

Personal tools

Instructions can be found here: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html

Initial indexing should be carried out on a desktop/laptop, as the freerunner is not powerful enough

offline wikipedia is one of the applications that runs on the Openmoko Phones. For a list of all applications, visit Applications

This project provides software to store the entirety of Wikipedia (any language) locally on a Linux device.

All Wikipedia pages are downloaded from the Wikipedia page dump.

For more info visit the official website: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html

The English Wikipedia (text only) is around 6GB, so the entire content can be stored on one 8GB device. The German Wikipedia is approximately 1/4 the size, so can be stored on a correspondingly smaller card; ditto for other languages


Development status

At present, a single tar.bz is downloaded from the site above, the pages extract downloaded and copied to the correct location, and the indexing process run (takes approximately one hour on a dual-core 1.1GHz cpu, 1.5GB RAM laptop)

In the future the software will be released as an ipk, with an automatic download of the most recent wikipedia dump, and a diff utility to allow it to be updated.


150px Mokopedia

Read entirety of Wikipedia offline


Homepage: http://users.softlab.ece.ntua.gr/~ttsiod/buildWikipediaOffline.html
Package: no package yet
Tested on: -