(JoSch) I recently programmed a little Python app for being able to read *ALL* Wikipedia articles (german wikipedia) on my Nokia E70 Symbian phone. It tought me that a 4GB miniSD Card is (with a whole bunch of tweaks) enough for the (german) wikipedia and that Symbian programming really is *no* fun. So why not do this for OpenMoko instead?
- what about compressing the data? probably wouldn't allow you to search, but hey.
- --Minime 16:21, 16 July 2007 (CEST)
- (JoSch)I mentioned compression as an option in the article. It would be necessary to compress every single article on its own because it would be overkill to seek a few kilobytes in several GB compressed file every time. Then a title search could be made by filename but I think it's a better idea to have a title list file.
- Compress the data in small portions - say 100K compressed - that can be decompressed in under a second.You probably want to use some sort of page-sorting-compressor - so that pages in one batch are similar - and will compress a bit better.
- It sounds logical that electronics based articles together will compress better than random (of course - in reality)...
- Then store a search-keyword database into this data.
- Works well.
- I use 'wwwoffle'  to search my browsed web-pages.
- --Speedevil 17:10, 16 July 2007 (CEST)
- Thanks for your ideas - I will consider them!
- --JoSch 17:57, 16 July 2007 (CEST)
most readed articles
Is it possible and a good idea to filter out the most readed articles?