Talk:Mokopedia

From Openmoko

(Difference between revisions)
Jump to: navigation, search
m (coment)
Line 2: Line 2:
 
So why not do this for OpenMoko instead?
 
So why not do this for OpenMoko instead?
  
: what about compressing the data? probably wouldn't allow you to search, but hey.--[[User:Minime|Minime]] 16:21, 16 July 2007 (CEST)
+
what about compressing the data? probably wouldn't allow you to search, but hey.
 +
--[[User:Minime|Minime]] 16:21, 16 July 2007 (CEST)
 +
Compress the data in small portions - say 100K compressed - that can be decompressed in under a second.
 +
You probably want to use some sort of page-sorting-compressor - so that pages in one batch are similar - and will compress a bit better.
 +
It sounds logical that electronics based articles together will compress better than random (of course - in reality)...
  
:: ([[User:JoSch|JoSch]])I mentioned compression as an option in the article. It would be necessary to compress every single article on its own because it would be overkill to seek a few kilobytes in several GB compressed file every time. Then a title search could be made by filename but I think it's a better idea to have a title list file.
+
Then store a search-keyword database into this data.
 +
Works well.
 +
I use 'wwwoffle' [http://www.gedanken.demon.co.uk/wwwoffle/] to search my browsed web-pages.
 +
--[[User:Speedevil|Speedevil]] 17:10, 16 July 2007 (CEST)

Revision as of 17:10, 16 July 2007

(JoSch) I recently programmed a little Python app for being able to read *ALL* Wikipedia articles (german wikipedia) on my Nokia E70 Symbian phone. It tought me that a 4GB miniSD Card is (with a whole bunch of tweaks) enough for the (german) wikipedia and that Symbian programming really is *no* fun. So why not do this for OpenMoko instead?

what about compressing the data? probably wouldn't allow you to search, but hey. --Minime 16:21, 16 July 2007 (CEST) Compress the data in small portions - say 100K compressed - that can be decompressed in under a second. You probably want to use some sort of page-sorting-compressor - so that pages in one batch are similar - and will compress a bit better. It sounds logical that electronics based articles together will compress better than random (of course - in reality)...

Then store a search-keyword database into this data. Works well. I use 'wwwoffle' [1] to search my browsed web-pages. --Speedevil 17:10, 16 July 2007 (CEST)

Personal tools

(JoSch) I recently programmed a little Python app for being able to read *ALL* Wikipedia articles (german wikipedia) on my Nokia E70 Symbian phone. It tought me that a 4GB miniSD Card is (with a whole bunch of tweaks) enough for the (german) wikipedia and that Symbian programming really is *no* fun. So why not do this for OpenMoko instead?

what about compressing the data? probably wouldn't allow you to search, but hey. --Minime 16:21, 16 July 2007 (CEST) Compress the data in small portions - say 100K compressed - that can be decompressed in under a second. You probably want to use some sort of page-sorting-compressor - so that pages in one batch are similar - and will compress a bit better. It sounds logical that electronics based articles together will compress better than random (of course - in reality)...

Then store a search-keyword database into this data. Works well. I use 'wwwoffle' [1] to search my browsed web-pages. --Speedevil 17:10, 16 July 2007 (CEST)