Difference between revisions of "Kiwix"

From OnnoWiki
Jump to navigation Jump to search
(Created page with "Wikix is a 'C' based program written by Jeffrey Vernon Merkey that will read any XML dump provided by the foundation, extract all image names from the XML dump which it may re...")
 
Line 8: Line 8:
  
 
* [[Kiwix: Install Ubuntu 18.04]]
 
* [[Kiwix: Install Ubuntu 18.04]]
 +
* [[Kiwix: Download & Instal]] RaspberryPi

Revision as of 08:23, 12 December 2018

Wikix is a 'C' based program written by Jeffrey Vernon Merkey that will read any XML dump provided by the foundation, extract all image names from the XML dump which it may reference, then generate a series of BASH or Bourne Unix style scripts which can be invoked to download all images from Wikimedia Commons and Wikipedia.

The program relies on cURL, an automated web spider, to download referenced images. The program will also convert text based utf8 characters into actual utf8 strings for those dumps which may contain improperly formatted names for specific images. The program can be configured to generate 16 parallel scripts which will download all images from Wikipedia. The program includes Jeff Bezanson's utf8 libraries.

As of March 24, 2008, using a cable modem, the entire set of Wikipedia images can be downloaded in about 96 hours using this program (420 GB as of 3/24/08).

Pranala Menarik