Unfortunately they have not yet released an English version. They claim that their 150 MB Arabic version contains an impressive 70,000 articles, and that their 1.5 GB French version contains the entire French Wikipedia, more than 700,000 articles. Moulin Wiki is a project to develop open source offline distributions of Wikipedia content, based on the Kiwix browser. It looks very nice, but it's a shame that it doesn't run on Linux. It has its own user interface, which displays Wikipedia formatting properly (e.g. WikiTaxi is a free Windows application which also loads and indexes Wikipedia dump files. You can pass the indexed data between users as a Zip file to save time and bandwidth, and you may be able to share this file between multiple users on a computer or a network. It does not support images, and the search feature only searches article titles, not their contents. You can then access Wikipedia offline in your browser by going to a URL such as wikipedia://wiki. It requires a different dump file, containing the latest metadata (8 GB) instead of the usual one (3 GB). Zipedia is a Firefox plugin which loads and indexes a Wikipedia dump file. Another review is on the Speed of Creativity Blog. It also has only 5,500 articles compared to the 2 million in Wikipedia itself (about 0.25%). Although it looks like Wikipedia, it's a static website, so while it's easy to install, it has no search feature. It's available for free download using BitTorrent, which is rather slow. Wikipedia Selection for Schools is a static website, created by Wikimedia and SOS Childrens Villages, with a hand-chosen and checked selection of articles from the main Wikipedia, and images, that fit on a DVD or 3GB of disk space. This method does not include any images, as the image dump is apparently 75 GB, and no longer appears to be available, and it displays some odd template codes in the text (shown in red below) which may confuse users. Apparently it takes days to build the search index (I'm testing this at the moment). This is quite a complex process, and importing takes a long time, about 4 hours for the articles themselves (on a 3 GHz P4). You can then import a Wikipedia database dump and use the wiki offline. MediaWiki (the Wikipedia wiki software) can be downloaded and installed on a computer configured as an AMP server (Apache, MySQL, PHP). Here are my impressions of the solutions that I've tried so far, gathered from various sources including. They also took me a long time to find, so I'm collating the information here in the hope that it will help others. Tools that have built-in search engines usually require that you download a pages and articles dump file from Wikipedia (about 3 GB download) and then generate a search index, which can take from half an hour to five days.įor an open source project that seems ideally suited to being used offline, and considering the amount of interest, there are surprisingly few options (already developed). Keyword search like the Wikipedia website.Looks and feels like the Wikipedia website (e.g.Reasonable selection of content from English Wikipedia, preferably with some images. I'm working on making Wikipedia, the (in)famous free encyclopaedia, available offline, for a project in a school in rural Zambia where Internet access will be slow, expensive and unreliable.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |