User:Darxide/Clone

You can use the API to export all the text content, with something like action=query&generator=allpages&export. Files you'll have to scrape via some script, such as pywikibot. You can see what extensions are installed via Special:Version if you want to set up an identical wiki; some of the configuration settings are available via the siteinfo API, most you'll have to guess. There is no way to bulk clone user accounts, but you can use the MediaWikiAuth extension to transfer them when they log in.

Media-Wiki pages can be exported in a special XML format to upload import into another MediaWiki installation You can use Special:Export usually in most standard Mediawiki. At lease you can get all pages of each namespace. imho this depends on the size. This works good for small mediawikis.

Backing up a wiki without server shell access, requires Python v2 (v3 didn't yet work last time I did this).

From the command-line run the the WikiTeam Python script dumpgenerator.py to get an XML dump, including edit histories with all images and their descriptions.

Note this XML dump doesn't create a complete backup of the wiki database. It doesn't contain user accounts, etc. Also the extensions and their configuration aren't backed up, file types other than images don't get saved. But it does save enough to recreate the wiki on another server.

Full instructions are at the WikiTeam tutorial.

For restoring the wiki from the XML dump see MediaWiki Manual:Importing XML dumps, etc.