Β« Oatmeal

Stupid-simple bash script to create a local copy of a website

#!/bin/bash

printf "\n\n    Please enter the URL of the website you wish to download.\n    Do not include leading 'http://' or 'https://'?\n\n\n"
printf "    "
read URL
printf "\n\n    Depending on the size of the website you just\n    entered, this may take a few minutes.\n\n\n"
sleep 6
wget --recursive --no-clobber --page-requisites --html-extension --convert-links --restrict-file-names=windows --domains ${URL} --no-parent http://${URL}
printf "\n\n  DONE!\n\n\n"

All the heavy lifting is done by wget, so that will need to be installed before using the script

Post a response on your own site? Send me a webmention!