Sometimes, we need to download the entire websites (Website mirroring) for reference purposes. Sometimes, you may not have internet availability all the time and you may want to have an offline copy of the website that you love. There are software tools that are capable of grabbing the entire websites for you and keep a local copy (Website mirroring). They are highly configurable, you can specify the URL, sub domains, wild characters to download etc. Here are some such Web Mirroring tools for your reference. However, these tools should not be used to steal off the information from Websites.
HTTrack is a free Open Source and easy-to-use Website mirroring utility. It allows you to download a Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site’s relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.
curl is a command line Website mirroring tool for transferring data with URL syntax, supporting FTP, FTPS, HTTP, HTTPS, SCP, SFTP etc protocols.
Wget is a free and Open Source software package for retrieving files using HTTP, HTTPS and FTP, the most widely-used Internet protocols. It is a non-interactive command line tool, so it may easily be called from scripts, cron jobs, terminals without X-Windows support, etc. Wget has many features to make retrieving large files or Website mirroring or FTP sites easy.