Can anyone recommend a freeware that can download an entire website (all the html, images, video, etc that is linked from within the html)?
I need this to download my own website on my local harddisk.
I have added much, and changes much to my website, I need to get a clean copy of my website without all the outdated objects in my site.
Thus, it must be able to download from a localhost website.
Urgent, as I wanted to refresh my existing websites.
Thanks and please provide tested and safe ware.
HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility.
It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. HTTrack can also update an existing mirrored site, and resume interrupted downloads. HTTrack is fully configurable, and has an integrated help system.
WinHTTrack is the Windows 9x/NT/2000/XP release of HTTrack, and WebHTTrack the Linux/Unix/BSD release.
O.o windows doesn't do that by itself?
My Ubuntu install will re-route all paths to a local folder in the format of <filename>_files and store all pics/vids/sounds....etc there. Even all .js files, everything really....
Actually it does, although I don't believe you can set where the files go. They should be in the Temporary Internet Files.
well thats rather worthless. Can't actually SAVE a site like that. I have a large folder filled with all my purchase invoices, just save the page and voila no need to print anything.
Well I believe it is copyright infringement to steal images from other sites without permission.
You might find this of use: http://www.webreaper.net/. It will download an entire site and even connected sites (if you set it to). It can be set to modify the pages so that the site, once downloaded, can be browsed locally while offline. I've used Webreaper before and had no problems with it. From the previous posts, it sounds like HTTrack does a similar job.
I would have used FrontPage (learned today it is a dying software). The Trial Version of FrontPage 2003 is still available. I would start an empty Web and then import your Website into the empty Web. Quite easy to do. You obviously would need FrontPage Extensions for that however.
I have tried HTTrack, but seem unable to download my http://localhost/site/index.html
The log shows need authentication, when I can surf my loaclhost site (using IIS), without any authentication.
I actually finally copied the entire website, and removed all my PNG files, work folders, 3D animation files, and test the site manually for broken links, and restoring some of the PNG files that are needed.
Finally I got my clean site out.
Thanks for all the info and help. With regards.
WebCopier Pro™ 5.0
WebCopier Pro™, based on award-winning WebCopier™ product, is the most powerful version of our offline browsers product line.
It really goes beyond traditional web browsing by integrating powerful and innovative features to find, manage, analyze and track information on the Internet.
Companies can use WebCopier to transfer company's intranet contents to staff desktops and notebooks, create a copy of companies' online catalogs and brochures for sales personal, backup corporate web sites, print downloaded files.
Individuals can use WebCopier to save complete copies of their favorite sites, magazines, or stock quotes. Students can download enormous amounts of information from the Internet for later study. Teachers can download whole sites so their students can view them later offline. Developers may use this tool to analyze websites structure, find dead links on a website.
# New version contains several enhancements, including: Redesigned Office 2007-like User Interface,
# Ability to download FTP sites,
# Improved Download Speed.
Download it at:
Wah, so far I have only encountered referal from members of Frihost, now this seem more like a marketing effort.
Anyway, thanks, and I am looking for a freeware to do this task.
Thanks for the info albuferque.
Actually, I've not used windows tools in a long time.I've just remembered one day that I used Webcopier; long time ago, so I could recommend it to someone someday. But, I don't know if you're using NIX-like O.S or Windows exclusively.
If you need GUI and you want to stay free then you can visit:
Thank you, albuferque.
I must have forgotten, that there are plenty of Linux/Unix users around too.
I am looking for Win based website downloaders and freeware.
I have taken a look at the pavuk open source sw, and it looks great, but on unix platforms.
Thank you, I have resolve the problem, by copying my entire website from my web server to a folder and manually removed some of the source programs, and then uploaded the cleanup folder to my external website.
So thanks for the info on pavuk, may consider this when working on unix.