FRIHOST FORUMS SEARCH FAQ TOS BLOGS COMPETITIONS
You are invited to Log in or Register a free Frihost Account!


Download webpage





timetorock
How can I download all the pages of a website using php code?
sonam
I don't understand your question, what do you want to download, php scripts, javascripts, css, images, etc? In this case why to use PHP if you can use FTP or cPanel (Direct Admin). Or maybe Cron Jobs.

If you want to download some site, and you are not owner then you can download only html (eventually Javascript, images and css), nothing else.

If you need content of some site maybe you can use RSS instead of downloading htmls.

Sonam
jmraker
If you have to do it in php you'd have to parse the HTML tags, build a list of links and download the pages and keep downloading+parsing until you run out of things to download

There are programs that do this (making a mirror of the site)

I use wget, a linux or cygwin command line program that downloads web pages has a few switches for mirroring a site for things like link depth, external links, etc
Aredon
http://www.pagesucker.com/

This will only download the static html/css pages from the website that you have access too.
ogah
Aredon wrote:
http://www.pagesucker.com/

This will only download the static html/css pages from the website that you have access too.

right. we can't download php code of the pages.
but we can do with php ftp like net2ftp. but it must logged on ftp account Smile
marron28
try winhttrack http://www.httrack.com/page/2/
it is a website copier...
k_s_baskar
don't think you can copy all scripts in website or webpage unless you have permission to access it. but you can copy html content using utilities like webcopier like softwares.
Marcuzzo
timetorock wrote:
How can I download all the pages of a website using php code?



the OP wants to create a web page using PHP that will to the same as the website copiers available.... I think...

have you checked cUrl? http://php.net/manual/en/book.curl.php
you may want to look at HTTrack for functionality but the basic idea is:

open a session with a website, get it's content.
look for links in the content, and put those in an array.
Loop through that array
create a new session for each link ( make sure to close the session !!! )
fetch the content of the link page
get the links in that page and put them in a new array
and so on and on and on... you may want to set a limit to recursion depth and to exclude 'external' links

also check here: http://nashruddin.com/fetching-a-web-page-from-php-code.html

you also need to make sure that your hosting supports cUrl.

and as the others said.... the PHP files can only be accessed if you can FTP the server.

hope this helps
k_s_baskar
wget will do if you have shell access in you linux.
Related topics
Personal WebPage building Tools..
Professional Webpage Production
Download Studio - A great Download Manager.
What Web Design Software is everyone using?
Make a image out of a webpage?
i'll have a try for "hello"
Download IPB 2.0.4
Internet Download Manager
A "small" list of free apps
Can i put a link to a website to download music
Can i put a link to a website to download music from my frie
Create Web pages without knowing HTML.
do you think free download of mp3s should be legal or not?
PC-News Dominicana
Reply to topic    Frihost Forum Index -> Scripting -> Php and MySQL

FRIHOST HOME | FAQ | TOS | ABOUT US | CONTACT US | SITE MAP
© 2005-2011 Frihost, forums powered by phpBB.