Hello to all.. i have a script that parse about more than 3,000 page and then get some data and save it to my db. but the problem here is the script can't continue because of the time out
i already tried the set_time_limit(0) but still it doesn't work.. is there anyway that continue to perform the script without stopping sending request to the server so that it can finish or if possible is to split the long task so that i can simultaneously perform well.
Hope you can share anything :)
The cronjob (as per BlueSky's suggestion) would probably be your best bet... but here's another possible work around.
Note this isn't tested - but I think it would work.
Like snowboardalliance suggested, you could break it into smaller bits - but it would be a pain to have to click "Next" enough times to finish it. Instead, you can use a Header("Location:xyz") to reload the page and keep the script moving.
The basic structure would be like this...
The page loads. Build the list of pages to process.
Process x number of pages in a loop. At the end, save the last page processed in a $_SESSION variable.
Use a Header("Location:xyz") to send the browser back to the page... and when the page loads, it checks for the $_SESSION variable. If it's set, it starts off again where you left off.
This should make the script terminate and reload often, so it never times out. It will also be completely automated.
One drawback is you can't have feedback as you go... since any output would invalidate the Header redirect. Instead, you could either write the feedback to a log file or save the information in another $_SESSION variable and output the string at the end of the whole process.
i try to fetch a data more about 10,000+ files..... so i made script that fetch all detail info of all to get the links. but the script doesn't continue as it stop in the middle of the process..... when i tried to fetch 20 sites. and i get the time rate for 20 sites = 80 sec. so how much more for 10,000 that is why what i worried for how to fetch all the 10,000+ without stopping or the script should finish to execute.... i read some that to use ajax httprequest but still doesn't work.
By the way if ever it will run on cron jobs the question is there any limitations or limit time to execute?
If this is a one-time database write, you could just set up php on your computer and get rid of the timeout. Then just use that data on your live site. Though this may not be a good option for you. I'm not sure.
well i just use a plain php code that fetch the inner links to be parse for the site but the problem is, it break in the middle, i guess it only parse about 50 links and then it will stop.
Does the server on which you are running this script allow PHP's exec() command?
i don't know but i guess yes since it is a paid hosting.. so i guess we can used this functionalities.