FRIHOST FORUMS SEARCH FAQ TOS BLOGS COMPETITIONS
You are invited to Log in or Register a free Frihost Account!


SPLIT long task?





AOP Web Development
Hello to all.. i have a script that parse about more than 3,000 page and then get some data and save it to my db. but the problem here is the script can't continue because of the time out

i already tried the set_time_limit(0) but still it doesn't work.. is there anyway that continue to perform the script without stopping sending request to the server so that it can finish or if possible is to split the long task so that i can simultaneously perform well.

Hope you can share anything :)
snowboardalliance
AOP Web Development wrote:
Hello to all.. i have a script that parse about more than 3,000 page and then get some data and save it to my db. but the problem here is the script can't continue because of the time out

i already tried the set_time_limit(0) but still it doesn't work.. is there anyway that continue to perform the script without stopping sending request to the server so that it can finish or if possible is to split the long task so that i can simultaneously perform well.

Hope you can share anything Smile


Probably, all you can do is break it up into smaller sections with links to go to the next page. If you can find the time just before a timeout, and this is for some loop, you could simply use GET or POST to send the index # (or whatever) you left off at so that the page can reload and continue at that point. Could get messy if the subsequent pages never load, but it's better than the whole thing getting timed out.

Edit, or look into http://hudzilla.org/phpwiki/index.php?title=Connection-related_functions
Could be useful.
MrBlueSky
The execution timeout is only imposed when your script is executed through a webserver. So you want to call it like a normal script instead. Normally you would use the shell for this, but since you don't have shell access to the frihost server, you have to use another way. Another option would be PHP's exec() or system() function, but these are disabled at frihost. But there are more ways to do it. You could

1. use a one-line perl cgi-script which calls your php-script, and then invoke the cgi-script from your browser, OR:

2. you can use a one-time cronjob. Setup a cronjob to execute at a certain time, with the command

Code:

/usr/local/bin/php -q /home/yourhome/some/path/3000pagescript.php


and remove it when it is done.
SlowWalkere
The cronjob (as per BlueSky's suggestion) would probably be your best bet... but here's another possible work around.

Note this isn't tested - but I think it would work.

Like snowboardalliance suggested, you could break it into smaller bits - but it would be a pain to have to click "Next" enough times to finish it. Instead, you can use a Header("Location:xyz") to reload the page and keep the script moving.

The basic structure would be like this...

The page loads. Build the list of pages to process.

Process x number of pages in a loop. At the end, save the last page processed in a $_SESSION variable.

Use a Header("Location:xyz") to send the browser back to the page... and when the page loads, it checks for the $_SESSION variable. If it's set, it starts off again where you left off.

This should make the script terminate and reload often, so it never times out. It will also be completely automated.

One drawback is you can't have feedback as you go... since any output would invalidate the Header redirect. Instead, you could either write the feedback to a log file or save the information in another $_SESSION variable and output the string at the end of the whole process.

- Walkere[/list]
AOP Web Development
i try to fetch a data more about 10,000+ files..... so i made script that fetch all detail info of all to get the links. but the script doesn't continue as it stop in the middle of the process..... when i tried to fetch 20 sites. and i get the time rate for 20 sites = 80 sec. so how much more for 10,000 that is why what i worried for how to fetch all the 10,000+ without stopping or the script should finish to execute.... i read some that to use ajax httprequest but still doesn't work.

By the way if ever it will run on cron jobs the question is there any limitations or limit time to execute?
MrBlueSky
AOP Web Development wrote:


By the way if ever it will run on cron jobs the question is there any limitations or limit time to execute?


No, but if your script runs for a long time (minutes or longer) it wouldn't be nice to the other processes on the server (and it is not allowed to run cronjobs which use a lot of resources on frihost). So in that case you should have your script wait a few seconds after is has done some (1000?) files (sleep()) before it continues.
AOP Web Development
So in regards of that coz i tried to create a fetching data which i save on db and then after that i create a script to get the data on db and by doing the while and mysql_fetch_array() so i have to fetch all the data links... although my script had already perform well but it will not just finish the loop? i have 3000+ data to be fetch on a sitelink from another server. it's something data mining.. so how i could make it like as what you are suggesting... that to use header() to refresh the page and start to script to run again.. or whats on the exec() and system() i don't know yet the functionalities and how to use it... regarding the sleep()? does it work if ever in every 50 data has been fetch it will sleep for a couple of seconds and then continue it again??? hope you can really share.. Smile

Anyway i'm not doing it now on my frih account. i run this script to different server not in frih account. but i hope you can suggest or help me about it... thanks a lot Laughing Laughing Laughing Laughing Laughing
snowboardalliance
If this is a one-time database write, you could just set up php on your computer and get rid of the timeout. Then just use that data on your live site. Though this may not be a good option for you. I'm not sure.
AOP Web Development
well i just use a plain php code that fetch the inner links to be parse for the site but the problem is, it break in the middle, i guess it only parse about 50 links and then it will stop.
MrBlueSky
Does the server on which you are running this script allow PHP's exec() command?
AOP Web Development
i don't know but i guess yes since it is a paid hosting.. so i guess we can used this functionalities.
Related topics
What is the best way to split up?
speed of light... speed of dark?!?!
Paying 150 Frih$ for Directory Submissions!
The Game creation topic! - Share experience - Find resources
CSX - ridin the rails
Randomness is an illusion. [philosophy/science]
Megaman Phoenix
best programming editor
Obama's speech to students Kindergarten through 12
Symphonism. My religion-in-progress.
Exceptionally long weekend
Writing to a file -- Python
I redid the entire site and did everything from scratch
Why i was inactive part II
Reply to topic    Frihost Forum Index -> Scripting -> Php and MySQL

FRIHOST HOME | FAQ | TOS | ABOUT US | CONTACT US | SITE MAP
© 2005-2011 Frihost, forums powered by phpBB.