6

If I run a standard cURL_multi_exec function (example below), I get all cURL handles requested at once. I would like to put a delay of 100ms between each request, is there a way to do that? (nothing found on Google & StackOverflow search)

I've tried usleep() before curl_multi_exec() which slows down the script but does not postpone each request.

// array of curl handles & results
$curlies = array();
$result = array();
$mh = curl_multi_init();

// setup curl requests
for ($id = 0; $id <= 10; $id += 1) {
    $curlies[$id] = curl_init();
    curl_setopt($curlies[$id], CURLOPT_URL,            "http://google.com");
    curl_setopt($curlies[$id], CURLOPT_HEADER,         0);
    curl_setopt($curlies[$id], CURLOPT_RETURNTRANSFER, 1);
    curl_multi_add_handle($mh, $curlies[$id]);
}

// execute the handles
$running = null;
do {
    curl_multi_exec($mh, $running);
} while($running > 0);

// get content and remove handles
foreach($curlies as $id => $c) {
    $result[$id] = curl_multi_getcontent($c);
    curl_multi_remove_handle($mh, $c);
}

// all done
curl_multi_close($mh);

I'm working on this all day, any help would be greatly appreciated! Thank you.

EDIT: Any other non-cUrl method? That would also answer my question.

7
  • No. PHP's curl support does not offer that kind of functionality. Commented Aug 8, 2011 at 19:27
  • Any other non-cUrl method? That would also answer my question. Thanks. Commented Aug 8, 2011 at 19:31
  • PHP's not multithreaded AT ALL. You'd have to run multiple copies of the script in parallel. And each copy of the script would be completely independent of the others. You'd have to have some method of telling each script which url(s) it should fetch Commented Aug 8, 2011 at 19:33
  • I understand that cURL multi is only one thread that is waiting for all connections to resolve. Similar solutions with one thread will solve my problem. I don't want to DDoS any server with 1000 requests at once, but I also don't want to run the requests one at a time (too slow). Commented Aug 8, 2011 at 19:38
  • 1
    Are all the urls on a single site? or are you hitting multiple sites? If it's multiple, then hit one site in each multi thread and put a 100ms pause on the whole script. that'd make it appear as 1-hit-per-100ms on each site, even though you're hitting 5 or 10 sites at the same time. Commented Aug 8, 2011 at 19:47

4 Answers 4

4

Yes, this is possible. If you use the ParallelCurl library, you can much more easily add your 100ms delay with usleep(), as you can make a separate request to be added to the download queue.

for ($urls as $url) {
    $pcurl->startRequest($url);
    usleep(100000);
}
Sign up to request clarification or add additional context in comments.

3 Comments

Sadly, it doesn't work. Script hangs forever and I'm not the only one reporting the problem: github.com/petewarden/ParallelCurl/issues
Weird. Perhaps it has been updated since I last grabbed it. I'm using this method in 2 different applications right now, and don't have troubles. I'll take a look at my version tonight, and post an update. Maybe it has to do with a different modification to my copy that I made...
I confirm. No trouble at all with this method. +1. Forking cannot be used on the web. But ParallelCurl works on both the web and the command line.
3

Don't think you can. If you run this from the cli, you could instead fork your script into 10 processes and then fire regular curl requests from each. That would allow you fine grained control over the timing.

1 Comment

1

PHP is not solution for that. Forking the script won't help too. In the beginnings yes but once you have a little bit more websites you need to grab like that you will find yourself as your sever very, very red. In terms of costs and in terms of script stability you should reconsider using some other idea.

You can do that with Python easily and in case of non-blocking real time calls to API endpoints you should use stuff like Socket.IO + Node.JS or just Node.JS or well, huh... lol

In case that you do not have time nor will you can use stuff like this:

http://framework.zend.com/manual/en/zendx.console.process.unix.overview.html

It actually all depends on what are you trying to achieve.

1 Comment

I agree with the last sentance :) I'm pinging always the same server (API). Server CAN hold the requests but I just want to postpone them without waiting for prevoius request to finish.
0

You can try this:
Store a timestamp in the DB, add one handle and call to curl_multi_exec.
Use CURLOPT_PROGRESSFUNCTION to check timings and add more handles when you need it.
Here Daniel Stenberg (author of cURL and libcurl) says it's possible to add more handles after executing curl_multi_exec.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.