Matrix Version:
Hi,
When running a links report on external links, it’s bound to create a lot of false positives due to anti-flooding measures built in external firewalls.
For instance, if our site has 200 total links to cnet.com and the report generating tool does 200 simultaneous requests to cnet.com/about, we can trip the anti-flood measure and get blocked / no response / access denied / flooding message (depending on their firewall setup).
Is there a way to tell the report generating tool (are you guys using cURL?) to induce a delay between subsequent requests to the same remote host? That would avoid tripping the anti flood measure and get a report that is actually accurate without all the false positives.
Thanks