¡@

Home 

php Programming Glossary: robot

How to detect fake users ( crawlers ) and cURL

http://stackoverflow.com/questions/12257584/how-to-detect-fake-users-crawlers-and-curl

to avoid automatic crawling. Everyting a human can do a robot can do it too. There are only solutions to make the job harder..

How to add scraped website data in database?

http://stackoverflow.com/questions/18997932/how-to-add-scraped-website-data-in-database

good for this I've found. When crawling remember Be a good robot and define a unique USER_AGENT for yourself so site operators.. intended to hide your identity crawl in the open. Respect robots.txt if a site wishes to block scrapers they should be allowed..

Detecting Ajax in PHP and making sure request was from my own website

http://stackoverflow.com/questions/1953954/detecting-ajax-in-php-and-making-sure-request-was-from-my-own-website

request came from my own domain and not an external domain robot www.example.com ajax true could allow anyone to make an Ajax..

Best solution to anti-spam in PHP?

http://stackoverflow.com/questions/2570367/best-solution-to-anti-spam-in-php

solution to anti spam in PHP How to distinguish robots from normal user How does SO do this job Currently I'm met.. user How does SO do this job Currently I'm met with a robot which post once every 1 hour... php spam prevention spam spam..

Cached, PHP generated Thumbnails load slow: How to find problem/solution from Waterfall graphs?

http://stackoverflow.com/questions/4810806/cached-php-generated-thumbnails-load-slow-how-to-find-problem-solution-from-wa

second image show whats initiating what to load. The angry robot is my pet ZAM . He is harmless and often happier. Load Waterfall..

How to identify web-crawler?

http://stackoverflow.com/questions/8404775/how-to-identify-web-crawler

cheap if I have to pay for ALL hits including webcrawlers robots etc. php web crawler share improve this question There.. this question There are two general ways to detect robots and I would call them Polite Passive and Aggressive . Basically.. how often you are crawled. Politeness is ensured through robots.txt file in which you specify which bots if any should be allowed..