Kako rešiti samoklikanje
5 naročnikov
5 naročnikov
Pozdravljeni,
imam eno težavo pri samoklikanju. Na spletni strani imam povezavo, ko klikneš nanjo se zabeleži klik. Sedaj pa ne vem točno kako naj naredim, da ne bo prišlo do večih klikov. Pomislil sem že, da bi zaščitil preko IP naslova. Vendar ne vem če bo dovolj, saj imam en programček ki preverja stran in klika in gre skozi to zaščito.
6 odgovorov
S piškotki ali sejami. Če ta "programček" ni browser pa mislim, da bi lahko preveril še z HTTPUSERAGENT.
Za preverjanje ali je crawler/bot jaz uporabljam tole funkcijo:
function getIsCrawler()
{
$userAgent = $_SERVER['HTTP_USER_AGENT'];
$crawlers = 'bot|spider|crawler|curl|Bloglines subscriber|Dumbot|Sosoimagespider|QihooBot|FAST-WebCrawler|Superdownloads Spiderman|LinkWalker|msnbot|ASPSeek|WebAlta Crawler|Lycos|FeedFetcher-Google|Yahoo|YoudaoBot|AdsBot-Google|Googlebot|Scooter|Gigabot|Charlotte|eStyle|AcioRobot|GeonaBot|msnbot-media|Baidu|CocoCrawler|Google|Charlotte t|Yahoo! Slurp China|Sogou web spider|YodaoBot|MSRBOT|AbachoBOT|Sogou head spider|AltaVista|IDBot|Sosospider|Yahoo! Slurp|Java VM|DotBot|LiteFinder|Yeti|Rambler|Scrubby|Baiduspider|accoona';
$isCrawler = (preg_match("/$crawlers/i", $userAgent) > 0);
return $isCrawler;
}
Hvala za kodo, sem poskusil vendar mi ne deluje. Sem pa naredil tako:
function getIsCrawler(){
$userAgent = $_SERVER['HTTP_USER_AGENT'];
$crawlers = 'bot|spider|crawler|curl|Bloglines subscriber|Dumbot|Sosoimagespider|QihooBot|FAST-WebCrawler|Superdownloads Spiderman|LinkWalker|msnbot|ASPSeek|WebAlta Crawler|Lycos|FeedFetcher-Google|Yahoo|YoudaoBot|AdsBot-Google|Googlebot|Scooter|Gigabot|Charlotte|eStyle|AcioRobot|GeonaBot|msnbot-media|Baidu|CocoCrawler|Google|Charlotte t|Yahoo! Slurp China|Sogou web spider|YodaoBot|MSRBOT|AbachoBOT|Sogou head spider|AltaVista|IDBot|Sosospider|Yahoo! Slurp|Java VM|DotBot|LiteFinder|Yeti|Rambler|Scrubby|Baiduspider|accoona';
$isCrawler = (preg_match("/$crawlers/i", $userAgent) > 0);
return $isCrawler;
}
echo getIsCrawler($prenesi);