Great solution this modification in the code introduces an additional check to prevent unnecessary IP abuse checks.
If the current page is a 'page_not_found', the IP abuse check will be skipped.
If the visitor is a known web spider or bot, the IP abuse check will also be skipped.
This reduces unnecessary API calls to AbuseIPDB when traffic comes from known web spiders or bots, which are usually harmless.
PHP Code:
// Do not execute the check for the 'page_not_found' page or for known spiders
if ($current_page_base == 'page_not_found' || (isset($spider_flag) && $spider_flag === true)) {
return;
}
I've changed the modification to this to exclude spiders & 404. Just have to wait until tomorrow to check if block caching is now fully working, but on test mode all seems well. This mod seems to be evolving quickly, but it will do wonders to keep the worst offenders away & reduce my manual blocking workload massively.
Thank you all.
Also, as an aside in case anyone else wants to include an error message on their block page, I'm still serving a 403 but now echoing a browser message as follows.
Code:header('HTTP/1.0 403 Forbidden'); echo 'You are forbidden! Your IP Address is marked as malicious in the abuseipdb.com database'; zen_exit();
Any chance we can get a github link to avoid having to add, delete, rinse, repeat?
Are You Vulnerable for an Accessibility Lawsuit?
myZenCartHost.com - Zen Cart Certified, PCI Compatible Hosting by JEANDRET
Free SSL, Domain, and MagicThumb with semi-annual and longer hosting.
Here is the GitHub link: https://github.com/CcMarc/AbuseIPDB.git
I Updated to the latest commit from github, which was made three hours ago, as the API results caching wasn't working on mine. My API calls reset at midnight and by 7am GMT I had hit 3.5k API calls. Checking the calls log, there were multiples for the same IP. The latest commit, saving sessions to database appears to have fixed this. Good work :)
Last edited by johnjlarge; 26 May 2023 at 08:03 AM.
Yes, I'm on version 2.0 and fully up to date as of just a moment ago with the changes on github.
A couple of ideas, in the abuseipdb_api_call_2023_05.log the IPs of spiders which appear in spiders.txt are still showing as blocked, for example
2023-05-26 16:51:52 IP address 54.236.1.11 API call. Score: 63
(this is pinterest bot, which is still allowed to browse the site, but perhaps we could avoid adding and spider sessions to the api log)
Also, maybe beyond the scope of this plugin, but perhaps we could stop blocked IPs from showing in whos online?
Blocking is working really well, just thinking how the plugin could evolve. Perhaps even excluding anything in spiders.txt for even doing an api call, so it never shows in logs & doesn't use up any api hits?
I run a relatively busy & established site with over 8000 products which has been running on zen cart since 2005, so I have a lot of traffic to test this out. So far today, most normal users have scored a 0 which is as expected, but the block log has blocked a fair few really malicious IPs today, so this plugin could prove invaluable for protecting sites from the worst offenders.
The latest v2.0.4 release of the AbuseIPDB module is now live on GitHub. This update introduces a new feature that allows you to enable or disable known spiders from bypassing IP checks. Additionally, in the previous v2.0.3 release, I added an IP Cleanup feature that automatically deletes expired IP records. You can enable or disable this functionality and configure the IP record expiration period in the admin settings.
Bookmarks