In reviewing your web server logs you notice that "automated spiders" used for web indexing or spam address harvesting consume a significant fraction of your allocated bandwidth. You would like to filter out spiders or abusers who slam your website with many hits, but without having to troll your web logs and manually enter offending IP addresses.
IPNetSentryX provides a "Keep address" action and "Include address" property you can use to recognize potential offenders.
Suppose we want to filter out any IP addresses that request a specific web page more than X times per minute. Using the TCP Dump tool (with TCP Flow selected) shows a typical web page access looks like this:
So we might first search for packets to port 80 (HTTP) with the keyword "GET" in the first few bytes, and then specify a URL keyword to match the pages of interest.
In rule 220.127.116.11.1 we detect web page requests to our site and save the source IP address in our Address table. Next we use the "Include address" property to determine if this address has been seen before and setup to test how many times. In rule 18.104.22.168.1.1.1 we test if this same source IP address has been seen more than 30 times and if so we add it to our banned list. Notice when we test the "Parent Match Count" of a rule with the "Include" property, we test against the match count for this address in the address table. Rule 22.214.171.124.1.1.2 fires every 60 seconds to reset the match count. If a single remote IP address tries to access our website more than 30 times within a 60 second interval, that remote address will be banned until the corresponding trigger entry expires (remains idle for the expiration interval).
The "Keep address" table can be used to detect frequent repeated access attempts that often characterize abuse.
Please send questions, comments, or suggestions using our general requests form:
Back to IPNetSentryX Application Notes