Building Firewall Blocklists

One of the things that I see very commonly asked on internet forums are for ways to build firewall block lists. We have to face the facts that there are many bad actors out there that seek to do nefarious things with their internet connection. Whether it is an entire country engaged in industrial espionage or bad actors seeking to profit from spam, sending ransomware, or other activity, we often have a want and need to stop the nonsense. This little post is geared primarily towards building block lists in pf for OpenBSD. That much said, these scripts could easily be adapted to work with iptables.

Let’s start by creating a script to download zone files by country. These have been compiled by and are made available freely as a public service. If you’re hosting certain material that may not even be remotely germaine to a certain country and want to block said country, this website is an invaluable resource. The script below needs wget available in ports.


# If a list already exists, delete it
if [ -r /var/db/countries ]; then
    rm /var/db/countries

# Build an IP block list based on country 
countries="cn tw mo ru ua"
for country in $countries; do;
    cat $ >> /var/db/countries
    rm $

Now say there isn’t a specific country but an organization that you want blocked. Say Digital Ocean or Choopa hosts are hammering you and you want to do away with them. The easiest thing to do is to obtain a list of IP address blocks that belong to the organization. You can do this with the organization’s AS (Autonomous System) number. Below is a script that will do this and put it in a form suitable for easy inclusion into a pf table. The script below uses curl.


curl\?q\=AS$1 -o as$1.txt

Simply type aslookup <asnumber> and you’ll have a txt file that can be easily loaded into a pf table. You might have to delete a small header comment at the top of the file first. Back when I was running a WordPress website I had to make an unfortunate decision to block by both country and AS. Now that I have moved to essentially a static website powered by Hugo, this problem is nonexistent. In fact, the bots are no longer even trying. They haven’t been at it since 7:00am this morning.

See also