Using Python to Monitor a .Onion Dark Web Site
I have a few servers running on the dark web for my SANSSEC497 Practical OSINT course. The dark web is known for many things, but reliability isn’t necessarily one of them, which is why I have multiple. As the class becomes available in March, students will take it all over the world, at different times. Because of this, I needed a small program to monitor my dark web sites and let me know if they were offline.
My first attempt was using a popular open-source website
monitor. I made several attempts to route its traffic through Tor to monitor my
.onion sites, but they weren’t successful. I finally decided to write a simple
Python script to fit my needs. I then thought, why do that, when I can have ChatGPT
do it for me?
I went to the ChatGPT website and asked it to write me some python
code to check if my .onion site was online and to alert me if it wasn’t. I had
to switch the port it wanted to use for the SOCKS proxy (more on that later),
but the code worked. I then asked it to check a list of sites every 15 minutes
and only to notify me if any site failed the check four times in a row. With
the dark web being as unreliable as it is, I don’t want to get constant notifications
if the issue was just a minor hiccup.
Anyone who’s built (or used) any persistent monitoring system
knows there is the potential to get flooded with alerts, and nobody enjoys
that. I asked ChatGPT to make the code use Amazon’s SES email service to email
me if a site was down more than four checks in a row, but not to notify me more
than once per day, per site. This way if I’m asleep when a site goes down I
wake up to a single email instead of an inbox avalanche.
Ok, that was the why, let’s talk about the how. I’ve tested
this code on Windows, but for production, I wanted it on a small Intel Nuc running
Ubuntu 22.04.
The first step is to get Tor running on the nook.
sudo apt install tor
sudo service tor status
This first command will install Tor, and the second will let
you verify that it’s running.
I then used pip to install the python packages “requests” (to
make our web requests), “stem” to interact with Tor and “pysocks” (to let our
code use the SOCKS proxy opened up by Tor when it’s running
On Windows with the Tor Browser running, I needed to funnel my
traffic through port 9150. On Ubuntu with the Tor service running, I needed to
use port 9050.
The last Python library I installed was “boto3” which lets
Python work with Amazon’s cloud or AWS. Their SES service makes it very easy to
automate email notifications. If you’re interested in getting started using
AWS, I wrote a few blog posts in 2020 which you can find here.
With the requirements out of the way, let’s look at the
code:
import time
import datetime
import requests
from stem import Signal
from stem.control import Controller
import boto3
# Replace with
your AWS access key and secret key
aws_access_key_id = "ACCESS_KEY"
aws_secret_access_key = "SECRET_KEY"
def check_onion_site(url, failures,
last_notification_sent):
session = requests.session()
session.proxies = {'http': 'socks5h://127.0.0.1:9050',
'https': 'socks5h://127.0.0.1:9050'}
try:
response = session.get(url, timeout=10)
response.raise_for_status()
print(f"{url} is up!")
return 0
except
requests.exceptions.RequestException as
e:
print(f"{url} is down: {e}")
return 1
def send_notification(url):
# Replace with your email address
recipient = "recipient@example.com"
# Replace with your sender email address
sender = "sender@example.com"
client = boto3.client('ses',
region_name="us-west-2",
aws_access_key_id=aws_access_key_id,
aws_secret_access_key=aws_secret_access_key)
subject = f"Notification: {url} is down"
body = f"{url}
is down for more than 4 consecutive times."
client.send_email(
Source=sender,
Destination={
'ToAddresses': [
recipient,
]
},
Message={
'Subject': {
'Data': subject
},
'Body': {
'Text': {
'Data': body
}
}
}
)
print(f"Email sent to {recipient}")
urls = ["http://icanhazip.com","http://f5xdqcmzgq6tld6cg7ttahzulzm656tg2jl7zhrordjkajjgilc2jgqd.onion/"]
failures = {url: 0 for url in urls}
last_notification_sent = {url: datetime.datetime.now() for url in urls}
while True:
for
url in urls:
fail_count =
check_onion_site(url, failures, last_notification_sent)
if fail_count == 1:
failures[url] += 1
else:
failures[url] = 0
if failures[url] >= 4 and (datetime.datetime.now() -
last_notification_sent[url]).days >= 1:
send_notification(url)
last_notification_sent[url] =
datetime.datetime.now()
time.sleep(900) # check every 15 minutes
I did have to make one change to the ChatGPT-generated code.
The original code tried to call the check_onion_site function with only
two arguments instead of the three required. After that quick fix, I fired up
the code and it worked like a champ. If you want to email notifications to work,
you’ll have to put in your own AWS keys, but the rest of the code works without
them.
The code here checks two sites, including one on the internet, icanhazip.com. I put this in there during testing figuring that if it couldn't hit that site, it was likely an issue with my Tor proxy.
There are a few changes I will make to this but I wanted to
share it since when I went looking for something like it online, I really couldn’t
find anything.
Comments
Post a Comment