From patchwork Sat Oct 23 07:15:31 2021 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 8bit X-Patchwork-Submitter: =?utf-8?q?Peter_M=C3=BCller?= X-Patchwork-Id: 4812 Return-Path: Received: from mail01.ipfire.org (mail01.haj.ipfire.org [172.28.1.202]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature ECDSA (P-384) client-signature ECDSA (P-384)) (Client CN "mail01.haj.ipfire.org", Issuer "R3" (verified OK)) by web04.haj.ipfire.org (Postfix) with ESMTPS id 4Hbstv5Wjlz3wcp for ; Sat, 23 Oct 2021 07:15:35 +0000 (UTC) Received: from mail02.haj.ipfire.org (mail02.haj.ipfire.org [172.28.1.201]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature ECDSA (P-384) client-signature ECDSA (P-384)) (Client CN "mail02.haj.ipfire.org", Issuer "R3" (verified OK)) by mail01.ipfire.org (Postfix) with ESMTPS id 4Hbstv3QqTz1gl; Sat, 23 Oct 2021 07:15:35 +0000 (UTC) Received: from mail02.haj.ipfire.org (localhost [127.0.0.1]) by mail02.haj.ipfire.org (Postfix) with ESMTP id 4Hbstv3HLfz2xLq; Sat, 23 Oct 2021 07:15:35 +0000 (UTC) Received: from mail01.ipfire.org (mail01.haj.ipfire.org [172.28.1.202]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature ECDSA (P-384) client-signature ECDSA (P-384)) (Client CN "mail01.haj.ipfire.org", Issuer "R3" (verified OK)) by mail02.haj.ipfire.org (Postfix) with ESMTPS id 4Hbstt3Pmqz2xHs for ; Sat, 23 Oct 2021 07:15:34 +0000 (UTC) Received: from [127.0.0.1] (localhost [127.0.0.1]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature ECDSA (P-384)) (No client certificate requested) by mail01.ipfire.org (Postfix) with ESMTPSA id 4Hbsts1qkqz1gl for ; Sat, 23 Oct 2021 07:15:32 +0000 (UTC) DKIM-Signature: v=1; a=ed25519-sha256; c=relaxed/relaxed; d=ipfire.org; s=202003ed25519; t=1634973333; h=from:from:reply-to:subject:subject:date:date:message-id:message-id: to:to:cc:mime-version:mime-version:content-type:content-type: content-transfer-encoding:content-transfer-encoding: in-reply-to:in-reply-to:references:references; bh=2/VF8U/xEP9P46mzHnk66D1+5AHzYT8dIcY2ryuZz2I=; b=eZ7ya9OteI7BFpS6SBQIiT2xLMWEwG60/uT3w4tN+qU5yZ/IdMb8YWaHu8HnARd7tkhFrg CdgXEPKV0ZMCFSDA== DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=ipfire.org; s=202003rsa; t=1634973333; h=from:from:reply-to:subject:subject:date:date:message-id:message-id: to:to:cc:mime-version:mime-version:content-type:content-type: content-transfer-encoding:content-transfer-encoding: in-reply-to:in-reply-to:references:references; bh=2/VF8U/xEP9P46mzHnk66D1+5AHzYT8dIcY2ryuZz2I=; b=Q72K98DCHpiW7/NEDrmqR16UByFg8mQqisW7LogS3pF8mYmFJTM73l3etTdcRc0VMsYAl7 IaEiwdXSgGpI/NY4xa6IjqqpGhJR8sWkELxGSP8isgwdeNRmsazbfHo/v0xPPk1Bek8TNS b0o9sc50AiEqWrEC3rhDc6a/vTlVh7EgYJ+ELuaaODfBYVzmDAXhlAwdYvfEIzDigQdsCb eziq8GQcuNwo6OSPT40pZdb2mQn4YikMDax8oe4mFZoXnKKPf9RPNxUJLrK0b6bpWjjkAD YL7KytP/84UAsXX/MweTUd7WQnGJwZVWUtj/8BIZVknQFFItrhAv0QTfu70HJg== Subject: [PATCH v3 2/2] location-importer.in: Add Spamhaus DROP lists To: location@lists.ipfire.org References: <12557f79-b0e0-04f1-99ed-3571eb7c68af@ipfire.org> From: =?utf-8?q?Peter_M=C3=BCller?= Message-ID: <26065cd5-cea6-5355-950a-6baaf6429903@ipfire.org> Date: Sat, 23 Oct 2021 09:15:31 +0200 MIME-Version: 1.0 In-Reply-To: <12557f79-b0e0-04f1-99ed-3571eb7c68af@ipfire.org> Content-Language: en-US X-BeenThere: location@lists.ipfire.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: location-bounces@lists.ipfire.org Sender: "Location" A while ago, it was discussed whether or not libloc should become an "opinionated database", i. e. including any information on a network's reputation. In general, this idea was dismissed as libloc is neither intended nor suitable for such tasks, and we do not want to make (political?) decisions like these for various reasons. All we do is to provide a useful location database in a neutral way, and leave it up to our users on how to react on certain results. However, there is a problematic area. Take AS55303 as an example: We _know_ this is to be a dirty network, tampering with RIR data and hijacking IP space, and strongly recommend against processing any connection originating from or directed to it. Since it appears to be loaded with proxies used by miscreants for abusive purposes, all we can do at the time of writing is to flag it as "anonymous proxy", but we lack possibility of telling our users something like "this is not a safe area". The very same goes for known bulletproof ISPs, IP hijackers, and so forth. This patch therefore suggests to populate the "is_drop" flag introduced in libloc 0.9.8 (albeit currently unused in production) with the contents of Spamhaus' DROP lists (https://www.spamhaus.org/drop/), to have at least the baddest of the bad covered. The very same lists are, in fact, included in popular IPS rulesets as well - a decent amount of IPFire users is therefore likely to have them already enabled, but in a very costly way. It is not planned to go further, partly because there is no other feed publicly available, which would come with the same intention, volatility, and FP rate. The third version of this patch makes use of an auxiliary function to sanitise ASNs, hence avoiding boilerplate code, and treats any line starting with a semicolon as a comment, which should be sufficient. Further, extracting ASNs from the ASN-DROP feed is done in a more clear way, avoiding code snippets hard to read. Signed-off-by: Peter Müller --- src/python/location-importer.in | 110 ++++++++++++++++++++++++++++++++ 1 file changed, 110 insertions(+) diff --git a/src/python/location-importer.in b/src/python/location-importer.in index 3ad335f..b6a5faa 100644 --- a/src/python/location-importer.in +++ b/src/python/location-importer.in @@ -1074,6 +1074,9 @@ class CLI(object): # network allocation lists in a machine-readable format... self._update_overrides_for_aws() + # Update overrides for Spamhaus DROP feeds... + self._update_overrides_for_spamhaus_drop() + for file in ns.files: log.info("Reading %s..." % file) @@ -1258,6 +1261,113 @@ class CLI(object): ) + def _update_overrides_for_spamhaus_drop(self): + downloader = location.importer.Downloader() + + ip_urls = [ + "https://www.spamhaus.org/drop/drop.txt", + "https://www.spamhaus.org/drop/edrop.txt", + "https://www.spamhaus.org/drop/dropv6.txt" + ] + + asn_urls = [ + "https://www.spamhaus.org/drop/asndrop.txt" + ] + + for url in ip_urls: + try: + with downloader.request(url, return_blocks=False) as f: + fcontent = f.body.readlines() + except Exception as e: + log.error("Unable to download Spamhaus DROP URL %s: %s" % (url, e)) + return + + # Iterate through every line, filter comments and add remaining networks to + # the override table in case they are valid... + with self.db.transaction(): + for sline in fcontent: + + # The response is assumed to be encoded in UTF-8... + sline = sline.decode("utf-8") + + # Comments start with a semicolon... + if sline.startswith(";"): + continue + + # Extract network and ignore anything afterwards... + try: + network = ipaddress.ip_network(sline.split()[0], strict=False) + except ValueError: + log.error("Unable to parse line: %s" % sline) + continue + + # Sanitize parsed networks... + if not self._check_parsed_network(network): + log.warning("Skipping bogus network found in Spamhaus DROP URL %s: %s" % \ + (url, network)) + continue + + # Conduct SQL statement... + self.db.execute(""" + INSERT INTO network_overrides( + network, + source, + is_drop + ) VALUES (%s, %s, %s) + ON CONFLICT (network) DO NOTHING""", + "%s" % network, + "Spamhaus DROP lists", + True + ) + + for url in asn_urls: + try: + with downloader.request(url, return_blocks=False) as f: + fcontent = f.body.readlines() + except Exception as e: + log.error("Unable to download Spamhaus DROP URL %s: %s" % (url, e)) + return + + # Iterate through every line, filter comments and add remaining ASNs to + # the override table in case they are valid... + with self.db.transaction(): + for sline in fcontent: + + # The response is assumed to be encoded in UTF-8... + sline = sline.decode("utf-8") + + # Comments start with a semicolon... + if sline.startswith(";"): + continue + + # Throw away anything after the first space... + sline = sline.split()[0] + + # ... strip the "AS" prefix from it ... + sline = sline.strip("AS") + + # ... and convert it into an integer. Voila. + asn = int(sline) + + # Filter invalid ASNs... + if not self._check_parsed_asn(asn): + log.warning("Skipping bogus ASN found in Spamhaus DROP URL %s: %s" % \ + (url, asn)) + continue + + # Conduct SQL statement... + self.db.execute(""" + INSERT INTO autnum_overrides( + number, + source, + is_drop + ) VALUES (%s, %s, %s) + ON CONFLICT (number) DO NOTHING""", + "%s" % asn, + "Spamhaus ASN-DROP list", + True + ) + @staticmethod def _parse_bool(block, key): val = block.get(key)