[2/5] ids-functions.pl: Allow "5" download attempts for each provider before fail.

Message ID 20220323040452.2609-2-stefan.schantl@ipfire.org
State Superseded
Headers
Series [1/5] ids-functions.pl: Drop downloader code for sourcefire based ruleset. |

Commit Message

Stefan Schantl March 23, 2022, 4:04 a.m. UTC
  Signed-off-by: Stefan Schantl <stefan.schantl@ipfire.org>
---
 config/cfgroot/ids-functions.pl | 38 ++++++++++++++++++++++++---------
 1 file changed, 28 insertions(+), 10 deletions(-)
  

Comments

Michael Tremer March 23, 2022, 9:28 a.m. UTC | #1
Hello,

What is the rationale for five attempts? Why not three?

-Michael

> On 23 Mar 2022, at 04:04, Stefan Schantl <stefan.schantl@ipfire.org> wrote:
> 
> Signed-off-by: Stefan Schantl <stefan.schantl@ipfire.org>
> ---
> config/cfgroot/ids-functions.pl | 38 ++++++++++++++++++++++++---------
> 1 file changed, 28 insertions(+), 10 deletions(-)
> 
> diff --git a/config/cfgroot/ids-functions.pl b/config/cfgroot/ids-functions.pl
> index eb276030b..c8bc52b1b 100644
> --- a/config/cfgroot/ids-functions.pl
> +++ b/config/cfgroot/ids-functions.pl
> @@ -256,6 +256,10 @@ sub downloadruleset ($) {
> 	# If no provider is given default to "all".
> 	$provider //= 'all';
> 
> +	# The amount of download attempts before giving up and
> +	# logging an error.
> +	my $max_dl_attempts = 5;
> +
> 	# Hash to store the providers and access id's, for which rules should be downloaded.
> 	my %sheduled_providers = ();
> 
> @@ -364,19 +368,33 @@ sub downloadruleset ($) {
> 		# Pass the requested url to the downloader.
> 		my $request = HTTP::Request->new(GET => $url);
> 
> -		# Perform the request and save the output into the tmpfile.
> -		my $response = $downloader->request($request, $tmpfile);
> +		my $dl_attempt = 1;
> +		my $response;
> 
> -		# Check if there was any error.
> -		unless ($response->is_success) {
> -			# Obtain error.
> -			my $error = $response->content;
> +		# Download and retry on failure.
> +		while ($dl_attempt <= $max_dl_attempts) {
> +			# Perform the request and save the output into the tmpfile.
> +			$response = $downloader->request($request, $tmpfile);
> 
> -			# Log error message.
> -			&_log_to_syslog("Unable to download the ruleset. \($error\)");
> +			# Check if the download was successfull.
> +			if($response->is_success) {
> +				# Break loop.
> +				last;
> 
> -			# Return "1" - false.
> -			return 1;
> +			# Check if we ran out of download re-tries.
> +			} elsif ($dl_attempt eq $max_dl_attempts) {
> +				# Obtain error.
> +				my $error = $response->content;
> +
> +				# Log error message.
> +				&_log_to_syslog("Unable to download the ruleset. \($error\)");
> +
> +				# Return "1" - false.
> +				return 1;
> +			}
> +
> +			# Increase download attempt counter.
> +			$dl_attempt++;
> 		}
> 
> 		# Obtain the connection headers.
> -- 
> 2.30.2
>
  
Stefan Schantl March 24, 2022, 6:23 p.m. UTC | #2
Hello Michael,

there was no special intention - I simple wanted to give the downloader
more than just one chance to do it's job. For this I needed a value so
I simple choosed "5".

But I'm also fine with "3" or any other suggestion.

Best regards,

-Stefan

> Hello,
> 
> What is the rationale for five attempts? Why not three?
> 
> -Michael
> 
> > On 23 Mar 2022, at 04:04, Stefan Schantl
> > <stefan.schantl@ipfire.org> wrote:
> > 
> > Signed-off-by: Stefan Schantl <stefan.schantl@ipfire.org>
> > ---
> > config/cfgroot/ids-functions.pl | 38 ++++++++++++++++++++++++------
> > ---
> > 1 file changed, 28 insertions(+), 10 deletions(-)
> > 
> > diff --git a/config/cfgroot/ids-functions.pl b/config/cfgroot/ids-
> > functions.pl
> > index eb276030b..c8bc52b1b 100644
> > --- a/config/cfgroot/ids-functions.pl
> > +++ b/config/cfgroot/ids-functions.pl
> > @@ -256,6 +256,10 @@ sub downloadruleset ($) {
> >         # If no provider is given default to "all".
> >         $provider //= 'all';
> > 
> > +       # The amount of download attempts before giving up and
> > +       # logging an error.
> > +       my $max_dl_attempts = 5;
> > +
> >         # Hash to store the providers and access id's, for which
> > rules should be downloaded.
> >         my %sheduled_providers = ();
> > 
> > @@ -364,19 +368,33 @@ sub downloadruleset ($) {
> >                 # Pass the requested url to the downloader.
> >                 my $request = HTTP::Request->new(GET => $url);
> > 
> > -               # Perform the request and save the output into the
> > tmpfile.
> > -               my $response = $downloader->request($request,
> > $tmpfile);
> > +               my $dl_attempt = 1;
> > +               my $response;
> > 
> > -               # Check if there was any error.
> > -               unless ($response->is_success) {
> > -                       # Obtain error.
> > -                       my $error = $response->content;
> > +               # Download and retry on failure.
> > +               while ($dl_attempt <= $max_dl_attempts) {
> > +                       # Perform the request and save the output
> > into the tmpfile.
> > +                       $response = $downloader->request($request,
> > $tmpfile);
> > 
> > -                       # Log error message.
> > -                       &_log_to_syslog("Unable to download the
> > ruleset. \($error\)");
> > +                       # Check if the download was successfull.
> > +                       if($response->is_success) {
> > +                               # Break loop.
> > +                               last;
> > 
> > -                       # Return "1" - false.
> > -                       return 1;
> > +                       # Check if we ran out of download re-tries.
> > +                       } elsif ($dl_attempt eq $max_dl_attempts) {
> > +                               # Obtain error.
> > +                               my $error = $response->content;
> > +
> > +                               # Log error message.
> > +                               &_log_to_syslog("Unable to download
> > the ruleset. \($error\)");
> > +
> > +                               # Return "1" - false.
> > +                               return 1;
> > +                       }
> > +
> > +                       # Increase download attempt counter.
> > +                       $dl_attempt++;
> >                 }
> > 
> >                 # Obtain the connection headers.
> > -- 
> > 2.30.2
> > 
>
  
Michael Tremer March 28, 2022, 3:16 p.m. UTC | #3
Hello,

I generally don’t disagree with trying again. This should however happen after a little while (let’s say an hour or so).

Trying more than three times at one time is a bit excessive I would say. Let’s not try to DDoS other people’s systems :)

-Michael

> On 24 Mar 2022, at 18:23, Stefan Schantl <stefan.schantl@ipfire.org> wrote:
> 
> Hello Michael,
> 
> there was no special intention - I simple wanted to give the downloader
> more than just one chance to do it's job. For this I needed a value so
> I simple choosed "5".
> 
> But I'm also fine with "3" or any other suggestion.
> 
> Best regards,
> 
> -Stefan
> 
>> Hello,
>> 
>> What is the rationale for five attempts? Why not three?
>> 
>> -Michael
>> 
>>> On 23 Mar 2022, at 04:04, Stefan Schantl
>>> <stefan.schantl@ipfire.org> wrote:
>>> 
>>> Signed-off-by: Stefan Schantl <stefan.schantl@ipfire.org>
>>> ---
>>> config/cfgroot/ids-functions.pl | 38 ++++++++++++++++++++++++------
>>> ---
>>> 1 file changed, 28 insertions(+), 10 deletions(-)
>>> 
>>> diff --git a/config/cfgroot/ids-functions.pl b/config/cfgroot/ids-
>>> functions.pl
>>> index eb276030b..c8bc52b1b 100644
>>> --- a/config/cfgroot/ids-functions.pl
>>> +++ b/config/cfgroot/ids-functions.pl
>>> @@ -256,6 +256,10 @@ sub downloadruleset ($) {
>>>         # If no provider is given default to "all".
>>>         $provider //= 'all';
>>> 
>>> +       # The amount of download attempts before giving up and
>>> +       # logging an error.
>>> +       my $max_dl_attempts = 5;
>>> +
>>>         # Hash to store the providers and access id's, for which
>>> rules should be downloaded.
>>>         my %sheduled_providers = ();
>>> 
>>> @@ -364,19 +368,33 @@ sub downloadruleset ($) {
>>>                 # Pass the requested url to the downloader.
>>>                 my $request = HTTP::Request->new(GET => $url);
>>> 
>>> -               # Perform the request and save the output into the
>>> tmpfile.
>>> -               my $response = $downloader->request($request,
>>> $tmpfile);
>>> +               my $dl_attempt = 1;
>>> +               my $response;
>>> 
>>> -               # Check if there was any error.
>>> -               unless ($response->is_success) {
>>> -                       # Obtain error.
>>> -                       my $error = $response->content;
>>> +               # Download and retry on failure.
>>> +               while ($dl_attempt <= $max_dl_attempts) {
>>> +                       # Perform the request and save the output
>>> into the tmpfile.
>>> +                       $response = $downloader->request($request,
>>> $tmpfile);
>>> 
>>> -                       # Log error message.
>>> -                       &_log_to_syslog("Unable to download the
>>> ruleset. \($error\)");
>>> +                       # Check if the download was successfull.
>>> +                       if($response->is_success) {
>>> +                               # Break loop.
>>> +                               last;
>>> 
>>> -                       # Return "1" - false.
>>> -                       return 1;
>>> +                       # Check if we ran out of download re-tries.
>>> +                       } elsif ($dl_attempt eq $max_dl_attempts) {
>>> +                               # Obtain error.
>>> +                               my $error = $response->content;
>>> +
>>> +                               # Log error message.
>>> +                               &_log_to_syslog("Unable to download
>>> the ruleset. \($error\)");
>>> +
>>> +                               # Return "1" - false.
>>> +                               return 1;
>>> +                       }
>>> +
>>> +                       # Increase download attempt counter.
>>> +                       $dl_attempt++;
>>>                 }
>>> 
>>>                 # Obtain the connection headers.
>>> -- 
>>> 2.30.2
>>> 
>> 
>
  

Patch

diff --git a/config/cfgroot/ids-functions.pl b/config/cfgroot/ids-functions.pl
index eb276030b..c8bc52b1b 100644
--- a/config/cfgroot/ids-functions.pl
+++ b/config/cfgroot/ids-functions.pl
@@ -256,6 +256,10 @@  sub downloadruleset ($) {
 	# If no provider is given default to "all".
 	$provider //= 'all';
 
+	# The amount of download attempts before giving up and
+	# logging an error.
+	my $max_dl_attempts = 5;
+
 	# Hash to store the providers and access id's, for which rules should be downloaded.
 	my %sheduled_providers = ();
 
@@ -364,19 +368,33 @@  sub downloadruleset ($) {
 		# Pass the requested url to the downloader.
 		my $request = HTTP::Request->new(GET => $url);
 
-		# Perform the request and save the output into the tmpfile.
-		my $response = $downloader->request($request, $tmpfile);
+		my $dl_attempt = 1;
+		my $response;
 
-		# Check if there was any error.
-		unless ($response->is_success) {
-			# Obtain error.
-			my $error = $response->content;
+		# Download and retry on failure.
+		while ($dl_attempt <= $max_dl_attempts) {
+			# Perform the request and save the output into the tmpfile.
+			$response = $downloader->request($request, $tmpfile);
 
-			# Log error message.
-			&_log_to_syslog("Unable to download the ruleset. \($error\)");
+			# Check if the download was successfull.
+			if($response->is_success) {
+				# Break loop.
+				last;
 
-			# Return "1" - false.
-			return 1;
+			# Check if we ran out of download re-tries.
+			} elsif ($dl_attempt eq $max_dl_attempts) {
+				# Obtain error.
+				my $error = $response->content;
+
+				# Log error message.
+				&_log_to_syslog("Unable to download the ruleset. \($error\)");
+
+				# Return "1" - false.
+				return 1;
+			}
+
+			# Increase download attempt counter.
+			$dl_attempt++;
 		}
 
 		# Obtain the connection headers.