Additional filter request to remove spammers.

If you got suggestions for new features or feature changes in NewsLeecher, go ahead and let santa know.
Post Reply
bentbobb
Posts: 4
Joined: Tue Sep 06, 2011 11:42 pm

Additional filter request to remove spammers.

Post by bentbobb » Tue Sep 13, 2011 11:59 pm

Lately some scumbag has been flooding various groups with tens of thousands of spam/virus binaries. He's impossible to filter since he uses different names each day and moves around to different groups.
Would it be possible to add a filter that would NOT download headers from any poster posting over XX number of posts in any one group?
Thanks,
Bob

jdgdcg
Posts: 129
Joined: Sun Jun 28, 2009 3:08 pm

Re: Additional filter request to remove spammers.

Post by jdgdcg » Fri Sep 23, 2011 11:17 am

bentbobb wrote:Lately some scumbag has been flooding various groups with tens of thousands of spam/virus binaries. He's impossible to filter since he uses different names each day and moves around to different groups.
Would it be possible to add a filter that would NOT download headers from any poster posting over XX number of posts in any one group?
Thanks,
Bob
That's a bad idea. because then there's flood from people that upload comics, books, or tv series complete. Thats a lot of posting that you'll lose.

Try to play with the crosspost filter in setting - Group Browsing. and with the subjetc filter. I have this on my filter:

german.dubbed
.german.
.exe"
.rar"
.exe.rar"
.rar - -
.rar.rar
.exe
keygens.nl
passworded
razorblade
eternalcracker
.dutch.
.swedish.
.french.
.foreign@
.german
highcompressed
(????)

bentbobb
Posts: 4
Joined: Tue Sep 06, 2011 11:42 pm

Post by bentbobb » Fri Sep 30, 2011 12:53 am

I feel that it's a very good idea.
Many of the groups I visit have a daily posting limit which the regular posters have agreed to.
When I see Newsleecher start downloading over a million(!) headers as it updates a group I know darn well that it's a spammer. This new type of spam is targeted at the usenet search services such as GoogleGroups that use a search function to locate articles or content. The files are all named differently so that the unsuspecting searcher is almost guaranteed to get a hit on one of the spammer's files.
Being able to ignore any poster who posts some ridiculous number of binaries would help a lot. As it is now, I have to stop the update (it can take hours) and just get the latest 100,000 articles. This means I miss a lot of stuff from the regular legitimate posters.

f4gww
Posts: 2
Joined: Sun Dec 30, 2007 5:47 pm

Filters

Post by f4gww » Sat Oct 01, 2011 1:30 pm

I've been using Newsleecher for quite a while now and one of the most annoying things is its lack of a regular expression filter system (similar to that found in Forte Agent). These bulk spammers can be filtered out by using their IP address in a regex expression; but Newsleecher does not expose the full message header to you.

I really like Newsleecher, but these spammers have gotten so bad in the a.b groups that I'm actually looking at other newsreaders that are capable of more robust filtering (BTW, Forte Agent is about the only one I've found that really does a good job of fully customizable filtering).

bentbobb
Posts: 4
Joined: Tue Sep 06, 2011 11:42 pm

Post by bentbobb » Sat Oct 01, 2011 11:10 pm

I agree with you regarding Agent's sophisticated filtering capabilities. I spend (too much) time in many porn binary groups, probably the most spammed groups in usenet, and I'm able to filter out almost all of it. I got tired of Agent and decided to find something that would work better for binaries. NewsLeecher has been working well (I've tried many of the others) except for some notable exceptions. There doesn't seem to be much support for the program. There's no help file and questions go unanswered in the forums. I don't think these folks are ready for the big time yet.

Chris202
Posts: 4
Joined: Mon Oct 26, 2009 9:46 pm

Post by Chris202 » Fri Oct 21, 2011 11:14 am

Totally agree. Newsleecher's SPAM filter capabilities are woefully lacking for an application which is supposedly designed to simplify/enhance newsgroup access. I actually use Forte Agent for day-to-day viewing of newsgroup post simply because it has better filter capabilites - including a one-click addition of a spammer to the killfile. I would love to use only Newsleecher for this job but it is incapable.

Rather than spend so much time adding more bells and whistles the authors of Newsleecher need to bolster this essential function!

Loren
Posts: 96
Joined: Tue Jan 03, 2006 9:25 pm

Post by Loren » Fri Mar 09, 2012 6:40 am

I believe the spam can be identified by a proper filtering technique built into the program.

Once the headers have been assembled sort the group by size. Find blocks consisting of posts by one poster, a number of posts over some definable threshold (say 500) and file sizes within 1% (definable). All posts included in such a group get zapped as spam.

Note that there might be multiple blocks per poster (I've seen as many as three different bits of malware in a single day's load) and there may be other posts by the "poster" that are legit. (They sometimes copy a legitimate name.)

It probably needs a way of defining it per group. While I don't think a picture group would meet the 1% threshold I could imagine people being worried that it might.

pixelmonk
Posts: 7
Joined: Wed Apr 26, 2006 12:35 pm

Post by pixelmonk » Thu Nov 08, 2012 8:40 pm

This needs to be looked at again. Groups are being flooded with random posts from random posters. To select poster, ignore poster and remove from cache is a 3 step process. This should be a single button/hot key option.

Outside of that easy fix, Newsleecher needs more sophisticated filtering options (eg. IP) like Agent has. Spammers are getting smarter by removing subject lines that used to be easily filterable. If you want me to continue to pay to use your software then you should be making it easier to use your software.

tommo123
Posts: 120
Joined: Wed Oct 19, 2005 4:43 pm

Post by tommo123 » Sun Nov 11, 2012 10:42 pm

1 click makes it too easy to remove all posts by mistake too.

i've often (1st thing in AM and drowsy) hit get latest 1 header instead of all.

that's annoying

i just remove rar files tbh and things under 5MB. the spam is usually (from what i saw) a big block of rar files that are easily removed and doesn't affect split posts (rar and pars) thankfully so very easy to clean up

pixelmonk
Posts: 7
Joined: Wed Apr 26, 2006 12:35 pm

Post by pixelmonk » Mon Nov 12, 2012 11:57 pm

No. That doesn't fix the underlying problem. Besides, previews, nfo or par files get below that threshold or spam rars go above it. Also, this method is an "after the fact", which is bs. Real filtering options would fix it and not just some half-assed work around

doppler
Posts: 152
Joined: Fri Apr 01, 2005 4:19 pm

Post by doppler » Thu Nov 15, 2012 2:31 am

Deleted by me
Last edited by doppler on Fri Sep 27, 2013 3:08 am, edited 1 time in total.

NotScouser
Posts: 17
Joined: Sat Feb 17, 2007 3:59 pm

Post by NotScouser » Fri Dec 14, 2012 5:20 am

I wish I had seen this thread before I started a new one on the same subject. Spam filtering has got to be the #1 priority in the next version. Unfortunately NL can only filter by certain parameters. It cannot filter by "Path" (the usenet server the spam originates from) as "Path" is not in the XOVER response. The only thing it can sensibly filter on is the "Subject".

JeffSD
Posts: 7
Joined: Tue Dec 11, 2012 11:47 am

Re: Additional filter request to remove spammers.

Post by JeffSD » Sat Dec 15, 2012 2:58 am

jdgdcg wrote: .rar"
.rar - -
.rar.rar
I just wanted to let you guys know that the above filters will cause downloads to be missing, since some files have the .rar extension. So I'd suggest not to include the above filters.

User avatar
Bags
Posts: 397
Joined: Thu Jul 14, 2005 11:53 pm
Location: Outer Space

Post by Bags » Wed Jan 02, 2013 8:40 pm

If NL has a 'script-reading' function held against the groups in the 'saved Group Properties', one could select any of them to filter all updated headers when they are updated.
Now, how to build the scripts/files .. ?

There'll be a new 'Anti-Spammer' club!

Post Reply