Page 1 of 1

Additional filter request to remove spammers.

Posted: Tue Sep 13, 2011 11:59 pm
by bentbobb
Lately some scumbag has been flooding various groups with tens of thousands of spam/virus binaries. He's impossible to filter since he uses different names each day and moves around to different groups.
Would it be possible to add a filter that would NOT download headers from any poster posting over XX number of posts in any one group?
Thanks,
Bob

Re: Additional filter request to remove spammers.

Posted: Fri Sep 23, 2011 11:17 am
by jdgdcg
bentbobb wrote:Lately some scumbag has been flooding various groups with tens of thousands of spam/virus binaries. He's impossible to filter since he uses different names each day and moves around to different groups.
Would it be possible to add a filter that would NOT download headers from any poster posting over XX number of posts in any one group?
Thanks,
Bob
That's a bad idea. because then there's flood from people that upload comics, books, or tv series complete. Thats a lot of posting that you'll lose.

Try to play with the crosspost filter in setting - Group Browsing. and with the subjetc filter. I have this on my filter:

german.dubbed
.german.
.exe"
.rar"
.exe.rar"
.rar - -
.rar.rar
.exe
keygens.nl
passworded
razorblade
eternalcracker
.dutch.
.swedish.
.french.
.foreign@
.german
highcompressed
(????)

Posted: Fri Sep 30, 2011 12:53 am
by bentbobb
I feel that it's a very good idea.
Many of the groups I visit have a daily posting limit which the regular posters have agreed to.
When I see Newsleecher start downloading over a million(!) headers as it updates a group I know darn well that it's a spammer. This new type of spam is targeted at the usenet search services such as GoogleGroups that use a search function to locate articles or content. The files are all named differently so that the unsuspecting searcher is almost guaranteed to get a hit on one of the spammer's files.
Being able to ignore any poster who posts some ridiculous number of binaries would help a lot. As it is now, I have to stop the update (it can take hours) and just get the latest 100,000 articles. This means I miss a lot of stuff from the regular legitimate posters.

Filters

Posted: Sat Oct 01, 2011 1:30 pm
by f4gww
I've been using Newsleecher for quite a while now and one of the most annoying things is its lack of a regular expression filter system (similar to that found in Forte Agent). These bulk spammers can be filtered out by using their IP address in a regex expression; but Newsleecher does not expose the full message header to you.

I really like Newsleecher, but these spammers have gotten so bad in the a.b groups that I'm actually looking at other newsreaders that are capable of more robust filtering (BTW, Forte Agent is about the only one I've found that really does a good job of fully customizable filtering).

Posted: Sat Oct 01, 2011 11:10 pm
by bentbobb
I agree with you regarding Agent's sophisticated filtering capabilities. I spend (too much) time in many porn binary groups, probably the most spammed groups in usenet, and I'm able to filter out almost all of it. I got tired of Agent and decided to find something that would work better for binaries. NewsLeecher has been working well (I've tried many of the others) except for some notable exceptions. There doesn't seem to be much support for the program. There's no help file and questions go unanswered in the forums. I don't think these folks are ready for the big time yet.

Posted: Fri Oct 21, 2011 11:14 am
by Chris202
Totally agree. Newsleecher's SPAM filter capabilities are woefully lacking for an application which is supposedly designed to simplify/enhance newsgroup access. I actually use Forte Agent for day-to-day viewing of newsgroup post simply because it has better filter capabilites - including a one-click addition of a spammer to the killfile. I would love to use only Newsleecher for this job but it is incapable.

Rather than spend so much time adding more bells and whistles the authors of Newsleecher need to bolster this essential function!

Posted: Fri Mar 09, 2012 6:40 am
by Loren
I believe the spam can be identified by a proper filtering technique built into the program.

Once the headers have been assembled sort the group by size. Find blocks consisting of posts by one poster, a number of posts over some definable threshold (say 500) and file sizes within 1% (definable). All posts included in such a group get zapped as spam.

Note that there might be multiple blocks per poster (I've seen as many as three different bits of malware in a single day's load) and there may be other posts by the "poster" that are legit. (They sometimes copy a legitimate name.)

It probably needs a way of defining it per group. While I don't think a picture group would meet the 1% threshold I could imagine people being worried that it might.

Posted: Thu Nov 08, 2012 8:40 pm
by pixelmonk
This needs to be looked at again. Groups are being flooded with random posts from random posters. To select poster, ignore poster and remove from cache is a 3 step process. This should be a single button/hot key option.

Outside of that easy fix, Newsleecher needs more sophisticated filtering options (eg. IP) like Agent has. Spammers are getting smarter by removing subject lines that used to be easily filterable. If you want me to continue to pay to use your software then you should be making it easier to use your software.

Posted: Sun Nov 11, 2012 10:42 pm
by tommo123
1 click makes it too easy to remove all posts by mistake too.

i've often (1st thing in AM and drowsy) hit get latest 1 header instead of all.

that's annoying

i just remove rar files tbh and things under 5MB. the spam is usually (from what i saw) a big block of rar files that are easily removed and doesn't affect split posts (rar and pars) thankfully so very easy to clean up

Posted: Mon Nov 12, 2012 11:57 pm
by pixelmonk
No. That doesn't fix the underlying problem. Besides, previews, nfo or par files get below that threshold or spam rars go above it. Also, this method is an "after the fact", which is bs. Real filtering options would fix it and not just some half-assed work around

Posted: Thu Nov 15, 2012 2:31 am
by doppler
Deleted by me

Posted: Fri Dec 14, 2012 5:20 am
by NotScouser
I wish I had seen this thread before I started a new one on the same subject. Spam filtering has got to be the #1 priority in the next version. Unfortunately NL can only filter by certain parameters. It cannot filter by "Path" (the usenet server the spam originates from) as "Path" is not in the XOVER response. The only thing it can sensibly filter on is the "Subject".

Re: Additional filter request to remove spammers.

Posted: Sat Dec 15, 2012 2:58 am
by JeffSD
jdgdcg wrote: .rar"
.rar - -
.rar.rar
I just wanted to let you guys know that the above filters will cause downloads to be missing, since some files have the .rar extension. So I'd suggest not to include the above filters.

Posted: Wed Jan 02, 2013 8:40 pm
by Bags
If NL has a 'script-reading' function held against the groups in the 'saved Group Properties', one could select any of them to filter all updated headers when they are updated.
Now, how to build the scripts/files .. ?

There'll be a new 'Anti-Spammer' club!