NewsLeecher V4.0 Beta 19 issues
Forum rules
IMPORTANT : Be sure to read the NewsLeecher forums policy before posting.
IMPORTANT : Be sure to read the NewsLeecher forums policy before posting.
NewsLeecher V4.0 Beta 19 issues
Please post any new issues with the 4.0 beta 19 version here.
Last edited by Smudge on Tue Jun 29, 2010 2:41 am, edited 1 time in total.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.
I doubt that it is nl19 issue..
It refers to SS
When you choose 2 days retention for instance, set group to alt.binaries.erotica and hit search so that you can get all headers for 1-2 days you get only stuff posted max 9h or so ago .
When you search for actual title SS finds it
Is it intended?
PS ALso there is strange behavior with mindays: maxdays: commands concernign erotica category(the same erotica group).
For instance mindays:1 maxdays:7, size set to 100000
You wont get anything between 1 -6 days. Not a single post. You start getting headers only on day 7
It refers to SS
When you choose 2 days retention for instance, set group to alt.binaries.erotica and hit search so that you can get all headers for 1-2 days you get only stuff posted max 9h or so ago .
When you search for actual title SS finds it
Is it intended?
PS ALso there is strange behavior with mindays: maxdays: commands concernign erotica category(the same erotica group).
For instance mindays:1 maxdays:7, size set to 100000
You wont get anything between 1 -6 days. Not a single post. You start getting headers only on day 7
This is something Spiril will need to look into/explain but I know he has been working on the erotica server and making many changes. It is possible the indexing system is limited in the retention when searching that way. I just tested it using a 3 day retention of just the a.b.erotica group and it returned only about 100 results posted within the last minute. Expanding to 10 days and I hit the 30,000 result limit. It looks like there is a gap between 2 minutes ago and 6d14h back.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.
Still the NL folders inside of my download folders are not deleled.
Please move these folders outside the download tree!!!
Cases I found, where the folder is not deleted until NL exits:
1) If you try to download files, which have identical names, either the same file or another files which has the same name - but not always.
2) If a part of an archive is damaged, the archive part is created, but the NL temporary inside the folder remains after the download is done.
This behaviour is very annoying - you cannot delete your own folders until NL is stopped.
Is there any chance to do this download thing more intelligent?
Please move these folders outside the download tree!!!
Cases I found, where the folder is not deleted until NL exits:
1) If you try to download files, which have identical names, either the same file or another files which has the same name - but not always.
2) If a part of an archive is damaged, the archive part is created, but the NL temporary inside the folder remains after the download is done.
This behaviour is very annoying - you cannot delete your own folders until NL is stopped.
Is there any chance to do this download thing more intelligent?
this is why i hate updating
I just updated from 4.0 beta9 to 4.0 beta 19 and now when i try to resume a download i get an error message every five seconds:
-------------------------------------------
BUG ID: 40019
DATLN1: ksdkja
DATLN2: Unable to create directory
DATLN3:
When a file is done downloading it doesn't show up in the directory it's supposed to and says it can't access it. I added a screenshot.
That pop up repeats every ten seconds. It's so frustrating it's why I haven't been updating the software. But now supersearch wont' work unless I do.
Is there a way to roll back to my previous version of newsleecher until this gets sorted out??

-------------------------------------------
BUG ID: 40019
DATLN1: ksdkja
DATLN2: Unable to create directory
DATLN3:
When a file is done downloading it doesn't show up in the directory it's supposed to and says it can't access it. I added a screenshot.
That pop up repeats every ten seconds. It's so frustrating it's why I haven't been updating the software. But now supersearch wont' work unless I do.
Is there a way to roll back to my previous version of newsleecher until this gets sorted out??

alt.binaries.dvd.classics ,
1) retention 1-5 days hit search,and get just 3 results.
2) i set 10 days ang get more but if you check nzbindex.nl then youll find that SS doesnt show what was posted between 7h or so and 10 days
BUT
when you search for a specific title posted in classics between 7h-10 days SS finds it.
Two days ago everything worked just fine (
1) retention 1-5 days hit search,and get just 3 results.
2) i set 10 days ang get more but if you check nzbindex.nl then youll find that SS doesnt show what was posted between 7h or so and 10 days
BUT
when you search for a specific title posted in classics between 7h-10 days SS finds it.
Two days ago everything worked just fine (
while trying to download certain collections, i keep getting a stability error. with some other collections i have no issues at all.
Code: Select all
Bug Data1:
læsafmjls
Bug Data2:
data error
Bug Report
When marking several articles and two collections (the collections where the par2 files for the articles, which itself where not in a collection), this bug happend.
A side effect seems to be, that several par2 files where downloaded and not added paused to the queue.
Here the data:
BUG ID: 40019
DATLN1: idjfaoij4
DATLN2: Access violation at address 008A14D5 in module 'newsLeecher.exe'. Read of address 6C696385
DATLN3:
A side effect seems to be, that several par2 files where downloaded and not added paused to the queue.
Here the data:
BUG ID: 40019
DATLN1: idjfaoij4
DATLN2: Access violation at address 008A14D5 in module 'newsLeecher.exe'. Read of address 6C696385
DATLN3:
That, and the other SS related issues you've mentioned, are fixed now. Let me know if you experience any of them again.semel wrote:I doubt that it is nl19 issue..
It refers to SS
When you choose 2 days retention for instance, set group to alt.binaries.erotica and hit search so that you can get all headers for 1-2 days you get only stuff posted max 9h or so ago .
When you search for actual title SS finds it
Is it intended?
PS ALso there is strange behavior with mindays: maxdays: commands concernign erotica category(the same erotica group).
For instance mindays:1 maxdays:7, size set to 100000
You wont get anything between 1 -6 days. Not a single post. You start getting headers only on day 7
bug fixed. no idea how. hate it when that happens. trying to break it again now. will. not. be. defeated.