NewsLeecher V4.0 Beta 9

Forum to report beta release bugs and discuss the latest beta releases with other users.
• If reporting a beta release bug, be sure read the bug reporting guidelines first.
Forum rules
IMPORTANT : Be sure to read the NewsLeecher forums policy before posting.
feuniks
Posts: 3
Joined: Mon Sep 26, 2005 6:10 pm

Post by feuniks »

I think there is a memory leak in the program. I didn't spot in in the previous beta's so I think this is new to this version. Gradually, newsleecher.exe consumes more and more memory until it reaches 4 GB. At that poitn the program crashes.

rjulie
Posts: 14
Joined: Sun Jun 26, 2005 6:06 am

Post by rjulie »

CTRL-R "Read article" don't work for me, y try to read a NFO and don't open :-(

In 4.0 Beta 7 work fine !

Code: Select all

07:02:34  * SS Connect Info : 174.37.197.132:119 (ssgen.getmyip.com)
07:02:35 Fetched article download data from the SuperSearch service.
07:02:35 Added 1 article to the NewsLeecher transfer queue in 47ms, followed by automatic an queue backup, which was executed in 32ms.
07:02:36 Downloaded Article "[Mike post HRS] - "Mike post HRS.409.vostf.nfo"".


amaze
Posts: 6
Joined: Tue Aug 04, 2009 9:45 pm

Post by amaze »

Smudge wrote:
amaze wrote:
Smudge wrote:Can you post more of the log showing where it was trying to move the files?
As I have the same problem, here is my

9:24:36 AM * Unable to copy/move file. Returned Error "The system cannot find the path specified".
9:24:36 AM Src. : <L:\xxxxxxx.xxxxxx.xxx>
9:24:36 AM Dst. : <L:\!RnE - 2009.11.15 09.24.36 -*****-**\*****-**-xx.xx>
9:24:36 AM * Any delete action for Repair'n'Extract set, has been skipped due to errors.
9:24:36 AM * ERROR (Unable to copy/move file. Returned Error "The system cannot find the path specified".)

But it is not all just some is a problem with
Disable the "Date and Time Stamp" option in the RnE settings.
Done that same problem... it does extract the files..... But not delete them

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

Please turn on detailed logging, try a download and RnE procedure and if it fails, please send the entire unedited log to support@newsleecher.com

For 3.9x versions, it is in the Options>Advanced>Logging screen. Leave "Only for failed downloads" unchecked.

For 4.0x versions, open the Settings>Advanced Nerdy Tweaks screen. Set the logging_detailed_download and logging_detailed_download_error options to true.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

pzam
Posts: 179
Joined: Fri Jul 18, 2008 8:34 pm

Lazy Par2's

Post by pzam »

somewhere in the last few betas you seemed to have changed the par2 cleanup again.


I used to be able to unpause a par2 file up a size i set. but that seems to be gone now. everytime it gets done downloading it will not extract some he files because it is only seeing the 0 size par2. the way i got it to work before was to set it to always unpause par2 that were less than 4k this also usualy gave just a tiny bit of a repair file incase needed also.

The way it is now i have to go to the que and to each file and uppause each one till i get them all going again.

please put the unpause pars by under ??K back.

Antonin
Posts: 86
Joined: Mon Aug 01, 2005 12:51 am

Post by Antonin »

mi643 wrote: not really bugs, but.

I hate the reset button in leech specify!
Absolutely agreed. IMHO, it serves no purpose. Leech specify should always assume the folder that corresponds to the requested group. Or, just browse to the folder that's needed.

doppler
Posts: 152
Joined: Fri Apr 01, 2005 4:19 pm

new problem with turbo headers

Post by doppler »

New problem. Yesterday and today. Usenetserver which works for
turbo headers, did something strange.

After starting an update, I noticed a group started to download all
headers in a group (non-turbo) and starting at beginning! Nothing would
start or download in the headers. Yesterday I stopped it and restarted.
Restarted normal spot turbo. Today it restarted the entire group in turbo.
Different groups not the same groups.

carloX
Posts: 38
Joined: Thu Nov 19, 2009 7:09 am

Post by carloX »

hi
guess i found some bugs in nl 4b9:

1. closing newsleecher while nl is downloading headers from a big group (e.g. boneless) results in app crash. to close the program without crashing, one has to stop downloading headers first.

2. opening the search filter list.... clicking the first time, the list crashes.
then for the second time it opens and stays opened.

3. enhancing the search filter list from default to e.g. 40 doesn't work. always listing only 16 items.

4. the nl-icon in the tray is not resizable. trying to change the size results in a small blue bar overlapping the icon.

nevertheless a BIG thanks to the developpers for this version.
carloX

system: win 7, 64 bit / nvidia 8600gt graphics (driver v190.62)

User avatar
Ironside
Posts: 131
Joined: Tue Jul 12, 2005 11:36 am
Location: Melbourne, Australia

Post by Ironside »

Two observations with v4beta9

1) On occasions after I have download the files in the queue and restarted, the nzb file has been reloaded and files are waiting to be downloaded again. Repair/Extract has no knowledge of previous download even though it did download/unrar.

2) And might be related to 1) that I am now seeing the below warning appearing in this version at times.

Image

amaze
Posts: 6
Joined: Tue Aug 04, 2009 9:45 pm

Post by amaze »

Smudge wrote:Please turn on detailed logging, try a download and RnE procedure and if it fails, please send the entire unedited log to support@newsleecher.com

For 3.9x versions, it is in the Options>Advanced>Logging screen. Leave "Only for failed downloads" unchecked.

For 4.0x versions, open the Settings>Advanced Nerdy Tweaks screen. Set the logging_detailed_download and logging_detailed_download_error options to true.
it is here by done

bbarker
Posts: 43
Joined: Fri May 29, 2009 6:18 am

Post by bbarker »

I've begun to notice a lot of .SRR files that are not being included in the file collections even though they seem to be named in the same as the other pieces.

User avatar
Ironside
Posts: 131
Joined: Tue Jul 12, 2005 11:36 am
Location: Melbourne, Australia

Post by Ironside »

amaze wrote:
it is here by done
What does that mean?

acousticrand
Posts: 6
Joined: Tue Jan 30, 2007 8:39 pm

Post by acousticrand »

I am still seeing changed behavior in Article View:
Shift-PgUp/Shift-PgDn no longer highlights/selects, it does nothing.

amaze
Posts: 6
Joined: Tue Aug 04, 2009 9:45 pm

Post by amaze »

Ironside wrote:
amaze wrote:
it is here by done
What does that mean?
I was ask to send a logfil

User avatar
Ironside
Posts: 131
Joined: Tue Jul 12, 2005 11:36 am
Location: Melbourne, Australia

Post by Ironside »

Ironside wrote:Two observations with v4beta9

1) On occasions after I have download the files in the queue and restarted, the nzb file has been reloaded and files are waiting to be downloaded again. Repair/Extract has no knowledge of previous download even though it did download/unrar.

2) And might be related to 1) that I am now seeing the below warning appearing in this version at times.

Image
After a heck of allot of testing it looks like the error I listed as number 2 is caused by the newest version of Kaspersky 2010. (Still evaluating)

As for the number one error I'm still looking into it as it was random.

Post Reply