Page 3 of 7

I found how to get newsleecher to work.

Posted: Tue Mar 30, 2010 3:24 pm
by pzam
I got version 9 of newsleecher working almost flawlessly now.

I let it extract and create folders

Then run all its scans to try to extract stuff but get stuck like usual

but i also run this program http://www.milow.net/public/projects/parnrar.html

At same time and just start it now and then to clean up the que it seems to get everything newsleecher misses.

I think they have some source code there program seems to extract much smoother also. I am not sure how well it error checks yet.

It is keeping me entertained till you get the next working beta done..

Posted: Fri Apr 02, 2010 4:13 am
by carloX
although this is the place for adressing bugs and deficiencies of newsleecher, let me say - congratulations spiril & co!!!
besides the desirable improvement of the collection-algorithm, nl now is a fine newsreader.
with nl4b16 i never had a crash; it's very fast, and, with the just discovered virtual-groups-functionality, at least for me a very comfortable newsleecher.
though it had been quiet a long time to get to this point....thanks.

carloX

Re: Speed limit

Posted: Sat Apr 03, 2010 5:56 am
by elite
japeape wrote:Anyone experiencing speed limitations?

Since B15 I no longer get more then 5MBs which is less then half of what it can be and was before.
I think for me since Beta6 or 7 I have not been to download at high speed. Mostly getting about 1/10 of my normal speed (500kbs normally but with the beta I get aobut 50kbs)

However when I put back in 3.9 it works beautifully. I currently have two verision of newleechers installed - the 3.9 and the betas. I test the betas as they come in the hope that one day I can use the V4 as it has some cool features I really would like to use..

supersearch collecting doesn't gather srr file

Posted: Sun Apr 04, 2010 7:28 pm
by dgburns
it seems that many things posted to usenet of late have this thing called a .srr file that provides some descriptive text. supersearch does not seem to include this file (when uploaded) when it does it's collecting. since many creators of .par2's include this srr in the recoverable files, newsleecher is forced to gather a par2 recovery file and recreate the srr file.

Please include srr file in supersearch collections so that we don't wast time downloading recovery data and waiting for the file to be recreated. They are very small yet consume an innordinate amount of resources to reconstruct before NL can do an extract.

recovery errors

Posted: Sun Apr 04, 2010 7:34 pm
by dgburns
I'm not sure what's up, but in both beta 15 and now in beta 16 I am finding many rar sets not able to be reconstructed by NL. The exact same .par2 launched in QuickPar recovers the missing RAR's perfectly. no error is displayed, it just sits here consuming memory and CPU and disk space trying over and over to recreate some missing files. Last night beta 15 consumed 173GB of hard drive space and had created 572 copies of 3 each RAR's missing from a set I downloaded and RnE had to be force halted and over 1600 extraneous rar's deleted.

I can provide subject of problem posts via PM if desired. I am going back to beta 14 (or earlier).

Re: recovery errors

Posted: Mon Apr 05, 2010 4:53 am
by Smudge
dgburns wrote:I can provide subject of problem posts via PM if desired. I am going back to beta 14 (or earlier).
yes please send Spiril a PM with details of the posts that create this issue. This should have been fixed a few versions ago.

Speed limit not honored.

Posted: Mon Apr 05, 2010 9:35 pm
by schekker
I have also speed problems with this release, but they are opposite to what some others are complaining about. For me NL is downloading too fast.

I have a max speed of about 1MB/s and set my Speed Limit at 50%, e.g. 512,0 KB/s. I disabled all my connection but one. Even then it is still downloading at max. speed instead of limiting the download speed to 512 KB/s.

I have seen it working on other occassions. And this time, after I changed the speed limit to Scheduled with the scheduled speed limit set to 50% for the whole of the week, it worked again. If I thereafter set the speed limit to the old value of 50% (so not scheduled anymore), it also worked again. But for how long?...

Posted: Tue Apr 06, 2010 6:13 am
by Liam
when I try and connect or update headers the program just crashes.

I am running XP SP3 and just upgraded from Beta 11.

Any idea how to fix?

Posted: Tue Apr 06, 2010 9:45 am
by pzam
Reading everyone's reply on b16
It seems to sort of be working other than that forever annoying repair and extract of "par split files" the ones that end in 001 002 003.
Those are some special way of creating par files the library they are using does not support it they need to update if I read things correctly. The next version of that library does, after it things should be fixed with this bug.
It is time to bite the Thumbdrive and squeeze room for this and see how it is behaves on my beast. Hopefully it is the beast tamer. I have a small bucket of multicolored praise awaiting if this works. I reserved a barrel of praise for the par and rar bug to be fixed. If I don't report good things by tomorrow my computer exploded with joy.

beta 9 to beta 16 upgrade went smoothly.

Posted: Wed Apr 07, 2010 12:44 am
by pzam
I backed up everything from b9 then installed b16 over it figuring I might have to re-enter settings. But I was surprised all loaded in fine it even kept the newsgroups and servers. From what I can tell every setting I need was still as I configured it in b9.
I did a update headers full and it worked fast and without noticeable error. The headers seemed to be more consistent in speed and the pauses between groups is gone.
There seems to have been some slight visual changes also. One of the things I hoped for is still not there. I hoped they put a button back on the explorer view for file-name mode. But looking at it now I think if they just put a button to allow it to sort by subject or file-name that would do it.

I did not turn on the extract of repair and extract as others say it still is broken so I am still using a another utility that works for me.

Well now to give it a few days workout. Good job so far NLTeam 1 thumb up the other is wavering yet.

v4b16 "repair extract" stops self extract par of itself

Posted: Wed Apr 07, 2010 4:06 pm
by pzam
v4b16 "repair extract" stops self extract par of itself (Still happening)


If a par file named aaabbb.par is a self extract par and the file inside is also aaabbb.par newsleecher will get stuck in a loop.

I have these nerdie tweeks enabled.
Download keep read open articles
Download skip existing articles.

I need these tweeks on to keep from getting duplicate articles downloaded and so if I look at file I can find it again.

I saved the par if you like to see it.

Posted: Thu Apr 08, 2010 12:35 am
by Sunfox
Well, I just realized I've been posting in the wrong section.

Anyways, I'm not new to NL but I'm new to the 4.0 series. Here's my problem: partially downloaded articles are not saved when you exit/restart NL. This worked fine in 3.9X.

So if you have a file that's 98% completed and exit NL, when it starts back up it begins that file from 0%.

Now you might say this is no big deal... most files are say 15-50 megabytes. However sometimes posts are made with single files hundreds of megabytes big, and losing a good chunk of those is a huge problem.

Myself, I had an issue where one news server was throwing incompletes, and another was unavailable. I queued up about 15 gigabytes of stuff to download. It downloaded about 12 gigabytes of it - however due to incompletes not a single file was actually finished, as I was waiting on that other server to come back. Then I made the mistake of exiting NL. Now I've lost 12 gigabytes of downloaded material and have to start over those downloads entirely from scratch.

This is utterly unaccaptable. Article caching worked perfectly fine in 3.9X, and while I love the new collections feature I just can't believe that such a simple and important feature has been removed.

Re: errror in the log window

Posted: Sat Apr 10, 2010 1:59 am
by ptr727
Lundis2 wrote:I get this error every second.
"10:49:54 ThreadPR Error(121): Access violation at address 0080235A in module 'newsLeecher.exe'. Read of address 00000000"

Everything still seem to work though.

Lundis
Same problem here.

First one of these:
5:52:07 PM ThreadPR Error(121): Access violation at address 10003381 in module 'nlpar.dll'. Read of address 00000000

Then this over and over and over:
5:52:21 PM ThreadPR Error(121): Access violation at address 0080235A in module 'newsLeecher.exe'. Read of address 00000000

Only thing that helps is to remove the problem archive from repair and extract and manually extract using WinRAR.

P.

Posted: Sat Apr 10, 2010 2:17 pm
by gkar
me to for ThreadPR error. Circumstances exactly as above

Youre You're

Posted: Sun Apr 11, 2010 11:28 am
by pzam
I am noticing some names are comming out differnt between version 9 and now. in version 9 it would put the ' in the name now it seems to leave it out.

ver 9 You're
ver 16 Youre



It seemed right before and wrong now.