NewsLeecher V4.0 Beta 11

Forum to report beta release bugs and discuss the latest beta releases with other users.
• If reporting a beta release bug, be sure read the bug reporting guidelines first.
Forum rules
IMPORTANT : Be sure to read the NewsLeecher forums policy before posting.
pzam
Posts: 179
Joined: Fri Jul 18, 2008 8:34 pm

Post by pzam »

I was just thinking looking at all these bugs they keep adding and taking out of the program to keep us amused.

They should really have a listing of each version they made somewhere and how well it worked and what was really wrong with it. It would be amusing to see what they are really doing.


It would be helpful for those that like to get a sort of working one also.

Version 4b9 worked pretty good for me but it may be the way I set up my system
I disable short file names on windows. So every file access it does not have to make a extra write.
I set my anti-virus not to scan the temp work areas of newleecher.
I set search indexing not to index the temp or incoming drive as they keep changing.
I make newsleecher work on 2 hard drives so its temp cache files are on one and the completed are on another.
When I used xp I had to set something to change the way xp cached so it could handle more as I had several drives.
I use 64bit vista now to handle the larger drives and such.
I only use write caching on the hard drive windows is on and newleecher temp files are on. All other hard drives I disable write cacheing this helps prevent lost fragments if your system gets locked up.
I delete all the incoming HTML exe com scripts js and passwd files they try to put in files to virus machines and never open HTML or passwd links.
(if you see something interesting in newsgroups Google it first)

The main problem I still have is that 2gig file thing they seem to be working on. the other one is the small pars. I prefer they are downloaded after the RARs so it is not scanning for them all the time. If it downloads the RAR then a PAR2 it does them 1 at a time and keeps hard drive access way down. The only advantage to doing it the way they are doing it now is if they are planing to add a feature we can download more than one file at same time.

the one other thing that I dont know if they can ever fix it their feature to download to a folder of choice. this works if you keep the download and extract in same folder.
The problem arises if you have the extract folder set to a different drive. It will still haply create the requested folders in the download directory but then it moves the files in it to the download folder. Hence you have a download folder with only files in it. Also it only seems to move the RARs. Some thought needs to be done to try to keep the files together some how.
The other thing is when newsleecher gets done downloading and is just idling why does it not do a backup. It never seems to fail that I will come and do a get all headers after it has been idling and it will crash. Personally loosing the history of what I selected that session is annoying.

other wise I am all smiles loving how version 4.9 works wondering and waiting for some good news. But It seems it will be at least a year or so before they get this working looking back.

Patiently using a buggy program does it make you virtuous? oh my i'm starting to glow. I hope the saints don't start calling my name.

Bob_B
Posts: 39
Joined: Fri Jan 26, 2007 10:43 am

Post by Bob_B »

I am having the problem of the last part of an archive hanging at around 97%. It started recently with V3 and now with this beta.
Here is my temp solution:
As an example the archive had 15 files, file.part01.rar to file.part15.rar plus the usual PARs. If it hang at part15 I went to the download folder and looked for the temp file holding the 97 odd percent yet to be written as completed. I made a copy of it and renamed that copy file.part15.rar. I then unpaused the PARs with the number of blocks required, moved them up in the queue if required so they would download and it would auto-extract. If not use Quick Par.

fiedler
Posts: 3
Joined: Fri May 15, 2009 7:39 pm

NL 4.0.11 Temp files

Post by fiedler »

The problem with the "do not delete" temp files not being deleted by NL still a problem, often connected with the first of a set of rar files -- that first one is also often corrupted.

Also, text messages or jpgs downloaded under "open," to be viewed immediately, sometimes are not downloaded to the appropriate folder, but to NLtemp instead, and are deleted when NL closes.

Waiting with optimism and bated breath for Beta 12.

p.vanderwal
Posts: 22
Joined: Sat Sep 03, 2005 10:25 am

NL4.0v9: Download time to go 133343:54:46 instead of some minutes .... how come?

Post by p.vanderwal »

All of a sudden NL 4.0v9 is showing the time the download still has to go as 133343:54:46 instead of some minutes, as it did before. That is not very convenient. I did not change anything. What went wrong? How can I set it back to minutes? Who kowns an answer?

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

p.vanderwal wrote:All of a sudden NL 4.0v9 is showing the time the download still has to go as 133343:54:46 instead of some minutes, as it did before. That is not very convenient. I did not change anything. What went wrong? How can I set it back to minutes? Who kowns an answer?
I think that was one of the small bugs that has been fixed in a later version. Do you still have the problem with the b11 version?
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

p.vanderwal
Posts: 22
Joined: Sat Sep 03, 2005 10:25 am

Post by p.vanderwal »

Smudge wrote:
p.vanderwal wrote:All of a sudden NL 4.0v9 is showing the time the download still has to go as 133343:54:46 instead of some minutes, as it did before. That is not very convenient. I did not change anything. What went wrong? How can I set it back to minutes? Who kowns an answer?
I think that was one of the small bugs that has been fixed in a later version. Do you still have the problem with the b11 version?

Thanks for your quick replay. Yes, the same in v11

p.vanderwal
Posts: 22
Joined: Sat Sep 03, 2005 10:25 am

Post by p.vanderwal »

p.vanderwal wrote:
Smudge wrote:
p.vanderwal wrote:All of a sudden NL 4.0v9 is showing the time the download still has to go as 133343:54:46 instead of some minutes, as it did before. That is not very convenient. I did not change anything. What went wrong? How can I set it back to minutes? Who kowns an answer?
I think that was one of the small bugs that has been fixed in a later version. Do you still have the problem with the b11 version?

Thanks for your quick replay. Yes, the same in v11
And now it suddenly is normal again ..... went back to v9, normal also, reinstalled v11, still normal .. for the time being that is.
I don't know what is going on, but thanks anyway for your attention and: keep up the good work.

pzam
Posts: 179
Joined: Fri Jul 18, 2008 8:34 pm

NL 4.0v9 crash when udate headers then when restart deletes all par2

Post by pzam »

NL 4.0v9 I hope this gets fixed.

I dont know if you have found or fixed this bug but thought i would post it it only seems to happen very rarely.

some times when I have had it running a while then do a header update it will get some error I always miss writing it down thinking it is the usual errors. but there is something different happening.

it goes like this i press send receive headers then a second later the error happens where it usual does when you send receive and get a error.
the difference is usually you just press enter it goes on and finishes getting headers but I figure it is doing wrong at this point anyway so I usually stop it and start it again.

#########################################
What happens on the rare times is when I press send receive headers it will after a second get the error and I press enter and then immediately start cleaning the cache and deleting all the par2.

it totally deletes every par2 for every file on the repair extract list.

###########################################
then have to download them all again.

I had option set to clean up par2 I turned it off now to see what that does.

thedoc31
Posts: 13
Joined: Tue Apr 12, 2005 5:58 am
Location: Salt Lake City, UT
Contact:

Collections do not always expand/collapse when clicked

Post by thedoc31 »

Spiril,

I'm using 4b11 on Vista 32-bit and I've noticed that the collections don't always expand properly when I click on the + next to them. I have noticed this mostly after scrolling through a list in SuperSearch, shift- or ctrl-clicking a bunch of articles, then trying to expand the list so I can pick out one or two PAR files. It's inconsistent - sometimes it works every time I click the +, and sometimes it doesn't, which makes it all the more frustrating. When it does not properly expand, and I click the + again to get the list to expand, I will lose my existing selection when the list expands.

I thought I saw a bug posted for something similar to this back in the early 4.0 betas, but I wanted to make sure you know this bug (or one like it) may still exist in 4b11.

pzam
Posts: 179
Joined: Fri Jul 18, 2008 8:34 pm

Post by pzam »

I think I ran into what you are trying to fix finally. I am still using 4b9.
What I noticed was most of the time i manually browse thru the articles view explorer window and select the files I want. I will mark all the junk messages as old and have it so old messages don't show. I don't have it ever mark all the messages old. I with there was 2 marks spam and old.

The repair and extract folder had crashed earlier from some file it got stuck on and i had to do press stop and clear the list, exit the program, start the program, clear the error, exit the program, start the program, drag all the par2 from the download dir back to the repair extract list. start repair extract. quickly remove the file from the list that locked it up. clear all the finished items. exit the program, start the program start downloading again, start repair extract again. now it will keep running till the next time it hits a bad file.

After that I had a large file downloading about 20gig i added to the list. and several smaller files.

What happened next was something i had not seen before or may not have noticed. I went to the articles explorer and instead of selecting leech like usual on the selected files. I thought i would be lazy and just push open on several nzb files that were for groups of 10 - 100 smaller files. After I opened a few it started getting a bit slower but still was ruining.
next I am not clear on I was typing some filters in the artcles explorer then clicked double clicked it to clear it and it seemed to be locked up. the drives were working away. I looked and newsleecher was using about 500,000k of ram and higher than usual cpu accesses. I stopped most other programs antivirus and such to see if it would come out of it. But after quite a while I terminated it.
I suspect it was stuck in cache doing something. I hope what you are trying to figure out negates this.


I think I am understanding how you changed the par2 to work finally and am having better success with it.

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

pzam wrote:I with there was 2 marks spam and old.
For spam, use shift-delete and it will purge it from the cache. Normal delete will mark it as old but not remove from the cache in case you wish to view old messages.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

alexps
Posts: 30
Joined: Tue Jan 04, 2005 10:36 pm

Unpause doesn't work when '+' files collapsed

Post by alexps »

Hi. I posted this before when a previous beta killed it. It was fixed in a subsequent beta then this beta (& the previous one has killed it again). I also posted it here for this beta with no response.

In short, Pause/Unpause no longer works for queued files unless they are expanded. I manage my queue by pausing files I don't want to download yet (I have a peak/off peak traffic allowance). Then when I want to, I Unpause them (select, right click). This used to unpause ALL the files listed under a single queued header but now it doesn't. Again. I have to click on all the '+' signs, select all the parts of each queued file I want to download then Unpause each one. Quite a pain with 200+ queued files each of 100 parts.
Please please PLEASE fix this one soon, its very tedious to work around. And you fixed it before then broke it again!
Thanks

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Re: Unpause doesn't work when '+' files collapsed

Post by Smudge »

alexps wrote:Hi. I posted this before when a previous beta killed it. It was fixed in a subsequent beta then this beta (& the previous one has killed it again). I also posted it here for this beta with no response.

In short, Pause/Unpause no longer works for queued files unless they are expanded. I manage my queue by pausing files I don't want to download yet (I have a peak/off peak traffic allowance). Then when I want to, I Unpause them (select, right click). This used to unpause ALL the files listed under a single queued header but now it doesn't. Again. I have to click on all the '+' signs, select all the parts of each queued file I want to download then Unpause each one. Quite a pain with 200+ queued files each of 100 parts.
Please please PLEASE fix this one soon, its very tedious to work around. And you fixed it before then broke it again!
Thanks
I think you are a bit confused on how pausing works with collections. I know because I was confused too until Spiril explained it to me. I just tried it with 4.0b11 and it still works the same way.

When you pause a collection with the main collection entry selected, it will pause the downloads of all files within that collection but it doesn't show each file as paused. Each file has its own pause/queued state. For example, the par2 files are paused but the collection itself is queued. This allows the files within the collection to download if they are queued but not the paused files.

If you expand the collection and select all the files then click the pause button, each file in the collection is changed to a paused state. This is not what you want to do because then you have to unpause them all to get them to download.

What you want to do is to leave the files within the collection alone and just pause the main collection entry. This will pause all the files inside and not download the files until you are ready. Don't mess with the individual files within a collection.

I agree that the individual files stating that they are Queued even though the collection is paused is a bit confusing. Perhaps Spiril can change the Queued status to "Queued (Coll.Pause)" or something else that would show the master collection is paused so the files will not be downloaded.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

capntripz
Posts: 78
Joined: Sun May 14, 2006 11:05 pm

Post by capntripz »

I have found that downloading just a few headers in a group, then wiping the cache for that group (right click on the group) will fix the problem. The next time you get headers, all will be in sync. Annoying, but at least there seems to be a word around, at least for mine.

Capn.

Post Reply