NewsLeecher V4.0 Beta 10

Forum to report beta release bugs and discuss the latest beta releases with other users.
• If reporting a beta release bug, be sure read the bug reporting guidelines first.
Forum rules
IMPORTANT : Be sure to read the NewsLeecher forums policy before posting.
quickly
Posts: 109
Joined: Mon Sep 12, 2005 4:00 pm

missing buttons

Post by quickly »

where are the buttons for show text files and incomplete files?
why are such important features removed?

BlueSteel
Posts: 17
Joined: Thu Jan 15, 2009 3:52 pm

Post by BlueSteel »

Mirx wrote:Is this a selectable option???

I ask because I only have 1 internal HD installed and I use a RAM disk as my tempory download directory. If I cannot split the temporary and download directory, this means I cannot use this configuration anymore right?.
If so, this will mean everything needs to go to the HD instead of being stored in memory, and it will slow things down instead of speading them up.
This goes hand in hand with my suggestion of adding a configurable memory buffer to NL. If there was an option to buffer x MB of data in memory before it gets flushed to disk, there would not be any need to use workarounds like a RAM disk to reduce HDD thrashing.

Good to see that I am not the only one that is a bit annoyed by the high amount of HDD usage NL is creating.

(On the same note, a lower HDD access priority for RnE operations would be very nice, my system usually stalls while my downloads get extracted from my temp HDD to my work/storage HDD.)

p0W3Rh0u5e
Posts: 176
Joined: Thu Apr 21, 2005 6:46 pm

Post by p0W3Rh0u5e »

BlueSteel wrote: This goes hand in hand with my suggestion of adding a configurable memory buffer to NL. If there was an option to buffer x MB of data in memory before it gets flushed to disk, there would not be any need to use workarounds like a RAM disk to reduce HDD thrashing.
Yep, this kind of option is really needed, especially with that new decoding-engine.

I was using a ram-disk for a long time, but a few months ago i added a raid-controller to my system, which has 512mb ram and can use this as write-cache. With that option enabled, i did'nt need the ram-disk anymore, no more write-pauses in nl, atleast not lots of them. ;)

But the new decoding/writing-engine works differently, i see multiple write-pauses while downloadng one(!) 100mb-file. It looks like NL is getting around the controllers cache.
Good to see that I am not the only one that is a bit annoyed by the high amount of HDD usage NL is creating.
Believe me, you're not alone. ;)

acousticrand
Posts: 6
Joined: Tue Jan 30, 2007 8:39 pm

Post by acousticrand »

theprimal wrote:I reported this with b9 already, noone ever got back to me if it is a bug or not. Nothing major but it really bothers me:

Keyboard-shortcuts: Page Up/Page Down + Shift no longer marks a page of articles. Instead Newsleecher tries to scroll to the left/right (provided there is a scrollbar).
I'll throw a "Me Too" in here, I reported this issue on B8 and B9.

User avatar
the doctor
Posts: 250
Joined: Wed Apr 22, 2009 12:40 am
Location: DeLand, Florida USA
Contact:

Re: Turbo Headers d/l very slow

Post by the doctor »

the doctor wrote:300,000 headers taking 8 min to download speed of 3.4 Kb/s

Started happening on update to v4b10 - also d/l forum release - still the same.
This problem seems to have cleared up - perhaps it was a problem with my Newserver that just coincidentally happen during the same time I updated to V4b10.

coagulant
Posts: 1
Joined: Fri Dec 04, 2009 1:11 am

Post by coagulant »

acousticrand wrote:
theprimal wrote:I reported this with b9 already, noone ever got back to me if it is a bug or not. Nothing major but it really bothers me:

Keyboard-shortcuts: Page Up/Page Down + Shift no longer marks a page of articles. Instead Newsleecher tries to scroll to the left/right (provided there is a scrollbar).
I'll throw a "Me Too" in here, I reported this issue on B8 and B9.
Agreed - this regression in functionality is a real hit in usability - please fix!! :cry:

bigsid05
Posts: 5
Joined: Sat Feb 04, 2006 5:16 am

Post by bigsid05 »

New beta has introduced a bug where image files are downloading corrupt. Not sure what the specific issue is, but downgrading to beta 9 fixed the problem. I'm running Win 7 x64 btw.

pzam
Posts: 179
Joined: Fri Jul 18, 2008 8:34 pm

Pars after rars = less collatral damage and more functinalaty

Post by pzam »

I suggest again to put the option back in to not download any pars until the rars are done. then download the little one.

This will stop the system from beating the drives to death and when newsletter crashes there will be lots less things to clean up.


Note it can extract faster if it is not scanning lots of files that are missing files anyway and waiting for them to download.


This feature was in newleecher up till a few betas ago but it got lost

cobrala
Posts: 21
Joined: Mon Jan 23, 2006 3:26 am

Disappointed with b10

Post by cobrala »

Well as much as I praised you with the massive performance gains in b9 (it performed how it always used to long ago) b10 is quite a stiff step backwards. :(

D'ling articles takes longer and my HDD is thrashing like a mofo. :cry: It's seriously affecting system responsiveness - Win7 goes unresponsive until it finishes WTH it's doing.

I'm going to go back to Beta 9.

flanders
Posts: 36
Joined: Mon Feb 27, 2006 9:54 pm
Location: France

Post by flanders »

GREAT job on the performance increase as far as I'm concerned.
Fetching articles from supersearch takes 10 times less than b9!
And I don't have the CPU hogg when pausing or when NL is idle.
Very nice!
p0W3Rh0u5e wrote:Another bug, temp-download-files ("!! newsleecher download - do not delete - -????????? !!") are'nt deleted when you remove files from the queue, while they're downloadng.
I have this problem as well though... and the files are not deleted when NL is closed either.

morv
Posts: 9
Joined: Fri Dec 04, 2009 6:15 pm
Location: Netherlands

Post by morv »

I got an error that exists in the last few releases. I hoped it would be solved in one of the Beta's, but I suppose I's not a common error since it isn't solved already.

"Repair'n'Extract could not create destination folder"

I have this error in Windows Vista as well as Windows 7.

It occures only with downloading files that end in a whole movie. (It also happens when de moviefile (avi, mpg, etc) is splitted (file.avi.000, file.avi001, etc)

Newsleacher works correctly with archive files (rar, zip, etc) they are extracted to a sub-folder as should be.

I Hope you can solve this. Please send me a PM for further info if needed!

Mark

Image

Image

tommo123
Posts: 123
Joined: Wed Oct 19, 2005 4:43 pm

Post by tommo123 »

i still have the same headers fault. really, *really* annoying which means i use supersearch to find everything in a specific group then search again and again to narrow things down - repeat for next group - several times a day,

if others do this, then i can't see SS staying up long

pzam
Posts: 179
Joined: Fri Jul 18, 2008 8:34 pm

Post by pzam »

I don't think i will try this one to many people grieving. hope the next version is better.

I think I figures out why some files don't extract. newsleecher is stupid and downloads all the little pars way before it needs them. if you send 300 files to the cue it puts 300 little pars in the download directory then it starts parsing them and beating hard drive to death. The silly thing is, that it is only downloading 1 file at a time. This results in needing only needs 1 par at a time.


I vote stop abusing my computer. 1 par per file at a time.


Let your binary voices be heard.

pgjensen
Posts: 9
Joined: Wed Oct 31, 2007 1:59 am

Post by pgjensen »

Skip existing files no longer works for me and I think someone reported the same problem in b9. It downloads the full file into the temp !! newsleecher !! file and it just stays there.

jdgdcg
Posts: 129
Joined: Sun Jun 28, 2009 3:08 pm

Post by jdgdcg »

With the last update beta 10 when i download a .avi file without parts, direct video file, the file downloaded is broken.

Someone has the same problem?

Post Reply