File size of groups of files sometimes way too small when downloaded

Found a bug in NewsLeecher? Post a bug report here. Please describe how to reproduce the bug.
• Be sure to read the bug reporting guidelines before posting.
Post Reply
Jed
Posts: 19
Joined: Thu Mar 18, 2010 11:31 am

File size of groups of files sometimes way too small when downloaded

Post by Jed » Mon Jan 03, 2011 5:49 pm

Sometimes a group of files will look ok in the file list but when they're downloaded all of the files are the same size and very small (220KB or 330KB). I'm using Giganews and the problem isn't with the server since I just used Agent to download several hundred files that Newsleecher was having trouble with. From the file sizes I would guess that only the first part of the file is being downloaded although it looks like Newsleecher is downloading all of the parts. The latest files that I had a problem with were posted over a couple of months but they all started with the same text string. Perhaps something in the file names is causing the problem.

Running Newsleecher 4.0 final under Windows 7.

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge » Tue Jan 04, 2011 12:11 am

Please turn on detailed logging then check the News:Log tab after trying to connect or download. The log should give you details of the error.

For 4.0x versions, open the Settings>Advanced Nerdy Tweaks screen. Set the "Logging Detailed Download" and "Logging Detailed Download Error" options to true.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

Jed
Posts: 19
Joined: Thu Mar 18, 2010 11:31 am

Post by Jed » Tue Jan 04, 2011 2:20 pm

All five of the downloaded files are 220KB while the sizes showing in the file list range from 7MB to 40MB. The forum software complained about lines being too long when I tried to add the log file so I put it at
http://theunicorn.org/NewsleecherLog.txt
The same files downloaded just fine using Agent from the same Giganews account.

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge » Wed Jan 05, 2011 7:42 am

I was able to download the same PDFs without problems. Your log looks exactly the same as mine with no errors. The final file was about 5MB and was about 7MB in the SuperSearch results.

Your log says the news server is localhost. Are you using the Giganews accelerator program? Perhaps it is causing a problem so try downloading the PDFs without the accelerator. It isn't needed anymore anyways since NewsLeecher supports compressed headers from GN and the accelerator doesn't help with article downloads.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

Jed
Posts: 19
Joined: Thu Mar 18, 2010 11:31 am

Post by Jed » Wed Jan 05, 2011 11:16 am

Not using Giganews accelerator didn't make any difference. I tried an XP box and it worked fine. The system with the problem is running Windows 7 Ultimate under VirtualBox under Debian Linux Lenny. It looks like everything is being downloaded correctly but the extraction is breaking for some reason. Is there any way to turn off extraction and save all of the parts to disk?

Another problem that I've had is that very occasionally a file will get saved with the wrong name. I suspect that this has the same cause.

Thanks for the help.

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge » Thu Jan 06, 2011 6:29 am

Those article segments are not saved to disk anymore. In the 4.0 beta 10 release it states "uuEncoded articles are now kept in a memory cache until all article parts are downloaded, whereafter the cache is flushed directly to the download destination folder."

So for some reason your system is either not putting the segments together properly, the segments are corrupted in memory or there is a failure when it is saving the file to disk.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

Post Reply