NewsLeecher v5.0 Beta 16 not downloading headers from HUGE groups

Found a bug in NewsLeecher? Post a bug report here. Please describe how to reproduce the bug.
• Be sure to read the bug reporting guidelines before posting.
Post Reply
eurodude
Posts: 9
Joined: Sat Apr 23, 2005 11:16 am

NewsLeecher v5.0 Beta 16 not downloading headers from HUGE groups

Post by eurodude »

Version: Beta 16 (RC1)
Operating System w/ Service Pack Version: Windows XP SP3
Installation Method: Full
Type of Installation: Beta
Number of Servers: 1
List your provider(s)? Usenetserver.com
List of antivirus/firewall software installed: McAfee VirusScan Enterprise 8.7.0i
Connections per server: 1
Server priority: 1

Bug Description:
****************************
As of about two days ago, alt.binaries.hdtv header numbers seem to have grown too large for NewsLeecher to handle. When downloading headers for the first time, the server connection shows the message to the effect of "0 of about 1,696,350,000 turbo compressed headers received", and then times out.
****************************

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

This is similar to an earlier problem in 4.0 but has been fixed in 5.0. The timeout part is different. Earlier it would immediately come back saying all headers have been downloaded. With the timeout, I think something else is going on here.

I was able to get the headers to download, even more than what your provider listed, without issue.

Code: Select all

4:17:20 PM Receiving headers from news.newsleecher.com. Range 0 to 2,362,126,504. Local Cache Pointer is at: 0.
Does this happen with other large groups too? Can you try alt.binaries.boneless to see if it has the same problem?

Try turning off the turbo compressed header option. Toolbox>Settings>Adv. Nerdy Tweaks. Set the "Article Download Allow Xfeat Compression" to FALSE.

Can you try disabling McAfee and download headers again? Just make sure to turn it back on before downloading or opening any downloaded files.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

eurodude
Posts: 9
Joined: Sat Apr 23, 2005 11:16 am

Post by eurodude »

This is similar to an earlier problem in 4.0 but has been fixed in 5.0. The timeout part is different. Earlier it would immediately come back saying all headers have been downloaded. With the timeout, I think something else is going on here.
Yes, it's about 15 seconds before it times out and goes back to "Idle & ready for action!".
I was able to get the headers to download, even more than what your provider listed, without issue.
If I recall correctly, article numbers are specific to the provider, correct? So is it possible that the numbers used by my provider are too big while those used by yours are not? If so, then the count of headers reported for me might actually be bogus, and the fact that your count is bigger than mine might be irrelevant?
Does this happen with other large groups too? Can you try alt.binaries.boneless to see if it has the same problem?
No other group is affected to my knowledge. alt.binaries.boneless downloads fine.
Try turning off the turbo compressed header option. Toolbox>Settings>Adv. Nerdy Tweaks. Set the "Article Download Allow Xfeat Compression" to FALSE.
Does the same thing.
Can you try disabling McAfee and download headers again? Just make sure to turn it back on before downloading or opening any downloaded files.
Does the same thing.

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

Could be a problem with the local cache files for that group. First quit NewsLeecher then remove the 3 files for this group from the cacheV4\groups\ directory in your NewsLeecher data location. By default this is in %APPDATA%\NewsLeecher\
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

eurodude
Posts: 9
Joined: Sat Apr 23, 2005 11:16 am

Post by eurodude »

Actually in the course of troubleshooting this, I was already working from a clean install with nothing in the cache folder. Then when it times out, only an .nli file gets created.

eurodude
Posts: 9
Joined: Sat Apr 23, 2005 11:16 am

Post by eurodude »

My provider has been investigating too, and they just notified me that they resolved an article number issue on their end. The problem is not happening anymore, so looks like this one can be marked a non-issue. Thanks for all your help, Smudge!

stephenstud
Posts: 9
Joined: Mon Feb 01, 2010 4:07 am

Post by stephenstud »

I have the same problem as well, so I guess its the provider?

Post Reply