Keeps re-downloading all headers in all groups!!

Post your questions here if you need help to use NewsLeecher or if you have a question about a feature.

Post Reply
fdl333
Posts: 16
Joined: Thu Nov 24, 2005 9:15 am
Location: Italy

Keeps re-downloading all headers in all groups!!

Post by fdl333 »

This one is hard to diagnose 'cause at the same time I had to:

- reinstall PC (see previous post where you lot kindly helped me to transfer current article headers and groups)
- upgrade to latest Beta
- Install the Giganews accelerator.

The fact is that now, every morning when I ask it to download all new headers it starts downloading the group from the start (in some cases, i.e. a.b.multimedia, over 15 million articles at 140days old)!!!!

This happens both on groups I empty completely when I've found nothing of interest (selecting all and shift-del), or groups in which I leave some articles in..

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

Could be a couple of things

1. You have turned off all local caching. In the Options>Download>Group Caching screen, set the Days to Keep:Group Headers option to the number of days worth of headers to keep.

2. The group header cache file is corrupted. Quit NewsLeecher then delete the cache file for that group from the configuration files directory.

I'd suggest you test that it is working properly by downloading some headers, say 4000 or so, then close the connection and quit NewsLeecher. Start NewsLeecher back up and open the group to make sure the headers remain in the local cache. You can then download the rest of the headers.

Be aware that the headers are saved in the local cache file when you quit NewsLeecher or unload the group from memory. If NewsLeecher crashes or you force quit it, it won't save the headers.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

fdl333
Posts: 16
Joined: Thu Nov 24, 2005 9:15 am
Location: Italy

Post by fdl333 »

Thanks Smudge. First let my congratulate you on your "motto".. Coming from Italy I totally agree! Actually the "fan club" killed more people than Hitler and Stalin put together!!

Back to our point. I *would* agree, only that it's doing exactly the same thing on my office machine too, that *didn't* crash!

I think it's something to do with the Giganews accelerator utility: it tells you to connect to your localhost port:119. Maybe Newsleecher gets confused with the IP address of the host? Or could it be that Giganews doesn't just compress the headers but if f..ks up the numbering somehow??

Anyway, I *sort of* did what you suggested this morning. I purged all the groups and started re-downloading all headers. I had set the cache to 999 days.. The prob is that a.b.multimedia on GigaNews has over 40Million headers and, unfortunately, NewsLeecher still doesn't have the ability (like Agent had about 10 years ago) to get the last xxx DAYS of headers!

Anyway I'll try and get back to you all. Just in case, I was thinking of putting an entry of:
127.0.0.1 news.giganews.com
in my "\windows\system32\drivers\etc\host" file and setting NewsLeechers server address back to "news.giganews.com"
Could that do any good?

User avatar
Smudge
Site Admin
Posts: 10034
Joined: Tue Aug 17, 2004 1:42 am

Post by Smudge »

fdl333 wrote:I think it's something to do with the Giganews accelerator utility: it tells you to connect to your localhost port:119. Maybe Newsleecher gets confused with the IP address of the host? Or could it be that Giganews doesn't just compress the headers but if f..ks up the numbering somehow??

Anyway, I *sort of* did what you suggested this morning. I purged all the groups and started re-downloading all headers. I had set the cache to 999 days.. The prob is that a.b.multimedia on GigaNews has over 40Million headers and, unfortunately, NewsLeecher still doesn't have the ability (like Agent had about 10 years ago) to get the last xxx DAYS of headers!
Not number of days but specific number of headers. You could right-click the group and update the latest 1|100|10,000|100,000|1,000,000 headers or you can set an exact number in the Options>Download>Headers screen.

fdl333 wrote:Anyway I'll try and get back to you all. Just in case, I was thinking of putting an entry of:
127.0.0.1 news.giganews.com
in my "\windows\system32\drivers\etc\host" file and setting NewsLeechers server address back to "news.giganews.com"
Could that do any good?
unless the localhost entry in the hosts file is missing/corrupt, setting news.giganews.com would be exactly the same as using the localhost name.
Please be aware of and use the following pages...
Services Status Page : SuperSearch and Usenet Access server status, retention, server load, indexing time, etc.
Support Request Form : Use this if you have a problem with billing or account status. The forum is only for NewsLeecher application issues.

User avatar
Quaraxkad
Posts: 310
Joined: Tue Aug 29, 2006 4:34 am

Post by Quaraxkad »

This happens to me also, with Giganews. I think I may know why NewsLeecher is doing this.

Since Giganews updated to 200 days retention, I think they are actually re-adding OLD posts that had been deleted already, so that they can fill-up to 200 days sooner than just waiting 80 days for new posts. If they are doing that, is NewsLeecher detecting that there are older headers on the server, and downloading everything starting with the (new) oldest header to the most recent one, including everything in between that you've already downloaded?

...Did that make any sense at all?

dwazegek
Forum Moderator
Posts: 3006
Joined: Tue Sep 28, 2004 8:36 am

Post by dwazegek »

Quaraxkad wrote:...Did that make any sense at all?
Not really :wink:
Where would they get the older posts from? There aren't any other servers with that kind of retention, so they'd have to have the old posts stored themselves. And if they had them stored themselves, then the articles would have been accessible long before.
Forum's Grammar Nazi:
I will hunt you down if you don't lose the extra o.

User avatar
Quaraxkad
Posts: 310
Joined: Tue Aug 29, 2006 4:34 am

Post by Quaraxkad »

dwazegek wrote:Where would they get the older posts from? There aren't any other servers with that kind of retention, so they'd have to have the old posts stored themselves.
You sound pretty sure of that. I'm not aware of any either, but that doesn't mean they don't exist.

Post Reply