File-handling for downloading group headers

Found a bug in NewsLeecher? Post a bug report here. Please describe how to reproduce the bug.
• Be sure to read the bug reporting guidelines before posting.
Post Reply
Posts: 3
Joined: Wed Oct 10, 2012 5:01 am

File-handling for downloading group headers

Post by SystemDisc »

The way NewsLeecher currently allocates space for the files that contain cached headers creates a large amount of fragmentation on NTFS. Typically, I download ~800 days of headers for all of my subscribed newsgroups because my system can handle it. However, it seems that NewsLeecher allocates small amounts space at a time for writing to cache-files, resulting in an extremely large amount of fragmentation on the harddrive.

As a side note, NewsLeecher does a pretty poor job handling resources for large header counts in general. The reason I download as many headers as I can, without creating too much lag, is because if I can find it in my local cache, I know I have access to it. SuperSearch is nice, and online services such as binsearch are nice, but I like having the ability to cache and search newsgroups on my own.

Perhaps there's a way to better handle loading the articles from HDD to RAM, and it would create a lot less fragmentation if writing the cache from network to HDD was handled more appropriately.

I'd be very willing to offer my services free-of-charge if NewsLeecher is written in C++. I wouldn't mind having it in my resume :P

User avatar
Posts: 639
Joined: Wed Feb 16, 2005 3:15 pm

Post by Destroyer »

It isn't written in C++ afaik.

A lot of programs behave in this manner creating fragmentation. Firefox is one and spotify is absolutely atrocious. If running Diskeeper Intelliwrite goes nuts trying to reduce the fragmentation it is causing.
Please note: I am NOT official newsleecher support.

Post Reply