View Single Post
  #160   Report Post  
Posted to uk.d-i-y
polygonum polygonum is offline
external usenet poster
 
Posts: 5,386
Default Defraggin LInux (was should DIY be a green cause)

On 28/03/2016 10:02, 764hho wrote:

And it doesnt really matter if those log files do get quite
fragmented, because they are hardly ever read from end
to end except when browsing them, when you reading
much more slowly than the file can be read anyway,
so extra seeks between fragments dent matter at all.


Unfortunately, in my experience, it can matter.

One particular application I used to deal with had such files. And they
were regularly accessed. The difference achieved from a simple move of
such a file was often very obvious to a user.

It can also matter because in time, as other files are created and
deleted, the fragmented log file can mean that any free space is fragmented.

One area of NTFS I have either forgotten or never read up is how it
knows where fragments of files reside. Based on another file system
which I did know well, there was a file which contained lots of records
something like:

File number ! Fragment number ! Starts at block ! For so many blocks

In that old system, locating the block required shuffling through this
file and counting. The amount of work this required was very closely
related to the number of fragments and hardly at all to the absolute
size of the file. To find the last block of a severely fragmented file
would require reading through lots of these small records. (Of course,
some or all of this might be cached - though probably not then. This
file would be a prime candidate for holding in memory.)

--
Rod