![]() ![]() In such cases it's often more efficient to dump the entire filesystem to tape (or another disk, HDs are cheap these days), format the filesystem and copy everything back. If it can't do it's job properly it's only making maters worse. The defrag software will need to have a chance to do the job. That is if the server is not constantly in use at this intensity for 24/7. YMV, but in my experience for most use-cases (at home or on servers) there is no real need for continuous defragmentation like DiskKeeper wants to sell to you.įor intensively used (lot's of new files/modified files/files deleted) file-servers it's another matter: A low-priority defrag job running in the background can really help to keep system response-times stable over a long period of time. In general: As of Windows XP the NTFS filesystem is quite good if keeping fragmentation limited to reasonable levels all by itself. This in turn may lead to point a), d) and e), but only as a side effect.Īdditonally on a very heavily fragmented disk that is also 99.999999999% full file corruption may actually be an issue as the filesystem itself runs out of elbow room to do it's work but in general you will consider the PC to slow to be usable long before you reach that point.Īs for f): RAM use for caching will in general not increase, but the efficiency of the caching will take a plunge downwards. Sometimes to the point of applications timing out which causes software instability, because the application doesn't expect this to happen. In my experience heavy fragmentation just slows things down.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |