hard drive defragmenting

Posted:
in General Discussion edited January 2014
This is probably a dumb question, but anyway:



Does it take longer to defragment a 250 Gigabyte hard drive than a 160 Gigabyte hard drive if they have the same amount of data on them?



In other words, if a 250 Gig and a 160 Gig drive each has 120 Gigs of data written on it, would there be any difference in the amount of time required to defrag the two drives?

Comments

  • Reply 1 of 14
    kickahakickaha Posts: 8,760member
    Oh thank god, I thought this was another "Why doesn't Apple provide a defrag tool!?" thread.



    Yeah, they should take the same amount of time. In fact, the larger drive might be a little quicker, if less data needs to be shuttled around to make contiguous room.
  • Reply 2 of 14
    Yep. Assuming the large drive has its greater amount of free space in larger contiguous chunks, it should move data and therefore defrag faster.
  • Reply 3 of 14
    Thanks guys. Helpful info.



    I'm tempted to ask if you're satisfied with OSX's "auto-defragmenting" or use additional software but apparently that is a touchy subject.
  • Reply 4 of 14
    IT'S TIME FOR A DEFRAGER MADE BY APPLE!



    -runs for dear life-
  • Reply 5 of 14
    kickahakickaha Posts: 8,760member
    Heh. No, that's fine. It's when someone comes waltzing in whining that Apple doesn't provide an explicit defrag tool "like MS does!"... not realizing that that's only an admission that the filesystem is highly suboptimal. A good system should never make you do crud like that.



    I find the auto-defrag to be just dandy. Others who have highly demanding tasks such as real-time video editing and such need a higher quality of optimization.
  • Reply 6 of 14
    Quote:

    Originally posted by Kickaha

    I find the auto-defrag to be just dandy. Others who have highly demanding tasks such as real-time video editing and such need a higher quality of optimization.



    I'll be doing video editing and audio production on the Powermac, which is why I'm digging around for info about it. Thanks for your help!
  • Reply 7 of 14
    groveratgroverat Posts: 10,872member
    Quote:

    Others who have highly demanding tasks such as real-time video editing and such need a higher quality of optimization.



    What, is that not what PowerMacs are designed to do?
  • Reply 8 of 14
    kickahakickaha Posts: 8,760member
    Yup, oh sarcastic one, but only those involved in *real-time* video editing, with massive resolutions (think HD) are going to be in serious need of such things. Upping to a faster drive will help more.



    Basically, IMO, if you can get by without upgrading to a faster rpm drive, you can get by without a specialized defrag tool. If you're going to be running iMovie or even FCP, I'm not sure it'd be worth it.
  • Reply 9 of 14
    What about a production graphics environment where you have frequent incidents of new files being added and old files being deleted? You will want to defrag eventually, if not to recover contigous free space on the HD. Naturally, this becomes more important on smaller volumes. One may think this is no longer an issue with such gargantuan HD capacities these days, but also remember that large HD's may also be partitioned to smaller logic volumes (for one reason or another).



    ...and technically, the 250 GB drive could defrag more quickly than the 160, just out of the likelihood that there will be more free space to work with when moving around files in the defrag process. This could have a pronounced affect if that remaining 40 GB of free space on the 160 (assuming the 120 GB of data scenario) is not nice, neat contiguous space. It will take a good amount of time either way, but this could mean the difference between 1 hour vs. several hours and multiple passes to get everything tidy.
  • Reply 10 of 14
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by Randycat99

    What about a production graphics environment where you have frequent incidents of new files being added and old files being deleted? You will want to defrag eventually, if not to recover contigous free space on the HD.



    If the files are under 20MB, they are defragged automatically by HFS+ in 10.3.x, no matter how often you add or delete files.



    For files over 20MB in size, then yes, you would need an explicit defrag tool for those.



    Quick Q: what *need* is there to have contiguous free space? I know a couple of answers, but I'm curious to hear your opinion on why a user, specifically, would *need* to do it.



    OCD needs to have their invisible disk space tidy don't count.
  • Reply 11 of 14
    Quote:

    Originally posted by Kickaha

    If the files are under 20MB, they are defragged automatically by HFS+ in 10.3.x, no matter how often you add or delete files.



    ...assuming there is contiguous free space nearby to store it in a single block. If not, then you either end up with a delay before it finds a suitable space on the entire HD, or you just end up with a fragmented file.



    Quote:

    For files over 20MB in size, then yes, you would need an explicit defrag tool for those.



    Just about anything other than text files can easily range above 20 MB these days.



    Quote:

    Quick Q: what *need* is there to have contiguous free space? I know a couple of answers, but I'm curious to hear your opinion on why a user, specifically, would *need* to do it.



    To ensure maximum probability that a new file can be written in a single block, regardless of file size. If you don't have contiguous space, then all you have are odd sized "holes" where you may or may not be able to stick a file or fragment of given size. If you are in a production or file serving environment where loads of files come in and go out on a given day, the ability to find an appropriate space to fit a file w/o fragmenting may be impacted. The casual user may not deal with this sort of useage, but over a long enough period of regular useage, the fragmentation could look very similar. So by ensuring there is contiguous space on a maintenance basis, you will allow the this "intelligent HFS+" to do its handywork in an optimal environment. True it will be able to do its work over a wide range of conditions which deviate from the virginal state of "fully unfragmented space". That's what it was designed to do. However, this also doesn't mean you can entirely ignore disk maintenance issues indefinitely and expect the file system to keep working optimally as claimed (though, the sheer size of current HD's do invite very long service lives before maintenance even begins to have an impact). I guess it just comes down to taking your lumps little by little over time, or all at once at the end when the HD has become riddled with "unfillable holes". Anybody who has waited for the latter to defrag with only the last 300 MB of free (non-contiguous) space available to buffer the process, knows that taking the lumps little by little is definitely the way to go.



    Quote:

    OCD needs to have their invisible disk space tidy don't count.



    I find your expressed distaste for defragmentation altogether intriguing. I could understand a more neutral stance or an indifference. However, eschewing it seems just as irrational as the compulsion to overuse it.



    I don't recommend defragging every week or anything like that. In OSX, it may not even be useful over prolonged periods. So instead of every 3-4 mos, maybe you could stretch it out to a year. I say do it once while the OS installation is new and you have your basic software laid in. Then let it ride for a loooong time, before worrying about a full defrag. ...and then occasionally, you could do a quick file-only defrag to handle the handful that seem to have gotten that way. It seems OSX has a background process that already pretty much handles that on-the-fly, so even occasional maintenance is pretty much taken care of w/o the user.
  • Reply 12 of 14
    kickahakickaha Posts: 8,760member
    Quote:

    Originally posted by Randycat99

    ...assuming there is contiguous free space nearby to store it in a single block. If not, then you either end up with a delay before it finds a suitable space on the entire HD, or you just end up with a fragmented file.



    Er, if you have less than 20MB free, contiguous, you've got bigger problems than defragmenting.



    Quote:

    Just about anything other than text files can easily range above 20 MB these days.



    'Can' and 'probably will' are two different beasts by far. MP3s? Can, sure. Probably will? Not so much.



    Quote:

    To ensure maximum probability that a new file can be written in a single block, regardless of file size.



    Why is that a *benefit*??



    Quote:

    If you don't have contiguous space, then all you have are odd sized "holes" where you may or may not be able to stick a file or fragment of given size. If you are in a production or file serving environment where loads of files come in and go out on a given day, the ability to find an appropriate space to fit a file w/o fragmenting may be impacted.



    And the impact of *being* fragmented is... what again in these environments?



    Quote:

    The casual user may not deal with this sort of useage, but over a long enough period of regular useage, the fragmentation could look very similar. So by ensuring there is contiguous space on a maintenance basis, you will allow the this "intelligent HFS+" to do its handywork in an optimal environment.



    Yes, but the benefit is what I'm looking for here...



    Quote:

    True it will be able to do its work over a wide range of conditions which deviate from the virginal state of "fully unfragmented space". That's what it was designed to do. However, this also doesn't mean you can entirely ignore disk maintenance issues indefinitely and expect the file system to keep working optimally as claimed (though, the sheer size of current HD's do invite very long service lives before maintenance even begins to have an impact). I guess it just comes down to taking your lumps little by little over time, or all at once at the end when the HD has become riddled with "unfillable holes". Anybody who has waited for the latter to defrag with only the last 300 MB of free (non-contiguous) space available to buffer the process, knows that taking the lumps little by little is definitely the way to go.



    As before, anyone who has only 300MB free has far greater problems due to regular everyday VM swapping than they do from fragmentation. Period.



    Quote:

    I find your expressed distaste for defragmentation altogether intriguing. I could understand a more neutral stance or an indifference. However, eschewing it seems just as irrational as the compulsion to overuse it.



    I'm afraid you completely misunderstand and misrepresent me.



    I do not eschew or find defragmentation distasteful. What I find distasteful is unsubstantiated whining about a supposed lack of built-in tool that only a very small subset of users ever *need*. HFS+ has a very intelligent algorithm to *prevent* fragmentation in the first place, and the background defragging takes care of the vast majority of fragmentation for the vast majority of users.



    The *need* for a dedicated defragmentation tool, as measured by the benefit to the user, is quite small, and is only magnified by cockeyed perceptions from Wintel migrants who believe that every system is as brain-dead as their previous OS and needs the same sort of hand-holding.



    Quote:

    I don't recommend defragging every week or anything like that. In OSX, it may not even be useful over prolonged periods. So instead of every 3-4 mos, maybe you could stretch it out to a year. I say do it once while the OS installation is new and you have your basic software laid in. Then let it ride for a loooong time, before worrying about a full defrag. ...and then occasionally, you could do a quick file-only defrag to handle the handful that seem to have gotten that way. It seems OSX has a background process that already pretty much handles that on-the-fly, so even occasional maintenance is pretty much taken care of w/o the user. [/B]



    BINGO. It doesn't 'seem', it is fact. The only limitation is the previously mentioned 20MB/file. For *most users*, that's well within the bounds to get the *vast* majority of files. For *most users*, the benefits of a fully defragmented and streamlined disk are quite vanishingly small anyway.



    Ergo, the *need* for a defrag tool on MacOS X is limited to those whose real-time disk streaming needs are quite high... ie, real-time video production... which is what I said in the first place.
  • Reply 13 of 14
    Quote:

    Originally posted by Kickaha

    Er, if you have less than 20MB free, contiguous, you've got bigger problems than defragmenting.



    It is not impossible to get into a situation like that, at all. Mind you, this is not a case of your having only 20 MB of free space left. You may have an entire GB of "space" or 5 GB of "space" left as indicated by the drive status. However, none of that space may be contiguous. Maybe the largest contiguous space is 5 MB. So the drive will thrash around finding where it can stick a 15 MB file, and end up fragmenting it anyway. If you have a VM file growing into a situation like that, I suppose you could be in some serious trouble.



    Quote:

    And the impact of *being* fragmented is... what again in these environments?



    It may feel like you are running out of space before you really have run out of space.



    Quote:

    As before, anyone who has only 300MB free has far greater problems due to regular everyday VM swapping than they do from fragmentation. Period.



    On a smaller partition, sure, it's a possibility. Maybe it is only a temporary situation? Maybe you just transferred in a BIG file or archive to do a few things, and then you will export it off again? You already verified you had the space + 300 MB to spare, so why not? There's plenty of situations where this can arise.



    If OSX is good enough for the casual user as it is to the pro-user, it should cater basic maintenance tools for both of them, not just for what the casual user will encounter.



    Quote:

    I'm afraid you completely misunderstand and misrepresent me.



    The readers will be sure to question that as we read further on of your rant against whiney, "cockeyed" Wintel migrants. To be sure, you seem to demonstrate a hatred or disrespect to something. We are just not sure.



    Quote:

    I do not eschew or find defragmentation distasteful. What I find distasteful is unsubstantiated whining about a supposed lack of built-in tool that only a very small subset of users ever *need*. HFS+ has a very intelligent algorithm to *prevent* fragmentation in the first place, and the background defragging takes care of the vast majority of fragmentation for the vast majority of users.



    The *need* for a dedicated defragmentation tool, as measured by the benefit to the user, is quite small, and is only magnified by cockeyed perceptions from Wintel migrants who believe that every system is as brain-dead as their previous OS and needs the same sort of hand-holding.



    Maybe it need not be "dedicated". "Basic" would be fine- something that lets you use your HD good to the last drop. God forbid if your HD goes belly up and can't remember where your files are. I'm sure the HD that was "well maintained" will be imminently easier to recover than the one that was allowed to become fragmented down to the very last MB.



    Quote:

    BINGO. It doesn't 'seem', it is fact. The only limitation is the previously mentioned 20MB/file. For *most users*, that's well within the bounds to get the *vast* majority of files. For *most users*, the benefits of a fully defragmented and streamlined disk are quite vanishingly small anyway.



    Ergo, the *need* for a defrag tool on MacOS X is limited to those whose real-time disk streaming needs are quite high... ie, real-time video production... which is what I said in the first place.



    How about CD/DVD authoring? Digital sound editing? Videogaming? A "heavy" document derived from M$ Office?
  • Reply 14 of 14
    I want my bloody defrag back becuase it lets me feel 'in control'. It is excellent for OCD folks.
Sign In or Register to comment.