This may be a fruitless search for info but i'm interested in hearing about how folks are handling disk usage stats on large quantities of data (say 2 or more TB). Of particular interest to me are environments where there are lots and lots of small files rather than a modest amount of large ones. Knowing that the network wire is usually the bottleneck, we are running our du's from an admin host on the same isolated 100 bt server network as our 6 filers. But- because of a known incompatibility between the network card on our admin host and the Bay switch we're using, the card can only sucessfully do half-duplex, not full. Due to a lack of other suitable hosts to run these reports from, I now turn my attention to this list for some collective brainstorming.
Thanks, Derek Kelly