----- Original Message ----- From: Derek Kelly derek.kelly@genomecorp.com To: toasters@mathworks.com Sent: Tuesday, February 29, 2000 7:57 AM Subject: Running du's
This may be a fruitless search for info but i'm interested in hearing about how folks are handling disk usage stats on large quantities of data (say 2 or more TB). Of particular interest to me are environments where there are lots and lots of small files rather than a modest amount of large ones. Knowing that the network wire is usually the bottleneck, we are running our du's from an admin host on the same isolated 100 bt server network as our 6 filers. But- because of a known incompatibility between the network card on our admin host and the Bay switch we're using, the card can only sucessfully do half-duplex, not full. Due to a lack of other suitable hosts to run these reports from, I now turn my attention to this list for some collective brainstorming.
Give every user a quota, but set the quota so outrageously high they won't encounter it. Then the quota report will give you instant usage for any user, and it will count everything on that filesytem, not just the stuff in his home directory.
Bruce