There is this  very simple script that generates  CSV to your email with your aggr usage.    Requires passwordless ssh from the system/user you cron this on.

 

It is not much, but figured I would share

 

 

#!/bin/bash

date=$(date -I)

dir=/root/sanreport/

 

for i in {1..8}; do echo "============= filer$i ===============";ssh filer$i df -gA|grep -v snap|grep -v Aggregate|sort|sed s/GB//g|awk '{print $1,"," $2,"," $3,"," $4}';done > $dir$date-San_Report.csv

mutt -s "$date San Capacity Report" -a $dir$date-San_Report.csv emailaddress@domin < $dir/body

rm $dir$date-San_Report.csv

 

 

 

 

 

 

From: toasters-bounces@teaparty.net [mailto:toasters-bounces@teaparty.net] On Behalf Of Daniel Keisling
Sent: Tuesday, July 30, 2013 11:53 AM
To: Ray Van Dolson; toasters@teaparty.net
Subject: RE: Metrics!

 

I've received several message for the scripts, but unfortunately I will not be able to share them in their entirety.  They really are custom to my environment, but they shouldn't be too hard to create if you know perl.

 

                foreach $controller (@controllers)

                {

                                .....

                                @output =`dfm perf data retrieve -o $controller -C system:avg_processor_busy -d $duration`;

                                .....

                }

 

                foreach $day(@day)

                {

                                .....

                                $data->add_point($data_day,$ctrla_values[$array_counter],$ctrlb_values[$array_counter],50,90);

                                .....

                }

 

                print IMG $gd->png;

 

The result would be something like:

 

http://storage.wilm.ppdi.com/storage/graphs/AUSSTORE2-CPU-30-line.png

 

 

-----Original Message-----
From: toasters-bounces@teaparty.net [mailto:toasters-bounces@teaparty.net] On Behalf Of Daniel Keisling
Sent: Tuesday, July 30, 2013 8:19 AM
To: Ray Van Dolson; toasters@teaparty.net
Subject: RE: Metrics!

 

I wrote custom perl scripts that pull the output of "dfm perf data retrieve," puts it into an array, then uses GD::Graph to plot custom graphs.  Graphs are created nightly and can be viewed by navigating to a URL on that server.  We graph on CPU utilization, FCP latency, CIFS latency, and overall storage capacity.  It's works very well and provides a great high level view of our NetApp storage environment.

 

Regards,

 

Daniel

 

-----Original Message-----

From: toasters-bounces@teaparty.net

[mailto:toasters-bounces@teaparty.net] On Behalf Of Ray Van Dolson

Sent: Tuesday, July 30, 2013 1:11 AM

To: toasters@teaparty.net

Subject: Metrics!

 

We have a growing deployment with ~25 filers and about 750TB of usable space.  Most of it is VMware datastore space exposed via NFS, but some is general purpose NAS file sharing.

 

We've been using OnCommand Core (formerly DFM) to handle reporting and such, and for the most part it's worked pretty well.

 

I need to start pulling together some metrics for the environment as a whole, both to benefit the Storage Team who manages the devices, but also to be able to pass along to my bosses so they can get a basic grasp of how things are going.

 

I'm curious how those of you with deployments similar to mine or larger hvae dealt with this?  What sort of things are you reporting on, and how do you handle the "roll-ups" to summarize many (in our case 200+) volumes into something meaningful?  Are you generating reports natively from OnCommand Core, or pulling the data out into another tool?  Do you find the OnCommand DataSets feature useful?  We haven't really leveraged it properly I think.

 

My thought is to present a sort of physical view of our environment showing roll-ups of storage capacity and performance (probably CPU, IOPS and latency).  How to give a snapshot overview of the whole environment is what I'm not sure on.

 

I'm thinking using the DataSets will help me organize our various types of data and then I can report on that.

 

I'm not sure, however, if I should spend my time trying to get the right reports out of OnCommand Core directly or if I should just start extracting the data to another tool (which?) or into a database where I can home-brew up what I'm after.

 

Thoughts/experiences?

 

Thanks,

Ray

_______________________________________________

Toasters mailing list

Toasters@teaparty.net

http://www.teaparty.net/mailman/listinfo/toasters

 

This email transmission and any documents, files or previous email messages attached to it may contain information that is confidential or legally privileged.

If you are not the intended recipient or a person responsible for delivering this transmission to the intended recipient, you are hereby notified that you must not read this transmission and that any disclosure, copying, printing, distribution or use of this transmission is strictly prohibited.

If you have received this transmission in error, please immediately notify the sender by telephone or return email and delete the original transmission and its attachments without reading or saving in any manner.

 

 

_______________________________________________

Toasters mailing list

Toasters@teaparty.net

http://www.teaparty.net/mailman/listinfo/toasters


This email transmission and any documents, files or previous email messages attached to it may contain information that is confidential or legally privileged.
If you are not the intended recipient or a person responsible for delivering this transmission to the intended recipient, you are hereby notified
that you must not read this transmission and that any disclosure, copying, printing, distribution or use of this transmission is strictly prohibited.
If you have received this transmission in error, please immediately notify the sender by telephone or return email and delete the original transmission and its attachments without reading or saving in any manner.