Unsolved

This post is more than 5 years old

1 Message

3655

April 9th, 2014 04:00

how to get clients backup sizes

Hello,

what is the best way to get approximate client backup size (for example last fullbackup size) ? I've tried mminfo -q and getting info from savegrp log. Both have information about backupsize per saveset, not by client and transforming this information with awk(only solution i came with) is unwanted load on system and takes few minutes. Is there a way to use networker reporting to get directly (and automaticaly from cron or something, not by clicking in nmc) the size of last fullbackup of every client (whole, not by saveset)?

Basicaly i need to get a list of "client backup size" (for example last fullbackup size, or last week backup size summary) automaticaly everyday. How can this be done?

Thank you

ps: sorry if this question was asked before, i had no luck searching for answer.

4 Operator

 • 

14.3K Posts

April 9th, 2014 05:00

I'm doing something similar now.  Well, for couple of year, but now this is getting into DB too

What I do is following:

mminfo -avot -q 'sscreate>=yesterday 00:00:00,sscreate<=yesterday 23:59:59' -xc, -r 'totalsize,pool,group,client,sscreate(20),sscomp(20),level,name' | grep -v 'ss-created' |sort -u > /nsr/scripts/stats/`yesterday`.lst

Yesterday is just small function:

yesterday()

{

perl -e '@y=localtime(time()-86400);printf "%02d%02d%04d",$y[3],$y[4]+1,$y[5]+1900'

}

Now, I run this each day and I get all data that I need.  After that I feed it into SQL and I have table like:

ecn228.JPG.jpg

What don't see above is column on left with client names and far right with totals per client.  If I hover over field I get value in bytes and if I click on "+" I get data per group (as one client may be in different groups like FS, DB and ARCH).  Now, how you do it - it is your game.  I can only say that after couple of years and if this is larger environment you will notice that even MySQL has issue coping with load and that something like Hadoop would be needed.  Anyway, going back to your query, simple thing to do is to list backups you want and you can sum totalsize with blah bllah | perl -lpe '$c+=$_}{$_=$c'

It is very fast and efficient way of getting total sum. If you use mminfo, you are limited only for data within mdb retention (this is why I export data out).  Give it a try and play with it. One nice thing about exporting this per day is that you can later run different queries without touching mdb at all and there is no impact when running queries, compiling reports or graphs.  Obviously, with exported data you can feed it into excel as well do whatever you want from there too.

142 Posts

April 9th, 2014 06:00

Good explanation Hrvoje,  I like the idea of using this data with a DB. However, that will require acquiring DB administration skills.

Regards

tech88kur

2.4K Posts

April 9th, 2014 07:00

For non-DB users it is useful to send the mminfo report to create a CSV (one of about 35MB each month).

Then go ahead and filter/sort whatever you want.

I create the report once a month, so you can easily compare multiple weekly backup cycles.

Of course, if you adjust the parameters, you can even run it more often.

As i am not familiar with perl, i use powershell to summarize the columns and to store that report in an appropriate subdirectory using the date as name. And you can even use powershell to sort the data - it should not be too difficult.

142 Posts

April 9th, 2014 07:00

Thanks...

I keep on asking for more and more...could you also guide what could be a

starting point.

Regards

tech88kur

ECN

how to get clients backup sizes

reply from Hrvoje Crvelin in NetWorker Support Forum - View the full

discussion

Usually UNIX guys are ok with basics and it doesn't take much more than

basics to write it. Sure, there will be always some know-how required -

especially when you move from simple cat/grep/awk approach into more

structured and organized approach with some of free DB offerings.  Of

course, nothing stops you from creating structure on file system with mdb

dumps for each day and have script which processes this data and gives

values you are interested at the end of certain time period - this will

also work just fine.  But eventually, especially with backup as a service

approach, you will eventually end up with tasks to present it to

customers, to link it with SLAs and similar - so one should start thinking

about this now as it is already happening elsewhere.

Reply to this message by replying to this email, or go to the message on

ECN

Start a new discussion in NetWorker Support Forum by email or at ECN

Following how to get clients backup sizes in these streams: Inbox

=====

4 Operator

 • 

14.3K Posts

April 9th, 2014 07:00

Usually UNIX guys are ok with basics and it doesn't take much more than basics to write it. Sure, there will be always some know-how required - especially when you move from simple cat/grep/awk approach into more structured and organized approach with some of free DB offerings.  Of course, nothing stops you from creating structure on file system with mdb dumps for each day and have script which processes this data and gives values you are interested at the end of certain time period - this will also work just fine.  But eventually, especially with backup as a service approach, you will eventually end up with tasks to present it to customers, to link it with SLAs and similar - so one should start thinking about this now as it is already happening elsewhere.

4 Operator

 • 

14.3K Posts

April 9th, 2014 08:00

mminfo query/output/dump is starting point.  Make sure you have everything there that you feel you might need it now and in the future. After that, at any point, it is just matter of processing raw data collected that way.

Top