Unsolved
This post is more than 5 years old
132 Posts
0
135516
July 20th, 2012 15:00
foglightdatareport.groovy
[Updated to version 5.00]
The purpose of this script is to allow easy creation of .csv reports from pretty much any Foglight data.
This is the first public test of this script, so if it doesn't work, report the issues so it can be improved.
Purpose:
"Prepare a csv formatted file containing any Foglight topology object data for a range of dates, with a specified granularity, for a user-specified selection of retrieved or calculated data points."
Features:
- can be run with different config files, to contain static parameters such as query, column definitions, output file names, etc.
- cmd line options to allow dynamic parameters such as config file location or start/stop dates
- optimized to only retrieve properties once at the object level, while retrieving metrics at the timeframe level.
- ability to perform pretty much any calculation based on previously retrieved data.
- ability to include multi-step calculations, and use any public Foglight functions with passed parameters in calculations
- ability to scale and round resulting numeric data without renderers.
- [new 4.7] Mulit-Level Debug options
- 0 = off, 1 = least detail, 2 = object detail, 3 = timeslice detail, 4 = insane detail
- (level 2+ will generate a lot more log file entries, and take up a lot more space on disk)
- [new 4.7] Verbose option, to display log entries to the console
- [new 4.7] command line arguments now override config file entries of the same name
- [new 4.7] fixed null of metric saved variables
- [new 4.7] test option, to run the full report but only for the first object's timeslices; output is saved to a file with -test appended, to avoid confusion with production runs.
- [new 4.72] corrected retrieval of stringObservations
- [new 4.72] fixed some scoping issues
- [new 4.72] now prints some helpful info on saved vars in the log file; particularly useful when trying to assemble calculations.
- [new 5.00] Corrected Null Pointer Error in RetrieveMetric and RetrieveCalcMetric; Added a LOT more debug code
- [new 5.00] Corrected "SQLServerException: Lock request time out period exceeded." issue, by adding retry loop with sleep to allow system to catch up.
- [new 5.00] Changed baseSave definition to add object.get("name") as x[1], moving all other x values up one index, to assist with error reporting.
- ** Likely will require changing your config files.**
- [new 5.00] Changed config file structure to use the same format as is used in table definitions for metrics and properties:
- column:metric|Net Xfer Rate Mbps|network/hostNetwork|transferRate|avg|1073741824|2
- becomes:
- column:metric|Net Xfer Rate Mbps|/network/hostNetwork/transferRate/period/average|1073741824|2
- The leading '/' is optional, and can be included to completely match the path
- field on context column definitions. Old config file structure must be modified.
- avg must be changed to average
- [new 5.00] changed to automatically detect calculations, no longer have to include 'calc' in cfg file column definitions
- [new 5.00] changed log file name pattern to include startDate, so can relate log to resulting output file more easily.
- DailyVMExportByHour.cfg.2012-07-29.2012.07.30.07.18.11.355.log
Notes
Documentation still needs work.
Config File
Requires that a config file be specified, can be executed from fglcmd or from the script console (with a slight change to specify the config file, since you can’t currently specify command line options in the script console.
D:\Quest_Software\Foglight55\bin\fglcmd.bat -srv 10.4.118.110 -port 8080 -usr foguser -pwd fog1pwd -cmd script:run -f foglightdatareport.groovy "config:D:\reportconfig\foglightdatareport.cfg"
or
/vfoglight/bin/fglcmd.sh -srv localhost -port 8080 -usr foguser vi-pwd fog1pwd -cmd script:run -f foglightdatareport.groovy "config:/vfoglight/reportconfig/foglightdatareport.cfg"
To run from the script console, you still use a config file from an FMS path. Uncomment/modify the following line to point to the config file.
//configPath = "/vfoglight/reportconfig/DailyESXExportByHour.cfg"
Options
Options can be added to the end of the command line, such as:
- "test:1"
- "debug:4"
- "starttime:2012.07.25 00:00:00"
- "endtime:2012.07.26 00:00:00"
A full list of options is presented below in context with examples.Comments and suggestions welcome at jmaincpa@gmail.com
Performance
I’ve seen the system perform as many as 150,000 object-timeSlices (lines of output) per hour. Things which can slow it down are:
- excessive # of columns of retrieved data
- retrieving data from related object models
- called functions
DailyVMExportByHour.cfg
We run 13,000+ VMs, and it takes about 3h 10m to export 1 day of hourly data for the example below.
DailyESXExportByHour.cfg
For 850 ESX hosts, takes around 5 min to export 1 day of hourly data for the example below.
DailyClusterExportByHour.cfg
For 175 VMWClusters, takes around 3 min to export 1 day of hourly data for the example below.
//////////////////////////////////////////////////////////////////
// Things still to be added
//////////////////////////////////////////////////////////////////
/*
+ more documentation
+ Pre-pass calculations, to allow calculations to use saved values
which will print later in the column specification; currently
saved values are null until after their point in the column
order
+ A way to use calendar date math in the config file - other than
just using the default of previous calendar day, there's no
easy way to specify "run for last month" except by specific
date range (or changing the code.)
+ WCF functions can be called in calculations, but I've not tested
with WCF queries
+ Allow WCF functions or queries to be used for the object source
instead of the current
server.get("QueryService").queryTopologyObjects(query)
+ Add a user interface for preparing the config files - may never
happen.
*/
//////////////////////////////////////////////////////////////////
Comments and suggestions welcome


