Skip navigation
Currently Being Moderated

3.3 Creating Batches

VERSION 5  Click to view document history
Created on: Oct 12, 2011 11:51 AM by Zenoss API - Last Modified:  Oct 12, 2011 12:01 PM by Zenoss API

 3. Creating Batches

You create batches to populate the reporting database. Every eight hours, the ETL processes a batch and extracts information from the previous eight-hour window.

Existing batches and associated status information appear in the batch list. To access the batch list, select Reports > Configuration from the  Resource Manager interface.

The batch list appears.


Figure 3.4. Batch List

Batch List

The list shows each batch and its status (UNSTARTED, INPROGRESS, COMPLETED, or FAILED).

To add a batch:

  1. From the batch list, click (Add Batches).

    The Add Batch dialog appears.


    Figure 3.5. Add Batch

    Add Batch

  2. Make selections in the dialog:

    • Extractor - Select MODEL, EVENTS, or localhost.

    • Begin Date - Optionally adjust the begin date and time.

    • End Date - Optionally adjust the end date and time. By default, the value is set to the current Analytics server date and time.


      Do not create MODEL batches that extend into the future. Creating a batch that finishes in the future will leave a batch with a status of UNSTARTED.

  3. Click Submit. The batch is added to the batch list.


RRD stores data samples at various different resolutions. It is configured to store a fixed number of data samples at any resolution.  Resource Manager, by default, creates RRD files to store 288 samples at five minute (300 second) intervals, called a step. This means that  Resource Manager retains, at most , one day of five-minute samples for a data point. To ensure that disk usage does not grow without bound, RRD consolidates the five-minute samples in less granular periods (such as 15 minutes, two hours, one day, or one week).

When you schedule batches, remember that  Resource Manager will return data at the finest granularity that exists over the entire period. This means that, for example, if your query is for the last 26 hours, you will not receive 300 second data samples but approximately 1800 second (15 minute) samples. The further back your batch periods reach, the less granular the retrieved data will be.

Comments (0)