Batch export functionality allows you to bring in data that has been generated by your platform or system into an overall collection of data for analysis and processing. It enables you to gather and utilize large amounts of data produced by your platform, enhancing your ability to analyze and derive insights from this information.
Partner users can configure batch exports for their clients. The partner can specify different category types for export. The feature allows you to export data for the clients either on demand or at scheduled intervals to Amazon AWS S3 and Microsoft Azure Blob Storage.
As a partner, you can use Batch Export API to schedule batch exports. Based on the export category type, the export is created for the clients.
If a batch export has been set up for a client or a group of clients using the Batch Export API, any attempt by the partner to create the export through the User Interface will fail to create it.
You can export the following category types for the clients:
Category Type
Description
Alerts
Each client alert list will be exported into cloud storage.
Audit
Each client or All Clients audit data will be exported into cloud storage.
Integrations
Each client / partner installed integration configuration data will be exported into cloud storage.
Inventory
Each client-level inventory data will be exported into cloud storage.
Metrics
Each client-level metric data will be exported into cloud storage.
Tenants
User can configure the export at partner level, and it displays partner details and all client details under the partner.
Tickets
Each client-level ticket data will be exported into cloud storage.
Usage
Usage collection data (metering usage data for billing purposes) will be exported at partner level into cloud storage.
Note
Cloud means – AWS S3 / Azure Blob Container
Batch Export integration is available only at partner level.
The following batch export types for different data types are available:
Export Type
Applicable Data Type(s)
Description
Snapshot
Inventory, Integrations, and Tenants
This data export provides a snapshot view of the data at the instance of export generation into the Amazon S3 bucket folder or Azure Blob storage.
Incremental
Alert, Ticket, Metric, Audit and Usage
The incremental batch export sends data for the past three months as the first batch export. Subsequent exports are incremental exports of updated records. You can schedule recurring or on-demand data export:
On-demand - Export provides a snapshot of the data at the time of the generation of the export.
Recurring - Export provides snapshots of the data at the chosen time.
When you request incremental data on-demand, the export generates a snapshot of the past three months.
Metric data export occurs every hour by default and cannot be customized. You can get metric data using the metric APIs.
Data is exported in JSON format. A list of failed exports is also provided if applicable. The exported data content depends on the type and frequency of the export schedule, which determines the data items and quantity.
After exporting the data to the installed integrations, you can view the exported data on the platform, AWS S3 and Azure Blob.
Exporting data from OpsRamp involves installing an export integration and creating a data export.
Prerequisites
AWS S3 or Blob Storage integration has to be installed.
Create folders in the cloud storage integration: Amazon AWS S3 bucket storage.
Follow these steps to enable the Batch Export Add-on:
Navigate to Setup > Accounts > Partners.
Search for the partner and click on partner name.
Click Edit.
In EDIT PARTNER screen, click Add Ons tab.
Select Batch Exports.
Click Save.
Step 2 - Install an Export Integration
Integrate with AWS S3 or Azure Blob to export the data to an AWS S3 bucket or an Azure Blob container.
Follow these steps to create an Export integration:
Click Setup > Account.
Select the Integrations tile. The Installed Integrations screen is displayed, with all the installed applications.
If you do not have any installed applications, you will be navigated to the Available Integrations page. The Available Integrations page displays all the available applications along with the newly created application with the version. Note: Search for AWS S3 or Azure blob application using the search option available. Alternatively, use the All Categories option to search.
Click +ADD.
Based on the application, the fields are displayed. Enter the information in the fields.
Click FINISH.
When creating a new batch export, this saved integration will be available in the Export to dropdown of the Configuration screen.
Refer the following links for more information on how to create AWS S3 bucket storage or Microsoft Azure Blob storage folders:
The Installed Integrations screen is displayed, with all the installed applications.
If you do not have any installed applications, you will be navigated to the Available Integrations page. The Available Integrations page displays all the available applications along with the newly created application with the version. Note: Search for Batch Export application using the search option available. Alternatively, search for Exports from All Categories option and select it.
Click +ADD in Batch Export tile.
Enter the following information:GENERAL DETAILS
Field Name
Field Type
Description
Name
String
Unique name of the export.
Category Type
Dropdown
Type of data to export: Alerts, Audit, Integrations, Inventory, Metrics, Tickets, Tenants, and Usage.
Select client
Dropdown
Client for whom you want to export the data. For Audit Category Type, you have the option to select All Clients or a single client.
New JSON/ Old JSON
Radio button
These options are displayed if you selected Category Type as Metrics. Select New JSON or Old JSON. The data will be exported in the specified format.
Export to
Dropdown
Specify the integration: AWS S3 or Azure Blob. If the integrations are not available, you can create them:
Click +ADD in the Export to dropdown. The ADD INTEGRATION window is displayed.
Select Integration type from the dropdown.
For AWS EventBridge
Name: Enter the integration name.
Access Key: Unique Identifier to access the AWS EventBridge.
Secret key: Key generated from the AWS portal.
Confirm Secret key: Secret generated from the AWS portal.
Region name: Name of the cloud storage location.
Event Bus Name: Event bus that receives the events from OpsRamp and AWS services.
Event Source: Event pattern or text specified in the Rules section in EventBridge. Use this to filter the incoming events. Example:{ "Source": [ "pattern" ] }.
Detail Type: Parameter to do more filtering in EventBridge.
If you enable this option, you will receive a notification if the export fails.
SCHEDULE
Field Name
Field Type
Description
Schedule Type
Selection
Specify the frequency of data generation:
ON DEMAND generates exports when a request is created or rerun.
HOURLY generates export data at a specified interval. Example: Every 6 hours.
DAILY generates export data daily at the specified time.
WEEKLY generates export data weekly on the specified day and time.
MONTHLY generates export data monthly on the specified day and time.
Note: These options are displayed based on the Category Type selected.
Click FINISH to apply the export and display export details.
Note
During on-demand execution, the data availability of metrics data on the configured buckets (AWS S3 or Azure blob storage) starts from the next hour. Example: If the request was made at 13:00 GMT, 13:20 GMT, or 13:40 GMT, the data would be available on AWS S3 or Azure blob storage after an hour (only after 1400 GMT).
The above note is only applicable for Metrics Category Type.