Overview
Batch export functionality allows you to bring in data that has been generated by your platform or system into an overall collection of data for analysis and processing. It enables you to gather and utilize large amounts of data produced by your platform, enhancing your ability to analyze and derive insights from this information.
Partner users can configure batch exports for their clients. The partner can specify different category types for export. The feature allows you to export data for the clients either on demand or at scheduled intervals to Amazon AWS S3 and Microsoft Azure Blob Storage.
As a partner, you can use Batch Export API to schedule batch exports. Based on the export category type, the export is created for the clients.
If a batch export has been set up for a client or a group of clients using the Batch Export API, any attempt by the partner to create the export through the User Interface will fail to create it.
You can export the following category types for the clients:
Category Type | Description |
---|---|
Alerts | Each client alert list will be exported into cloud storage. |
Audit | Each client or All Clients audit data will be exported into cloud storage. |
Integrations | Each client / partner installed integration configuration data will be exported into cloud storage. |
Inventory | Each client-level inventory data will be exported into cloud storage. |
Metrics | Each client-level metric data will be exported into cloud storage. |
Tenants | User can configure the export at partner level, and it displays partner details and all client details under the partner. |
Tickets | Each client-level ticket data will be exported into cloud storage. |
Usage | Usage collection data (metering usage data for billing purposes) will be exported at partner level into cloud storage. |
Note
- Cloud means – AWS S3 / Azure Blob Container
- Batch Export integration is available only at partner level.
The following batch export types for different data types are available:
Export Type | Applicable Data Type(s) | Description |
---|---|---|
Snapshot | Inventory, Integrations, and Tenants | This data export provides a snapshot view of the data at the instance of export generation into the Amazon S3 bucket folder or Azure Blob storage. |
Incremental | Alert, Ticket, Metric, Audit and Usage | The incremental batch export sends data for the past three months as the first batch export. Subsequent exports are incremental exports of updated records. You can schedule recurring or on-demand data export:
Metric data export occurs every hour by default and cannot be customized. You can get metric data using the metric APIs. |
Data is exported in JSON format. A list of failed exports is also provided if applicable.
The exported data content depends on the type and frequency of the export schedule, which determines the data items and quantity.
After exporting the data to the installed integrations, you can view the exported data on the platform, AWS S3 and Azure Blob.
Exporting data from OpsRamp involves installing an export integration and creating a data export.
Prerequisites
- AWS S3 or Blob Storage integration has to be installed.
- Create folders in the cloud storage integration: Amazon AWS S3 bucket storage.
How to Create a Batch Export?
Follow these steps to create a batch export:
Step 1 - Select the Batch Export Add-On
Follow these steps to enable the Batch Export Add-on:
- Navigate to Setup > Accounts > Partners.
- Search for the partner and click on partner name.
- Click Edit.
- In EDIT PARTNER screen, click Add Ons tab.
- Select Batch Exports.
- Click Save.
Step 2 - Install an Export Integration
Integrate with AWS S3 or Azure Blob to export the data to an AWS S3 bucket or an Azure Blob container.
Follow these steps to create an Export integration:
- Click Setup > Account.
- Select the Integrations tile. The Installed Integrations screen is displayed, with all the installed applications.
- If you do not have any installed applications, you will be navigated to the Available Integrations page. The Available Integrations page displays all the available applications along with the newly created application with the version.
Note: Search for AWS S3 or Azure blob application using the search option available. Alternatively, use the All Categories option to search. - Click +ADD.
- Based on the application, the fields are displayed. Enter the information in the fields.
- Click FINISH.
When creating a new batch export, this saved integration will be available in the Export to dropdown of the Configuration screen.
Refer the following links for more information on how to create AWS S3 bucket storage or Microsoft Azure Blob storage folders:
- Create AWS S3 data export integration, refer Data Export to Amazon S3.
- Create Azure Blob Storage data export integration, refer Azure Blob Storage.
Step 3 - Configure a Batch Export
Follow these steps to configure a batch export:
Click Setup > Account.
Select the Integrations tile.
The Installed Integrations screen is displayed, with all the installed applications.
If you do not have any installed applications, you will be navigated to the Available Integrations page. The Available Integrations page displays all the available applications along with the newly created application with the version.
Note: Search for Batch Export application using the search option available. Alternatively, search for Exports from All Categories option and select it.Click +ADD in Batch Export tile.
Enter the following information:GENERAL DETAILS
Field Name Field Type Description Name String Unique name of the export. Category Type Dropdown Type of data to export: Alerts, Audit, Integrations, Inventory, Metrics, Tickets, Tenants, and Usage. Select client Dropdown Client for whom you want to export the data.
For Audit Category Type, you have the option to select All Clients or a single client.New JSON/
Old JSONRadio button These options are displayed if you selected Category Type as Metrics.
Select New JSON or Old JSON. The data will be exported in the specified format.Export to Dropdown Specify the integration: AWS S3 or Azure Blob.
If the integrations are not available, you can create them:- Click +ADD in the Export to dropdown.
The ADD INTEGRATION window is displayed. - Select Integration type from the dropdown. For AWS EventBridge
- Name: Enter the integration name.
- Access Key: Unique Identifier to access the AWS EventBridge.
- Secret key: Key generated from the AWS portal.
- Confirm Secret key: Secret generated from the AWS portal.
- Region name: Name of the cloud storage location.
- Event Bus Name: Event bus that receives the events from OpsRamp and AWS services.
- Event Source: Event pattern or text specified in the Rules section in EventBridge. Use this to filter the incoming events.
Example:{ "Source": [ "pattern" ] }.
- Detail Type: Parameter to do more filtering in EventBridge.
- Click ADD.
- Name: Enter the integration name.
- Storage account name: Azure Blob account name.
- Secret access key: Access key generated from the portal.
- Confirm Secret access key: Reenter the secret access key.
- Container name: Name of the Azure container for the export data.
- Base URI: Data location in the container.
Example: https://portal.azure.com. - Click ADD.
For more information, see AWS Supported Services.
For Blob Storage
For more information, see Azure Blob Storage.Failure Export Notification Checkbox If you enable this option, you will receive a notification if the export fails. SCHEDULE
Field Name Field Type Description Schedule Type Selection Specify the frequency of data generation: - ON DEMAND generates exports when a request is created or rerun.
- HOURLY generates export data at a specified interval.
Example: Every 6 hours. - DAILY generates export data daily at the specified time.
- WEEKLY generates export data weekly on the specified day and time.
- MONTHLY generates export data monthly on the specified day and time.
- Click +ADD in the Export to dropdown.
Click FINISH to apply the export and display export details.
Note
During on-demand execution, the data availability of metrics data on the configured buckets (AWS S3 or Azure blob storage) starts from the next hour.Example: If the request was made at
13:00 GMT
, 13:20 GMT
, or 13:40 GMT
, the data would be available on AWS S3 or Azure blob storage after an hour (only after 1400 GMT
). The above note is only applicable for Metrics Category Type.
There is change in file name format. See View Metric Type Batch Export for more information.
View Batch Exports
You can view the configured Batch Export details in Setup> Accounts > Integrations > Batch Export.
Column Name | Description |
---|---|
Name | Name of the batch export. |
Status | Integration status. |
Added On | Date and time details of the added account. Information about the user who added the account is also displayed. |
Updated On | Date and time details of the modified account. Information about the user who modified the account is also displayed. |
View Batch Exports on AWS S3
You can view the generated batch exports in an AWS S3 bucket in the corresponding folders. Example: The alerts export is stored in the alerts folder:
AWS stores the export files in S3 folders in JSON format. The export file name has the following encoding:
- (a) schedule of batch export, recurring or on-demand
- (b) batch export type
- (c) unique client ID
- (d) schedule starting timestamp
- (e) schedule ending timestamp
- (f) recurring export serial number
View Metric Type Batch Export
You can view the latest file format for AWS S3 metric batch export. The export file name has the following encoding:
- (A) schedule of batch export, recurring or on-demand
- (B) batch export type
- (C) unique client ID
- (D) schedule starting timestamp
- (E) unique id of the file
- (F) unique timestamp of the file
View Batch Export on Azure Blob
You can view the generated data exports in the Azure Blob container in the corresponding folders:
Azure Blob stores the export files in Azure Blob containers in JSON format. The export file name has the encoding as shown in the following figure:
- (A) schedule of batch export, recurring or on-demand
- (B) batch export type
- (C) unique client ID
- (D) schedule starting timestamp
- (E) schedule ending timestamp
- (F) recurring export serial number