Data Export to S3

Easily integrate OpsLevel data into your data warehouse to create your own reports.

With our Data Export feature, you can integrate OpsLevel’s service ownership and operational maturity data into your data warehouse. This lets you create fully customized reports that join OpsLevel data against other data in your business, such as support tickets, security incidents, or costs.

Data Export works by regularly exporting full and incremental snapshots of your OpsLevel account data to an AWS S3 bucket of your choice. Data is exported as one .csv.gz file per table.

Setup

  1. First email us at support@opslevel.com. We will provide you with the following details:
    • Our AWS user (%OPSLEVEL-USER%)
    • Our AWS account ID (%OPSLEVEL-ACCOUNT-ID%)
    • A processing bucket name (%PROCESSING-BUCKET%)
    • An external ID (%EXTERNAL-ID%)
  2. Log onto your AWS account.
  3. Create a new S3 bucket where you want to receive files (%DESTINATION-BUCKET%).
  4. Create a role in your account for OpsLevel to assume (%CUSTOMER-ROLE-TO-ASSUME%).
  5. Add this policy to the new role:
     {
         "Version": "2012-10-17",
         "Statement": [
             {
                 "Effect": "Allow",
                 "Action": [
                     "s3:PutObject"
                 ],
                 "Resource": [
                     "arn:aws:s3:::%DESTINATION-BUCKET%/*"
                 ]
             },
             {
                 "Effect": "Allow",
                 "Action": [
                     "s3:GetObject"
                 ],
                 "Resource": [
                     "arn:aws:s3:::%PROCESSING-BUCKET%/*"
                 ]
             }
         ]
     }
    
  6. Add this trust relationship to the new role:
     {
       "Version": "2012-10-17",
       "Statement": [
         {
           "Effect": "Allow",
           "Principal": {
             "AWS": "arn:aws:iam::%OPSLEVEL-ACCOUNT-ID%:user/%OPSLEVEL-USER%"
           },
           "Condition": {"StringEquals": {"sts:ExternalId": "%EXTERNAL-ID%"}},
           "Action": "sts:AssumeRole"
         }
       ]
     }
    
  7. You will then need to provide us with:
    • Your AWS account ID
    • Destination bucket name (%DESTINATION-BUCKET%)
    • Role used for OpsLevel to send data to your bucket

For a more complete explanation for these steps you can read the AWS Guide.

Push Types

There are two types of pushes you can expect, full, and incremental. Full pushes are a complete dump of all the data associated with your account. Incremental pushes will contain only the changes since the last dump.

Full pushes are done on a weekly basis and incremental pushes are done on a daily basis.

After ingesting the initial full dump you should be able to keep your system up to date by ingesting only the incremental files. However we recommend refreshing from full dumps on a regular interval.

For incremental style pushes, deleted records will be specified by sending a row with the id with the deleted_at column specifying the time the record was deleted at. All other columns will be blank for the deleted entry.

Schema

We will upload files into the destination with a path taking the form of: /v1/:push_type/:table_name/:date/:timestamp/

For example: /v1/full/services/2020-07-14/22:00:00/services.csv.gz

Examples

Here are some examples of the files you will receive.

services.csv

id name alias product description owner_id language framework tier_index lifecycle_index html_url created_at updated_at deleted_at
83 Transaction Processor transaction_processor NULL NULL 5 Python Flask NULL NULL https://app.opslevel.com/services/transaction_processor 2020-02-19 13:57:11 2020-02-19 13:57:11  
84 Reconciliator reconciliator Payment Responsible for reconciling various transactions 5 Ruby Rails 1 NULL https://app.opslevel.com/services/reconciliator 2020-02-19 13:57:42 2020-02-19 13:57:42  
85                         2020-02-19 13:57:42

check_results.csv

id status message check_id service_id is_applied created_at updated_at deleted_at
1 passed Success 1 1 1 2018-11-16 22:02:09 2018-11-16 22:02:09  
2 failed The service does not have a logs tool. 1 1 0 2018-11-16 22:02:44 2018-11-16 22:02:44  

teams.csv

id name alias manager_id responsibilities html_url created_at updated_at deleted_at
8 Discovery Team discovery_team NULL NULL https://app.opslevel.com/teams/discovery_team 2018-10-05 21:05:25 2019-03-12 18:20:52  
11 Support support 12 Supporting our product https://app.opslevel.com/teams/support 2018-11-06 14:58:05 2019-03-12 18:20:52  

There are many more tables we export; essentially all the data in OpsLevel will be exported.

Next Steps

Now that you will be receiving regular data exports from OpsLevel you will need to configure your data pipeline to ingest the data. We can get the data to you; how you’ll use it is up to you.