The Skytap webhook service is a powerful feature, yet on first glance can be a bit intimidating. A webhook is a method used for one application to talk to another. In our case, the Skytap webhook service can send audit data and usage data to other applications. For example, suppose you want to be notified on run state changes for specific environments or you want to be notified on failed logins; you can use webhooks to address these as well as other needs. This article provides a sample implementation using Microsoft Azure services to receive, process, report, and alert on webhook data.
To act on webhook data I will use a few native Azure services and use Azure Log Analytics as the foundation. This is where the webhook data is stored and can be queried, used for alerting, and surfaced on dashboards. An Azure Function is used to receive the webhook data and write it to a Log Analytics workspace. Lastly, an Azure Dashboard is used to provide a visual representation of the webhook data.
Here is an illustration of the flow of data:
Azure Log Analytics
The Azure documentation provides details on creating a new Log Analytics workspace. You can find more information here: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-platform-logs. The HTTP Data Collector API is used to post the webhook data from the Azure Function to the log workspace. The data goes into a custom log table for each audit and usage data. This Azure help article provides the details on using the HTTP Data Collector API including sample code: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/data-collector-api
Azure Function
Skytap webhooks send data formatted as JSON. You can create separate Azure functions to receive and process audit and usage data respectively or you can create a single Azure Function to receive both and send each to their respective table in Log Analytics. If you create a single function, you will need to interrogate the webhook data to determine if it is audit or usage data and then send it to the right log table. Use the “category” attribute to determine which type of data you are dealing with (“category”: “auditing” or “category”: “usage”), and then send the data to the appropriate log table. Writing, testing, and deploying an Azure function is beyond the scope of this blog article. The Azure documentation does a good job explaining how to do this and using Visual Studio to write and deploy your webhook keeps things simple.
The sample code in the Azure documentation for using the HTTP Data Collector API is a good starting point. The Azure function can be broken down into the following sections described below: constants, main function, and posting data. For simplicity I removed all error handling and logging information. You will want to be sure to include this in your version.
Constants
Here you will include the required authentication information (customer id and shared key) for accessing your instance of Log Analytics. You also will want to set the name for each log table to use. For example, I named mine SkytapMySkytapAccountAudit and SkytapMySkytapAccountUsage, where I replaced MySkytapAccount with the name of the Skytap account that is pushing data to this end point. This keeps things nice and neat, and allows me to trace back to the source of the data.
Main Function
In this section I set the timestamp field for the record, parse the JSON to determine if I have Audit or Usage data and set the log table, build the API signature, and then pass it to the post method.
Posting Data
In this method, the request headers are created and the webhook data is posted to the data collector end point.
Querying Log Data
The Kusto Query Language (KQL) is the query language used to query the logs. The webhook data will be saved in custom log tables, meaning you will need to write your own queries for retrieving and reporting on the data. For more information on the Log Analytics workspace and KQL, refer to the Azure Help documentation: https://docs.microsoft.com/en-us/azure/azure-monitor/logs/log-analytics-overview
Querying Audit data is straight forward. You will want to explore the JSON to identify what events and attributes you care about.
Querying usage data is a bit more complex as the usage data is timeseries data. An event is created each time something changes state; for example, each time an environment is created, started, stopped, and deleted. Here is a sample KQL for reporting on storage usage across all regions. The KQL also converts the region name to a friendly name.
Here is a chart produced by the KQL followed by the query.
SkytapPacRetailUsage_CL
| where TimeGenerated > ago(30d)
| extend id_ = toint(parse_json(payload_s)[0].id)
| extend type_ = tostring(parse_json(payload_s)[0].type)
| extend region_ = tostring(parse_json(payload_s)[0].region)
| extend start_time_arr = extract_all(@"([0-9]{4})\/([0-9]{2})\/([0-9]{2})\s([0-9]{2}):([0-9]{2}):([0-9]{2})", tostring(parse_json(payload_s)[0].start_time))
| extend start_time_ = make_datetime(toint(start_time_arr[0][0]), toint(start_time_arr[0][1]), toint(start_time_arr[0][2]), toint(start_time_arr[0][3]), toint(start_time_arr[0][4]), toint(start_time_arr[0][5]))
| extend end_time_arr = extract_all(@"([0-9]{4})\/([0-9]{2})\/([0-9]{2})\s([0-9]{2}):([0-9]{2}):([0-9]{2})", tostring(parse_json(payload_s)[0].end_time))
| extend end_time_ = coalesce(make_datetime(toint(end_time_arr[0][0]), toint(end_time_arr[0][1]), toint(end_time_arr[0][2]), toint(end_time_arr[0][3]), toint(end_time_arr[0][4]), toint(end_time_arr[0][5])), now())
| extend value_ = toint(parse_json(payload_s)[0].value)
| extend TypeNew_ = case(type_ == 'svms', "x86 Metered RAM", type_ == 'svms_power', "Power Metered RAM", "storage")
| extend FriendlyRegion_ = case(region_ == 'prod/amm1r1', 'NL-Amsterdam-M-1', region_ == 'prod/dal1r1', 'US-Central', region_ == 'prod/fri1r1', 'DE-Frankfurt-I-1', region_ == 'prod/lon1r1', 'EMEA', region_ == 'prod/sgm1r1', 'SG-Singapore-M-1', region_ == 'prod/slg1r1', 'US-East', region_ == 'prod/syi1r1', 'AU-Sydney-I-1', region_ == 'prod/tor1r1', 'CAN-Toronto', region_ == 'prod/tuk1r1', 'US-West', region_ == 'prod/vam1r1', 'US-Virginia-M-1', region_ == 'prod/wbd1r1', 'US-East-2', region_ == 'prod/txm1r1', 'US-Texas-M-1', region_)
| extend region_type_ = strcat(TypeNew_, '/', FriendlyRegion_)
| where type_ == 'storage_size'
| summarize arg_max(TimeGenerated, *) by id_
| mv-expand samples = range(bin(start_time_, 1d), end_time_ , 1d)
| summarize usage = sum(value_) by bin(todatetime(samples),1d), region_type_
| where samples > ago(30d)
Once the data is flowing from Skytap to your Log Analytics workspace you can create charts for your Azure Dashboard and setup alters.
Notifications
Alerting is a power capability of Log Analytics. You will want to think about which types of conditions you care about most and build alerts for these. You can create custom alerts which fire based on certain conditions. Alerts can generate email or an SMS text message. For example, I created this alert to email each time a failed login attempt is detected. Here is the KQL to capture this condition.
SkytapPacRetailAudit_CL
| extend type_code_ = tostring(parse_json(payload_s)[0].type_code)
| where type_code_ == "FailedUserLoginHistory"
Dashboards
Each query you create can be added to an Azure Dashboard, and you can include the data in a table format or as a chart. Below is a snippet of a dashboard I created to track logins, environment creation, delete, and run operations. You can also add usage charts to your dashboard.
Skytap Webhooks
Webhooks are disabled by default. Once you have deployed your Azure function you will need to enable each Skytap webhook to post data to your Azure function or functions. These help topics guide you through enabling and configuring the webhooks: enabling and configuring auditing and usage data webhooks in the Skytap Help documentation. When you configure the Azure Function you’ll have the option of using the default Azure certificate or deploying your own.
Configure Skytap to call the Azure Function:
- Get the URL for the Function
- This will look like https://[FunctionAppName].azurewebsites.net
- Append the FunctionName to create the fully qualified URL
- This will look like https:// [FunctionAppName].azurewebsites.net/api/[FunctionName]
- In the Skytap portal navigate to Account Settings
- Enable the webhook and post the fully qualified URL in the webhook address field.
- You will need to create a certificate specific to your Azure Function or use the built-in certificate. To retrieve the certificate you can use the following command:
openssl s_client -showcerts -connect {webhookserver}:{port} </dev/null 2>/dev/null | openssl x509 -outform PEM
Summary
Skytap tracks and logs over 200 audit events, and we are constantly adding more. Integrating Skytap Webhooks with Azure Log Analytics creates a power tool to report, monitor, and alert on your Skytap usage. If you are using Skytap on Azure, Skytap on IBM Cloud, or both, you can be alerted if someone unlocks a critical environment, changes an important account setting, or adds a public IP address to a VM. Constant monitoring and alerting are key to successfully running applications in the cloud.