Skip to main content
All CollectionsIntegrations
Data Push Destination Example Using the Generic HTTP Endpoint with Observe
Data Push Destination Example Using the Generic HTTP Endpoint with Observe

Setting up a Data Push Destination integration using Generic HTTP Endpoint.

Updated over a week ago

Observe is a powerful platform that helps derive meaningful insights from logs, metrics, and traces. Although there is no dedicated Observe Cloud integration, you can send issue data from your UXI dashboard to Observe Cloud using the Generic HTTP Endpoint Data Push Destination.

To get started:

  1. Log in to your Observe Cloud account and create a new data stream. You can optionally set the data retention period at this time.

  2. Next, create a token for the data stream. You will be presented with the token on the next screen. Copy the token to a secure place. Notice that the token contains a ":". You will need both the strings on the left-hand side and the right-hand side when you configure your UXI dashboard, as UXI will use basic auth instead of a bearer token.

On the next page, you will be presented with setup instructions for how to send events. Copy the URL to a safe place.

Next, open the UXI dashboard and navigate to SettingsIntegrations.

Under Data Push Destinations, select Add Destination.

Enter the following in the menu:

  • Data Type: Issues (Do not send test results)

  • Destination Type: Generic HTTP Endpoint

  • Name: Give the data push a unique, friendly name

  • URL: Enter the URL you copied earlier

  • Username: Enter the string from the left-hand side of the ":" from the Observe token

  • Password: Enter the string from the right-hand side of the ":" from the Observe token

Select Submit.

Using the Dataset to Find Ongoing Issues (Example)

Here is an example OPAL filter to find ongoing issues and display them as a table:

filter DATASTREAM_TOKEN_ID = "<uxi token id>"
// filter for UXI data

make_col code:string(FIELDS.code),
hierarchy_node_name:FIELDS.context.hierarchy_node_name,
mac_address:string(FIELDS.context.mac_address),
network_name:string(FIELDS.context.network_name),
sensor_name:string(FIELDS.context.sensor_name),
service_name:FIELDS.context.service_name,
sensor_serial:string(FIELDS.context.sensor_serial),
event_type:string(FIELDS.event_type),
incident_uid:FIELDS.incident_uid,
timestamp:string(FIELDS.timestamp),
uid:string(FIELDS.uid)
// make columns for fileds in the UXI JSON payload

filter is_null(event_type) or (event_type != "UPDATED")
filter is_null(event_type) or (event_type != "INCIDENT_ADDED")
filter is_null(event_type) or (event_type != "INCIDENT_REMOVED")
// ignore any logs where the event type is "UPDATED", "INCIDENT_ADDED" or "INCIDENT_REMOVED"

make_col last_status:window(last(event_type), group_by(uid))
// make a new column which display the last status for a given issue UID

dedup uid
// deduplicate rows based on uid and timestamp

filter (is_null(last_status) or (last_status != "RESOLVED")) and (event_type != "RESOLVED")
// only display columns where the latest event_type is CONFIRMED

sort asc(timestamp)
// sort oldest to newest

​Using this filter in a worksheet, you can create a table in a dashboard to see which issues are ongoing and how long they have been ongoing.

Did this answer your question?