Change Point Detection

Change point detection identifies shifts in data patterns, helping to detect significant events or changes in system behavior.

What is it for?

A change point in anomaly detection is used to identify moments in time when the statistical properties of a data series shift significantly. This could indicate a structural change in the data, such as a sudden shift in mean, variance, or other characteristics, which may signal an underlying event or change in behavior within the system being monitored.

In the context of IoT/OT environments, detecting change points can help in identifying critical events, such as equipment malfunction, environmental changes, or process shifts, enabling timely intervention and adjustments.

Change point detection

How does it work?

Change point detection Identifies change points in the trend that exist in the time series data.

Sends the most recent 1000 data points at once to AlphaX intelligence engine to build a model, then analyses each data point based on that model.

Parameters

Feature
Description

Sensitivity

A numerical value(0% - 99%) that adjusts the tolerance of Anomaly Detection. Lowering the sensitivity leads to fewer anomalies detected.

Max Anomaly Ratio

A numerical value(0% - 99%) that indicates the max ratio of anomalies to be detected from a time series. For example, if it's set to 0.25(25%), for a time series with 100 points, the max anomaly points would be 25.

Output

Feature
Description

Change Points

Boolean: If the data point is Identified as a change point

Confidence Score

A percentage number from 1.00 to 100 (inclusive) that sets the sensitivity of the machine learning model. The lower the confidence, the higher the number of anomalies detected, and vice versa. Start from an arbitrary number between 70 and 90 and adjust this based on the results observed in development or testing.

Last updated