Skip to content

Commit

Permalink
[Cog Serv Anomaly Detector] add sample codes (#14591)
Browse files Browse the repository at this point in the history
* add samples and sample data

* add REAME.md for samples

* add javafdoc; rename variates;

* add more comments
  • Loading branch information
moreOver0 committed Sep 8, 2020
1 parent 1b5a85b commit 3f42cdb
Show file tree
Hide file tree
Showing 5 changed files with 391 additions and 0 deletions.
59 changes: 59 additions & 0 deletions sdk/anomalydetector/azure-ai-anomalydetector/src/samples/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
---
page_type: sample
languages:
- java
products:
- azure
- azure-cognitive-services
- azure-anomaly-detector
urlFragment: anomalydetector-java-samples
---

# Azure Anomaly Detector client library samples for Java

Azure Anomaly Detector samples are a set of self-contained Java programs that demonstrate interacting with Azure self-contained service using the client library. Each sample focuses on a specific scenario and can be executed independently.

## Key concepts

Key concepts are explained in detail [here][SDK_README_KEY_CONCEPTS].

## Getting started

Getting started explained in detail [here][SDK_README_GETTING_STARTED].

## Examples

The following sections provide code samples covering common scenario operations with the Azure Anomaly Detector client library.

All of these samples need the endpoint to your Anomaly Detector resource, and your Anomaly Detector API key.

|**File Name**|**Description**|
|----------------|-------------|
|[DetectAnomaliesEntireSeries.java][detect_anomaly_entire]|Detect anomalies as a batch|
|[DetectAnomaliesLastPoint.java][detect_anomaly_last]|Detect if last point is anomaly|
|[DetectChangePoints.java][detect_change_point]|Detect change points in series|

## Troubleshooting

Troubleshooting steps can be found [here][SDK_README_TROUBLESHOOTING].

## Next steps

See [Next steps][SDK_README_NEXT_STEPS].

## Contributing

If you would like to become an active contributor to this project please refer to our [Contribution
Guidelines][SDK_README_CONTRIBUTING] for more information.

<!-- LINKS -->
[SDK_README_CONTRIBUTING]: ../../README.md#contributing
[SDK_README_GETTING_STARTED]: ../../README.md#getting-started
[SDK_README_TROUBLESHOOTING]: ../../README.md#troubleshooting
[SDK_README_KEY_CONCEPTS]: ../../README.md#key-concepts
[SDK_README_DEPENDENCY]: ../../README.md#include-the-package
[SDK_README_NEXT_STEPS]: ../../README.md#next-steps

[detect_anomaly_entire]: ./java/com/azure/ai/anomalydetector/DetectAnomaliesEntireSeries.java
[detect_anomaly_last]: ./java/com/azure/ai/anomalydetector/DetectAnomaliesLastPoint.java
[detect_change_point]: ./java/com/azure/ai/anomalydetector/DetectChangePoints.java
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.

package com.azure.ai.anomalydetector;

import com.azure.ai.anomalydetector.models.DetectRequest;
import com.azure.ai.anomalydetector.models.EntireDetectResponse;
import com.azure.ai.anomalydetector.models.TimeGranularity;
import com.azure.ai.anomalydetector.models.TimeSeriesPoint;
import com.azure.core.credential.AzureKeyCredential;
import com.azure.core.http.ContentType;
import com.azure.core.http.HttpClient;
import com.azure.core.http.HttpHeaders;
import com.azure.core.http.HttpPipeline;
import com.azure.core.http.HttpPipelineBuilder;
import com.azure.core.http.policy.AddHeadersPolicy;
import com.azure.core.http.policy.AzureKeyCredentialPolicy;
import com.azure.core.http.policy.HttpPipelinePolicy;

import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.OffsetDateTime;
import java.util.List;
import java.util.stream.Collectors;

/**
* Sample for detecting anomalies in a piece of time series.
*/
public class DetectAnomaliesEntireSeries {

/**
* Main method to invoke this demo.
*
* @param args Unused arguments to the program.
* @throws IOException Exception thrown when there is an error in reading all the lines from the csv file.
*/
public static void main(final String[] args) throws IOException {
String endpoint = "<anomaly-detector-resource-endpoint>";
String key = "<anomaly-detector-resource-key>";
HttpHeaders headers = new HttpHeaders()
.put("Accept", ContentType.APPLICATION_JSON);

HttpPipelinePolicy authPolicy = new AzureKeyCredentialPolicy("Ocp-Apim-Subscription-Key",
new AzureKeyCredential(key));
AddHeadersPolicy addHeadersPolicy = new AddHeadersPolicy(headers);

HttpPipeline httpPipeline = new HttpPipelineBuilder().httpClient(HttpClient.createDefault())
.policies(authPolicy, addHeadersPolicy).build();
// Instantiate a client that will be used to call the service.
AnomalyDetectorClient anomalyDetectorClient = new AnomalyDetectorClientBuilder()
.pipeline(httpPipeline)
.endpoint(endpoint)
.buildClient();

// Read the time series from csv file and organize the time series into list of TimeSeriesPoint.
// The sample csv file has no header, and it contains 2 columns, namely timestamp and value.
// The following is a snippet of the sample csv file:
// 2018-03-01T00:00:00Z,32858923
// 2018-03-02T00:00:00Z,29615278
// 2018-03-03T00:00:00Z,22839355
// 2018-03-04T00:00:00Z,25948736
Path path = Paths.get("./src/samples/java/sample_data/request-data.csv");
List<String> requestData = Files.readAllLines(path);
List<TimeSeriesPoint> series = requestData.stream()
.map(line -> line.trim())
.filter(line -> line.length() > 0)
.map(line -> line.split(",", 2))
.filter(splits -> splits.length == 2)
.map(splits -> {
TimeSeriesPoint timeSeriesPoint = new TimeSeriesPoint();
timeSeriesPoint.setTimestamp(OffsetDateTime.parse(splits[0]));
timeSeriesPoint.setValue(Float.parseFloat(splits[1]));
return timeSeriesPoint;
})
.collect(Collectors.toList());

System.out.println("Detecting anomalies as a batch...");
DetectRequest request = new DetectRequest();
request.setSeries(series);
// Set the granularity to be DAILY since the minimal interval in time of the sample data is one day.
request.setGranularity(TimeGranularity.DAILY);
EntireDetectResponse response = anomalyDetectorClient.detectEntireSeries(request);
if (response.getIsAnomaly().contains(true)) {
System.out.println("Anomalies found in the following data positions:");
for (int i = 0; i < request.getSeries().size(); ++i) {
if (response.getIsAnomaly().get(i)) {
System.out.print(i + " ");
}
}
System.out.println();
} else {
System.out.println("No anomalies were found in the series.");
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,91 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.

package com.azure.ai.anomalydetector;

import com.azure.ai.anomalydetector.models.DetectRequest;
import com.azure.ai.anomalydetector.models.LastDetectResponse;
import com.azure.ai.anomalydetector.models.TimeGranularity;
import com.azure.ai.anomalydetector.models.TimeSeriesPoint;
import com.azure.core.credential.AzureKeyCredential;
import com.azure.core.http.ContentType;
import com.azure.core.http.HttpClient;
import com.azure.core.http.HttpHeaders;
import com.azure.core.http.HttpPipeline;
import com.azure.core.http.HttpPipelineBuilder;
import com.azure.core.http.policy.AddHeadersPolicy;
import com.azure.core.http.policy.AzureKeyCredentialPolicy;
import com.azure.core.http.policy.HttpPipelinePolicy;

import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.OffsetDateTime;
import java.util.List;
import java.util.stream.Collectors;

/**
* Sample for detecting whether the last point of time series is anomaly or not.
*/
public class DetectAnomaliesLastPoint {

/**
* Main method to invoke this demo.
*
* @param args Unused arguments to the program.
* @throws IOException Exception thrown when there is an error in reading all the lines from the csv file.
*/
public static void main(final String[] args) throws IOException {
String endpoint = "<anomaly-detector-resource-endpoint>";
String key = "<anomaly-detector-resource-key>";
HttpHeaders headers = new HttpHeaders()
.put("Accept", ContentType.APPLICATION_JSON);

HttpPipelinePolicy authPolicy = new AzureKeyCredentialPolicy("Ocp-Apim-Subscription-Key",
new AzureKeyCredential(key));
AddHeadersPolicy addHeadersPolicy = new AddHeadersPolicy(headers);

HttpPipeline httpPipeline = new HttpPipelineBuilder().httpClient(HttpClient.createDefault())
.policies(authPolicy, addHeadersPolicy).build();
// Instantiate a client that will be used to call the service.
AnomalyDetectorClient anomalyDetectorClient = new AnomalyDetectorClientBuilder()
.pipeline(httpPipeline)
.endpoint(endpoint)
.buildClient();

// Read the time series from csv file and organize the time series into list of TimeSeriesPoint.
// The sample csv file has no header, and it contains 2 columns, namely timestamp and value.
// The following is a snippet of the sample csv file:
// 2018-03-01T00:00:00Z,32858923
// 2018-03-02T00:00:00Z,29615278
// 2018-03-03T00:00:00Z,22839355
// 2018-03-04T00:00:00Z,25948736
Path path = Paths.get("./src/samples/java/sample_data/request-data.csv");
List<String> requestData = Files.readAllLines(path);
List<TimeSeriesPoint> series = requestData.stream()
.map(line -> line.trim())
.filter(line -> line.length() > 0)
.map(line -> line.split(",", 2))
.filter(splits -> splits.length == 2)
.map(splits -> {
TimeSeriesPoint timeSeriesPoint = new TimeSeriesPoint();
timeSeriesPoint.setTimestamp(OffsetDateTime.parse(splits[0]));
timeSeriesPoint.setValue(Float.parseFloat(splits[1]));
return timeSeriesPoint;
})
.collect(Collectors.toList());

System.out.println("Determining if latest data point is an anomaly...");
DetectRequest request = new DetectRequest();
request.setSeries(series);
// Set the granularity to be DAILY since the minimal interval in time of the sample data is one day.
request.setGranularity(TimeGranularity.DAILY);
LastDetectResponse response = anomalyDetectorClient.detectLastPoint(request);
if (response.isAnomaly()) {
System.out.println("The latest point was detected as an anomaly.");
} else {
System.out.println("The latest point was not detected as an anomaly.");
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,97 @@
// Copyright (c) Microsoft Corporation. All rights reserved.
// Licensed under the MIT License.

package com.azure.ai.anomalydetector;

import com.azure.ai.anomalydetector.models.ChangePointDetectRequest;
import com.azure.ai.anomalydetector.models.ChangePointDetectResponse;
import com.azure.ai.anomalydetector.models.TimeGranularity;
import com.azure.ai.anomalydetector.models.TimeSeriesPoint;
import com.azure.core.credential.AzureKeyCredential;
import com.azure.core.http.ContentType;
import com.azure.core.http.HttpClient;
import com.azure.core.http.HttpHeaders;
import com.azure.core.http.HttpPipeline;
import com.azure.core.http.HttpPipelineBuilder;
import com.azure.core.http.policy.AddHeadersPolicy;
import com.azure.core.http.policy.AzureKeyCredentialPolicy;
import com.azure.core.http.policy.HttpPipelinePolicy;

import java.io.IOException;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.time.OffsetDateTime;
import java.util.List;
import java.util.stream.Collectors;

/**
* Sample for detecting change points in a piece of time series.
*/
public class DetectChangePoints {

/**
* Main method to invoke this demo.
*
* @param args Unused arguments to the program.
* @throws IOException Exception thrown when there is an error in reading all the lines from the csv file.
*/
public static void main(final String[] args) throws IOException {
String endpoint = "<anomaly-detector-resource-endpoint>";
String key = "<anomaly-detector-resource-key>";
HttpHeaders headers = new HttpHeaders()
.put("Accept", ContentType.APPLICATION_JSON);

HttpPipelinePolicy authPolicy = new AzureKeyCredentialPolicy("Ocp-Apim-Subscription-Key",
new AzureKeyCredential(key));
AddHeadersPolicy addHeadersPolicy = new AddHeadersPolicy(headers);

HttpPipeline httpPipeline = new HttpPipelineBuilder().httpClient(HttpClient.createDefault())
.policies(authPolicy, addHeadersPolicy).build();
// Instantiate a client that will be used to call the service.
AnomalyDetectorClient anomalyDetectorClient = new AnomalyDetectorClientBuilder()
.pipeline(httpPipeline)
.endpoint(endpoint)
.buildClient();

// Read the time series from csv file and organize the time series into list of TimeSeriesPoint.
// The sample csv file has no header, and it contains 2 columns, namely timestamp and value.
// The following is a snippet of the sample csv file:
// 2018-03-01T00:00:00Z,32858923
// 2018-03-02T00:00:00Z,29615278
// 2018-03-03T00:00:00Z,22839355
// 2018-03-04T00:00:00Z,25948736
Path path = Paths.get("./src/samples/java/sample_data/request-data.csv");
List<String> requestData = Files.readAllLines(path);
List<TimeSeriesPoint> series = requestData.stream()
.map(line -> line.trim())
.filter(line -> line.length() > 0)
.map(line -> line.split(",", 2))
.filter(splits -> splits.length == 2)
.map(splits -> {
TimeSeriesPoint timeSeriesPoint = new TimeSeriesPoint();
timeSeriesPoint.setTimestamp(OffsetDateTime.parse(splits[0]));
timeSeriesPoint.setValue(Float.parseFloat(splits[1]));
return timeSeriesPoint;
})
.collect(Collectors.toList());

System.out.println("Detecting change points...");
ChangePointDetectRequest request = new ChangePointDetectRequest();
request.setSeries(series);
// Set the granularity to be DAILY since the minimal interval in time of the sample data is one day.
request.setGranularity(TimeGranularity.DAILY);
ChangePointDetectResponse response = anomalyDetectorClient.detectChangePoint(request);
if (response.getIsChangePoint().contains(true)) {
System.out.println("Change points found in the following data positions:");
for (int i = 0; i < request.getSeries().size(); ++i) {
if (response.getIsChangePoint().get(i)) {
System.out.print(i + " ");
}
}
System.out.println();
} else {
System.out.println("No change points were found in the series.");
}
}
}
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
2018-03-01T00:00:00Z,32858923
2018-03-02T00:00:00Z,29615278
2018-03-03T00:00:00Z,22839355
2018-03-04T00:00:00Z,25948736
2018-03-05T00:00:00Z,34139159
2018-03-06T00:00:00Z,33843985
2018-03-07T00:00:00Z,33637661
2018-03-08T00:00:00Z,32627350
2018-03-09T00:00:00Z,29881076
2018-03-10T00:00:00Z,22681575
2018-03-11T00:00:00Z,24629393
2018-03-12T00:00:00Z,34010679
2018-03-13T00:00:00Z,33893888
2018-03-14T00:00:00Z,33760076
2018-03-15T00:00:00Z,33093515
2018-03-16T00:00:00Z,29945555
2018-03-17T00:00:00Z,22676212
2018-03-18T00:00:00Z,25262514
2018-03-19T00:00:00Z,33631649
2018-03-20T00:00:00Z,34468310
2018-03-21T00:00:00Z,34212281
2018-03-22T00:00:00Z,38144434
2018-03-23T00:00:00Z,34662949
2018-03-24T00:00:00Z,24623684
2018-03-25T00:00:00Z,26530491
2018-03-26T00:00:00Z,35445003
2018-03-27T00:00:00Z,34250789
2018-03-28T00:00:00Z,33423012
2018-03-29T00:00:00Z,30744783
2018-03-30T00:00:00Z,25825128
2018-03-31T00:00:00Z,21244209
2018-04-01T00:00:00Z,22576956
2018-04-02T00:00:00Z,31957221
2018-04-03T00:00:00Z,33841228
2018-04-04T00:00:00Z,33554483
2018-04-05T00:00:00Z,32383350
2018-04-06T00:00:00Z,29494850
2018-04-07T00:00:00Z,22815534
2018-04-08T00:00:00Z,25557267
2018-04-09T00:00:00Z,34858252
2018-04-10T00:00:00Z,34750597
2018-04-11T00:00:00Z,34717956
2018-04-12T00:00:00Z,34132534
2018-04-13T00:00:00Z,30762236
2018-04-14T00:00:00Z,22504059
2018-04-15T00:00:00Z,26149060
2018-04-16T00:00:00Z,35250105

0 comments on commit 3f42cdb

Please sign in to comment.