Skip to content

Observation Outputs

API v2 Coming Soon

API v2 introduces improvements to observation output operations. The current release uses API v1. See Migrating from API v1 for a preview of what's changing.

When Novelty detects anomalies or patterns in your data, observation outputs route those detection results to external systems. Each observation can have any number of output destinations, which are processed in parallel. Outputs support various destinations including Kafka, Kinesis, SNS, HTTP webhooks, and local files.

Output Workflows

Novelty provides workflow-based output processing with filtering and parallel destination routing. Each observation result passes through optional filtering before reaching one or more destination outputs.

Filtering

Filter results before processing with predicates. Use OnlyPositiveMatch to route only positive matches (when the pattern first matches), filtering out cancellations:

{
  "filter": {
    "type": "OnlyPositiveMatch"
  }
}

Observation results include an isPositiveMatch metadata flag:

  • true - Pattern newly matched (positive match)
  • false - Pattern no longer matches (cancellation)

Output Formats

Destinations that support formats can serialize as JSON or Protobuf.

JSON (Default):

{
  "format": { "type": "JSON" }
}

Protobuf:

{
  "format": {
    "type": "Protobuf",
    "schemaUrl": "http://schema-registry:8081/schemas/ids/1",
    "typeName": "com.example.ResultMessage"
  }
}

Destination JSON Protobuf
Kafka Yes Yes
Kinesis Yes Yes
SNS Yes Yes
ReactiveStream Yes Yes
File Yes No
HTTP Webhook Yes No
Standard Out Yes No

Output Destinations

Drop

Drop the current result output and end processing the destination.

POST to Webhook

Makes an HTTP[S] POST for each result.

Log JSON to Standard Out

Prints each result as a single-line JSON object to standard output on the Novelty server.

Log JSON to a File

Write each result as a single-line JSON object to a file on the local filesystem.

Publish to Kafka Topic

Publishes a record for each result to the provided Apache Kafka topic. Records can be serialized as JSON or Protocol Buffers before being published to Kafka.

Publish to Kinesis Stream

Publishes a record for each result to the provided Kinesis stream. Records can be serialized as JSON or Protocol Buffers before being published to Kinesis.

Setting Description Default
streamName Kinesis stream name
credentials AWS credentials (optional) None
region AWS region (optional) None
format Output format (JSON or Protobuf) JSON
kinesisParallelism Concurrent publish operations None
kinesisMaxBatchSize Maximum records per batch None
kinesisMaxRecordsPerSecond Rate limit (records/second) None
kinesisMaxBytesPerSecond Rate limit (bytes/second) None

Publish to SNS Topic

Publishes an AWS SNS record to the provided topic containing JSON for each result.

Setting Description Default
topic SNS topic ARN
credentials AWS credentials (optional) None
region AWS region (optional) None
format Output format (JSON or Protobuf) JSON

Credential Validation

Ensure your credentials and topic ARN are correct. If writing to SNS fails, the write will be retried indefinitely. If the error is not fixable (e.g., the topic or credentials cannot be found), the outputs will never be emitted and the output could stop running.

Publish to Reactive Stream

Broadcasts results to a TCP-based reactive stream endpoint. Clients can connect to receive a continuous stream of results.

Setting Description Default
address Address to bind the stream server localhost
port Port to bind the stream server
format Output format (JSON or Protobuf) JSON

Cluster Limitation

Reactive Stream outputs do not function correctly when running in a cluster. Use Kafka or Kinesis for clustered deployments.