Features
StreamPipes

Features

Use Cases
Incident Detection
StreamPipes allows to immediately detect incidents you'd like to avoid. We support algorithms ranging from simple threshold-based tracking of sensor measurements over trend analysis over time periods up to the integration of custom-tailored predictive maintenance algorithms.
Data Harmonization
StreamPipes helps to create a clean data lake based on sensor measurements from machines and other assets. Various data harmonization algorithms (e.g., filters, aggregations and unit converters) allow to easily clean and enrich data in a continuous fashion.
Monitoring
See what's happening right now: Use StreamPipes as your real-time window into your current production performance. A live dashboard and a wide range of available notification channels allow you to monitor KPI's in a flexible and customizable manner.
Pipeline Editor

  • Easy to use. Web-based user interface to create stream processing pipelines in a graphical editor.
  • Powerful. Select data streams, add transformation modules and choose a data sink. StreamPipes cares about execution details of your pipeline.
  • Intelligent. Our built-in semantics recommend you elements you can connect and disallow you to connect elements that wo not work.
Data Sources

  • Intuitive. No deeper knowledge of schemas, protocols or data formats required. From now on, a data stream is just a circle.
  • Dynamic. Once your sensor data changes or you need to add new streams, just reconfigure the stream in our web interface.
  • Versatile. We have successfully integrated data streams ranging from MES systems over mobile phone data to twitter streams.
Data Processing and Transformation

  • Filter, Aggregate, Enrich. You need less data, more data or other data? Stateless processing components transform your input streams.
  • Advanced Analytics. Advanced methods (e.g., ML-algorithms) can be used as pipeline elements.
  • Pattern Detection. We have developed modules supporting pattern detection, e.g., detection of sequences over time.
Data Sinks

  • Visualize. Create real-time visualizations and observe your KPIs and production systems from web-based cockpits.
  • Notify. Detect problems or other situations of interest and notify the people in your organization who need to be informed.
  • Store. Harmonize your data stream and store it in third-party systems, e.g. for offline analysis. We have adapters for Elasticsearch, HDFS, Cassandra and NoSQL databases.
Here are the technical details
  • RDF-based data model to describe semantics of streams, processing logic and sinks
  • Semantics-based matching based on schema, protocols, formats and data quality aspects
  • Arbitrary output transformations: Extract, rename, replace, append, or customize the output of data processors.
  • Web-based assisted definition of new sources, data processors and sinks. Extend the system at runtime. Code generation for supported runtime implementations.
  • Transport protocols: Support for multiple message protocols and brokers, e.g. Kafka, REST, MQTT, JMS, AMQP or STOMP.
  • Data Formats: Support for multiple message formats, e.g., JSON, XML, Thrift
  • Runtime-independent integration of heterogeneous stream processing systems: We have already integrated Apache Storm, Apache Flink, Esper and standalone algorithms.
  • Intelligent Monitoring: Detection of sensor failures and semi-automatic replacement with backup sensors
Screenshots