Features
StreamPipes

Features

Beispielhafte Anwendungsfälle
"Monitor the current scrap rate per production machine, detect a sudden increase of scrap and notify the production manager "
"Calculate the moving average of temperature values, group values into categories and store them in Elasticsearch "
"Show me real-time sales data for convenience products in Europe. "
Pipeline Editor

  • Easy to use. Web-based user interface to create stream processing pipelines in a graphical editor.
  • Powerful. Select data streams, add transformation modules and choose a data sink. StreamPipes cares about execution details of your pipeline.
  • Intelligent. Our built-in semantics recommend you elements you can connect and disallow you to connect elements that wo not work.
Data Sources

  • Intuitive. No deeper knowledge of schemas, protocols or data formats required. From now on, a data stream is just a circle.
  • Dynamic. Once your sensor data changes or you need to add new streams, just reconfigure the stream in our web interface.
  • Versatile. We have successfully integrated data streams ranging from MES systems over mobile phone data to twitter streams.
Data Processing and Transformation

  • Filter, Aggregate, Enrich. You need less data, more data or other data? Stateless processing components transform your input streams.
  • Advanced Analytics. Advanced methods (e.g., ML-algorithms) can be used as pipeline elements.
  • Pattern Detection. We have developed modules supporting pattern detection, e.g., detection of sequences over time.
Data Sinks

  • Visualize. Create real-time visualizations and observe your KPIs and production systems from web-based cockpits.
  • Notify. Detect problems or other situations of interest and notify the people in your organization who need to be informed.
  • Store. Harmonize your data stream and store it in third-party systems, e.g. for offline analysis. We have adapters for Elasticsearch, HDFS, Cassandra and NoSQL databases.
Here are the technical details
  • RDF-based data model to describe semantics of streams, processing logic and sinks
  • Semantics-based matching based on schema, protocols, formats and data quality aspects
  • Arbitrary output transformations: Extract, rename, replace, append, or customize the output of data processors.
  • Web-based assisted definition of new sources, data processors and sinks. Extend the system at runtime. Code generation for supported runtime implementations.
  • Transport protocols: Support for multiple message protocols and brokers, e.g. Kafka, REST, MQTT, JMS, AMQP or STOMP.
  • Data Formats: Support for multiple message formats, e.g., JSON, XML, Thrift
  • Runtime-independent integration of heterogeneous stream processing systems: We have already integrated Apache Storm, Apache Flink, Esper and standalone algorithms.
  • Intelligent Monitoring: Detection of sensor failures and semi-automatic replacement with backup sensors
Screenshots