"Monitor the current scrap rate per production machine, detect a sudden increase of scrap and notify the production manager "
"Calculate the moving average of temperature values, group values into categories and store them in Elasticsearch "
"Show me real-time sales data for convenience products in Europe. "
Easy to use.
Web-based user interface to create stream processing pipelines in a graphical editor.
Select data streams, add transformation modules and choose a data sink. StreamPipes cares about execution details of your pipeline.
Our built-in semantics recommend you elements you can connect and disallow you to connect elements that wo not work.
No deeper knowledge of schemas, protocols or data formats required. From now on, a data stream is just a circle.
Once your sensor data changes or you need to add new streams, just reconfigure the stream in our web interface.
We have successfully integrated data streams ranging from MES systems over mobile phone data to twitter streams.
Data Processing and Transformation
Filter, Aggregate, Enrich.
You need less data, more data or other data? Stateless processing components transform your input streams.
Advanced methods (e.g., ML-algorithms) can be used as pipeline elements.
We have developed modules supporting pattern detection, e.g., detection of sequences over time.
Create real-time visualizations and observe your KPIs and production systems from web-based cockpits.
Detect problems or other situations of interest and notify the people in your organization who need to be informed.
Harmonize your data stream and store it in third-party systems, e.g. for offline analysis. We have adapters for Elasticsearch, HDFS, Cassandra and NoSQL databases.
Here are the technical details
RDF-based data model to describe semantics of streams, processing logic and sinks
Semantics-based matching based on schema, protocols, formats and data quality aspects
Arbitrary output transformations: Extract, rename, replace, append, or customize the output of data processors.
Web-based assisted definition of new sources, data processors and sinks. Extend the system at runtime. Code generation for supported runtime implementations.
Transport protocols: Support for multiple message protocols and brokers, e.g. Kafka, REST, MQTT, JMS, AMQP or STOMP.
Data Formats: Support for multiple message formats, e.g., JSON, XML, Thrift
Runtime-independent integration of heterogeneous stream processing systems: We have already integrated Apache Storm, Apache Flink, Esper and standalone algorithms.
Intelligent Monitoring: Detection of sensor failures and semi-automatic replacement with backup sensors