The chat responses are generated using Generative AI technology for intuitive search and may not be entirely accurate. They are not intended as professional advice. For full details, including our use rights, privacy practices and potential export control restrictions, please refer to our Generative AI Service Terms of Use and Generative AI Service Privacy Information. As this is a test version, please let us know if something irritating comes up. Like you get recommended a chocolate fudge ice cream instead of an energy managing application. If that occurs, please use the feedback button in our contact form!
Skip to content
Insights Hub and Industrial IoT

Insights Hub drives smart manufacturing through the industrial Internet of Things. Gain actionable insights with asset and operational data and improve your processes.

IoT Timeseries Data Streaming¶

Overview¶

Currently, to fetch Time Series data, an Insights Hub application would have to query all Asset instances, compare the timestamps, find the latest Time Series data and update the application. This would involve multiple requests through Industrial IoT Gateway which is expensive, not optimal or customer friendly.

With Timeseries data streaming, the 3rd party applications will get actual timeseries data streamed by Insights Hub on new Time Series data arrival for concrete Assets in near real time. This mechanism enables customer to directly consume Time series data in their application. Moreover this improves the usability of Insights Hub Services and user experience for the developers.

Note

IoT Timeseries Data Streaming will be available only in Early Access (EA) for Local Private Cloud customer. The feature can be provisioned via offering of 21000 for respective customer tenant. You may reach out to Customer support for availing this feature.

Streaming output¶

The streaming component will receive the data in following format -

  1. owner (String) - It is represented as - tenantId/assetId/aspectName
  2. registrationId (String) - It is the unique ID for the validated registration. The registrationId can be used to segregate the data.
  3. data - Data is in List<Map<String, Object>> format and the order will be as ingested into timeseries.
    {
    "owner": "siemens/28068dafbf7046d2a7a2be4bccb49409/status",
    "registrationId": "c19677f8-e215-4a56-975d-8ee7f3ba4538",
    "data": [
        {
            "_time": "2024-03-27T01:21:00Z",
            "Pressure": 22
        },
        {
            "_time": "2024-03-27T01:22:00Z",
            "Pressure": 21,
            "Status": true,
            "Remarks": "all_good"
        },
        {
            "_time": "2024-03-27T01:23:00Z",
            "Pressure": 24,
            "Temperature": 102
        }
    ]
}

Limitation¶

  • Only asset level Registrations are supported.
  • Only 10 registrations are allowed per tenant with maximum 50 assets under single registration.
  • Customer can have up to 5 destinations per tenant.
  • Only Kafka is supported as the destination streaming component.
  • Updating registration is not supported. Customer needs to delete & recreate registration. (In case of key/secret rotation, customer needs to delete and recreate the registration, and validate the same.)
  • Same assetId cannot be used in or across multiple registrations.
  • There can be multiple issues where the validation of a stream registration can fail (BROKEN), the same will be specified in the error field. If it has happened due to timeouts, a simple retry would suffice, otherwise user needs to delete and re-create the registration.
  • Asset deletion will not be reflected in Registrations. Deleted assets may still be visible in registration but the data won't be streamed.
  • Any changes in aspect/variable metadata (rename/update/delete) may not be reflected in output data for 30 minutes.

Setting up the destination¶

Customer can create a kafka stream in the same kafka cluster in customer account or can deploy a kafka cluster on an external server.

  • In case of kafka setup in same cluster as the customer account, the customer needs to only provide the zookeeper broker address and stream name while creating the registration.
  • In case of a kafka cluster created in external server -
    • Customer needs to provide a publicly accessible route.
    • Customer needs to take care of the encryption at transit and other security of the kafka cluster.
    • It is recommended to follow the best practices for optimum performance.

FAQs¶

  1. What will be delay in getting timeseries data after ingestion?
    • The delay will depend on the intermediate processing & validation at various stages at Edge, Connectivity & IoT timeseries streaming before it is forwarded to customer's streaming component. It may vary case to case, however the functionality is built in such a way it will be near real time.
  2. Is there is retry mechanism if customer destination component is not able to receive data somehow?
    • The functionality internally performs an exponential backoff retry for failed messages. If the delivery fails despite that, the registration is marked BROKEN.
  3. What if customer is interested for specific variable's data to be streamed?
    • Customer needs to filter out variable's data at recipient end to consume it in respective use-case.