The chat responses are generated using Generative AI technology for intuitive search and may not be entirely accurate. They are not intended as professional advice. For full details, including our use rights, privacy practices and potential export control restrictions, please refer to our Generative AI Service Terms of Use and Generative AI Service Privacy Information. As this is a test version, please let us know if something irritating comes up. Like you get recommended a chocolate fudge ice cream instead of an energy managing application. If that occurs, please use the feedback button in our contact form!
Skip to content
Insights Hub and Industrial IoT

Insights Hub drives smart manufacturing through the industrial Internet of Things. Gain actionable insights with asset and operational data and improve your processes.

IOT Bulk Client for Python¶

Introduction¶

The IoT TS Bulk Python client allows you to use IoT TS Bulk APIs while developing applications for Insights Hub. Refer to IoT TS Bulk Service for more information about the service.

Further implementation of the IOT TS Bulk SDK library has been shown in a sample project that you can download and test in local or on Insights Hub application. Please refer to this repository: industrial-iot-python-sdk-examples

Hint

In the IoT context, assets are referred to as entity and aspects as propertyset.
Placeholders in the following samples are indicated by angular brackets < >.

IOTTSBulk Operations¶

Client Name: BulkImportOperationsClient

create bulk import job for importing time series data¶

Creates an import job resource to asynchronously import IoT time series data from files uploaded through IoT File Service. After successful creation of an import job, the provided file contents are validated and imported in the background. The status of a job can be retrieved using the returned job ID. Note that in case of validation errors occurring during or after job creation, no time series data is imported from any of the provided files. ### Restrictions: ### Currently only one asset-aspect (entity-property set) combination can be specified as target of the import. Data for performance assets (entities) must be older than 30 minutes in order to be imported, while for simulation assets (entities) no restriction on minimum age exists. In case of simulation assets (entities), all data must be within the same hour. In case of performance assets (entities), all data must be within the same day. The overall size of the files used to import data for one asset-aspect (entity-property set) combination is limited: - For simulation assets (entities), a maximum of 350 MB per hour is allowed - For performance assets (entities), a maximum of 1 GB per day is allowed Note that hour and day intervals are fixed with respect to UTC time hours and days. A maximum of 100 files can be specified per request.

# Import the RestClientConfig and UserToken from mindsphere_core module
from mindsphere_core import RestClientConfig
from mindsphere_core import UserToken

# Import the MindsphereError from mindsphere_core.exceptions module
from mindsphere_core.exceptions import MindsphereError

# Import the DiagnosticActivationsClient from mindconnect module
from mindconnect import DiagnosticActivationsClient


# Create the BulkImportOperationsClient object using the RestClientConfig and UserToken objects
bulkImportOperationsClient = BulkImportOperationsClient(rest_client_config = config, mindsphere_credentials = credentials)

try:
    # Create the request object
    createImportJobRequest =CreateImportJobRequest()
    bulkImportInput = BulkImportInput()
    data = Data()
    data.entity = "5908ae5c5e4f4e18b0be58cd21ee675f"
    data.property_set_name = "test_2020_11_11"

    fileInfo = FileInfo()
    fileInfo.file_path = "test11.json"
    fileInfo._from = "2020-12-16T04:30:00.01Z"
    fileInfo.to = "2020-12-16T04:35:00.30Z"
    fileInfoList = [fileInfo]

    data.timeseries_files = fileInfoList
    dataList = [data]

    bulkImportInput.data = dataList
    createImportJobRequest.bulk_import_input = bulkImportInput

    # Initiate the API call
    response = bulkImportOperationsClient.create_import_job(createImportJobRequest)

except MindsphereError as err:
    # Exception Handling

Retrieve status of bulk import job¶

Retrieves status of bulk import job

try:
    # Create the request object
    retrieveImportJobRequest = RetrieveImportJobRequest()
    retrieveImportJobRequest.id = "4e37b10ad4774b8bb391209a9d18f1b6"

    # Initiate the API call
    response = bulkImportOperationsClient.retrieve_import_job(retrieveImportJobRequest)

except MindsphereError as err:
    # Exception Handling

Client Name: ReadOperationsClient

Retrieve time series data¶

Retrieve time series data for a single asset (entity) and aspect (property set). Returns data for the specified time range. Returns only the selected properties if 'select' parameter is used. The maximum number of time series data items returned per request is defined by parameter 'limit'. In case more time series data items are present in the requested time range, only a subset of data items will be returned. In that case response property 'nextRecord' contains the request URL to fetch the next set of time series data items, by increasing the 'from' parameter accordingly.

# Create the DiagnosticInformationClient object using the RestClientConfig and UserToken objects
readOperationsClient = ReadOperationsClient(rest_client_config = config, mindsphere_credentials = credentials)

try:
    # Create the request object
    retrieveTimeseriesRequest = RetrieveTimeseriesRequest();
    retrieveTimeseriesRequest.entity = "5908ae5c5e4f4e18b0be58cd21ee675f"
    retrieveTimeseriesRequest.property_set_name = "test_2020_11_11"
    retrieveTimeseriesRequest._from = "2020-11-11T02:50:00Z"
    retrieveTimeseriesRequest.to = "2020-11-11T03:52:00Z"

    # Initiate the API call
    response = self.client.retrieve_timeseries(retrieveTimeseriesRequest)

except MindsphereError as err:
    # Exception Handling