Blob
Machine data can be sent via AWS S3, Azure Blob Storage and S3 compatible data stores such as Oracle Object Storage. Senseye will periodically gather the files and store the time series data within Senseye's time series database.
Data should be sent in our one of our standard formats. For time series data, these are CSV, JSON, or Parquet. For vibration data, these are Waveform CSV/JSON or Spectrum CSV/JSON. See their respective pages for more information on the structure we expect. We can also support custom data formats - please contact support for more information.
Note
Once a file has been processed, it will be deleted from the data store to avoid reprocessing
AWS S3¶
Senseye can retrieve data files from your bucket, or we can create one for your use. If Senseye is to connect to your bucket, we require the following information:
- Account access and secret key, with read and delete permissions on the bucket's contents
- Bucket name
Files should be placed at the root of the bucket. As noted above, we delete files once they have been processed, depending on the data volume, you may wish to activate versioning on the bucket so that files can be recovered if required.
For S3 compatible data stores, the only additional requirement is to provide an endpoint URL for the data store.
Azure Blob¶
Like S3, we can retrieve data files from your storage account, or we can create one for your use. If Senseye is to connect to your storage account, we require the following information:
- Storage account name
- Container name
- Access key associated with the storage account
Files can be placed within a directory structure if desired, but it is not required.