Ingestion API
With the Ingestion API you can upload data to SAP Signavio Process Intelligence from any connection, and then get the status of the data upload request. To upload data to SAP Signavio Process Intelligence, you can upload CSV and TSV files.
First, you need to create a source data with Ingestion API source system or a process data pipeline. Next, you can upload data and get the status of the upload request. Read more in section Upload Data Using the Ingestion API .
Access the ingestion API
To use the Ingestion API, you’ll need to get an API access token. This is necessary for authenticating the API with your third-party connection. Read more in section Ingestion API Authentication.
Use the ingestion API
This section explains the reference information for the Ingestion API.
Ingestion API base URL
All API endpoints are relative to the base URL. The endpoint URL is generated when you create an Ingestion API connection. Read more in section Connector - Ingestion API.
In the following regions, this URL format applies: https://baseURL/ingestion/data
Region baseURLs:
-
EU: spi-etl-ingestion.eu-prod-cloud-os-eu.suite-saas-prod.signav.io/
In the following regions, this URL format applies: https://baseURL/spi/ingestions/v1/data
Region baseURLs:
-
Australia (AU): api.au.signavio.cloud.sap/
-
Canada (CA): api.ca.signavio.cloud.sap/
-
Japan (JP): api.jp.signavio.cloud.sap/
-
Singapore (SGP): api.sgp.signavio.cloud.sap/
-
South Korea (KR): api.kr.signavio.cloud.sap/
-
USA (US): api.us.signavio.cloud.sap/
Requests
Request Type | POST |
Header |
Accept: application/json Content-Type: multipart/form-data Authorization: Bearer <<ACCESS_TOKEN>> |
Form-data | schema= { "type": "record", "name": "users", "fields": [ { "name": "id", "type": [“null”, “int”] }, { "name": "birthdate", "type": { "type": "int", "logicalType": "date" } }, { "name": "createdAt", "type": { "type": "long", "logicalType": "timestamp-millis" } }, { "name": "arrivalTime", "type": { "type": "int", "logicalType": "time-millis" } } ] } primaryKeys= "id" delimiter= “,” <<files>> |
Response | "status": 200, "payload": { "executionId": <<REQUEST-UUID>> } |
-
<<ACCESS_TOKEN>>: The API token from the connection.
-
<<files>>: The data to be pushed.
-
schema : The JSON (Avro) schema for which this data is pushed.
-
primaryKeys: The comma separated list of primary keys, example primaryKeys="key1,key2,key3" (values must exist in the provided Schema).
-
delimiter: (optional) The delimiter character separating the data in the files. Any character can be used, but if no value is provided the delimiter defaults to a comma (“,”).
Response codes
Code | Description |
---|---|
200 | OK |
400 | Bad Request |
401 | Unauthorized |
Ingestion API schema definition
A Schema is represented in JSON following the Apache Avro Specification of type record to define the table structure. No other type may be used for this field.
For information about supported data types, read more in section Supported data types.
Example:
{ "type": "record", "name": "tableName", "fields" : [ {"name": "id", "type": "long"}, {"name": "title", "type": "string"} ] }
CSV example
The following image shows the contents of an example CSV file using a comma as the delimiter:
cURL example

curl --location --request POST 'https://<<base-url>>/ingestion/data' \
--header 'Authorization: Bearer <<ACCESS_TOKEN>>' \
--form 'schema="{
\"type\": \"record\",
\"name\": \"tableName\",
\"fields\": [
{
\"name\": \"id\",
\"type\": \"int\"
},
{
\"name\": \"name\",
\"type\": [\"null\", \"string\"]
},
{
\"name\": \"description\",
\"type\": \"string\"
},
{
\"name\": \"rand\",
\"type\": \"double\"
},
{
\"name\": \"assigned\",
\"type\": \"boolean\"
}
]
}"' \
--form 'primaryKeys="id,name"' \
--form 'file1=@"/file-path/valid.csv"'\
--form 'file2=@"/file-path/valid2.csv"'\
--form 'delimiter=";"'
Python example

import pandas as pd
import requests
import json
columns = [
{'name': 'StartDate', 'type': {
"type": "long",
"logicalType": "timestamp-millis"
}},
{'name': 'EndDate', 'type': {
"type": "long",
"logicalType": "timestamp-millis"
}},
{'name': 'IncidendID', 'type': ['null', 'string']},
{'name': 'Status', 'type': ['null', 'string']},
{'name': 'IPAddress', 'type': ['null', 'string']},
{'name': 'Progress', 'type': ['null', 'string']},
{'name': 'Duration', 'type': ['null', 'string']},
{'name': 'Finished', 'type': ['null', 'string']},
{'name': 'RecordedDate', 'type': ['null', 'string']},
{'name': 'ResponseId', 'type': ['null', 'string']},
{'name': 'RecipientLastName', 'type': ['null', 'string']},
{'name': 'RecipientFirstName', 'type': ['null', 'string']},
{'name': 'RecipientEmail', 'type': ['null', 'string']},
{'name': 'ExternalReference', 'type': ['null', 'string']},
{'name': 'LocationLatitude', 'type': ['null', 'string']},
{'name': 'LocationLongitude', 'type': ['null', 'string']},
{'name': 'DistributionChannel', 'type': ['null', 'string']},
{'name': 'UserLanguage', 'type': ['null', 'string']},
{'name': 'Q1', 'type': ['null', 'string']},
{'name': 'Recommendation', 'type': ['null', 'string']},
{'name': 'Q2', 'type': ['null', 'string']},
{'name': 'Q3', 'type': ['null', 'string']},
{'name': 'Q4', 'type': ['null', 'string']},
{'name': 'Q5', 'type': ['null', 'string']},
{'name': 'Q6', 'type': ['null', 'string']},
{'name': 'Q7', 'type': ['null', 'string']},
{'name': 'Q8', 'type': ['null', 'string']},
{'name': 'Q9', 'type': ['null', 'string']},
{'name': 'Q2Sentiment', 'type': ['null', 'string']},
{'name': 'Q2SentimentScore', 'type': ['null', 'string']},
{'name': 'Q2SentimentPolarity', 'type': ['null', 'string']},
{'name': 'Q2TopicSentimentLabel', 'type': ['null', 'string']},
{'name': 'Q2TopicSentimentScore', 'type': ['null', 'string']},
{'name': 'Q2Topics', 'type': ['null', 'string']},
{'name': 'Q2ParentTopics', 'type': ['null', 'string']},
{'name': 'TicketID', 'type': ['null', 'string']}
]
token = "<<ACCESS_TOKEN>>"
# url prod
url = "https://<<base-url>>/ingestion/data"
# url staging
url = 'https://<<base-url>>/ingestion/data/'
file = open(file_url, 'rb')
headers = {'Authorization': f'Bearer {token}'}
schema = {"type": "record", "name": "SV_optue57fznaOEU43", "fields": columns}
data = {"delimiter": ",", 'schema': json.dumps(schema), 'primaryKeys': 'ResponseId'}
response = requests.post(url, headers=headers, data=data, files={'file1': file})
Use the ingestion status API
This section explains the reference information for the Ingestion status API. You can view the status of data upload calls in the logs section of an Ingestion API integration. Read more in section Monitor the data extraction.
Ingestion status API base URL
All API endpoints are relative to the base URL. The endpoint URL is generated when you create an Ingestion API connection. Read more in section Connector - Ingestion API.
In the following regions, this URL format applies: https://baseURL/ingestions/<<REQUEST-UUID>>/status
Region baseURLs:
-
EU: spi-etl-ingestion.eu-prod-cloud-os-eu.suite-saas-prod.signav.io/
In the following regions, this URL format applies: https://baseURL/spi/ingestions/v1/<<REQUEST-UUID>>/status
Region baseURLs:
-
Australia (AU): api.au.signavio.cloud.sap/
-
Canada (CA): api.ca.signavio.cloud.sap/
-
Japan (JP): api.jp.signavio.cloud.sap/
-
Singapore (SGP): api.sgp.signavio.cloud.sap/
-
South Korea (KR): api.kr.signavio.cloud.sap/
-
USA (US): api.us.signavio.cloud.sap/
Requests
Request Type | GET |
Header |
Accept: application/json Content-Type: application/json Authorization: Bearer <<ACCESS_TOKEN>> |
Response | { "status": 200, "payload": { "status": "COMPLETED", “displayStatus”: “completed” "message": "OPTIONAL-MESSAGE" } } |
-
<<ACCESS_TOKEN>>: The API token from the connection.
-
<<REQUEST-UUID>>: The ingestion request id received from the response of an Ingestion API call. Read more in section Requests.
Response codes
Code | Description |
---|---|
200 | OK |
400 | Bad Request |
401 | Unauthorized |
Ingestion API request statuses
Below is a list of the statuses that can be returned when executing an Ingestion status API call:
Status | Description |
---|---|
REQUEST_VALIDATING | The ingestion request is being validated. The inputs, schema, and file headers are checked for matching. |
REQUEST_VALIDATED |
The ingestion request is valid and ready for the next step. |
REQUEST_VALIDATION_FAILED |
The ingestion request is invalid. See the 'message' field in the response for more details. |
FILE_CONVERTING |
The request is converting and parsing the data files. |
FILE_CONVERTED |
The request has converted the data files and ready for the next step. |
FILE_CONVERSION_FAILED |
The conversion of the data files has failed. See the 'message' field in the response for more details. |
FILE_UPLOADING |
The data files are uploading to SAP Signavio Process Intelligence. |
FILE_UPLOADED |
The data files are uploaded to SAP Signavio Process Intelligence and ready for the next step. |
FILE_UPLOAD_FAILED |
The uploading of the data files to SAP Signavio Process Intelligence has failed. See the 'message' field in the response for more details. |
INTERNAL_SYNCHRONISING |
The internal system is synchronizing and preparing the ingested data for transformation and load (T&L). |
INTERNAL_SYNCHRONISING_FAILED |
The synchronization of the internal systems has failed. See the 'message' field in the response for more details. |
COMPLETED | The ingestion request has completed successfully and is ready for transformation and load (T&L). |
cURL example

curl --location --request GET 'https://base-url/ingestions/<<REQUEST-UUID>>/status' \
--header 'Authorization: Bearer <<access-token>>' \
--header 'Content-Type: application/json'
Do you have feedback for this page? Send us an email
For product support, please contact our service experts on the SAP ONE Support Launchpad.