Skip to main content
POST
https://api.pretectum.io
/
v1
/
businessareas
/
{businessAreaId}
/
schemas
/
{schemaId}
/
datasets
/
{datasetId}
/
dataobjects
Create Data Object
curl --request POST \
  --url https://api.pretectum.io/v1/businessareas/{businessAreaId}/schemas/{schemaId}/datasets/{datasetId}/dataobjects \
  --header 'Authorization: <authorization>' \
  --header 'Content-Type: <content-type>' \
  --data '
{
  "[Field Name]": {}
}
'
{
  "_dataObjectId": "<string>",
  "_version": 123,
  "_errors": [
    {
      "name": "<string>",
      "errors": "<string>"
    }
  ],
  "[Field Name]": {}
}
The Create Data Object endpoint allows you to add new records to a dataset. The data object structure is determined by the schema definition, and the API validates field values according to schema rules.

Prerequisites

Authentication

Include your access token in the Authorization header.
Pass the token directly without the “Bearer” prefix.
Authorization: your_access_token

Request

Path Parameters

businessAreaId
string
required
The unique identifier of the business area. You can obtain this from the List Business Areas endpoint.
schemaId
string
required
The unique identifier of the schema that defines the data structure. You can obtain this from the List Schemas endpoint.
datasetId
string
required
The unique identifier of the dataset where the data object will be created. You can obtain this from the List Datasets endpoint.

Headers

Authorization
string
required
Your access token obtained from the /v1/oauth2/token endpoint. Pass the token directly without the “Bearer” prefix.
Content-Type
string
required
Must be application/json.
Accept
string
default:"application/json"
The response content type. Currently only application/json is supported.

Request Body

The request body should be a JSON object with field names as keys and their values. Field names must match the schema field names exactly.
[Field Name]
varies
required
Dynamic fields based on the schema definition. Use the field names as defined in the schema. Values should be formatted according to the field’s data type:
  • string: Plain text values
  • integer: Whole numbers
  • float: Decimal numbers
  • boolean: true or false
  • date: Date string in the format defined by the schema (e.g., “MM/DD/YYYY”)
  • datetime: DateTime string in the format defined by the schema
  • time: Time string in the format defined by the schema
  • email: Valid email address
  • url: Valid URL
  • phone: Phone number string
  • picklist: Value from the predefined list

Example Requests

curl -X POST "https://api.pretectum.io/v1/businessareas/20240115103000123a1b2c3d4e5f6789012345678901234/schemas/20240115103000456d1e2f3a4b5c6789012345678901234/datasets/20240925152201042a1b2c3d4e5f6789012345678901234/dataobjects" \
  -H "Authorization: your_access_token" \
  -H "Content-Type: application/json" \
  -H "Accept: application/json" \
  -d '{
    "First Name": "John",
    "Last Name": "Smith",
    "Email": "john.smith@example.com",
    "Phone": "+1 555-123-4567",
    "Date of Birth": "01/15/1985",
    "Status": "Active"
  }'

Response

A successful request returns the created data object with system-generated fields.
_dataObjectId
string
required
The unique identifier generated for the new data object.
_version
integer
required
The version number of the data object. New objects start at version 1.
_errors
array
required
An array of validation errors. If the data doesn’t conform to schema validation rules, errors will be listed here. The data object is still created, but flagged with errors.
[Field Name]
varies
The field values as stored in the system. These may be transformed based on schema rules (e.g., date formatting).

Example Response

{
  "_dataObjectId": "20240601120000123f1a2b3c4d5e6789012345678901234",
  "_version": 1,
  "_errors": [],
  "First Name": "John",
  "Last Name": "Smith",
  "Email": "john.smith@example.com",
  "Phone": "+1 555-123-4567",
  "Date of Birth": "01/15/1985",
  "Status": "Active"
}

Response with Validation Errors

If the data doesn’t conform to schema rules, the object is created but flagged with errors:
{
  "_dataObjectId": "20240601120000123f1a2b3c4d5e6789012345678901234",
  "_version": 1,
  "_errors": [
    {
      "name": "Email",
      "errors": "Invalid email format"
    },
    {
      "name": "Date of Birth",
      "errors": "Date format does not match expected format MM/DD/YYYY"
    }
  ],
  "First Name": "John",
  "Last Name": "Smith",
  "Email": "invalid-email",
  "Phone": "+1 555-123-4567",
  "Date of Birth": "1985-01-15",
  "Status": "Active"
}

Error Responses

Status CodeDescription
400 Bad RequestInvalid request body or malformed JSON. Check your request format.
401 UnauthorizedInvalid or expired access token. Obtain a new token from /v1/oauth2/token and try again.
403 ForbiddenYour application client does not have permission to create data objects. Contact your tenant administrator.
404 Not FoundThe specified business area, schema, or dataset does not exist, or you do not have access to it.
500 Internal Server ErrorAn unexpected error occurred on the server. Try again later or contact support.

Validation

The API validates field values based on the schema definition:
  • Required fields: Fields marked as required in the schema must have a value.
  • Data types: Values must match the expected data type (e.g., integers for integer fields).
  • Format validation: Emails, URLs, phones, and dates are validated against their expected formats.
  • Picklist values: Values must be from the predefined list if the field is a picklist.
Data objects with validation errors are still created but flagged in the _errors array. This allows you to import data and fix validation issues later.

Best Practices

  1. Validate before sending: Validate data on the client side before making API calls to reduce errors.
  2. Use correct field names: Field names are case-sensitive and must match the schema exactly.
  3. Handle errors gracefully: Check the _errors array in the response to identify and report validation issues.
  4. Store the ID: Save the _dataObjectId for future updates or deletions.
  5. Batch imports: For large data imports, consider using the batch import feature instead of individual API calls.