diy-template-creation
The DIY Template API allows programmatic creation of custom templates without using the UI. The API accepts a list of blocks in order and wires them into a new template.
Overview and prerequisites
Getting workspace ID
Retrieving available processors
Use the following endpoint to list all available block types with their name (blockType identifier), description, and whether they are source blocks. Use this to validate blockType values before creating a DIY template.
GET /api/v1/processors
| blockType | Description |
|---|---|
sftp_pull | Poll SFTP server and download matching files |
s3_pull | Poll S3 bucket and download files |
kafka_connect_to_source | Kafka pull connect to source |
optional_decrypt_content | Decryption block |
optional_encrypt_content | Encryption block |
csv_json_neo_transformer | CSV to JSON transformer for DIY templates |
neo_transformer | Neo transformer (applies business logic) |
retro_destination | OAuth-based InTouch Transaction Add V2 Retro |
intouch_transaction_v2 | OAuth-based InTouch Transaction Add V2 |
ok_file | OK file presence check |
Creating a template via API
POST /api/v1/templates/diy
| Parameter | Description |
|---|---|
name * | Name of the template being created |
blockList * | Array of block objects defining the template structure |
blockName * | Display name for the block (customizable) |
blockType * | The processor type identifier. Must match the name field from GET Processors response |
blockOrder * | Position of the block in the dataflow. Source block = 0, subsequent blocks increment by 1 |
* Required parameter
Sample request:
{
"name": "My_Custom_Template",
"blockList": [
{ "blockName": "Connect to Source", "blockType": "sftp_pull", "blockOrder": 0 },
{ "blockName": "CSV to JSON", "blockType": "csv_json_neo_transformer", "blockOrder": 1 },
{ "blockName": "Neo Transformer", "blockType": "neo_transformer", "blockOrder": 2 },
{ "blockName": "Connect to Destination", "blockType": "retro_destination", "blockOrder": 3 }
]
}blockType reference
| Block name | blockType | Category | Key purpose |
|---|---|---|---|
| Connect to Source | sftp_pull | Source | Fetch files from SFTP/FTP |
| Connect to Source (S3) | s3_pull | Source | Fetch files from S3 |
| Kafka source | kafka_connect_to_source | Source | Consume from Kafka topic |
| Decrypt data | optional_decrypt_content | Processing | Decrypt encrypted files |
| Join data | JoinProcessor | Processing | Merge two CSV files |
| Hash CSV fields | HashCSVProcessor | Processing | Hash/mask sensitive columns |
| Rebuild / Define headers | HeaderProcessor | Processing | Rename and reorder CSV headers |
| Transform data | AryaProcessor | Processing | Map CSV columns to API fields |
| CSV-to-XML | CsvToXmlProcessor | Processing | Convert CSV to XML format |
| CSV-to-JSON (Neo DIY) | csv_json_neo_transformer | Processing | Convert CSV to JSON for Neo |
| Neo transformer | neo_transformer | Processing | Apply Neo business logic |
| Encrypt data | optional_encrypt_content | Processing | Encrypt output file |
| Connect to Destination | PutSFTPProcessor | Destination | Transfer file to SFTP/FTP |
| Connect to Destination (API) | CBRequestProcessor | Destination | Post data to Capillary API |
| Push to S3 | PutS3ObjectV2 | Destination | Upload file to S3 bucket |
| OAuth client | OAuthClientProcessor | Control | Generate OAuth token for API calls |
| Trigger | (Cron scheduler) | Control | Schedule dataflow execution |
Updated about 10 hours ago
