Data Transformation
Move data between systems while cleaning and transforming it along the way.
Use Case
You have data in one system (e.g. a spreadsheet, API, or database) and need to:
- Extract the data
- Clean and transform it (rename fields, filter rows, convert formats)
- Load it into another system
Step 1: Extract Data
Add a trigger or data source node to pull the data.
Common sources:
- Google Sheets — Pull rows from a spreadsheet
- HTTP Utilities — Fetch data from a REST API
- Webhook Trigger — Receive data pushed from another system
Step 2: Transform with JSON Node
Add a JSON node to reshape the data.
Common transformations:
Rename fields: Map source field names to the names your target system expects.
Filter rows: Use a Router node to keep only rows that match your criteria (e.g. status == “active”).
Convert formats: Use Parameter Mapping helper functions to:
- Convert dates between formats
- Parse strings to numbers
- Merge or split text fields
Step 3: Loop Over Records
If your data contains multiple records, use a Loop node to process each one individually.
- Add a Loop node
- Set the input to the array from your data source
- Inside the loop, add nodes to process each record
Step 4: Load Into Target System
Add a destination node to push the transformed data.
Common targets:
- Google Sheets — Append rows to a spreadsheet
- HTTP Utilities — POST data to an API
- HubSpot / Salesforce / Pipedrive — Create or update CRM records
- Notion — Create database entries
Step 5: Test and Publish
- Test with a small dataset first
- Verify the output in your target system
- Publish the workflow
Best Practices
- Always test with sample data before processing large datasets
- Use a Data Validator node to catch malformed records early
- Add error handling inside loops to prevent one bad record from stopping the entire batch
- Use the Delay Utilities node between API calls if your target system has rate limits