Array to CSV - Performance
AnsweredI have JSON data from an API call which contains about 500k records and 90+ fields and I am using the Array to CSV command to convert the data to CSV before importing to a Wdata table. The Array to CSV command currently takes about 7-8 minutes to complete. Is this performance normal for a JSON with this size? Thanks!
0
-
Hi Mark. The size of data can impact performance. There are several other factors that also influence run time. For instance:
- Large datasets naturally take longer to process
- If you're using GroundRunner, server load may further increase run time
To reduce run time, you can consider the following:
- Schedule the Chain to run overnight or during periods of low usage. This ensures the data is ready when needed
- Reduce the data size by transforming the data to remove unnecessary columns and rows
- Reduce the data size by only loading new and updated data instead of reloading previously loaded data
These are just examples, but hopefully they give you a good starting point on areas that you may be able to trim down overall run time.
0Please sign in to leave a comment.
Comments
1 comment