Classic file types are no longer available for use as of January 2021. You can transition your classic files or download a PDF. Learn More

Deletion of Datasets

Answered
0

Comments

7 comments

  • Jeff Hickey

    Hi Barnett.

    Yes, it is possible to delete datasets from a Wdata Table via the API. First, you can get a list of files in a Table with the Retrieve a List of Files endpoint. Filter the results to find files that were created 14 days ago or more. Then unimport with Unimport a Single File endpoint and delete with Delete a Single File endpoint for each matching file. You can also reference the example workflow Updating a File.

    0
  • Barnett Chenoweth

    Hi Jeff,

    Thanks for the info.  Is it possible to automate this deletion process?  As in after 14 days, the old datasets will be deleted from the Wdata table?  These tables are just so large and there is no need to keep the older datasets after 14 days or so.

    Thanks,

    Barnett

    0
  • Jeff Hickey

    The process can be automated using the API endpoints listed above. It does require a developer to build custom scripting that utilizes the endpoints or an integration platform that can make external HTTP requests to build out the logic.

    Alternatively, the Chains tool could be used to build the automation logic. If Chains were used, you could take it a step further and configured Chains to retrieve the new dataset. In this case, the Load Data to Wdata template could be helpful as it does have a load method option to replace old datasets with new datasets. This does require access to Wdata and Chains.

    0
  • Barnett Chenoweth

    Hi Jeff.

    Using the Chains methodology outlined above, is it possible to keep up to 14 days of old datasets?  I want to make sure it's not just replacing the previous version each time.  Let me know if that doesn't make sense.

    Thanks,

    Barnett

     

    0
  • Jeff Hickey

    Hi Barnett.

    Yes, you can create logic to keep files that are less than 14 days old and delete the rest by building in logic to handle this either via the API or Chains.

    For the API route, you could filter the output received from Retrieve a List of Files endpoint to exclude files that are less than 14 days old. For example, one of the attributes included in the response is "created". You could use this value to find dates that are greater than 14 days old and drop the rest. After performing the filter, run the resulting list of files found to be older than 14 days through your logic to remove them.

    For the Chain route, the List File command will provide information about the files that can then be filtered on. For example, you could use the Convert JSON to CSV and Smart Filter Rows commands to perform the filtering on the created date. Similar to above, run the logic on the filtered list of files to remove the one older than 14 days.

    0
  • Barnett Chenoweth

    Jeff,


    Thanks for the clarification. I think we are going to try the Chains method first.  Do I have to create a new chain entirely or can I just edit the List File command in the Fact_Actuals_Update chain?  This is the chain that holds the largest datasets.

    Thanks,

    Barnett


    0
  • Jeff Hickey

    I can't give you a definitive answer since everyone's Chains are unique. In your case, it may makes sense to modify an existing Chain and add this logic in or create a new Chain to handle these steps.

    0

Please sign in to leave a comment.