Create File error: 'duplicate file name'
回答済みHi,
I was trying to follow the steps in https://support.workiva.com/hc/en-us/articles/360050552311-Extract-Data-from-Oracle-Hyperion-Financial-Management-to-Workiva-as-a-File, and is having some problems with the 'Create File' command.
Below is my initial setting (pay attention to the Name field) and the chain was a success.
However, when I tried to run the chain again (to update the data in the Wdata Table), I get the below error message "duplicate file name".
So to have a 'dynamic' file name that changes whenever the chain is run (I don't want to go back and manually update the file name every time), I switched to using a Runtime variable (in this case: "Chain.ExecutionDateTime") plus '.csv' in the end.
Unfortunately, this didn't work as I had expected, see below.
Could somebody help with this? Ideally, I don't have to modify the Name field or manually delete the file/dataset in the Table whenever I run the chain.
(some additional background: I'm connected to HFM using a GroundRunner and the Create File command is therefore also using the same GroundRunner)
Thanks.
Lina
-
正式なコメント
Hi Lina,
You've got two different things going on here. First, Wdata requires that a file name is unique. To account for this, you can build the Chain to unimport and delete the file before importing the new file. We have a Template that shows how this is done:
https://support.workiva.com/hc/en-us/articles/360058919852-Update-Datasets-in-a-Table-chain-template
The second problem is that the Runtime Variable Chain.RunDateTime returns a date time stamp in ISO format including a colon for the hours & minutes separator. Since a colon is an invalid character in a file name, you need to use Variable Transformation to modify the file name to be compliant:
https://support.workiva.com/hc/en-us/articles/360035643752-Transform-variables-and-outputs
I would strongly recommend that you familiarize yourself with the Chain Template though because failure to delete the file will cause duplicate data if you are loading an updated dataset for essentially the same data.
Hope that helps.
Tony
Thank you very much Tony for pointing me in the right direction! I will definitely get familiar with the update-datasets chain as I can see bad consequences down the road happening without setting this properly...
Lina
0サインインしてコメントを残してください。
コメント
2件のコメント