Classic file types are no longer available for use as of January 2021. You can transition your classic files or download a PDF. Learn More

How to access file content after conditional branch in Chains?

Answered
0

Comments

5 comments

  • Jeff Hickey

    Hi Waldo, I think a better solution to this would be to adjust your architecture so that you have a fail path on your condition. Each branch could then call a sub routine chain that would handle and accept the appropriate input. In the case of the success path, you would insert the new column first and pass the resulting output to the sub routine. The fail path would just pass the advanced query output.

    Alternatively, if you stuck with your current architecture, you could probably do it by dumping the file contents into a dynamic variable and updating the variable as need while the chain progresses. When it's time to call your final List File Content, first create a new file by dumping the variable contents into it and then running the list file contents on the newly created file. Below is a high level screenshot of what it may look like using a dynamic variable:

    0
  • Waldo Nell

    So dynamic variables are out - problem is List Content is orange as it says the output is too large.  So I cannot use it as my dynamic variable is null when this happens.  Chains - this is SO hard. problem is I have two of these conditional splits.  Normally I'd put the grouped section in a chain and call it as a subroutine.  This does not work as the grouped section has two outputs and a chain cannot return two outputs as one output.  It just shifts the issue downstream.

    So I have to break the chain up in non logical sections of partial chains.  I will try that now.

    0
  • Waldo Nell

    I clearly do not understand chains at all.  I cannot get this working.  I have a chain 1, 2 and 3.  Chain 1 calls chain 2, chain 2 calls chain 3.  Chain 1 runs an Advanced Query and generates some output:

    VARIABLE
    VALUE
    Output CSV
    output-1620760297585.csv
    Record Count
    13228
    Result
    output-1620760297585.csv
    Command Details
    {"id":18557050,"status":"succeeded"}

    Chain 1 takes the output of Advanced Query (Result), of type File, and passes it to Chain 2 which has a Runtime Parameter called Source File, also of type File.  Chain 2 takes this parameter and calls Chain 3, which also has a Source File runtime parameter of type File, and passes the Source File runtime input it received to Chain 3.

    Chain 3 tries to run an Advanced Query on that Source File runtime input file and fails:

    ⚙️ Command Configuration:  1 
    Input file: None
    Output file: <none>
    Delimiter: ,
    Preview: true
    Header line: 0

    Command: Query
    Tables: [{"file":"output-1620760297585.csv","table":"tmp_tbl"},{"file":"query_result_1620760335448.csv","table":"tmp_tbl_existing"}]
    Output Delimiter: ,
    Query: select distinct * from tmp_tbl where Account not in (select distinct Account_Code from tmp_tbl_existing)


    Querying file...


    Error with table tmp_tbl: stat output-1620760297585.csv: no such file or directory

    Elapsed Time:27.645759ms
    Return Code:2
    Exit message:stat output-1620760297585.csv: no such file or directory
    Status:❌

    Why can it not find file output-1620760297585.csv?

    0
  • Jeff Hickey

    Passing a file from Chain 1 to Chain 2 and then to Chain 3 will result in the "no such file" error unfortunately. The file can be found and use by Chain 2 though. It appears that files can only be passed one level deep. I am investigating if this is by design or a bug and will provide an update when more information is learned.

    A workaround is to manipulate the file in a meaningless way in Chain 2 and pass that output to Chain 3. An example is to use Advanced Query to select everything (select * from table) from the file which will create a copy. Then pass the resulting output from Advanced Query to Chain 3. This will keep the file "one level deep" since you're passing a file created in Chain 2 to Chain 3.

    1
  • Waldo Nell

    I can confirm the workaround works.

    0

Please sign in to leave a comment.