Persefoni を使えば、 御社は簡単に二酸化炭素排出量を計算、分析、管理することができます。環境、社会、ガバナンス(ESG)または持続可能性報告書においてペルセフォーニの炭素会計データを使用するために、ペルセフォーニ|取引の取得 チェーンビルダー のテンプレートを使用してください。
このテンプレートは、PersefoniからWorkivaにデータをダウンロードしてインポートするための4つのチェーンを作成します:
- ペルセフォーニ|フェッチ取引
- Wdata|プライマリ・チェーンにデータをロードする
- Wdataにデータをロードする|データセットを置き換える
- Wdataにデータをロードする|新しいデータセットを追加する
注: テンプレートは複数のチェーンを作成しますが、Persefoniからデータをダウンロードし、テーブルのデータセットを更新するには、Persefoni | Fetch トランザクション のみを実行する必要があります。
Persefoni | Fetch transactions チェーンは、Persefoniからさまざまなタイプのトランザクションを取得することができる:
- 活動内容 、カーボンフットプリントの算出に使用したすべての投融資活動。
- 設備 、取引に関連する御社の拠点について
- 排出量 、取引に伴う温室効果ガス(GHG)排出量
前提条件
テンプレートはチェーンを作るためにこれらのコネクターを必要とする:
注: チェーンのコマンドはすべてCloudRunnerを使用する。グラウンドランナーは必要ない。
チェーンを作る前に、以下のことを確認する:
- ダウンロードするデータのPersefoniリソースグループと言語のID。
- 活動、施設、排出データをアップロードするWdataテーブルのID 。
これらの ID を特定したら、 のワークスペース変数 を追加します。接続 のワークスペース設定 を追加します:
名称 | デフォルト値 |
---|---|
cv_EmissionsTableId | 排出量データをインポートするテーブルのIDを入力する。 |
cv_ActivitiesTableId | アクティビティデータをインポートするテーブルのIDを入力してください。 |
cv_FacilitiesTableId | 施設データをインポートするテーブルのIDを入力してください。 |
wsv_ResourceGroupId | トランザクションをダウンロードするPersefoniリソースグループのIDを入力してください。 |
wsv_LanguageId | Persefoniからダウンロードする言語のIDを入力してください。 |
wsv_WdataLoadWarningThreshold |
250 を入力してください。注: テンプレートから作成されるチェーンは、大きなデータセットをインポートする際のタイムアウトを防ぐために、このしきい値を使用する。 |
Persefoni|Fetch Transactionsテンプレートからチェーンを作成する
必要なコネクターとワークスペース変数を設定したら、Persefoni | Fetch Transactions テンプレートからチェーンを作成する:
-
Chains から、Create 、Create chain from template を選択する。
-
Persefoni | Fetch transactions を選択し、Create from template をクリックする。
- チェーンを作成する環境を選択し、Next をクリックする。
- テンプレートの各変数に対応するワークスペース変数を選択し、Next をクリックする。
-
Persefoni | Fetch transaction コマンドで使用する Persefoni コネクタを選択し、Next chain をクリックします。
-
wsv-WdataLoadWarningThreshold ワークスペース変数を選択し、Next をクリックする。
-
Load data to Wdata | Primary chain コマンドで使用するコネクタを選択し、Next chain をクリックする。
-
wsv-WdataLoadWarningThreshold ワークスペース変数を選択し、Next をクリックする。
-
Workiva およびHandlebars を選択し、Load data to Wdata | Replace dataset コマンドで使用するコネクタを選択し、Next chain をクリックし、Next をクリックする。
-
Load data to Wdata | Add new dataset コマンドに同じコネクタを選択し、Submit をクリックし、View your new chain をクリックする。
- Persefoni | Fetch transactions チェーンから、Publish およびPublish をクリックする。
- チェーン 、から、テンプレートから作成した他のチェーン を公開する。
Persefoni の実行|Fetch Transactions チェーン
テンプレートから作成したチェーンを実行する:
-
Chains から、Execute forPersefoni | Fetch transactions chain を選択する。
-
Run with inputs をクリックする。
-
設備 、活動 、排出 のように、Persefoni からダウンロードするさまざまな種類の取引データについて、実行 を選択します。
- ダウンロードしたデータをさらにフィルタリングするには、トランザクションタイプに適用するオプションのパラメータを入力します:
パラメータ 入力 総計 トランザクションを集計するフィールドを入力します。 フィルター トランザクションに適用するフィルタを入力します。 演算子 取引に適用するオペレータを入力します。 グループ別 グループ化する列をカンマ区切りで入力する。 - Start をクリックする。
***
With Persefoni, your company can easily calculate, analyze, and manage its carbon footprint. From Wdata, you can build a chain to download carbon accounting data from Persefoni and update tables with the latest datasets, such as for Environmental, Social, and Governance (ESG) reporting in Workiva.
When you run the chain, you can select the type of data to download from Persefoni:
- Activities, for all investment and lending activity used to calculate your carbon footprint
- Facilities, for your company's locations associated with transactions
- Emissions, for the greenhouse gas (GHG) emissions associated with transactions
Tip: To download all data types from Persefoni at once, you can create an optional chain to run this chain multiple times at once.
Prerequisites
To build this chain, you'll need these connectors:
Note: All of this chain's commands use the default CloudRunner. No GroundRunners are needed.
Before you run the chain, identify:
- The IDs for the Persefoni resource group and language of the data to download.
- The IDs of the Wdata tables to upload activity, facility, and emission data into.
Step 1. Create the chain
-
In Chain Builder, from Chains, click
Create, Create chain.
-
Under Setup, enter a name and description to help identify
the chain and its intent.
- To enable the chain to run multiple times at once, select Allow concurrent runs.
-
Under Variables, add a variable for the ID each Wdata table
to upload Persefoni data to, based on data type:
- activity_table_ID
- facility_table_ID
- emission_table_ID
- In Value, enter the default table ID for each variable.
- Click Save.
Step 2. Start with a Runtime Inputs event
To prompt for the type of data to download from Persefoni each time the chain runs, start with a Runtime inputs event:
- Move Runtime Inputs from under Trigger Events to Start.
- Select the Runtime Inputs event, and click Edit.
- Click Add input, and select DropdownField.
-
Enter a name and description to help identify the input's context of a Persefoni
data type.
- Select Required.
-
Add each option for the input, and click Save:
Value Display name emissions Emissions facilities Facilities activities Activities
Step 3. Add logic to run commands based on data type
To run commands depending on the desired Persefoni data, add conditional logic and groups for each runtime input value—Emissions, Facilities, and Activities:
- Move Conditional from under Events to the canvas, and drag a link to it from Start.
- Select the Conditional event, and click Edit.
-
Under Basic info, enter a name and description to help identify
the context of Emissions data.
-
Under Conditions, add a rule based on the input selection
of Emissions, and click Save:
Data type Variable Operator Value Select String. Select the runtime input variable under Trigger. Select =. Enter emissions
. - Move Command group to the canvas, and drag a link to it from the Conditional event.
- Select the command group, and click Edit.
- Under Basic info, enter a name and description to identify its context of Emissions data, and click Save.
- Move another Conditional from under Events to the canvas, and drag a link to it from Start.
- Select this Conditional event, and click Edit.
-
Under Basic info, enter a name and description to help identify
the context of Facilities data.
-
Under Conditions, add a rule based on the input selection
of Facilities, and click Save:
Data type Variable Operator Value Select String. Select the variable under Trigger, Runtime inputs. Select =. Enter facilities
. - Move two more command groups to the canvas, and drag a link to each from the Facilities Conditional event.
- For each command group, click Edit, enter a name and description to identify the context of its data (Facilities or Activities, respectively), and click Save.
-
To run the Activities command group only when the runtime input is
not Emissions or Facilities, double-click its link, select
Error for Edit link condition, and click
Close.
Step 4. Add commands to remove any datasets from each table
To prepare each Wdata table for the latest data from Persefoni, build these commands three times— once for each command group of Emissions, Facilities, or Activities—to remove any existing datasets:
- Under BizApps, select Workiva, and move List files to the canvas.
- Drag a link from Group start of the command group to the List files command.
- Select the List files command, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intended data.
-
Enter the command's properties, and click Save:
Property Value Connector Select the Workiva connector to use. Table ID Select the chain variable for the ID of the table to import the data into. For example, for the Wdata table to upload Emissions into, select the chain variable for the Emissions table's ID. - Under BizApps, select JSON, and move Array to CSV to the canvas.
- Drag a link from the List files command to the Array to CSV command.
- Select the Array to CSV command, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intent to convert the list of the table's datasets to a comma-separated
values (CSV) format.
-
Enter the command's properties:
Property Value Connector Select the JSON connector to use. JSON data Select the Files list output of the List files command. Multi-value delimiter Enter a comma ( ,
).Preview result Select this checkbox. -
Add columns for IDs and names:
Column name JSONpath id .id name .name - In Delimiter, select Comma, and click Save.
- Under BizApps, select Tabular Transformation, and move Advanced query to the canvas.
- Drag a link from the Array to CSV command to the Advanced query command.
- Select the Advanced query command, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intent to query the table's datasets.
- Select the Tabular Transformation connector to use.
-
Under Tables, add the file with the data to query:
Property Value File Select the Converted file output of the Array to CSV command. Table name Enter a
. -
Enter the command's properties, and click Save:
Property Value Query Enter select * from a
.Input delimiter Select Comma. Output delimiter Select Comma. Preview results Select this checkbox. - To run the next commands only when the table has an existing dataset, move Conditional from under Events to the canvas.
- Drag a link from the Advanced query command to the Conditional event.
- Select the Conditional event, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intent to check for datasets.
-
Under Conditions, add a rule based on whether the table
already has a dataset, and click Save:
Data type Variable Operator Value Select Integer. Select the Record count output of the Advanced query command. Select >=. Enter 1
. - Under BizApps, select Workiva, and move Un-import file from table to the canvas.
- Drag a link from the new Conditional event to the Un-import file from table command.
- Select the Un-import file from table command, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intent to un-import datasets.
-
Enter the command's properties:
Property Value Connector Select the same Workiva connector as earlier. Table ID Select the chain variable for the ID of the table to import data into. File ID Select the Files list output of the List files command. -
In the File ID field, click the Files list
output, and apply a transformation:
Transformation Output Value Select Get value from JSON. Select Text. Enter 0
andid
. - Click Save.
- Under BizApps, select Workiva, and move Delete file to the canvas.
- Drag a link from the Un-import file from table command to the Delete file command.
- Select the Delete file command, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intent to delete datasets.
-
Enter the command's properties, and click Save:
Property Value Connector Select the same Workiva connector as earlier. File ID Select ID from the File import output of the Un-import file from table command.
Step 5. Add commands to download and upload Persefoni data
To pull and upload the latest data from Persefoni, build these commands three times— once for each command group of Emissions, Facilities, or Activities:
- Under BizApps, select Persefoni, and move Fetch transactions to the canvas.
-
Drag a link from Group start of the command group to the
Fetch transactions command.
Note: Since both the Fetch transactions and List files commands link to Group start, they'll start at the same time when the group runs.
- Select the Fetch transactions command, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intent to download the respective data type.
-
Enter the command's properties, and click Save:
Property Value Connector Select the Persefoni connector to use. Resource group ID Enter the ID of the Persefoni resource group to download from. Named query Select the same data type as the command group—Emissions, Facilities, or Activities. Language ID Enter the language ID of the data to download. - Under BizApps, select Workiva, and move Create file to the canvas.
- Drag a link from the Fetch transactions command to the Create file command.
- Select the Create file command, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intent to create a comma-separated values (CSV) file of the data.
-
Enter the command's properties, and click Save:
Property Value Connector Select the Workiva connector to use. Table ID Select the chain variable for the ID of the table to import the data into. For example, for the Wdata table to upload Emissions into, select the chain variable for the Emissions table's ID. File Select the Fetched transactions output of the Fetch transactions command. Name Enter the name for the created dataset. For example, for Emissions data, enter emissions.csv
. - Under BizApps, select Workiva, and move Import file into table to the canvas.
- Drag a link from the Create file command to the Import file into table command.
- Select the Import file to table command, and click Edit.
-
Under Basic info, enter a name and description to help identify
the intent to import the dataset into the Wdata table.
-
Enter the command's properties, and click Save:
Property Value Connector Select the same Workiva connector as earlier. Table ID Select the same chain variable as the Create file command. File ID Select ID from the Result output of the Create file command. - Drag links from both the previous Conditional event and Delete file command to the Create file command.
-
To skip the Un-import file from table and
Delete file commands when there is no existing dataset,
double-click the link between the Conditional event and
Create file command, and select Error for
Edit link condition, and click Close.
Step 6. Publish and run the chain
After you set up the command groups for each data type, enable the chain to run:
- Click Publish.
- Enter any comment about its publication, and click Publish.
To run the chain:
- From Build, Chains, select Edit from the chain's menu.
- Click Chain Settings, enter the table IDs in their respective variables, and click Save.
- Click Execute and Run with inputs.
-
Select the type of data to download from Persefoni, and click
Start.
Step 7. Create optional chain to start concurrent runs for all data types
When you run the chain, it downloads only the data type selected for the runtime input. To download all data types from Persefoni at once, create a second chain that starts multiple runs at the same time:
-
In Chain Builder, from Chains, click
Create, Create chain.
-
Under Setup, enter a name and description to help identify
the chain and its intent.
- To enable the chain to run multiple times at once, select Allow concurrent runs.
- Click Save.
- Move Command group to Start.
- Move three Run chain events to the canvas, one for each data type—Emissions, Facilities, and Activities.
- Drag a link from Group start to each Run chain event.
-
For each Run chain event, click Edit, set
it up for its respective data type, and click Save:
Property Value Name Enter a name to identify the context of its data type. Chain to run Select the chain created earlier. Chain runtime inputs Select the respective data type—Emissions, Facilities, or Activities. - Click Publish.
- Enter any comment about its publication, and click Publish.