Once the approach has been approved, we can update the GHGP policy and run the example data, publish the PCFs to the Hedera Network, and demonstrate how another guardian policy (of a supply chain partner) can reference a dynamic PCF to support scope 3 calculations. I believe Wes was interested in having this be a methodology breakdown.
API facilities to retrieve unique references (IDs) of results for API-triggered operations
Design a generic approach to the 'traceability' of API calls such that for each API call a chain of events and actions within Guardian policy and especially to outside systems can be established via the unique IDs culminating in:
Hedera transactions
Hedera topics messages
Hedera contract calls
Artifacts published on IPFS
Introduce a corresponding UI where users can visually observe the same information
Guardian analytics: labels and top down data way points
Introduce 2 new workflows into Guardian, which include the corresponding roles and access permissions:
labels author, for users to be able to create the 'rulesets' for evaluating data for their compliance with the chosen 'label',
auditor workflow, for users which would use these 'rulesets' to apply to data.
Introduce the concept of labels, which can be specified to combine multiple statistics (introduced in Guardian analytics: bottom-up data traceability #3336) to create 'higher-order' statistics which themselves can be combined further essentially enabling the creation of 'data transformation' trees which, when applied to data, would ultimately get resolved into binary compliant/non-compliant answers. The top-level 'nodes' in these trees are 'Labels'.
Enhance the current capability of qualitative evaluations in Statistics to support the ability for users to attach external evidence and add textual comments/explanations whenever a human input is enabled. The evidence would then become part of the 'evaluation trust-chain', i.e. it should be hashed and stored verifiably. Evidence in the image formats should be viewable in the browser, archives (zip files), pdfs, csv files should be supported for attachment and then download.
Enable Auditors to apply 'label rulesets' to tokens, Guardian would then automatically traverse the token trust-chain to find and evaluate the required data to produce the label conclusion, i.e. the compliant/non-compliant results. These results can optionally be published to IPFS/topics by Auditors that generated them.
Enable ordinary users to search for statistics, label ruleset, and label conclusions that have been published.
Trustchain support for contract-based issuance and retirement implementation
Extend/modify trustchain implementation to support new contract-based issuance and retirement functionality such that users have visibility to the entire lifecycle of the token and have access to all significant artifacts produced as a result.
American Carbon Registry (ACR) ACR Methodology for Quantifying, Monitoring, Reporting, and Verifying Greenhouse Gas Emissions Reductions and Removals from Landfill Gas Destruction and Beneficial Use Projects
Creating Schema design for this methodology.
Development of the schema and policy.
Testing the policy development through Guardian UI and configurator.
Some interesting next steps could be to address the final data gaps; engage with PACT to review the policy and schema, provide feedback, and identify new use cases and features; engage with a third-party auditor to review or potentially verify the policy; and eventually have actual downstream supply chain partners referencing the PCF.
Formula Driven Definitions & Schema Tree Enhancement
A clear label (and an ability to define a formula) for a business user to build high level definitions and link them to underlying schemas, tools, and patterns closer aligned with VCM formulaic definitions
Dry-run policy execution 'savepoints' - restart policy dry-run from the list of 'saved' places
Introduce a new functionality for users to 'save' dry-run execution status at arbitrary points by clicking 'save state' button.
The system should support the creation of multiple save points for the same execution workflow
Next time the (draft) policy is executed in the dry-run mode users should be given a choice whether to restart from the beginning or continue execution from any of the 'save points'.
Starting execution from a 'save point' invalidates and removes all the other save points that logically followed it
It should be possible to delete some or all save points manually
Improving Guardian UI by adding more UI elements and also adding more colorful headers which can be customized.
Creating a static landing page which will have capability of performing project comparison within same instance using different parameters such as scale size, sectoral scopes, etc.
Implementing AI search for allowing Project developers to search policies as per the information entered.
Implementing Guider Search for allowing project developers to search policies using different parameters within same instance.
Implementation of property field when schema is created, which will be used for standardizing as per IWA specification.
Implement the policy deprecation workflow which includes:
Guardian UI allowing issuing SR to discontinue a policy (version) or the entire policy from a certain date (in the future or 'now').
Policy grid should display a suitable marker against non-active policies, and a different for the ones soon expiring.
An appropriate message posted in the corresponding Hedera topic recording the 'discontinuing' decision
For in-progress projects that have been registered and are operating under the policy it should be possible to 'switch' to the appropriate version of the policy which is still valid.
Gold Standardโs Carbon Sequestration through Accelerated Carbonation of Concrete Aggregate Webinar
Design schemas for the Carbon Sequestration through Accelerated Carbonation of Concrete Aggregate methodology, create a PowerPoint presentation, and conduct webinar.
Development of the policy using the schemas and workflow designed
Business User Policy Development Feature - schemas MVP
Create a excel 'schema representation' standard suitable for non-technical users. Note: use existing excel schemas from Tools and UNFCCC initiatives as guidance.
Create an explicit template for the above, downloadable from Guardian UI, which users can take and update/change to develop new schemas.
Create an Export/Import UI and tooling which would allow seamless transformation of schemas written in Excel into valid Guardian JSON schemas and vice versa
Ensure manual interventions are possible for corrections/adjustments of complex formulas and other issues.
Introduce support for geoTIFF and other raster types of data such that:
Guardian documents (i.e. in schemas) can reference raster data (in geoTIFF and other common formats) which are located on external (3rd party) systems.
Guardian UI can display raster images and their georeferencing data when they are encountered in documents.
Guardian policy can access and manipulate (use in calculations, etc) data from raster sources.
Support externally controlled DIDs with keys in Guardian
Introduce a workflow into the Guardian where a DID Controller would introduce a dedicated verification method into the main DID for which the private key would be stored and managed by a Guardian instance. This way Guardian would only be able to control the specific verification method's key, but not the rest of the DID.
Introduce the ability to mint Mitigation Asset Type tokens as the result of the calculation of the diff between planned (and reported on the Environmental) and actual results of the calculations based on the MRV data for a reporting period. This would likely require:
New type of blocks in the policy definition language specifying 'target' numbers.
Policy Engine ability to mint different types of tokens depending on the conditions
Development of ACM0002: Grid-Connected Electricity Generation from Renewable Sources
Development of the policy with all details mentioned in the design schema.
Tools involved in this policy also needs to be developed. The tools are listed below:
Tool 01- Tool for the demonstration and assessment of additionality
Tool 02- Combined tool to identify the baseline scenario and demonstrate additionality
Tool 03- Tool to calculate project or leakage CO2 emissions from fossil fuel combustion
Tool 05- Baseline, project and/or leakage emissions from electricity consumption and monitoring of electricity generation
Tool 07- Tool to calculate the emission factor for an electricity system
Tool 10- Tool to determine the remaining lifetime of equipment
Live project (data) migration across Policies, across Guardian instances
Implement User Interface (UI) and tooling allowing users to execute multiple cycles of 'export a live project' from a policy and 'import a live project' into another policy. This migration process should work irrespective of the policy versions, standard registries, and Guardian instances, automatically mapping data/documents to the corresponding policy steps in an intelligent way, referring to the Project Developer in situations needing human input via a convenient UI/UX ('User Experience'):
Project Developer can preview and assess the compatibility of policies and data, and the result of the migration using something analogous to the 'dry-run' mode.
For cases where the 'new' schemas and policy steps match perfectly the 'old' valid data/documents from the 'source', the 'old' ones should be automatically accepted into the 'target' policy flow with no human intervention.
Project Developer can review and select/guide the matching and the destination of the 'source' data/documents into the new policy flow with full visibility with regard to:
'source' and 'target' policy structure (side by side), with details of block parameters etc where required.
content of the original and destination documents with field-level granularity
Where data needs to be augmented and thus new signatures are required the corresponding Guardian users (e.g. Standard Registry) get requests to sign the data.
The migration process should be automated, and should result in the 'stopped' project/policy execution on the 'source platform' and 'resumed' from the same point in the policy flow on the 'destination' (other) platform, with full data and tokens visibility and provenance provability in the trust chain. The 'old' data and artifacts produced on the 'source' should be fully useable on the 'target', e.g.
used in reports
viewable in the UI
data referencable and useable in calculations and other policy actions (such as minting)
operations on 'old' tokens are supported in the new policy smart contracts (retirement, exchanges, etc)
We need to integrate FireBlocks , a Key management tool to manage the Keys and secure Guardian. To get complete info on Fireblocks, please look at https://www.fireblocks.com/
Full project data comparison as produced/captured by policies
Introduce a comparison functionality where it'd be possible to 'diff' arbitrary sections or the entire trust-chains for different tokens, potentially issued by different policies such that the system would:
graphically display the differences where a user would then be able to 'scroll' through and review them in the UI
get a numerical 'similarity score' indicating how similar the two 'chains' are
Global environmental/Guardian data search (indexer) component for Hedera and IPFS
Improve the data storage and indexing capabilities of Guardian for the data belonging to the local instance such that complex analytical queries could be run efficiently, such as 'search for data similar to this' and 'what is the possibility of this being a double entry for something submitted elsewhere'.
Introduce a global search and indexing capability for data produce by other (all) instances such that queries above could be run on the entire body of Guardian data produced from the beginning of time (in blockchain sense).
Extend Block and policy discoverability/search #2281 for users to be able to preview the usage of the block without having to import "other SR's" policy into their Guardian instance
Fundamentally separate the concept of users, roles and permissions in Guardian
Introduce granular concept of permissions which could be assigned to users, a user could then perform a specific function within the role if its assigned role 'contains' this permission. These should include (but not limited to):
Policy edit/submit for review
Policy view
Policy approval & publish
Introduce a "user admin" role, which allows:
defining new roles from permissions
assigning of roles to users
Create a permissioning system which verifies actor role before any action has been taken throughout Guardian
Package in suitable most-common role set into Guardian so it can be operated immediately 'out of the box' without the need for additional configuration
Create a concept of 'delegation' where a user with a particular role/permission can explicitly 'delegate' this role/permission to another user
Introduce the functionality to produce a report (page, download) which lists all users and their roles/permissions mapping in the system
Create a Guardian 'transaction execution' service which would assure orderly transaction execution and their status tracking, and provide intelligent retry and failure recovery functionality such that required transactions would be guaranteed to be asynchronously executed once, and only once, and in the right order.
Further evolution of policy comparison (a.k.a 'mass diff')
Relying on the work done in the course of #1793 (i.e. creating data structures (hashes) to enable more efficient comparison), allow for mass-comparison of policies such that a user should be able to search for local policies 'similar' to 'different' to some other policy based on some similarity threshold. This is related (but different) to #2281 as it focuses on 'easy diff' vs 'easy search'.
Add suitable API facilities which would allow programmatic access to the indexed data and analytics, which include policy structure data (such as formulas used in the various elements - e.g. Tools) as well as project data.
Filtering data for blocks is stateful API, introduce stateless data filters for API usage
I don't necessarily think there is a hard requirement to remove the stateful nature of guardian filtering, as we cannot predict, what are the downstream API consumers are using this functionality or affects, they will be without some kind of deprecation notice.
So, the recommendation would be:
Add ability to filter using a GET request for a filter, so data can be fetched and filtered in one action
(As an alternative - preferred) It would be preferable to enable filtering at the block level when retrieving data so a API consumer does not need to add explicit filter blocks in block can use the Guardian API to be more RESTful by default.
Post a six month deprecation notice for stateful usage of the filter (revert if hard requirement for others)
An example, code enhancement could be implemented like this (tags are easier to reason about):
From old version:
public function filterByTag(string $policyId, string $tag, string $uuid): object
{
return (object) $this->httpClient->post("policies/{$policyId}/tag/{$tag}/blocks", [
'filterValue' => $uuid
], true);
}
to:
public function filterByTag(string $policyId, string $tag, string $uuid): object
{
return (object) $this->httpClient->get("policies/{$policyId}/tag/{$tag}/blocks?filterValue={$uuid}");
}
Or provide/document clearly a mechanism to filter on an interface document block itself, which would be preferred.
Different token IDs for different projects by the same policy
Introduce the facility to dynamically create new TokenIDs and 'assign' them to (newly registered) specific projects such that all data associated with these specific projects would be linked to the corresponding TokenIDs upon minting instances of the token.
Ensure clear association with the same methodology for all TokenIDs and their respective trustchains. I.e. it should be clear that 'these tokens' have been issued by the same Policy, but for different projects.
Extend trust chain to show multiple tokens and multiple projects 'managed' by the same policy
Enhance MongoDB Integration by incorporating seamless support for popular third-party services, such as MongoDB Atlas.
The task at hand involves modifying the codebase to seamlessly integrate the new MongoDB Atlas connection string without the redundant mongodb:// prefix. The correct format for the DB_HOST environment variable should be mongodb+srv://:@staging.wj9lvfj.mongodb.net/?retryWrites=true&w=majority. This adjustment will ensure a successful and accurate connection to our MongoDB Atlas instance.
Global Carbon Council (GCC) GCCM001: Methodology for Renewable Energy Generation Projects Supplying Electricity to Grid or Captive Consumers โ Version 4.0
Creating Schema design for this methodology.
Development of the schema and policy.
Testing the policy development through Guardian UI and configurator.
Introduce facilities into the Guardian schema language which would allow Guardian policy engine (and humans when they read these schemas in json) to recognize what values should be considered default for the documents based on these schemas.
Make Guardian policy engine UI to put in the default values into the fields of forms based on such schemas. The fact that these are default values automatically inserted into the field should be clearly identifiable, i.e. they need to look different from the values users explicitly put into the (other) fields.
All standard tools/libraries (e.g. for verification) should work with such schemas out of the box
Calculation Logic for values in 'automatic fields' in schemas
Introduce facilities into schema definition language which would allow for the referencing of other fields/values and specification of mathematics needed to calculate the field value based on those.
It should be possible to reference fields from 'other' schemas in the chain - parents/children.
Design and implement specialized analytics engine which would enable Guardian to identify, trace and display mathematical relations between data in different artifacts (VCs/VPs/tokens) including events (transactions/messages) on Hedera hashgraph, with unlimited traceability depth.
Intelligent 'understanding' of the nature of the transformations (e.g. in formulas in calculation blocks) is out of scope of this ticket, the analytics engine can view transformations as black boxes. If the 'original' data are used as 'input' into such a black box, for the purposes of this analytics reporting it can be assumed that the 'output' data depends on that 'original' data.
The system should correctly identify and display references to the 'original' data such as when VC document fields reference document fields in other VCs.
Users should be able to perform complex data searches with the scope limited to the dependencies graph.
Guardian UI should enable Project Developers to define (and name) new datapoints and publish these definitions in a standardized manner linking them both to the originator's author DID, project ID and policy ID.
Letโs say I submit incorrect data to a block (as I would like to rely on the schema validation on the guardian)
Looking at the logs, I received an error about JSON schema validation:
I am receiving a 500 status code back from a block submission in point where I would normally expect to receive something kin to a 422 (unprocessable entity)?