Awesome
National Climate Transparency Tool
<a name="about"></a>
The National Climate Transparency Tool is your gateway to ensure robust Measurement, Reporting and Verification (MRV) toward the Enhanced Transparency Framework (ETF) and to accelerate implementation of the Nationally Determined Contribution (NDC).
In an era where climate action is not just an aspiration but a responsibility, the National Climate Transparency System is a tool to support the country's uncompromising commitment towards environmental stewardship. This module is an integral part of our open-source software ecosystem, to support countries in easing the process of developing national data management systems for NDC tracking, GHG inventory data management and Biennal Transparency Reporting (BTR). The interface co-designed with countries, coupled with a robust backend ensures that managing complex climate data becomes a seamless experience, fostering a culture of transparency and accountability.
By employing the National Climate Transparency tool, countries can effortlessly align their climate actions with international standards, ensuring a harmonized approach towards a sustainable future. The tool is engineered to provide a clear lens into the progress and impact of NDC actions, making compliance with international commitments and national institutional arrangements a streamlined process. This Digital Public Good codebase envisions to encapsulate the essence of effective climate action management, which countries can configure, adapt and build on to meet national circumstance.
The system has 3 key features, and to be uploaded by the third quarter of 2024.
- NDC Actions Tracking: Effortlessly track and report NDC mitigation / adaptation actions, programmes, projects, activities and support. The system is intended to support monitoring and reporting of national activities and finance, facilitating compliance with international reporting commitments. The codebase can be configured to national institutional arrangements and NDC context.
- GHG Inventory: Maintains a comprehensive inventory of greenhouse gas (GHG) emissions with ease. The system allows for accurate data collection (with Excel integration), automated calculations, and reporting, supporting informed decision-making.
- Reporting Module: Pulls together data collected and managed across the above two modules into a format that is required for reporting to UNFCCCC. The standard codebase uses the recently approved Common Tabular Format and can be configured to any other format.
Index
Below contents are planned to be updated by the third quarter of 2024 based on user feedback and recent change in international requirements.
- About
- Standards
- System Architecture
- Project Structure
- Run Services as Containers
- Run Services Locally
- Deploy System on the AWS Cloud
- Modules
- Web Frontend
- Localization
- API (Application Programming Interface)
- Status Page
- User Manual
- Demonstration Video
- Data Sovereignty
- Governance and Support
<a name="standards"></a>
Standards
This codebase aims to fulfill the Digital Public Goods standard, adheres to the UNDP Data Principles, and it is built according to the Principles for Digital Development.
<a name="architecture"></a>
System Architecture
<a name="deployment"></a>
Deployment
System services can deploy in 2 ways.
- As a Container - Each service boundary containerized in to a docker container and can deploy on any container orchestration service. Please refer Docker Compose file
- As a Function - Each service boundary packaged as a function (Serverless) and host on any Function As A Service (FaaS) stack. Please refer Serverless configuration file
External Service Providers
All the external services access through a generic interface. It will decouple the system implementation from the external services and enable extendability to multiple services.
File Service
Implemented 2 options for static file hosting.
- NestJS static file hosting using the local storage and container volumes.
- AWS S3 file storage.
Can add more options by implementing file handler interface
Change by environment variable FILE_SERVICE
. Supported types are LOCAL
(default) and S3
.
<a name="structure"></a>
Project Structure
.
├── .github # CI/CD [Github Actions files]
├── deployment # Declarative configuration files for initial resource creation and setup [AWS Cloudformation]
├── backend # System service implementation
├── services # Services implementation [NestJS application]
├── src
├── national-api # National API [NestJS module]
├── stats-api # Statistics API [NestJS module]
├── async-operations-handler # Async Operations Handler [NestJS module]
├── serverless.yml # Service deployment scripts [Serverless + AWS Lambda]
├── web # System web frontend implementation [ReactJS]
├── .gitignore
├── docker-compose.yml # Docker container definitions
└── README.md
<a name="container"></a>
Run Services As Containers
- Update docker compose file env variables as required.
- Currently all the emails are disabled using env variable
IS_EMAIL_DISABLED
. When the emails are disabled email payload will be printed on the console. User account passwords needs to extract from this console log. Including root user account, search for a log line starting withPassword (temporary)
on national container (docker logs -f climate-transparency-national-1
). - Add / update following environment variables to enable email functionality.
IS_EMAIL_DISABLED
=falseSOURCE_EMAIL
(Sender email address)SMTP_ENDPOINT
SMTP_USERNAME
SMTP_PASSWORD
- Use
DB_PASSWORD
env variable to change PostgreSQL database password - Configure system root account email by updating environment variable
ROOT EMAIL
. If the email service is enabled, on the first docker start, this email address will receive a new email with the root user password.
- Currently all the emails are disabled using env variable
- Add user data
- Update users.csv file to add users.
- When updating file keep the header and replace existing dummy data with your data.
- These users will be added to the system each docker restart.
- Run
docker-compose up -d --build
. This will build and start containers for following services:- PostgresDB container
- National service
- Analytics service
- Async Operations Handler service
- Migration service (This service will shutdown automatically once db migration script execution is completed)
- React web server with Nginx.
- Web frontend on http://localhost:9030/
- API Endpoints,
<a name="local"></a>
- Swagger documentation will be available on http://localhost:9000/local/national
Run Services Locally
Follow same steps mentioned above to run the services locally using docker.
<a name="cloud"></a>
Deploy System on the AWS Cloud
- Execute to create all the required resources on the AWS.
aws cloudformation deploy --template-file ./deployment/aws-formation.yml --stack-name ndc-transparency-basic --parameter-overrides EnvironmentName=<stage> DBPassword=<password> --capabilities CAPABILITY_NAMED_IAM
- Setup following Github Secrets to enable CI/CD
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
- Run it manually to deploy all the lambda services immediately. It will create 2 lambda layers and following lambda functions,
- national-api: Handle all user and program creation. Trigger by external http request.
- async-operations-handler: Handle all async operations such as managing notification emails.
- setup: Function to add initial system user data.
- Create initial user data in the system by invoking setup lambda function by executing
aws lambda invoke \
--function-name ndc-transparency-services-dev-setup --cli-binary-format raw-in-base64-out\
--payload '{"rootEmail": "<Root user email>","systemCountryCode": "<System country Alpha 2 code>", "name": "<System country name>", "logoBase64": "<System country logo base64>"}' \
response.json
<a name="modules"></a>
Modules
UNDP Platform for Voluntary Bilateral Cooperation
UNDP Platform for Voluntary Bilateral Cooperation generation is implemented in a separate node module. Please refer this for more information.
<a name="frontend"></a>
Web Frontend
Web frontend implemented using ReactJS framework. Please refer getting started with react app for more information.
<a name="localization"></a>
Localization
- Languages (Current): English
- Languages (In progress): French, Spanish
For updating translations or adding new ones, reference https://github.com/undp/national-climate-transparency/tree/main/web/src/locales/i18n
<a name="api"></a>
API (Application Programming Interface)
For integration, reference RESTful Web API Documentation documentation via Swagger. To access
- National API:
APP_URL
/national - Status API:
APP_URL
/stats
<a name="resource"></a>
Resource Requirements
Resource | Minimum | Recommended |
---|---|---|
Memory | 4 GB | 8 GB |
CPU | 4 Cores | 4 Cores |
Storage | 20 GB | 50 GB |
OS | Linux <br/> Windows Server 2016 and later versions. |
Note: Above resource requirement mentioned for a single instance from each microservice.status.APP_URL
<a name="status"></a>
Status Page
Coming soon...
<a name="manual"></a>
User Manual
Coming soon...
<a name="demo"></a>
Demonstration Video
Coming soon...
<a name="data"></a>
Data Sovereignty
The code is designed with data sovereignty at its core, empowering nations and organizations to have greater control and governance over their environmental data. Here are the key points highlighting how this system promotes data sovereignty:
- Local Control:
- Allows nations and entities to store, manage, and process their data locally or in a preferred jurisdiction, adhering to local laws and regulations.
- Open Source Architecture:
- Facilitates transparency, customization, and control over the software, enabling adaptation to specific legal and regulatory requirements.
- Decentralized Infrastructure:
- Supports a decentralized data management approach, minimizing reliance on external or centralized systems.
- Standardized yet Flexible Protocols:
- Provides standardized protocols for data management while allowing for local customization, aligning with the diverse legal landscapes.
- Secure Data Sharing and Access Control:
- Implements robust access control and secure data sharing mechanisms, ensuring only authorized entities can access or alter the data.
- Audit Trails:
- Offers comprehensive audit trails for all data transactions, ensuring traceability and accountability in data handling and reporting.
- Enhanced Privacy Compliance:
- Helps in ensuring compliance with privacy laws and regulations by providing tools for secure data handling and consent management.
By integrating these features, the code significantly contributes to achieving data sovereignty, promoting a more localized and accountable management of environmental data in line with the goals of the Paris Agreement.
<a name="support"></a>
Governance and Support
Digital For Climate (D4C) is responsible for managing the application. D4C is a collaboration between the European Bank for Reconstruction and Development (EBRD), United Nations Development Program (UNDP), United Nations Framework Convention on Climate Change (UNFCCC), International Emissions Trading Association (IETA), European Space Agency (ESA), and World Bank Group that aims to coordinate respective workflows and create a modular and interoperable end-to-end digital ecosystem for the carbon market. The overarching goal is to support a transparent, high integrity global carbon market that can channel capital for impactful climate action and low-carbon development.
This code is managed by United Nations Development Programme as custodian, detailed in the press release. For any questions, contact us at digital4planet@undp.org.