[login to view URL] is a nonprofit dedicated to help communities reduce their carbon emissions to net zero by 2040. Our key technology is the Carbon Tracker, a web-based application. This application is built using Angular 9 and Google Firestore, and is hosted by Google.
We have three Firebase environments: cure100-dev, cure100-staging, and cure100-prod. Currently, cure100-staging is serving as the "production" environment and stores the data of close to 200 users from about 3-4 communities.
We need backend support, especially:
• Ability to backend and restore any Firebase environment.
• Ability to upload and download a collection (JSON file) to any Firebase environment; to/from a developer's environment or from any other Firebase Environment.
• Ability to ETL any Firebase environment into an SQL Data lake, so that we can create queries on any community and cure100 level. We are primarily considering Snowflake, but possibly also Big-Query, at least initially.
• Using Data-Studio to create Dashboards based on queries in Snowflake and/or Big-Query.
• Automating community sub domains -- so that we can automatically move users from each community to the correct Firebase Environment. Note:
1. Currently we are using only a single Firebase Environment for all communities.
2. When we move data from the current "cure100-staging" to "cure100-prod" we will change reference on the backend -- so from a community's webpage perspective -- nothing changes.
3. Automation is a 2nd level requirement that will become important only when we scale beyond 50 or 100 communities.
We have over 40 new communities that are waiting to join and use our Carbon Tracker. We are reluctant to add that many users before:
1. We have the initial backend support described above.
2. Move all user data from "cure100-staging" to "cure100-prod" in coordination with redirecting any traffic that is using old URLs that points to "staging" on to "cure100-prod".
Scope of Work
Setting up the Carbon Tracker in Google Cloud Platform (GCP) as backend support described in the Background section above.
Implementation has to be done in a way such as:
1. A non-technical admin can easily use them via a GCP or other console.
2. All source code, such as the cloud functions, is made available. If a programming language is required, we prefer Python or Node.js.
3. Code and settings can easily be updated in the future by a more technical admin.
1. Firebase Environment Backup and Restore
1. Daily backup of all collections into a storage bucket (except zip_code_data).
2. Backup should span at least 30 days, with an admin ability to increase or decrease.
3. Ability to restore any collection and/or collections from any selected backup.
2. Ability to restore any Collection or Collections from any Backup.
3. Ability to move collections between any Firebase Environment
1. Likely this will first move to the new Environment's storage bucket, and then replace or add to an existing collection/s.
4. Ability to move any data from a storage bucket to and from a developer's computer.
5. Move data from Firebase to a Cloud SQL DB: Snowflake (possibly also Big-Query if need be)
1. Snowflake is our proffered backend environment, but we are open to other suggestions.
2. Daily ETL of backed up data into Snowflake (or BigQuery).
3. A set of 10 Basic SQL queries. Note: we will provide the Query requested logic.
i. On a Community Level. The idea that each community, that is identified by a unique ID, will have access to queries that will ONLY present data of their own users.
ii. On the overall CURE100 level. Cure can see data of all users from all communities.
6. A Data Studio Dashboard with Query Results
1. On a Community Level (a community can only see data of its own members).
2. On a Cure100 Level.
7. Improve data security: Firebase environments, bucket storages, data lake, and dashboards.