Compare commits

..

47 Commits

Author SHA1 Message Date
nolash
31268c0b47 Add config for extended results for task graph generator 2021-09-30 08:50:01 +02:00
8a9d2ee0be Merge branch 'bvander/default-docker-compose' into 'master'
chore: gift defaults

See merge request grassrootseconomics/cic-internal-integration!282
2021-09-27 13:15:21 +00:00
3608fd1fc7 chore: gift defaults 2021-09-27 16:14:31 +03:00
0d275f358b Merge branch 'philip/removes-docker-compose-profiles' into 'master'
removes profile attrs from docker-compose file.

See merge request grassrootseconomics/cic-internal-integration!281
2021-09-27 12:33:19 +00:00
3aef2aa65f removes profile attrs from docker-compose file. 2021-09-27 15:26:22 +03:00
5644baefb2 Merge branch 'philip/bump-cic-ussd-version' into 'master'
Bumps cic-ussd version following token fix.

See merge request grassrootseconomics/cic-internal-integration!280
2021-09-27 11:37:42 +00:00
1a7c4deab6 Bumps cic-ussd version following token fix. 2021-09-27 14:27:10 +03:00
0389d8623d Merge branch 'philip/token-value-fix' into 'master'
Refactors to avoid conversion of zero values to wei.

See merge request grassrootseconomics/cic-internal-integration!279
2021-09-22 08:17:39 +00:00
cf64387d81 Refactors to avoid conversion of zero values to wei. 2021-09-22 11:10:33 +03:00
79bcc8a9f1 Merge branch 'dev-k8s-deploy-new-data-seeding' into 'master'
Dataseeding container

See merge request grassrootseconomics/cic-internal-integration!267
2021-09-21 06:04:44 +00:00
7b57f1b4c2 Dataseeding container 2021-09-21 06:04:44 +00:00
76b8519637 Merge branch 'philip/cache-migration' into 'master'
Fixes migrations in cic-cache (ideally)

See merge request grassrootseconomics/cic-internal-integration!276
2021-09-21 05:33:46 +00:00
e89aec76fa Fixes migrations in cic-cache (ideally) 2021-09-21 05:33:46 +00:00
a138a0ec75 Merge branch 'bvander/update-buildkit-cache' into 'master'
chore: clean old dockerfile and add docs for clearing cache

See merge request grassrootseconomics/cic-internal-integration!277
2021-09-20 07:01:53 +00:00
5128c7828c chore: clean old dockerfile and add docs for clearing cache 2021-09-20 09:59:12 +03:00
2f005195e5 Merge branch 'philip/ussd-cli-fix' into 'master'
Philip/ussd cli fix

See merge request grassrootseconomics/cic-internal-integration!275
2021-09-18 09:07:05 +00:00
fb8db3ffd2 Philip/ussd cli fix 2021-09-18 09:07:05 +00:00
b5f647c4aa Merge branch 'philip/ussd-demurrage' into 'master'
Philip/ussd demurrage

See merge request grassrootseconomics/cic-internal-integration!263
2021-09-17 11:15:43 +00:00
6019143ba1 Philip/ussd demurrage 2021-09-17 11:15:43 +00:00
610440b722 Merge branch 'lash/signer-missing-symbol' into 'master'
Update signer to fill in missing sign to wire symbol

See merge request grassrootseconomics/cic-internal-integration!273
2021-09-17 06:54:58 +00:00
Louis Holbrook
d65455fc29 Update signer to fill in missing sign to wire symbol 2021-09-17 06:54:58 +00:00
43f8d1c30c Merge branch 'bvander/docker-compose-restarts' into 'master'
chore: fix docker compose restarts

See merge request grassrootseconomics/cic-internal-integration!274
2021-09-16 11:17:42 +00:00
b855211eed chore: fix docker compose restarts 2021-09-16 13:06:53 +03:00
1e0c475f39 Merge branch 'lash/contract-migration-dump' into 'master'
fix(contract-migration): Replace missing environment

See merge request grassrootseconomics/cic-internal-integration!271
2021-09-15 13:36:10 +00:00
Louis Holbrook
3e6cf594e3 fix(contract-migration): Replace missing environment 2021-09-15 13:36:09 +00:00
b8f79a2dd1 Merge branch 'bvander/extend-timoeout-contract-migration' into 'master'
chore: extend timeout for contract migration

See merge request grassrootseconomics/cic-internal-integration!272
2021-09-15 11:08:14 +00:00
540c2fd950 chore: extend timeout for contract migration 2021-09-15 14:04:18 +03:00
b9b06eced8 Merge branch 'staging' into 'master'
feat: add multi-project build to trigger deploy-k8s-dev in the devops repo

See merge request grassrootseconomics/cic-internal-integration!268
2021-09-09 21:28:49 +00:00
949bb29379 feat: add multi-project build to trigger deploy-k8s-dev in the devops repo 2021-09-09 21:28:49 +00:00
Louis Holbrook
0468906601 Merge branch 'lash/contract-migration-dump' into 'master'
fix(contract-migration): Make http signer work with local deployment

See merge request grassrootseconomics/cic-internal-integration!270
2021-09-09 13:38:10 +00:00
nolash
471243488e Add latest crypto-dev-signer install explicitly in dockerfile for cic-eth 2021-09-09 14:27:47 +02:00
3c4acd82ff test eth and meta only mrs 2021-09-07 11:56:17 -07:00
e07f992c5a Add new file 2021-09-07 18:38:20 +00:00
17e95cb19c Merge branch 'bvander/signer-as-service' into 'master'
run signer as a service over http

See merge request grassrootseconomics/cic-internal-integration!265
2021-09-06 19:10:08 +00:00
3c3a97ce15 run signer as a service over http 2021-09-06 12:07:57 -07:00
a492be4927 Merge branch 'lash/contract-migration-config' into 'master'
Sanitize contract migration configs

See merge request grassrootseconomics/cic-internal-integration!262
2021-09-06 10:06:58 +00:00
Louis Holbrook
1f555748b0 Sanitize contract migration configs 2021-09-06 10:06:58 +00:00
8aa4d20eea Update .gitlab-ci.yml 2021-09-01 20:00:44 +00:00
Louis Holbrook
90cf24dcee Merge branch 'lash/lockfix' into 'master'
Rehabilitate imports

See merge request grassrootseconomics/cic-internal-integration!261
2021-09-01 16:54:10 +00:00
Louis Holbrook
75b711dbd5 Rehabilitate imports 2021-09-01 16:54:10 +00:00
c21c1eb2ef fix data seeding node installs 2021-08-31 11:43:01 -07:00
eb5e612105 minor update to import_ussd script 2021-08-30 11:09:47 -07:00
e017d11770 update readme 2021-08-30 10:14:22 -07:00
e327af68e1 Merge branch 'philip/refactor-import-scripts' into 'master'
Consolidated ussd dataseeding script

See merge request grassrootseconomics/cic-internal-integration!252
2021-08-29 09:55:47 +00:00
92cc6a3f27 Consolidated ussd dataseeding script 2021-08-29 09:55:47 +00:00
f42bf7754a Merge branch 'lash/lockfix' into 'master'
Normalize initial INIT lock address

See merge request grassrootseconomics/cic-internal-integration!260
2021-08-29 09:07:46 +00:00
nolash
7342927e91 Normalize initial INIT lock address 2021-08-29 10:43:12 +02:00
124 changed files with 3446 additions and 4860 deletions

View File

@@ -12,56 +12,32 @@ include:
stages:
- build
- test
- deploy
- deploy
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/docker-with-compose:latest
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/docker-with-compose:latest
variables:
DOCKER_BUILDKIT: "1"
COMPOSE_DOCKER_CLI_BUILD: "1"
COMPOSE_DOCKER_CLI_BUILD: "1"
CI_DEBUG_TRACE: "true"
TAG: $CI_COMMIT_REF_SLUG-$CI_COMMIT_SHORT_SHA
before_script:
- docker login -u gitlab-ci-token -p $CI_JOB_TOKEN $CI_REGISTRY
# runs on protected branches and pushes to repo
build-push:
stage: build
tags:
- integration
before_script:
- docker login -u gitlab-ci-token -p $CI_JOB_TOKEN $CI_REGISTRY
#script:
# - TAG=$CI_COMMIT_REF_SLUG-$CI_COMMIT_SHORT_SHA sh ./scripts/build-push.sh
script:
- TAG=$TAG sh ./scripts/build-push.sh
- TAG=latest sh ./scripts/build-push.sh
rules:
- if: $CI_COMMIT_REF_PROTECTED == "true"
when: always
deploy-k8s-dev:
deploy-dev:
stage: deploy
image: line/kubectl-kustomize
variables:
CI_DEBUG_TRACE: "true"
script:
- kubectl config set-cluster k8s --server="${K8S_DEV_SERVER?dev server missing}"
- kubectl config set clusters.k8s.certificate-authority-data ${K8S_DEV_CERTIFICATE_AUTHORITY_DATA}
- kubectl config set-credentials gitlab --token="${K8S_DEV_USER_TOKEN}"
- kubectl config set-context grassroots --cluster=k8s --user=gitlab --namespace grassroots
- kubectl config use-context grassroots
#- sed -i "s/<VERSION>/${CI_COMMIT_SHORT_SHA}/g" deployment.yaml
#- kubectl apply -f deployment.yaml
- echo "Wiping state..."
- kubectl delete jobs.batch --all
- kubectl delete hr postgresql && kubectl delete pvc -l 'app.kubernetes.io/name=postgresql'
- kubectl delete sts,pvc -l 'app=bloxberg-validator'
- kubectl delete hr redis && kubectl delete pvc -l 'app=redis'
- kubectl apply -f kubernetes/eth-node/ -f kubernetes/postgresql/ -f kubernetes/redis/
- echo "deploy and run database migrations..."
# set image based on deploy tag
- bash ./scripts/set-image.sh
- kubectl apply -f .
- echo "run contract migrations..."
- kubectl apply -f kubernetes/contract-migration/contract-migration-job.yaml
rules:
- if: $CI_COMMIT_REF_PROTECTED == "true"
when: always
trigger: grassrootseconomics/devops
when: manual

View File

@@ -2,25 +2,21 @@
## Getting started
## Make some keys
This repo uses docker-compose and docker buildkit. Set the following environment variables to get started:
```
docker build -t bloxie . && docker run -v "$(pwd)/keys:/root/keys" --rm -it -t bloxie account new --chain /root/bloxberg.json --keys-path /root/keys
export COMPOSE_DOCKER_CLI_BUILD=1
export DOCKER_BUILDKIT=1
```
### Prepare the repo
This is stuff we need to put in makefile but for now...
File mounts and permisssions need to be set
start services, database, redis and local ethereum node
```
chmod -R 755 scripts/initdb apps/cic-meta/scripts/initdb
````
start cluster
docker-compose up -d
```
docker-compose up
Run app/contract-migration to deploy contracts
```
RUN_MASK=3 docker-compose up contract-migration
```
stop cluster
@@ -28,9 +24,9 @@ stop cluster
docker-compose down
```
delete data
stop cluster and delete data
```
docker-compose down -v
docker-compose down -v --remove-orphans
```
rebuild an images
@@ -38,5 +34,7 @@ rebuild an images
docker-compose up --build <service_name>
```
Deployment variables are writtend to service-configs/.env after everthing is up.
to delete the buildkit cache
```
docker builder prune --filter type=exec.cachemount
```

View File

@@ -27,11 +27,11 @@ class RPC:
@staticmethod
def from_config(config):
chain_spec = ChainSpec.from_chain_str(config.get('CHAIN_SPEC'))
RPCConnection.register_location(config.get('RPC_HTTP_PROVIDER'), chain_spec, 'default')
RPCConnection.register_location(config.get('RPC_PROVIDER'), chain_spec, 'default')
if config.get('SIGNER_PROVIDER'):
RPCConnection.register_constructor(ConnType.UNIX, EthUnixSignerConnection, tag='signer')
RPCConnection.register_location(config.get('SIGNER_PROVIDER'), chain_spec, 'signer')
rpc = RPC(chain_spec, config.get('RPC_HTTP_PROVIDER'), signer_provider=config.get('SIGNER_PROVIDER'))
rpc = RPC(chain_spec, config.get('RPC_PROVIDER'), signer_provider=config.get('SIGNER_PROVIDER'))
logg.info('set up rpc: {}'.format(rpc))
return rpc

View File

@@ -9,7 +9,6 @@ psycopg2==2.8.6
celery==4.4.7
redis==3.5.3
chainsyncer[sql]>=0.0.6a3,<0.1.0
erc20-faucet>=0.3.2a1, <0.4.0
chainlib-eth>=0.0.9a7,<0.1.0
chainlib>=0.0.9a3,<0.1.0
eth-address-index>=0.2.3a1,<0.3.0
erc20-faucet>=0.3.2a2, <0.4.0
chainlib-eth>=0.0.9a14,<0.1.0
eth-address-index>=0.2.3a4,<0.3.0

View File

@@ -1,5 +1,4 @@
celery==4.4.7
erc20-demurrage-token~=0.0.3a1
cic-eth-registry~=0.5.8a1
chainlib~=0.0.7a1
cic_eth~=0.12.2a4
cic-eth-registry>=0.6.1a2,<0.7.0
cic-eth[services]~=0.12.4a8

View File

@@ -9,8 +9,8 @@ build-test-cic-eth:
- cd apps/cic-eth
- docker build -t $MR_IMAGE_TAG -f docker/Dockerfile .
- docker run $MR_IMAGE_TAG sh docker/run_tests.sh
#rules:
#- if: $CI_PIPELINE_SOURCE == "merge_request_event"
# changes:
# - apps/$APP_NAME/**/*
# when: always
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
changes:
- apps/$APP_NAME/**/*
when: always

View File

@@ -35,14 +35,14 @@ class RPC:
def from_config(config, use_signer=False, default_label='default', signer_label='signer'):
chain_spec = ChainSpec.from_chain_str(config.get('CHAIN_SPEC'))
RPCConnection.register_location(config.get('RPC_HTTP_PROVIDER'), chain_spec, default_label)
RPCConnection.register_location(config.get('RPC_PROVIDER'), chain_spec, default_label)
if use_signer:
RPCConnection.register_constructor(ConnType.UNIX, EthUnixSignerConnection, signer_label)
RPCConnection.register_constructor(ConnType.HTTP, EthHTTPSignerConnection, signer_label)
RPCConnection.register_constructor(ConnType.HTTP_SSL, EthHTTPSignerConnection, signer_label)
RPCConnection.register_location(config.get('SIGNER_PROVIDER'), chain_spec, signer_label)
rpc = RPC(chain_spec, config.get('RPC_HTTP_PROVIDER'), signer_provider=config.get('SIGNER_PROVIDER'))
rpc = RPC(chain_spec, config.get('RPC_PROVIDER'), signer_provider=config.get('SIGNER_PROVIDER'))
logg.info('set up rpc: {}'.format(rpc))
return rpc

View File

@@ -8,8 +8,8 @@ Create Date: 2021-04-02 18:41:20.864265
import datetime
from alembic import op
import sqlalchemy as sa
from chainlib.eth.constant import ZERO_ADDRESS
from cic_eth.db.enum import LockEnum
from cic_eth.encode import ZERO_ADDRESS_NORMAL
# revision identifiers, used by Alembic.
@@ -30,7 +30,7 @@ def upgrade():
sa.Column("otx_id", sa.Integer, sa.ForeignKey('otx.id'), nullable=True),
)
op.create_index('idx_chain_address', 'lock', ['blockchain', 'address'], unique=True)
op.execute("INSERT INTO lock (address, date_created, blockchain, flags) VALUES('{}', '{}', '::', {})".format(ZERO_ADDRESS, datetime.datetime.utcnow(), LockEnum.INIT | LockEnum.SEND | LockEnum.QUEUE))
op.execute("INSERT INTO lock (address, date_created, blockchain, flags) VALUES('{}', '{}', '::', {})".format(ZERO_ADDRESS_NORMAL, datetime.datetime.utcnow(), LockEnum.INIT | LockEnum.SEND | LockEnum.QUEUE))
def downgrade():

View File

@@ -3,7 +3,10 @@ import logging
# external imports
import celery
from hexathon import strip_0x
from hexathon import (
strip_0x,
add_0x,
)
#from chainlib.eth.constant import ZERO_ADDRESS
from chainlib.chain import ChainSpec
from chainlib.eth.address import is_checksum_address
@@ -180,6 +183,7 @@ def check_gas(self, tx_hashes_hex, chain_spec_dict, txs_hex=[], address=None, ga
if not is_checksum_address(address):
raise ValueError('invalid address {}'.format(address))
address = tx_normalize.wallet_address(address)
address = add_0x(address)
tx_hashes = []
txs = []

View File

@@ -69,7 +69,6 @@ from cic_eth.registry import (
)
from cic_eth.task import BaseTask
logging.basicConfig(level=logging.WARNING)
logg = logging.getLogger()
@@ -204,6 +203,9 @@ def main():
argv.append(config.get('CELERY_QUEUE'))
argv.append('-n')
argv.append(config.get('CELERY_QUEUE'))
argv.append('--config')
argv.append('celeryconfig')
BaseTask.default_token_symbol = config.get('CIC_DEFAULT_TOKEN_SYMBOL')
BaseTask.default_token_address = registry.by_name(BaseTask.default_token_symbol)

View File

@@ -13,7 +13,6 @@ from chainlib.eth.nonce import RPCNonceOracle
from chainlib.eth.gas import RPCGasOracle
from cic_eth_registry import CICRegistry
from cic_eth_registry.error import UnknownContractError
import liveness.linux
# local imports
from cic_eth.error import SeppukuError
@@ -48,6 +47,7 @@ class BaseTask(celery.Task):
def on_failure(self, exc, task_id, args, kwargs, einfo):
if isinstance(exc, SeppukuError):
import liveness.linux
liveness.linux.reset(rundir=self.run_dir)
logg.critical(einfo)
msg = 'received critical exception {}, calling shutdown'.format(str(exc))

View File

@@ -10,7 +10,7 @@ version = (
0,
12,
4,
'alpha.7',
'alpha.8',
)
version_object = semver.VersionInfo(

View File

@@ -1,4 +0,0 @@
[signer]
secret = deadbeef
#database = crypto-dev-signer
socket_path = ipc:///run/crypto-dev-signer/jsonrpc.ipc

View File

@@ -1,8 +0,0 @@
[database]
NAME=cic-signer
USER=
PASSWORD=
HOST=
PORT=
ENGINE=
DRIVER=

View File

@@ -1,8 +1,6 @@
@node cic-eth configuration
@section Configuration
(refer to @code{cic-base} for a general overview of the config pipeline)
Configuration parameters are grouped by configuration filename.
@@ -40,7 +38,26 @@ Boolean value. If set, the amount of available context for a task in the result
@subsection database
See ref cic-base when ready
@table @var
@item host
Database host
@item port
Database port
@item name
Database name
@item user
Database user
@item password
Database password
@item engine
The engine part of the dsn connection string (@code{postgresql} in @code{postgresql+psycopg2})
@item driver
The driver part of the dsn connection string (@code{psycopg2} in @code{postgresql+psycopg2})
@item pool_size
Connection pool size for database drivers that provide connection pooling
@item debug
Output actual sql queries to logs. Potentially very verbose
@end table
@subsection eth

View File

@@ -13,7 +13,16 @@ ARG EXTRA_PIP_ARGS=""
# pip install --index-url https://pypi.org/simple \
# --force-reinstall \
# --extra-index-url $GITLAB_PYTHON_REGISTRY --extra-index-url $EXTRA_INDEX_URL \
# -r requirements.txt
# -r requirements.txt
RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
pip install --index-url https://pypi.org/simple \
--extra-index-url $GITLAB_PYTHON_REGISTRY \
--extra-index-url $EXTRA_INDEX_URL \
$EXTRA_PIP_ARGS \
cic-eth-aux-erc20-demurrage-token~=0.0.2a6
COPY *requirements.txt ./
RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
pip install --index-url https://pypi.org/simple \
@@ -22,19 +31,21 @@ RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
$EXTRA_PIP_ARGS \
-r requirements.txt \
-r services_requirements.txt \
-r admin_requirements.txt
COPY . .
RUN python setup.py install
ENV PYTHONPATH .
-r admin_requirements.txt
# always install the latest signer
RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
pip install --index-url https://pypi.org/simple \
--extra-index-url $GITLAB_PYTHON_REGISTRY \
--extra-index-url $EXTRA_INDEX_URL \
$EXTRA_PIP_ARGS \
cic-eth-aux-erc20-demurrage-token~=0.0.2a6
crypto-dev-signer
COPY . .
RUN python setup.py install
ENV PYTHONPATH .
COPY docker/entrypoints/* ./
RUN chmod 755 *.sh

View File

@@ -5,27 +5,27 @@ set -e
# set CONFINI_ENV_PREFIX to override the env prefix to override env vars
echo "!!! starting signer"
python /usr/local/bin/crypto-dev-daemon -c /usr/local/etc/crypto-dev-signer -vv 2> /tmp/signer.log &
#echo "!!! starting signer"
#python /usr/local/bin/crypto-dev-daemon -c /usr/local/etc/crypto-dev-signer -vv 2> /tmp/signer.log &
echo "!!! starting taskerd"
/usr/local/bin/cic-eth-taskerd $@
# thanks! https://docs.docker.com/config/containers/multi-service_container/
sleep 1;
echo "!!! entering monitor loop"
while true; do
ps aux | grep crypto-dev-daemon | grep -q -v grep
PROCESS_1_STATUS=$?
ps aux | grep cic-eth-tasker |grep -q -v grep
PROCESS_2_STATUS=$?
# If the greps above find anything, they exit with 0 status
# If they are not both 0, then something is wrong
if [ $PROCESS_1_STATUS -ne 0 -o $PROCESS_2_STATUS -ne 0 ]; then
echo "One of the processes has already exited."
exit 1
fi
sleep 15;
done
#sleep 1;
#echo "!!! entering monitor loop"
#while true; do
# ps aux | grep crypto-dev-daemon | grep -q -v grep
# PROCESS_1_STATUS=$?
# ps aux | grep cic-eth-tasker |grep -q -v grep
# PROCESS_2_STATUS=$?
# # If the greps above find anything, they exit with 0 status
# # If they are not both 0, then something is wrong
# if [ $PROCESS_1_STATUS -ne 0 -o $PROCESS_2_STATUS -ne 0 ]; then
# echo "One of the processes has already exited."
# exit 1
# fi
# sleep 15;
#done
#
set +e

View File

@@ -1,3 +1,3 @@
celery==4.4.7
chainlib-eth>=0.0.9a7,<0.1.0
chainlib-eth>=0.0.9a14,<0.1.0
semver==2.13.0

View File

@@ -75,7 +75,7 @@ def test_task_check_gas_ok(
'cic_eth.eth.gas.check_gas',
[
[
tx_hash_hex,
strip_0x(tx_hash_hex),
],
default_chain_spec.asdict(),
[],
@@ -283,4 +283,3 @@ def test_task_resend_explicit(
tx_after = unpack(bytes.fromhex(strip_0x(otx.signed_tx)), default_chain_spec)
logg.debug('gasprices before {} after {}'.format(tx_before['gasPrice'], tx_after['gasPrice']))
assert tx_after['gasPrice'] > tx_before['gasPrice']

View File

@@ -1,4 +1,4 @@
crypto-dev-signer>=0.4.15a1,<=0.4.15
crypto-dev-signer>=0.4.15a7,<=0.4.15
chainqueue>=0.0.5a1,<0.1.0
cic-eth-registry>=0.6.1a2,<0.7.0
redis==3.5.3

View File

@@ -9,8 +9,8 @@ build-test-cic-meta:
- cd apps/cic-meta
- docker build -t $MR_IMAGE_TAG -f docker/Dockerfile .
- docker run --entrypoint=sh $MR_IMAGE_TAG docker/run_tests.sh
#rules:
#- if: $CI_PIPELINE_SOURCE == "merge_request_event"
# changes:
# - apps/$APP_NAME/**/*
# when: always
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event"
changes:
- apps/$APP_NAME/**/*
when: always

1
apps/cic-meta/README.md Normal file
View File

@@ -0,0 +1 @@
# CIC-Meta

View File

@@ -11,12 +11,12 @@ celery_app = celery.current_app
@celery_app.task
def persist_notification(recipient, message):
def persist_notification(message, recipient):
"""
:param recipient:
:type recipient:
:param message:
:type message:
:param recipient:
:type recipient:
:return:
:rtype:
"""

View File

@@ -11,12 +11,13 @@ local_logg = logging.getLogger(__name__)
@celery_app.task
def log(recipient, message):
def log(message, recipient):
"""
:param recipient:
:type recipient:
:param message:
:type message:
:param recipient:
:type recipient:
:return:
:rtype:
"""

View File

@@ -1,10 +1,12 @@
# standard imports
import json
import logging
from typing import Optional
# third-party imports
from cic_eth.api import Api
from cic_eth_aux.erc20_demurrage_token.api import Api as DemurrageApi
# local imports
from cic_ussd.account.transaction import from_wei
@@ -73,6 +75,24 @@ def calculate_available_balance(balances: dict) -> float:
return from_wei(value=available_balance)
def get_adjusted_balance(balance: int, chain_str: str, timestamp: int, token_symbol: str):
"""
:param balance:
:type balance:
:param chain_str:
:type chain_str:
:param timestamp:
:type timestamp:
:param token_symbol:
:type token_symbol:
:return:
:rtype:
"""
logg.debug(f'retrieving adjusted balance on chain: {chain_str}')
demurrage_api = DemurrageApi(chain_str=chain_str)
return demurrage_api.get_adjusted_balance(token_symbol, balance, timestamp).result
def get_cached_available_balance(blockchain_address: str) -> float:
"""This function attempts to retrieve balance data from the redis cache.
:param blockchain_address: Ethereum address of an account.
@@ -88,3 +108,14 @@ def get_cached_available_balance(blockchain_address: str) -> float:
return calculate_available_balance(json.loads(cached_balances))
else:
raise CachedDataNotFoundError(f'No cached available balance for address: {blockchain_address}')
def get_cached_adjusted_balance(identifier: bytes):
"""
:param identifier:
:type identifier:
:return:
:rtype:
"""
key = cache_data_key(identifier, ':cic.adjusted_balance')
return get_cached_data(key)

View File

@@ -1,5 +1,6 @@
# standard import
import decimal
import json
import logging
from typing import Dict, Tuple
@@ -8,6 +9,8 @@ from cic_eth.api import Api
from sqlalchemy.orm.session import Session
# local import
from cic_ussd.account.chain import Chain
from cic_ussd.account.tokens import get_cached_default_token
from cic_ussd.db.models.account import Account
from cic_ussd.db.models.base import SessionBase
from cic_ussd.error import UnknownUssdRecipient
@@ -59,7 +62,9 @@ def from_wei(value: int) -> float:
:return: SRF equivalent of value in Wei
:rtype: float
"""
value = float(value) / 1e+6
cached_token_data = json.loads(get_cached_default_token(Chain.spec.__str__()))
token_decimals: int = cached_token_data.get('decimals')
value = float(value) / (10**token_decimals)
return truncate(value=value, decimals=2)
@@ -70,7 +75,9 @@ def to_wei(value: int) -> int:
:return: Wei equivalent of value in SRF
:rtype: int
"""
return int(value * 1e+6)
cached_token_data = json.loads(get_cached_default_token(Chain.spec.__str__()))
token_decimals: int = cached_token_data.get('decimals')
return int(value * (10**token_decimals))
def truncate(value: float, decimals: int):

View File

@@ -44,7 +44,7 @@ class MetadataRequestsHandler(Metadata):
def create(self, data: Union[Dict, str]):
""""""
data = json.dumps(data)
data = json.dumps(data).encode('utf-8')
result = make_request(method='POST', url=self.url, data=data, headers=self.headers)
error_handler(result=result)

View File

@@ -1,13 +1,17 @@
# standard imports
import json
import logging
from datetime import datetime, timedelta
# external imports
import i18n.config
from sqlalchemy.orm.session import Session
# local imports
from cic_ussd.account.balance import calculate_available_balance, get_balances, get_cached_available_balance
from cic_ussd.account.balance import (calculate_available_balance,
get_adjusted_balance,
get_balances,
get_cached_adjusted_balance,
get_cached_available_balance)
from cic_ussd.account.chain import Chain
from cic_ussd.account.metadata import get_cached_preferred_language
from cic_ussd.account.statement import (
@@ -16,14 +20,15 @@ from cic_ussd.account.statement import (
query_statement,
statement_transaction_set
)
from cic_ussd.account.transaction import from_wei, to_wei
from cic_ussd.account.tokens import get_default_token_symbol
from cic_ussd.account.transaction import from_wei, to_wei
from cic_ussd.cache import cache_data_key, cache_data
from cic_ussd.db.models.account import Account
from cic_ussd.metadata import PersonMetadata
from cic_ussd.phone_number import Support
from cic_ussd.processor.util import latest_input, parse_person_metadata
from cic_ussd.processor.util import parse_person_metadata
from cic_ussd.translation import translation_for
from sqlalchemy.orm.session import Session
logg = logging.getLogger(__name__)
@@ -43,21 +48,26 @@ class MenuProcessor:
:rtype:
"""
available_balance = get_cached_available_balance(self.account.blockchain_address)
logg.debug('Requires call to retrieve tax and bonus amounts')
tax = ''
bonus = ''
adjusted_balance = get_cached_adjusted_balance(self.identifier)
token_symbol = get_default_token_symbol()
preferred_language = get_cached_preferred_language(self.account.blockchain_address)
if not preferred_language:
preferred_language = i18n.config.get('fallback')
return translation_for(
key=self.display_key,
preferred_language=preferred_language,
available_balance=available_balance,
tax=tax,
bonus=bonus,
token_symbol=token_symbol
)
with_available_balance = f'{self.display_key}.available_balance'
with_fees = f'{self.display_key}.with_fees'
if not adjusted_balance:
return translation_for(key=with_available_balance,
preferred_language=preferred_language,
available_balance=available_balance,
token_symbol=token_symbol)
adjusted_balance = json.loads(adjusted_balance)
tax_wei = to_wei(int(available_balance)) - int(adjusted_balance)
tax = from_wei(int(tax_wei))
return translation_for(key=with_fees,
preferred_language=preferred_language,
available_balance=available_balance,
tax=tax,
token_symbol=token_symbol)
def account_statement(self) -> str:
"""
@@ -67,7 +77,7 @@ class MenuProcessor:
cached_statement = get_cached_statement(self.account.blockchain_address)
statement = json.loads(cached_statement)
statement_transactions = parse_statement_transactions(statement)
transaction_sets = [statement_transactions[tx:tx+3] for tx in range(0, len(statement_transactions), 3)]
transaction_sets = [statement_transactions[tx:tx + 3] for tx in range(0, len(statement_transactions), 3)]
preferred_language = get_cached_preferred_language(self.account.blockchain_address)
if not preferred_language:
preferred_language = i18n.config.get('fallback')
@@ -149,12 +159,22 @@ class MenuProcessor:
:return:
:rtype:
"""
chain_str = Chain.spec.__str__()
token_symbol = get_default_token_symbol()
blockchain_address = self.account.blockchain_address
balances = get_balances(blockchain_address, Chain.spec.__str__(), token_symbol, False)[0]
balances = get_balances(blockchain_address, chain_str, token_symbol, False)[0]
key = cache_data_key(self.identifier, ':cic.balances')
cache_data(key, json.dumps(balances))
available_balance = calculate_available_balance(balances)
now = datetime.now()
if (now - self.account.created).days >= 30:
if available_balance <= 0:
logg.info(f'Not retrieving adjusted balance, available balance: {available_balance} is insufficient.')
else:
timestamp = int((now - timedelta(30)).timestamp())
adjusted_balance = get_adjusted_balance(to_wei(int(available_balance)), chain_str, timestamp, token_symbol)
key = cache_data_key(self.identifier, ':cic.adjusted_balance')
cache_data(key, json.dumps(adjusted_balance))
query_statement(blockchain_address)

View File

@@ -63,7 +63,7 @@ elif ssl == 0:
else:
ssl = True
valid_service_codes = config.get('USSD_SERVICE_CODE').split(",")
def main():
# TODO: improve url building
@@ -79,9 +79,9 @@ def main():
session = uuid.uuid4().hex
data = {
'sessionId': session,
'serviceCode': config.get('APP_SERVICE_CODE'),
'serviceCode': valid_service_codes[0],
'phoneNumber': args.phone,
'text': config.get('APP_SERVICE_CODE'),
'text': "",
}
state = "_BEGIN"

View File

@@ -146,7 +146,7 @@ def create_ussd_session(
)
def update_ussd_session(ussd_session: UssdSession,
def update_ussd_session(ussd_session: DbUssdSession,
user_input: str,
state: str,
data: Optional[dict] = None) -> UssdSession:

View File

@@ -154,15 +154,14 @@ def parse_person_metadata(account: Account, metadata: dict):
phone_number = account.phone_number
date_registered = int(account.created.replace().timestamp())
blockchain_address = account.blockchain_address
chain_spec = f'{Chain.spec.common_name()}:{Chain.spec.engine()}: {Chain.spec.chain_id()}'
chain_str = Chain.spec.__str__()
if isinstance(metadata.get('identities'), dict):
identities = metadata.get('identities')
else:
identities = manage_identity_data(
blockchain_address=blockchain_address,
blockchain_type=Chain.spec.engine(),
chain_spec=chain_spec
chain_str=chain_str
)
return {

View File

@@ -138,26 +138,14 @@ def transaction_balances_callback(self, result: list, param: dict, status_code:
balances_data = result[0]
available_balance = calculate_available_balance(balances_data)
transaction = param
blockchain_address = transaction.get('blockchain_address')
transaction['available_balance'] = available_balance
queue = self.request.delivery_info.get('routing_key')
s_preferences_metadata = celery.signature(
'cic_ussd.tasks.metadata.query_preferences_metadata', [blockchain_address], queue=queue
)
s_process_account_metadata = celery.signature(
'cic_ussd.tasks.processor.parse_transaction', [transaction], queue=queue
)
s_notify_account = celery.signature('cic_ussd.tasks.notifications.transaction', queue=queue)
if transaction.get('transaction_type') == 'transfer':
celery.chain(s_preferences_metadata, s_process_account_metadata, s_notify_account).apply_async()
if transaction.get('transaction_type') == 'tokengift':
s_process_account_metadata = celery.signature(
'cic_ussd.tasks.processor.parse_transaction', [{}, transaction], queue=queue
)
celery.chain(s_process_account_metadata, s_notify_account).apply_async()
celery.chain(s_process_account_metadata, s_notify_account).apply_async()
@celery_app.task

View File

@@ -24,7 +24,8 @@ def transaction(notification_data: dict):
:rtype:
"""
role = notification_data.get('role')
amount = from_wei(notification_data.get('token_value'))
token_value = notification_data.get('token_value')
amount = token_value if token_value == 0 else from_wei(token_value)
balance = notification_data.get('available_balance')
phone_number = notification_data.get('phone_number')
preferred_language = notification_data.get('preferred_language')

View File

@@ -8,6 +8,7 @@ import i18n
from chainlib.hash import strip_0x
# local imports
from cic_ussd.account.metadata import get_cached_preferred_language
from cic_ussd.account.statement import get_cached_statement
from cic_ussd.account.transaction import aux_transaction_data, validate_transaction_account
from cic_ussd.cache import cache_data, cache_data_key
@@ -58,19 +59,17 @@ def cache_statement(parsed_transaction: dict, querying_party: str):
@celery_app.task
def parse_transaction(preferences: dict, transaction: dict) -> dict:
def parse_transaction(transaction: dict) -> dict:
"""This function parses transaction objects and collates all relevant data for system use i.e:
- An account's set preferred language.
- Account identifier that facilitates notification.
- Contextual tags i.e action and direction tags.
:param preferences: An account's set preferences.
:type preferences: dict
:param transaction: Transaction object.
:type transaction: dict
:return: Transaction object with contextual data for use in the system.
:rtype: dict
"""
preferred_language = preferences.get('preferred_language')
preferred_language = get_cached_preferred_language(transaction.get('blockchain_address'))
if not preferred_language:
preferred_language = i18n.config.get('fallback')
transaction['preferred_language'] = preferred_language
@@ -83,6 +82,8 @@ def parse_transaction(preferences: dict, transaction: dict) -> dict:
alt_account = session.query(Account).filter_by(blockchain_address=alt_blockchain_address).first()
if alt_account:
transaction['alt_metadata_id'] = alt_account.standard_metadata_id()
else:
transaction['alt_metadata_id'] = 'GRASSROOTS ECONOMICS'
transaction['metadata_id'] = account.standard_metadata_id()
transaction['phone_number'] = account.phone_number
session.close()

View File

@@ -1,7 +1,7 @@
# standard imports
import semver
version = (0, 3, 1, 'alpha.4')
version = (0, 3, 1, 'alpha.5')
version_object = semver.VersionInfo(
major=version[0],

View File

@@ -1,2 +0,0 @@
[app]
service_code = *483*46#

View File

@@ -1,3 +1,4 @@
[ussd]
service_code = *483*46#
user =
pass =

View File

@@ -10,6 +10,13 @@ RUN mkdir -vp data
ARG EXTRA_INDEX_URL="https://pip.grassrootseconomics.net:8433"
ARG GITLAB_PYTHON_REGISTRY="https://gitlab.com/api/v4/projects/27624814/packages/pypi/simple"
RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
pip install --index-url https://pypi.org/simple \
--extra-index-url $GITLAB_PYTHON_REGISTRY \
--extra-index-url $EXTRA_INDEX_URL \
cic-eth-aux-erc20-demurrage-token~=0.0.2a6
COPY requirements.txt .
RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \

View File

@@ -1,9 +1,12 @@
alembic==1.4.2
attrs==21.2.0
billiard==3.6.4.0
bcrypt==3.2.0
celery==4.4.7
cffi==1.14.6
cic-eth[services]~=0.12.4a7
cic-notify~=0.4.0a10
cic-types~=0.1.0a14
cic-types~=0.1.0a15
confini>=0.4.1a1,<0.5.0
phonenumbers==8.12.12
psycopg2==2.8.6

View File

@@ -43,10 +43,13 @@ def test_sync_get_balances(activated_account,
(5000000, 89000000, 67000000, 27.00)
])
def test_calculate_available_balance(activated_account,
available_balance,
balance_incoming,
balance_network,
balance_outgoing,
available_balance):
cache_balances,
cache_default_token_data,
load_chain_spec):
balances = {
'address': activated_account.blockchain_address,
'converters': [],
@@ -57,7 +60,11 @@ def test_calculate_available_balance(activated_account,
assert calculate_available_balance(balances) == available_balance
def test_get_cached_available_balance(activated_account, cache_balances, balances):
def test_get_cached_available_balance(activated_account,
balances,
cache_balances,
cache_default_token_data,
load_chain_spec):
cached_available_balance = get_cached_available_balance(activated_account.blockchain_address)
available_balance = calculate_available_balance(balances[0])
assert cached_available_balance == available_balance

View File

@@ -27,6 +27,8 @@ def test_filter_statement_transactions(transactions_list):
def test_generate(activated_account,
cache_default_token_data,
cache_statement,
cache_preferences,
celery_session_worker,
init_cache,
@@ -60,7 +62,7 @@ def test_get_cached_statement(activated_account, cache_statement, statement):
assert cached_statement[0].get('blockchain_address') == statement[0].get('blockchain_address')
def test_parse_statement_transactions(statement):
def test_parse_statement_transactions(cache_default_token_data, statement):
parsed_transactions = parse_statement_transactions(statement)
parsed_transaction = parsed_transactions[0]
parsed_transaction.startswith('Sent')
@@ -76,7 +78,7 @@ def test_query_statement(blockchain_address, limit, load_chain_spec, activated_a
assert mock_transaction_list_query.get('limit') == limit
def test_statement_transaction_set(preferences, set_locale_files, statement):
def test_statement_transaction_set(cache_default_token_data, load_chain_spec, preferences, set_locale_files, statement):
parsed_transactions = parse_statement_transactions(statement)
preferred_language = preferences.get('preferred_language')
transaction_set = statement_transaction_set(preferred_language, parsed_transactions)

View File

@@ -36,19 +36,19 @@ def test_aux_transaction_data(preferences, set_locale_files, transactions_list):
check_aux_data('helpers.sent', 'helpers.to', preferred_language, sender_tx_aux_data)
@pytest.mark.parametrize("wei, expected_result", [
@pytest.mark.parametrize("value, expected_result", [
(50000000, Decimal('50.00')),
(100000, Decimal('0.10'))
])
def test_from_wei(wei, expected_result):
assert from_wei(wei) == expected_result
def test_from_wei(cache_default_token_data, expected_result, value):
assert from_wei(value) == expected_result
@pytest.mark.parametrize("value, expected_result", [
(50, 50000000),
(0.10, 100000)
])
def test_to_wei(value, expected_result):
def test_to_wei(cache_default_token_data, expected_result, value):
assert to_wei(value) == expected_result
@@ -96,6 +96,7 @@ def test_validate_transaction_account(activated_account, init_database, transact
@pytest.mark.parametrize("amount", [50, 0.10])
def test_outgoing_transaction_processor(activated_account,
amount,
cache_default_token_data,
celery_session_worker,
load_config,
load_chain_spec,

View File

@@ -1,5 +1,6 @@
# standard imports
import json
import datetime
# external imports
from chainlib.hash import strip_0x
@@ -14,6 +15,7 @@ from cic_ussd.account.statement import (
)
from cic_ussd.account.tokens import get_default_token_symbol
from cic_ussd.account.transaction import from_wei, to_wei
from cic_ussd.cache import cache_data, cache_data_key
from cic_ussd.menu.ussd_menu import UssdMenu
from cic_ussd.metadata import PersonMetadata
from cic_ussd.phone_number import Support
@@ -38,24 +40,34 @@ def test_menu_processor(activated_account,
load_chain_spec,
load_support_phone,
load_ussd_menu,
mock_get_adjusted_balance,
mock_sync_balance_api_query,
mock_transaction_list_query,
valid_recipient):
preferred_language = get_cached_preferred_language(activated_account.blockchain_address)
available_balance = get_cached_available_balance(activated_account.blockchain_address)
token_symbol = get_default_token_symbol()
tax = ''
bonus = ''
display_key = 'ussd.kenya.account_balances'
with_available_balance = 'ussd.kenya.account_balances.available_balance'
with_fees = 'ussd.kenya.account_balances.with_fees'
ussd_menu = UssdMenu.find_by_name('account_balances')
name = ussd_menu.get('name')
resp = response(activated_account, display_key, name, init_database, generic_ussd_session)
assert resp == translation_for(display_key,
resp = response(activated_account, 'ussd.kenya.account_balances', name, init_database, generic_ussd_session)
assert resp == translation_for(with_available_balance,
preferred_language,
available_balance=available_balance,
token_symbol=token_symbol)
identifier = bytes.fromhex(strip_0x(activated_account.blockchain_address))
key = cache_data_key(identifier, ':cic.adjusted_balance')
adjusted_balance = 45931650.64654012
cache_data(key, json.dumps(adjusted_balance))
resp = response(activated_account, 'ussd.kenya.account_balances', name, init_database, generic_ussd_session)
tax_wei = to_wei(int(available_balance)) - int(adjusted_balance)
tax = from_wei(int(tax_wei))
assert resp == translation_for(key=with_fees,
preferred_language=preferred_language,
available_balance=available_balance,
tax=tax,
bonus=bonus,
token_symbol=token_symbol)
cached_statement = get_cached_statement(activated_account.blockchain_address)
@@ -123,6 +135,15 @@ def test_menu_processor(activated_account,
account_balance=available_balance,
account_token_name=token_symbol)
display_key = 'ussd.kenya.start'
ussd_menu = UssdMenu.find_by_name('start')
name = ussd_menu.get('name')
older_timestamp = (activated_account.created - datetime.timedelta(days=35))
activated_account.created = older_timestamp
init_database.flush()
response(activated_account, display_key, name, init_database, generic_ussd_session)
assert mock_get_adjusted_balance['timestamp'] == int((datetime.datetime.now() - datetime.timedelta(days=30)).timestamp())
display_key = 'ussd.kenya.transaction_pin_authorization'
ussd_menu = UssdMenu.find_by_name('transaction_pin_authorization')
name = ussd_menu.get('name')

View File

@@ -49,6 +49,7 @@ def test_is_valid_transaction_amount(activated_account, amount, expected_result,
])
def test_has_sufficient_balance(activated_account,
cache_balances,
cache_default_token_data,
expected_result,
generic_ussd_session,
init_database,

View File

@@ -121,6 +121,7 @@ def test_statement_callback(activated_account, mocker, transactions_list):
def test_transaction_balances_callback(activated_account,
balances,
cache_balances,
cache_default_token_data,
cache_person_metadata,
cache_preferences,
load_chain_spec,

View File

@@ -13,7 +13,8 @@ from cic_ussd.translation import translation_for
# tests imports
def test_transaction(celery_session_worker,
def test_transaction(cache_default_token_data,
celery_session_worker,
load_support_phone,
mock_notifier_api,
notification_data,

View File

@@ -30,10 +30,11 @@ def test_generate_statement(activated_account,
def test_cache_statement(activated_account,
cache_default_token_data,
cache_person_metadata,
cache_preferences,
celery_session_worker,
init_database,
preferences,
transaction_result):
recipient_transaction, sender_transaction = transaction_actors(transaction_result)
identifier = bytes.fromhex(strip_0x(activated_account.blockchain_address))
@@ -41,7 +42,7 @@ def test_cache_statement(activated_account,
cached_statement = get_cached_data(key)
assert cached_statement is None
s_parse_transaction = celery.signature(
'cic_ussd.tasks.processor.parse_transaction', [preferences, sender_transaction])
'cic_ussd.tasks.processor.parse_transaction', [sender_transaction])
result = s_parse_transaction.apply_async().get()
s_cache_statement = celery.signature(
'cic_ussd.tasks.processor.cache_statement', [result, activated_account.blockchain_address]
@@ -61,15 +62,15 @@ def test_cache_statement(activated_account,
def test_parse_transaction(activated_account,
cache_person_metadata,
cache_preferences,
celery_session_worker,
init_database,
preferences,
transaction_result):
recipient_transaction, sender_transaction = transaction_actors(transaction_result)
assert sender_transaction.get('metadata_id') is None
assert sender_transaction.get('phone_number') is None
s_parse_transaction = celery.signature(
'cic_ussd.tasks.processor.parse_transaction', [preferences, sender_transaction])
'cic_ussd.tasks.processor.parse_transaction', [sender_transaction])
result = s_parse_transaction.apply_async().get()
assert result.get('metadata_id') == activated_account.standard_metadata_id()
assert result.get('phone_number') == activated_account.phone_number

View File

@@ -41,6 +41,22 @@ def mock_async_balance_api_query(mocker):
return query_args
@pytest.fixture(scope='function')
def mock_get_adjusted_balance(mocker, task_uuid):
query_args = {}
def get_adjusted_balance(self, token_symbol, balance, timestamp):
sync_res = mocker.patch('celery.result.AsyncResult')
sync_res.id = task_uuid
sync_res.result = 45931650.64654012
query_args['balance'] = balance
query_args['timestamp'] = timestamp
query_args['token_symbol'] = token_symbol
return sync_res
mocker.patch('cic_eth_aux.erc20_demurrage_token.api.Api.get_adjusted_balance', get_adjusted_balance)
return query_args
@pytest.fixture(scope='function')
def mock_notifier_api(mocker):
sms = {}

View File

@@ -141,12 +141,22 @@ en:
0. Back
retry: |-
%{retry_pin_entry}
account_balances: |-
CON Your balances are as follows:
balance: %{available_balance} %{token_symbol}
fees: %{tax} %{token_symbol}
rewards: %{bonus} %{token_symbol}
0. Back
account_balances:
available_balance: |-
CON Your balances are as follows:
balance: %{available_balance} %{token_symbol}
0. Back
with_fees: |-
CON Your balances are as follows:
balances: %{available_balance} %{token_symbol}
fees: %{tax} %{token_symbol}
0. Back
with_rewards: |-
CON Your balances are as follows:
balance: %{available_balance} %{token_symbol}
fees: %{tax} %{token_symbol}
rewards: %{bonus} %{token_symbol}
0. Back
first_transaction_set: |-
CON %{first_transaction_set}
1. Next

View File

@@ -140,12 +140,22 @@ sw:
0. Nyuma
retry: |-
%{retry_pin_entry}
account_balances: |-
CON Salio zako ni zifuatazo:
salio: %{available_balance} %{token_symbol}
ushuru: %{tax} %{token_symbol}
tuzo: %{bonus} %{token_symbol}
0. Nyuma
account_balances:
available_balance: |-
CON Salio zako ni zifuatazo:
salio: %{available_balance} %{token_symbol}
0. Nyuma
with_fees: |-
CON Salio zako ni zifuatazo:
salio: %{available_balance} %{token_symbol}
ushuru: %{tax} %{token_symbol}
0. Nyuma
with_rewards: |-
CON Salio zako ni zifuatazo:
salio: %{available_balance} %{token_symbol}
ushuru: %{tax} %{token_symbol}
tuzo: %{bonus} %{token_symbol}
0. Nyuma
first_transaction_set: |-
CON %{first_transaction_set}
1. Mbele

View File

@@ -0,0 +1,46 @@
#!/bin/bash
set -a
if [ -z $DEV_DATA_DIR ]; then
export DEV_DATA_DIR=`mktemp -d`
else
mkdir -p $DEV_DATA_DIR
fi
if [ -z $DEV_CONFIG_RESET ]; then
if [ -f ${DEV_DATA_DIR}/env_reset ]; then
>&2 echo "importing existing configuration values from ${DEV_DATA_DIR}/env_reset"
. ${DEV_DATA_DIR}/env_reset
fi
fi
# Handle wallet
export WALLET_KEY_FILE=${WALLET_KEY_FILE:-`realpath ./keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c`}
if [ ! -f $WALLET_KEY_FILE ]; then
>&2 echo "wallet path '$WALLET_KEY_FILE' does not point to a file"
exit 1
fi
export DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER=`eth-checksum $(cat $WALLET_KEY_FILE | jq -r .address)`
# Wallet dependent variable defaults
export DEV_ETH_ACCOUNT_RESERVE_MINTER=${DEV_ETH_ACCOUNT_RESERVE_MINTER:-$DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER}
export DEV_ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER=${DEV_ETH_ACCOUNT_RESERVE_MINTER:-$DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER}
export CIC_TRUST_ADDRESS=${CIC_TRUST_ADDRESS:-$DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER}
export CIC_DEFAULT_TOKEN_SYMBOL=$TOKEN_SYMBOL
export TOKEN_SINK_ADDRESS=${TOKEN_SINK_ADDRESS:-$DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER}
# Legacy variable defaults
# Migration variable processing
confini-dump --schema-dir ./config --prefix export > ${DEV_DATA_DIR}/env_reset
echo "export CIC_TRUST_ADDRESS=$CIC_TRUST_ADDRESS
export CIC_DEFAULT_TOKEN_SYMBOL=$CIC_DEFAULT_TOKEN_SYMBOL
export WALLET_KEY_FILE=$WALLET_KEY_FILE
" >> ${DEV_DATA_DIR}/env_reset
cat ${DEV_DATA_DIR}/env_reset
set +a

View File

@@ -0,0 +1,13 @@
[dev]
eth_account_contract_deployer =
eth_account_reserve_minter =
eth_account_accounts_index_writer =
reserve_amount = 10000000000000000000000000000000000
faucet_amount = 0
gas_amount = 100000000000000000000000
token_amount = 100000000000000000000000
eth_gas_price =
data_dir =
pip_extra_index_url =
eth_provider_host =
eth_provider_port =

View File

@@ -1,44 +0,0 @@
# syntax = docker/dockerfile:1.2
FROM registry.gitlab.com/grassrootseconomics/cic-base-images:python-3.8.6-dev-55da5f4e
WORKDIR /root
RUN touch /etc/apt/sources.list.d/ethereum.list
RUN echo 'deb http://ppa.launchpad.net/ethereum/ethereum/ubuntu bionic main' > /etc/apt/sources.list.d/ethereum.list
RUN echo 'deb-src http://ppa.launchpad.net/ethereum/ethereum/ubuntu bionic main' >> /etc/apt/sources.list.d/ethereum.list
RUN cat /etc/apt/sources.list.d/ethereum.list
RUN apt-key adv --keyserver keyserver.ubuntu.com --recv-keys 2A518C819BE37D2C2031944D1C52189C923F6CA9
#RUN apt-get install solc
RUN mkdir -vp /usr/local/etc/cic
ENV CONFINI_DIR /usr/local/etc/cic/
COPY config_template/ /usr/local/etc/cic/
COPY requirements.txt .
COPY override_requirements.txt .
ARG pip_index_url=https://pypi.org/simple
ARG EXTRA_INDEX_URL="https://pip.grassrootseconomics.net:8433"
ARG EXTRA_PIP_ARGS=""
ARG GITLAB_PYTHON_REGISTRY="https://gitlab.com/api/v4/projects/27624814/packages/pypi/simple"
ARG pip_trusted_host=pypi.org
RUN pip install --index-url https://pypi.org/simple \
pip install --index-url https://pypi.org/simple \
--pre \
--force-reinstall \
--trusted-host $pip_trusted_host \
--extra-index-url $GITLAB_PYTHON_REGISTRY --extra-index-url $EXTRA_INDEX_URL $EXTRA_PIP_ARGS \
-r requirements.txt
RUN pip install --index-url https://pypi.org/simple \
--force-reinstall \
--pre \
--trusted-host $pip_trusted_host \
--extra-index-url $GITLAB_PYTHON_REGISTRY --extra-index-url $EXTRA_INDEX_URL $EXTRA_PIP_ARGS \
-r override_requirements.txt
COPY . .
RUN chmod +x *.sh

View File

@@ -56,6 +56,7 @@ ETH_PROVIDER
ETH_ABI_DIR
SIGNER_SOCKET_PATH
SIGNER_SECRET
SIGNER_PROVIDER
CELERY_BROKER_URL
CELERY_RESULT_URL
META_PROVIDER

View File

@@ -1,6 +1,6 @@
cic-eth[tools]==0.12.4a4
chainlib-eth>=0.0.9a7,<0.1.0
eth-erc20>=0.1.2a2,<0.2.0
cic-eth[tools]==0.12.4a8
chainlib-eth>=0.0.9a14,<0.1.0
eth-erc20>=0.1.2a3,<0.2.0
erc20-demurrage-token>=0.0.5a2,<0.1.0
eth-accounts-index>=0.1.2a2,<0.2.0
eth-address-index>=0.2.3a4,<0.3.0
@@ -8,3 +8,5 @@ cic-eth-registry>=0.6.1a2,<0.7.0
erc20-transfer-authorization>=0.3.5a2,<0.4.0
erc20-faucet>=0.3.2a2,<0.4.0
sarafu-faucet>=0.0.7a2,<0.1.0
confini>=0.4.2rc3,<1.0.0
crypto-dev-signer>=0.4.15a7,<=0.4.15

View File

@@ -2,179 +2,122 @@
set -a
default_token=giftable_erc20_token
TOKEN_SYMBOL=${CIC_DEFAULT_TOKEN_SYMBOL}
TOKEN_NAME=${TOKEN_NAME}
TOKEN_TYPE=${TOKEN_TYPE:-$default_token}
cat <<EOF
external token settings:
token_type: $TOKEN_TYPE
token_symbol: $TOKEN_SYMBOL
token_name: $TOKEN_NAME
token_decimals: $TOKEN_DECIMALS
token_demurrage: $TOKEN_DEMURRAGE_LEVEL
token_redistribution_period: $TOKEN_REDISTRIBUTION_PERIOD
token_supply_limit: $TOKEN_SUPPLY_LIMIT
EOF
. ${DEV_DATA_DIR}/env_reset
CHAIN_SPEC=${CHAIN_SPEC:-$CIC_CHAIN_SPEC}
RPC_HTTP_PROVIDER=${RPC_HTTP_PROVIDER:-$ETH_PROVIDER}
DEV_ETH_ACCOUNT_RESERVE_MINTER=${DEV_ETH_ACCOUNT_RESERVE_MINTER:-$DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER}
DEV_ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER=${DEV_ETH_ACCOUNT_RESERVE_MINTER:-$DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER}
DEV_RESERVE_AMOUNT=${DEV_ETH_RESERVE_AMOUNT:-""10000000000000000000000000000000000}
DEV_FAUCET_AMOUNT=${DEV_FAUCET_AMOUNT:-0}
DEV_ETH_KEYSTORE_FILE=${DEV_ETH_KEYSTORE_FILE:-`realpath ./keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c`}
WAIT_FOR_TIMEOUT=${WAIT_FOR_TIMEOUT:-60}
set -e
DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER=`eth-checksum $(cat $DEV_ETH_KEYSTORE_FILE | jq -r .address)`
if [ ! -z $DEV_ETH_GAS_PRICE ]; then
gas_price_arg="--gas-price $DEV_ETH_GAS_PRICE"
fee_price_arg="--fee-price $DEV_ETH_GAS_PRICE"
>&2 echo using static gas price $DEV_ETH_GAS_PRICE
fi
echo "environment:"
printenv
echo \n
echo "using wallet address '$DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER' from keystore file $DEV_ETH_KEYSTORE_FILE"
# This is a grassroots team convention for building the Bancor contracts using the bancor protocol repository truffle setup
# Running this in docker-internal dev container (built from Docker folder in this repo) will write a
# source-able env file to CIC_DATA_DIR. Services dependent on these contracts can mount this file OR
# define these parameters at runtime
# pushd /usr/src
if [ -z $CIC_DATA_DIR ]; then
CIC_DATA_DIR=`mktemp -d`
fi
>&2 echo using data dir $CIC_DATA_DIR
init_level_file=${CIC_DATA_DIR}/.init
if [ ! -f ${CIC_DATA_DIR}/.init ]; then
echo "Creating .init file..."
mkdir -p $CIC_DATA_DIR
touch $CIC_DATA_DIR/.init
# touch $init_level_file
fi
echo -n 1 > $init_level_file
# Abort on any error (including if wait-for-it fails).
# Wait for the backend to be up, if we know where it is.
if [[ -n "${ETH_PROVIDER}" ]]; then
export CONFINI_DIR=$_CONFINI_DIR
unset CONFINI_DIR
if [ ! -z "$DEV_USE_DOCKER_WAIT_SCRIPT" ]; then
echo "waiting for ${ETH_PROVIDER}..."
./wait-for-it.sh "${ETH_PROVIDER_HOST}:${ETH_PROVIDER_PORT}"
fi
if [ "$TOKEN_TYPE" == "$default_token" ]; then
if [ -z "$TOKEN_SYMBOL" ]; then
>&2 echo token symbol not set, setting defaults for type $TOKEN_TYPE
TOKEN_SYMBOL="GFT"
TOKEN_NAME="Giftable Token"
elif [ -z "$TOKEN_NAME" ]; then
>&2 echo token name not set, setting same as symbol for type $TOKEN_TYPE
TOKEN_NAME=$TOKEN_SYMBOL
fi
>&2 echo deploying default token $TOKEN_TYPE
echo giftable-token-deploy $fee_price_arg -p $ETH_PROVIDER -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -vv -s -ww --name "$TOKEN_NAME" --symbol $TOKEN_SYMBOL --decimals 6 -vv
DEV_RESERVE_ADDRESS=`giftable-token-deploy $fee_price_arg -p $ETH_PROVIDER -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -vv -s -ww --name "$TOKEN_NAME" --symbol $TOKEN_SYMBOL --decimals 6 -vv`
elif [ "$TOKEN_TYPE" == "erc20_demurrage_token" ]; then
if [ -z "$TOKEN_SYMBOL" ]; then
>&2 echo token symbol not set, setting defaults for type $TOKEN_TYPE
TOKEN_SYMBOL="SARAFU"
TOKEN_NAME="Sarafu Token"
elif [ -z "$TOKEN_NAME" ]; then
>&2 echo token name not set, setting same as symbol for type $TOKEN_TYPE
TOKEN_NAME=$TOKEN_SYMBOL
fi
>&2 echo deploying token $TOKEN_TYPE
if [ -z $TOKEN_SINK_ADDRESS ]; then
if [ ! -z $TOKEN_REDISTRIBUTION_PERIOD ]; then
>&2 echo -e "\033[;93mtoken sink address not set, so redistribution will be BURNED\033[;39m"
fi
fi
DEV_RESERVE_ADDRESS=`erc20-demurrage-token-deploy $fee_price_arg -p $ETH_PROVIDER -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC --name "$TOKEN_NAME" --symbol $TOKEN_SYMBOL -vv -ww -s`
else
>&2 echo unknown token type $TOKEN_TYPE
exit 1
fi
echo "giftable-token-gift $fee_price_arg -p $ETH_PROVIDER -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -vv -w -e $DEV_RESERVE_ADDRESS $DEV_RESERVE_AMOUNT"
giftable-token-gift $fee_price_arg -p $ETH_PROVIDER -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -u -vv -s -w -e $DEV_RESERVE_ADDRESS $DEV_RESERVE_AMOUNT
>&2 echo "deploy account index contract"
DEV_ACCOUNT_INDEX_ADDRESS=`eth-accounts-index-deploy $fee_price_arg -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -y $DEV_ETH_KEYSTORE_FILE -vv -s -u -w`
>&2 echo "add deployer address as account index writer"
eth-accounts-index-writer $fee_price_arg -s -u -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -e $DEV_ACCOUNT_INDEX_ADDRESS -ww -vv $debug $DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER
>&2 echo "deploy contract registry contract"
CIC_REGISTRY_ADDRESS=`eth-contract-registry-deploy $fee_price_arg -i $CIC_CHAIN_SPEC -y $DEV_ETH_KEYSTORE_FILE --identifier AccountRegistry --identifier TokenRegistry --identifier AddressDeclarator --identifier Faucet --identifier TransferAuthorization --identifier ContractRegistry -p $ETH_PROVIDER -vv -s -u -w`
eth-contract-registry-set $fee_price_arg -s -u -w -y $DEV_ETH_KEYSTORE_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -vv --identifier ContractRegistry $CIC_REGISTRY_ADDRESS
eth-contract-registry-set $fee_price_arg -s -u -w -y $DEV_ETH_KEYSTORE_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -vv --identifier AccountRegistry $DEV_ACCOUNT_INDEX_ADDRESS
# Deploy address declarator registry
>&2 echo "deploy address declarator contract"
declarator_description=0x546869732069732074686520434943206e6574776f726b000000000000000000
DEV_DECLARATOR_ADDRESS=`eth-address-declarator-deploy -s -u -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -w -vv $declarator_description`
eth-contract-registry-set $fee_price_arg -s -u -w -y $DEV_ETH_KEYSTORE_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -vv --identifier AddressDeclarator $DEV_DECLARATOR_ADDRESS
# Deploy transfer authorization contact
>&2 echo "deploy transfer auth contract"
DEV_TRANSFER_AUTHORIZATION_ADDRESS=`erc20-transfer-auth-deploy $gas_price_arg -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -w -vv`
eth-contract-registry-set $fee_price_arg -s -u -w -y $DEV_ETH_KEYSTORE_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -vv --identifier TransferAuthorization $DEV_TRANSFER_AUTHORIZATION_ADDRESS
# Deploy token index contract
>&2 echo "deploy token index contract"
DEV_TOKEN_INDEX_ADDRESS=`eth-token-index-deploy -s -u $fee_price_arg -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -w -vv`
eth-contract-registry-set $fee_price_arg -s -u -w -y $DEV_ETH_KEYSTORE_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -vv --identifier TokenRegistry $DEV_TOKEN_INDEX_ADDRESS
>&2 echo "add reserve token to token index"
eth-token-index-add $fee_price_arg -s -u -w -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -vv -e $DEV_TOKEN_INDEX_ADDRESS $DEV_RESERVE_ADDRESS
# Sarafu faucet contract
>&2 echo "deploy token faucet contract"
DEV_FAUCET_ADDRESS=`sarafu-faucet-deploy $fee_price_arg -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -w -vv --account-index-address $DEV_ACCOUNT_INDEX_ADDRESS $DEV_RESERVE_ADDRESS -s`
>&2 echo "set token faucet amount"
sarafu-faucet-set $fee_price_arg -y $DEV_ETH_KEYSTORE_FILE -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -e $DEV_FAUCET_ADDRESS -vv -s --fee-limit 100000 $DEV_FAUCET_AMOUNT
>&2 echo "register faucet in registry"
eth-contract-registry-set -s -u $fee_price_arg -w -y $DEV_ETH_KEYSTORE_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -vv --identifier Faucet $DEV_FAUCET_ADDRESS
>&2 echo "set faucet as token minter"
giftable-token-minter -s -u $fee_price_arg -w -y $DEV_ETH_KEYSTORE_FILE -e $DEV_RESERVE_ADDRESS -i $CIC_CHAIN_SPEC -p $ETH_PROVIDER -vv $DEV_FAUCET_ADDRESS
export CONFINI_DIR=$_CONFINI_DIR
else
echo "\$ETH_PROVIDER not set!"
if [ -z "${RPC_PROVIDER}" ]; then
echo "\$RPC_PROVIDER not set!"
exit 1
fi
mkdir -p $CIC_DATA_DIR
>&2 echo using data dir $CIC_DATA_DIR for environment variable dump
unset CONFINI_DIR
# this is consumed in downstream services to set environment variables
cat << EOF > $CIC_DATA_DIR/.env
export CIC_REGISTRY_ADDRESS=$CIC_REGISTRY_ADDRESS
export CIC_TRUST_ADDRESS=$DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER
export CIC_DECLARATOR_ADDRESS=$CIC_DECLARATOR_ADDRESS
EOF
if [ ! -z "$DEV_USE_DOCKER_WAIT_SCRIPT" ]; then
IFS=: read -a p <<< "$RPC_PROVIDER"
read -i "/" rpc_provider_port <<< "${p[2]}"
rpc_provider_host=${p[1]:2}
echo "waiting for provider host $rpc_provider_host port $rpc_provider_port..."
./wait-for-it.sh "$rpc_provider_host:$rpc_provider_port" -t $WAIT_FOR_TIMEOUT
fi
cat ./envlist | bash from_env.sh > $CIC_DATA_DIR/.env_all
cat ./envlist
# popd
if [ "$TOKEN_TYPE" == "giftable_erc20_token" ]; then
if [ -z "$TOKEN_SYMBOL" ]; then
>&2 echo token symbol not set, setting defaults for type $TOKEN_TYPE
TOKEN_SYMBOL="GFT"
TOKEN_NAME="Giftable Token"
elif [ -z "$TOKEN_NAME" ]; then
>&2 echo token name not set, setting same as symbol for type $TOKEN_TYPE
TOKEN_NAME=$TOKEN_SYMBOL
fi
>&2 echo deploying default token $TOKEN_TYPE
echo giftable-token-deploy $fee_price_arg -p $RPC_PROVIDER -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -vv -s -ww --name "$TOKEN_NAME" --symbol $TOKEN_SYMBOL --decimals 6 -vv
DEV_RESERVE_ADDRESS=`giftable-token-deploy $fee_price_arg -p $RPC_PROVIDER -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -vv -s -ww --name "$TOKEN_NAME" --symbol $TOKEN_SYMBOL --decimals 6 -vv`
elif [ "$TOKEN_TYPE" == "erc20_demurrage_token" ]; then
if [ -z "$TOKEN_SYMBOL" ]; then
>&2 echo token symbol not set, setting defaults for type $TOKEN_TYPE
TOKEN_SYMBOL="DET"
TOKEN_NAME="Demurrage Token"
elif [ -z "$TOKEN_NAME" ]; then
>&2 echo token name not set, setting same as symbol for type $TOKEN_TYPE
TOKEN_NAME=$TOKEN_SYMBOL
fi
>&2 echo deploying token $TOKEN_TYPE
if [ -z $TOKEN_SINK_ADDRESS ]; then
if [ ! -z $TOKEN_REDISTRIBUTION_PERIOD ]; then
>&2 echo -e "\033[;93mtoken sink address not set, so redistribution will be BURNED\033[;39m"
fi
fi
DEV_RESERVE_ADDRESS=`erc20-demurrage-token-deploy $fee_price_arg -p $RPC_PROVIDER -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC --name "$TOKEN_NAME" --symbol $TOKEN_SYMBOL -vv -ww -s`
else
>&2 echo unknown token type $TOKEN_TYPE
exit 1
fi
echo "giftable-token-gift $fee_price_arg -p $RPC_PROVIDER -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -vv -w -e $DEV_RESERVE_ADDRESS $DEV_RESERVE_AMOUNT"
giftable-token-gift $fee_price_arg -p $RPC_PROVIDER -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -u -vv -s -w -e $DEV_RESERVE_ADDRESS $DEV_RESERVE_AMOUNT
>&2 echo "deploy account index contract"
DEV_ACCOUNT_INDEX_ADDRESS=`eth-accounts-index-deploy $fee_price_arg -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -y $WALLET_KEY_FILE -vv -s -u -w`
>&2 echo "add deployer address as account index writer"
eth-accounts-index-writer $fee_price_arg -s -u -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -e $DEV_ACCOUNT_INDEX_ADDRESS -ww -vv $debug $DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER
>&2 echo "deploy contract registry contract"
CIC_REGISTRY_ADDRESS=`eth-contract-registry-deploy $fee_price_arg -i $CIC_CHAIN_SPEC -y $WALLET_KEY_FILE --identifier AccountRegistry --identifier TokenRegistry --identifier AddressDeclarator --identifier Faucet --identifier TransferAuthorization --identifier ContractRegistry -p $RPC_PROVIDER -vv -s -u -w`
eth-contract-registry-set $fee_price_arg -s -u -w -y $WALLET_KEY_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -vv --identifier ContractRegistry $CIC_REGISTRY_ADDRESS
eth-contract-registry-set $fee_price_arg -s -u -w -y $WALLET_KEY_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -vv --identifier AccountRegistry $DEV_ACCOUNT_INDEX_ADDRESS
# Deploy address declarator registry
>&2 echo "deploy address declarator contract"
declarator_description=0x546869732069732074686520434943206e6574776f726b000000000000000000
DEV_DECLARATOR_ADDRESS=`eth-address-declarator-deploy -s -u -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -w -vv $declarator_description`
eth-contract-registry-set $fee_price_arg -s -u -w -y $WALLET_KEY_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -vv --identifier AddressDeclarator $DEV_DECLARATOR_ADDRESS
# Deploy transfer authorization contact
>&2 echo "deploy transfer auth contract"
DEV_TRANSFER_AUTHORIZATION_ADDRESS=`erc20-transfer-auth-deploy $gas_price_arg -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -w -vv`
eth-contract-registry-set $fee_price_arg -s -u -w -y $WALLET_KEY_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -vv --identifier TransferAuthorization $DEV_TRANSFER_AUTHORIZATION_ADDRESS
# Deploy token index contract
>&2 echo "deploy token index contract"
DEV_TOKEN_INDEX_ADDRESS=`eth-token-index-deploy -s -u $fee_price_arg -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -w -vv`
eth-contract-registry-set $fee_price_arg -s -u -w -y $WALLET_KEY_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -vv --identifier TokenRegistry $DEV_TOKEN_INDEX_ADDRESS
>&2 echo "add reserve token to token index"
eth-token-index-add $fee_price_arg -s -u -w -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -vv -e $DEV_TOKEN_INDEX_ADDRESS $DEV_RESERVE_ADDRESS
# Sarafu faucet contract
>&2 echo "deploy token faucet contract"
DEV_FAUCET_ADDRESS=`sarafu-faucet-deploy $fee_price_arg -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -w -vv --account-index-address $DEV_ACCOUNT_INDEX_ADDRESS $DEV_RESERVE_ADDRESS -s`
>&2 echo "set token faucet amount"
sarafu-faucet-set $fee_price_arg -w -y $WALLET_KEY_FILE -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -e $DEV_FAUCET_ADDRESS -vv -s --fee-limit 100000 $DEV_FAUCET_AMOUNT
>&2 echo "register faucet in registry"
eth-contract-registry-set -s -u $fee_price_arg -w -y $WALLET_KEY_FILE -e $CIC_REGISTRY_ADDRESS -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -vv --identifier Faucet $DEV_FAUCET_ADDRESS
>&2 echo "set faucet as token minter"
giftable-token-minter -s -u $fee_price_arg -w -y $WALLET_KEY_FILE -e $DEV_RESERVE_ADDRESS -i $CIC_CHAIN_SPEC -p $RPC_PROVIDER -vv $DEV_FAUCET_ADDRESS
#echo "export CIC_DEFAULT_TOKEN_SYMBOL=$TOKEN_SYMBOL" >> ${DEV_DATA_DIR}/env_reset
export CIC_DEFAULT_TOKEN_SYMBOL=$TOKEN_SYMBOL
echo "Writing env_reset file ..."
echo "export CIC_REGISTRY_ADDRESS=$CIC_REGISTRY_ADDRESS
export CIC_DEFAULT_TOKEN_SYMBOL=$CIC_DEFAULT_TOKEN_SYMBOL
export TOKEN_NAME=$TOKEN_NAME
" >> "${DEV_DATA_DIR}"/env_reset
set +a
set +e
echo -n 2 > $init_level_file
exec "$@"

View File

@@ -1,5 +1,13 @@
#! /bin/bash
>&2 echo -e "\033[;96mRUNNING\033[;39m configurations"
./config.sh
if [ $? -ne "0" ]; then
>&2 echo -e "\033[;31mFAILED\033[;39m configurations"
exit 1;
fi
>&2 echo -e "\033[;32mSUCCEEDED\033[;39m configurations"
if [[ $((RUN_MASK & 1)) -eq 1 ]]
then
>&2 echo -e "\033[;96mRUNNING\033[;39m RUN_MASK 1 - contract deployment"

View File

@@ -1,68 +1,40 @@
#!/bin/bash
# defaults
#initlevel=`cat ${CIC_DATA_DIR}/.init`
#echo $inilevel
#if [ $initlevel -lt 2 ]; then
# >&2 echo "initlevel too low $initlevel"
# exit 1
#fi
source ${CIC_DATA_DIR}/.env
source ${CIC_DATA_DIR}/.env_all
DEV_PIP_EXTRA_INDEX_URL=${DEV_PIP_EXTRA_INDEX_URL:-https://pip.grassrootseconomics.net:8433}
DEV_DATABASE_NAME_CIC_ETH=${DEV_DATABASE_NAME_CIC_ETH:-"cic-eth"}
CIC_DATA_DIR=${CIC_DATA_DIR:-/tmp/cic}
ETH_PASSPHRASE=''
CIC_DEFAULT_TOKEN_SYMBOL=${CIC_DEFAULT_TOKEN_SYMBOL:-GFT}
TOKEN_SYMBOL=$CIC_DEFAULT_TOKEN_SYMBOL
CHAIN_SPEC=${CHAIN_SPEC:-$CIC_CHAIN_SPEC}
RPC_HTTP_PROVIDER=${RPC_HTTP_PROVIDER:-$ETH_PROVIDER}
source ${DEV_DATA_DIR}/env_reset
cat ${DEV_DATA_DIR}/env_reset
# Debug flag
DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER=0xEb3907eCad74a0013c259D5874AE7f22DcBcC95C
keystore_file=./keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c
debug='-vv'
gas_amount=100000000000000000000000
token_amount=${gas_amount}
env_out_file=${CIC_DATA_DIR}/.env_seed
init_level_file=${CIC_DATA_DIR}/.init
empty_config_dir=$CONFINI_DIR/empty
truncate $env_out_file -s 0
set -e
set -a
#pip install --extra-index-url $DEV_PIP_EXTRA_INDEX_URL eth-address-index==0.1.1a7
export CONFINI_DIR=$_CONFINI_DIR
unset CONFINI_DIR
# get required addresses from registries
DEV_TOKEN_INDEX_ADDRESS=`eth-contract-registry-list -u -i $CHAIN_SPEC -p $ETH_PROVIDER -e $CIC_REGISTRY_ADDRESS -vv --raw TokenRegistry`
DEV_ACCOUNT_INDEX_ADDRESS=`eth-contract-registry-list -u -i $CHAIN_SPEC -p $ETH_PROVIDER -e $CIC_REGISTRY_ADDRESS -vv --raw AccountRegistry`
DEV_RESERVE_ADDRESS=`eth-token-index-list -i $CHAIN_SPEC -u -p $ETH_PROVIDER -e $DEV_TOKEN_INDEX_ADDRESS -vv --raw $CIC_DEFAULT_TOKEN_SYMBOL`
cat <<EOF
Token registry: $DEV_TOKEN_INDEX_ADDRESS
Account reigstry: $DEV_ACCOUNT_INDEX_ADDRESS
Reserve address: $DEV_RESERVE_ADDRESS ($CIC_DEFAULT_TOKEN_SYMBOL)
EOF
token_index_address=`eth-contract-registry-list -u -i $CHAIN_SPEC -p $RPC_PROVIDER -e $CIC_REGISTRY_ADDRESS -vv --raw TokenRegistry`
account_index_address=`eth-contract-registry-list -u -i $CHAIN_SPEC -p $RPC_PROVIDER -e $CIC_REGISTRY_ADDRESS -vv --raw AccountRegistry`
reserve_address=`eth-token-index-list -i $CHAIN_SPEC -u -p $RPC_PROVIDER -e $token_index_address -vv --raw $CIC_DEFAULT_TOKEN_SYMBOL`
>&2 echo "Token registry: $token_index_address"
>&2 echo "Account registry: $account_index_address"
>&2 echo "Reserve address: $reserve_address ($TOKEN_SYMBOL)"
>&2 echo "create account for gas gifter"
old_gas_provider=$DEV_ETH_ACCOUNT_GAS_PROVIDER
DEV_ETH_ACCOUNT_GAS_GIFTER=`CONFINI_DIR=$empty_config_dir cic-eth-create --redis-timeout 120 $debug --redis-host $REDIS_HOST --redis-host-callback=$REDIS_HOST --redis-port-callback=$REDIS_PORT --no-register`
echo DEV_ETH_ACCOUNT_GAS_GIFTER=$DEV_ETH_ACCOUNT_GAS_GIFTER >> $env_out_file
#DEV_ETH_ACCOUNT_GAS_GIFTER=`CONFINI_DIR=$empty_config_dir cic-eth-create --redis-timeout 120 $debug --redis-host $REDIS_HOST --redis-host-callback=$REDIS_HOST --redis-port-callback=$REDIS_PORT --no-register`
DEV_ETH_ACCOUNT_GAS_GIFTER=`cic-eth-create --redis-timeout 120 $debug --redis-host $REDIS_HOST --redis-host-callback=$REDIS_HOST --redis-port-callback=$REDIS_PORT --no-register`
cic-eth-tag -i $CHAIN_SPEC GAS_GIFTER $DEV_ETH_ACCOUNT_GAS_GIFTER
>&2 echo "create account for sarafu gifter"
DEV_ETH_ACCOUNT_SARAFU_GIFTER=`CONFINI_DIR=$empty_config_dir cic-eth-create $debug --redis-host $REDIS_HOST --redis-host-callback=$REDIS_HOST --redis-port-callback=$REDIS_PORT --no-register`
echo DEV_ETH_ACCOUNT_SARAFU_GIFTER=$DEV_ETH_ACCOUNT_SARAFU_GIFTER >> $env_out_file
cic-eth-tag -i $CHAIN_SPEC SARAFU_GIFTER $DEV_ETH_ACCOUNT_SARAFU_GIFTER
>&2 echo "create account for approval escrow owner"
DEV_ETH_ACCOUNT_TRANSFER_AUTHORIZATION_OWNER=`CONFINI_DIR=$empty_config_dir cic-eth-create $debug --redis-host $REDIS_HOST --redis-host-callback=$REDIS_HOST --redis-port-callback=$REDIS_PORT --no-register`
echo DEV_ETH_ACCOUNT_TRANSFER_AUTHORIZATION_OWNER=$DEV_ETH_ACCOUNT_TRANSFER_AUTHORIZATION_OWNER >> $env_out_file
cic-eth-tag -i $CHAIN_SPEC TRANSFER_AUTHORIZATION_OWNER $DEV_ETH_ACCOUNT_TRANSFER_AUTHORIZATION_OWNER
#>&2 echo "create account for faucet owner"
@@ -72,50 +44,45 @@ cic-eth-tag -i $CHAIN_SPEC TRANSFER_AUTHORIZATION_OWNER $DEV_ETH_ACCOUNT_TRANSFE
>&2 echo "create account for accounts index writer"
DEV_ETH_ACCOUNT_ACCOUNT_REGISTRY_WRITER=`CONFINI_DIR=$empty_config_dir cic-eth-create $debug --redis-host $REDIS_HOST --redis-host-callback=$REDIS_HOST --redis-port-callback=$REDIS_PORT --no-register`
echo DEV_ETH_ACCOUNT_ACCOUNT_REGISTRY_WRITER=$DEV_ETH_ACCOUNT_ACCOUNT_REGISTRY_WRITER >> $env_out_file
cic-eth-tag -i $CHAIN_SPEC ACCOUNT_REGISTRY_WRITER $DEV_ETH_ACCOUNT_ACCOUNT_REGISTRY_WRITER
>&2 echo "add acccounts index writer account as writer on contract"
eth-accounts-index-writer -s -u -y $keystore_file -i $CHAIN_SPEC -p $ETH_PROVIDER -e $DEV_ACCOUNT_INDEX_ADDRESS -ww $debug $DEV_ETH_ACCOUNT_ACCOUNT_REGISTRY_WRITER
eth-accounts-index-writer -s -u -y $WALLET_KEY_FILE -i $CHAIN_SPEC -p $RPC_PROVIDER -e $account_index_address -ww $debug $DEV_ETH_ACCOUNT_ACCOUNT_REGISTRY_WRITER
# Transfer gas to custodial gas provider adddress
_CONFINI_DIR=$CONFINI_DIR
unset CONFINI_DIR
>&2 echo gift gas to gas gifter
>&2 eth-gas -s -u -y $keystore_file -i $CHAIN_SPEC -p $ETH_PROVIDER -w $debug -a $DEV_ETH_ACCOUNT_GAS_GIFTER $gas_amount
>&2 eth-gas -s -u -y $WALLET_KEY_FILE -i $CHAIN_SPEC -p $RPC_PROVIDER -w $debug -a $DEV_ETH_ACCOUNT_GAS_GIFTER $DEV_GAS_AMOUNT
>&2 echo gift gas to sarafu token owner
>&2 eth-gas -s -u -y $keystore_file -i $CHAIN_SPEC -p $ETH_PROVIDER -w $debug -a $DEV_ETH_ACCOUNT_SARAFU_GIFTER $gas_amount
>&2 eth-gas -s -u -y $WALLET_KEY_FILE -i $CHAIN_SPEC -p $RPC_PROVIDER -w $debug -a $DEV_ETH_ACCOUNT_SARAFU_GIFTER $DEV_GAS_AMOUNT
>&2 echo gift gas to account index owner
>&2 eth-gas -s -u -y $keystore_file -i $CHAIN_SPEC -p $ETH_PROVIDER -w $debug -a $DEV_ETH_ACCOUNT_ACCOUNT_REGISTRY_WRITER $gas_amount
>&2 eth-gas -s -u -y $WALLET_KEY_FILE -i $CHAIN_SPEC -p $RPC_PROVIDER -w $debug -a $DEV_ETH_ACCOUNT_ACCOUNT_REGISTRY_WRITER $DEV_GAS_AMOUNT
# Send token to token creator
>&2 echo "gift tokens to sarafu owner"
echo "giftable-token-gift -s -u -y $keystore_file -i $CHAIN_SPEC -p $ETH_PROVIDER -e $DEV_RESERVE_ADDRESS -a $DEV_ETH_ACCOUNT_SARAFU_GIFTER -w $debug $token_amount"
>&2 giftable-token-gift -s -u -y $keystore_file -i $CHAIN_SPEC -p $ETH_PROVIDER -e $DEV_RESERVE_ADDRESS -a $DEV_ETH_ACCOUNT_SARAFU_GIFTER -w $debug $token_amount
echo "giftable-token-gift -s -u -y $WALLET_KEY_FILE -i $CHAIN_SPEC -p $RPC_PROVIDER -e $reserve_address -a $DEV_ETH_ACCOUNT_SARAFU_GIFTER -w $debug $DEV_TOKEN_AMOUNT"
>&2 giftable-token-gift -s -u -y $WALLET_KEY_FILE -i $CHAIN_SPEC -p $RPC_PROVIDER -e $reserve_address -a $DEV_ETH_ACCOUNT_SARAFU_GIFTER -w $debug $DEV_TOKEN_AMOUNT
# Send token to token gifter
>&2 echo "gift tokens to keystore address"
>&2 giftable-token-gift -s -u -y $keystore_file -i $CHAIN_SPEC -p $ETH_PROVIDER -e $DEV_RESERVE_ADDRESS -a $DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER -w $debug $token_amount
>&2 giftable-token-gift -s -u -y $WALLET_KEY_FILE -i $CHAIN_SPEC -p $RPC_PROVIDER -e $reserve_address -a $DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER -w $debug $DEV_TOKEN_AMOUNT
>&2 echo "set sarafu token to reserve token (temporarily while bancor contracts are not connected)"
echo DEV_ETH_SARAFU_TOKEN_ADDRESS=$DEV_ETH_RESERVE_ADDRESS >> $env_out_file
export DEV_ETH_SARAFU_TOKEN_ADDRESS=$DEV_ETH_RESERVE_ADDRESS
# Transfer tokens to gifter address
>&2 echo "transfer tokens to token gifter address"
>&2 erc20-transfer -s -u -y $keystore_file -i $CHAIN_SPEC -p $ETH_PROVIDER --fee-limit 100000 -e $DEV_RESERVE_ADDRESS -w $debug -a $DEV_ETH_ACCOUNT_SARAFU_GIFTER ${token_amount:0:-1}
>&2 erc20-transfer -s -u -y $WALLET_KEY_FILE -i $CHAIN_SPEC -p $RPC_PROVIDER --fee-limit 100000 -e $reserve_address -w $debug -a $DEV_ETH_ACCOUNT_SARAFU_GIFTER ${DEV_TOKEN_AMOUNT:0:-1}
#echo -n 0 > $init_level_file
#CONFINI_DIR=$_CONFINI_DIR
# Remove the SEND (8), QUEUE (16) and INIT (2) locks (or'ed), set by default at migration
cic-eth-ctl -i $CHAIN_SPEC unlock INIT
cic-eth-ctl -i $CHAIN_SPEC unlock SEND
cic-eth-ctl -i $CHAIN_SPEC unlock QUEUE
export CONFINI_DIR=$_CONFINI_DIR
#confini-dump --schema-module chainlib.eth.data.config --schema-module cic_eth.data.config --schema-dir ./config
set +a
set +e

View File

@@ -2,7 +2,7 @@
.cache
.dot
**/doc
**/node_modules
node_modules/
**/venv
**/.venv

View File

@@ -7,6 +7,7 @@ import json
# external imports
import celery
import confini
import celery.utils.graph
# local imports
from cic_eth.api import Api
@@ -35,7 +36,7 @@ class Fmtr(celery.utils.graph.GraphFormatter):
def main():
api = Api(
config.get('CIC_CHAIN_SPEC'),
config.get('CHAIN_SPEC'),
queue='cic-eth',
#callback_param='{}:{}:{}:{}'.format(args.redis_host_callback, args.redis_port_callback, redis_db, redis_channel),
#callback_task='cic_eth.callbacks.redis.redis',

View File

@@ -1,64 +1,61 @@
# standard imports
import argparse
import logging
import sys
import os
import sys
# external imports
import celery
import confini
import redis
from chainlib.chain import ChainSpec
from chainlib.eth.address import to_checksum_address
from chainlib.eth.connection import EthHTTPConnection
from confini import Config
from crypto_dev_signer.eth.signer import ReferenceSigner as EIP155Signer
from crypto_dev_signer.keystore.dict import DictKeystore
# local imports
from import_task import ImportTask, MetadataTask
from import_util import BalanceProcessor, get_celery_worker_status
from import_task import ImportTask, MetadataTask
logging.basicConfig(level=logging.WARNING)
default_config_dir = '/usr/local/etc/data-seeding/'
logg = logging.getLogger()
config_dir = './config'
arg_parser = argparse.ArgumentParser(description='Daemon worker that handles data seeding tasks.')
arg_parser.add_argument('-c', type=str, default=default_config_dir, help='config root to use.')
arg_parser.add_argument('--env-prefix',
default=os.environ.get('CONFINI_ENV_PREFIX'),
dest='env_prefix',
type=str,
help='environment prefix for variables to overwrite configuration.')
arg_parser.add_argument('--head', action='store_true', help='start at current block height (overrides --offset)')
arg_parser.add_argument('-i', '--chain-spec', type=str, dest='i', help='chain spec')
arg_parser.add_argument('--include-balances', dest='include_balances', help='include opening balance transactions',
action='store_true')
arg_parser.add_argument('--meta-host', dest='meta_host', type=str, help='metadata server host')
arg_parser.add_argument('--meta-port', dest='meta_port', type=int, help='metadata server host')
arg_parser.add_argument('-p', '--provider', dest='p', type=str, help='chain rpc provider address')
arg_parser.add_argument('-q', type=str, default='cic-import-ussd', help='celery queue to submit data seeding tasks to.')
arg_parser.add_argument('-r', '--registry-address', type=str, dest='r', help='CIC Registry address')
arg_parser.add_argument('--redis-db', dest='redis_db', type=int, help='redis db to use for task submission and callback')
arg_parser.add_argument('--redis-host', dest='redis_host', type=str, help='redis host to use for task submission')
arg_parser.add_argument('--redis-port', dest='redis_port', type=int, help='redis host to use for task submission')
arg_parser.add_argument('--token-symbol', default='GFT', type=str, dest='token_symbol',
help='Token symbol to use for transactions')
arg_parser.add_argument('-v', help='be verbose', action='store_true')
arg_parser.add_argument('-vv', help='be more verbose', action='store_true')
arg_parser.add_argument('-y', '--key-file', dest='y', type=str, help='Ethereum keystore file to use for signing')
arg_parser.add_argument('--offset', type=int, default=0, help='block offset to start syncer from')
arg_parser.add_argument('--old-chain-spec', type=str, dest='old_chain_spec', default='evm:oldchain:1',
help='chain spec')
arg_parser.add_argument('import_dir', default='out', type=str, help='user export directory')
args = arg_parser.parse_args()
argparser = argparse.ArgumentParser(description='daemon that monitors transactions in new blocks')
argparser.add_argument('-p', '--provider', dest='p', type=str, help='chain rpc provider address')
argparser.add_argument('-y', '--key-file', dest='y', type=str, help='Ethereum keystore file to use for signing')
argparser.add_argument('-c', type=str, default=config_dir, help='config root to use')
argparser.add_argument('--old-chain-spec', type=str, dest='old_chain_spec', default='evm:oldchain:1', help='chain spec')
argparser.add_argument('-i', '--chain-spec', type=str, dest='i', help='chain spec')
argparser.add_argument('-r', '--registry-address', type=str, dest='r', help='CIC Registry address')
argparser.add_argument('--meta-host', dest='meta_host', type=str, help='metadata server host')
argparser.add_argument('--meta-port', dest='meta_port', type=int, help='metadata server host')
argparser.add_argument('--redis-host', dest='redis_host', type=str, help='redis host to use for task submission')
argparser.add_argument('--redis-port', dest='redis_port', type=int, help='redis host to use for task submission')
argparser.add_argument('--redis-db', dest='redis_db', type=int, help='redis db to use for task submission and callback')
argparser.add_argument('--token-symbol', default='GFT', type=str, dest='token_symbol',
help='Token symbol to use for transactions')
argparser.add_argument('--head', action='store_true', help='start at current block height (overrides --offset)')
argparser.add_argument('--env-prefix', default=os.environ.get('CONFINI_ENV_PREFIX'), dest='env_prefix', type=str,
help='environment prefix for variables to overwrite configuration')
argparser.add_argument('-q', type=str, default='cic-import-ussd', help='celery queue to submit transaction tasks to')
argparser.add_argument('--offset', type=int, default=0, help='block offset to start syncer from')
argparser.add_argument('-v', help='be verbose', action='store_true')
argparser.add_argument('-vv', help='be more verbose', action='store_true')
argparser.add_argument('user_dir', default='out', type=str, help='user export directory')
args = argparser.parse_args(sys.argv[1:])
if args.v:
if args.vv:
logging.getLogger().setLevel(logging.DEBUG)
elif args.v:
logging.getLogger().setLevel(logging.INFO)
elif args.vv:
logging.getLogger().setLevel(logging.DEBUG)
config_dir = os.path.join(args.c)
os.makedirs(config_dir, 0o777, True)
config = confini.Config(config_dir, args.env_prefix)
config = Config(args.c, args.env_prefix)
config.process()
# override args
args_override = {
'CIC_CHAIN_SPEC': getattr(args, 'i'),
'ETH_PROVIDER': getattr(args, 'p'),
@@ -73,88 +70,76 @@ args_override = {
config.dict_override(args_override, 'cli flag')
config.censor('PASSWORD', 'DATABASE')
config.censor('PASSWORD', 'SSL')
logg.debug('config loaded from {}:\n{}'.format(config_dir, config))
logg.debug(f'config loaded from {args.c}:\n{config}')
redis_host = config.get('REDIS_HOST')
redis_port = config.get('REDIS_PORT')
redis_db = config.get('REDIS_DB')
r = redis.Redis(redis_host, redis_port, redis_db)
db_config = {
'database': config.get('DATABASE_NAME'),
'host': config.get('DATABASE_HOST'),
'port': config.get('DATABASE_PORT'),
'user': config.get('DATABASE_USER'),
'password': config.get('DATABASE_PASSWORD')
}
ImportTask.db_config = db_config
# create celery apps
celery_app = celery.Celery(backend=config.get('CELERY_RESULT_URL'), broker=config.get('CELERY_BROKER_URL'))
status = get_celery_worker_status(celery_app=celery_app)
signer_address = None
keystore = DictKeystore()
if args.y is not None:
logg.debug('loading keystore file {}'.format(args.y))
signer_address = keystore.import_keystore_file(args.y)
logg.debug('now have key for signer address {}'.format(signer_address))
# define signer
os.path.isfile(args.y)
logg.debug(f'loading keystore file {args.y}')
signer_address = keystore.import_keystore_file(args.y)
logg.debug(f'now have key for signer address {signer_address}')
signer = EIP155Signer(keystore)
queue = args.q
chain_str = config.get('CIC_CHAIN_SPEC')
block_offset = 0
if args.head:
block_offset = -1
else:
block_offset = args.offset
block_offset = -1 if args.head else args.offset
chain_str = config.get('CIC_CHAIN_SPEC')
chain_spec = ChainSpec.from_chain_str(chain_str)
ImportTask.chain_spec = chain_spec
old_chain_spec_str = args.old_chain_spec
old_chain_spec = ChainSpec.from_chain_str(old_chain_spec_str)
user_dir = args.user_dir # user_out_dir from import_users.py
token_symbol = args.token_symbol
MetadataTask.meta_host = config.get('META_HOST')
MetadataTask.meta_port = config.get('META_PORT')
ImportTask.chain_spec = chain_spec
txs_dir = os.path.join(args.import_dir, 'txs')
os.makedirs(txs_dir, exist_ok=True)
sys.stdout.write(f'created txs dir: {txs_dir}')
celery_app = celery.Celery(broker=config.get('CELERY_BROKER_URL'), backend=config.get('CELERY_RESULT_URL'))
get_celery_worker_status(celery_app)
def main():
conn = EthHTTPConnection(config.get('ETH_PROVIDER'))
ImportTask.balance_processor = BalanceProcessor(conn, chain_spec, config.get('CIC_REGISTRY_ADDRESS'),
signer_address, signer)
ImportTask.balance_processor.init(token_symbol)
# TODO get decimals from token
ImportTask.balance_processor = BalanceProcessor(conn,
chain_spec,
config.get('CIC_REGISTRY_ADDRESS'),
signer_address,
signer)
ImportTask.balance_processor.init(args.token_symbol)
balances = {}
f = open('{}/balances.csv'.format(user_dir, 'r'))
remove_zeros = 10 ** 6
i = 0
while True:
l = f.readline()
if l is None:
break
r = l.split(',')
try:
address = to_checksum_address(r[0])
sys.stdout.write('loading balance {} {} {}'.format(i, address, r[1]).ljust(200) + "\r")
except ValueError:
break
balance = int(int(r[1].rstrip()) / remove_zeros)
balances[address] = balance
i += 1
f.close()
accuracy = 10 ** 6
count = 0
with open(f'{args.import_dir}/balances.csv', 'r') as balances_file:
while True:
line = balances_file.readline()
if line is None:
break
balance_data = line.split(',')
try:
blockchain_address = to_checksum_address(balance_data[0])
logg.info(
'loading balance: {} {} {}'.format(count, blockchain_address, balance_data[1].ljust(200) + "\r"))
except ValueError:
break
balance = int(int(balance_data[1].rstrip()) / accuracy)
balances[blockchain_address] = balance
count += 1
ImportTask.balances = balances
ImportTask.count = i
ImportTask.import_dir = user_dir
s = celery.signature(
'import_task.send_txs',
[
MetadataTask.balance_processor.nonce_offset,
],
queue=queue,
)
s.apply_async()
ImportTask.count = count
ImportTask.include_balances = args.include_balances is True
ImportTask.import_dir = args.import_dir
s_send_txs = celery.signature(
'import_task.send_txs', [ImportTask.balance_processor.nonce_offset], queue=args.q)
s_send_txs.apply_async()
argv = ['worker']
if args.vv:
@@ -165,6 +150,7 @@ def main():
argv.append(args.q)
argv.append('-n')
argv.append(args.q)
argv.append(f'--pidfile={args.q}.pid')
celery_app.worker_main(argv)

View File

@@ -1,71 +1,63 @@
# standard import
# standard imports
import argparse
import csv
import logging
import os
import psycopg2
# third-party imports
import celery
import confini
# external imports
from confini import Config
# local imports
from import_util import get_celery_worker_status
default_config_dir = './config'
logging.basicConfig(level=logging.WARNING)
logg = logging.getLogger()
default_config_dir = './config'
arg_parser = argparse.ArgumentParser()
arg_parser.add_argument('-c', type=str, default=default_config_dir, help='config root to use')
arg_parser = argparse.ArgumentParser(description='Pins import script.')
arg_parser.add_argument('-c', type=str, default=default_config_dir, help='config root to use.')
arg_parser.add_argument('--env-prefix',
default=os.environ.get('CONFINI_ENV_PREFIX'),
dest='env_prefix',
type=str,
help='environment prefix for variables to overwrite configuration')
arg_parser.add_argument('-q', type=str, default='cic-import-ussd', help='celery queue to submit transaction tasks to')
help='environment prefix for variables to overwrite configuration.')
arg_parser.add_argument('import_dir', default='out', type=str, help='user export directory')
arg_parser.add_argument('-v', help='be verbose', action='store_true')
arg_parser.add_argument('-vv', help='be more verbose', action='store_true')
arg_parser.add_argument('pins_dir', default='out', type=str, help='user export directory')
args = arg_parser.parse_args()
# set log levels
if args.v:
logg.setLevel(logging.INFO)
elif args.vv:
logg.setLevel(logging.DEBUG)
if args.vv:
logging.getLogger().setLevel(logging.DEBUG)
elif args.v:
logging.getLogger().setLevel(logging.INFO)
# process configs
config_dir = args.c
config = confini.Config(config_dir, os.environ.get('CONFINI_ENV_PREFIX'))
config = Config(args.c, args.env_prefix)
config.process()
config.censor('PASSWORD', 'DATABASE')
logg.debug('config loaded from {}:\n{}'.format(args.c, config))
celery_app = celery.Celery(broker=config.get('CELERY_BROKER_URL'), backend=config.get('CELERY_RESULT_URL'))
status = get_celery_worker_status(celery_app=celery_app)
db_configs = {
'database': config.get('DATABASE_NAME'),
'host': config.get('DATABASE_HOST'),
'port': config.get('DATABASE_PORT'),
'user': config.get('DATABASE_USER'),
'password': config.get('DATABASE_PASSWORD')
}
logg.debug(f'config loaded from {args.c}:\n{config}')
def main():
with open(f'{args.pins_dir}/pins.csv') as pins_file:
with open(f'{args.import_dir}/pins.csv') as pins_file:
phone_to_pins = [tuple(row) for row in csv.reader(pins_file)]
s_import_pins = celery.signature(
'import_task.set_pins',
(db_configs, phone_to_pins),
queue=args.q
db_conn = psycopg2.connect(
database=config.get('DATABASE_NAME'),
host=config.get('DATABASE_HOST'),
port=config.get('DATABASE_PORT'),
user=config.get('DATABASE_USER'),
password=config.get('DATABASE_PASSWORD')
)
result = s_import_pins.apply_async()
logg.debug(f'TASK: {result.id}, STATUS: {result.status}')
db_cursor = db_conn.cursor()
sql = 'UPDATE account SET password_hash = %s WHERE phone_number = %s'
for element in phone_to_pins:
db_cursor.execute(sql, (element[1], element[0]))
logg.debug(f'Updating account: {element[0]} with: {element[1]}')
db_conn.commit()
db_cursor.close()
db_conn.close()
if __name__ == '__main__':

View File

@@ -1,38 +1,37 @@
# standard imports
import csv
import json
import logging
import os
import random
import urllib.error
import urllib.parse
import urllib.request
import uuid
from urllib import error, parse, request
# external imports
import celery
import psycopg2
from celery import Task
from chainlib.chain import ChainSpec
from chainlib.eth.address import to_checksum_address
from chainlib.eth.tx import (
unpack,
raw,
)
from cic_types.models.person import Person
from cic_types.processor import generate_metadata_pointer
from hexathon import (
strip_0x,
add_0x,
)
from chainlib.eth.tx import raw, unpack
from cic_types.models.person import Person, generate_metadata_pointer
from hexathon import add_0x, strip_0x
# local imports
logg = logging.getLogger()
celery_app = celery.current_app
logg = logging.getLogger()
class ImportTask(celery.Task):
class ImportTask(Task):
balances = None
import_dir = 'out'
count = 0
chain_spec = None
balance_processor = None
chain_spec: ChainSpec = None
count = 0
db_config: dict = None
import_dir = ''
include_balances = False
max_retries = None
@@ -41,121 +40,70 @@ class MetadataTask(ImportTask):
meta_port = None
meta_path = ''
meta_ssl = False
autoretry_for = (
urllib.error.HTTPError,
OSError,
)
autoretry_for = (error.HTTPError, OSError,)
retry_jitter = True
retry_backoff = True
retry_backoff_max = 60
@classmethod
def meta_url(self):
def meta_url(cls):
scheme = 'http'
if self.meta_ssl:
if cls.meta_ssl:
scheme += 's'
url = urllib.parse.urlparse('{}://{}:{}/{}'.format(scheme, self.meta_host, self.meta_port, self.meta_path))
return urllib.parse.urlunparse(url)
url = parse.urlparse(f'{scheme}://{cls.meta_host}:{cls.meta_port}/{cls.meta_path}')
return parse.urlunparse(url)
def old_address_from_phone(base_path, phone):
pidx = generate_metadata_pointer(phone.encode('utf-8'), ':cic.phone')
phone_idx_path = os.path.join('{}/phone/{}/{}/{}'.format(
base_path,
pidx[:2],
pidx[2:4],
pidx,
)
)
f = open(phone_idx_path, 'r')
old_address = f.read()
f.close()
def old_address_from_phone(base_path: str, phone_number: str):
pid_x = generate_metadata_pointer(phone_number.encode('utf-8'), ':cic.phone')
phone_idx_path = os.path.join(f'{base_path}/phone/{pid_x[:2]}/{pid_x[2:4]}/{pid_x}')
with open(phone_idx_path, 'r') as f:
old_address = f.read()
return old_address
@celery_app.task(bind=True, base=MetadataTask)
def resolve_phone(self, phone):
identifier = generate_metadata_pointer(phone.encode('utf-8'), ':cic.phone')
url = urllib.parse.urljoin(self.meta_url(), identifier)
logg.debug('attempt getting phone pointer at {} for phone {}'.format(url, phone))
r = urllib.request.urlopen(url)
address = json.load(r)
address = address.replace('"', '')
logg.debug('address {} for phone {}'.format(address, phone))
return address
@celery_app.task(bind=True, base=MetadataTask)
def generate_metadata(self, address, phone):
old_address = old_address_from_phone(self.import_dir, phone)
logg.debug('address {}'.format(address))
old_address_upper = strip_0x(old_address).upper()
metadata_path = '{}/old/{}/{}/{}.json'.format(
self.import_dir,
old_address_upper[:2],
old_address_upper[2:4],
old_address_upper,
)
f = open(metadata_path, 'r')
o = json.load(f)
f.close()
u = Person.deserialize(o)
if u.identities.get('evm') == None:
u.identities['evm'] = {}
sub_chain_str = '{}:{}'.format(self.chain_spec.common_name(), self.chain_spec.network_id())
u.identities['evm'][sub_chain_str] = [add_0x(address)]
new_address_clean = strip_0x(address)
filepath = os.path.join(
def generate_person_metadata(self, blockchain_address: str, phone_number: str):
logg.debug(f'blockchain address: {blockchain_address}')
old_blockchain_address = old_address_from_phone(self.import_dir, phone_number)
old_address_upper = strip_0x(old_blockchain_address).upper()
metadata_path = f'{self.import_dir}/old/{old_address_upper[:2]}/{old_address_upper[2:4]}/{old_address_upper}.json'
with open(metadata_path, 'r') as metadata_file:
person_metadata = json.load(metadata_file)
person = Person.deserialize(person_metadata)
if not person.identities.get('evm'):
person.identities['evm'] = {}
sub_chain_str = f'{self.chain_spec.common_name()}:{self.chain_spec.network_id()}'
person.identities['evm'][sub_chain_str] = [add_0x(blockchain_address)]
blockchain_address = strip_0x(blockchain_address)
file_path = os.path.join(
self.import_dir,
'new',
new_address_clean[:2].upper(),
new_address_clean[2:4].upper(),
new_address_clean.upper() + '.json',
blockchain_address[:2].upper(),
blockchain_address[2:4].upper(),
blockchain_address.upper() + '.json'
)
os.makedirs(os.path.dirname(filepath), exist_ok=True)
o = u.serialize()
f = open(filepath, 'w')
f.write(json.dumps(o))
f.close()
meta_key = generate_metadata_pointer(bytes.fromhex(new_address_clean), ':cic.person')
os.makedirs(os.path.dirname(file_path), exist_ok=True)
serialized_person_metadata = person.serialize()
with open(file_path, 'w') as metadata_file:
metadata_file.write(json.dumps(serialized_person_metadata))
logg.debug(f'written person metadata for address: {blockchain_address}')
meta_filepath = os.path.join(
self.import_dir,
'meta',
'{}.json'.format(new_address_clean.upper()),
'{}.json'.format(blockchain_address.upper()),
)
os.symlink(os.path.realpath(filepath), meta_filepath)
os.symlink(os.path.realpath(file_path), meta_filepath)
return blockchain_address
# write ussd data
ussd_data = {
'phone': phone,
'is_activated': 1,
'preferred_language': random.sample(['en', 'sw'], 1)[0],
'is_disabled': False
}
ussd_data_dir = os.path.join(self.import_dir, 'ussd')
ussd_data_file_path = os.path.join(ussd_data_dir, f'{old_address}.json')
f = open(ussd_data_file_path, 'w')
f.write(json.dumps(ussd_data))
f.close()
# write preferences data
@celery_app.task(bind=True, base=MetadataTask)
def generate_preferences_data(self, data: tuple):
blockchain_address: str = data[0]
preferences = data[1]
preferences_dir = os.path.join(self.import_dir, 'preferences')
preferences_data = {
'preferred_language': ussd_data['preferred_language']
}
preferences_key = generate_metadata_pointer(bytes.fromhex(new_address_clean[2:]), ':cic.preferences')
preferences_key = generate_metadata_pointer(bytes.fromhex(strip_0x(blockchain_address)), ':cic.preferences')
preferences_filepath = os.path.join(preferences_dir, 'meta', preferences_key)
filepath = os.path.join(
preferences_dir,
'new',
@@ -164,95 +112,95 @@ def generate_metadata(self, address, phone):
preferences_key.upper() + '.json'
)
os.makedirs(os.path.dirname(filepath), exist_ok=True)
f = open(filepath, 'w')
f.write(json.dumps(preferences_data))
f.close()
with open(filepath, 'w') as preferences_file:
preferences_file.write(json.dumps(preferences))
logg.debug(f'written preferences metadata: {preferences} for address: {blockchain_address}')
os.symlink(os.path.realpath(filepath), preferences_filepath)
logg.debug('found metadata {} for phone {}'.format(o, phone))
return address
return blockchain_address
@celery_app.task(bind=True, base=MetadataTask)
def opening_balance_tx(self, address, phone, serial):
old_address = old_address_from_phone(self.import_dir, phone)
def generate_pins_data(self, blockchain_address: str, phone_number: str):
pins_file = f'{self.import_dir}/pins.csv'
file_op = 'a' if os.path.exists(pins_file) else 'w'
with open(pins_file, file_op) as pins_file:
password_hash = uuid.uuid4().hex
pins_file.write(f'{phone_number},{password_hash}\n')
logg.debug(f'written pin data for address: {blockchain_address}')
return blockchain_address
k = to_checksum_address(strip_0x(old_address))
balance = self.balances[k]
logg.debug('found balance {} for address {} phone {}'.format(balance, old_address, phone))
@celery_app.task(bind=True, base=MetadataTask)
def generate_ussd_data(self, blockchain_address: str, phone_number: str):
ussd_data_file = f'{self.import_dir}/ussd_data.csv'
file_op = 'a' if os.path.exists(ussd_data_file) else 'w'
preferred_language = random.sample(["en", "sw"], 1)[0]
preferences = {'preferred_language': preferred_language}
with open(ussd_data_file, file_op) as ussd_data_file:
ussd_data_file.write(f'{phone_number}, { 1}, {preferred_language}, {False}\n')
logg.debug(f'written ussd data for address: {blockchain_address}')
return blockchain_address, preferences
@celery_app.task(bind=True, base=MetadataTask)
def opening_balance_tx(self, blockchain_address: str, phone_number: str, serial: str):
old_blockchain_address = old_address_from_phone(self.import_dir, phone_number)
address = to_checksum_address(strip_0x(old_blockchain_address))
balance = self.balances[address]
logg.debug(f'found balance: {balance} for address: {address} phone: {phone_number}')
decimal_balance = self.balance_processor.get_decimal_amount(int(balance))
(tx_hash_hex, o) = self.balance_processor.get_rpc_tx(address, decimal_balance, serial)
tx_hash_hex, o = self.balance_processor.get_rpc_tx(blockchain_address, decimal_balance, serial)
tx = unpack(bytes.fromhex(strip_0x(o)), self.chain_spec)
logg.debug('generated tx token value {} to {} tx hash {}'.format(decimal_balance, address, tx_hash_hex))
tx_path = os.path.join(
self.import_dir,
'txs',
strip_0x(tx_hash_hex),
)
f = open(tx_path, 'w')
f.write(strip_0x(o))
f.close()
tx_nonce_path = os.path.join(
self.import_dir,
'txs',
'.' + str(tx['nonce']),
)
logg.debug(f'generated tx token value: {decimal_balance}: {blockchain_address} tx hash {tx_hash_hex}')
tx_path = os.path.join(self.import_dir, 'txs', strip_0x(tx_hash_hex))
with open(tx_path, 'w') as tx_file:
tx_file.write(strip_0x(o))
logg.debug(f'written tx with tx hash: {tx["hash"]} for address: {blockchain_address}')
tx_nonce_path = os.path.join(self.import_dir, 'txs', '.' + str(tx['nonce']))
os.symlink(os.path.realpath(tx_path), tx_nonce_path)
return tx['hash']
@celery_app.task(bind=True, base=ImportTask, autoretry_for=(FileNotFoundError,), max_retries=None,
@celery_app.task(bind=True, base=MetadataTask)
def resolve_phone(self, phone_number: str):
identifier = generate_metadata_pointer(phone_number.encode('utf-8'), ':cic.phone')
url = parse.urljoin(self.meta_url(), identifier)
logg.debug(f'attempt getting phone pointer at: {url} for phone: {phone_number}')
r = request.urlopen(url)
address = json.load(r)
address = address.replace('"', '')
logg.debug(f'address: {address} for phone: {phone_number}')
return address
@celery_app.task(autoretry_for=(FileNotFoundError,),
bind=True,
base=ImportTask,
max_retries=None,
default_retry_delay=0.1)
def send_txs(self, nonce):
if nonce == self.count + self.balance_processor.nonce_offset:
logg.info('reached nonce {} (offset {} + count {}) exiting'.format(nonce, self.balance_processor.nonce_offset,
self.count))
return
logg.debug('attempt to open symlink for nonce {}'.format(nonce))
tx_nonce_path = os.path.join(
self.import_dir,
'txs',
'.' + str(nonce),
)
f = open(tx_nonce_path, 'r')
tx_signed_raw_hex = f.read()
f.close()
os.unlink(tx_nonce_path)
o = raw(add_0x(tx_signed_raw_hex))
tx_hash_hex = self.balance_processor.conn.do(o)
logg.info('sent nonce {} tx hash {}'.format(nonce, tx_hash_hex)) # tx_signed_raw_hex))
nonce += 1
queue = self.request.delivery_info.get('routing_key')
s = celery.signature(
'import_task.send_txs',
[
nonce,
],
queue=queue,
)
s.apply_async()
if nonce == self.count + self.balance_processor.nonce_offset:
logg.info(f'reached nonce {nonce} (offset {self.balance_processor.nonce_offset} + count {self.count}).')
celery_app.control.broadcast('shutdown', destination=[f'celery@{queue}'])
logg.debug(f'attempt to open symlink for nonce {nonce}')
tx_nonce_path = os.path.join(self.import_dir, 'txs', '.' + str(nonce))
with open(tx_nonce_path, 'r') as tx_nonce_file:
tx_signed_raw_hex = tx_nonce_file.read()
os.unlink(tx_nonce_path)
o = raw(add_0x(tx_signed_raw_hex))
if self.include_balances:
tx_hash_hex = self.balance_processor.conn.do(o)
logg.info(f'sent nonce {nonce} tx hash {tx_hash_hex}')
nonce += 1
s = celery.signature('import_task.send_txs', [nonce], queue=queue)
s.apply_async()
return nonce
@celery_app.task
def set_pins(config: dict, phone_to_pins: list):
# define db connection
@celery_app.task()
def set_pin_data(config: dict, phone_to_pins: list):
db_conn = psycopg2.connect(
database=config.get('database'),
host=config.get('host'),
@@ -261,24 +209,17 @@ def set_pins(config: dict, phone_to_pins: list):
password=config.get('password')
)
db_cursor = db_conn.cursor()
# update db
sql = 'UPDATE account SET password_hash = %s WHERE phone_number = %s'
for element in phone_to_pins:
sql = 'UPDATE account SET password_hash = %s WHERE phone_number = %s'
db_cursor.execute(sql, (element[1], element[0]))
logg.debug(f'Updating: {element[0]} with: {element[1]}')
# commit changes
db_conn.commit()
# close connections
db_cursor.close()
db_conn.close()
@celery_app.task
def set_ussd_data(config: dict, ussd_data: dict):
# define db connection
def set_ussd_data(config: dict, ussd_data: list):
db_conn = psycopg2.connect(
database=config.get('database'),
host=config.get('host'),
@@ -287,20 +228,12 @@ def set_ussd_data(config: dict, ussd_data: dict):
password=config.get('password')
)
db_cursor = db_conn.cursor()
# process ussd_data
account_status = 1
if ussd_data['is_activated'] == 1:
account_status = 2
preferred_language = ussd_data['preferred_language']
phone_number = ussd_data['phone']
sql = 'UPDATE account SET status = %s, preferred_language = %s WHERE phone_number = %s'
db_cursor.execute(sql, (account_status, preferred_language, phone_number))
# commit changes
for element in ussd_data:
status = 2 if int(element[1]) == 1 else 1
preferred_language = element[2]
phone_number = element[0]
db_cursor.execute(sql, (status, preferred_language, phone_number))
db_conn.commit()
# close connections
db_cursor.close()
db_conn.close()

View File

@@ -3,56 +3,61 @@ import argparse
import json
import logging
import os
import redis
import sys
import time
import urllib.request
import uuid
from urllib import request
from urllib.parse import urlencode
# external imports
import celery
import confini
import phonenumbers
import redis
from chainlib.chain import ChainSpec
from cic_types.models.person import Person
from confini import Config
# local imports
from import_util import get_celery_worker_status
default_config_dir = './config'
logging.basicConfig(level=logging.WARNING)
logg = logging.getLogger()
default_config_dir = '/usr/local/etc/cic'
arg_parser = argparse.ArgumentParser(description='Daemon worker that handles data seeding tasks.')
# batch size should be slightly below cumulative gas limit worth, eg 80000 gas txs with 8000000 limit is a bit less than 100 batch size
arg_parser.add_argument('--batch-size',
dest='batch_size',
default=100,
type=int,
help='burst size of sending transactions to node')
arg_parser.add_argument('--batch-delay', dest='batch_delay', default=3, type=int, help='seconds delay between batches')
arg_parser.add_argument('-c', type=str, default=default_config_dir, help='config root to use.')
arg_parser.add_argument('--env-prefix',
default=os.environ.get('CONFINI_ENV_PREFIX'),
dest='env_prefix',
type=str,
help='environment prefix for variables to overwrite configuration.')
arg_parser.add_argument('-i', '--chain-spec', type=str, dest='i', help='chain spec')
arg_parser.add_argument('-q', type=str, default='cic-import-ussd', help='celery queue to submit data seeding tasks to.')
arg_parser.add_argument('--redis-db', dest='redis_db', type=int, help='redis db to use for task submission and callback')
arg_parser.add_argument('--redis-host', dest='redis_host', type=str, help='redis host to use for task submission')
arg_parser.add_argument('--redis-port', dest='redis_port', type=int, help='redis host to use for task submission')
arg_parser.add_argument('--ussd-host', dest='ussd_host', type=str,
help="host to ussd app responsible for processing ussd requests.")
arg_parser.add_argument('--ussd-no-ssl', dest='ussd_no_ssl', help='do not use ssl (careful)', action='store_true')
arg_parser.add_argument('--ussd-port', dest='ussd_port', type=str,
help="port to ussd app responsible for processing ussd requests.")
arg_parser.add_argument('-v', help='be verbose', action='store_true')
arg_parser.add_argument('-vv', help='be more verbose', action='store_true')
arg_parser.add_argument('import_dir', default='out', type=str, help='user export directory')
args = arg_parser.parse_args()
argparser = argparse.ArgumentParser()
argparser.add_argument('-c', type=str, default=default_config_dir, help='config file')
argparser.add_argument('-i', '--chain-spec', dest='i', type=str, help='Chain specification string')
argparser.add_argument('--redis-host', dest='redis_host', type=str, help='redis host to use for task submission')
argparser.add_argument('--redis-port', dest='redis_port', type=int, help='redis host to use for task submission')
argparser.add_argument('--redis-db', dest='redis_db', type=int, help='redis db to use for task submission and callback')
argparser.add_argument('--batch-size', dest='batch_size', default=100, type=int,
help='burst size of sending transactions to node') # batch size should be slightly below cumulative gas limit worth, eg 80000 gas txs with 8000000 limit is a bit less than 100 batch size
argparser.add_argument('--batch-delay', dest='batch_delay', default=3, type=int, help='seconds delay between batches')
argparser.add_argument('--timeout', default=60.0, type=float, help='Callback timeout')
argparser.add_argument('--ussd-host', dest='ussd_host', type=str,
help="host to ussd app responsible for processing ussd requests.")
argparser.add_argument('--ussd-port', dest='ussd_port', type=str,
help="port to ussd app responsible for processing ussd requests.")
argparser.add_argument('--ussd-no-ssl', dest='ussd_no_ssl', help='do not use ssl (careful)', action='store_true')
argparser.add_argument('-q', type=str, default='cic-eth', help='Task queue')
argparser.add_argument('-v', action='store_true', help='Be verbose')
argparser.add_argument('-vv', action='store_true', help='Be more verbose')
argparser.add_argument('user_dir', type=str, help='path to users export dir tree')
args = argparser.parse_args()
if args.vv:
logging.getLogger().setLevel(logging.DEBUG)
elif args.v:
logging.getLogger().setLevel(logging.INFO)
if args.v:
logg.setLevel(logging.INFO)
elif args.vv:
logg.setLevel(logging.DEBUG)
config_dir = args.c
config = confini.Config(config_dir, os.environ.get('CONFINI_ENV_PREFIX'))
config = Config(args.c, args.env_prefix)
config.process()
args_override = {
'CIC_CHAIN_SPEC': getattr(args, 'i'),
@@ -60,44 +65,29 @@ args_override = {
'REDIS_PORT': getattr(args, 'redis_port'),
'REDIS_DB': getattr(args, 'redis_db'),
}
config.dict_override(args_override, 'cli')
logg.debug('config loaded from {}:\n{}'.format(args.c, config))
config.dict_override(args_override, 'cli flag')
config.censor('PASSWORD', 'DATABASE')
config.censor('PASSWORD', 'SSL')
logg.debug(f'config loaded from {args.c}:\n{config}')
celery_app = celery.Celery(broker=config.get('CELERY_BROKER_URL'), backend=config.get('CELERY_RESULT_URL'))
get_celery_worker_status(celery_app=celery_app)
old_account_dir = os.path.join(args.import_dir, 'old')
os.stat(old_account_dir)
logg.debug(f'created old system data dir: {old_account_dir}')
redis_host = config.get('REDIS_HOST')
redis_port = config.get('REDIS_PORT')
redis_db = config.get('REDIS_DB')
r = redis.Redis(redis_host, redis_port, redis_db)
new_account_dir = os.path.join(args.import_dir, 'new')
os.makedirs(new_account_dir, exist_ok=True)
logg.debug(f'created new system data dir: {new_account_dir}')
ps = r.pubsub()
person_metadata_dir = os.path.join(args.import_dir, 'meta')
os.makedirs(person_metadata_dir, exist_ok=True)
logg.debug(f'created person metadata dir: {person_metadata_dir}')
user_new_dir = os.path.join(args.user_dir, 'new')
os.makedirs(user_new_dir, exist_ok=True)
ussd_data_dir = os.path.join(args.user_dir, 'ussd')
os.makedirs(ussd_data_dir, exist_ok=True)
preferences_dir = os.path.join(args.user_dir, 'preferences')
preferences_dir = os.path.join(args.import_dir, 'preferences')
os.makedirs(os.path.join(preferences_dir, 'meta'), exist_ok=True)
logg.debug(f'created preferences metadata dir: {preferences_dir}')
meta_dir = os.path.join(args.user_dir, 'meta')
os.makedirs(meta_dir, exist_ok=True)
valid_service_codes = config.get('USSD_SERVICE_CODE').split(",")
user_old_dir = os.path.join(args.user_dir, 'old')
os.stat(user_old_dir)
txs_dir = os.path.join(args.user_dir, 'txs')
os.makedirs(txs_dir, exist_ok=True)
chain_spec = ChainSpec.from_chain_str(config.get('CIC_CHAIN_SPEC'))
chain_str = str(chain_spec)
batch_size = args.batch_size
batch_delay = args.batch_delay
ussd_port = args.ussd_port
ussd_host = args.ussd_host
ussd_no_ssl = args.ussd_no_ssl
if ussd_no_ssl is True:
ussd_ssl = False
@@ -105,7 +95,17 @@ else:
ussd_ssl = True
def build_ussd_request(phone, host, port, service_code, username, password, ssl=False):
celery_app = celery.Celery(broker=config.get('CELERY_BROKER_URL'), backend=config.get('CELERY_RESULT_URL'))
get_celery_worker_status(celery_app)
def build_ussd_request(host: str,
password: str,
phone_number: str,
port: str,
service_code: str,
username: str,
ssl: bool = False):
url = 'http'
if ssl:
url += 's'
@@ -115,16 +115,16 @@ def build_ussd_request(phone, host, port, service_code, username, password, ssl=
url += '/?username={}&password={}'.format(username, password)
logg.info('ussd service url {}'.format(url))
logg.info('ussd phone {}'.format(phone))
logg.info('ussd phone {}'.format(phone_number))
session = uuid.uuid4().hex
data = {
'sessionId': session,
'serviceCode': service_code,
'phoneNumber': phone,
'phoneNumber': phone_number,
'text': service_code,
}
req = urllib.request.Request(url)
req = request.Request(url)
req.method = 'POST'
data_str = urlencode(data)
data_bytes = data_str.encode('utf-8')
@@ -134,85 +134,77 @@ def build_ussd_request(phone, host, port, service_code, username, password, ssl=
return req
def register_ussd(i, u):
phone_object = phonenumbers.parse(u.tel)
phone = phonenumbers.format_number(phone_object, phonenumbers.PhoneNumberFormat.E164)
logg.debug('tel {} {}'.format(u.tel, phone))
req = build_ussd_request(
phone,
ussd_host,
ussd_port,
config.get('APP_SERVICE_CODE'),
'',
'',
ussd_ssl
)
response = urllib.request.urlopen(req)
def e164_phone_number(phone_number: str):
phone_object = phonenumbers.parse(phone_number)
return phonenumbers.format_number(phone_object, phonenumbers.PhoneNumberFormat.E164)
def register_account(person: Person):
phone_number = e164_phone_number(person.tel)
logg.debug(f'tel: {phone_number}')
req = build_ussd_request(args.ussd_host,
'',
phone_number,
args.ussd_port,
valid_service_codes[0],
'',
ussd_ssl)
response = request.urlopen(req)
response_data = response.read().decode('utf-8')
state = response_data[:3]
out = response_data[4:]
logg.debug('ussd reponse: {}'.format(out))
logg.debug(f'ussd response: {response_data[4:]}')
if __name__ == '__main__':
i = 0
j = 0
for x in os.walk(user_old_dir):
for x in os.walk(old_account_dir):
for y in x[2]:
if y[len(y) - 5:] != '.json':
continue
# handle json containing person object
filepath = os.path.join(x[0], y)
f = open(filepath, 'r')
try:
o = json.load(f)
except json.decoder.JSONDecodeError as e:
f.close()
logg.error('load error for {}: {}'.format(y, e))
continue
f.close()
u = Person.deserialize(o)
register_ussd(i, u)
phone_object = phonenumbers.parse(u.tel)
phone = phonenumbers.format_number(phone_object, phonenumbers.PhoneNumberFormat.E164)
s_phone = celery.signature(
'import_task.resolve_phone',
[
phone,
],
queue='cic-import-ussd',
file_path = os.path.join(x[0], y)
with open(file_path, 'r') as account_file:
try:
account_data = json.load(account_file)
except json.decoder.JSONDecodeError as e:
logg.error('load error for {}: {}'.format(y, e))
continue
person = Person.deserialize(account_data)
register_account(person)
phone_number = e164_phone_number(person.tel)
s_resolve_phone = celery.signature(
'import_task.resolve_phone', [phone_number], queue=args.q
)
s_meta = celery.signature(
'import_task.generate_metadata',
[
phone,
],
queue='cic-import-ussd',
s_person_metadata = celery.signature(
'import_task.generate_person_metadata', [phone_number], queue=args.q
)
s_balance = celery.signature(
'import_task.opening_balance_tx',
[
phone,
i,
],
queue='cic-import-ussd',
s_ussd_data = celery.signature(
'import_task.generate_ussd_data', [phone_number], queue=args.q
)
s_meta.link(s_balance)
s_phone.link(s_meta)
# block time plus a bit of time for ussd processing
s_phone.apply_async(countdown=7)
s_preferences_metadata = celery.signature(
'import_task.generate_preferences_data', [], queue=args.q
)
s_pins_data = celery.signature(
'import_task.generate_pins_data', [phone_number], queue=args.q
)
s_opening_balance = celery.signature(
'import_task.opening_balance_tx', [phone_number, i], queue=args.q
)
celery.chain(s_resolve_phone,
s_person_metadata,
s_ussd_data,
s_preferences_metadata,
s_pins_data,
s_opening_balance).apply_async(countdown=7)
i += 1
sys.stdout.write('imported {} {}'.format(i, u).ljust(200) + "\r")
sys.stdout.write('imported: {} {}'.format(i, person).ljust(200) + "\r\n")
j += 1
if j == batch_size:
time.sleep(batch_delay)
if j == args.batch_size:
time.sleep(args.batch_delay)
j = 0

View File

@@ -1,67 +1,67 @@
# standard imports
import argparse
import json
import csv
import logging
import os
import psycopg2
# external imports
import celery
from confini import Config
# local imports
default_config_dir = './config'
logging.basicConfig(level=logging.WARNING)
logg = logging.getLogger()
default_config_dir = '/usr/local/etc/cic'
arg_parser = argparse.ArgumentParser(description='Pins import script.')
arg_parser.add_argument('-c', type=str, default=default_config_dir, help='config root to use.')
arg_parser.add_argument('--env-prefix',
default=os.environ.get('CONFINI_ENV_PREFIX'),
dest='env_prefix',
type=str,
help='environment prefix for variables to overwrite configuration.')
arg_parser.add_argument('import_dir', default='out', type=str, help='user export directory')
arg_parser.add_argument('-v', help='be verbose', action='store_true')
arg_parser.add_argument('-vv', help='be more verbose', action='store_true')
arg_parser = argparse.ArgumentParser()
arg_parser.add_argument('-c', type=str, default=default_config_dir, help='config file')
arg_parser.add_argument('-q', type=str, default='cic-import-ussd', help='Task queue')
arg_parser.add_argument('-v', action='store_true', help='Be verbose')
arg_parser.add_argument('-vv', action='store_true', help='Be more verbose')
arg_parser.add_argument('user_dir', type=str, help='path to users export dir tree')
args = arg_parser.parse_args()
if args.v:
logg.setLevel(logging.INFO)
elif args.vv:
logg.setLevel(logging.DEBUG)
if args.vv:
logging.getLogger().setLevel(logging.DEBUG)
elif args.v:
logging.getLogger().setLevel(logging.INFO)
config_dir = args.c
config = Config(config_dir, os.environ.get('CONFINI_ENV_PREFIX'))
config = Config(args.c, args.env_prefix)
config.process()
logg.debug('config loaded from {}:\n{}'.format(args.c, config))
config.censor('PASSWORD', 'DATABASE')
logg.debug(f'config loaded from {args.c}:\n{config}')
ussd_data_dir = os.path.join(args.user_dir, 'ussd')
db_configs = {
'database': config.get('DATABASE_NAME'),
'host': config.get('DATABASE_HOST'),
'port': config.get('DATABASE_PORT'),
'user': config.get('DATABASE_USER'),
'password': config.get('DATABASE_PASSWORD')
}
celery_app = celery.Celery(broker=config.get('CELERY_BROKER_URL'), backend=config.get('CELERY_RESULT_URL'))
def main():
with open(f'{args.import_dir}/ussd_data.csv') as ussd_data_file:
ussd_data = [tuple(row) for row in csv.reader(ussd_data_file)]
db_conn = psycopg2.connect(
database=config.get('DATABASE_NAME'),
host=config.get('DATABASE_HOST'),
port=config.get('DATABASE_PORT'),
user=config.get('DATABASE_USER'),
password=config.get('DATABASE_PASSWORD')
)
db_cursor = db_conn.cursor()
sql = 'UPDATE account SET status = %s, preferred_language = %s WHERE phone_number = %s'
for element in ussd_data:
status = 2 if int(element[1]) == 1 else 1
preferred_language = element[2]
phone_number = element[0]
db_cursor.execute(sql, (status, preferred_language, phone_number))
logg.debug(f'Updating account:{phone_number} with: preferred language: {preferred_language} status: {status}.')
db_conn.commit()
db_cursor.close()
db_conn.close()
if __name__ == '__main__':
for x in os.walk(ussd_data_dir):
for y in x[2]:
if y[len(y) - 5:] == '.json':
filepath = os.path.join(x[0], y)
f = open(filepath, 'r')
try:
ussd_data = json.load(f)
logg.debug(f'LOADING USSD DATA: {ussd_data}')
except json.decoder.JSONDecodeError as e:
f.close()
logg.error('load error for {}: {}'.format(y, e))
continue
f.close()
s_set_ussd_data = celery.signature(
'import_task.set_ussd_data',
[db_configs, ussd_data]
)
s_set_ussd_data.apply_async(queue='cic-import-ussd')
main()

View File

@@ -1,27 +1,4 @@
[app]
ALLOWED_IP=0.0.0.0/0
LOCALE_FALLBACK=en
LOCALE_PATH=/usr/src/cic-ussd/var/lib/locale/
MAX_BODY_LENGTH=1024
PASSWORD_PEPPER=QYbzKff6NhiQzY3ygl2BkiKOpER8RE/Upqs/5aZWW+I=
SERVICE_CODE=*483*46#
[phone_number]
REGION=KE
[ussd]
MENU_FILE=/usr/src/data/ussd_menu.json
user =
pass =
[statemachine]
STATES=/usr/src/cic-ussd/states/
TRANSITIONS=/usr/src/cic-ussd/transitions/
[client]
host =
port =
ssl =
[keystore]
file_path = keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c
allowed_ip=0.0.0.0/0
max_body_length=1024
password_pepper=

View File

@@ -1,11 +1,10 @@
[cic]
registry_address = 0x32E860c2A0645d1B7B005273696905F5D6DC5D05
registry_address =
token_index_address =
accounts_index_address =
declarator_address =
approval_escrow_address =
chain_spec = evm:bloxberg:8996
chain_spec =
tx_retry_delay =
trust_address = 0xEb3907eCad74a0013c259D5874AE7f22DcBcC95C
trust_address =
user_ussd_svc_service_port =

View File

@@ -1,10 +1,10 @@
[database]
NAME=sempo
USER=postgres
PASSWORD=
HOST=localhost
PORT=5432
ENGINE=postgresql
DRIVER=psycopg2
DEBUG=0
POOL_SIZE=1
name=cic_ussd
user=postgres
password=
host=localhost
port=5432
engine=postgresql
driver=psycopg2
debug=0
pool_size=1

View File

@@ -0,0 +1,5 @@
[ussd]
menu_file=data/ussd_menu.json
service_code=*483*46#,*483*061#,*384*96#
user =
pass =

View File

@@ -1,91 +0,0 @@
# standard imports
import argparse
import json
import logging
import os
import uuid
# third-party imports
import bcrypt
import celery
import confini
import phonenumbers
import random
from cic_types.models.person import Person
from cryptography.fernet import Fernet
# local imports
logging.basicConfig(level=logging.WARNING)
logg = logging.getLogger()
script_dir = os.path.realpath(os.path.dirname(__file__))
default_config_dir = os.environ.get('CONFINI_DIR', os.path.join(script_dir, 'config'))
arg_parser = argparse.ArgumentParser()
arg_parser.add_argument('-c', type=str, default=default_config_dir, help='Config dir')
arg_parser.add_argument('-v', action='store_true', help='Be verbose')
arg_parser.add_argument('-vv', action='store_true', help='Be more verbose')
arg_parser.add_argument('--userdir', type=str, help='path to users export dir tree')
arg_parser.add_argument('pins_dir', type=str, help='path to pin export dir tree')
args = arg_parser.parse_args()
if args.v:
logg.setLevel(logging.INFO)
elif args.vv:
logg.setLevel(logging.DEBUG)
config = confini.Config(args.c, os.environ.get('CONFINI_ENV_PREFIX'))
config.process()
logg.info('loaded config\n{}'.format(config))
celery_app = celery.Celery(broker=config.get('CELERY_BROKER_URL'), backend=config.get('CELERY_RESULT_URL'))
user_dir = args.userdir
pins_dir = args.pins_dir
def generate_password_hash():
key = Fernet.generate_key()
fnt = Fernet(key)
pin = str(random.randint(1000, 9999))
return fnt.encrypt(bcrypt.hashpw(pin.encode('utf-8'), bcrypt.gensalt())).decode()
user_old_dir = os.path.join(user_dir, 'old')
logg.debug(f'reading user data from: {user_old_dir}')
pins_file = open(f'{pins_dir}/pins.csv', 'w')
if __name__ == '__main__':
for x in os.walk(user_old_dir):
for y in x[2]:
# skip non-json files
if y[len(y) - 5:] != '.json':
continue
# define file path for
filepath = None
if y[:15] != '_ussd_data.json':
filepath = os.path.join(x[0], y)
f = open(filepath, 'r')
try:
o = json.load(f)
except json.decoder.JSONDecodeError as e:
f.close()
logg.error('load error for {}: {}'.format(y, e))
continue
f.close()
u = Person.deserialize(o)
phone_object = phonenumbers.parse(u.tel)
phone = phonenumbers.format_number(phone_object, phonenumbers.PhoneNumberFormat.E164)
password_hash = uuid.uuid4().hex
pins_file.write(f'{phone},{password_hash}\n')
logg.info(f'Writing phone: {phone}, password_hash: {password_hash}')
pins_file.close()

View File

@@ -28,12 +28,11 @@ logg = logging.getLogger()
fake = Faker(['sl', 'en_US', 'no', 'de', 'ro'])
script_dir = os.path.realpath(os.path.dirname(__file__))
# config_dir = os.environ.get('CONFINI_DIR', '/usr/local/etc/cic')
config_dir = os.environ.get('CONFINI_DIR', os.path.join(script_dir, 'config'))
default_config_dir = './config'
argparser = argparse.ArgumentParser()
argparser.add_argument('-c', type=str, default=config_dir, help='Config dir')
argparser.add_argument('-c', type=str, default=default_config_dir, help='Config dir')
argparser.add_argument('--tag', type=str, action='append',
help='Tags to add to record')
argparser.add_argument('--gift-threshold', type=int,
@@ -53,7 +52,7 @@ elif args.vv:
config = confini.Config(args.c, os.environ.get('CONFINI_ENV_PREFIX'))
config.process()
logg.info('loaded config\n{}'.format(config))
logg.debug('loaded config\n{}'.format(config))
dt_now = datetime.datetime.utcnow()

View File

@@ -9,9 +9,12 @@ COPY package.json \
package-lock.json \
.
RUN --mount=type=cache,mode=0755,target=/root/node_modules npm install
COPY requirements.txt .
RUN npm ci --production
#RUN --mount=type=cache,mode=0755,target=/root/node_modules npm install
COPY requirements.txt .
COPY config/ /usr/local/etc/data-seeding
ARG EXTRA_INDEX_URL="https://pip.grassrootseconomics.net:8433"
ARG GITLAB_PYTHON_REGISTRY="https://gitlab.com/api/v4/projects/27624814/packages/pypi/simple"

View File

@@ -1,24 +0,0 @@
# syntax = docker/dockerfile:1.2
FROM registry.gitlab.com/grassrootseconomics/cic-base-images:python-3.8.6-dev-5ab8bf45
WORKDIR /root
RUN mkdir -vp /usr/local/etc/cic
COPY package.json \
package-lock.json \
.
RUN npm install
COPY requirements.txt .
ARG EXTRA_INDEX_URL="https://pip.grassrootseconomics.net:8433"
ARG GITLAB_PYTHON_REGISTRY="https://gitlab.com/api/v4/projects/27624814/packages/pypi/simple"
RUN pip install \
--extra-index-url $GITLAB_PYTHON_REGISTRY \
--extra-index-url $EXTRA_INDEX_URL -r requirements.txt
COPY . .
ENTRYPOINT [ ]

View File

@@ -0,0 +1,94 @@
#!/bin/bash
if [[ -d "$OUT_DIR" ]]
then
echo "found existing IMPORT DIR cleaning up..."
rm -rf "$OUT_DIR"
mkdir -p "$OUT_DIR"
else
echo "IMPORT DIR does not exist creating it."
mkdir -p "$OUT_DIR"
fi
# using timeout because the timeout flag for celery inspect does not work
timeout 5 celery inspect ping -b "$CELERY_BROKER_URL"
if [[ $? -eq 124 ]]
then
>&2 echo "Celery workers not available. Is the CELERY_BROKER_URL ($CELERY_BROKER_URL) correct?"
exit 1
fi
echo "Creating seed data..."
python create_import_users.py -vv -c "$CONFIG" --dir "$OUT_DIR" "$NUMBER_OF_USERS"
wait $!
echo "Check for running celery workers ..."
if [ -f ./cic-ussd-import.pid ];
then
echo "Found a running worker. Killing ..."
kill -9 $(<cic-ussd-import.pid)
fi
echo "Purge tasks from celery worker"
celery -A cic_ussd.import_task purge -Q "$CELERY_QUEUE" --broker redis://"$REDIS_HOST":"$REDIS_PORT" -f
echo "Start celery work and import balance job"
if [ "$INCLUDE_BALANCES" != "y" ]
then
echo "Running worker without opening balance transactions"
TARGET_TX_COUNT=$NUMBER_OF_USERS
nohup python cic_ussd/import_balance.py -vv -c "$CONFIG" -p "$ETH_PROVIDER" -r "$CIC_REGISTRY_ADDRESS" --token-symbol "$TOKEN_SYMBOL" -y "$KEYSTORE_PATH" "$OUT_DIR" > nohup.out 2> nohup.err < /dev/null &
else
echo "Running worker with opening balance transactions"
TARGET_TX_COUNT=$((NUMBER_OF_USERS*2))
nohup python cic_ussd/import_balance.py -vv -c "$CONFIG" -p "$ETH_PROVIDER" -r "$CIC_REGISTRY_ADDRESS" --include-balances --token-symbol "$TOKEN_SYMBOL" -y "$KEYSTORE_PATH" "$OUT_DIR" &
fi
echo "Target count set to ${TARGET_TX_COUNT}"
until [ -f ./cic-import-ussd.pid ]
do
echo "Polling for celery worker pid file..."
sleep 1
done
IMPORT_BALANCE_JOB=$(<cic-import-ussd.pid)
echo "Start import users job"
if [ "$USSD_SSL" == "y" ]
then
echo "Targeting secure ussd-user server"
python cic_ussd/import_users.py -vv -c "$CONFIG" --ussd-host "$USSD_HOST" --ussd-port "$USSD_PORT" "$OUT_DIR"
else
python cic_ussd/import_users.py -vv -c "$CONFIG" --ussd-host "$USSD_HOST" --ussd-port "$USSD_PORT" --ussd-no-ssl "$OUT_DIR"
fi
echo "Waiting for import balance job to complete ..."
tail --pid="$IMPORT_BALANCE_JOB" -f /dev/null
set -e
echo "Importing pins"
python cic_ussd/import_pins.py -c "$CONFIG" -vv "$OUT_DIR"
set +e
wait $!
set -e
echo "Importing ussd data"
python cic_ussd/import_ussd_data.py -c "$CONFIG" -vv "$OUT_DIR"
set +e
wait $!
echo "Importing person metadata"
node cic_meta/import_meta.js "$OUT_DIR" "$NUMBER_OF_USERS"
echo "Import preferences metadata"
node cic_meta/import_meta_preferences.js "$OUT_DIR" "$NUMBER_OF_USERS"
CIC_NOTIFY_DATABASE=postgres://$DATABASE_USER:$DATABASE_PASSWORD@$DATABASE_HOST:$DATABASE_PORT/$NOTIFY_DATABASE_NAME
NOTIFICATION_COUNT=$(psql -qtA "$CIC_NOTIFY_DATABASE" -c 'SELECT COUNT(message) FROM notification WHERE message IS NOT NULL')
while [[ "$NOTIFICATION_COUNT" < "$TARGET_TX_COUNT" ]]
do
NOTIFICATION_COUNT=$(psql -qtA "$CIC_NOTIFY_DATABASE" -c 'SELECT COUNT(message) FROM notification WHERE message IS NOT NULL')
sleep 5
echo "Notification count is: ${NOTIFICATION_COUNT} of ${TARGET_TX_COUNT}. Checking after 5 ..."
done
echo "Running verify script."
python verify.py -c "$CONFIG" -v -p "$ETH_PROVIDER" -r "$CIC_REGISTRY_ADDRESS" --exclude "$EXCLUSIONS" --meta-provider "$META_URL" --token-symbol "$TOKEN_SYMBOL" --ussd-provider "$USSD_PROVIDER" "$OUT_DIR"

File diff suppressed because it is too large Load Diff

View File

@@ -1,13 +1,14 @@
sarafu-faucet~=0.0.7a1
cic-eth[tools]~=0.12.4a7
cic-types~=0.1.0a14
crypto-dev-signer>=0.4.15a1,<=0.4.15
sarafu-faucet~=0.0.7a2
cic-eth[tools]~=0.12.4a8
cic-types~=0.1.0a15
crypto-dev-signer~=0.4.15a7
faker==4.17.1
chainsyncer~=0.0.6a3
chainlib-eth~=0.0.9a7
chainlib-eth~=0.0.9a14
eth-address-index~=0.2.3a4
eth-contract-registry~=0.6.3a3
eth-accounts-index~=0.1.2a3
eth-erc20~=0.1.2a2
eth-erc20~=0.1.2a3
erc20-faucet~=0.3.2a2
psycopg2==2.8.6
liveness~=0.0.1a7

View File

@@ -64,6 +64,10 @@ phone_tests = [
'ussd_pins'
]
admin_tests = [
'local_key',
]
all_tests = eth_tests + custodial_tests + metadata_tests + phone_tests
argparser = argparse.ArgumentParser(description='daemon that monitors transactions in new blocks')
@@ -74,6 +78,7 @@ argparser.add_argument('-i', '--chain-spec', type=str, dest='i', help='chain spe
argparser.add_argument('--meta-provider', type=str, dest='meta_provider', default='http://localhost:63380', help='cic-meta url')
argparser.add_argument('--ussd-provider', type=str, dest='ussd_provider', default='http://localhost:63315', help='cic-ussd url')
argparser.add_argument('--skip-custodial', dest='skip_custodial', action='store_true', help='skip all custodial verifications')
argparser.add_argument('--skip-ussd', dest='skip_ussd', action='store_true', help='skip all ussd verifications')
argparser.add_argument('--skip-metadata', dest='skip_metadata', action='store_true', help='skip all metadata verifications')
argparser.add_argument('--exclude', action='append', type=str, default=[], help='skip specified verification')
argparser.add_argument('--include', action='append', type=str, help='include specified verification')
@@ -134,6 +139,11 @@ if args.skip_custodial:
for t in custodial_tests:
if t not in exclude:
exclude.append(t)
if args.skip_ussd:
logg.info('will skip all ussd verifications ({})'.format(','.join(phone_tests)))
for t in phone_tests:
if t not in exclude:
exclude.append(t)
if args.skip_metadata:
logg.info('will skip all metadata verifications ({})'.format(','.join(metadata_tests)))
for t in metadata_tests:
@@ -154,20 +164,8 @@ for t in custodial_tests:
logg.info('activating custodial module'.format(t))
break
cols = os.get_terminal_size().columns
def to_terminalwidth(s):
ss = s.ljust(int(cols)-1)
ss += "\r"
return ss
def default_outfunc(s):
ss = to_terminalwidth(s)
sys.stdout.write(ss)
outfunc = default_outfunc
if logg.isEnabledFor(logging.DEBUG):
outfunc = logg.debug
outfunc = logg.debug
def send_ussd_request(address, data_dir):
@@ -187,9 +185,10 @@ def send_ussd_request(address, data_dir):
phone = p.tel
session = uuid.uuid4().hex
valid_service_codes = config.get('USSD_SERVICE_CODE').split(",")
data = {
'sessionId': session,
'serviceCode': config.get('APP_SERVICE_CODE'),
'serviceCode': valid_service_codes[0],
'phoneNumber': phone,
'text': '',
}

View File

@@ -60,8 +60,6 @@ services:
contract-migration:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/contract-migration:${TAG:-latest}
profiles:
- migrations
build:
context: apps/contract-migration
dockerfile: docker/Dockerfile
@@ -72,16 +70,14 @@ services:
EXTRA_PIP_ARGS: $EXTRA_PIP_ARGS
# image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/contract-migration:latest
environment:
CIC_REGISTRY_ADDRESS: $CIC_REGISTRY_ADDRESS
ETH_PROVIDER: ${CIC_HTTP_PROVIDER:-http://eth:8545}
RPC_HTTP_PROVIDER: ${CIC_HTTP_PROVIDER:-http://eth:8545}
# And these two are for wait-for-it (could parse this)
RPC_PROVIDER: ${RPC_PROVIDER:-http://eth:8545}
ETH_PROVIDER: ${RPC_PROVIDER:-http://eth:8545}
RPC_HTTP_PROVIDER: ${RPC_PROVIDER:-http://eth:8545}
DEV_USE_DOCKER_WAIT_SCRIPT: 1
ETH_PROVIDER_HOST: eth
ETH_PROVIDER_PORT: 8545
CHAIN_SPEC: ${CHAIN_SPEC:-evm:bloxberg:8996}
CIC_CHAIN_SPEC: ${CHAIN_SPEC:-evm:bloxberg:8996}
CIC_DATA_DIR: ${CIC_DATA_DIR:-/tmp/cic/config}
DEV_DATA_DIR: ${DEV_DATA_DIR:-/tmp/cic/config}
DEV_CONFIG_RESET: $DEV_CONFIG_RESET
DATABASE_HOST: ${DATABASE_HOST:-postgres}
DATABASE_PORT: ${DATABASE_PORT:-5432}
DATABASE_NAME: ${DEV_DATABASE_NAME_CIC_ETH:-cic_eth}
@@ -95,18 +91,19 @@ services:
CELERY_RESULT_URL: ${CELERY_RESULT_URL:-redis://redis:6379}
DEV_PIP_EXTRA_INDEX_URL: ${DEV_PIP_EXTRA_INDEX_URL:-https://pip.grassrootseconomics.net:8433}
RUN_MASK: ${RUN_MASK:-0} # bit flags; 1: contract migrations 2: seed data
DEV_FAUCET_AMOUNT: ${DEV_FAUCET_AMOUNT:-0}
#DEV_SARAFU_DEMURRAGE_LEVEL: ${DEV_SARAFU_DEMURRAGE_LEVEL:-196454828847045000000000000000000}
DEV_ETH_GAS_PRICE: ${DEV_ETH_GAS_PRICE:-1}
CIC_DEFAULT_TOKEN_SYMBOL: $CIC_DEFAULT_TOKEN_SYMBOL
TOKEN_NAME: $TOKEN_NAME
DEV_FAUCET_AMOUNT: ${DEV_FAUCET_AMOUNT:-50000000}
DEV_ETH_GAS_PRICE: $DEV_ETH_GAS_PRICE
TOKEN_NAME: ${TOKEN_NAME:-Giftable Token}
TOKEN_SYMBOL: ${TOKEN_SYMBOL:-GFT}
TOKEN_TYPE: ${TOKEN_TYPE:-giftable_erc20_token}
TOKEN_DECIMALS: $TOKEN_DECIMALS
TOKEN_REDISTRIBUTION_PERIOD: $TOKEN_REDISTRIBUTION_PERIOD
TOKEN_REDISTRIBUTION_PERIOD: $TOKEN_DEMURRAGE_REDISTRIBUTION_PERIOD
TASKS_TRANSFER_CALLBACKS: ${TASKS_TRANSFER_CALLBACKS:-"cic-eth:cic_eth.callbacks.noop.noop,cic-ussd:cic_ussd.tasks.callback_handler.transaction_callback"}
TOKEN_SUPPLY_LIMIT: $TOKEN_SUPPLY_LIMIT
TOKEN_DEMURRAGE_LEVEL: ${TOKEN_DEMURRAGE_LEVEL:-196454828847045000000000000000000}
TOKEN_DEMURRAGE_LEVEL: $TOKEN_DEMURRAGE_LEVEL
TOKEN_SINK_ADDRESS: $TOKEN_SINK_ADDRESS
TOKEN_TYPE: $TOKEN_TYPE
#CONFINI_DIR: ${CONFINI_DIR:-/tmp/cic/config}
SIGNER_PROVIDER: ${SIGNER_SOCKET_PATH:-http://cic-eth-signer:8000}
restart: on-failure
command: ["./run_job.sh"]
#command: ["./reset.sh"]
depends_on:
@@ -118,10 +115,50 @@ services:
- contract-config:/tmp/cic/config
data-seeding:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/data-seeding:${TAG:-latest}
build:
context: apps/data-seeding
dockerfile: docker/Dockerfile
args:
pip_index_url: ${PIP_DEFAULT_INDEX_URL:-https://pypi.org/simple}
pip_extra_args: $PIP_EXTRA_ARGS
environment:
CIC_REGISTRY_ADDRESS: ${CIC_REGISTRY_ADDRESS:-0xea6225212005e86a4490018ded4bf37f3e772161}
OUT_DIR: out
NUMBER_OF_USERS: 10
CONFIG: /usr/local/etc/data-seeding
CIC_CHAIN_SPEC: ${CIC_CHAIN_SPEC:-evm:bloxberg:8996}
ETH_PROVIDER: ${CIC_HTTP_PROVIDER:-http://eth:8545}
TOKEN_SYMBOL: GFT
KEYSTORE_PATH: keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c
USSD_HOST: cic-user-ussd-server
USSD_PORT: 9000
INCLUDE_BALANCES: y
USSD_SSL: n
DATABASE_NAME: ${DATABASE_NAME:-cic_ussd}
DATABASE_USER: ${DATABASE_USER:-grassroots}
DATABASE_PASSWORD: ${DATABASE_PASSWORD:-tralala} # this is is set at initdb see: postgres/initdb/create_db.sql
DATABASE_HOST: ${DATABASE_HOST:-postgres}
DATABASE_PORT: ${DATABASE_PORT:-5432}
CELERY_BROKER_URL: ${CELERY_BROKER_URL:-redis://redis:6379}
CELERY_RESULT_URL: ${CELERY_RESULT_URL:-redis://redis:6379}
NOTIFY_DATABASE_NAME: cic_notify
REDIS_HOST: redis
REDIS_PORT: 6379
REDIS_DB: 0
META_HOST: meta
META_PORT: 8000
META_URL: http://meta:8000
USSD_PROVIDER: http://cic-user-ussd-server:9000
CELERY_QUEUE: cic-import-ussd
EXCLUSIONS: ussd
command: bash import_ussd.sh
volumes:
- contract-config:/tmp/cic/config/:ro
cic-cache-tracker:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:${TAG:-latest}
profiles:
- cache
build:
context: apps/cic-cache
dockerfile: docker/Dockerfile
@@ -129,8 +166,8 @@ services:
EXTRA_INDEX_URL: ${EXTRA_INDEX_URL:-https://pip.grassrootseconomics.net:8433}
environment:
CIC_REGISTRY_ADDRESS: $CIC_REGISTRY_ADDRESS # supplied at contract-config after contract provisioning
ETH_PROVIDER: ${RPC_HTTP_PROVIDER:-http://eth:8545}
RPC_HTTP_PROVIDER: ${RPC_HTTP_PROVIDER:-http://eth:8545}
ETH_PROVIDER: ${RPC_PROVIDER:-http://eth:8545}
RPC_PROVIDER: ${RPC_PROVIDER:-http://eth:8545}
DATABASE_USER: ${DATABASE_USER:-grassroots}
DATABASE_PASSWORD: ${DATABASE_PASSWORD:-tralala} # this is is set at initdb see: postgres/initdb/create_db.sql
DATABASE_HOST: ${DATABASE_HOST:-postgres}
@@ -144,9 +181,7 @@ services:
CIC_CHAIN_SPEC: ${CHAIN_SPEC:-evm:bloxberg:8996}
CELERY_BROKER_URL: redis://redis:6379
CELERY_RESULT_URL: redis://redis:6379
deploy:
restart_policy:
condition: on-failure
restart: on-failure
depends_on:
- redis
- postgres
@@ -155,15 +190,13 @@ services:
- /bin/bash
- -c
- |
if [[ -f /tmp/cic/config/.env ]]; then source /tmp/cic/config/.env; fi
if [[ -f /tmp/cic/config/env_reset ]]; then source /tmp/cic/config/env_reset; fi
./start_tracker.sh -c /usr/local/etc/cic-cache -vv
volumes:
- contract-config:/tmp/cic/config/:ro
cic-cache-tasker:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:${TAG:-latest}
profiles:
- cache
build:
context: apps/cic-cache
dockerfile: docker/Dockerfile
@@ -185,9 +218,7 @@ services:
CIC_CHAIN_SPEC: ${CIC_CHAIN_SPEC:-evm:bloxberg:8996}
CELERY_BROKER_URL: redis://redis:6379
CELERY_RESULT_URL: redis://redis:6379
deploy:
restart_policy:
condition: on-failure
restart: unless-stopped
depends_on:
- redis
- postgres
@@ -196,15 +227,13 @@ services:
- /bin/bash
- -c
- |
if [[ -f /tmp/cic/config/.env ]]; then source /tmp/cic/config/.env; fi
if [[ -f /tmp/cic/config/env_reset ]]; then source /tmp/cic/config/env_reset; fi
/usr/local/bin/cic-cache-taskerd -vv
volumes:
- contract-config:/tmp/cic/config/:ro
cic-cache-server:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:${TAG:-latest}
profiles:
- cache
build:
context: apps/cic-cache
dockerfile: docker/Dockerfile
@@ -219,18 +248,18 @@ services:
DATABASE_DEBUG: 1
#PGPASSWORD: $DATABASE_PASSWORD
SERVER_PORT: 8000
restart: on-failure
ports:
- ${HTTP_PORT_CIC_CACHE:-63313}:8000
depends_on:
- postgres
deploy:
restart_policy:
condition: on-failure
- cic-cache-tasker
- cic-cache-tracker
command:
- /bin/bash
- -c
- |
if [[ -f /tmp/cic/config/.env ]]; then source /tmp/cic/config/.env; fi
if [[ -f /tmp/cic/config/env_reset ]]; then source /tmp/cic/config/env_reset; fi
"/usr/local/bin/uwsgi" \
--wsgi-file /root/cic_cache/runnable/daemons/server.py \
--http :8000 \
@@ -249,7 +278,7 @@ services:
CIC_REGISTRY_ADDRESS: $CIC_REGISTRY_ADDRESS
ETH_GAS_PROVIDER_ADDRESS: $DEV_ETH_ACCOUNT_GAS_PROVIDER
ETH_PROVIDER: ${ETH_PROVIDER:-http://eth:8545}
RPC_HTTP_PROVIDER: ${ETH_PROVIDER:-http://eth:8545}
RPC_PROVIDER: ${ETH_PROVIDER:-http://eth:8545}
DATABASE_USER: ${DATABASE_USER:-grassroots}
DATABASE_HOST: ${DATABASE_HOST:-postgres}
DATABASE_PASSWORD: ${DATABASE_PASSWORD:-tralala}
@@ -267,28 +296,81 @@ services:
CELERY_BROKER_URL: ${CELERY_BROKER_URL:-redis://redis}
CELERY_RESULT_URL: ${CELERY_RESULT_URL:-redis://redis}
CELERY_DEBUG: ${CELERY_DEBUG:-1}
SIGNER_SOCKET_PATH: ${SIGNER_SOCKET_PATH:-ipc:///run/crypto-dev-signer/jsonrpc.ipc}
SIGNER_PROVIDER: ${SIGNER_SOCKET_PATH:-ipc:///run/crypto-dev-signer/jsonrpc.ipc}
#SIGNER_SOCKET_PATH: ${SIGNER_SOCKET_PATH:-http://cic-eth-signer:8000}
SIGNER_PROVIDER: ${SIGNER_PROVIDER:-http://cic-eth-signer:8000}
SIGNER_SECRET: ${SIGNER_SECRET:-deadbeef}
ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER: ${DEV_ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER:-0xACB0BC74E1686D62dE7DC6414C999EA60C09F0eA}
TASKS_TRACE_QUEUE_STATUS: ${TASKS_TRACE_QUEUE_STATUS:-1}
CIC_DEFAULT_TOKEN_SYMBOL: ${CIC_DEFAULT_TOKEN_SYMBOL:-GFT}
restart: unless-stopped
depends_on:
- eth
- postgres
- redis
deploy:
restart_policy:
condition: on-failure
- cic-eth-signer
volumes:
- signer-data:/tmp/cic/signer
- signer-data:/run/crypto-dev-signer
- contract-config:/tmp/cic/config/:ro
#command: ["/usr/local/bin/cic-eth-taskerd"]
#command: ["sleep", "3600"]
command:
- /bin/bash
- -c
- |
if [[ -f /tmp/cic/config/.env ]]; then source /tmp/cic/config/.env; fi
if [[ -f /tmp/cic/config/env_reset ]]; then source /tmp/cic/config/env_reset; fi
./start_tasker.sh --aux-all -q cic-eth -vv
cic-eth-signer:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:${TAG:-latest}
build:
context: apps/cic-eth
dockerfile: docker/Dockerfile
target: dev
args:
EXTRA_INDEX_URL: ${EXTRA_INDEX_URL:-https://pip.grassrootseconomics.net:8433}
environment:
CIC_REGISTRY_ADDRESS: $CIC_REGISTRY_ADDRESS
ETH_GAS_PROVIDER_ADDRESS: $DEV_ETH_ACCOUNT_GAS_PROVIDER
ETH_PROVIDER: ${ETH_PROVIDER:-http://eth:8545}
RPC_PROVIDER: ${ETH_PROVIDER:-http://eth:8545}
DATABASE_USER: ${DATABASE_USER:-grassroots}
DATABASE_HOST: ${DATABASE_HOST:-postgres}
DATABASE_PASSWORD: ${DATABASE_PASSWORD:-tralala}
DATABASE_NAME: ${DATABASE_NAME_CIC_ETH:-cic_eth}
DATABASE_PORT: ${DATABASE_PORT:-5432}
DATABASE_ENGINE: ${DATABASE_ENGINE:-postgres}
DATABASE_DRIVER: ${DATABASE_DRIVER:-psycopg2}
DATABASE_DEBUG: ${DATABASE_DEBUG:-0}
DATABASE_POOL_SIZE: 0
REDIS_PORT: 6379
REDIS_HOST: redis
PGPASSWORD: ${DATABASE_PASSWORD:-tralala}
CIC_CHAIN_SPEC: ${CIC_CHAIN_SPEC:-evm:bloxberg:8996}
CHAIN_SPEC: ${CIC_CHAIN_SPEC:-evm:bloxberg:8996}
CELERY_BROKER_URL: ${CELERY_BROKER_URL:-redis://redis}
CELERY_RESULT_URL: ${CELERY_RESULT_URL:-redis://redis}
CELERY_DEBUG: ${CELERY_DEBUG:-1}
SIGNER_SOCKET_PATH: ${SIGNER_SOCKET_PATH:-http://0.0.0.0:8000}
SIGNER_PROVIDER: ${SIGNER_SOCKET_PATH:-http://0.0.0.0:8000}
SIGNER_SECRET: ${SIGNER_SECRET:-deadbeef}
ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER: ${DEV_ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER:-0xACB0BC74E1686D62dE7DC6414C999EA60C09F0eA}
TASKS_TRACE_QUEUE_STATUS: ${TASKS_TRACE_QUEUE_STATUS:-1}
CIC_DEFAULT_TOKEN_SYMBOL: ${CIC_DEFAULT_TOKEN_SYMBOL:-GFT}
restart: on-failure
depends_on:
- eth
- postgres
- redis
volumes:
- signer-data:/run/crypto-dev-signer
- contract-config:/tmp/cic/config/:ro
command: ["python", "/usr/local/bin/crypto-dev-daemon", "-c", "/usr/local/etc/crypto-dev-signer", "-vv"]
#command:
# - /bin/bash
# - -c
# - |
# if [[ -f /tmp/cic/config/env_reset ]]; then source /tmp/cic/config/env_reset; fi
# ./start_tasker.sh --aux-all -q cic-eth -vv
# command: [/bin/sh, "./start_tasker.sh", -q, cic-eth, -vv ]
cic-eth-tracker:
@@ -300,8 +382,8 @@ services:
args:
EXTRA_INDEX_URL: ${EXTRA_INDEX_URL:-https://pip.grassrootseconomics.net:8433}
environment:
RPC_HTTP_PROVIDER: ${RPC_HTTP_PROVIDER:-http://eth:8545}
ETH_PROVIDER: ${RPC_HTTP_PROVIDER:-http://eth:8545}
RPC_PROVIDER: ${RPC_PROVIDER:-http://eth:8545}
ETH_PROVIDER: ${RPC_PROVIDER:-http://eth:8545}
DATABASE_USER: ${DATABASE_USER:-grassroots}
DATABASE_HOST: ${DATABASE_HOST:-postgres}
DATABASE_PASSWORD: ${DATABASE_PASSWORD:-tralala}
@@ -313,24 +395,21 @@ services:
CIC_CHAIN_SPEC: ${CHAIN_SPEC:-evm:bloxberg:8996}
CHAIN_SPEC: ${CHAIN_SPEC:-evm:bloxberg:8996}
CIC_REGISTRY_ADDRESS: $CIC_REGISTRY_ADDRESS
#BANCOR_DIR: $BANCOR_DIR
CELERY_BROKER_URL: ${CELERY_BROKER_URL:-redis://redis}
CELERY_RESULT_URL: ${CELERY_RESULT_URL:-redis://redis}
TASKS_TRANSFER_CALLBACKS: $TASKS_TRANSFER_CALLBACKS
TASKS_TRANSFER_CALLBACKS: ${TASKS_TRANSFER_CALLBACKS:-"cic-eth:cic_eth.callbacks.noop.noop,cic-ussd:cic_ussd.tasks.callback_handler.transaction_callback"}
restart: on-failure
depends_on:
- eth
- postgres
- redis
deploy:
restart_policy:
condition: on-failure
volumes:
- contract-config:/tmp/cic/config/:ro
command:
- /bin/bash
- -c
- |
if [[ -f /tmp/cic/config/.env ]]; then source /tmp/cic/config/.env; fi
if [[ -f /tmp/cic/config/env_reset ]]; then source /tmp/cic/config/env_reset; fi
#./start_tracker.sh -vv -c /usr/local/etc/cic-eth
./start_tracker.sh -vv
# command: "/root/start_manager.sh head -vv"
@@ -346,7 +425,7 @@ services:
EXTRA_INDEX_URL: ${EXTRA_INDEX_URL:-https://pip.grassrootseconomics.net:8433}
environment:
ETH_PROVIDER: http://eth:8545
RPC_HTTP_PROVIDER: http://eth:8545
RPC_PROVIDER: http://eth:8545
DATABASE_USER: ${DATABASE_USER:-grassroots}
DATABASE_HOST: ${DATABASE_HOST:-postgres}
DATABASE_PASSWORD: ${DATABASE_PASSWORD:-tralala}
@@ -363,22 +442,21 @@ services:
TASKS_TRANSFER_CALLBACKS: $TASKS_TRANSFER_CALLBACKS
DATABASE_DEBUG: ${DATABASE_DEBUG:-false}
#DATABASE_DEBUG: 1
restart: on-failure
depends_on:
- eth
- postgres
- redis
deploy:
restart_policy:
condition: on-failure
volumes:
- contract-config:/tmp/cic/config/:ro
command:
- /bin/bash
- -c
- |
if [[ -f /tmp/cic/config/.env ]]; then source /tmp/cic/config/.env; fi
if [[ -f /tmp/cic/config/env_reset ]]; then source /tmp/cic/config/env_reset; fi
./start_dispatcher.sh -q cic-eth -vv
# command: "/root/start_dispatcher.sh -q cic-eth -vv"
cic-eth-retrier:
@@ -391,7 +469,7 @@ services:
EXTRA_INDEX_URL: ${EXTRA_INDEX_URL:-https://pip.grassrootseconomics.net:8433}
environment:
ETH_PROVIDER: http://eth:8545
RPC_HTTP_PROVIDER: http://eth:8545
RPC_PROVIDER: http://eth:8545
DATABASE_USER: ${DATABASE_USER:-grassroots}
DATABASE_HOST: ${DATABASE_HOST:-postgres}
DATABASE_PASSWORD: ${DATABASE_PASSWORD:-tralala}
@@ -410,20 +488,18 @@ services:
CIC_TX_RETRY_DELAY: 60
BATCH_SIZE: ${RETRIER_BATCH_SIZE:-50}
#DATABASE_DEBUG: 1
restart: on-failure
depends_on:
- eth
- postgres
- redis
deploy:
restart_policy:
condition: on-failure
volumes:
- contract-config:/tmp/cic/config/:ro
command:
- /bin/bash
- -c
- |
if [[ -f /tmp/cic/config/.env ]]; then source /tmp/cic/config/.env; fi
if [[ -f /tmp/cic/config/env_reset ]]; then source /tmp/cic/config/env_reset; fi
./start_retry.sh -vv
# command: "/root/start_retry.sh -q cic-eth -vv"
@@ -449,19 +525,15 @@ services:
AFRICASTALKING_API_USERNAME: $AFRICASTALKING_API_USERNAME
AFRICASTALKING_API_KEY: $AFRICASTALKING_API_KEY
AFRICASTALKING_API_SENDER_ID: $AFRICASTALKING_API_SENDER_ID
restart: unless-stopped
depends_on:
- postgres
- redis
deploy:
restart_policy:
condition: on-failure
command: "/root/start_tasker.sh -q cic-notify -vv"
cic-meta-server:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-meta:${TAG:-latest}
profiles:
- custodial-meta
hostname: meta
build:
context: apps/cic-meta
@@ -483,21 +555,17 @@ services:
PGP_PUBLICKEY_ACTIVE_FILE: publickeys.asc
PGP_PUBLICKEY_ENCRYPT_FILE: publickeys.asc
SCHEMA_SQL_PATH: scripts/initdb/server.postgres.sql
restart: on-failure
ports:
- ${HTTP_PORT_CIC_META:-63380}:8000
depends_on:
- postgres
deploy:
restart_policy:
condition: on-failure
volumes:
- ./apps/contract-migration/testdata/pgp/:/tmp/cic/pgp
# command: "/root/start_server.sh -vv"
cic-user-ussd-server:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:${TAG:-latest}
profiles:
- custodial-ussd
build:
context: apps/cic-ussd
dockerfile: docker/Dockerfile
@@ -515,6 +583,7 @@ services:
PGP_PASSPHRASE: merman
SERVER_PORT: 9000
CIC_META_URL: ${CIC_META_URL:-http://meta:8000}
restart: on-failure
ports:
- ${HTTP_PORT_CIC_USER_USSD_SERVER:-63315}:9000
depends_on:
@@ -522,15 +591,10 @@ services:
- redis
volumes:
- ./apps/contract-migration/testdata/pgp/:/usr/src/secrets/
deploy:
restart_policy:
condition: on-failure
command: "/root/start_cic_user_ussd_server.sh -vv"
cic-user-server:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:${TAG:-latest}
profiles:
- custodial-ussd
build:
context: apps/cic-ussd
dockerfile: docker/Dockerfile
@@ -544,19 +608,15 @@ services:
DATABASE_ENGINE: postgresql
DATABASE_DRIVER: psycopg2
DATABASE_POOL_SIZE: 0
restart: on-failure
ports:
- ${HTTP_PORT_CIC_USER_SERVER:-63415}:9500
depends_on:
- postgres
deploy:
restart_policy:
condition: on-failure
command: "/root/start_cic_user_server.sh -vv"
cic-user-tasker:
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:${TAG:-latest}
profiles:
- custodial-ussd
build:
context: apps/cic-ussd/
dockerfile: docker/Dockerfile
@@ -574,12 +634,10 @@ services:
CELERY_RESULT_URL: ${CELERY_BROKER_URL:-redis://redis}
PGP_PASSPHRASE: merman
CIC_META_URL: ${CIC_META_URL:-http://meta:8000}
restart: unless-stopped
depends_on:
- postgres
- redis
volumes:
- ./apps/contract-migration/testdata/pgp/:/usr/src/secrets/
deploy:
restart_policy:
condition: on-failure
command: "/root/start_cic_user_tasker.sh -q cic-ussd -vv"

View File

@@ -1,10 +0,0 @@
apiVersion: v1
kind: ConfigMap
metadata:
name: cic-auth-proxy-credentials-configmap
namespace: grassroots
data:
credentials.yaml: |
level: 9
items:
user: 1

View File

@@ -1,10 +0,0 @@
apiVersion: v1
kind: ConfigMap
metadata:
name: cic-auth-proxy-acl-configmap
namespace: grassroots
data:
F3FAF668E82EF5124D5187BAEF26F4682343F692: |
- "^/user(/.*)?$":
read:
- user

View File

@@ -1,114 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-auth-proxy-meta
namespace: grassroots
labels:
app: cic-auth-proxy-meta
group: cic
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-auth-proxy-meta
replicas: 1
template:
metadata:
labels:
app: cic-auth-proxy-meta
group: cic
spec:
containers:
- name: cic-auth-proxy-meta
#image: registry.gitlab.com/grassrootseconomics/cic-auth-proxy:master-c05fafbf-1627493790 # {"$imagepolicy": "flux-system:cic-auth-proxy"}
image: registry.gitlab.com/grassrootseconomics/cic-auth-proxy:latest
imagePullPolicy: Always
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 100m
memory: 200Mi
env:
- name: PROXY_HOST
value: cic-meta-server
- name: PROXY_PORT
value: "80"
- name: PROXY_PATH_PREFIX
value: "/"
- name: HTTP_AUTH_ORIGIN
value: https://meta-auth.dev.grassrootseconomics.net:443
- name: HTTP_AUTH_REALM
value: GE
- name: ACL_CREDENTIALS_ENDPOINT
value: http://key-server:8081/
- name: ACL_PATH
value: /data/acls/F3FAF668E82EF5124D5187BAEF26F4682343F692
- name: GPG_PUBLICKEYS_ENDPOINT
value: http://key-server:8080/.well-known/publickeys/
- name: GPG_SIGNATURE_ENDPOINT
value: http://key-server:8080/.well-known/signature/
- name: GPG_TRUSTED_PUBLICKEY_FINGERPRINT # fingerprint of trusted key
value: CCE2E1D2D0E36ADE0405E2D0995BB21816313BD5
- name: GPG_HOMEDIR
value: /usr/local/etc/cic-auth-proxy/.gnupg/
- name: GPG_IMPORT_DIR
value: /usr/local/etc/cic-auth-proxy/import/
- name: GPG_PUBLICKEY_FILENAME
value: publickeys.asc
- name: GPG_SIGNATURE_FILENAME
value: signature.asc
- name: GPG_TRUSTED_PUBLICKEY_MATERIAL
value: /usr/local/etc/cic-auth-proxy/trusted/trustedpublickey.asc
ports:
- containerPort: 8080
name: http
volumeMounts:
- name: acl-config
mountPath: /data/acls/
readOnly: true
- name: credentials-config
mountPath: /data/noop/
readOnly: true
- name: trusted-publickey
mountPath: /usr/local/etc/cic-auth-proxy/trusted/
- name: gpg-homedir
mountPath: /usr/local/etc/cic-auth-proxy/.gnupg
- name: pgp-meta-test
mountPath: /usr/local/etc/cic-auth-proxy/import
volumes:
- name: pgp-meta-test
configMap:
name: pgp-meta-test
- name: acl-config
configMap:
name: cic-auth-proxy-acl-configmap
- name: credentials-config
configMap:
name: cic-auth-proxy-credentials-configmap
- name: trusted-publickey
configMap:
name: pgp-trusted-publickey
- name: gpg-homedir
emptyDir: {}
---
# https://kubernetes.io/docs/concepts/services-networking/service/
apiVersion: v1
kind: Service
metadata:
name: cic-auth-proxy-meta
namespace: grassroots
spec:
selector:
app: cic-auth-proxy-meta
type: ClusterIP
ports:
- name: http
protocol: TCP
port: 80
targetPort: 8080

View File

@@ -1,114 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-auth-proxy-user
namespace: grassroots
labels:
app: cic-auth-proxy-user
group: cic
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-auth-proxy-user
replicas: 1
template:
metadata:
labels:
app: cic-auth-proxy-user
group: cic
spec:
containers:
- name: cic-auth-proxy-user
#image: registry.gitlab.com/grassrootseconomics/cic-auth-proxy:master-c05fafbf-1627493790 # {"$imagepolicy": "flux-system:cic-auth-proxy"}
image: registry.gitlab.com/grassrootseconomics/cic-auth-proxy:latest
imagePullPolicy: Always
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 100m
memory: 200Mi
env:
- name: PROXY_HOST
value: cic-user-server
- name: PROXY_PORT
value: "80"
- name: PROXY_PATH_PREFIX
value: "/"
- name: HTTP_AUTH_ORIGIN
value: https://meta-auth.dev.grassrootseconomics.net:443
- name: HTTP_AUTH_REALM
value: GE
- name: ACL_CREDENTIALS_ENDPOINT
value: http://key-server:8081/
- name: ACL_PATH
value: /data/acls/F3FAF668E82EF5124D5187BAEF26F4682343F692
- name: GPG_PUBLICKEYS_ENDPOINT
value: http://key-server:8080/.well-known/publickeys/
- name: GPG_SIGNATURE_ENDPOINT
value: http://key-server:8080/.well-known/signature/
- name: GPG_TRUSTED_PUBLICKEY_FINGERPRINT # fingerprint of trusted key
value: CCE2E1D2D0E36ADE0405E2D0995BB21816313BD5
- name: GPG_HOMEDIR
value: /usr/local/etc/cic-auth-proxy/.gnupg/
- name: GPG_IMPORT_DIR
value: /usr/local/etc/cic-auth-proxy/import/
- name: GPG_PUBLICKEY_FILENAME
value: publickeys.asc
- name: GPG_SIGNATURE_FILENAME
value: signature.asc
- name: GPG_TRUSTED_PUBLICKEY_MATERIAL
value: /usr/local/etc/cic-auth-proxy/trusted/trustedpublickey.asc
ports:
- containerPort: 8080
name: http
volumeMounts:
- name: acl-config
mountPath: /data/acls/
readOnly: true
- name: credentials-config
mountPath: /data/noop/
readOnly: true
- name: trusted-publickey
mountPath: /usr/local/etc/cic-auth-proxy/trusted/
- name: gpg-homedir
mountPath: /usr/local/etc/cic-auth-proxy/.gnupg
- name: pgp-meta-test
mountPath: /usr/local/etc/cic-auth-proxy/import
volumes:
- name: pgp-meta-test
configMap:
name: pgp-meta-test
- name: acl-config
configMap:
name: cic-auth-proxy-acl-configmap
- name: credentials-config
configMap:
name: cic-auth-proxy-credentials-configmap
- name: trusted-publickey
configMap:
name: pgp-trusted-publickey
- name: gpg-homedir
emptyDir: {}
---
# https://kubernetes.io/docs/concepts/services-networking/service/
apiVersion: v1
kind: Service
metadata:
name: cic-auth-proxy-user
namespace: grassroots
spec:
selector:
app: cic-auth-proxy-user
type: ClusterIP
ports:
- name: http
protocol: TCP
port: 80
targetPort: 8080

View File

@@ -1,129 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-auth-proxy-ussd
namespace: grassroots
labels:
app: cic-auth-proxy-ussd
group: cic
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-auth-proxy-ussd
replicas: 1
template:
metadata:
labels:
app: cic-auth-proxy-ussd
group: cic
spec:
containers:
- name: cic-auth-proxy-ussd
#image: registry.gitlab.com/grassrootseconomics/cic-auth-proxy:master-c05fafbf-1627493790 # {"$imagepolicy": "flux-system:cic-auth-proxy"}
image: registry.gitlab.com/grassrootseconomics/cic-auth-proxy:latest
imagePullPolicy: Always
command: ["uwsgi", "--wsgi-file", "meta/scripts/proxy-ussd.py", "--http",
":8080"]
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 100m
memory: 200Mi
env:
- name: PROXY_HOST
value: cic-user-ussd-server
- name: PROXY_PORT
value: "80"
- name: PROXY_PATH_PREFIX
value: "/"
- name: HTTP_AUTH_ORIGIN
value: https://ussd-auth.dev.grassrootseconomics.net:443
- name: HTTP_AUTH_REALM
value: GE
- name: ACL_CREDENTIALS_ENDPOINT
value: http://key-server:8081/
- name: ACL_PATH
value: /data/acls/F3FAF668E82EF5124D5187BAEF26F4682343F692
- name: ACL_QUERYSTRING_USERNAME
valueFrom:
secretKeyRef:
name: cic-ussd-querystring-creds
key: username
- name: ACL_QUERYSTRING_PASSWORD
valueFrom:
secretKeyRef:
name: cic-ussd-querystring-creds
key: password
- name: ACL_WHITELIST
value: "37.188.113.15, 164.177.157.18, 5.79.0.242, 164.177.141.82, 164.177.141.83"
- name: GPG_PUBLICKEYS_ENDPOINT
value: http://key-server:8080/.well-known/publickeys/
- name: GPG_SIGNATURE_ENDPOINT
value: http://key-server:8080/.well-known/signature/
- name: GPG_TRUSTED_PUBLICKEY_FINGERPRINT # fingerprint of trusted key
value: CCE2E1D2D0E36ADE0405E2D0995BB21816313BD5
- name: GPG_HOMEDIR
value: /usr/local/etc/cic-auth-proxy/.gnupg/
- name: GPG_IMPORT_DIR
value: /usr/local/etc/cic-auth-proxy/import/
- name: GPG_PUBLICKEY_FILENAME
value: publickeys.asc
- name: GPG_SIGNATURE_FILENAME
value: signature.asc
- name: GPG_TRUSTED_PUBLICKEY_MATERIAL
value: /usr/local/etc/cic-auth-proxy/trusted/trustedpublickey.asc
ports:
- containerPort: 8080
name: http
volumeMounts:
- name: acl-config
mountPath: /data/acls/
readOnly: true
- name: credentials-config
mountPath: /data/noop/
readOnly: true
- name: trusted-publickey
mountPath: /usr/local/etc/cic-auth-proxy/trusted/
- name: gpg-homedir
mountPath: /usr/local/etc/cic-auth-proxy/.gnupg
- name: pgp-meta-test
mountPath: /usr/local/etc/cic-auth-proxy/import
volumes:
- name: pgp-meta-test
configMap:
name: pgp-meta-test
- name: acl-config
configMap:
name: cic-auth-proxy-acl-configmap
- name: credentials-config
configMap:
name: cic-auth-proxy-credentials-configmap
- name: trusted-publickey
configMap:
name: pgp-trusted-publickey
- name: gpg-homedir
emptyDir: {}
---
# https://kubernetes.io/docs/concepts/services-networking/service/
apiVersion: v1
kind: Service
metadata:
name: cic-auth-proxy-ussd
namespace: grassroots
spec:
selector:
app: cic-auth-proxy-ussd
type: ClusterIP
ports:
- name: http
protocol: TCP
port: 80
targetPort: 8080

View File

@@ -1,16 +0,0 @@
apiVersion: bitnami.com/v1alpha1
kind: SealedSecret
metadata:
creationTimestamp: null
name: cic-ussd-querystring-creds
namespace: grassroots
spec:
encryptedData:
password: AgAO18hv8gZZRoySuTqNOg06imdnfES7io54TEGO501cq8usRcj9Bu3us+7V8zD20DQTZ7xCGCV2eyZ1kMsnFNFUBPMG0raAtwe4dAskYeQxKp0VyE1y+d9TbecvlqRXmAD3lF1exMSS3PD68VZcrDpenXE7Ag74uEJM7YmAKnUXIzBhxZlG7bIwLrRuNiTye83e/jijaPD3+66NrExBM2H//b8u1l8QVNd13XwgAWkJEW9QNMP1b2ir9VWKE4P8Da22SQRIed/Nhyc+aR3izqEpCbeRYmQqlo9r66oXK4Yr5v0IkI38gBGORrPv1OZK01plvGgMoxTe3pSiW5cyMtCXx6GgyIFKII1yYvtlI86Sf4J3DRU6kC3NSvF+2B99yFrbUPyoAGQFd6oSWAQLBI2kYqf6NXuaT3kxBNroAICMlAYygPKG0t6jEzTWk+E5l4F/PHWFD5DM+diHRqQAdDStACcMCl2i2133PPjlK1gUhQkCmeRcKeEqMT5ssBx/KM5p+v8syV4/0VlNtJpnP/aWcbwvVsiHhDE9trM/+p5E6gsVymAIiW20nRnuOSjHCnHOtjtfPtEZ3A8eHAqgJjdE6fY7DvZmwKfx2A4tMcrikz208Pa7JIFE6nj9osTz7eMrWXTmquVRotZ9we3WmGBycgVUuv9hfzUC6srkTIyT9UGHH9UNiRba/73ZDF2Zr2XvN0ofkQz8dN0dKLxy/OptCciV3Rdn/4s/IQ0usSwXLEb2tQtHgy5+twj7IQij6amjLbulRm1U2GLatmvgbjqf
username: AgCJQ9JiJErCGfH3pkX3I3696Garu40pGWgvcUOa9wz3r3Q+5SY0IMnroWO3z66L/HG7DW70upgfJBVyhncrUBrC/z73D++nMz43JvFc39MHcUYVmqvjIw1401705+G7UxxgcMOx09++CBZ97wbEfKc251v98s8G/bLtMcqS/12pVNfMGgjpkE6InlS0n9VD26HDs4A3uNtfN2GK2dsazV05UXs8W+6JOZsJ60QfPJylCpKh+sxPLDlYt4lRIM/6pP7kQXLn+VmWtzuo1dZTaUliMYH+DOXO2V9ePnjTUXGrMgRfuZP2PCmG3usdC45mOPpuURPFUmF8SDQ4IXKhBd7N+8pjZDiqQ2RxK61Qz6MXv851u7HABgVMhtjlZfaD4hVY+mr7KYDVQvrhJ971y84KBHuQFOxwZZnXAx5FdBWmkKkz959bjulJaRe1ZWA01k1SQHiVFeIArbbNSlvH45XoNR8rxFiQE+5Olt9UwdpXAH/sAfH7CRY1SnRrAW3LCyCVqAJcXF9kU+bnezVgWoJ/h9ff6VKRFqh3o+wa6V6J8GNbwC0/oeXo0XyUjbwAUjIRKeWbJ1obkvWDfYILpo2CVt34tzYowiVqmYYB8SqKBPgrca+Xmn9GpggOMibJKk0LC6R04iAajMy73cuxkX+PMV1Wz5xOuF60ccCsdiD15deYOlA34UdfeNQgGA7EKPhsl0kD/xm5
template:
metadata:
creationTimestamp: null
name: cic-ussd-querystring-creds
namespace: grassroots

View File

@@ -1,100 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-cache-server
namespace: grassroots
labels:
app: cic-cache-server
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-cache-server
replicas: 1
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 25%
type: RollingUpdate
template:
metadata:
labels:
app: cic-cache-server
group: cic
teir: backend
spec:
containers:
- name: cic-cache-server
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:master-402b968b-1626300208 # {"$imagepolicy": "flux-system:cic-cache"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:latest
imagePullPolicy: Always
command: ["/usr/local/bin/uwsgi", "--wsgi-file=/root/cic_cache/runnable/daemons/server.py",
"--http=:8000", "--pyargv", "-vv"]
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 100m
memory: 100Mi
env:
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: DATABASE_NAME
value: cic_cache
- name: SERVER_PORT
value: "8000"
- name: DATABASE_DEBUG
value: "0"
ports:
- containerPort: 8000
name: server
restartPolicy: Always
---
# https://kubernetes.io/docs/concepts/services-networking/service/
apiVersion: v1
kind: Service
metadata:
name: cic-cache-svc
namespace: grassroots
spec:
selector:
app: cic-cache-server
type: ClusterIP
ports:
- name: server
protocol: TCP
port: 80
targetPort: 8000

View File

@@ -1,175 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-cache-watchers
namespace: grassroots
labels:
app: cic-cache-watchers
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-cache-watchers
replicas: 1
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 25%
type: RollingUpdate
template:
metadata:
labels:
app: cic-cache-watchers
group: cic
tier: queue
spec:
containers:
- name: cic-cache-tasker
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:master-402b968b-1626300208 # {"$imagepolicy": "flux-system:cic-cache"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:latest
imagePullPolicy: Always
command: ["/usr/local/bin/cic-cache-taskerd", "-vv"]
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 100m
memory: 100Mi
env:
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: CIC_CHAIN_SPEC
value: "evm:bloxberg:8996"
- name: DATABASE_NAME
value: cic_cache
- name: ETH_PROVIDER
value: http://bloxberg-validator.grassroots.svc.cluster.local:8547
- name: CIC_REGISTRY_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_REGISTRY_ADDRESS
- name: CIC_TRUST_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_TRUST_ADDRESS
- name: cic-cache-tracker
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:master-402b968b-1626300208 # {"$imagepolicy": "flux-system:cic-cache"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-cache:latest
# command: ["/usr/local/bin/cic-cache-trackerd", "-vv", "-c", "/usr/local/etc/cic-cache"]
command: ["./start_tracker.sh", "-c", "/usr/local/etc/cic-cache", "-vv"]
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 100m
memory: 100Mi
env:
- name: CIC_REGISTRY_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_REGISTRY_ADDRESS
- name: CIC_TRUST_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_TRUST_ADDRESS
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: DATABASE_NAME
value: cic_cache
- name: ETH_PROVIDER
value: http://bloxberg-validator.grassroots.svc.cluster.local:8547
- name: CIC_CHAIN_SPEC
value: "evm:bloxberg:8996"
- name: SERVER_PORT
value: "8000"
- name: ETH_ABI_DIR
value: /usr/local/share/cic/solidity/abi
- name: DATABASE_DEBUG
value: "0"
restartPolicy: Always

View File

@@ -1,221 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-eth-tasker
namespace: grassroots
labels:
app: cic-eth-tasker
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-eth-tasker
replicas: 1
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 25%
type: RollingUpdate
template:
metadata:
labels:
app: cic-eth-tasker
group: cic
tier: queue
spec:
containers:
- name: cic-eth-tasker
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:master-f1917300-1626888924 # {"$imagepolicy": "flux-system:cic-eth"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:latest
imagePullPolicy: Always
# command: ["./start_tasker.sh", "-q", "cic-eth", "-vv"]
command: ["/usr/local/bin/cic-eth-taskerd"]
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 500m
memory: 250Mi
env:
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: CIC_REGISTRY_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_REGISTRY_ADDRESS
- name: CIC_TRUST_ADDRESS # - name: ETH_GAS_PROVIDER_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_TRUST_ADDRESS
- name: REDIS_HOST
value: redis-master
- name: REDIS_PORT
value: "6379"
- name: REDIS_DB
value: "0"
- name: ETH_PROVIDER
value: http://bloxberg-validator.grassroots.svc.cluster.local:8547
- name: ETH_ABI_DIR
value: /usr/local/share/cic/solidity/abi
- name: DATABASE_NAME
value: cic_eth
- name: DATABASE_POOL_SIZE
value: "0"
- name: CIC_CHAIN_SPEC
value: "evm:bloxberg:8996"
- name: BANCOR_DIR
value: /usr/local/share/cic/bancor
- name: SIGNER_SOCKET_PATH
value: ipc:///run/crypto-dev-signer/jsonrpc.ipc
- name: SIGNER_SECRET
value: deadbeef
- name: ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER
value: "0xACB0BC74E1686D62dE7DC6414C999EA60C09F0eA"
- name: TASKS_TRACE_QUEUE_STATUS
value: "1"
- name: "DATABASE_DEBUG"
value: "false"
volumeMounts:
- name: socket-path
mountPath: /run/crypto-dev-signer/
- name: cic-eth-signer
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:master-f1917300-1626888924 # {"$imagepolicy": "flux-system:cic-eth"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:latest
imagePullPolicy: Always
# command: ["./start_tasker.sh", "-q", "cic-eth", "-vv"]
command: ["python", "/usr/local/bin/crypto-dev-daemon", "-c", "/usr/local/etc/crypto-dev-signer",
"-vv"]
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 500m
memory: 250Mi
env:
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: CIC_REGISTRY_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_REGISTRY_ADDRESS
- name: CIC_TRUST_ADDRESS # - name: ETH_GAS_PROVIDER_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_TRUST_ADDRESS
- name: ETH_PROVIDER
value: http://bloxberg-validator.grassroots.svc.cluster.local:8547
- name: ETH_ABI_DIR
value: /usr/local/share/cic/solidity/abi
- name: DATABASE_NAME
value: cic_eth
- name: DATABASE_POOL_SIZE
value: "0"
- name: CIC_CHAIN_SPEC
value: "evm:bloxberg:8996"
- name: BANCOR_DIR
value: /usr/local/share/cic/bancor
- name: SIGNER_SOCKET_PATH
value: ipc:///run/crypto-dev-signer/jsonrpc.ipc
- name: SIGNER_SECRET
value: deadbeef
- name: ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER
value: "0xACB0BC74E1686D62dE7DC6414C999EA60C09F0eA"
- name: TASKS_TRACE_QUEUE_STATUS
value: "1"
- name: "DATABASE_DEBUG"
value: "false"
- name: "CIC_DEFAULT_TOKEN_SYMBOL"
value: GFT
volumeMounts:
- name: socket-path
mountPath: /run/crypto-dev-signer/
volumes:
- name: socket-path
emptyDir: {}
restartPolicy: Always

View File

@@ -1,248 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
# The guardian is composed of:
# cic-manager-head
# cic-dispatch
# cic-retrier
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-eth-tracker
namespace: grassroots
labels:
app: cic-eth-tracker
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-eth-tracker
replicas: 1 # these are all strictly 1
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 25%
type: RollingUpdate
template:
metadata:
labels:
app: cic-eth-tracker
group: cic
spec:
containers:
- name: cic-eth-tracker
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:master-f1917300-1626888924 # {"$imagepolicy": "flux-system:cic-eth"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:latest
imagePullPolicy: Always
command: ["./start_tracker.sh", "-v", "-c", "/usr/local/etc/cic-eth"]
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 500m
memory: 250Mi
env:
- name: TASKS_TRANSFER_CALLBACKS
value: "cic-eth:cic_eth.callbacks.noop.noop,cic-ussd:cic_ussd.tasks.callback_handler.transaction_callback"
- name: CIC_REGISTRY_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_REGISTRY_ADDRESS
- name: CIC_TRUST_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_TRUST_ADDRESS
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: ETH_PROVIDER
value: http://bloxberg-validator.grassroots.svc.cluster.local:8547
- name: DATABASE_NAME
value: cic_eth
- name: DATABASE_DEBUG
value: "0"
- name: ETH_ABI_DIR
value: /usr/local/share/cic/solidity/abi
- name: CIC_CHAIN_SPEC
value: "evm:bloxberg:8996"
- name: REDIS_HOSTNAME
value: redis-master
- name: cic-eth-dispatcher
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:master-f1917300-1626888924 # {"$imagepolicy": "flux-system:cic-eth"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:latest
imagePullPolicy: Always
command: ["./start_dispatcher.sh", "-q", "cic-eth", "-v"]
resources:
requests:
cpu: 50m
memory: 100Mi
limits:
cpu: 500m
memory: 250Mi
env:
- name: CIC_REGISTRY_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_REGISTRY_ADDRESS
- name: CIC_TRUST_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_TRUST_ADDRESS
- name: TASKS_TRANSFER_CALLBACKS
value: "cic-eth:cic_eth.callbacks.noop.noop,cic-ussd:cic_ussd.tasks.callback_handler.transaction_callback"
- name: CIC_REGISTRY_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_REGISTRY_ADDRESS
- name: CIC_TRUST_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_TRUST_ADDRESS
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: ETH_PROVIDER
value: http://bloxberg-validator.grassroots.svc.cluster.local:8547
- name: DATABASE_NAME
value: cic_eth
- name: DATABASE_DEBUG
value: "0"
- name: CIC_CHAIN_SPEC
value: "evm:bloxberg:8996"
- name: REDIS_HOSTNAME
value: redis-master
# - name: cic-eth-retrier
# image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-eth:latest
# command: [ "./start_retry.sh", "-v" ]
# resources:
# requests:
# cpu: 50m
# memory: 100Mi
# limits:
# cpu: 500m
# memory: 250Mi
# env:
# - name: CIC_REGISTRY_ADDRESS
# valueFrom:
# configMapKeyRef:
# name: contract-migration-output
# key: CIC_REGISTRY_ADDRESS
# - name: CIC_TRUST_ADDRESS
# valueFrom:
# configMapKeyRef:
# name: contract-migration-output
# key: CIC_TRUST_ADDRESS
# - name: CIC_TX_RETRY_DELAY # TODO what is this value?
# value: "15"
# - name: TASKS_TRANSFER_CALLBACKS # TODO what is this value?
# value: "taskcall:cic_eth.callbacks.noop.noop"
# - name: ETH_PROVIDER
# value: http://bloxberg-validator.grassroots.svc.cluster.local:8547
# - name: DATABASE_USER
# value: grassroots
# - name: DATABASE_HOST
# value: postgres-helm-postgresqlsql
# - name: DATABASE_PASSWORD
# value: tralala
# - name: DATABASE_NAME
# value: cic_eth
# - name: DATABASE_PORT
# value: "5432"
# - name: DATABASE_ENGINE
# value: postgres
# - name: DATABASE_DRIVER
# value: psycopg2
# - name: DATABASE_DEBUG
# value: "1"
# - name: REDIS_HOSTNAME
# value: grassroots-redis-master
# - name: CIC_CHAIN_SPEC
# value: "evm:bloxberg:8996"
# - name: CELERY_BROKER_URL
# value: redis://grassroots-redis-master
# - name: CELERY_RESULT_URL
# value: redis://grassroots-redis-master
restartPolicy: Always

View File

@@ -1,122 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-meta-server
namespace: grassroots
labels:
app: cic-meta-server
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-meta-server
replicas: 1
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 25%
type: RollingUpdate
template:
metadata:
labels:
app: cic-meta-server
group: cic
spec:
containers:
- name: cic-meta-server
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-meta:master-fe017d2b-1625932004 # {"$imagepolicy": "flux-system:cic-meta"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-meta:latest
imagePullPolicy: Always
resources:
requests:
cpu: 50m
memory: 250Mi
limits:
cpu: 100m
memory: 500Mi
env:
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: SCHEMA_SQL_PATH
value: scripts/initdb/server.postgres.sql
- name: DATABASE_NAME
value: cic_meta
- name: SERVER_HOST
value: localhost
- name: SERVER_PORT
value: "8000"
- name: DATABASE_SCHEMA_SQL_PATH
value: ""
- name: PGP_EXPORTS_DIR
value: /tmp/src/keys
- name: PGP_PRIVATEKEY_FILE # Private key here is for enrypting data
value: privatekey.asc
- name: PGP_PASSPHRASE
value: queenmarlena # TODO move to secret
- name: PGP_PUBLICKEY_TRUSTED_FILE
value: publickeys.asc
- name: PGP_PUBLICKEY_ACTIVE_FILE # public key here is to know who to trust
value: publickeys.asc
- name: PGP_PUBLICKEY_ENCRYPT_FILE
value: publickeys.asc
ports:
- containerPort: 8000
name: cic-meta-server
volumeMounts:
- mountPath: /tmp/src/keys
readOnly: true
name: pgp
volumes:
- name: pgp
configMap:
name: pgp-meta-test
items:
restartPolicy: Always
---
# https://kubernetes.io/docs/concepts/services-networking/service/
apiVersion: v1
kind: Service
metadata:
name: cic-meta-server
namespace: grassroots
spec:
selector:
app: cic-meta-server
type: ClusterIP
ports:
- name: http
protocol: TCP
port: 80
targetPort: 8000

View File

@@ -1,17 +0,0 @@
apiVersion: bitnami.com/v1alpha1
kind: SealedSecret
metadata:
creationTimestamp: null
name: cic-notify-africastalking-sandbox-secret
namespace: grassroots
spec:
encryptedData:
api_key: AgC1j5LwcOocVchLac0kPmCqyqljV40RzlCXVrtaxFvFvO+s2KcaUNFrIxfirZ8pbSDPKAUAgQam9tJiZFg4Wdu0x6LDDo6x2/0t/HXqb1eeH+OxO045T0AQZPpqFK+5QZ7y+c0tnUJfppCgOd+PhvB0AyHndWoWmTMGy48zpKaeRZriRRM5+rkNgGzazUNdsQoThOhEStIDCa1+zpq++KckU+VS5Qgsj7X1gOth+XFeyVFuXLY/QTTmvflgXj/YA5HMcJVCO3/l4UGaxHaUWps6IlZ5RgALamavAww7HIT7iecjXbAKx4fQt7bKuQY/FHiwDLAJmfdYUv7YK3EVsW4lSWVxgQ4RKM1dru8Kgs4nE+jmlApeuMl2Y3yT0tyCXSNPBzcsAbeumctsWO2gypQbL/0N1a9OZYgJv4/rfbonl8lK8detZDnAOqhOEiJ2fX/H9bkLc7n5hEHxnIt72sgdurD3r7WkxNsD8/laBx7hDtr7+Zv9RZ8MlyyUd+ZBVax6tO7immo8ya+BRsFKcrr3FqSuvwl3q7y/DNLMU6wqEv+/FmfSW7J1HuGlOJApHM96R6Jfqpw+ZNZ/hS6Z0CDNicLRdRfDaKglOu4UvkiWQ2F1JtqtJEtishbk3bNtZSAOO7Uan/V1XfeUvrYBAyXCoLLwK2pm/eWNZ71BW7TghCNgqRpeALPZy2JrDudnmyoXzcidJeBi5lBEnbXzIcOuos/0xgM/DiKQDO8mxh1ycbiu9XSydxdkLrRFXMtO8KK4qhIdgJp8INjZxIzgm5j3
api_sender_id: AgCTvYXxQNTCcRnMrfcc3D+OY+N8/mFtfI+O3xI45qxdC1n7/OE2MI0XIleYSbbAh3R7+5lNjq3pVhdKpRvCYxUGerCdSeYFQZf0RN71sAtDV8NnHCB6pRIsCWlvjDHw7XRH6PiPrr0xj4rwd1Qou8EMybVQXwT6nvV20kdKhAd6dTTgqT8x1AB5N+L0o1H/ThOKNadjN6oqlROz4in0DTTmis+Ebk4pywFroVWa5pMCjy7Ua5omKqB45lAKDp4FwogSnEUVS1b1pQTO/o0mnCQRObPcOzVjze6oqfUaEPkvtrOkmKuAxEXFKdAzG8bzPwx6EuSCbMmcUekeBGKDZBniE1YyOkPyLZelBEtyXFhQBmKbdW4qhlEJwYwflFBll8eNo2ruMVH7HTJvAqW4p2mxTIoM9Nx+UWDJjZ736aYvDfCgZX0MVKvXjrNtWEhuVBYGZxFIWfj/RqNkcJf+ZlNYK8MripzPReNWDlvhH1jzlPGJozUlYfLY8hY/NDGMbdc+EflLy8p4oRLWUo0Z0W3ARwLM1WQMk8FSXAgCjMmL7m3d310lVO5LLVdj3GTpjfCNPm2m7Re0lsOkLDE8x0vhJnlIUJ8WlY1EdezKfPWR0laNSknCCxfPYU88XiO2DVgUdWBH0Hg13yQQ3IXz49Pnf6LrAx46EgKUzoKnCgA5L/hRAIV3u8vav/5kdSF4CxY=
api_username: AgBQZXCDQY5B1RKfwjih04XSvdPJjmlfuA2bL6qiphYDIM223DJ5HEolHIWlCeSAjDC5JjkkesPmVzxybgD7hRwVuyOyWfmiXBG6+552v/DNv17IfSNNtZMCsEj9jyscHfvRI1NCG5pxywhBfRxhOVIYflbFI/H+13ReAs/6J/uhVfWVPMXzdg7Zj7mYsWzj3otv0npzXc+j2G4JZk53h7fDTY/XsXyZCe+Xpn9auf7Y+m+/shOiaM67CuE+eMWYhpTpTDy6oG3rjSrqhjVnE4wu8fP85nTGK2ctrQ9qgyOdcf1vPdlt1CwP5FZ+zxY8GrhUsYnunQ50znPD55e/8m5AqZhIrCjQ0lfzB2iCn0OzwOjnQpfv0W0QXpw6aDH/GYfMbDBLyXnF5yU0J+R7tv2Q45NKyRHbdO2VQXhXak95YGvK0Er3+8b0Jx4k3L+hM3OTFaJgdGOyb4IsUFG6vfQlfG3Bl4ggYqAGCJo041SpT4gc1XgGkar8WD4sOir61qKde8NnIOdRn1bhaZ8qD2eP+4p20hqsgpAA38SN6qZfrZc5sj/Re2mXxYIbmdALlP6yt1exhtU4QRzF23RDHmwiGtzh0YqIUj62+icrLcn+ZXgpufW9BgxKXilczF/bfsN2v4VFKsrkp+wRtAlFj3riDoles/IhItIJ9KpmSHEsRDRrkrUl+sFQ5xqQvfIuEoy/DHKf/Q3L
template:
metadata:
creationTimestamp: null
name: cic-notify-africastalking-sandbox-secret
namespace: grassroots

View File

@@ -1,17 +0,0 @@
apiVersion: bitnami.com/v1alpha1
kind: SealedSecret
metadata:
creationTimestamp: null
name: cic-notify-africastalking-secret
namespace: grassroots
spec:
encryptedData:
api_key: AgAbH1KDx0kVxtqPBvkb0w1uYJm0Vwd/eHBMhLJRC9Z5zs/YkHkV+Zrz4Pk0loSIdyV3fFdL+zcqnwWmkYjrkBYUXqB+F6HcfYPlND+zAtv8QMbpmHoF6faqx1MY981Sx7qqURlORCZzkwzrDEC4XK2nwut414PFEQ4Rn9SWe2RWXl75G3+TVkj0jd24rYYlGCmD8krjFLY9IMigKZOr0TqoPhHv9xKPg7+Xpwtd86QpiFcTR0fBtN8FLDuKlYGHvTvaUqT2kyBEXeiSn3V+rB7mD72QtNBwS9KJfQla/Gh1z4kps5DkZNoHKRz3O0SnMND1UrID713g7coYI+3q3Xk4/IFiYH4iTQwls/TZWuu/aAc74Ofs6tNwNfUeGGwa0uX/EdMpQ0cWBsGQXi3wG91kvCbnHQ1IERXgCLzGur9yB7YARy6H3tJkQrjzY2ETmMK6wj8yelKC1LLiVk/UVPiRI3O85JPxDDdtcOWkRtGso5Z1Q4D0AFWhbrhBXZc3cS2MmU1bIXoZ+fsYoZ1Mrg+uFQyuf4XwIq+HZuaYksN6xN0olId/H53Bau/z54pteFwT3VOKJNrLHCnAz0YJG2LO08Iu3FcNIUVh1tf39019QOFW2QGemhxznfPH1Mn4l9GwhE2MNfU2JqbFLnCfPZPoFSY+7YZ8I271aY4sM/OlaE2Nrk9RSsSgpS+WUCijSzZ8Crgc8dffVQryDEc/t+JIW7T4BH16lk+fgx2W1JoU/BMMvaRitHX5XsIs5xchwfuvb5+WbNnT8WbYdomGokEg
api_sender_id: AgBt9Sp+yb1Qek38Imhs4zW4pcujj2ExStcfLgWPXgD3WhIOfMS7AFAe5uDTICZqLEoMnluVTJn8Hk02BIU3quXWjOOgLJRmpNlgqAbchR1XA07BwGeYWMZomC4M+8nknMOHeQvo8HGN/Y5HnXw5N2Gw8Utlm/cLVVJOEYCKGiR8vWpzMMGepxIKVeex93Ev09SQsxbOJ3QO3SGdJiKFBvXgSeGnsdg+uAeC0NG1zzVBbiCgiNoIZgv6ZnwzBZSl7BYAazt4xOoHSRoicgmIX3PWdjIz+lzVHIuEuFZK9heq+MODQhZcvf54JRB1qPURGJobu3IVDAckIvnFlSCRaBp2rAqtyzAtngkau3Et/66eI3SChJFDgnbourQ1QRPzqWipzYQ11JpUbc0luXRfA+HvzrlIf5r/rSL93jepWQS0Exk2VuCrBAS+QPJccmncJYRZ98SU0fAu+5ZcHFVkEpc4iatW6j1tXyxn+tO1pl7yB/9GGfNA72XxVHwLZmTE7LN0Y4VyhdIQYtXX5ZX49bD0Lfk/m1pkwkWtj1Cas7YpFvX0JUN0bjn6JQoJAIh7fCW5O2ATp/eqpsCcfyw+Fy8kNPFVXlfvVmD9aqwYojfCE2CV0lXFGwHLkNp0nxxTPoqrx2YAa9R/nILaUawvMdMU+a8n1Ee79al/eVpqr4WdTkQikrxcoud8PDzNOyk2y8eZY+Kb2dU=
api_username: AgBOCaA5MnRsnGSkQw8Md2XIRsb+9Dg3XiM5yagk1GvAcO+eNnL4i2F4pbA0Ctad0LpQVXwVhppJDqxS7ffy3VJwMdEWxChKN3AiE3e8Eqx8LzD1xXU48OxAfksFKQrXNBVjfBcSgyoTwI5ohYSGxuUL3fWo4/Wp0AWdy92bwNZfhOvkLpfn+Q3nblLX9tQeu/ky6+hqkiyZSWjgykJXGnVnpPMoRS8BLbcxRD1kpJfMORfW7q1JyzheBGByStRvr7i6z3m4KVC/a9H3o0OpIQQTNtvvy4zECQ5NnjrxXYdOJfJZNq1enVqqCeA06h2e3DESfxYPPx6tL8+Ugo+4TorFb5Fw8vqFQRd/1SiGQE++W/MRY4P+fj9hQrxyzdnG5oBq1j2JTpx5VMwkOEk1yVLmRS9nYrZOfFE/6EJJSzRZMbh+xVTEvcag9ZzSzkbKoakR+VKjZ2Rpum6rwXPUUhJ5F6ohIZx2x/qE3QJlmxtnJQGfmNwe6HTiLxqjU3o0SkVQgMPN275Uydlm6Omn9eo2jkYnA5hg4TPoFjcqAmbgV5O372U6ShpNtA2hYi6CdF7K/sbKxhFtHTrPAdF5NhMGZ6N81X1AMWq3a1mojMBJbVTPVzSWPovkiqhQ62gKUUbPp191aLuQlX83dqEuRxyYTprId/ODB7RtaAddDcLsF4z/7i7yh8VtKyqHBs7hA52O1b2sQJwxmhw=
template:
metadata:
creationTimestamp: null
name: cic-notify-africastalking-secret
namespace: grassroots

View File

@@ -1,100 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-notify-tasker
namespace: grassroots
labels:
app: cic-notify-tasker
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-notify-tasker
replicas: 1
template:
metadata:
labels:
app: cic-notify-tasker
group: cic
spec:
containers:
- name: cic-notify-tasker
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-notify:master-7a3cb7ab-1627053362 # {"$imagepolicy": "flux-system:cic-notify"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-notify:latest
imagePullPolicy: Always
command: ["./start_tasker.sh", "-q", "cic-notify", "-vv"]
resources:
requests:
cpu: 25m
memory: 100Mi
limits:
cpu: 50m
memory: 200Mi
env:
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: DATABASE_POOL_SIZE
value: "0"
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: DATABASE_NAME
value: cic_notify
- name: AFRICASTALKING_API_USERNAME
valueFrom:
secretKeyRef:
name: cic-notify-africastalking-sandbox-secret
key: api_username
- name: AFRICASTALKING_API_KEY
valueFrom:
secretKeyRef:
name: cic-notify-africastalking-sandbox-secret
key: api_key
- name: AFRICASTALKING_API_SENDER_ID
valueFrom:
secretKeyRef:
name: cic-notify-africastalking-sandbox-secret
key: api_sender_id
ports:
- containerPort: 80 # What is this value?
name: cic-eth-manager
restartPolicy: Always

View File

@@ -1,55 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-staff-client
namespace: grassroots
labels:
app: cic-staff-client
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-staff-client
replicas: 1
template:
metadata:
labels:
app: cic-staff-client
group: cic
spec:
containers:
- name: cicada
#image: registry.gitlab.com/grassrootseconomics/cic-staff-client:master-858e1e65-1627284988 # {"$imagepolicy": "flux-system:cic-staff-client"}
image: registry.gitlab.com/grassrootseconomics/cic-staff-client:latest
imagePullPolicy: Always
resources:
requests:
cpu: 100m
memory: 100Mi
limits:
cpu: 100m
memory: 100Mi
ports:
- containerPort: 80
name: http
restartPolicy: Always
---
# https://kubernetes.io/docs/concepts/services-networking/service/
apiVersion: v1
kind: Service
metadata:
name: cic-staff-client
namespace: grassroots
spec:
selector:
app: cic-staff-client
type: ClusterIP
ports:
- name: http
protocol: TCP
port: 80
targetPort: 80

View File

@@ -1,113 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-user-server
namespace: grassroots
labels:
app: cic-user-server
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-user-server
replicas: 1
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 25%
type: RollingUpdate
template:
metadata:
labels:
app: cic-user-server
group: cic
tier: backend
spec:
containers:
- name: cic-user-server
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:master-fad0a4b5-1628267359 # {"$imagepolicy": "flux-system:cic-ussd"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:latest
command: ["/root/start_cic_user_server.sh", "-vv"]
resources:
requests:
cpu: 100m
memory: 100Mi
limits:
cpu: 500m
memory: 250Mi
env:
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: DATABASE_POOL_SIZE
value: "0"
- name: DATABASE_NAME
value: cic_ussd
- name: HTTP_PORT_CIC_USER_SERVER
value: "9500"
- name: PGP_KEYS_PATH
value: /tmp/src/keys/
- name: PGP_EXPORTS_DIR
value: /tmp/src/keys/
ports:
- containerPort: 9500
name: server
volumeMounts:
- mountPath: /tmp/src/keys
name: pgp
readOnly: true
volumes:
#- name: pgp
# secret:
# secretName: pgp
- name: pgp
configMap:
name: pgp-meta-test
restartPolicy: Always
---
# https://kubernetes.io/docs/concepts/services-networking/service/
apiVersion: v1
kind: Service
metadata:
name: cic-user-server-svc
namespace: grassroots
spec:
selector:
app: cic-user-server
type: ClusterIP
ports:
- name: server
protocol: TCP
port: 80
targetPort: 9500

View File

@@ -1,122 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-user-tasker
namespace: grassroots
labels:
app: cic-user-tasker
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-user-tasker
replicas: 1
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 25%
type: RollingUpdate
template:
metadata:
labels:
app: cic-user-tasker
group: cic
task: queue
spec:
containers:
- name: cic-user-tasker
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:master-7a3cb7ab-1627053361 # {"$imagepolicy": "flux-system:cic-ussd"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:latest
imagePullPolicy: Always
command: ["/root/start_cic_user_tasker.sh", "-q", "cic-ussd", "-vv"]
resources:
requests:
cpu: 100m
memory: 100Mi
limits:
cpu: 500m
memory: 250Mi
env:
- name: APP_PASSWORD_PEPPER
valueFrom:
secretKeyRef:
name: cic-ussd-secret
key: app_password_pepper
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: DATABASE_NAME
value: cic_ussd
- name: DATABASE_POOL_SIZE
value: "0"
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: REDIS_HOST
value: redis-master
- name: REDIS_PORT
value: "6379"
- name: REDIS_DATABASE
value: "0"
- name: CIC_META_URL
value: http://cic-meta-server:80
- name: PGP_KEYS_PATH
value: /tmp/src/keys/
- name: PGP_EXPORTS_DIR
value: /tmp/src/keys/
- name: PGP_PRIVATE_KEYS
value: privatekey.asc
- name: PGP_PASSPHRASE
value: queenmarlena
- name: CIC_META_URL
value: http://cic-meta-server:80
volumeMounts:
- mountPath: /tmp/src/keys
name: pgp
readOnly: true
volumes:
#- name: pgp
# secret:
# secretName: pgp
- name: pgp
configMap:
name: pgp-meta-test
restartPolicy: Always

View File

@@ -1,139 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/deployment/
apiVersion: apps/v1
kind: Deployment
metadata:
name: cic-user-ussd-server
namespace: grassroots
labels:
app: cic-user-ussd-server
annotations:
keel.sh/policy: "glob:master-*"
keel.sh/trigger: poll
keel.sh/pollSchedule: "@every 5m"
spec:
selector:
matchLabels:
app: cic-user-ussd-server
replicas: 1
strategy:
rollingUpdate:
maxSurge: 25%
maxUnavailable: 25%
type: RollingUpdate
template:
metadata:
labels:
app: cic-user-ussd-server
group: cic
tier: backend
spec:
containers:
- name: cic-user-ussd-server
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:master-7a3cb7ab-1627053361 # {"$imagepolicy": "flux-system:cic-ussd"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/cic-ussd:latest
imagePullPolicy: Always
command: ["/root/start_cic_user_ussd_server.sh", "-vv"]
resources:
requests:
cpu: 100m
memory: 100Mi
limits:
cpu: 500m
memory: 250Mi
env:
- name: APP_PASSWORD_PEPPER
valueFrom:
secretKeyRef:
name: cic-ussd-secret
key: app_password_pepper
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: DATABASE_POOL_SIZE
value: "0"
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: REDIS_HOST
value: redis-master
- name: REDIS_PORT
value: "6379"
- name: REDIS_DATABASE
value: "0"
- name: DATABASE_NAME
value: cic_ussd
- name: SERVER_PORT
value: "9000"
- name: APP_ALLOWED_IP
value: "0.0.0.0/0"
- name: CIC_META_URL
value: http://cic-meta-server:80
- name: PGP_KEYS_PATH
value: /tmp/src/keys/
- name: PGP_EXPORTS_DIR
value: /tmp/src/keys/
- name: PGP_PRIVATE_KEYS
value: privatekey.asc
- name: PGP_PASSPHRASE
value: queenmarlena # TODO move to secret
volumeMounts:
- mountPath: /tmp/src/keys
name: pgp
ports:
- containerPort: 9000
name: server
volumes:
- name: pgp
configMap:
name: pgp-meta-test
restartPolicy: Always
---
# https://kubernetes.io/docs/concepts/services-networking/service/
apiVersion: v1
kind: Service
metadata:
name: cic-user-ussd-svc
namespace: grassroots
spec:
selector:
app: cic-user-ussd-server
type: ClusterIP
ports:
- name: server
protocol: TCP
port: 80
targetPort: 9000

View File

@@ -1,15 +0,0 @@
apiVersion: bitnami.com/v1alpha1
kind: SealedSecret
metadata:
creationTimestamp: null
name: cic-ussd-secret
namespace: grassroots
spec:
encryptedData:
app_password_pepper: AgCrkW9x9iLDjcmmNHTopd1CXAdamuqWkiS3Tviev5OtR5d4FBw9L9Qm8P9ZsBiH9ucQSNu4XLxUuIEgyBLEPQDM5DELl+qvCmPboQ73kPxjEH4/ppqE1QnJHBFcTqfkUVDu4dBkmGPCbYAGrWqDa1bZt1Hujl+ZpWuc919nN2yxFEArs8jt8JIP2bbGKwPl21MDNP53ITtpw1euBswxUzLRcfUrw+6ghce8faFjDEnCS9NKftCwBFMQN2ddNn8aY0K1Dl/DJLTJw9c6L+3YAHzaQAkNz2I6W6ERri6Mmkc1bE1cujvftNxrWkPqh9bJD163xhMbA6vQd8rEpQelilBQpT0plcH5dj11mhAX+n3IR4Xp5kD+fwwPPgoQ5P3TJj4nfAccM/ubVVR6vLCXrO2TV56V8YgQ2pd0H3H8IJYRdU6LE9xLWdG5J32EKrhnhTNaGcFpWq3g4UUv6Fy0z7iZ8TkBce13zCiEf7AXeXzdj2KDgVW6/FBLknJoDd2m/j4GA5UC6MiKXc/X9sgHeSIOstR5nGBuYYr/12FbAPdnvV8zQzfHQsSCysciLQBqFhC5Yeepwu+UYsYCDWKGgvmC284zl52VkK2G6cXi77qcKNc6NtViZpKkJ0vRysgLLpBiDERB/NOrXef+kA6hJXsc5hqAPJjceh0XV0iT/4FA/2qq0A7qfm4meSJAxrj2PssoVQSUQmo2jopTCsa7IEq9opqy0yJaXu+dJG32IHimDLcAm1Sp0uumcd9MOg==
template:
metadata:
creationTimestamp: null
name: cic-ussd-secret
namespace: grassroots

View File

@@ -1,68 +0,0 @@
# https://kubernetes.io/docs/concepts/configuration/configmap/
kind: ConfigMap
apiVersion: v1
metadata:
name: contract-migration-envlist
namespace: grassroots
data:
envlist: |
SYNCER_LOOP_INTERVAL
SSL_ENABLE_CLIENT
SSL_CERT_FILE
SSL_KEY_FILE
SSL_PASSWORD
SSL_CA_FILE
BANCOR_DIR
REDIS_HOST
REDIS_PORT
REDIS_DB
PGP_PRIVATEKEY_FILE
PGP_PASSPHRASE
DATABASE_USER
DATABASE_PASSWORD
DATABASE_NAME
DATABASE_HOST
DATABASE_PORT
DATABASE_ENGINE
DATABASE_DRIVER
DATABASE_DEBUG
TASKS_AFRICASTALKING
TASKS_SMS_DB
TASKS_LOG
TASKS_TRACE_QUEUE_STATUS
TASKS_TRANSFER_CALLBACKS
DEV_MNEMONIC
DEV_ETH_RESERVE_ADDRESS
DEV_ETH_ACCOUNTS_INDEX_ADDRESS
DEV_ETH_RESERVE_AMOUNT
DEV_ETH_ACCOUNT_BANCOR_DEPLOYER
DEV_ETH_ACCOUNT_CONTRACT_DEPLOYER
DEV_ETH_ACCOUNT_GAS_PROVIDER
DEV_ETH_ACCOUNT_RESERVE_OWNER
DEV_ETH_ACCOUNT_RESERVE_MINTER
DEV_ETH_ACCOUNT_ACCOUNTS_INDEX_OWNER
DEV_ETH_ACCOUNT_ACCOUNTS_INDEX_WRITER
DEV_ETH_ACCOUNT_SARAFU_OWNER
DEV_ETH_ACCOUNT_SARAFU_GIFTER
DEV_ETH_ACCOUNT_APPROVAL_ESCROW_OWNER
DEV_ETH_ACCOUNT_SINGLE_SHOT_FAUCET_OWNER
DEV_ETH_SARAFU_TOKEN_NAME
DEV_ETH_SARAFU_TOKEN_SYMBOL
DEV_ETH_SARAFU_TOKEN_DECIMALS
DEV_ETH_SARAFU_TOKEN_ADDRESS
DEV_PGP_PUBLICKEYS_ACTIVE_FILE
DEV_PGP_PUBLICKEYS_TRUSTED_FILE
DEV_PGP_PUBLICKEYS_ENCRYPT_FILE
CIC_REGISTRY_ADDRESS
CIC_APPROVAL_ESCROW_ADDRESS
CIC_TOKEN_INDEX_ADDRESS
CIC_ACCOUNTS_INDEX_ADDRESS
CIC_DECLARATOR_ADDRESS
CIC_CHAIN_SPEC
ETH_PROVIDER
ETH_ABI_DIR
SIGNER_SOCKET_PATH
SIGNER_SECRET
CELERY_BROKER_URL
CELERY_RESULT_URL
META_PROVIDER

View File

@@ -1,122 +0,0 @@
# https://kubernetes.io/docs/concepts/workloads/controllers/job/
apiVersion: batch/v1
kind: Job
metadata:
name: contract-migration
namespace: grassroots
labels:
app: contract-migration
spec:
backoffLimit: 6
template:
spec:
imagePullSecrets:
- name: gitlab-internal-integration-registry
# securityContext:
# runAsUser: 1000
# runAsGroup: 1000
restartPolicy: Never
containers:
- name: contract-migration
#image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/contract-migration:master-621780e9-1618865959 # {"$imagepolicy": "flux-system:cic-contract-migration"}
image: registry.gitlab.com/grassrootseconomics/cic-internal-integration/contract-migration:latest
command: ["./run_job.sh"]
# command: ["sleep", "3600"]
env:
- name: CIC_REGISTRY_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_REGISTRY_ADDRESS
- name: CIC_TRUST_ADDRESS # - name: ETH_GAS_PROVIDER_ADDRESS
valueFrom:
configMapKeyRef:
name: contract-migration-output
key: CIC_TRUST_ADDRESS
- name: DATABASE_USER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_USER
- name: DATABASE_HOST
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_HOST
- name: DATABASE_PORT
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PORT
- name: DATABASE_ENGINE
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_ENGINE
- name: DATABASE_DRIVER
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_DRIVER
- name: DATABASE_PASSWORD
valueFrom:
configMapKeyRef:
name: postgresql-conn-common
key: DATABASE_PASSWORD
- name: CELERY_BROKER_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_BROKER_URL
- name: CELERY_RESULT_URL
valueFrom:
configMapKeyRef:
name: redis-conn-common
key: CELERY_RESULT_URL
- name: DATABASE_NAME
value: cic_eth
- name: REDIS_HOST
value: redis-master
- name: REDIS_PORT
value: "6379"
- name: REDIS_DB
value: "0"
- name: DEV_PIP_EXTRA_INDEX_URL
value: https://pip.grassrootseconomics.net:8433
- name: ETH_PROVIDER
value: http://bloxberg-validator.grassroots.svc.cluster.local:8547
- name: ETH_PROVIDER_HOST
value: bloxberg-validator.grassroots.svc.cluster.local
- name: ETH_PROVIDER_PORT
value: "8547"
- name: CIC_CHAIN_SPEC
value: "evm:bloxberg:8996"
- name: CIC_DATA_DIR
value: /tmp/cic/config
- name: RUN_MASK
value: "3" # bit flags; 1: contract migrations 2: seed data
- name: DEV_FAUCET_AMOUNT
value: "50000000"
- name: CIC_DEFAULT_TOKEN_SYMBOL
value: GFT
- name: DEV_SARAFU_DEMURRAGE_LEVEL
value: "196454828847045000000000000000000"
- name: DEV_ETH_GAS_PRICE
value: "1"
- name: TOKEN_TYPE
value: giftable_erc20_token
- name: WALLET_KEY_FILE
value: /root/keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c
volumeMounts:
- mountPath: /tmp/cic/config
name: migration-output
resources:
requests:
memory: "250Mi"
cpu: "100m"
limits:
memory: "500Mi"
cpu: "250m"
volumes:
- name: migration-output
emptyDir: {}

View File

@@ -1,11 +0,0 @@
# https://kubernetes.io/docs/concepts/configuration/configmap/
# PURPOSE: These values are *manually* populated after execution of the contract migration container
# The contract migration pod should output these vars among the STDOUT
kind: ConfigMap
apiVersion: v1
metadata:
name: contract-migration-output
namespace: grassroots
data:
CIC_REGISTRY_ADDRESS: "0xea6225212005e86a4490018ded4bf37f3e772161"
CIC_TRUST_ADDRESS: "0xEb3907eCad74a0013c259D5874AE7f22DcBcC95C"

Some files were not shown because too many files have changed in this diff Show More