Compare commits

...

151 Commits

Author SHA1 Message Date
5ed8bd6e54
Merge "origin/master" into "philip/bump-test-coverage" 2022-01-04 19:35:23 +03:00
9398cd98d0
Default to max gas oracle. 2022-01-04 18:39:08 +03:00
f67702cc79
Simplifies api module. 2022-01-04 18:38:48 +03:00
e4c41c2d3e
Adds log levels for caplog since pytest isn't run in debug. 2022-01-04 17:52:38 +03:00
c5ee59f836
Adds faker requirements 2022-01-04 14:14:24 +03:00
0b64cd1fb8
Refactors logg to enable use of caplog. 2022-01-04 13:39:26 +03:00
7e0737a001
Changes to enable testing migrations. 2022-01-04 13:36:29 +03:00
c04caece8d
Upgrades cic-notify test suite. 2022-01-04 13:34:47 +03:00
214637a9f5
Incorporates all test rehabilitation changes. 2022-01-03 21:20:27 +03:00
af21fd004d
Adds missing test files. 2022-01-03 21:19:27 +03:00
b43d9618d9
Addresses remaining bullets in hardening issue. 2021-12-29 17:18:17 +03:00
55266fd721
Adds system guardians 2021-12-29 17:17:13 +03:00
6aa836b1d2
Integrates translation management. 2021-12-29 17:16:41 +03:00
9b639c4ea9
Adds ability to temporarily cache language selection upon account creation. 2021-12-29 17:15:28 +03:00
54c9fe34ce
Retires old translation files. 2021-12-29 17:14:24 +03:00
599c01931e
Makes language list dynamic-ish. 2021-12-29 17:13:57 +03:00
c2c7eb5e6d
Adds ability to select language before creating account. 2021-12-29 17:10:25 +03:00
nolash
7618abcda3
Bump deps 2021-12-22 21:31:39 +00:00
nolash
94d8ddb164
Amend okota dependency conflict 2021-12-22 20:34:12 +00:00
nolash
7f958d4be8
Rehabiliate test for balance incoming check (already counted from PENDING state) 2021-12-22 20:23:02 +00:00
nolash
03b06ca8c1 Merge remote-tracking branch 'origin/master' into lash/improve-cache 2021-12-22 20:09:54 +00:00
nolash
14449f5c6d
Rehabilitate tests for cic-eth 2021-12-22 20:02:05 +00:00
nolash
15618fa061
Rehabilitate tests after url format change 2021-12-22 19:12:08 +00:00
nolash
3a52a78e93
Change poker method 2021-12-21 14:23:33 +00:00
nolash
6562d37a30
Add demurrage poker script 2021-12-21 14:19:26 +00:00
nolash
c5efa56885
Loosen requirements restrictions, fix missing explicit nonce in global contract deploy last step 2021-12-21 14:18:09 +00:00
d6346bb87b
Merge "origin/master" into "lash/improve-cache" 2021-12-20 11:14:04 +03:00
9050d331cd
Bumps lib versions 2021-12-20 11:12:39 +03:00
99997df248
Bumps deps 2021-12-20 11:11:23 +03:00
d04a4bf5c6
Adds missing import: tempfile 2021-12-20 11:10:35 +03:00
43c49dd527
Adds minor fixes:
- Added appropriate imports:
  - from okota.token_index.index import to_identifier
  - from cic_eth.error import SignerError
- cleaned up balance func.
2021-12-20 11:07:41 +03:00
511557c242
Cleaned up docker-compose.yml:
- Added ETH_MIN_FEE_LIMIT.
- Removed "" from callback task definitions.
2021-12-20 11:03:23 +03:00
887799962f
Couldn't get python setup.py install to work without it fussing:
```
 => => # Processing dependencies for cic-cache==0.3.1
 => => # Searching for eth-contract-registry~=0.7.1a2
 => => # Reading https://pypi.org/simple/eth-contract-registry/
 => => # Couldn't retrieve index page for 'eth-contract-registry'
 => => # Scanning index of all packages (this may take a while)
 => => # Reading https://pypi.org/simple/
```
2021-12-20 11:01:39 +03:00
3acc3cf417
Bumps deps. 2021-12-20 10:58:21 +03:00
ceeb246ce2
Retires unused override_requirements.txt file. 2021-12-20 10:56:30 +03:00
45499ec839
Refactor to match proof checked for by verify_proofs task. 2021-12-20 10:56:01 +03:00
77bdee049c
Adds token decimals to demurrage token deployment. 2021-12-20 10:54:28 +03:00
nolash
0cf6489f49 Merge remote-tracking branch 'origin/master' into lash/improve-cache 2021-12-08 06:44:31 +01:00
nolash
31256b3650
Remove custom port in pip url in dockers 2021-12-07 21:54:04 +01:00
nolash
380550cb84
Add higher fee limit in base task 2021-12-07 21:45:07 +01:00
nolash
a356585c6a
Remove port from pip 2021-12-06 17:45:44 +01:00
nolash
4809bc8c22
Bump confini 2021-12-06 14:21:26 +01:00
nolash
760f618943
WIP upgrade deps 2021-12-04 11:39:13 +01:00
nolash
39de1837c2
Upgrade deps to non-prerelease (temporarily removed transfer-auth) 2021-11-15 14:23:24 +01:00
nolash
97e45c87d7
WIP move to whole patch versions in deps 2021-11-15 14:07:54 +01:00
nolash
4658a5d8e5
Bump cic-cache version 2021-11-13 07:51:01 +01:00
nolash
995d4e0bd0
Add remaining database name prefix changes 2021-11-10 09:55:30 +01:00
nolash
140b72a72b
Use database prefix instead of name 2021-11-10 09:07:23 +01:00
nolash
21b0c4a48b
Change query parse order 2021-11-08 09:58:22 +01:00
nolash
0b66462c11
Update openapi spec, enable queries with no ranges 2021-11-04 09:42:35 +01:00
nolash
f18f865231
WIP openapi spec for cic-cache-server 2021-11-04 07:59:38 +01:00
nolash
ad1c241a85
Reorganize url path params in cic-cache-server 2021-11-04 06:06:34 +01:00
nolash
99b0fb5aed Merge branch 'lash/verify-cache' into lash/bloxberg-seeding 2021-11-04 04:26:50 +01:00
nolash
29423449b7 Merge remote-tracking branch 'origin/master' into lash/verify-cache 2021-11-04 04:23:47 +01:00
nolash
58e766aa58
Remove explicit config in db migration 2021-11-04 04:18:27 +01:00
nolash
2ebcd3e3de Merge remote-tracking branch 'origin/master' into lash/bloxberg-seeding 2021-11-02 18:49:49 +01:00
nolash
c440b049cc
Add config dirs 2021-11-02 16:35:44 +01:00
nolash
09034af5bc
Bump cic-eth version 2021-11-02 16:03:29 +01:00
nolash
dc80bae673
Upgrade cic-eth in migrations 2021-11-02 15:31:00 +01:00
nolash
d88ae00b72
Add celery cli args with defaults from redis 2021-10-31 07:58:35 +01:00
nolash
7a366edb9d
WIP rehabilitate cic-eth-inspect 2021-10-30 19:09:17 +02:00
nolash
0b912b99b6
Add role listing to cic-eth tag cli tool 2021-10-30 13:19:31 +02:00
nolash
cbd4aef004
Add action confirm on sweep script 2021-10-30 10:25:39 +02:00
nolash
6f7f91780b
Add script to sweep gas from signer accounts 2021-10-30 09:02:04 +02:00
nolash
83ecdaf023
Connect token filter to tracker 2021-10-29 16:35:11 +02:00
nolash
e2ef9b43c8
Reactivate cic-eth-tasker dependency for bootstrap 2021-10-29 15:58:34 +02:00
nolash
6e58e4e4de
Remove nasty residue from bootstrap 2021-10-29 14:40:06 +02:00
nolash
f46c9b0e7d Merge remote-tracking branch 'origin/master' into lash/bloxberg-seeding 2021-10-29 11:39:40 +02:00
nolash
6ca3fd55d7
Add gas cache oracle connection for erc20 2021-10-29 08:45:42 +02:00
nolash
258ed420b8 Merge branch 'lash/tmp-bloxberg-seeding' into lash/bloxberg-seeding 2021-10-29 07:35:08 +02:00
nolash
1c022e9853
Added changes to wrong branch 2021-10-29 07:33:38 +02:00
nolash
d35e144723
Register gas cache only for registered tokens 2021-10-29 07:00:25 +02:00
nolash
fb953d0318
Add gas cache backend, test, filter 2021-10-28 21:45:47 +02:00
nolash
858bbdb69a Merge remote-tracking branch 'origin/master' into lash/local-dev-improve 2021-10-28 14:36:45 +02:00
nolash
66e23e4e20
Test config cleanup 2021-10-28 14:11:11 +02:00
nolash
546256c86a
Better gas gifting amounts and thresholds estimation, fix broken cic-eth imports 2021-10-28 13:34:39 +02:00
nolash
d9720bd0aa Merge remote-tracking branch 'origin/lash/local-dev-improve' into lash/bloxberg-seeding 2021-10-28 05:41:27 +02:00
nolash
e9e9f66d97
Correct wrong change for docker registries 2021-10-28 05:39:44 +02:00
nolash
0d640fab57 Merge remote-tracking branch 'origin/lash/local-dev-improve' into lash/bloxberg-seeding 2021-10-28 05:29:07 +02:00
nolash
4ce85bc824
Remove faulty default registry in dockerfiles 2021-10-28 05:27:13 +02:00
nolash
ce67f83457
Remove faulty default registry in docker compose 2021-10-28 05:24:11 +02:00
nolash
13f2e17931
Remove accidental 0 value override for syncer offset to trackers 2021-10-28 05:18:54 +02:00
nolash
f236234682 Merge remote-tracking branch 'origin/master' into lash/local-dev-improve 2021-10-27 16:58:38 +02:00
nolash
1f37632f0f
WIP Replace env vars in data-seeding with well-known 2021-10-27 16:56:03 +02:00
nolash
03d7518f8c Merge branch 'lash/local-dev-improve' of gitlab.com:grassrootseconomics/cic-internal-integration into lash/local-dev-improve 2021-10-27 11:52:31 +02:00
nolash
67152d0df1
Replace KEYSTORE_PATH with WALLET_KEY_FILE in data seeding 2021-10-27 11:51:20 +02:00
9168322941
Revert base image changes. 2021-10-27 12:41:35 +03:00
2fbd338e24
Adds correct base image. 2021-10-27 11:44:23 +03:00
c7d7f2a64d
Remove force reset. 2021-10-27 11:44:08 +03:00
16153df2f0
Resolve creation of phone dir when it already exists. 2021-10-27 11:43:35 +03:00
nolash
4391fa3aff Merge remote-tracking branch 'origin/master' into lash/local-dev-improve 2021-10-25 21:01:27 +02:00
nolash
7ce68021bd Merge remote-tracking branch 'origin/master' into lash/verify-cache 2021-10-25 20:20:40 +02:00
nolash
cd602dee49
Remove WIP docker compose file 2021-10-25 20:12:32 +02:00
nolash
a548ba6fce
Chainlib upgrade to handle none receipts, rpc node debug output in bootstrap 2021-10-25 20:09:35 +02:00
nolash
a6de7e9fe0 Merge remote-tracking branch 'origin/master' into lash/local-dev-improve 2021-10-20 20:02:19 +02:00
nolash
e705a94873
Resolve notify/ussd dependency conflict 2021-10-20 10:07:19 +02:00
nolash
3923de0a81
Update pip args handling in notify 2021-10-19 23:01:55 +02:00
nolash
5c0250b5b9
Rehabilitate cic-cache db migration 2021-10-19 22:58:10 +02:00
nolash
3285d8dfe5
Implement asynchronous deploys in bootstrap 2021-10-19 22:08:17 +02:00
nolash
9d349f1579
Add debug level env var to bootstrap dev container 2021-10-19 19:54:59 +02:00
nolash
837a1770d1
Upgrade deps more chainlib in bootstrap 2021-10-19 10:10:39 +02:00
003febec9d
Bumps contract migration deps. 2021-10-19 10:38:21 +03:00
f066a32ce8
Adds libffi-dev for local git-tea. 2021-10-19 10:38:08 +03:00
nolash
ad493705ad
Upgrade deps 2021-10-18 17:16:28 +02:00
nolash
b765c4ab88
More wrestling with chainlib-eth deps 2021-10-18 17:06:31 +02:00
nolash
e4935d3b58 Merge branch 'lash/split-migration' of gitlab.com:grassrootseconomics/cic-internal-integration into lash/split-migration 2021-10-18 16:49:58 +02:00
nolash
f88f0e321b
Upgrade chainlib-eth dep 2021-10-18 16:48:14 +02:00
31fa721397
Add cic-notify container 2021-10-18 17:17:53 +03:00
16481da193
Merge remote-tracking branch 'origin/lash/split-migration' into lash/split-migration 2021-10-18 16:54:23 +03:00
97a48cd8c6
Improves ussd deps. 2021-10-18 16:53:38 +03:00
nolash
7732412341 Merge branch 'lash/split-migration' of gitlab.com:grassrootseconomics/cic-internal-integration into lash/split-migration 2021-10-18 15:51:38 +02:00
nolash
649b124a61
Ugprade chainqueue dep 2021-10-18 15:50:45 +02:00
7601e3eeff
Corrects breakages in cic-ussd 2021-10-18 15:19:32 +03:00
60a9efc88b
Merge remote-tracking branch 'origin/lash/split-migration' into lash/split-migration 2021-10-18 15:18:33 +03:00
45011b58c4
Cleans up configs. 2021-10-18 15:11:31 +03:00
nolash
f1a0b4ee7c Merge branch 'lash/split-migration' of gitlab.com:grassrootseconomics/cic-internal-integration into lash/split-migration 2021-10-18 14:10:52 +02:00
nolash
c57abb7ad5
Upgrade deps in cic-eth, allow for new chain spec format 2021-10-18 14:08:39 +02:00
930a99c974
Bumps cic-types version. 2021-10-18 06:52:49 +03:00
b0935caab8
Fixes imports. 2021-10-18 06:52:28 +03:00
nolash
bdd5f6fcec
Update readme in data seeding 2021-10-17 19:37:29 +02:00
nolash
a293c2460e
Consolidate dir handling in data seeding scripts 2021-10-17 19:27:15 +02:00
nolash
0ee6400d7d
WIP rehabilitate ussd builds 2021-10-17 18:32:08 +02:00
nolash
677fb346fd
Add data seeding preparation step, rehabilitation of non-custodial seeding 2021-10-17 18:05:00 +02:00
nolash
ea3c75e755
Rehabilitate traffic script 2021-10-17 14:30:42 +02:00
nolash
0b2f22c416
Rehabilitate cic-user-server 2021-10-16 20:54:41 +02:00
nolash
24385ea27d
Rehabilitate cic-cache 2021-10-16 14:03:05 +02:00
nolash
9a154a8046
WIP rehabilitate cic-cache 2021-10-16 08:23:32 +02:00
nolash
d3576c8ec7
Add eth retrier to new docker compose file 2021-10-16 07:08:44 +02:00
nolash
79ee2bf4ff
Add eth tracker, dispatcher to new docker compose file 2021-10-16 07:04:19 +02:00
nolash
89ac70371a
Remove single function worker in test 2021-10-16 00:18:08 +02:00
nolash
5ea0318b0b
Fix default token symbol config setting for aux 2021-10-15 23:21:57 +02:00
nolash
5dfb96ec0c
Add new cic-signer app 2021-10-15 23:11:00 +02:00
nolash
4634ac41df Merge remote-tracking branch 'origin/master' into lash/split-migration 2021-10-15 22:19:01 +02:00
nolash
97f4fe8ca7
refactor docker-compose cic-eth-tasker, bootstrap (aka contract migration) 2021-10-15 22:16:45 +02:00
nolash
b36529f7fa
WIP local docker registry adaptations 2021-10-15 20:27:03 +02:00
nolash
a6675f2348
Add environment sourcing for cic-eth-tasker docker compose 2021-10-15 18:52:37 +02:00
nolash
e3116d74d6
No export 2021-10-15 12:54:16 +02:00
nolash
c0bbdc9bec
Add missing file 2021-10-15 08:43:04 +02:00
nolash
396bd4f300
update preliminary readme 2021-10-15 08:38:01 +02:00
nolash
58547b4067
Bump cic-eth-registry 2021-10-15 07:44:50 +02:00
nolash
9009815d78
Add trust address to contract migration config, get cic-eth default token from registry 2021-10-14 21:31:04 +02:00
nolash
2da19f5819
Add basic connectivity config directives 2021-10-14 17:40:53 +02:00
nolash
3948d5aa40
Add custodial initialization 2021-10-14 17:18:49 +02:00
nolash
ed432abb23
WIP refactor custodial initialization 2021-10-14 14:37:48 +02:00
nolash
f251b8b729
Remove dead code 2021-10-14 11:35:08 +02:00
nolash
36e791e08a
Split contract migration into three separate steps 2021-10-14 11:33:50 +02:00
nolash
71a7e3d3d5
Reinstate test config dir 2021-10-09 17:23:38 +02:00
nolash
335b7b30a4
Add okota dep 2021-10-09 16:40:28 +02:00
nolash
3b1f470ddf
Add empty config dir 2021-10-09 16:33:40 +02:00
nolash
4c9f20aa7f
Add explicit zero length tx lsit check for cic-cache verify 2021-10-08 11:26:09 +02:00
nolash
980191be4f
Add verify check for cache, use chainlib cli for cic-cache 2021-10-08 11:19:21 +02:00
63 changed files with 1890 additions and 498 deletions

View File

@ -17,7 +17,7 @@ from cic_eth_registry.error import UnknownContractError
# local imports # local imports
from cic_eth.error import SeppukuError from cic_eth.error import SeppukuError
from cic_eth.db.models.base import SessionBase from cic_eth.db.models.base import SessionBase
from cic_eth.eth.util import CacheGasOracle from cic_eth.eth.util import CacheGasOracle, MaxGasOracle
#logg = logging.getLogger().getChild(__name__) #logg = logging.getLogger().getChild(__name__)
logg = logging.getLogger() logg = logging.getLogger()
@ -41,21 +41,24 @@ class BaseTask(celery.Task):
def create_gas_oracle(self, conn, address=None, *args, **kwargs): def create_gas_oracle(self, conn, address=None, *args, **kwargs):
if address == None: x = None
return RPCGasOracle( if address is None:
x = RPCGasOracle(
conn, conn,
code_callback=kwargs.get('code_callback', self.get_min_fee_limit), code_callback=kwargs.get('code_callback', self.get_min_fee_limit),
min_price=self.min_fee_price, min_price=self.min_fee_price,
id_generator=kwargs.get('id_generator'), id_generator=kwargs.get('id_generator'),
) )
else:
return CacheGasOracle( x = MaxGasOracle(conn)
conn, x.code_callback = x.get_fee_units
address,
method=kwargs.get('method'), return x
min_price=self.min_fee_price,
id_generator=kwargs.get('id_generator'),
) def get_min_fee_limit(self, code):
return self.min_fee_limit
def get_min_fee_limit(self, code): def get_min_fee_limit(self, code):
@ -84,7 +87,7 @@ class BaseTask(celery.Task):
) )
s.apply_async() s.apply_async()
class CriticalTask(BaseTask): class CriticalTask(BaseTask):
retry_jitter = True retry_jitter = True
retry_backoff = True retry_backoff = True
@ -96,7 +99,7 @@ class CriticalSQLAlchemyTask(CriticalTask):
sqlalchemy.exc.DatabaseError, sqlalchemy.exc.DatabaseError,
sqlalchemy.exc.TimeoutError, sqlalchemy.exc.TimeoutError,
sqlalchemy.exc.ResourceClosedError, sqlalchemy.exc.ResourceClosedError,
) )
class CriticalWeb3Task(CriticalTask): class CriticalWeb3Task(CriticalTask):
@ -104,7 +107,7 @@ class CriticalWeb3Task(CriticalTask):
ConnectionError, ConnectionError,
) )
safe_gas_threshold_amount = 60000 * 3 safe_gas_threshold_amount = 60000 * 3
safe_gas_refill_amount = safe_gas_threshold_amount * 5 safe_gas_refill_amount = safe_gas_threshold_amount * 5
safe_gas_gifter_balance = safe_gas_threshold_amount * 5 * 100 safe_gas_gifter_balance = safe_gas_threshold_amount * 5 * 100
@ -122,13 +125,13 @@ class CriticalSQLAlchemyAndSignerTask(CriticalTask):
sqlalchemy.exc.DatabaseError, sqlalchemy.exc.DatabaseError,
sqlalchemy.exc.TimeoutError, sqlalchemy.exc.TimeoutError,
sqlalchemy.exc.ResourceClosedError, sqlalchemy.exc.ResourceClosedError,
) )
class CriticalWeb3AndSignerTask(CriticalWeb3Task): class CriticalWeb3AndSignerTask(CriticalWeb3Task):
autoretry_for = ( autoretry_for = (
ConnectionError, ConnectionError,
) )
@celery_app.task() @celery_app.task()
def check_health(self): def check_health(self):
pass pass

View File

@ -1,10 +1,9 @@
[DATABASE] [database]
user = postgres name=cic_notify_test
password = user=
host = localhost password=
port = 5432 host=localhost
name = /tmp/cic-notify.db port=
#engine = postgresql engine=sqlite
#driver = psycopg2 driver=pysqlite
engine = sqlite debug=0
driver = pysqlite

View File

@ -0,0 +1,7 @@
[report]
omit =
venv/*
scripts/*
cic_notify/db/migrations/*
cic_notify/runnable/*
cic_notify/version.py

View File

@ -3,6 +3,7 @@ import logging
import re import re
# third-party imports # third-party imports
import cic_notify.tasks.sms.db
from celery.app.control import Inspect from celery.app.control import Inspect
import celery import celery
@ -13,45 +14,16 @@ app = celery.current_app
logging.basicConfig(level=logging.DEBUG) logging.basicConfig(level=logging.DEBUG)
logg = logging.getLogger() logg = logging.getLogger()
sms_tasks_matcher = r"^(cic_notify.tasks.sms)(\.\w+)?"
re_q = r'^cic-notify'
def get_sms_queue_tasks(app, task_prefix='cic_notify.tasks.sms.'):
host_queues = []
i = Inspect(app=app)
qs = i.active_queues()
for host in qs.keys():
for q in qs[host]:
if re.match(re_q, q['name']):
host_queues.append((host, q['name'],))
task_prefix_len = len(task_prefix)
queue_tasks = []
for (host, queue) in host_queues:
i = Inspect(app=app, destination=[host])
for tasks in i.registered_tasks().values():
for task in tasks:
if len(task) >= task_prefix_len and task[:task_prefix_len] == task_prefix:
queue_tasks.append((queue, task,))
return queue_tasks
class Api: class Api:
# TODO: Implement callback strategy def __init__(self, queue: any = 'cic-notify'):
def __init__(self, queue=None):
""" """
:param queue: The queue on which to execute notification tasks :param queue: The queue on which to execute notification tasks
:type queue: str :type queue: str
""" """
self.queue = queue self.queue = queue
self.sms_tasks = get_sms_queue_tasks(app)
logg.debug('sms tasks {}'.format(self.sms_tasks))
def sms(self, message: str, recipient: str):
def sms(self, message, recipient):
"""This function chains all sms tasks in order to send a message, log and persist said data to disk """This function chains all sms tasks in order to send a message, log and persist said data to disk
:param message: The message to be sent to the recipient. :param message: The message to be sent to the recipient.
:type message: str :type message: str
@ -60,24 +32,9 @@ class Api:
:return: a celery Task :return: a celery Task
:rtype: Celery.Task :rtype: Celery.Task
""" """
signatures = [] s_send = celery.signature('cic_notify.tasks.sms.africastalking.send', [message, recipient], queue=self.queue)
for q in self.sms_tasks: s_log = celery.signature('cic_notify.tasks.sms.log.log', [message, recipient], queue=self.queue)
s_persist_notification = celery.signature(
if not self.queue: 'cic_notify.tasks.sms.db.persist_notification', [message, recipient], queue=self.queue)
queue = q[0] signatures = [s_send, s_log, s_persist_notification]
else: return celery.group(signatures)()
queue = self.queue
signature = celery.signature(
q[1],
[
message,
recipient,
],
queue=queue,
)
signatures.append(signature)
t = celery.group(signatures)()
return t

View File

@ -2,7 +2,7 @@
[alembic] [alembic]
# path to migration scripts # path to migration scripts
script_location = migrations script_location = .
# template used to generate migration files # template used to generate migration files
# file_template = %%(rev)s_%%(slug)s # file_template = %%(rev)s_%%(slug)s
@ -27,28 +27,17 @@ script_location = migrations
# sourceless = false # sourceless = false
# version location specification; this defaults # version location specification; this defaults
# to migrations/versions. When using multiple version # to ./versions. When using multiple version
# directories, initial revisions must be specified with --version-path # directories, initial revisions must be specified with --version-path
# version_locations = %(here)s/bar %(here)s/bat migrations/versions # version_locations = %(here)s/bar %(here)s/bat ./versions
# the output encoding used when revision files # the output encoding used when revision files
# are written from script.py.mako # are written from script.py.mako
# output_encoding = utf-8 # output_encoding = utf-8
sqlalchemy.url = postgres+psycopg2://postgres@localhost/cic-notify sqlalchemy.url = driver://user:pass@localhost/dbname
[post_write_hooks]
# post_write_hooks defines scripts or Python functions that are run
# on newly generated revision scripts. See the documentation for further
# detail and examples
# format using "black" - use the console_scripts runner, against the "black" entrypoint
# hooks=black
# black.type=console_scripts
# black.entrypoint=black
# black.options=-l 79
# Logging configuration # Logging configuration
[loggers] [loggers]
keys = root,sqlalchemy,alembic keys = root,sqlalchemy,alembic

View File

@ -11,7 +11,7 @@ config = context.config
# Interpret the config file for Python logging. # Interpret the config file for Python logging.
# This line sets up loggers basically. # This line sets up loggers basically.
fileConfig(config.config_file_name) fileConfig(config.config_file_name, disable_existing_loggers=True)
# add your model's MetaData object here # add your model's MetaData object here
# for 'autogenerate' support # for 'autogenerate' support
@ -56,11 +56,14 @@ def run_migrations_online():
and associate a connection with the context. and associate a connection with the context.
""" """
connectable = engine_from_config( connectable = context.config.attributes.get("connection", None)
config.get_section(config.config_ini_section),
prefix="sqlalchemy.", if connectable is None:
poolclass=pool.NullPool, connectable = engine_from_config(
) context.config.get_section(context.config.config_ini_section),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)
with connectable.connect() as connection: with connectable.connect() as connection:
context.configure( context.configure(

View File

@ -7,7 +7,7 @@ import celery
celery_app = celery.current_app celery_app = celery.current_app
logg = celery_app.log.get_default_logger() logg = celery_app.log.get_default_logger()
local_logg = logging.getLogger(__name__) local_logg = logging.getLogger()
@celery_app.task @celery_app.task

View File

@ -1,5 +1,9 @@
pytest~=6.0.1 Faker==11.1.0
pytest-celery~=0.0.0a1 faker-e164==0.1.0
pytest-mock~=3.3.1 pytest==6.2.5
pysqlite3~=0.4.3 pytest-celery~=0.0.0
pytest-cov==2.10.1 pytest-mock==3.6.1
pysqlite3~=0.4.6
pytest-cov==3.0.0
pytest-alembic==0.7.0
requests-mock==1.9.3

View File

View File

@ -0,0 +1,28 @@
import pytest
def test_single_head_revision(alembic_runner):
heads = alembic_runner.heads
head_count = len(heads)
assert head_count == 1
def test_upgrade(alembic_runner):
try:
alembic_runner.migrate_up_to("head")
except RuntimeError:
pytest.fail('Failed to upgrade to the head revision.')
def test_up_down_consistency(alembic_runner):
try:
for revision in alembic_runner.history.revisions:
alembic_runner.migrate_up_to(revision)
except RuntimeError:
pytest.fail('Failed to upgrade through each revision individually.')
try:
for revision in reversed(alembic_runner.history.revisions):
alembic_runner.migrate_down_to(revision)
except RuntimeError:
pytest.fail('Failed to downgrade through each revision individually.')

View File

@ -0,0 +1,27 @@
# standard imports
# external imports
from faker import Faker
from faker_e164.providers import E164Provider
# local imports
from cic_notify.db.enum import NotificationStatusEnum, NotificationTransportEnum
from cic_notify.db.models.notification import Notification
# test imports
from tests.helpers.phone import phone_number
def test_notification(init_database):
message = 'Hello world'
recipient = phone_number()
notification = Notification(NotificationTransportEnum.SMS, recipient, message)
init_database.add(notification)
init_database.commit()
notification = init_database.query(Notification).get(1)
assert notification.status == NotificationStatusEnum.UNKNOWN
assert notification.recipient == recipient
assert notification.message == message
assert notification.transport == NotificationTransportEnum.SMS

View File

@ -0,0 +1,38 @@
# standard imports
import os
# third-party imports
# local imports
from cic_notify.db import dsn_from_config
def test_dsn_from_config(load_config):
"""
"""
# test dsn for other db formats
overrides = {
'DATABASE_PASSWORD': 'password',
'DATABASE_DRIVER': 'psycopg2',
'DATABASE_ENGINE': 'postgresql'
}
load_config.dict_override(dct=overrides, dct_description='Override values to test different db formats.')
scheme = f'{load_config.get("DATABASE_ENGINE")}+{load_config.get("DATABASE_DRIVER")}'
dsn = dsn_from_config(load_config)
assert dsn == f"{scheme}://{load_config.get('DATABASE_USER')}:{load_config.get('DATABASE_PASSWORD')}@{load_config.get('DATABASE_HOST')}:{load_config.get('DATABASE_PORT')}/{load_config.get('DATABASE_NAME')}"
# undoes overrides to revert engine and drivers to sqlite
overrides = {
'DATABASE_PASSWORD': '',
'DATABASE_DRIVER': 'pysqlite',
'DATABASE_ENGINE': 'sqlite'
}
load_config.dict_override(dct=overrides, dct_description='Override values to test different db formats.')
# test dsn for sqlite engine
dsn = dsn_from_config(load_config)
scheme = f'{load_config.get("DATABASE_ENGINE")}+{load_config.get("DATABASE_DRIVER")}'
assert dsn == f'{scheme}:///{load_config.get("DATABASE_NAME")}'

View File

@ -0,0 +1,75 @@
# standard imports
import logging
import os
# external imports
import pytest
import requests_mock
# local imports
from cic_notify.error import NotInitializedError, AlreadyInitializedError, NotificationSendError
from cic_notify.tasks.sms.africastalking import AfricasTalkingNotifier
# test imports
from tests.helpers.phone import phone_number
def test_africas_talking_notifier(africastalking_response, caplog):
caplog.set_level(logging.DEBUG)
with pytest.raises(NotInitializedError) as error:
AfricasTalkingNotifier()
assert str(error.value) == ''
api_key = os.urandom(24).hex()
sender_id = 'bar'
username = 'sandbox'
AfricasTalkingNotifier.initialize(username, api_key, sender_id)
africastalking_notifier = AfricasTalkingNotifier()
assert africastalking_notifier.sender_id == sender_id
assert africastalking_notifier.initiated is True
with pytest.raises(AlreadyInitializedError) as error:
AfricasTalkingNotifier.initialize(username, api_key, sender_id)
assert str(error.value) == ''
with requests_mock.Mocker(real_http=False) as request_mocker:
message = 'Hello world.'
recipient = phone_number()
africastalking_response.get('SMSMessageData').get('Recipients')[0]['number'] = recipient
request_mocker.register_uri(method='POST',
headers={'content-type': 'application/json'},
json=africastalking_response,
url='https://api.sandbox.africastalking.com/version1/messaging',
status_code=200)
africastalking_notifier.send(message, recipient)
assert f'Africastalking response sender-id {africastalking_response}' in caplog.text
africastalking_notifier.sender_id = None
africastalking_notifier.send(message, recipient)
assert f'africastalking response no-sender-id {africastalking_response}' in caplog.text
with pytest.raises(NotificationSendError) as error:
status = 'InvalidPhoneNumber'
status_code = 403
africastalking_response.get('SMSMessageData').get('Recipients')[0]['status'] = status
africastalking_response.get('SMSMessageData').get('Recipients')[0]['statusCode'] = status_code
request_mocker.register_uri(method='POST',
headers={'content-type': 'application/json'},
json=africastalking_response,
url='https://api.sandbox.africastalking.com/version1/messaging',
status_code=200)
africastalking_notifier.send(message, recipient)
assert str(error.value) == f'Sending notification failed due to: {status}'
with pytest.raises(NotificationSendError) as error:
recipients = []
status = 'InsufficientBalance'
africastalking_response.get('SMSMessageData')['Recipients'] = recipients
africastalking_response.get('SMSMessageData')['Message'] = status
request_mocker.register_uri(method='POST',
headers={'content-type': 'application/json'},
json=africastalking_response,
url='https://api.sandbox.africastalking.com/version1/messaging',
status_code=200)
africastalking_notifier.send(message, recipient)
assert str(error.value) == f'Unexpected number of recipients: {len(recipients)}. Status: {status}'

View File

@ -0,0 +1,26 @@
# standard imports
# external imports
import celery
# local imports
from cic_notify.db.enum import NotificationStatusEnum, NotificationTransportEnum
from cic_notify.db.models.notification import Notification
# test imports
from tests.helpers.phone import phone_number
def test_persist_notification(celery_session_worker, init_database):
message = 'Hello world.'
recipient = phone_number()
s_persist_notification = celery.signature(
'cic_notify.tasks.sms.db.persist_notification', (message, recipient)
)
s_persist_notification.apply_async().get()
notification = Notification.session.query(Notification).filter_by(recipient=recipient).first()
assert notification.status == NotificationStatusEnum.UNKNOWN
assert notification.recipient == recipient
assert notification.message == message
assert notification.transport == NotificationTransportEnum.SMS

View File

@ -0,0 +1,21 @@
# standard imports
import logging
# external imports
import celery
# local imports
# test imports
from tests.helpers.phone import phone_number
def test_log(caplog, celery_session_worker):
message = 'Hello world.'
recipient = phone_number()
caplog.set_level(logging.INFO)
s_log = celery.signature(
'cic_notify.tasks.sms.log.log', [message, recipient]
)
s_log.apply_async().get()
assert f'message to {recipient}: {message}' in caplog.text

View File

@ -0,0 +1,24 @@
# standard imports
# external imports
import celery
# local imports
from cic_notify.api import Api
# test imports
from tests.helpers.phone import phone_number
def test_api(celery_session_worker, mocker):
mocked_group = mocker.patch('celery.group')
message = 'Hello world.'
recipient = phone_number()
s_send = celery.signature('cic_notify.tasks.sms.africastalking.send', [message, recipient], queue=None)
s_log = celery.signature('cic_notify.tasks.sms.log.log', [message, recipient], queue=None)
s_persist_notification = celery.signature(
'cic_notify.tasks.sms.db.persist_notification', [message, recipient], queue=None)
signatures = [s_send, s_log, s_persist_notification]
api = Api(queue=None)
api.sms(message, recipient)
mocked_group.assert_called_with(signatures)

View File

@ -1,31 +1,13 @@
# standard imports # standard imports
import sys
import os
import pytest
import logging import logging
# third party imports # third party imports
import confini
script_dir = os.path.dirname(os.path.realpath(__file__))
root_dir = os.path.dirname(script_dir)
sys.path.insert(0, root_dir)
# local imports # local imports
from cic_notify.db.models.base import SessionBase
#from transport.notification import AfricastalkingNotification
# fixtures # test imports
from tests.fixtures_config import *
from tests.fixtures_celery import *
from tests.fixtures_database import *
logg = logging.getLogger() from .fixtures.celery import *
from .fixtures.config import *
from .fixtures.database import *
#@pytest.fixture(scope='session') from .fixtures.result import *
#def africastalking_notification(
# load_config,
# ):
# return AfricastalkingNotificationTransport(load_config)
#

View File

@ -37,12 +37,6 @@ def celery_config():
shutil.rmtree(rq) shutil.rmtree(rq)
@pytest.fixture(scope='session')
def celery_worker_parameters():
return {
# 'queues': ('cic-notify'),
}
@pytest.fixture(scope='session') @pytest.fixture(scope='session')
def celery_enable_logging(): def celery_enable_logging():
return True return True

View File

@ -0,0 +1,32 @@
# standard imports
import os
import logging
# external imports
import pytest
from confini import Config
logg = logging.getLogger(__file__)
fixtures_dir = os.path.dirname(__file__)
root_directory = os.path.dirname(os.path.dirname(fixtures_dir))
@pytest.fixture(scope='session')
def alembic_config():
migrations_directory = os.path.join(root_directory, 'cic_notify', 'db', 'migrations', 'default')
file = os.path.join(migrations_directory, 'alembic.ini')
return {
'file': file,
'script_location': migrations_directory
}
@pytest.fixture(scope='session')
def load_config():
config_directory = os.path.join(root_directory, '.config/test')
config = Config(default_dir=config_directory)
config.process()
logg.debug('config loaded\n{}'.format(config))
return config

View File

@ -0,0 +1,54 @@
# standard imports
import os
# third-party imports
import pytest
import alembic
from alembic.config import Config as AlembicConfig
# local imports
from cic_notify.db import dsn_from_config
from cic_notify.db.models.base import SessionBase, create_engine
from .config import root_directory
@pytest.fixture(scope='session')
def alembic_engine(load_config):
data_source_name = dsn_from_config(load_config)
return create_engine(data_source_name)
@pytest.fixture(scope='session')
def database_engine(load_config):
if load_config.get('DATABASE_ENGINE') == 'sqlite':
try:
os.unlink(load_config.get('DATABASE_NAME'))
except FileNotFoundError:
pass
dsn = dsn_from_config(load_config)
SessionBase.connect(dsn)
return dsn
@pytest.fixture(scope='function')
def init_database(load_config, database_engine):
db_directory = os.path.join(root_directory, 'cic_notify', 'db')
migrations_directory = os.path.join(db_directory, 'migrations', load_config.get('DATABASE_ENGINE'))
if not os.path.isdir(migrations_directory):
migrations_directory = os.path.join(db_directory, 'migrations', 'default')
session = SessionBase.create_session()
alembic_config = AlembicConfig(os.path.join(migrations_directory, 'alembic.ini'))
alembic_config.set_main_option('sqlalchemy.url', database_engine)
alembic_config.set_main_option('script_location', migrations_directory)
alembic.command.downgrade(alembic_config, 'base')
alembic.command.upgrade(alembic_config, 'head')
yield session
session.commit()
session.close()

View File

@ -0,0 +1,24 @@
# standard imports
# external imports
import pytest
# local imports
# test imports
@pytest.fixture(scope="function")
def africastalking_response():
return {
"SMSMessageData": {
"Message": "Sent to 1/1 Total Cost: KES 0.8000",
"Recipients": [{
"statusCode": 101,
"number": "+254711XXXYYY",
"status": "Success",
"cost": "KES 0.8000",
"messageId": "ATPid_SampleTxnId123"
}]
}
}

View File

@ -1,20 +0,0 @@
# standard imports
import os
import logging
# third-party imports
import pytest
import confini
script_dir = os.path.dirname(os.path.realpath(__file__))
root_dir = os.path.dirname(script_dir)
logg = logging.getLogger(__file__)
@pytest.fixture(scope='session')
def load_config():
config_dir = os.path.join(root_dir, '.config/test')
conf = confini.Config(config_dir, 'CICTEST')
conf.process()
logg.debug('config {}'.format(conf))
return conf

View File

@ -1,48 +0,0 @@
# standard imports
import os
# third-party imports
import pytest
import alembic
from alembic.config import Config as AlembicConfig
# local imports
from cic_notify.db import SessionBase
from cic_notify.db import dsn_from_config
@pytest.fixture(scope='session')
def database_engine(
load_config,
):
dsn = dsn_from_config(load_config)
SessionBase.connect(dsn)
return dsn
@pytest.fixture(scope='function')
def init_database(
load_config,
database_engine,
):
rootdir = os.path.dirname(os.path.dirname(__file__))
dbdir = os.path.join(rootdir, 'cic_notify', 'db')
migrationsdir = os.path.join(dbdir, 'migrations', load_config.get('DATABASE_ENGINE'))
if not os.path.isdir(migrationsdir):
migrationsdir = os.path.join(dbdir, 'migrations', 'default')
session = SessionBase.create_session()
ac = AlembicConfig(os.path.join(migrationsdir, 'alembic.ini'))
ac.set_main_option('sqlalchemy.url', database_engine)
ac.set_main_option('script_location', migrationsdir)
alembic.command.downgrade(ac, 'base')
alembic.command.upgrade(ac, 'head')
yield session
session.commit()
session.close()

View File

@ -0,0 +1,16 @@
# standard imports
# external imports
from faker import Faker
from faker_e164.providers import E164Provider
# local imports
# test imports
fake = Faker()
fake.add_provider(E164Provider)
def phone_number() -> str:
return fake.e164('KE')

View File

@ -1,34 +0,0 @@
# standard imports
import json
# third party imports
import pytest
import celery
# local imports
from cic_notify.tasks.sms import db
from cic_notify.tasks.sms import log
def test_log_notification(
celery_session_worker,
):
recipient = '+25412121212'
content = 'bar'
s_log = celery.signature('cic_notify.tasks.sms.log.log')
t = s_log.apply_async(args=[recipient, content])
r = t.get()
def test_db_notification(
init_database,
celery_session_worker,
):
recipient = '+25412121213'
content = 'foo'
s_db = celery.signature('cic_notify.tasks.sms.db.persist_notification')
t = s_db.apply_async(args=[recipient, content])
r = t.get()

View File

@ -3,4 +3,5 @@ omit =
venv/* venv/*
scripts/* scripts/*
cic_ussd/db/migrations/* cic_ussd/db/migrations/*
cic_ussd/runnable/* cic_ussd/runnable/*
cic_ussd/version.py

View File

@ -14,7 +14,7 @@ class Cache:
store: Redis = None store: Redis = None
def cache_data(key: str, data: str): def cache_data(key: str, data: [bytes, float, int, str]):
""" """
:param key: :param key:
:type key: :type key:

View File

@ -63,10 +63,7 @@ class Account(SessionBase):
def remove_guardian(self, phone_number: str): def remove_guardian(self, phone_number: str):
set_guardians = self.guardians.split(',') set_guardians = self.guardians.split(',')
set_guardians.remove(phone_number) set_guardians.remove(phone_number)
if len(set_guardians) > 1: self.guardians = ','.join(set_guardians)
self.guardians = ','.join(set_guardians)
else:
self.guardians = set_guardians[0]
def get_guardians(self) -> list: def get_guardians(self) -> list:
return self.guardians.split(',') if self.guardians else [] return self.guardians.split(',') if self.guardians else []

View File

@ -7,3 +7,4 @@ from .custom import CustomMetadata
from .person import PersonMetadata from .person import PersonMetadata
from .phone import PhonePointerMetadata from .phone import PhonePointerMetadata
from .preferences import PreferencesMetadata from .preferences import PreferencesMetadata
from .tokens import TokenMetadata

View File

@ -417,7 +417,7 @@ class MenuProcessor:
preferred_language = get_cached_preferred_language(self.account.blockchain_address) preferred_language = get_cached_preferred_language(self.account.blockchain_address)
if not preferred_language: if not preferred_language:
preferred_language = i18n.config.get('fallback') preferred_language = i18n.config.get('fallback')
return translation_for(self.display_key,preferred_language,token_symbol=token_symbol) return translation_for(self.display_key, preferred_language, token_symbol=token_symbol)
def exit_successful_transaction(self): def exit_successful_transaction(self):
""" """

View File

@ -87,7 +87,7 @@ def is_valid_guardian_addition(state_machine_data: Tuple[str, dict, Account, Ses
guardianship = Guardianship() guardianship = Guardianship()
is_system_guardian = guardianship.is_system_guardian(phone_number) is_system_guardian = guardianship.is_system_guardian(phone_number)
is_initiator = phone_number == account.phone_number is_initiator = phone_number == account.phone_number
is_existent_guardian = phone_number in account.get_guardians() is_existent_guardian = phone_number in account.get_guardians() or is_system_guardian
failure_reason = '' failure_reason = ''
if not is_valid_account: if not is_valid_account:

View File

@ -6,3 +6,6 @@ password_pepper=QYbzKff6NhiQzY3ygl2BkiKOpER8RE/Upqs/5aZWW+I=
[machine] [machine]
states=states/ states=states/
transitions=transitions/ transitions=transitions/
[system]
guardians_file = var/lib/sys/guardians.txt

View File

@ -1,2 +1,2 @@
[chain] [chain]
spec = 'evm:foo:1:bar' spec = evm:foo:1:bar

View File

@ -1,3 +1,10 @@
[locale] [locale]
fallback=sw fallback=sw
path=var/lib/locale/ path=
file_builders=var/lib/sys/
[schema]
file_path = data/schema
[languages]
file = var/lib/sys/languages.json

View File

@ -1,12 +1,12 @@
cic-eth[services]~=0.12.4a13 cic-eth[services]~=0.12.7
Faker==8.1.2 Faker==11.1.0
faker-e164==0.1.0 faker-e164==0.1.0
pytest==6.2.4 pytest==6.2.5
pytest-alembic==0.2.5 pytest-alembic==0.7.0
pytest-celery==0.0.0a1 pytest-celery==0.0.0a1
pytest-cov==2.10.1 pytest-cov==3.0.0
pytest-mock==3.3.1 pytest-mock==3.6.1
pytest-ordering==0.6 pytest-ordering==0.6
pytest-redis==2.0.0 pytest-redis==2.3.0
requests-mock==1.8.0 requests-mock==1.9.3
tavern==1.14.2 tavern==1.18.0

View File

@ -2,10 +2,16 @@
# external imports # external imports
import pytest import pytest
from cic_types.condiments import MetadataPointer
# local imports # local imports
from cic_ussd.account.balance import calculate_available_balance, get_balances, get_cached_available_balance from cic_ussd.account.balance import (calculate_available_balance,
get_balances,
get_cached_adjusted_balance,
get_cached_available_balance)
from cic_ussd.account.chain import Chain from cic_ussd.account.chain import Chain
from cic_ussd.account.tokens import get_cached_token_data_list
from cic_ussd.cache import cache_data_key, get_cached_data
from cic_ussd.error import CachedDataNotFoundError from cic_ussd.error import CachedDataNotFoundError
# test imports # test imports
@ -57,19 +63,45 @@ def test_calculate_available_balance(activated_account,
'balance_outgoing': balance_outgoing, 'balance_outgoing': balance_outgoing,
'balance_incoming': balance_incoming 'balance_incoming': balance_incoming
} }
assert calculate_available_balance(balances) == available_balance assert calculate_available_balance(balances, 6) == available_balance
def test_get_cached_available_balance(activated_account, def test_get_cached_available_balance(activated_account,
balances, balances,
cache_balances, cache_balances,
cache_default_token_data, cache_default_token_data,
load_chain_spec): load_chain_spec,
cached_available_balance = get_cached_available_balance(activated_account.blockchain_address) token_symbol):
available_balance = calculate_available_balance(balances[0]) identifier = [bytes.fromhex(activated_account.blockchain_address), token_symbol.encode('utf-8')]
cached_available_balance = get_cached_available_balance(6, identifier)
available_balance = calculate_available_balance(balances[0], 6)
assert cached_available_balance == available_balance assert cached_available_balance == available_balance
address = blockchain_address() address = blockchain_address()
with pytest.raises(CachedDataNotFoundError) as error: with pytest.raises(CachedDataNotFoundError) as error:
cached_available_balance = get_cached_available_balance(address) identifier = [bytes.fromhex(address), token_symbol.encode('utf-8')]
key = cache_data_key(identifier=identifier, salt=MetadataPointer.BALANCES)
cached_available_balance = get_cached_available_balance(6, identifier)
assert cached_available_balance is None assert cached_available_balance is None
assert str(error.value) == f'No cached available balance for address: {address}' assert str(error.value) == f'No cached available balance at {key}'
def test_get_cached_adjusted_balance(activated_account, cache_adjusted_balances, token_symbol):
identifier = bytes.fromhex(activated_account.blockchain_address)
balances_identifier = [identifier, token_symbol.encode('utf-8')]
key = cache_data_key(balances_identifier, MetadataPointer.BALANCES_ADJUSTED)
adjusted_balances = get_cached_data(key)
assert get_cached_adjusted_balance(balances_identifier) == adjusted_balances
def test_get_account_tokens_balance(activated_account,
cache_token_data_list,
celery_session_worker,
load_chain_spec,
load_config,
mock_async_balance_api_query,
token_symbol):
blockchain_address = activated_account.blockchain_address
chain_str = Chain.spec.__str__()
get_balances(blockchain_address, chain_str, token_symbol, asynchronous=True)
assert mock_async_balance_api_query.get('address') == blockchain_address
assert mock_async_balance_api_query.get('token_symbol') == token_symbol

View File

@ -0,0 +1,21 @@
# standard imports
import os
# external imports
# local imports
from cic_ussd.account.guardianship import Guardianship
# test imports
from tests.fixtures.config import root_directory
def test_guardianship(load_config, setup_guardianship):
guardians_file = os.path.join(root_directory, load_config.get('SYSTEM_GUARDIANS_FILE'))
with open(guardians_file, 'r') as system_guardians:
guardians = [line.strip() for line in system_guardians]
assert Guardianship.guardians == guardians
guardianship = Guardianship()
assert guardianship.is_system_guardian(Guardianship.guardians[0]) is True
assert guardianship.is_system_guardian('+254712345678') is False

View File

@ -11,8 +11,7 @@ from cic_ussd.account.statement import (filter_statement_transactions,
generate, generate,
get_cached_statement, get_cached_statement,
parse_statement_transactions, parse_statement_transactions,
query_statement, query_statement)
statement_transaction_set)
from cic_ussd.account.transaction import transaction_actors from cic_ussd.account.transaction import transaction_actors
from cic_ussd.cache import cache_data_key, get_cached_data from cic_ussd.cache import cache_data_key, get_cached_data
@ -74,12 +73,3 @@ def test_query_statement(blockchain_address, limit, load_chain_spec, activated_a
query_statement(blockchain_address, limit) query_statement(blockchain_address, limit)
assert mock_transaction_list_query.get('address') == blockchain_address assert mock_transaction_list_query.get('address') == blockchain_address
assert mock_transaction_list_query.get('limit') == limit assert mock_transaction_list_query.get('limit') == limit
def test_statement_transaction_set(cache_default_token_data, load_chain_spec, preferences, set_locale_files, statement):
parsed_transactions = parse_statement_transactions(statement)
preferred_language = preferences.get('preferred_language')
transaction_set = statement_transaction_set(preferred_language, parsed_transactions)
transaction_set.startswith('Sent')
transaction_set = statement_transaction_set(preferred_language, [])
transaction_set.startswith('No')

View File

@ -1,17 +1,80 @@
# standard imports # standard imports
import hashlib
import json import json
# external imports # external imports
import pytest import pytest
from cic_types.condiments import MetadataPointer
# local imports # local imports
from cic_ussd.account.chain import Chain from cic_ussd.account.chain import Chain
from cic_ussd.account.tokens import get_cached_default_token, get_default_token_symbol, query_default_token from cic_ussd.account.tokens import (collate_token_metadata,
create_account_tokens_list,
get_active_token_symbol,
get_default_token_symbol,
get_cached_default_token,
get_cached_token_data,
get_cached_token_data_list,
get_cached_token_symbol_list,
hashed_token_proof,
handle_token_symbol_list,
order_account_tokens_list,
parse_token_list,
process_token_data,
query_default_token,
query_token_data,
remove_from_account_tokens_list,
set_active_token)
from cic_ussd.cache import cache_data, cache_data_key, get_cached_data
from cic_ussd.error import CachedDataNotFoundError
# test imports # test imports
def test_collate_token_metadata(token_meta_symbol, token_proof_symbol):
description = token_proof_symbol.get('description')
issuer = token_proof_symbol.get('issuer')
location = token_meta_symbol.get('location')
contact = token_meta_symbol.get('contact')
data = {
'description': description,
'issuer': issuer,
'location': location,
'contact': contact
}
assert collate_token_metadata(token_proof_symbol, token_meta_symbol) == data
def test_create_account_tokens_list(activated_account,
cache_balances,
cache_token_data,
cache_token_symbol_list,
init_cache):
create_account_tokens_list(activated_account.blockchain_address)
key = cache_data_key(bytes.fromhex(activated_account.blockchain_address), MetadataPointer.TOKEN_DATA_LIST)
cached_data_list = json.loads(get_cached_data(key))
data = get_cached_token_data_list(activated_account.blockchain_address)
assert cached_data_list == data
def test_get_active_token_symbol(activated_account, set_active_token, valid_recipient):
identifier = bytes.fromhex(activated_account.blockchain_address)
key = cache_data_key(identifier=identifier, salt=MetadataPointer.TOKEN_ACTIVE)
active_token_symbol = get_cached_data(key)
assert active_token_symbol == get_active_token_symbol(activated_account.blockchain_address)
with pytest.raises(CachedDataNotFoundError) as error:
get_active_token_symbol(valid_recipient.blockchain_address)
assert str(error.value) == 'No active token set.'
def test_get_cached_token_data(activated_account, cache_token_data, token_symbol):
identifier = [bytes.fromhex(activated_account.blockchain_address), token_symbol.encode('utf-8')]
key = cache_data_key(identifier, MetadataPointer.TOKEN_DATA)
token_data = json.loads(get_cached_data(key))
assert token_data == get_cached_token_data(activated_account.blockchain_address, token_symbol)
def test_get_cached_default_token(cache_default_token_data, default_token_data, load_chain_spec): def test_get_cached_default_token(cache_default_token_data, default_token_data, load_chain_spec):
chain_str = Chain.spec.__str__() chain_str = Chain.spec.__str__()
cached_default_token = get_cached_default_token(chain_str) cached_default_token = get_cached_default_token(chain_str)
@ -27,6 +90,84 @@ def test_get_default_token_symbol_from_api(default_token_data, load_chain_spec,
assert default_token_symbol == default_token_data['symbol'] assert default_token_symbol == default_token_data['symbol']
def test_get_cached_token_data_list(activated_account, cache_token_data_list):
blockchain_address = activated_account.blockchain_address
key = cache_data_key(identifier=bytes.fromhex(blockchain_address), salt=MetadataPointer.TOKEN_DATA_LIST)
token_symbols_list = json.loads(get_cached_data(key))
assert token_symbols_list == get_cached_token_data_list(blockchain_address)
def test_get_cached_token_symbol_list(activated_account, cache_token_symbol_list):
blockchain_address = activated_account.blockchain_address
key = cache_data_key(identifier=bytes.fromhex(blockchain_address), salt=MetadataPointer.TOKEN_SYMBOLS_LIST)
token_symbols_list = json.loads(get_cached_data(key))
assert token_symbols_list == get_cached_token_symbol_list(blockchain_address)
def test_hashed_token_proof(token_proof_symbol):
hash_object = hashlib.new("sha256")
token_proof = json.dumps(token_proof_symbol)
hash_object.update(token_proof.encode('utf-8'))
assert hash_object.digest().hex() == hashed_token_proof(token_proof_symbol)
def test_handle_token_symbol_list(activated_account, init_cache):
handle_token_symbol_list(activated_account.blockchain_address, 'GFT')
cached_token_symbol_list = get_cached_token_symbol_list(activated_account.blockchain_address)
assert len(cached_token_symbol_list) == 1
handle_token_symbol_list(activated_account.blockchain_address, 'DET')
cached_token_symbol_list = get_cached_token_symbol_list(activated_account.blockchain_address)
assert len(cached_token_symbol_list) == 2
def test_order_account_tokens_list(activated_account, token_list_entries):
identifier = bytes.fromhex(activated_account.blockchain_address)
last_sent_token_key = cache_data_key(identifier=identifier, salt=MetadataPointer.TOKEN_LAST_SENT)
cache_data(last_sent_token_key, 'FII')
last_received_token_key = cache_data_key(identifier=identifier, salt=MetadataPointer.TOKEN_LAST_RECEIVED)
cache_data(last_received_token_key, 'DET')
ordered_list = order_account_tokens_list(token_list_entries, identifier)
assert ordered_list == [
{
'name': 'Fee',
'symbol': 'FII',
'issuer': 'Foo',
'contact': {
'phone': '+254712345678'
},
'location': 'Fum',
'balance': 50.0
},
{
'name': 'Demurrage Token',
'symbol': 'DET',
'issuer': 'Grassroots Economics',
'contact': {
'phone': '+254700000000',
'email': 'info@grassrootseconomics.org'},
'location': 'Fum',
'balance': 49.99
},
{
'name': 'Giftable Token',
'symbol': 'GFT',
'issuer': 'Grassroots Economics',
'contact': {
'phone': '+254700000000',
'email': 'info@grassrootseconomics.org'},
'location': 'Fum',
'balance': 60.0
}
]
def test_parse_token_list(token_list_entries):
parsed_token_list = ['1. FII 50.0', '2. GFT 60.0', '3. DET 49.99']
assert parsed_token_list == parse_token_list(token_list_entries)
def test_query_default_token(default_token_data, load_chain_spec, mock_sync_default_token_api_query): def test_query_default_token(default_token_data, load_chain_spec, mock_sync_default_token_api_query):
chain_str = Chain.spec.__str__() chain_str = Chain.spec.__str__()
queried_default_token_data = query_default_token(chain_str) queried_default_token_data = query_default_token(chain_str)
@ -40,3 +181,38 @@ def test_get_default_token_symbol_from_cache(cache_default_token_data, default_t
default_token_symbol = get_default_token_symbol() default_token_symbol = get_default_token_symbol()
assert default_token_symbol is not None assert default_token_symbol is not None
assert default_token_symbol == default_token_data.get('symbol') assert default_token_symbol == default_token_data.get('symbol')
def test_remove_from_account_tokens_list(token_list_entries):
assert remove_from_account_tokens_list(token_list_entries, 'GFT') == ([{
'name': 'Giftable Token',
'symbol': 'GFT',
'issuer': 'Grassroots Economics',
'contact': {
'phone': '+254700000000',
'email': 'info@grassrootseconomics.org'
},
'location': 'Fum',
'balance': 60.0
}],
[
{
'name': 'Fee',
'symbol': 'FII',
'issuer': 'Foo',
'contact': {'phone': '+254712345678'},
'location': 'Fum',
'balance': 50.0
},
{
'name': 'Demurrage Token',
'symbol': 'DET',
'issuer': 'Grassroots Economics',
'contact': {
'phone': '+254700000000',
'email': 'info@grassrootseconomics.org'
},
'location': 'Fum',
'balance': 49.99
}
])

View File

@ -1,5 +1,4 @@
# standard imports # standard imports
from decimal import Decimal
# external imports # external imports
import pytest import pytest
@ -37,11 +36,11 @@ def test_aux_transaction_data(preferences, set_locale_files, transactions_list):
@pytest.mark.parametrize("value, expected_result", [ @pytest.mark.parametrize("value, expected_result", [
(50000000, Decimal('50.00')), (50000000, 50.0),
(100000, Decimal('0.10')) (100000, 0.1)
]) ])
def test_from_wei(cache_default_token_data, expected_result, value): def test_from_wei(cache_default_token_data, expected_result, value):
assert from_wei(value) == expected_result assert from_wei(6, value) == expected_result
@pytest.mark.parametrize("value, expected_result", [ @pytest.mark.parametrize("value, expected_result", [
@ -49,7 +48,7 @@ def test_from_wei(cache_default_token_data, expected_result, value):
(0.10, 100000) (0.10, 100000)
]) ])
def test_to_wei(cache_default_token_data, expected_result, value): def test_to_wei(cache_default_token_data, expected_result, value):
assert to_wei(value) == expected_result assert to_wei(6, value) == expected_result
@pytest.mark.parametrize("decimals, value, expected_result", [ @pytest.mark.parametrize("decimals, value, expected_result", [
@ -108,8 +107,8 @@ def test_outgoing_transaction_processor(activated_account,
activated_account.blockchain_address, activated_account.blockchain_address,
valid_recipient.blockchain_address) valid_recipient.blockchain_address)
outgoing_tx_processor.transfer(amount, token_symbol) outgoing_tx_processor.transfer(amount, 6, token_symbol)
assert mock_transfer_api.get('from_address') == activated_account.blockchain_address assert mock_transfer_api.get('from_address') == activated_account.blockchain_address
assert mock_transfer_api.get('to_address') == valid_recipient.blockchain_address assert mock_transfer_api.get('to_address') == valid_recipient.blockchain_address
assert mock_transfer_api.get('value') == to_wei(amount) assert mock_transfer_api.get('value') == to_wei(6, amount)
assert mock_transfer_api.get('token_symbol') == token_symbol assert mock_transfer_api.get('token_symbol') == token_symbol

View File

@ -90,7 +90,7 @@ def test_standard_metadata_id(activated_account, cache_person_metadata, pending_
def test_account_create(init_cache, init_database, load_chain_spec, mock_account_creation_task_result, task_uuid): def test_account_create(init_cache, init_database, load_chain_spec, mock_account_creation_task_result, task_uuid):
chain_str = Chain.spec.__str__() chain_str = Chain.spec.__str__()
create(chain_str, phone_number(), init_database) create(chain_str, phone_number(), init_database, 'en')
assert len(init_database.query(TaskTracker).all()) == 1 assert len(init_database.query(TaskTracker).all()) == 1
account_creation_data = get_cached_data(task_uuid) account_creation_data = get_cached_data(task_uuid)
assert json.loads(account_creation_data).get('status') == AccountStatus.PENDING.name assert json.loads(account_creation_data).get('status') == AccountStatus.PENDING.name

View File

@ -23,7 +23,7 @@ def test_ussd_metadata_handler(activated_account,
setup_metadata_signer): setup_metadata_signer):
identifier = bytes.fromhex(strip_0x(activated_account.blockchain_address)) identifier = bytes.fromhex(strip_0x(activated_account.blockchain_address))
cic_type = MetadataPointer.PERSON cic_type = MetadataPointer.PERSON
metadata_client = UssdMetadataHandler(cic_type, identifier) metadata_client = UssdMetadataHandler(cic_type=cic_type, identifier=identifier)
assert metadata_client.cic_type == cic_type assert metadata_client.cic_type == cic_type
assert metadata_client.engine == 'pgp' assert metadata_client.engine == 'pgp'
assert metadata_client.identifier == identifier assert metadata_client.identifier == identifier

View File

@ -0,0 +1,72 @@
# standard imports
import json
# external imports
import pytest
import requests_mock
from cic_types.condiments import MetadataPointer
from requests.exceptions import HTTPError
# local imports
from cic_ussd.cache import cache_data_key, get_cached_data
from cic_ussd.metadata import TokenMetadata
from cic_ussd.metadata.tokens import token_metadata_handler, query_token_metadata, query_token_info
# test imports
def test_token_metadata_handler(activated_account,
init_cache,
setup_metadata_request_handler,
setup_metadata_signer,
token_meta_symbol,
token_symbol):
with requests_mock.Mocker(real_http=False) as request_mocker:
with pytest.raises(HTTPError) as error:
metadata_client = TokenMetadata(identifier=b'foo', cic_type=MetadataPointer.TOKEN_META_SYMBOL)
reason = 'Not Found'
status_code = 401
request_mocker.register_uri('GET', metadata_client.url, status_code=status_code, reason=reason)
token_metadata_handler(metadata_client)
assert str(error.value) == f'Client Error: {status_code}, reason: {reason}'
identifier = token_symbol.encode('utf-8')
metadata_client = TokenMetadata(identifier, cic_type=MetadataPointer.TOKEN_META_SYMBOL)
request_mocker.register_uri('GET', metadata_client.url, json=token_meta_symbol, status_code=200, reason='OK')
token_metadata_handler(metadata_client)
key = cache_data_key(identifier, MetadataPointer.TOKEN_META_SYMBOL)
cached_token_meta_symbol = get_cached_data(key)
assert json.loads(cached_token_meta_symbol) == token_meta_symbol
def test_query_token_metadata(init_cache,
setup_metadata_request_handler,
setup_metadata_signer,
token_meta_symbol,
token_proof_symbol,
token_symbol):
with requests_mock.Mocker(real_http=False) as request_mocker:
identifier = token_symbol.encode('utf-8')
metadata_client = TokenMetadata(identifier, cic_type=MetadataPointer.TOKEN_META_SYMBOL)
request_mocker.register_uri('GET', metadata_client.url, json=token_meta_symbol, status_code=200, reason='OK')
query_token_metadata(identifier)
key = cache_data_key(identifier, MetadataPointer.TOKEN_META_SYMBOL)
cached_token_meta_symbol = get_cached_data(key)
assert json.loads(cached_token_meta_symbol) == token_meta_symbol
def test_query_token_info(init_cache,
setup_metadata_request_handler,
setup_metadata_signer,
token_meta_symbol,
token_proof_symbol,
token_symbol):
with requests_mock.Mocker(real_http=False) as request_mocker:
identifier = token_symbol.encode('utf-8')
metadata_client = TokenMetadata(identifier, cic_type=MetadataPointer.TOKEN_PROOF_SYMBOL)
request_mocker.register_uri('GET', metadata_client.url, json=token_proof_symbol, status_code=200, reason='OK')
query_token_info(identifier)
key = cache_data_key(identifier, MetadataPointer.TOKEN_PROOF_SYMBOL)
cached_token_proof_symbol = get_cached_data(key)
assert json.loads(cached_token_proof_symbol) == token_proof_symbol

View File

@ -1,6 +1,6 @@
# standard imports # standard imports
import json import json
import datetime import os
# external imports # external imports
from cic_types.condiments import MetadataPointer from cic_types.condiments import MetadataPointer
@ -10,195 +10,464 @@ from cic_ussd.account.balance import get_cached_available_balance
from cic_ussd.account.metadata import get_cached_preferred_language from cic_ussd.account.metadata import get_cached_preferred_language
from cic_ussd.account.statement import ( from cic_ussd.account.statement import (
get_cached_statement, get_cached_statement,
parse_statement_transactions, parse_statement_transactions
statement_transaction_set
) )
from cic_ussd.account.tokens import get_default_token_symbol from cic_ussd.account.tokens import (get_active_token_symbol,
get_cached_token_data)
from cic_ussd.account.transaction import from_wei, to_wei from cic_ussd.account.transaction import from_wei, to_wei
from cic_ussd.cache import cache_data, cache_data_key from cic_ussd.cache import cache_data, cache_data_key, get_cached_data
from cic_ussd.menu.ussd_menu import UssdMenu
from cic_ussd.metadata import PersonMetadata from cic_ussd.metadata import PersonMetadata
from cic_ussd.phone_number import Support from cic_ussd.phone_number import Support
from cic_ussd.processor.menu import response from cic_ussd.processor.menu import response, MenuProcessor
from cic_ussd.processor.util import parse_person_metadata from cic_ussd.processor.util import parse_person_metadata, ussd_menu_list
from cic_ussd.translation import translation_for from cic_ussd.translation import translation_for
# test imports # test imports
def test_account_balance(activated_account, cache_balances, cache_preferences, cache_token_data,
def test_menu_processor(activated_account, generic_ussd_session, init_database, set_active_token):
balances, """blockchain_address = activated_account.blockchain_address
cache_balances, token_symbol = get_active_token_symbol(blockchain_address)
cache_default_token_data, token_data = get_cached_token_data(blockchain_address, token_symbol)
cache_preferences, preferred_language = get_cached_preferred_language(blockchain_address)
cache_person_metadata, decimals = token_data.get("decimals")
cache_statement, identifier = bytes.fromhex(blockchain_address)
celery_session_worker, balances_identifier = [identifier, token_symbol.encode('utf-8')]
generic_ussd_session, available_balance = get_cached_available_balance(decimals, balances_identifier)
init_database,
load_chain_spec,
load_support_phone,
load_ussd_menu,
mock_get_adjusted_balance,
mock_sync_balance_api_query,
mock_transaction_list_query,
valid_recipient):
preferred_language = get_cached_preferred_language(activated_account.blockchain_address)
available_balance = get_cached_available_balance(activated_account.blockchain_address)
token_symbol = get_default_token_symbol()
with_available_balance = 'ussd.account_balances.available_balance' with_available_balance = 'ussd.account_balances.available_balance'
with_fees = 'ussd.account_balances.with_fees' resp = response(activated_account, with_available_balance, with_available_balance[5:], init_database,
ussd_menu = UssdMenu.find_by_name('account_balances') generic_ussd_session)
name = ussd_menu.get('name')
resp = response(activated_account, 'ussd.account_balances', name, init_database, generic_ussd_session)
assert resp == translation_for(with_available_balance, assert resp == translation_for(with_available_balance,
preferred_language, preferred_language,
available_balance=available_balance, available_balance=available_balance,
token_symbol=token_symbol) token_symbol=token_symbol)
identifier = bytes.fromhex(activated_account.blockchain_address) with_fees = 'ussd.account_balances.with_fees'
key = cache_data_key(identifier, MetadataPointer.BALANCES_ADJUSTED) key = cache_data_key(balances_identifier, MetadataPointer.BALANCES_ADJUSTED)
adjusted_balance = 45931650.64654012 adjusted_balance = 45931650.64654012
cache_data(key, json.dumps(adjusted_balance)) cache_data(key, json.dumps(adjusted_balance))
resp = response(activated_account, 'ussd.account_balances', name, init_database, generic_ussd_session) resp = response(activated_account, with_fees, with_fees[5:], init_database, generic_ussd_session)
tax_wei = to_wei(int(available_balance)) - int(adjusted_balance) tax_wei = to_wei(decimals, int(available_balance)) - int(adjusted_balance)
tax = from_wei(int(tax_wei)) tax = from_wei(decimals, int(tax_wei))
assert resp == translation_for(key=with_fees, assert resp == translation_for(key=with_fees,
preferred_language=preferred_language, preferred_language=preferred_language,
available_balance=available_balance, available_balance=available_balance,
tax=tax, tax=tax,
token_symbol=token_symbol) token_symbol=token_symbol)"""
pass
cached_statement = get_cached_statement(activated_account.blockchain_address)
statement = json.loads(cached_statement)
statement_transactions = parse_statement_transactions(statement)
transaction_sets = [statement_transactions[tx:tx + 3] for tx in range(0, len(statement_transactions), 3)]
first_transaction_set = []
middle_transaction_set = []
last_transaction_set = []
if transaction_sets:
first_transaction_set = statement_transaction_set(preferred_language, transaction_sets[0])
if len(transaction_sets) >= 2:
middle_transaction_set = statement_transaction_set(preferred_language, transaction_sets[1])
if len(transaction_sets) >= 3:
last_transaction_set = statement_transaction_set(preferred_language, transaction_sets[2])
display_key = 'ussd.first_transaction_set' def test_account_statement(activated_account,
ussd_menu = UssdMenu.find_by_name('first_transaction_set') cache_preferences,
name = ussd_menu.get('name') cache_statement,
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) generic_ussd_session,
init_database,
set_active_token,
set_locale_files):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
cached_statement = get_cached_statement(blockchain_address)
statement_list = parse_statement_transactions(statement=json.loads(cached_statement))
first_transaction_set = 'ussd.first_transaction_set'
middle_transaction_set = 'ussd.middle_transaction_set'
last_transaction_set = 'ussd.last_transaction_set'
fallback = translation_for('helpers.no_transaction_history', preferred_language)
transaction_sets = ussd_menu_list(fallback=fallback, menu_list=statement_list, split=3)
resp = response(activated_account, first_transaction_set, first_transaction_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(first_transaction_set, preferred_language, first_transaction_set=transaction_sets[0])
resp = response(activated_account, middle_transaction_set, middle_transaction_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(middle_transaction_set, preferred_language,
middle_transaction_set=transaction_sets[1])
resp = response(activated_account, last_transaction_set, last_transaction_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(last_transaction_set, preferred_language, last_transaction_set=transaction_sets[2])
assert resp == translation_for(display_key, preferred_language, first_transaction_set=first_transaction_set)
display_key = 'ussd.middle_transaction_set' def test_add_guardian_pin_authorization(activated_account,
ussd_menu = UssdMenu.find_by_name('middle_transaction_set') cache_preferences,
name = ussd_menu.get('name') guardian_account,
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) generic_ussd_session,
init_database):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
add_guardian_pin_authorization = 'ussd.add_guardian_pin_authorization'
activated_account.add_guardian(guardian_account.phone_number)
init_database.flush()
generic_ussd_session['external_session_id'] = os.urandom(20).hex()
generic_ussd_session['msisdn'] = guardian_account.phone_number
generic_ussd_session['data'] = {'guardian_phone_number': guardian_account.phone_number}
generic_ussd_session['state'] = 'add_guardian_pin_authorization'
resp = response(activated_account,
add_guardian_pin_authorization,
add_guardian_pin_authorization[5:],
init_database,
generic_ussd_session)
assert resp == translation_for(f'{add_guardian_pin_authorization}.first', preferred_language,
guardian_information=guardian_account.standard_metadata_id())
assert resp == translation_for(display_key, preferred_language, middle_transaction_set=middle_transaction_set)
display_key = 'ussd.last_transaction_set' def test_guardian_list(activated_account,
ussd_menu = UssdMenu.find_by_name('last_transaction_set') cache_preferences,
name = ussd_menu.get('name') generic_ussd_session,
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) guardian_account,
init_database):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
guardians_list = 'ussd.guardian_list'
guardians_list_header = translation_for('helpers.guardians_list_header', preferred_language)
guardian_information = guardian_account.standard_metadata_id()
guardians = guardians_list_header + '\n' + f'{guardian_information}\n'
activated_account.add_guardian(guardian_account.phone_number)
init_database.flush()
resp = response(activated_account, guardians_list, guardians_list[5:], init_database, generic_ussd_session)
assert resp == translation_for(guardians_list, preferred_language, guardians_list=guardians)
guardians = translation_for('helpers.no_guardians_list', preferred_language)
identifier = bytes.fromhex(guardian_account.blockchain_address)
key = cache_data_key(identifier, MetadataPointer.PREFERENCES)
cache_data(key, json.dumps({'preferred_language': preferred_language}))
resp = response(guardian_account, guardians_list, guardians_list[5:], init_database, generic_ussd_session)
assert resp == translation_for(guardians_list, preferred_language, guardians_list=guardians)
assert resp == translation_for(display_key, preferred_language, last_transaction_set=last_transaction_set)
display_key = 'ussd.display_user_metadata' def test_account_tokens(activated_account, cache_token_data_list, celery_session_worker, generic_ussd_session,
ussd_menu = UssdMenu.find_by_name('display_user_metadata') init_cache, init_database):
name = ussd_menu.get('name') """blockchain_address = activated_account.blockchain_address
identifier = bytes.fromhex(activated_account.blockchain_address) preferred_language = get_cached_preferred_language(blockchain_address)
cached_token_data_list = get_cached_token_data_list(blockchain_address)
token_data_list = ['1. GFT 50.0']
fallback = translation_for('helpers.no_tokens_list', preferred_language)
token_list_sets = ussd_menu_list(fallback=fallback, menu_list=token_data_list, split=3)
first_account_tokens_set = 'ussd.first_account_tokens_set'
middle_account_tokens_set = 'ussd.middle_account_tokens_set'
last_account_tokens_set = 'ussd.last_account_tokens_set'
resp = response(activated_account, first_account_tokens_set, first_account_tokens_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(first_account_tokens_set, preferred_language,
first_account_tokens_set=token_list_sets[0])
assert generic_ussd_session.get('data').get('account_tokens_list') == cached_token_data_list
resp = response(activated_account, middle_account_tokens_set, middle_account_tokens_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(middle_account_tokens_set, preferred_language,
middle_account_tokens_set=token_list_sets[1])
resp = response(activated_account, last_account_tokens_set, last_account_tokens_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(last_account_tokens_set, preferred_language,
last_account_tokens_set=token_list_sets[2])"""
pass
def test_help(activated_account, cache_preferences, generic_ussd_session, init_database):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
help = 'ussd.help'
resp = response(activated_account, help, help[5:], init_database, generic_ussd_session)
assert resp == translation_for(help, preferred_language, support_phone=Support.phone_number)
def test_person_data(activated_account, cache_person_metadata, cache_preferences, cached_ussd_session,
generic_ussd_session, init_database):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
identifier = bytes.fromhex(blockchain_address)
display_user_metadata = 'ussd.display_user_metadata'
person_metadata = PersonMetadata(identifier) person_metadata = PersonMetadata(identifier)
cached_person_metadata = person_metadata.get_cached_metadata() cached_person_metadata = person_metadata.get_cached_metadata()
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) resp = response(activated_account, display_user_metadata, display_user_metadata[5:], init_database,
assert resp == parse_person_metadata(cached_person_metadata, display_key, preferred_language) generic_ussd_session)
assert resp == parse_person_metadata(cached_person_metadata, display_user_metadata, preferred_language)
display_key = 'ussd.account_balances_pin_authorization'
ussd_menu = UssdMenu.find_by_name('account_balances_pin_authorization')
name = ussd_menu.get('name')
resp = response(activated_account, display_key, name, init_database, generic_ussd_session)
assert resp == translation_for(f'{display_key}.first', preferred_language)
activated_account.failed_pin_attempts = 1 def test_guarded_account_metadata(activated_account, generic_ussd_session, init_database):
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) reset_guarded_pin_authorization = 'ussd.reset_guarded_pin_authorization'
retry_pin_entry = translation_for('ussd.retry_pin_entry', preferred_language, remaining_attempts=2) generic_ussd_session['data'] = {'guarded_account_phone_number': activated_account.phone_number}
assert resp == translation_for(f'{display_key}.retry', preferred_language, retry_pin_entry=retry_pin_entry) menu_processor = MenuProcessor(activated_account, reset_guarded_pin_authorization,
activated_account.failed_pin_attempts = 0 reset_guarded_pin_authorization[5:], init_database, generic_ussd_session)
assert menu_processor.guarded_account_metadata() == activated_account.standard_metadata_id()
display_key = 'ussd.start'
ussd_menu = UssdMenu.find_by_name('start') def test_guardian_metadata(activated_account, generic_ussd_session, guardian_account, init_database):
name = ussd_menu.get('name') add_guardian_pin_authorization = 'ussd.add_guardian_pin_authorization'
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) generic_ussd_session['data'] = {'guardian_phone_number': guardian_account.phone_number}
assert resp == translation_for(display_key, menu_processor = MenuProcessor(activated_account, add_guardian_pin_authorization,
add_guardian_pin_authorization[5:], init_database, generic_ussd_session)
assert menu_processor.guardian_metadata() == guardian_account.standard_metadata_id()
def test_language(activated_account, cache_preferences, generic_ussd_session, init_database, load_languages):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
initial_language_selection = 'ussd.initial_language_selection'
select_preferred_language = 'ussd.select_preferred_language'
initial_middle_language_set = 'ussd.initial_middle_language_set'
middle_language_set = 'ussd.middle_language_set'
initial_last_language_set = 'ussd.initial_last_language_set'
last_language_set = 'ussd.last_language_set'
key = cache_data_key('system:languages'.encode('utf-8'), MetadataPointer.NONE)
cached_system_languages = get_cached_data(key)
language_list: list = json.loads(cached_system_languages)
fallback = translation_for('helpers.no_language_list', preferred_language)
language_list_sets = ussd_menu_list(fallback=fallback, menu_list=language_list, split=3)
resp = response(activated_account, initial_language_selection, initial_language_selection[5:], init_database,
generic_ussd_session)
assert resp == translation_for(initial_language_selection, preferred_language,
first_language_set=language_list_sets[0])
resp = response(activated_account, select_preferred_language, select_preferred_language[5:], init_database,
generic_ussd_session)
assert resp == translation_for(select_preferred_language, preferred_language,
first_language_set=language_list_sets[0])
resp = response(activated_account, initial_middle_language_set, initial_middle_language_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(initial_middle_language_set, preferred_language,
middle_language_set=language_list_sets[1])
resp = response(activated_account, initial_last_language_set, initial_last_language_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(initial_last_language_set, preferred_language,
last_language_set=language_list_sets[2])
resp = response(activated_account, middle_language_set, middle_language_set[5:], init_database,
generic_ussd_session)
assert resp == translation_for(middle_language_set, preferred_language, middle_language_set=language_list_sets[1])
resp = response(activated_account, last_language_set, last_language_set[5:], init_database, generic_ussd_session)
assert resp == translation_for(last_language_set, preferred_language, last_language_set=language_list_sets[2])
def test_account_creation_prompt(activated_account, cache_preferences, generic_ussd_session, init_database,
load_languages):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
user_input = ''
if preferred_language == 'en':
user_input = '1'
elif preferred_language == 'sw':
user_input = '2'
account_creation_prompt = 'ussd.account_creation_prompt'
generic_ussd_session['user_input'] = user_input
resp = response(activated_account, account_creation_prompt, account_creation_prompt[5:], init_database,
generic_ussd_session)
assert resp == translation_for(account_creation_prompt, preferred_language)
def test_reset_guarded_pin_authorization(activated_account, cache_preferences, generic_ussd_session, guardian_account,
init_database):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
reset_guarded_pin_authorization = 'ussd.reset_guarded_pin_authorization'
generic_ussd_session['external_session_id'] = os.urandom(20).hex()
generic_ussd_session['msisdn'] = guardian_account.phone_number
generic_ussd_session['data'] = {'guarded_account_phone_number': activated_account.phone_number}
resp = response(activated_account,
reset_guarded_pin_authorization,
reset_guarded_pin_authorization[5:],
init_database,
generic_ussd_session)
assert resp == translation_for(f'{reset_guarded_pin_authorization}.first', preferred_language,
guarded_account_information=activated_account.phone_number)
def test_start(activated_account, cache_balances, cache_preferences, cache_token_data, cache_token_data_list,
cache_token_symbol_list, celery_session_worker, generic_ussd_session, init_database, load_chain_spec,
mock_sync_balance_api_query, set_active_token):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
token_symbol = get_active_token_symbol(blockchain_address)
token_data = get_cached_token_data(blockchain_address, token_symbol)
decimals = token_data.get("decimals")
identifier = bytes.fromhex(blockchain_address)
balances_identifier = [identifier, token_symbol.encode('utf-8')]
available_balance = get_cached_available_balance(decimals, balances_identifier)
start = 'ussd.start'
resp = response(activated_account, start, start[5:], init_database, generic_ussd_session)
assert resp == translation_for(start,
preferred_language, preferred_language,
account_balance=available_balance, account_balance=available_balance,
account_token_name=token_symbol) account_token_name=token_symbol)
display_key = 'ussd.start'
ussd_menu = UssdMenu.find_by_name('start')
name = ussd_menu.get('name')
older_timestamp = (activated_account.created - datetime.timedelta(days=35))
activated_account.created = older_timestamp
init_database.flush()
response(activated_account, display_key, name, init_database, generic_ussd_session)
assert mock_get_adjusted_balance['timestamp'] == int((datetime.datetime.now() - datetime.timedelta(days=30)).timestamp())
display_key = 'ussd.transaction_pin_authorization' def test_token_selection_pin_authorization(activated_account, cache_preferences, cache_token_data, generic_ussd_session,
ussd_menu = UssdMenu.find_by_name('transaction_pin_authorization') init_database, set_active_token):
name = ussd_menu.get('name') blockchain_address = activated_account.blockchain_address
token_symbol = get_active_token_symbol(blockchain_address)
token_data = get_cached_token_data(blockchain_address, token_symbol)
preferred_language = get_cached_preferred_language(blockchain_address)
token_selection_pin_authorization = 'ussd.token_selection_pin_authorization'
generic_ussd_session['data'] = {'selected_token': token_data}
resp = response(activated_account,
token_selection_pin_authorization,
token_selection_pin_authorization[5:],
init_database,
generic_ussd_session)
token_name = token_data.get('name')
token_symbol = token_data.get('symbol')
token_issuer = token_data.get('issuer')
token_contact = token_data.get('contact')
token_location = token_data.get('location')
data = f'{token_name} ({token_symbol})\n{token_issuer}\n{token_contact}\n{token_location}\n'
assert resp == translation_for(f'{token_selection_pin_authorization}.first', preferred_language,
token_data=data)
def test_transaction_pin_authorization(activated_account, cache_preferences, cache_token_data, generic_ussd_session,
init_database, set_active_token, valid_recipient):
blockchain_address = activated_account.blockchain_address
token_symbol = get_active_token_symbol(blockchain_address)
token_data = get_cached_token_data(blockchain_address, token_symbol)
preferred_language = get_cached_preferred_language(blockchain_address)
decimals = token_data.get("decimals")
transaction_pin_authorization = 'ussd.transaction_pin_authorization'
generic_ussd_session['data'] = { generic_ussd_session['data'] = {
'recipient_phone_number': valid_recipient.phone_number, 'recipient_phone_number': valid_recipient.phone_number,
'transaction_amount': '15' 'transaction_amount': '15'
} }
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) resp = response(activated_account, transaction_pin_authorization, transaction_pin_authorization[5:], init_database,
generic_ussd_session)
user_input = generic_ussd_session.get('data').get('transaction_amount') user_input = generic_ussd_session.get('data').get('transaction_amount')
transaction_amount = to_wei(value=int(user_input)) transaction_amount = to_wei(decimals, int(user_input))
tx_recipient_information = valid_recipient.standard_metadata_id() tx_recipient_information = valid_recipient.standard_metadata_id()
tx_sender_information = activated_account.standard_metadata_id() tx_sender_information = activated_account.standard_metadata_id()
assert resp == translation_for(f'{display_key}.first', assert resp == translation_for(f'{transaction_pin_authorization}.first',
preferred_language, preferred_language,
recipient_information=tx_recipient_information, recipient_information=tx_recipient_information,
transaction_amount=from_wei(transaction_amount), transaction_amount=from_wei(decimals, transaction_amount),
token_symbol=token_symbol, token_symbol=token_symbol,
sender_information=tx_sender_information) sender_information=tx_sender_information)
display_key = 'ussd.exit_insufficient_balance'
ussd_menu = UssdMenu.find_by_name('exit_insufficient_balance') def test_guardian_exits(activated_account, cache_preferences, cache_token_data, generic_ussd_session, guardian_account,
name = ussd_menu.get('name') init_database, set_active_token):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
generic_ussd_session['data'] = {'guardian_phone_number': guardian_account.phone_number}
# testing exit guardian addition success
exit_guardian_addition_success = 'ussd.exit_guardian_addition_success'
resp = response(activated_account, exit_guardian_addition_success, exit_guardian_addition_success[5:],
init_database, generic_ussd_session)
assert resp == translation_for(exit_guardian_addition_success, preferred_language,
guardian_information=guardian_account.standard_metadata_id())
# testing exit guardian removal success
exit_guardian_removal_success = 'ussd.exit_guardian_removal_success'
resp = response(activated_account, exit_guardian_removal_success, exit_guardian_removal_success[5:],
init_database, generic_ussd_session)
assert resp == translation_for(exit_guardian_removal_success, preferred_language,
guardian_information=guardian_account.standard_metadata_id())
generic_ussd_session['data'] = {'failure_reason': 'foo'}
# testing exit invalid guardian addition
exit_invalid_guardian_addition = 'ussd.exit_invalid_guardian_addition'
resp = response(activated_account, exit_invalid_guardian_addition, exit_invalid_guardian_addition[5:],
init_database, generic_ussd_session)
assert resp == translation_for(exit_invalid_guardian_addition, preferred_language, error_exit='foo')
# testing exit invalid guardian removal
exit_invalid_guardian_removal = 'ussd.exit_invalid_guardian_removal'
resp = response(activated_account, exit_invalid_guardian_removal, exit_invalid_guardian_removal[5:],
init_database, generic_ussd_session)
assert resp == translation_for(exit_invalid_guardian_removal, preferred_language, error_exit='foo')
def test_exit_pin_reset_initiated_success(activated_account, cache_preferences, generic_ussd_session, init_database):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
exit_pin_reset_initiated_success = 'ussd.exit_pin_reset_initiated_success'
generic_ussd_session['data'] = {'guarded_account_phone_number': activated_account.phone_number}
resp = response(activated_account, exit_pin_reset_initiated_success, exit_pin_reset_initiated_success[5:],
init_database, generic_ussd_session)
assert resp == translation_for(exit_pin_reset_initiated_success,
preferred_language,
guarded_account_information=activated_account.standard_metadata_id())
def test_exit_insufficient_balance(activated_account, cache_balances, cache_preferences, cache_token_data,
generic_ussd_session, init_database, set_active_token, valid_recipient):
blockchain_address = activated_account.blockchain_address
token_symbol = get_active_token_symbol(blockchain_address)
token_data = get_cached_token_data(blockchain_address, token_symbol)
preferred_language = get_cached_preferred_language(blockchain_address)
decimals = token_data.get("decimals")
identifier = bytes.fromhex(blockchain_address)
balances_identifier = [identifier, token_symbol.encode('utf-8')]
available_balance = get_cached_available_balance(decimals, balances_identifier)
tx_recipient_information = valid_recipient.standard_metadata_id()
exit_insufficient_balance = 'ussd.exit_insufficient_balance'
generic_ussd_session['data'] = { generic_ussd_session['data'] = {
'recipient_phone_number': valid_recipient.phone_number, 'recipient_phone_number': valid_recipient.phone_number,
'transaction_amount': '85' 'transaction_amount': '85'
} }
transaction_amount = generic_ussd_session.get('data').get('transaction_amount') transaction_amount = generic_ussd_session.get('data').get('transaction_amount')
transaction_amount = to_wei(value=int(transaction_amount)) transaction_amount = to_wei(decimals, int(transaction_amount))
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) resp = response(activated_account, exit_insufficient_balance, exit_insufficient_balance[5:], init_database,
assert resp == translation_for(display_key, generic_ussd_session)
assert resp == translation_for(exit_insufficient_balance,
preferred_language, preferred_language,
amount=from_wei(transaction_amount), amount=from_wei(decimals, transaction_amount),
token_symbol=token_symbol, token_symbol=token_symbol,
recipient_information=tx_recipient_information, recipient_information=tx_recipient_information,
token_balance=available_balance) token_balance=available_balance)
display_key = 'ussd.exit_invalid_menu_option'
ussd_menu = UssdMenu.find_by_name('exit_invalid_menu_option')
name = ussd_menu.get('name')
resp = response(activated_account, display_key, name, init_database, generic_ussd_session)
assert resp == translation_for(display_key, preferred_language, support_phone=Support.phone_number)
display_key = 'ussd.exit_successful_transaction' def test_exit_invalid_menu_option(activated_account, cache_preferences, generic_ussd_session, init_database,
ussd_menu = UssdMenu.find_by_name('exit_successful_transaction') load_support_phone):
name = ussd_menu.get('name') blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
exit_invalid_menu_option = 'ussd.exit_invalid_menu_option'
resp = response(activated_account, exit_invalid_menu_option, exit_invalid_menu_option[5:], init_database,
generic_ussd_session)
assert resp == translation_for(exit_invalid_menu_option, preferred_language, support_phone=Support.phone_number)
def test_exit_pin_blocked(activated_account, cache_preferences, generic_ussd_session, init_database,
load_support_phone):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
exit_pin_blocked = 'ussd.exit_pin_blocked'
resp = response(activated_account, exit_pin_blocked, exit_pin_blocked[5:], init_database, generic_ussd_session)
assert resp == translation_for(exit_pin_blocked, preferred_language, support_phone=Support.phone_number)
def test_exit_successful_token_selection(activated_account, cache_preferences, cache_token_data, generic_ussd_session,
init_database, set_active_token):
blockchain_address = activated_account.blockchain_address
token_symbol = get_active_token_symbol(blockchain_address)
token_data = get_cached_token_data(blockchain_address, token_symbol)
preferred_language = get_cached_preferred_language(blockchain_address)
exit_successful_token_selection = 'ussd.exit_successful_token_selection'
generic_ussd_session['data'] = {'selected_token': token_data}
resp = response(activated_account, exit_successful_token_selection, exit_successful_token_selection[5:],
init_database, generic_ussd_session)
assert resp == translation_for(exit_successful_token_selection, preferred_language, token_symbol=token_symbol)
def test_exit_successful_transaction(activated_account, cache_preferences, cache_token_data, generic_ussd_session,
init_database, set_active_token, valid_recipient):
blockchain_address = activated_account.blockchain_address
token_symbol = get_active_token_symbol(blockchain_address)
token_data = get_cached_token_data(blockchain_address, token_symbol)
preferred_language = get_cached_preferred_language(blockchain_address)
decimals = token_data.get("decimals")
tx_recipient_information = valid_recipient.standard_metadata_id()
tx_sender_information = activated_account.standard_metadata_id()
exit_successful_transaction = 'ussd.exit_successful_transaction'
generic_ussd_session['data'] = { generic_ussd_session['data'] = {
'recipient_phone_number': valid_recipient.phone_number, 'recipient_phone_number': valid_recipient.phone_number,
'transaction_amount': '15' 'transaction_amount': '15'
} }
transaction_amount = generic_ussd_session.get('data').get('transaction_amount') transaction_amount = generic_ussd_session.get('data').get('transaction_amount')
transaction_amount = to_wei(value=int(transaction_amount)) transaction_amount = to_wei(decimals, int(transaction_amount))
resp = response(activated_account, display_key, name, init_database, generic_ussd_session) resp = response(activated_account, exit_successful_transaction, exit_successful_transaction[5:], init_database,
assert resp == translation_for(display_key, generic_ussd_session)
assert resp == translation_for(exit_successful_transaction,
preferred_language, preferred_language,
transaction_amount=from_wei(transaction_amount), transaction_amount=from_wei(decimals, transaction_amount),
token_symbol=token_symbol, token_symbol=token_symbol,
recipient_information=tx_recipient_information, recipient_information=tx_recipient_information,
sender_information=tx_sender_information) sender_information=tx_sender_information)

View File

@ -10,13 +10,16 @@ from chainlib.hash import strip_0x
from cic_types.condiments import MetadataPointer from cic_types.condiments import MetadataPointer
# local imports # local imports
from cic_ussd.account.chain import Chain
from cic_ussd.account.metadata import get_cached_preferred_language from cic_ussd.account.metadata import get_cached_preferred_language
from cic_ussd.cache import cache_data, cache_data_key, get_cached_data from cic_ussd.cache import cache_data, cache_data_key, get_cached_data
from cic_ussd.db.models.task_tracker import TaskTracker from cic_ussd.db.models.task_tracker import TaskTracker
from cic_ussd.menu.ussd_menu import UssdMenu from cic_ussd.menu.ussd_menu import UssdMenu
from cic_ussd.metadata import PersonMetadata from cic_ussd.metadata import PersonMetadata
from cic_ussd.processor.ussd import get_menu, handle_menu, handle_menu_operations from cic_ussd.processor.ussd import (get_menu,
handle_menu,
handle_menu_operations)
from cic_ussd.processor.util import ussd_menu_list
from cic_ussd.state_machine.logic.language import preferred_langauge_from_selection
from cic_ussd.translation import translation_for from cic_ussd.translation import translation_for
# test imports # test imports
@ -43,7 +46,7 @@ def test_handle_menu(activated_account,
ussd_menu = UssdMenu.find_by_name('exit_pin_blocked') ussd_menu = UssdMenu.find_by_name('exit_pin_blocked')
assert menu_resp.get('name') == ussd_menu.get('name') assert menu_resp.get('name') == ussd_menu.get('name')
menu_resp = handle_menu(pending_account, init_database) menu_resp = handle_menu(pending_account, init_database)
ussd_menu = UssdMenu.find_by_name('initial_language_selection') ussd_menu = UssdMenu.find_by_name('initial_pin_entry')
assert menu_resp.get('name') == ussd_menu.get('name') assert menu_resp.get('name') == ussd_menu.get('name')
identifier = bytes.fromhex(strip_0x(pending_account.blockchain_address)) identifier = bytes.fromhex(strip_0x(pending_account.blockchain_address))
key = cache_data_key(identifier, MetadataPointer.PREFERENCES) key = cache_data_key(identifier, MetadataPointer.PREFERENCES)
@ -75,38 +78,62 @@ def test_get_menu(activated_account,
assert menu_resp.get('name') == ussd_menu.get('name') assert menu_resp.get('name') == ussd_menu.get('name')
def test_handle_menu_operations(activated_account, def test_handle_no_account_menu_operations(celery_session_worker,
cache_preferences, init_cache,
celery_session_worker, init_database,
generic_ussd_session, load_chain_spec,
init_database, load_config,
init_cache, load_languages,
load_chain_spec, load_ussd_menu,
load_config, mock_account_creation_task_result,
mock_account_creation_task_result, pending_account,
persisted_ussd_session, persisted_ussd_session,
person_metadata, set_locale_files,
set_locale_files, task_uuid):
setup_metadata_request_handler, initial_language_selection = 'ussd.initial_language_selection'
setup_metadata_signer,
task_uuid):
# sourcery skip: extract-duplicate-method
chain_str = Chain.spec.__str__()
phone = phone_number() phone = phone_number()
external_session_id = os.urandom(20).hex() external_session_id = os.urandom(20).hex()
valid_service_codes = load_config.get('USSD_SERVICE_CODE').split(",") valid_service_codes = load_config.get('USSD_SERVICE_CODE').split(",")
preferred_language = i18n.config.get('fallback') preferred_language = i18n.config.get('fallback')
resp = handle_menu_operations(chain_str, external_session_id, phone, None, valid_service_codes[0], init_database, '4444') key = cache_data_key('system:languages'.encode('utf-8'), MetadataPointer.NONE)
assert resp == translation_for('ussd.account_creation_prompt', preferred_language) cached_system_languages = get_cached_data(key)
language_list: list = json.loads(cached_system_languages)
fallback = translation_for('helpers.no_language_list', preferred_language)
language_list_sets = ussd_menu_list(fallback=fallback, menu_list=language_list, split=3)
resp = handle_menu_operations(external_session_id, phone, None, valid_service_codes[0], init_database, '')
assert resp == translation_for(initial_language_selection, preferred_language,
first_language_set=language_list_sets[0])
cached_ussd_session = get_cached_data(external_session_id) cached_ussd_session = get_cached_data(external_session_id)
ussd_session = json.loads(cached_ussd_session) ussd_session = json.loads(cached_ussd_session)
assert ussd_session['msisdn'] == phone assert ussd_session['msisdn'] == phone
persisted_ussd_session.external_session_id = external_session_id
persisted_ussd_session.msisdn = phone
persisted_ussd_session.state = initial_language_selection[5:]
init_database.add(persisted_ussd_session)
init_database.commit()
account_creation_prompt = 'ussd.account_creation_prompt'
user_input = '2'
resp = handle_menu_operations(external_session_id, phone, None, valid_service_codes[0], init_database, user_input)
preferred_language = preferred_langauge_from_selection(user_input)
assert resp == translation_for(account_creation_prompt, preferred_language)
task_tracker = init_database.query(TaskTracker).filter_by(task_uuid=task_uuid).first() task_tracker = init_database.query(TaskTracker).filter_by(task_uuid=task_uuid).first()
assert task_tracker.task_uuid == task_uuid assert task_tracker.task_uuid == task_uuid
cached_creation_task_uuid = get_cached_data(task_uuid) cached_creation_task_uuid = get_cached_data(task_uuid)
creation_task_uuid_data = json.loads(cached_creation_task_uuid) creation_task_uuid_data = json.loads(cached_creation_task_uuid)
assert creation_task_uuid_data['status'] == 'PENDING' assert creation_task_uuid_data['status'] == 'PENDING'
def test_handle_account_menu_operations(activated_account,
cache_preferences,
celery_session_worker,
init_database,
load_config,
persisted_ussd_session,
person_metadata,
set_locale_files,
setup_metadata_request_handler,
setup_metadata_signer, ):
valid_service_codes = load_config.get('USSD_SERVICE_CODE').split(",")
identifier = bytes.fromhex(strip_0x(activated_account.blockchain_address)) identifier = bytes.fromhex(strip_0x(activated_account.blockchain_address))
person_metadata_client = PersonMetadata(identifier) person_metadata_client = PersonMetadata(identifier)
with requests_mock.Mocker(real_http=False) as request_mocker: with requests_mock.Mocker(real_http=False) as request_mocker:
@ -117,6 +144,5 @@ def test_handle_menu_operations(activated_account,
phone = activated_account.phone_number phone = activated_account.phone_number
preferred_language = get_cached_preferred_language(activated_account.blockchain_address) preferred_language = get_cached_preferred_language(activated_account.blockchain_address)
persisted_ussd_session.state = 'enter_transaction_recipient' persisted_ussd_session.state = 'enter_transaction_recipient'
resp = handle_menu_operations(chain_str, external_session_id, phone, None, valid_service_codes[0], init_database, '1') resp = handle_menu_operations(external_session_id, phone, None, valid_service_codes[0], init_database, '1')
assert resp == translation_for('ussd.enter_transaction_recipient', preferred_language) assert resp == translation_for('ussd.enter_transaction_recipient', preferred_language)

View File

@ -10,7 +10,10 @@ from cic_types.models.person import get_contact_data_from_vcard
# local imports # local imports
from cic_ussd.account.metadata import get_cached_preferred_language from cic_ussd.account.metadata import get_cached_preferred_language
from cic_ussd.metadata import PersonMetadata from cic_ussd.metadata import PersonMetadata
from cic_ussd.processor.util import latest_input, parse_person_metadata, resume_last_ussd_session from cic_ussd.processor.util import (latest_input,
parse_person_metadata,
resume_last_ussd_session,
ussd_menu_list)
from cic_ussd.translation import translation_for from cic_ussd.translation import translation_for
@ -60,3 +63,20 @@ def test_parse_person_metadata(activated_account, cache_person_metadata, cache_p
]) ])
def test_resume_last_ussd_session(expected_menu_name, last_state, load_ussd_menu): def test_resume_last_ussd_session(expected_menu_name, last_state, load_ussd_menu):
assert resume_last_ussd_session(last_state).get('name') == expected_menu_name assert resume_last_ussd_session(last_state).get('name') == expected_menu_name
def test_ussd_menu_list(activated_account, cache_preferences, load_ussd_menu, set_locale_files):
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
fallback = translation_for('helpers.no_transaction_history', preferred_language)
menu_list_sets = ['1. FII 50.0', '2. GFT 60.0', '3. DET 49.99']
split = 3
menu_list = ussd_menu_list(fallback=fallback, menu_list=menu_list_sets, split=split)
menu_list_sets = [menu_list_sets[item:item + split] for item in range(0, len(menu_list), split)]
menu_list_reprs = []
for i in range(split):
try:
menu_list_reprs.append(''.join(f'{list_set_item}\n' for list_set_item in menu_list_sets[i]).rstrip('\n'))
except IndexError:
menu_list_reprs.append(fallback)
assert menu_list == menu_list_reprs

View File

@ -3,8 +3,7 @@ import json
# external imports # external imports
import pytest import pytest
import requests_mock
from chainlib.hash import strip_0x
from cic_types.models.person import Person, get_contact_data_from_vcard from cic_types.models.person import Person, get_contact_data_from_vcard
# local imports # local imports
@ -12,9 +11,7 @@ from cic_ussd.cache import get_cached_data
from cic_ussd.account.maps import gender from cic_ussd.account.maps import gender
from cic_ussd.account.metadata import get_cached_preferred_language from cic_ussd.account.metadata import get_cached_preferred_language
from cic_ussd.db.enum import AccountStatus from cic_ussd.db.enum import AccountStatus
from cic_ussd.metadata import PreferencesMetadata from cic_ussd.state_machine.logic.account import (edit_user_metadata_attribute,
from cic_ussd.state_machine.logic.account import (change_preferred_language,
edit_user_metadata_attribute,
parse_gender, parse_gender,
parse_person_metadata, parse_person_metadata,
save_complete_person_metadata, save_complete_person_metadata,
@ -26,32 +23,6 @@ from cic_ussd.translation import translation_for
# test imports # test imports
@pytest.mark.parametrize('user_input, expected_preferred_language', [
('1', 'en'),
('2', 'sw')
])
def test_change_preferred_language(activated_account,
celery_session_worker,
expected_preferred_language,
init_database,
generic_ussd_session,
mock_response,
preferences,
setup_metadata_request_handler,
user_input):
identifier = bytes.fromhex(strip_0x(activated_account.blockchain_address))
preferences_metadata_client = PreferencesMetadata(identifier)
with requests_mock.Mocker(real_http=False) as requests_mocker:
requests_mocker.register_uri(
'POST', preferences_metadata_client.url, status_code=200, reason='OK', json=mock_response
)
state_machine_data = (user_input, generic_ussd_session, activated_account, init_database)
res = change_preferred_language(state_machine_data)
init_database.commit()
assert res.id is not None
assert activated_account.preferred_language == expected_preferred_language
@pytest.mark.parametrize('user_input', [ @pytest.mark.parametrize('user_input', [
'1', '1',
'2', '2',

View File

@ -0,0 +1,52 @@
# standard imports
import json
# external imports
import requests_mock
from cic_types.condiments import MetadataPointer
# local imports
from cic_ussd.cache import cache_data_key, get_cached_data
from cic_ussd.metadata import PreferencesMetadata
from cic_ussd.state_machine.logic.language import (change_preferred_language,
is_valid_language_selection,
preferred_langauge_from_selection,
process_language_selection)
# test imports
def test_change_preferred_language(activated_account,
cached_ussd_session,
celery_session_worker,
init_database,
load_languages,
mocker,
setup_metadata_signer,
setup_metadata_request_handler):
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
preferences = {
'preferred_language': 'en'
}
ussd_session['data'] = preferences
mock_add_preferences_metadata = mocker.patch('cic_ussd.tasks.metadata.add_preferences_metadata.apply_async')
with requests_mock.Mocker(real_http=False) as request_mocker:
identifier = bytes.fromhex(activated_account.blockchain_address)
metadata_client = PreferencesMetadata(identifier=identifier)
request_mocker.register_uri('POST', metadata_client.url, status_code=201, reason='CREATED', json=preferences)
state_machine_data = ('1', ussd_session, activated_account, init_database)
change_preferred_language(state_machine_data)
mock_add_preferences_metadata.assert_called_with(
(activated_account.blockchain_address, preferences), {}, queue='cic-ussd')
def test_is_valid_language_selection(activated_account,
generic_ussd_session,
init_cache,
init_database,
load_languages):
state_machine_data = ('1', generic_ussd_session, activated_account, init_database)
assert is_valid_language_selection(state_machine_data) is True
state_machine_data = ('12', generic_ussd_session, activated_account, init_database)
assert is_valid_language_selection(state_machine_data) is False

View File

@ -9,7 +9,10 @@ from cic_ussd.state_machine.logic.menu import (menu_one_selected,
menu_four_selected, menu_four_selected,
menu_five_selected, menu_five_selected,
menu_six_selected, menu_six_selected,
menu_nine_selected,
menu_zero_zero_selected, menu_zero_zero_selected,
menu_eleven_selected,
menu_twenty_two_selected,
menu_ninety_nine_selected) menu_ninety_nine_selected)
# test imports # test imports
@ -29,8 +32,14 @@ def test_menu_selection(init_database, pending_account, persisted_ussd_session):
assert menu_five_selected(('e', ussd_session, pending_account, init_database)) is False assert menu_five_selected(('e', ussd_session, pending_account, init_database)) is False
assert menu_six_selected(('6', ussd_session, pending_account, init_database)) is True assert menu_six_selected(('6', ussd_session, pending_account, init_database)) is True
assert menu_six_selected(('8', ussd_session, pending_account, init_database)) is False assert menu_six_selected(('8', ussd_session, pending_account, init_database)) is False
assert menu_nine_selected(('9', ussd_session, pending_account, init_database)) is True
assert menu_nine_selected(('-', ussd_session, pending_account, init_database)) is False
assert menu_zero_zero_selected(('00', ussd_session, pending_account, init_database)) is True assert menu_zero_zero_selected(('00', ussd_session, pending_account, init_database)) is True
assert menu_zero_zero_selected(('/', ussd_session, pending_account, init_database)) is False assert menu_zero_zero_selected(('/', ussd_session, pending_account, init_database)) is False
assert menu_eleven_selected(('11', ussd_session, pending_account, init_database)) is True
assert menu_eleven_selected(('*', ussd_session, pending_account, init_database)) is False
assert menu_twenty_two_selected(('22', ussd_session, pending_account, init_database)) is True
assert menu_twenty_two_selected(('5', ussd_session, pending_account, init_database)) is False
assert menu_ninety_nine_selected(('99', ussd_session, pending_account, init_database)) is True assert menu_ninety_nine_selected(('99', ussd_session, pending_account, init_database)) is True
assert menu_ninety_nine_selected(('d', ussd_session, pending_account, init_database)) is False assert menu_ninety_nine_selected(('d', ussd_session, pending_account, init_database)) is False

View File

@ -0,0 +1,221 @@
# standard imports
import json
# external imports
import requests_mock
# local imports
from cic_ussd.account.guardianship import Guardianship
from cic_ussd.account.metadata import get_cached_preferred_language
from cic_ussd.cache import cache_data_key, get_cached_data
from cic_ussd.db.models.account import Account
from cic_ussd.metadata import PersonMetadata
from cic_ussd.state_machine.logic.pin_guard import (add_pin_guardian,
is_dialers_pin_guardian,
is_others_pin_guardian,
is_set_pin_guardian,
remove_pin_guardian,
initiate_pin_reset,
save_guardian_to_session_data,
save_guarded_account_session_data,
retrieve_person_metadata,
is_valid_guardian_addition)
from cic_ussd.translation import translation_for
def test_save_guardian_to_session_data(activated_account,
cached_ussd_session,
celery_session_worker,
guardian_account,
init_cache,
init_database):
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
ussd_session['msisdn'] = activated_account.phone_number
state_machine_data = (guardian_account.phone_number, ussd_session, activated_account, init_database)
save_guardian_to_session_data(state_machine_data)
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
assert ussd_session.get('data').get('guardian_phone_number') == guardian_account.phone_number
def test_save_guarded_account_session_data(activated_account,
cached_ussd_session,
celery_session_worker,
guardian_account,
init_cache,
init_database):
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
ussd_session['msisdn'] = guardian_account.phone_number
state_machine_data = (activated_account.phone_number, ussd_session, guardian_account, init_database)
save_guarded_account_session_data(state_machine_data)
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
assert ussd_session.get('data').get('guarded_account_phone_number') == activated_account.phone_number
def test_retrieve_person_metadata(activated_account,
cached_ussd_session,
celery_session_worker,
guardian_account,
init_cache,
init_database,
mocker,
person_metadata,
setup_metadata_request_handler,
setup_metadata_signer):
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
ussd_session['msisdn'] = activated_account.phone_number
state_machine_data = (guardian_account.phone_number, ussd_session, activated_account, init_database)
mocker_query_person_metadata = mocker.patch('cic_ussd.tasks.metadata.query_person_metadata.apply_async')
with requests_mock.Mocker(real_http=False) as request_mocker:
identifier = bytes.fromhex(activated_account.blockchain_address)
metadata_client = PersonMetadata(identifier)
request_mocker.register_uri('GET', metadata_client.url, json=person_metadata, reason='OK', status_code=200)
retrieve_person_metadata(state_machine_data)
mocker_query_person_metadata.assert_called_with((guardian_account.blockchain_address,), {}, queue='cic-ussd')
def test_is_valid_guardian_addition(activated_account,
cache_preferences,
cached_ussd_session,
celery_session_worker,
init_cache,
init_database,
guardian_account,
load_languages,
load_ussd_menu,
set_locale_files,
setup_guardianship):
blockchain_address = activated_account.blockchain_address
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
state_machine_data = (guardian_account.phone_number, ussd_session, activated_account, init_database)
assert is_valid_guardian_addition(state_machine_data) is True
state_machine_data = (activated_account.phone_number, ussd_session, activated_account, init_database)
assert is_valid_guardian_addition(state_machine_data) is False
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
preferred_language = get_cached_preferred_language(blockchain_address)
failure_reason = translation_for('helpers.error.is_initiator', preferred_language)
assert ussd_session.get('data').get('failure_reason') == failure_reason
state_machine_data = (Guardianship.guardians[0], ussd_session, activated_account, init_database)
assert is_valid_guardian_addition(state_machine_data) is False
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
preferred_language = get_cached_preferred_language(blockchain_address)
failure_reason = translation_for('helpers.error.is_existent_guardian', preferred_language)
assert ussd_session.get('data').get('failure_reason') == failure_reason
def test_add_pin_guardian(activated_account, generic_ussd_session, guardian_account, init_database):
generic_ussd_session['data'] = {'guardian_phone_number': guardian_account.phone_number}
state_machine_data = ('', generic_ussd_session, activated_account, init_database)
add_pin_guardian(state_machine_data)
account = Account.get_by_phone_number(activated_account.phone_number, init_database)
assert account.get_guardians()[0] == guardian_account.phone_number
def test_is_set_pin_guardian(activated_account,
cache_preferences,
cached_ussd_session,
celery_session_worker,
init_cache,
init_database,
guardian_account,
load_languages,
load_ussd_menu,
set_locale_files,
setup_guardianship):
blockchain_address = activated_account.blockchain_address
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
preferred_language = get_cached_preferred_language(blockchain_address)
assert is_set_pin_guardian(activated_account, guardian_account.phone_number, preferred_language, init_database,
ussd_session) is False
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
failure_reason = translation_for('helpers.error.is_not_existent_guardian', preferred_language)
assert ussd_session.get('data').get('failure_reason') == failure_reason
assert is_set_pin_guardian(activated_account, Guardianship.guardians[0], preferred_language, init_database,
ussd_session) is True
assert is_set_pin_guardian(activated_account, activated_account.phone_number, preferred_language, init_database,
ussd_session) is False
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
failure_reason = translation_for('helpers.error.is_initiator', preferred_language)
assert ussd_session.get('data').get('failure_reason') == failure_reason
def test_is_dialers_pin_guardian(activated_account,
cache_preferences,
cached_ussd_session,
celery_session_worker,
init_database,
guardian_account):
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
state_machine_data = (guardian_account.phone_number, ussd_session, activated_account, init_database)
assert is_dialers_pin_guardian(state_machine_data) is False
activated_account.add_guardian(guardian_account.phone_number)
init_database.flush()
state_machine_data = (guardian_account.phone_number, ussd_session, activated_account, init_database)
assert is_dialers_pin_guardian(state_machine_data) is True
def test_is_others_pin_guardian(activated_account,
cache_preferences,
cached_ussd_session,
celery_session_worker,
init_database,
guardian_account):
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
state_machine_data = (activated_account.phone_number, ussd_session, guardian_account, init_database)
assert is_others_pin_guardian(state_machine_data) is False
activated_account.add_guardian(guardian_account.phone_number)
init_database.flush()
state_machine_data = (activated_account.phone_number, ussd_session, guardian_account, init_database)
assert is_others_pin_guardian(state_machine_data) is True
def test_remove_pin_guardian(activated_account, generic_ussd_session, guardian_account, init_database):
generic_ussd_session['data'] = {'guardian_phone_number': guardian_account.phone_number}
activated_account.add_guardian(guardian_account.phone_number)
init_database.flush()
assert activated_account.get_guardians()[0] == guardian_account.phone_number
state_machine_data = ('', generic_ussd_session, activated_account, init_database)
remove_pin_guardian(state_machine_data)
assert len(activated_account.get_guardians()) == 0
def test_initiate_pin_reset(activated_account,
cache_preferences,
celery_session_worker,
cached_ussd_session,
guardian_account,
init_cache,
init_database,
load_ussd_menu,
mock_notifier_api,
set_locale_files):
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
ussd_session['data'] = {'guarded_account_phone_number': activated_account.phone_number}
state_machine_data = ('', ussd_session, guardian_account, init_database)
initiate_pin_reset(state_machine_data)
blockchain_address = activated_account.blockchain_address
preferred_language = get_cached_preferred_language(blockchain_address)
message = translation_for('sms.pin_reset_initiated', preferred_language, pin_initiator=guardian_account.standard_metadata_id())
assert mock_notifier_api.get('message') == message
assert mock_notifier_api.get('recipient') == activated_account.phone_number

View File

@ -23,6 +23,7 @@ def test_upsell_unregistered_recipient(activated_account,
load_support_phone, load_support_phone,
mock_notifier_api, mock_notifier_api,
set_locale_files, set_locale_files,
set_active_token,
valid_recipient): valid_recipient):
cached_ussd_session.set_data('recipient_phone_number', valid_recipient.phone_number) cached_ussd_session.set_data('recipient_phone_number', valid_recipient.phone_number)
state_machine_data = ('', cached_ussd_session.to_json(), activated_account, init_database) state_machine_data = ('', cached_ussd_session.to_json(), activated_account, init_database)

View File

@ -0,0 +1,69 @@
# standard imports
import json
# external imports
from cic_types.condiments import MetadataPointer
# local imports
from cic_ussd.cache import cache_data_key, get_cached_data
from cic_ussd.state_machine.logic.tokens import (is_valid_token_selection,
process_token_selection,
set_selected_active_token)
from cic_ussd.account.tokens import get_cached_token_data_list
# test imports
def test_is_valid_token_selection(activated_account,
cache_token_data_list,
cache_token_symbol_list,
cached_ussd_session,
init_cache,
init_database):
cached_token_data_list = get_cached_token_data_list(activated_account.blockchain_address)
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
ussd_session['data'] = {'account_tokens_list': cached_token_data_list}
state_machine_data = ('GFT', ussd_session, activated_account, init_database)
assert is_valid_token_selection(state_machine_data) is True
state_machine_data = ('1', ussd_session, activated_account, init_database)
assert is_valid_token_selection(state_machine_data) is True
state_machine_data = ('3', ussd_session, activated_account, init_database)
assert is_valid_token_selection(state_machine_data) is False
def test_process_token_selection(activated_account,
cache_token_data_list,
cache_token_symbol_list,
cached_ussd_session,
celery_session_worker,
init_cache,
init_database):
cached_token_data_list = get_cached_token_data_list(activated_account.blockchain_address)
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
ussd_session['data'] = {'account_tokens_list': cached_token_data_list}
state_machine_data = ('GFT', ussd_session, activated_account, init_database)
process_token_selection(state_machine_data)
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
assert ussd_session.get('data').get('selected_token').get('symbol') == 'GFT'
def test_set_selected_active_token(activated_account,
cache_token_data_list,
cache_token_symbol_list,
cached_ussd_session,
init_cache,
init_database):
cached_token_data_list = get_cached_token_data_list(activated_account.blockchain_address)
ussd_session = get_cached_data(cached_ussd_session.external_session_id)
ussd_session = json.loads(ussd_session)
ussd_session['data'] = {'selected_token': cached_token_data_list[0]}
state_machine_data = ('GFT', ussd_session, activated_account, init_database)
set_selected_active_token(state_machine_data)
identifier = bytes.fromhex(activated_account.blockchain_address)
key = cache_data_key(identifier=identifier, salt=MetadataPointer.TOKEN_ACTIVE)
active_token = get_cached_data(key)
assert active_token == 'GFT'

View File

@ -3,13 +3,12 @@ import json
# external imports # external imports
import pytest import pytest
import requests_mock
from chainlib.hash import strip_0x
# local imports # local imports
from cic_ussd.account.metadata import get_cached_preferred_language
from cic_ussd.account.tokens import get_active_token_symbol, get_cached_token_data
from cic_ussd.account.transaction import to_wei from cic_ussd.account.transaction import to_wei
from cic_ussd.cache import get_cached_data from cic_ussd.cache import get_cached_data
from cic_ussd.metadata import PersonMetadata
from cic_ussd.state_machine.logic.transaction import (is_valid_recipient, from cic_ussd.state_machine.logic.transaction import (is_valid_recipient,
is_valid_transaction_amount, is_valid_transaction_amount,
has_sufficient_balance, has_sufficient_balance,
@ -18,7 +17,6 @@ from cic_ussd.state_machine.logic.transaction import (is_valid_recipient,
save_recipient_phone_to_session_data, save_recipient_phone_to_session_data,
save_transaction_amount_to_session_data) save_transaction_amount_to_session_data)
# test imports # test imports
@ -49,17 +47,18 @@ def test_is_valid_transaction_amount(activated_account, amount, expected_result,
]) ])
def test_has_sufficient_balance(activated_account, def test_has_sufficient_balance(activated_account,
cache_balances, cache_balances,
cache_default_token_data, cache_token_data,
expected_result, expected_result,
generic_ussd_session, generic_ussd_session,
init_database, init_database,
set_active_token,
value): value):
state_machine_data = (value, generic_ussd_session, activated_account, init_database) state_machine_data = (value, generic_ussd_session, activated_account, init_database)
assert has_sufficient_balance(state_machine_data=state_machine_data) == expected_result assert has_sufficient_balance(state_machine_data=state_machine_data) == expected_result
def test_process_transaction_request(activated_account, def test_process_transaction_request(activated_account,
cache_default_token_data, cache_token_data,
cached_ussd_session, cached_ussd_session,
celery_session_worker, celery_session_worker,
init_cache, init_cache,
@ -67,7 +66,12 @@ def test_process_transaction_request(activated_account,
load_chain_spec, load_chain_spec,
load_config, load_config,
mock_transfer_api, mock_transfer_api,
set_active_token,
valid_recipient): valid_recipient):
blockchain_address = activated_account.blockchain_address
token_symbol = get_active_token_symbol(blockchain_address)
token_data = get_cached_token_data(blockchain_address, token_symbol)
decimals = token_data.get("decimals")
cached_ussd_session.set_data('recipient_phone_number', valid_recipient.phone_number) cached_ussd_session.set_data('recipient_phone_number', valid_recipient.phone_number)
cached_ussd_session.set_data('transaction_amount', '50') cached_ussd_session.set_data('transaction_amount', '50')
ussd_session = get_cached_data(cached_ussd_session.external_session_id) ussd_session = get_cached_data(cached_ussd_session.external_session_id)
@ -76,7 +80,7 @@ def test_process_transaction_request(activated_account,
process_transaction_request(state_machine_data) process_transaction_request(state_machine_data)
assert mock_transfer_api['from_address'] == activated_account.blockchain_address assert mock_transfer_api['from_address'] == activated_account.blockchain_address
assert mock_transfer_api['to_address'] == valid_recipient.blockchain_address assert mock_transfer_api['to_address'] == valid_recipient.blockchain_address
assert mock_transfer_api['value'] == to_wei(50) assert mock_transfer_api['value'] == to_wei(decimals, 50)
assert mock_transfer_api['token_symbol'] == load_config.get('TEST_TOKEN_SYMBOL') assert mock_transfer_api['token_symbol'] == load_config.get('TEST_TOKEN_SYMBOL')

View File

@ -6,8 +6,10 @@ def test_state_machine(activated_account_ussd_session,
celery_session_worker, celery_session_worker,
init_database, init_database,
init_state_machine, init_state_machine,
pending_account): load_languages,
pending_account,
set_locale_files):
state_machine = UssdStateMachine(activated_account_ussd_session) state_machine = UssdStateMachine(activated_account_ussd_session)
state_machine.scan_data(('1', activated_account_ussd_session, pending_account, init_database)) state_machine.scan_data(('1', activated_account_ussd_session, pending_account, init_database))
assert state_machine.__repr__() == f'<KenyaUssdStateMachine: {state_machine.state}>' assert state_machine.__repr__() == f'<KenyaUssdStateMachine: {state_machine.state}>'
assert state_machine.state == 'initial_pin_entry' assert state_machine.state == 'account_creation_prompt'

View File

@ -4,15 +4,18 @@ import json
# external imports # external imports
import celery import celery
import pytest import pytest
import requests_mock
from chainlib.hash import strip_0x from chainlib.hash import strip_0x
from cic_types.condiments import MetadataPointer from cic_types.condiments import MetadataPointer
# local imports # local imports
from cic_ussd.account.statement import filter_statement_transactions from cic_ussd.account.statement import filter_statement_transactions
from cic_ussd.account.tokens import collate_token_metadata
from cic_ussd.account.transaction import transaction_actors from cic_ussd.account.transaction import transaction_actors
from cic_ussd.cache import cache_data_key, get_cached_data from cic_ussd.cache import cache_data_key, get_cached_data
from cic_ussd.db.models.account import Account from cic_ussd.db.models.account import Account
from cic_ussd.error import AccountCreationDataNotFound from cic_ussd.error import AccountCreationDataNotFound
from cic_ussd.metadata import TokenMetadata
# test imports # test imports
@ -22,11 +25,13 @@ from tests.helpers.accounts import blockchain_address
def test_account_creation_callback(account_creation_data, def test_account_creation_callback(account_creation_data,
cache_account_creation_data, cache_account_creation_data,
celery_session_worker, celery_session_worker,
cache_default_token_data,
custom_metadata, custom_metadata,
init_cache, init_cache,
init_database, init_database,
load_chain_spec, load_chain_spec,
mocker, mocker,
preferences,
setup_metadata_request_handler, setup_metadata_request_handler,
setup_metadata_signer): setup_metadata_signer):
phone_number = account_creation_data.get('phone_number') phone_number = account_creation_data.get('phone_number')
@ -48,10 +53,12 @@ def test_account_creation_callback(account_creation_data,
cached_account_creation_data = get_cached_data(task_uuid) cached_account_creation_data = get_cached_data(task_uuid)
cached_account_creation_data = json.loads(cached_account_creation_data) cached_account_creation_data = json.loads(cached_account_creation_data)
assert cached_account_creation_data.get('status') == account_creation_data.get('status') assert cached_account_creation_data.get('status') == account_creation_data.get('status')
mock_add_preferences_metadata = mocker.patch('cic_ussd.tasks.metadata.add_preferences_metadata.apply_async')
mock_add_phone_pointer = mocker.patch('cic_ussd.tasks.metadata.add_phone_pointer.apply_async') mock_add_phone_pointer = mocker.patch('cic_ussd.tasks.metadata.add_phone_pointer.apply_async')
mock_add_custom_metadata = mocker.patch('cic_ussd.tasks.metadata.add_custom_metadata.apply_async') mock_add_custom_metadata = mocker.patch('cic_ussd.tasks.metadata.add_custom_metadata.apply_async')
preferred_language = preferences.get('preferred_language')
s_account_creation_callback = celery.signature( s_account_creation_callback = celery.signature(
'cic_ussd.tasks.callback_handler.account_creation_callback', [result, '', 0] 'cic_ussd.tasks.callback_handler.account_creation_callback', [result, preferred_language, 0]
) )
s_account_creation_callback.apply_async().get() s_account_creation_callback.apply_async().get()
account = init_database.query(Account).filter_by(phone_number=phone_number).first() account = init_database.query(Account).filter_by(phone_number=phone_number).first()
@ -59,6 +66,7 @@ def test_account_creation_callback(account_creation_data,
cached_account_creation_data = get_cached_data(task_uuid) cached_account_creation_data = get_cached_data(task_uuid)
cached_account_creation_data = json.loads(cached_account_creation_data) cached_account_creation_data = json.loads(cached_account_creation_data)
assert cached_account_creation_data.get('status') == 'CREATED' assert cached_account_creation_data.get('status') == 'CREATED'
mock_add_preferences_metadata.assert_called_with((result, preferences), {}, queue='cic-ussd')
mock_add_phone_pointer.assert_called_with((result, phone_number), {}, queue='cic-ussd') mock_add_phone_pointer.assert_called_with((result, phone_number), {}, queue='cic-ussd')
mock_add_custom_metadata.assert_called_with((result, custom_metadata), {}, queue='cic-ussd') mock_add_custom_metadata.assert_called_with((result, custom_metadata), {}, queue='cic-ussd')
@ -117,12 +125,46 @@ def test_statement_callback(activated_account, mocker, transactions_list):
(activated_account.blockchain_address, sender_transaction), {}, queue='cic-ussd') (activated_account.blockchain_address, sender_transaction), {}, queue='cic-ussd')
def test_token_data_callback(activated_account,
cache_token_data,
cache_token_meta_symbol,
cache_token_proof_symbol,
celery_session_worker,
default_token_data,
init_cache,
token_meta_symbol,
token_symbol):
blockchain_address = activated_account.blockchain_address
identifier = token_symbol.encode('utf-8')
status_code = 1
with pytest.raises(ValueError) as error:
s_token_data_callback = celery.signature(
'cic_ussd.tasks.callback_handler.token_data_callback',
[[default_token_data], blockchain_address, status_code])
s_token_data_callback.apply_async().get()
assert str(error.value) == f'Unexpected status code: {status_code}.'
token_data_key = cache_data_key([bytes.fromhex(blockchain_address), identifier], MetadataPointer.TOKEN_DATA)
token_meta_key = cache_data_key(identifier, MetadataPointer.TOKEN_META_SYMBOL)
token_info_key = cache_data_key(identifier, MetadataPointer.TOKEN_PROOF_SYMBOL)
token_meta = get_cached_data(token_meta_key)
token_meta = json.loads(token_meta)
token_info = get_cached_data(token_info_key)
token_info = json.loads(token_info)
token_data = collate_token_metadata(token_info=token_info, token_metadata=token_meta)
token_data = {**token_data, **default_token_data}
cached_token_data = json.loads(get_cached_data(token_data_key))
for key, value in token_data.items():
assert token_data[key] == cached_token_data[key]
def test_transaction_balances_callback(activated_account, def test_transaction_balances_callback(activated_account,
balances, balances,
cache_balances, cache_balances,
cache_default_token_data, cache_token_data,
cache_person_metadata, cache_person_metadata,
cache_preferences, cache_preferences,
celery_session_worker,
load_chain_spec, load_chain_spec,
mocker, mocker,
preferences, preferences,
@ -157,7 +199,16 @@ def test_transaction_balances_callback(activated_account,
mocked_chain.assert_called() mocked_chain.assert_called()
def test_transaction_callback(load_chain_spec, mock_async_balance_api_query, transaction_result): def test_transaction_callback(cache_token_data,
celery_session_worker,
default_token_data,
init_cache,
load_chain_spec,
mock_async_balance_api_query,
token_symbol,
token_meta_symbol,
token_proof_symbol,
transaction_result):
status_code = 1 status_code = 1
with pytest.raises(ValueError) as error: with pytest.raises(ValueError) as error:
s_transaction_callback = celery.signature( s_transaction_callback = celery.signature(
@ -166,13 +217,19 @@ def test_transaction_callback(load_chain_spec, mock_async_balance_api_query, tra
s_transaction_callback.apply_async().get() s_transaction_callback.apply_async().get()
assert str(error.value) == f'Unexpected status code: {status_code}.' assert str(error.value) == f'Unexpected status code: {status_code}.'
status_code = 0 with requests_mock.Mocker(real_http=False) as request_mocker:
s_transaction_callback = celery.signature( identifier = token_symbol.encode('utf-8')
'cic_ussd.tasks.callback_handler.transaction_callback', metadata_client = TokenMetadata(identifier, cic_type=MetadataPointer.TOKEN_META_SYMBOL)
[transaction_result, 'transfer', status_code]) request_mocker.register_uri('GET', metadata_client.url, json=token_meta_symbol, status_code=200, reason='OK')
s_transaction_callback.apply_async().get() metadata_client = TokenMetadata(identifier, cic_type=MetadataPointer.TOKEN_PROOF_SYMBOL)
recipient_transaction, sender_transaction = transaction_actors(transaction_result) request_mocker.register_uri('GET', metadata_client.url, json=token_proof_symbol, status_code=200, reason='OK')
assert mock_async_balance_api_query.get('address') == recipient_transaction.get('blockchain_address') or sender_transaction.get('blockchain_address') status_code = 0
assert mock_async_balance_api_query.get('token_symbol') == recipient_transaction.get('token_symbol') or sender_transaction.get('token_symbol') s_transaction_callback = celery.signature(
'cic_ussd.tasks.callback_handler.transaction_callback',
[transaction_result, 'transfer', status_code])
s_transaction_callback.apply_async().get()
recipient_transaction, sender_transaction = transaction_actors(transaction_result)
assert mock_async_balance_api_query.get('address') == recipient_transaction.get('blockchain_address') or sender_transaction.get('blockchain_address')
assert mock_async_balance_api_query.get('token_symbol') == recipient_transaction.get('token_symbol') or sender_transaction.get('token_symbol')

View File

@ -14,13 +14,14 @@ from cic_ussd.translation import translation_for
def test_transaction(cache_default_token_data, def test_transaction(cache_default_token_data,
cache_token_data,
celery_session_worker, celery_session_worker,
load_support_phone, load_support_phone,
mock_notifier_api, mock_notifier_api,
notification_data, notification_data,
set_locale_files): set_locale_files):
notification_data['transaction_type'] = 'transfer' notification_data['transaction_type'] = 'transfer'
amount = from_wei(notification_data.get('token_value')) amount = from_wei(6, notification_data.get('token_value'))
balance = notification_data.get('available_balance') balance = notification_data.get('available_balance')
phone_number = notification_data.get('phone_number') phone_number = notification_data.get('phone_number')
preferred_language = notification_data.get('preferred_language') preferred_language = notification_data.get('preferred_language')

View File

@ -52,6 +52,11 @@ def test_cache_statement(activated_account,
cached_statement = get_cached_data(key) cached_statement = get_cached_data(key)
cached_statement = json.loads(cached_statement) cached_statement = json.loads(cached_statement)
assert len(cached_statement) == 1 assert len(cached_statement) == 1
sender_transaction['token_value'] = 60.0
s_parse_transaction = celery.signature(
'cic_ussd.tasks.processor.parse_transaction', [sender_transaction])
result = s_parse_transaction.apply_async().get()
s_cache_statement = celery.signature( s_cache_statement = celery.signature(
'cic_ussd.tasks.processor.cache_statement', [result, activated_account.blockchain_address] 'cic_ussd.tasks.processor.cache_statement', [result, activated_account.blockchain_address]
) )

View File

@ -8,6 +8,7 @@ from cic_types.condiments import MetadataPointer
# local imports # local imports
from cic_ussd.account.chain import Chain from cic_ussd.account.chain import Chain
from cic_ussd.account.tokens import set_active_token
from cic_ussd.cache import cache_data, cache_data_key from cic_ussd.cache import cache_data, cache_data_key
from cic_ussd.db.enum import AccountStatus from cic_ussd.db.enum import AccountStatus
from cic_ussd.db.models.account import Account from cic_ussd.db.models.account import Account
@ -36,6 +37,16 @@ def activated_account(init_database, set_fernet_key):
return account return account
@pytest.fixture(scope='function')
def guardian_account(init_database, set_fernet_key):
account = Account(blockchain_address(), phone_number())
account.create_password('0000')
account.activate_account()
init_database.add(account)
init_database.commit()
return account
@pytest.fixture(scope='function') @pytest.fixture(scope='function')
def balances(): def balances():
return [{ return [{
@ -53,13 +64,22 @@ def cache_account_creation_data(init_cache, account_creation_data):
@pytest.fixture(scope='function') @pytest.fixture(scope='function')
def cache_balances(activated_account, balances, init_cache): def cache_balances(activated_account, balances, init_cache, token_symbol):
identifier = bytes.fromhex(activated_account.blockchain_address) identifier = [bytes.fromhex(activated_account.blockchain_address), token_symbol.encode('utf-8')]
balances = json.dumps(balances[0]) balances = json.dumps(balances[0])
key = cache_data_key(identifier, MetadataPointer.BALANCES) key = cache_data_key(identifier, MetadataPointer.BALANCES)
cache_data(key, balances) cache_data(key, balances)
@pytest.fixture(scope='function')
def cache_adjusted_balances(activated_account, balances, init_cache, token_symbol):
identifier = bytes.fromhex(activated_account.blockchain_address)
balances_identifier = [identifier, token_symbol.encode('utf-8')]
key = cache_data_key(balances_identifier, MetadataPointer.BALANCES_ADJUSTED)
adjusted_balance = 45931650.64654012
cache_data(key, adjusted_balance)
@pytest.fixture(scope='function') @pytest.fixture(scope='function')
def cache_default_token_data(default_token_data, init_cache, load_chain_spec): def cache_default_token_data(default_token_data, init_cache, load_chain_spec):
chain_str = Chain.spec.__str__() chain_str = Chain.spec.__str__()
@ -68,6 +88,113 @@ def cache_default_token_data(default_token_data, init_cache, load_chain_spec):
cache_data(key, data) cache_data(key, data)
@pytest.fixture(scope='function')
def set_active_token(activated_account, init_cache, token_symbol):
identifier = bytes.fromhex(activated_account.blockchain_address)
key = cache_data_key(identifier, MetadataPointer.TOKEN_ACTIVE)
cache_data(key=key, data=token_symbol)
@pytest.fixture(scope='function')
def cache_token_data(activated_account, init_cache, token_data):
identifier = [bytes.fromhex(activated_account.blockchain_address), token_data.get('symbol').encode('utf-8')]
key = cache_data_key(identifier, MetadataPointer.TOKEN_DATA)
cache_data(key=key, data=json.dumps(token_data))
@pytest.fixture(scope='function')
def cache_token_symbol_list(activated_account, init_cache, token_symbol):
identifier = bytes.fromhex(activated_account.blockchain_address)
key = cache_data_key(identifier=identifier, salt=MetadataPointer.TOKEN_SYMBOLS_LIST)
token_symbols_list = [token_symbol]
cache_data(key, json.dumps(token_symbols_list))
@pytest.fixture(scope='function')
def cache_token_data_list(activated_account, init_cache, token_data):
identifier = bytes.fromhex(activated_account.blockchain_address)
key = cache_data_key(identifier, MetadataPointer.TOKEN_DATA_LIST)
token_data_list = [token_data]
cache_data(key, json.dumps(token_data_list))
@pytest.fixture(scope='function')
def token_meta_symbol():
return {
"contact": {
"phone": "+254700000000",
"email": "info@grassrootseconomics.org"
},
"country_code": "KE",
"location": "Kilifi",
"name": "GRASSROOTS ECONOMICS"
}
@pytest.fixture(scope='function')
def token_proof_symbol():
return {
"description": "Community support",
"issuer": "Grassroots Economics",
"namespace": "ge",
"proofs": [
"0x4746540000000000000000000000000000000000000000000000000000000000",
"1f0f0e3e9db80eeaba22a9d4598e454be885855d6048545546fd488bb709dc2f"
],
"version": 0
}
@pytest.fixture(scope='function')
def token_list_entries():
return [
{
'name': 'Fee',
'symbol': 'FII',
'issuer': 'Foo',
'contact': {'phone': '+254712345678'},
'location': 'Fum',
'balance': 50.0
},
{
'name': 'Giftable Token',
'symbol': 'GFT',
'issuer': 'Grassroots Economics',
'contact': {
'phone': '+254700000000',
'email': 'info@grassrootseconomics.org'
},
'location': 'Fum',
'balance': 60.0
},
{
'name': 'Demurrage Token',
'symbol': 'DET',
'issuer': 'Grassroots Economics',
'contact': {
'phone': '+254700000000',
'email': 'info@grassrootseconomics.org'
},
'location': 'Fum',
'balance': 49.99
}
]
@pytest.fixture(scope='function')
def cache_token_meta_symbol(token_meta_symbol, token_symbol):
identifier = token_symbol.encode('utf-8')
key = cache_data_key(identifier, MetadataPointer.TOKEN_META_SYMBOL)
cache_data(key, json.dumps(token_meta_symbol))
@pytest.fixture(scope='function')
def cache_token_proof_symbol(token_proof_symbol, token_symbol):
identifier = token_symbol.encode('utf-8')
key = cache_data_key(identifier, MetadataPointer.TOKEN_PROOF_SYMBOL)
cache_data(key, json.dumps(token_proof_symbol))
@pytest.fixture(scope='function') @pytest.fixture(scope='function')
def cache_person_metadata(activated_account, init_cache, person_metadata): def cache_person_metadata(activated_account, init_cache, person_metadata):
identifier = bytes.fromhex(activated_account.blockchain_address) identifier = bytes.fromhex(activated_account.blockchain_address)
@ -100,10 +227,33 @@ def custom_metadata():
@pytest.fixture(scope='function') @pytest.fixture(scope='function')
def default_token_data(token_symbol): def default_token_data(token_symbol):
return { return {
'symbol': token_symbol, 'symbol': token_symbol,
'address': blockchain_address(), 'address': '32e860c2a0645d1b7b005273696905f5d6dc5d05',
'name': 'Giftable', 'name': 'Giftable Token',
'decimals': 6 'decimals': 6,
"converters": []
}
@pytest.fixture(scope='function')
def token_data():
return {
"description": "Community support",
"issuer": "Grassroots Economics",
"location": "Kilifi",
"contact": {
"phone": "+254700000000",
"email": "info@grassrootseconomics.org"
},
"decimals": 6,
"name": "Giftable Token",
"symbol": "GFT",
"address": "32e860c2a0645d1b7b005273696905f5d6dc5d05",
"proofs": [
"0x4746540000000000000000000000000000000000000000000000000000000000",
"1f0f0e3e9db80eeaba22a9d4598e454be885855d6048545546fd488bb709dc2f"
],
"converters": []
} }

View File

@ -2,14 +2,18 @@
# external imports # external imports
import pytest import pytest
from pytest_redis import factories
# local imports # local imports
from cic_ussd.cache import Cache from cic_ussd.cache import Cache
from cic_ussd.session.ussd_session import UssdSession from cic_ussd.session.ussd_session import UssdSession
redis_test_proc = factories.redis_proc()
redis_db = factories.redisdb('redis_test_proc', decode=True)
@pytest.fixture(scope='function') @pytest.fixture(scope='function')
def init_cache(redisdb): def init_cache(redis_db):
Cache.store = redisdb Cache.store = redis_db
UssdSession.store = redisdb UssdSession.store = redis_db
return redisdb return redis_db

View File

@ -10,11 +10,13 @@ from confini import Config
# local imports # local imports
from cic_ussd.account.chain import Chain from cic_ussd.account.chain import Chain
from cic_ussd.account.guardianship import Guardianship
from cic_ussd.encoder import PasswordEncoder from cic_ussd.encoder import PasswordEncoder
from cic_ussd.files.local_files import create_local_file_data_stores, json_file_parser from cic_ussd.files.local_files import create_local_file_data_stores, json_file_parser
from cic_ussd.menu.ussd_menu import UssdMenu from cic_ussd.menu.ussd_menu import UssdMenu
from cic_ussd.phone_number import E164Format, Support from cic_ussd.phone_number import E164Format, Support
from cic_ussd.state_machine import UssdStateMachine from cic_ussd.state_machine import UssdStateMachine
from cic_ussd.translation import generate_locale_files, Languages
from cic_ussd.validator import validate_presence from cic_ussd.validator import validate_presence
logg = logging.getLogger(__name__) logg = logging.getLogger(__name__)
@ -39,6 +41,14 @@ def init_state_machine(load_config):
UssdStateMachine.transitions = json_file_parser(filepath=load_config.get('MACHINE_TRANSITIONS')) UssdStateMachine.transitions = json_file_parser(filepath=load_config.get('MACHINE_TRANSITIONS'))
@pytest.fixture(scope='function')
def load_languages(init_cache, load_config):
validate_presence(load_config.get('LANGUAGES_FILE'))
Languages.load_languages_dict(load_config.get('LANGUAGES_FILE'))
languages = Languages()
languages.cache_system_languages()
@pytest.fixture(scope='function') @pytest.fixture(scope='function')
def load_chain_spec(load_config): def load_chain_spec(load_config):
chain_spec = ChainSpec.from_chain_str(load_config.get('CHAIN_SPEC')) chain_spec = ChainSpec.from_chain_str(load_config.get('CHAIN_SPEC'))
@ -75,8 +85,23 @@ def set_fernet_key(load_config):
PasswordEncoder.set_key(load_config.get('APP_PASSWORD_PEPPER')) PasswordEncoder.set_key(load_config.get('APP_PASSWORD_PEPPER'))
@pytest.fixture @pytest.fixture(scope='function')
def set_locale_files(load_config): def setup_guardianship(load_config):
validate_presence(load_config.get('LOCALE_PATH')) guardians_file = os.path.join(root_directory, load_config.get('SYSTEM_GUARDIANS_FILE'))
i18n.load_path.append(load_config.get('LOCALE_PATH')) validate_presence(guardians_file)
Guardianship.load_system_guardians(guardians_file)
@pytest.fixture(scope="session")
def set_locale_files(load_config, tmpdir_factory):
tmpdir = tmpdir_factory.mktemp("var")
tmpdir_path = str(tmpdir)
validate_presence(tmpdir_path)
import cic_translations
package_path = cic_translations.__path__
schema_files = os.path.join(package_path[0], load_config.get("SCHEMA_FILE_PATH"))
generate_locale_files(locale_dir=tmpdir_path,
schema_file_path=schema_files,
translation_builder_path=load_config.get('LOCALE_FILE_BUILDERS'))
i18n.load_path.append(tmpdir_path)
i18n.set('fallback', load_config.get('LOCALE_FALLBACK')) i18n.set('fallback', load_config.get('LOCALE_FALLBACK'))

View File

@ -40,6 +40,7 @@ def statement(activated_account):
'blockchain_address': activated_account.blockchain_address, 'blockchain_address': activated_account.blockchain_address,
'token_symbol': 'GFT', 'token_symbol': 'GFT',
'token_value': 25000000, 'token_value': 25000000,
'token_decimals': 6,
'role': 'sender', 'role': 'sender',
'action_tag': 'Sent', 'action_tag': 'Sent',
'direction_tag': 'To', 'direction_tag': 'To',
@ -63,7 +64,7 @@ def transaction_result(activated_account, load_config, valid_recipient):
'destination_token_symbol': load_config.get('TEST_TOKEN_SYMBOL'), 'destination_token_symbol': load_config.get('TEST_TOKEN_SYMBOL'),
'source_token_decimals': 6, 'source_token_decimals': 6,
'destination_token_decimals': 6, 'destination_token_decimals': 6,
'chain': 'evm:bloxberg:8996' 'chain': load_config.get('CHAIN_SPEC')
} }

View File

@ -0,0 +1 @@
+254700000000

View File

@ -574,9 +574,9 @@ products_edit_pin_authorization.first,"CON Please enter your PIN
0. Dheebi" 0. Dheebi"
products_edit_pin_authorization.retry,%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry} products_edit_pin_authorization.retry,%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry},%{retry_pin_entry}
account_balances.available_balance,"CON Your balances are as follows: account_balances.available_balance,"CON Your balances are as follows:
balance: %{available_balance} %{token_symbol} %{available_balance} %{token_symbol}
0. Back","CON Salio zako ni zifuatazo: 0. Back","CON Salio zako ni zifuatazo:
salio: %{available_balance} %{token_symbol} %{available_balance} %{token_symbol}
0. Rudi","CON Utyalo waku ni uu: 0. Rudi","CON Utyalo waku ni uu:
utyalo: %{available_balance} %{token_symbol} utyalo: %{available_balance} %{token_symbol}
0. Syoka itina","CON Matigari maku ni maya: 0. Syoka itina","CON Matigari maku ni maya:
@ -659,9 +659,11 @@ first_transaction_set,"CON %{first_transaction_set}
1. Dhuur 1. Dhuur
00. Bai" 00. Bai"
middle_transaction_set,"CON %{middle_transaction_set} middle_transaction_set,"CON %{middle_transaction_set}
11. Next 11. Next
22. Previous 22. Previous
00. Exit","CON %{middle_transaction_set} 00. Exit","CON %{middle_transaction_set}
11. Mbele 11. Mbele
22. Rudi 22. Rudi
00. Ondoka","CON %{middle_transaction_set} 00. Ondoka","CON %{middle_transaction_set}
@ -681,8 +683,10 @@ middle_transaction_set,"CON %{middle_transaction_set}
2. Dheebi 2. Dheebi
00. Bai" 00. Bai"
last_transaction_set,"CON %{last_transaction_set} last_transaction_set,"CON %{last_transaction_set}
22. Previous 22. Previous
00. Exit","CON %{last_transaction_set} 00. Exit","CON %{last_transaction_set}
22. Rudi 22. Rudi
00. Ondoka","CON %{last_transaction_set} 00. Ondoka","CON %{last_transaction_set}
2. Itina 2. Itina

1 keys en sw kam kik miji luo bor
574
575
576
577
578
579
580
581
582
659
660
661
662
663
664
665
666
667
668
669
683
684
685
686
687
688
689
690
691
692