Compare commits
4 Commits
cic-eth-se
...
sohail/con
| Author | SHA1 | Date | |
|---|---|---|---|
|
def5cf0a1f
|
|||
|
4304ca8603
|
|||
|
635bab6896
|
|||
|
0b5363cd99
|
1
.gitignore
vendored
1
.gitignore
vendored
@@ -15,4 +15,3 @@ build/
|
||||
.idea
|
||||
**/.vim
|
||||
**/*secret.yaml
|
||||
.envrc
|
||||
@@ -1,4 +1,6 @@
|
||||
# Contributor Covenant Code of Conduct
|
||||
# Code of Conduct
|
||||
|
||||
By contributing toward this project you agree to the following code of conduct
|
||||
|
||||
## Our Pledge
|
||||
|
||||
@@ -14,21 +16,21 @@ appearance, race, religion, or sexual identity and orientation.
|
||||
Examples of behavior that contributes to creating a positive environment
|
||||
include:
|
||||
|
||||
* Using welcoming and inclusive language
|
||||
* Being respectful of differing viewpoints and experiences
|
||||
* Gracefully accepting constructive criticism
|
||||
* Focusing on what is best for the community
|
||||
* Showing empathy towards other community members
|
||||
- Using welcoming and inclusive language
|
||||
- Being respectful of differing viewpoints and experiences
|
||||
- Gracefully accepting constructive criticism
|
||||
- Focusing on what is best for the community
|
||||
- Showing empathy towards other community members
|
||||
|
||||
Examples of unacceptable behavior by participants include:
|
||||
|
||||
* The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
- The use of sexualized language or imagery and unwelcome sexual attention or
|
||||
advances
|
||||
* Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
* Public or private harassment
|
||||
* Publishing others' private information, such as a physical or electronic
|
||||
- Trolling, insulting/derogatory comments, and personal or political attacks
|
||||
- Public or private harassment
|
||||
- Publishing others' private information, such as a physical or electronic
|
||||
address, without explicit permission
|
||||
* Other conduct which could reasonably be considered inappropriate in a
|
||||
- Other conduct which could reasonably be considered inappropriate in a
|
||||
professional setting
|
||||
|
||||
## Our Responsibilities
|
||||
@@ -55,7 +57,7 @@ a project may be further defined and clarified by project maintainers.
|
||||
## Enforcement
|
||||
|
||||
Instances of abusive, harassing, or otherwise unacceptable behavior may be
|
||||
reported by contacting the project team at [INSERT EMAIL ADDRESS]. All
|
||||
reported by contacting the project team at info@grassecon.org . All
|
||||
complaints will be reviewed and investigated and will result in a response that
|
||||
is deemed necessary and appropriate to the circumstances. The project team is
|
||||
obligated to maintain confidentiality with regard to the reporter of an incident.
|
||||
@@ -75,9 +77,8 @@ available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.ht
|
||||
For answers to common questions about this code of conduct, see
|
||||
https://www.contributor-covenant.org/faq
|
||||
|
||||
|
||||
|
||||
|
||||
...Try to keep in mind the immortal
|
||||
words of Bill and Ted, "Be excellent to each other."
|
||||
|
||||
These documents are licensed under Creative Commons Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) https://creativecommons.org/licenses/by-sa/4.0/
|
||||
And further any software Grassrotos Economics supports is under licenced under GNU GENERAL PUBLIC LICENSE Version 3 https://www.gnu.org/licenses/
|
||||
|
||||
@@ -1,16 +1,66 @@
|
||||
Hello and welcome to the CIC Stack repository. Targeted for use with the ethereum virtual machine and a ussd capable telecom provider.
|
||||
# Contribution Guidelines
|
||||
|
||||
__To request a change to the code please fork this repository and sumbit a merge request.__
|
||||
**Karibu sana!** Hello and welcome to the Grassroots Economics [project](https://gitlab.com/grassrootseconomics). This guide will help you understand the overall
|
||||
organization of the CIC project to get you started with contributing to the project. You'll be able to pick up issues, write code to fix them, and get your work reviewed and merged.
|
||||
|
||||
__If there is a Grassroots Economics Kanban Issue please include that in our MR it will help us track contributions. Karibu sana!__
|
||||
## Table of Contents
|
||||
|
||||
__Visit the Development Kanban board here: https://gitlab.com/grassrootseconomics/cic-internal-integration/-/boards/2419764__
|
||||
- [Prerequisites](#Prerequisites)
|
||||
- [Scope of contributions](#Scope-of-contributions)
|
||||
- [Issues](#Issues)
|
||||
- [Bugs](#Bugs)
|
||||
- [Good first issues](#Good-first-issues)
|
||||
- [Git guidelines](#Git-guidelines)
|
||||
- [Merge request processs](#Merge-request-processs)
|
||||
- [Environment setup](#Environment-setup)
|
||||
- [Code style](#Code-style)
|
||||
- [Building-and-testing](#Building-and-testing)
|
||||
|
||||
__Ask a question in our dev chat:__
|
||||
## Prerequisites
|
||||
|
||||
[Mattermost](https://chat.grassrootseconomics.net/cic/channels/dev)
|
||||
Before contributing to the CIC Stack, ensure you have:
|
||||
|
||||
[Discord](https://discord.gg/XWunwAsX)
|
||||
- Read and accepted to abide by the Code of Conduct
|
||||
<!-- TODO: Add link to CoC -->
|
||||
- Read and accepted the Developers Certificate of Origin
|
||||
<!-- TODO: Add link to DCO -->
|
||||
|
||||
[Matrix, IRC soon?]
|
||||
## Scope of contributions
|
||||
|
||||
We generally welcome all sorts or contributions. No contribution is too small. We gladly accept contributions such as:
|
||||
|
||||
- Documentation improvements including inline documentation or even minor typos
|
||||
- Answering questions in issue discussions and pull requests
|
||||
- Fixing known bugs on our issue tracker
|
||||
- Reporting unknown bugs
|
||||
<!-- TODO: Add resposible disclosure guidelines -->
|
||||
- Securty issues through responsible disclosure
|
||||
|
||||
## Issues
|
||||
|
||||
Use the existing issue template and issue tags to correctly catagorize and issue when creating issues. Make sure to search for duplicated before creating a new issue.
|
||||
|
||||
### Bugs
|
||||
|
||||
If a bug is encountered and you can reproduce it:
|
||||
|
||||
- Define its priority with an issue tag
|
||||
- Describe the steps to reproduce it while providing the version of the bugged service/library and your environment setup
|
||||
|
||||
### Good first issues
|
||||
|
||||
These are pre-tagged issues which are friendly (low barrier to entry and clearly defined tasks) for new contributors to get aquinted to the project.
|
||||
|
||||
## Git guideleines
|
||||
|
||||
<!-- TODO: Review after finalzing branching or forking model -->
|
||||
|
||||
## Merge request process
|
||||
|
||||
<!-- TODO: Review after core_contrib -->
|
||||
|
||||
## Environment setup
|
||||
|
||||
## Code style
|
||||
|
||||
## Building and testing
|
||||
|
||||
@@ -1,117 +0,0 @@
|
||||
# CORE TEAM CONTRIBUTION GUIDE
|
||||
|
||||
# 1. Transparency
|
||||
|
||||
1.1 Use work logs for reflection of work done, aswell as telling your peers about changes that may affect their own tasks
|
||||
|
||||
1.2 A work log SHOULD be submitted after a "unit of work" is complete.
|
||||
|
||||
1.2.1 A "unit of work" should not span more than one full day's worth of work.
|
||||
|
||||
1.2.2 A "unit of work" should be small enough that the log entries give useful insight.
|
||||
|
||||
1.3 Individual logs are reviewed in weekly meetings
|
||||
|
||||
<!--1.4 Bullet point list of topics and one or more sub-points describing each item in short sentences, eg;
|
||||
|
||||
```
|
||||
- Core
|
||||
* fixed foo
|
||||
* fixed bar
|
||||
- Frontend
|
||||
* connected bar to baz
|
||||
|
||||
```-->
|
||||
|
||||
1.4 Work log format is defined in []()
|
||||
|
||||
1.5 Link to issue/MR in bullet point where appropriate
|
||||
|
||||
1.6
|
||||
|
||||
|
||||
# 2. Code hygiene
|
||||
|
||||
2.1 Keep function names and variable names short
|
||||
|
||||
2.2 Keep code files, functions and test fixtures short
|
||||
|
||||
2.3 The less magic the better. Recombinable and replaceable is king
|
||||
|
||||
2.4 Group imports by `standard`, `external`, `local`, `test` - in that order
|
||||
|
||||
2.5 Only auto-import when necessary, and always with a minimum of side-effects
|
||||
|
||||
2.6 Use custom errors. Let them bubble up
|
||||
|
||||
2.7 No logs in tight loops
|
||||
|
||||
2.8 Keep executable main routine minimal. Pass variables (do not use globals) in main business logic function
|
||||
|
||||
2.9 Test coverage MUST be kept higher than 90% after changes
|
||||
|
||||
2.10 Docstrings. Always. Always!
|
||||
|
||||
|
||||
# 3. Versioning
|
||||
|
||||
3.1 Use [Semantic Versioning](https://semver.org/)
|
||||
|
||||
3.2 When merging code, explicit dependencies SHOULD NOT use pre-release version
|
||||
|
||||
|
||||
# 4. Issues
|
||||
|
||||
4.1 Issue title should use [Convention Commit structure](https://www.conventionalcommits.org/en/v1.0.0-beta.2/)
|
||||
|
||||
4.2 Issues need proper problem statement
|
||||
|
||||
4.2.1. What is the current state
|
||||
|
||||
4.2.2. If current state is not behaving as expected, what was the expected state
|
||||
|
||||
4.2.3. What is the desired new state.
|
||||
|
||||
4.3 Issues need proper resolution statement
|
||||
|
||||
4.3.1. Bullet point list of short sentences describing practical steps to reach desired state
|
||||
|
||||
4.3.2. Builet point list of external resources informing the issue and resolution
|
||||
|
||||
4.4 Tasks needs to be appropriately labelled using GROUP labels.
|
||||
|
||||
|
||||
# 5. Code submission
|
||||
|
||||
5.1 A branch and new MR is always created BEFORE THE WORK STARTS
|
||||
|
||||
5.2 An MR should solve ONE SINGLE PART of a problem
|
||||
|
||||
5.3 Every MR should have at least ONE ISSUE associated with it. Ideally issue can be closed when MR is merged
|
||||
|
||||
5.4 MRs should not be open for more than one week (during normal operation periods)
|
||||
|
||||
5.5 MR should ideally not be longer than 400 lines of changes of logic
|
||||
|
||||
5.6 MRs that MOVE or DELETE code should not CHANGE that same code in a single MR. Scope MOVEs and DELETEs in separate commits (or even better, separate MRs) for transparency
|
||||
|
||||
|
||||
# 6. Code reviews
|
||||
|
||||
6.1 At least one peer review before merge
|
||||
|
||||
6.2 If MR is too long, evaluate whether this affects the quality of the review negatively. If it does, expect to be asked to split it up
|
||||
|
||||
6.3 Evaluate changes against associated issues' problem statement and proposed resolution steps. If there is a mismatch, either MR needs to change or issue needs to be amended accordingly
|
||||
|
||||
6.4 Make sure all technical debt introduced by MR is documented in issues. Add them according to criteria in section ISSUES if not
|
||||
|
||||
6.5 If CI is not working, reviewer MUST make sure code builds and runs
|
||||
|
||||
6.6 Behave!
|
||||
|
||||
6.6.1 Don't be a jerk
|
||||
|
||||
6.6.2 Don't block needlessly
|
||||
|
||||
6.6.3 Say please
|
||||
27
README.md
27
README.md
@@ -1,19 +1,24 @@
|
||||
# Community Inclusion Currency Stack (CIC Stack)
|
||||
## Community Inclusion Currency Stack (CIC Stack)
|
||||
|
||||
A custodial evm wallet for executing transactions via USSD
|
||||
A custodial wallet and blockchain bidirectional interface engine for community inclusion currencies. Check the [CIC Stack summary](https://docs.grassecon.org/software) page for more information.
|
||||
|
||||
## Getting started
|
||||
### Contributing
|
||||
|
||||
This repo uses docker-compose and docker buildkit. Set the following environment variables to get started:
|
||||
See our [contribution guidelines](https://docs.grassecon.org/community/contrib/) for more details on:
|
||||
|
||||
```
|
||||
export COMPOSE_DOCKER_CLI_BUILD=1
|
||||
export DOCKER_BUILDKIT=1
|
||||
```
|
||||
- Environment setup
|
||||
- Source Code Management (SCM) guidelines
|
||||
- Filing issues
|
||||
|
||||
To get started see [./apps/contract-migration/README.md](./apps/contract-migration/README.md)
|
||||
#### Code of Conduct
|
||||
|
||||
## Documentation
|
||||
This project is released with a [Code of Conduct](https://docs.grassecon.org/community/conduct/). By participating in this project you agree to abide by its terms.
|
||||
|
||||
[https://docs.grassecon.org/cic_stack/](https://docs.grassecon.org/cic_stack/)
|
||||
### Community
|
||||
|
||||
- [Mattermost](https://chat.grassrootseconomics.net/cic/channels/dev)
|
||||
- [Discord](https://discord.gg/ud32KMgH76)
|
||||
|
||||
### License
|
||||
|
||||
Licensed under [GNU AGPLv3](https://gitlab.com/grassrootseconomics/cic-internal-integration/-/blob/master/LICENSE).
|
||||
|
||||
@@ -1 +1 @@
|
||||
include *requirements.txt cic_cache/data/config/* cic_cache/db/migrations/default/* cic_cache/db/migrations/default/versions/*
|
||||
include *requirements.txt cic_cache/data/config/*
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
[cic]
|
||||
registry_address =
|
||||
trust_address =
|
||||
health_modules =
|
||||
health_modules = cic_eth.check.db,cic_eth.check.redis,cic_eth.check.signer,cic_eth.check.gas
|
||||
|
||||
@@ -3,8 +3,7 @@ engine =
|
||||
driver =
|
||||
host =
|
||||
port =
|
||||
#name = cic-cache
|
||||
prefix =
|
||||
name = cic-cache
|
||||
user =
|
||||
password =
|
||||
debug = 0
|
||||
|
||||
@@ -9,26 +9,21 @@ from .list import (
|
||||
tag_transaction,
|
||||
add_tag,
|
||||
)
|
||||
from cic_cache.db.models.base import SessionBase
|
||||
|
||||
|
||||
logg = logging.getLogger()
|
||||
|
||||
|
||||
def dsn_from_config(config, name):
|
||||
def dsn_from_config(config):
|
||||
scheme = config.get('DATABASE_ENGINE')
|
||||
if config.get('DATABASE_DRIVER') != None:
|
||||
scheme += '+{}'.format(config.get('DATABASE_DRIVER'))
|
||||
|
||||
database_name = name
|
||||
if config.get('DATABASE_PREFIX'):
|
||||
database_name = '{}_{}'.format(config.get('DATABASE_PREFIX'), database_name)
|
||||
dsn = ''
|
||||
if config.get('DATABASE_ENGINE') == 'sqlite':
|
||||
SessionBase.poolable = False
|
||||
dsn = '{}:///{}'.format(
|
||||
scheme,
|
||||
database_name,
|
||||
config.get('DATABASE_NAME'),
|
||||
)
|
||||
|
||||
else:
|
||||
@@ -38,7 +33,7 @@ def dsn_from_config(config, name):
|
||||
config.get('DATABASE_PASSWORD'),
|
||||
config.get('DATABASE_HOST'),
|
||||
config.get('DATABASE_PORT'),
|
||||
database_name,
|
||||
config.get('DATABASE_NAME'),
|
||||
)
|
||||
logg.debug('parsed dsn from config: {}'.format(dsn))
|
||||
return dsn
|
||||
|
||||
@@ -5,11 +5,7 @@ import re
|
||||
import base64
|
||||
|
||||
# external imports
|
||||
from hexathon import (
|
||||
add_0x,
|
||||
strip_0x,
|
||||
)
|
||||
from chainlib.encode import TxHexNormalizer
|
||||
from hexathon import add_0x
|
||||
|
||||
# local imports
|
||||
from cic_cache.cache import (
|
||||
@@ -20,72 +16,27 @@ from cic_cache.cache import (
|
||||
logg = logging.getLogger(__name__)
|
||||
#logg = logging.getLogger()
|
||||
|
||||
re_transactions_all_bloom = r'/tx/?(\d+)?/?(\d+)?/?(\d+)?/?(\d+)?/?'
|
||||
re_transactions_all_bloom = r'/tx/(\d+)?/?(\d+)/?'
|
||||
re_transactions_account_bloom = r'/tx/user/((0x)?[a-fA-F0-9]+)(/(\d+)(/(\d+))?)?/?'
|
||||
re_transactions_all_data = r'/txa/?(\d+)?/?(\d+)?/?(\d+)?/?(\d+)?/?'
|
||||
re_transactions_account_data = r'/txa/user/((0x)?[a-fA-F0-9]+)(/(\d+)(/(\d+))?)?/?'
|
||||
re_default_limit = r'/defaultlimit/?'
|
||||
re_transactions_all_data = r'/txa/(\d+)?/?(\d+)/?'
|
||||
|
||||
DEFAULT_LIMIT = 100
|
||||
|
||||
tx_normalize = TxHexNormalizer()
|
||||
|
||||
def parse_query_account(r):
|
||||
address = strip_0x(r[1])
|
||||
#address = tx_normalize.wallet_address(address)
|
||||
limit = DEFAULT_LIMIT
|
||||
g = r.groups()
|
||||
if len(g) > 3:
|
||||
limit = int(r[4])
|
||||
if limit == 0:
|
||||
limit = DEFAULT_LIMIT
|
||||
offset = 0
|
||||
if len(g) > 4:
|
||||
offset = int(r[6])
|
||||
|
||||
logg.debug('account query is address {} offset {} limit {}'.format(address, offset, limit))
|
||||
|
||||
return (address, offset, limit,)
|
||||
|
||||
|
||||
# r is an re.Match
|
||||
def parse_query_any(r):
|
||||
limit = DEFAULT_LIMIT
|
||||
offset = 0
|
||||
block_offset = None
|
||||
block_end = None
|
||||
if r.lastindex != None:
|
||||
if r.lastindex > 0:
|
||||
limit = int(r[1])
|
||||
if r.lastindex > 1:
|
||||
offset = int(r[2])
|
||||
if r.lastindex > 2:
|
||||
block_offset = int(r[3])
|
||||
if r.lastindex > 3:
|
||||
block_end = int(r[4])
|
||||
if block_end < block_offset:
|
||||
raise ValueError('cart before the horse, dude')
|
||||
|
||||
logg.debug('data query is offset {} limit {} block_offset {} block_end {}'.format(offset, limit, block_offset, block_end))
|
||||
|
||||
return (offset, limit, block_offset, block_end,)
|
||||
|
||||
|
||||
def process_default_limit(session, env):
|
||||
r = re.match(re_default_limit, env.get('PATH_INFO'))
|
||||
if not r:
|
||||
return None
|
||||
|
||||
return ('application/json', str(DEFAULT_LIMIT).encode('utf-8'),)
|
||||
|
||||
|
||||
def process_transactions_account_bloom(session, env):
|
||||
r = re.match(re_transactions_account_bloom, env.get('PATH_INFO'))
|
||||
if not r:
|
||||
return None
|
||||
logg.debug('match account bloom')
|
||||
|
||||
(address, offset, limit,) = parse_query_account(r)
|
||||
address = r[1]
|
||||
if r[2] == None:
|
||||
address = add_0x(address)
|
||||
offset = 0
|
||||
if r.lastindex > 2:
|
||||
offset = r[4]
|
||||
limit = DEFAULT_LIMIT
|
||||
if r.lastindex > 4:
|
||||
limit = r[6]
|
||||
|
||||
c = BloomCache(session)
|
||||
(lowest_block, highest_block, bloom_filter_block, bloom_filter_tx) = c.load_transactions_account(address, offset, limit)
|
||||
@@ -108,9 +59,13 @@ def process_transactions_all_bloom(session, env):
|
||||
r = re.match(re_transactions_all_bloom, env.get('PATH_INFO'))
|
||||
if not r:
|
||||
return None
|
||||
logg.debug('match all bloom')
|
||||
|
||||
(limit, offset, block_offset, block_end,) = parse_query_any(r)
|
||||
offset = DEFAULT_LIMIT
|
||||
if r.lastindex > 0:
|
||||
offset = r[1]
|
||||
limit = 0
|
||||
if r.lastindex > 1:
|
||||
limit = r[2]
|
||||
|
||||
c = BloomCache(session)
|
||||
(lowest_block, highest_block, bloom_filter_block, bloom_filter_tx) = c.load_transactions(offset, limit)
|
||||
@@ -133,16 +88,17 @@ def process_transactions_all_data(session, env):
|
||||
r = re.match(re_transactions_all_data, env.get('PATH_INFO'))
|
||||
if not r:
|
||||
return None
|
||||
#if env.get('HTTP_X_CIC_CACHE_MODE') != 'all':
|
||||
# return None
|
||||
logg.debug('match all data')
|
||||
if env.get('HTTP_X_CIC_CACHE_MODE') != 'all':
|
||||
return None
|
||||
|
||||
logg.debug('got data request {}'.format(env))
|
||||
|
||||
(offset, limit, block_offset, block_end) = parse_query_any(r)
|
||||
block_offset = r[1]
|
||||
block_end = r[2]
|
||||
if int(r[2]) < int(r[1]):
|
||||
raise ValueError('cart before the horse, dude')
|
||||
|
||||
c = DataCache(session)
|
||||
(lowest_block, highest_block, tx_cache) = c.load_transactions_with_data(offset, limit, block_offset, block_end, oldest=True) # oldest needs to be settable
|
||||
(lowest_block, highest_block, tx_cache) = c.load_transactions_with_data(0, 0, block_offset, block_end, oldest=True) # oldest needs to be settable
|
||||
|
||||
for r in tx_cache:
|
||||
r['date_block'] = r['date_block'].timestamp()
|
||||
@@ -157,30 +113,3 @@ def process_transactions_all_data(session, env):
|
||||
j = json.dumps(o)
|
||||
|
||||
return ('application/json', j.encode('utf-8'),)
|
||||
|
||||
|
||||
def process_transactions_account_data(session, env):
|
||||
r = re.match(re_transactions_account_data, env.get('PATH_INFO'))
|
||||
if not r:
|
||||
return None
|
||||
logg.debug('match account data')
|
||||
#if env.get('HTTP_X_CIC_CACHE_MODE') != 'all':
|
||||
# return None
|
||||
|
||||
(address, offset, limit,) = parse_query_account(r)
|
||||
|
||||
c = DataCache(session)
|
||||
(lowest_block, highest_block, tx_cache) = c.load_transactions_account_with_data(address, offset, limit)
|
||||
|
||||
for r in tx_cache:
|
||||
r['date_block'] = r['date_block'].timestamp()
|
||||
|
||||
o = {
|
||||
'low': lowest_block,
|
||||
'high': highest_block,
|
||||
'data': tx_cache,
|
||||
}
|
||||
|
||||
j = json.dumps(o)
|
||||
|
||||
return ('application/json', j.encode('utf-8'),)
|
||||
|
||||
@@ -12,20 +12,21 @@ import cic_cache.cli
|
||||
from cic_cache.db import dsn_from_config
|
||||
from cic_cache.db.models.base import SessionBase
|
||||
from cic_cache.runnable.daemons.query import (
|
||||
process_default_limit,
|
||||
process_transactions_account_bloom,
|
||||
process_transactions_account_data,
|
||||
process_transactions_all_bloom,
|
||||
process_transactions_all_data,
|
||||
)
|
||||
import cic_cache.cli
|
||||
|
||||
logging.basicConfig(level=logging.WARNING)
|
||||
logg = logging.getLogger()
|
||||
|
||||
rootdir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
|
||||
dbdir = os.path.join(rootdir, 'cic_cache', 'db')
|
||||
migrationsdir = os.path.join(dbdir, 'migrations')
|
||||
|
||||
arg_flags = cic_cache.cli.argflag_std_read
|
||||
local_arg_flags = cic_cache.cli.argflag_local_sync | cic_cache.cli.argflag_local_task
|
||||
# process args
|
||||
arg_flags = cic_cache.cli.argflag_std_base
|
||||
local_arg_flags = cic_cache.cli.argflag_local_task
|
||||
argparser = cic_cache.cli.ArgumentParser(arg_flags)
|
||||
argparser.process_local_flags(local_arg_flags)
|
||||
args = argparser.parse_args()
|
||||
@@ -34,7 +35,7 @@ args = argparser.parse_args()
|
||||
config = cic_cache.cli.Config.from_args(args, arg_flags, local_arg_flags)
|
||||
|
||||
# connect to database
|
||||
dsn = dsn_from_config(config, 'cic_cache')
|
||||
dsn = dsn_from_config(config)
|
||||
SessionBase.connect(dsn, config.true('DATABASE_DEBUG'))
|
||||
|
||||
|
||||
@@ -46,11 +47,9 @@ def application(env, start_response):
|
||||
|
||||
session = SessionBase.create_session()
|
||||
for handler in [
|
||||
process_transactions_account_data,
|
||||
process_transactions_account_bloom,
|
||||
process_transactions_all_data,
|
||||
process_transactions_all_bloom,
|
||||
process_default_limit,
|
||||
process_transactions_account_bloom,
|
||||
]:
|
||||
r = None
|
||||
try:
|
||||
|
||||
@@ -3,7 +3,6 @@ import logging
|
||||
import os
|
||||
import sys
|
||||
import argparse
|
||||
import tempfile
|
||||
|
||||
# third-party imports
|
||||
import celery
|
||||
@@ -29,7 +28,7 @@ args = argparser.parse_args()
|
||||
config = cic_cache.cli.Config.from_args(args, arg_flags, local_arg_flags)
|
||||
|
||||
# connect to database
|
||||
dsn = dsn_from_config(config, 'cic_cache')
|
||||
dsn = dsn_from_config(config)
|
||||
SessionBase.connect(dsn)
|
||||
|
||||
# set up celery
|
||||
|
||||
@@ -50,7 +50,7 @@ args = argparser.parse_args()
|
||||
config = cic_cache.cli.Config.from_args(args, arg_flags, local_arg_flags)
|
||||
|
||||
# connect to database
|
||||
dsn = dsn_from_config(config, 'cic_cache')
|
||||
dsn = dsn_from_config(config)
|
||||
SessionBase.connect(dsn, debug=config.true('DATABASE_DEBUG'))
|
||||
|
||||
# set up rpc
|
||||
|
||||
@@ -5,7 +5,7 @@ version = (
|
||||
0,
|
||||
2,
|
||||
1,
|
||||
'alpha.3',
|
||||
'alpha.2',
|
||||
)
|
||||
|
||||
version_object = semver.VersionInfo(
|
||||
|
||||
3
apps/cic-cache/config/celery.ini
Normal file
3
apps/cic-cache/config/celery.ini
Normal file
@@ -0,0 +1,3 @@
|
||||
[celery]
|
||||
broker_url = redis:///
|
||||
result_url = redis:///
|
||||
3
apps/cic-cache/config/cic.ini
Normal file
3
apps/cic-cache/config/cic.ini
Normal file
@@ -0,0 +1,3 @@
|
||||
[cic]
|
||||
registry_address =
|
||||
trust_address =
|
||||
9
apps/cic-cache/config/database.ini
Normal file
9
apps/cic-cache/config/database.ini
Normal file
@@ -0,0 +1,9 @@
|
||||
[database]
|
||||
NAME=cic_cache
|
||||
USER=postgres
|
||||
PASSWORD=
|
||||
HOST=localhost
|
||||
PORT=5432
|
||||
ENGINE=postgresql
|
||||
DRIVER=psycopg2
|
||||
DEBUG=0
|
||||
3
apps/cic-cache/config/docker/celery.ini
Normal file
3
apps/cic-cache/config/docker/celery.ini
Normal file
@@ -0,0 +1,3 @@
|
||||
[celery]
|
||||
broker_url = redis://localhost:63379
|
||||
result_url = redis://localhost:63379
|
||||
3
apps/cic-cache/config/docker/cic.ini
Normal file
3
apps/cic-cache/config/docker/cic.ini
Normal file
@@ -0,0 +1,3 @@
|
||||
[cic]
|
||||
registry_address =
|
||||
trust_address = 0xEb3907eCad74a0013c259D5874AE7f22DcBcC95C
|
||||
9
apps/cic-cache/config/docker/database.ini
Normal file
9
apps/cic-cache/config/docker/database.ini
Normal file
@@ -0,0 +1,9 @@
|
||||
[database]
|
||||
NAME=cic_cache
|
||||
USER=grassroots
|
||||
PASSWORD=
|
||||
HOST=localhost
|
||||
PORT=63432
|
||||
ENGINE=postgresql
|
||||
DRIVER=psycopg2
|
||||
DEBUG=0
|
||||
4
apps/cic-cache/config/docker/syncer.ini
Normal file
4
apps/cic-cache/config/docker/syncer.ini
Normal file
@@ -0,0 +1,4 @@
|
||||
[syncer]
|
||||
loop_interval = 1
|
||||
offset = 0
|
||||
no_history = 0
|
||||
2
apps/cic-cache/config/test/bancor.ini
Normal file
2
apps/cic-cache/config/test/bancor.ini
Normal file
@@ -0,0 +1,2 @@
|
||||
[bancor]
|
||||
dir =
|
||||
@@ -1,3 +1,4 @@
|
||||
[cic]
|
||||
registry_address =
|
||||
chain_spec =
|
||||
trust_address =
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
[database]
|
||||
PREFIX=cic-cache-test
|
||||
NAME=cic-cache-test
|
||||
USER=postgres
|
||||
PASSWORD=
|
||||
HOST=localhost
|
||||
|
||||
5
apps/cic-cache/config/test/eth.ini
Normal file
5
apps/cic-cache/config/test/eth.ini
Normal file
@@ -0,0 +1,5 @@
|
||||
[eth]
|
||||
#ws_provider = ws://localhost:8546
|
||||
#ttp_provider = http://localhost:8545
|
||||
provider = http://localhost:8545
|
||||
#chain_id =
|
||||
@@ -1,4 +1,4 @@
|
||||
openapi: "3.0.2"
|
||||
openapi: "3.0.3"
|
||||
info:
|
||||
title: Grassroots Economics CIC Cache
|
||||
description: Cache of processed transaction data from Ethereum blockchain and worker queues
|
||||
@@ -9,34 +9,17 @@ info:
|
||||
email: will@grassecon.org
|
||||
license:
|
||||
name: GPLv3
|
||||
version: 0.2.0
|
||||
version: 0.1.0
|
||||
|
||||
paths:
|
||||
/defaultlimit:
|
||||
summary: The default limit value of result sets.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve default limit
|
||||
operationId: limit.default
|
||||
responses:
|
||||
200:
|
||||
description: Limit query successful
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/Limit"
|
||||
|
||||
/tx:
|
||||
summary: Bloom filter for batch of latest transactions
|
||||
description: Generate a bloom filter of the latest transactions in the cache. The number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
/tx/{offset}/{limit}:
|
||||
description: Bloom filter for batch of latest transactions
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: tx.get.latest
|
||||
operationId: tx.get
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
@@ -46,153 +29,27 @@ paths:
|
||||
$ref: "#/components/schemas/BlocksBloom"
|
||||
|
||||
|
||||
/tx/{limit}:
|
||||
summary: Bloom filter for batch of latest transactions
|
||||
description: Generate a bloom filter of the latest transactions in the cache. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: tx.get.latest.limit
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful. Results are ordered from newest to oldest.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/BlocksBloom"
|
||||
parameters:
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/tx/{limit}/{offset}:
|
||||
summary: Bloom filter for batch of latest transactions
|
||||
description: Generate a bloom filter of the latest transactions in the cache. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: tx.get.latest.range
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful. Results are ordered from newest to oldest.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/BlocksBloom"
|
||||
parameters:
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/tx/{limit}/{offset}/{block_offset}:
|
||||
summary: Bloom filter for batch of transactions since a particular block.
|
||||
description: Generate a bloom filter of the latest transactions since a particular block in the cache. The block parameter is inclusive. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: tx.get.latest.range.block.offset
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful. Results are ordered from oldest to newest.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/BlocksBloom"
|
||||
|
||||
parameters:
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: block_offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/tx/{limit}/{offset}/{block_offset}/{block_end}:
|
||||
summary: Bloom filter for batch of transactions within a particular block range.
|
||||
description: Generate a bloom filter of the latest transactions within a particular block range in the cache. The block parameters are inclusive. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
/tx/{address}/{offset}/{limit}:
|
||||
description: Bloom filter for batch of latest transactions by account
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: tx.get.latest.range.block.range
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/BlocksBloom"
|
||||
|
||||
parameters:
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: block_offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: block_end
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/tx/{address}:
|
||||
summary: Bloom filter for batch of latest transactions by account.
|
||||
description: Generate a bloom filter of the latest transactions where a specific account is the spender or beneficiary.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: tx.get.user
|
||||
operationId: tx.get
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
@@ -201,30 +58,6 @@ paths:
|
||||
schema:
|
||||
$ref: "#/components/schemas/BlocksBloom"
|
||||
|
||||
parameters:
|
||||
- name: address
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
|
||||
|
||||
/tx/{address}/{limit}:
|
||||
summary: Bloom filter for batch of latest transactions by account.
|
||||
description: Generate a bloom filter of the latest transactions where a specific account is the spender or beneficiary. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: tx.get.user.limit
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/BlocksBloom"
|
||||
|
||||
parameters:
|
||||
- name: address
|
||||
@@ -232,317 +65,26 @@ paths:
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/tx/{address}/{limit}/{offset}:
|
||||
summary: Bloom filter for batch of latest transactions by account
|
||||
description: Generate a bloom filter of the latest transactions where a specific account is the spender or beneficiary. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: tx.get.user.range
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/BlocksBloom"
|
||||
|
||||
parameters:
|
||||
- name: address
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/txa:
|
||||
summary: Cached data for latest transactions.
|
||||
description: Return data entries of the latest transactions in the cache. The number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: txa.get.latest
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/TransactionList"
|
||||
|
||||
|
||||
/txa/{limit}:
|
||||
summary: Cached data for latest transactions.
|
||||
description: Return data entries of the latest transactions in the cache. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: txa.get.latest.limit
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/TransactionList"
|
||||
|
||||
parameters:
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/txa/{limit}/{offset}:
|
||||
summary: Cached data for latest transactions.
|
||||
description: Return data entries of the latest transactions in the cache. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: txa.get.latest.range
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/TransactionList"
|
||||
|
||||
parameters:
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/txa/{limit}/{offset}/{block_offset}:
|
||||
summary: Cached data for transactions since a particular block.
|
||||
description: Return cached data entries of transactions since a particular block. The block parameter is inclusive. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: txa.get.latest.range.block.offset
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/TransactionList"
|
||||
|
||||
parameters:
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: block_offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
/txa/{limit}/{offset}/{block_offset}/{block_end}:
|
||||
summary: Cached data for transactions within a particular block range.
|
||||
description: Return cached data entries of transactions within a particular block range in the cache. The block parameters are inclusive. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: txa.get.latest.range.block.range
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/TransactionList"
|
||||
|
||||
parameters:
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: block_offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: block_end
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/txa/{address}:
|
||||
summary: Cached data for batch of latest transactions by account.
|
||||
description: Return cached data of the latest transactions where a specific account is the spender or beneficiary.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: txa.get.user
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/TransactionList"
|
||||
|
||||
parameters:
|
||||
- name: address
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
|
||||
|
||||
/txa/{address}/{limit}:
|
||||
summary: Cached data for batch of latest transactions by account.
|
||||
description: Return cached data of the latest transactions where a specific account is the spender or beneficiary. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: txa.get.user.limit
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/TransactionList"
|
||||
|
||||
parameters:
|
||||
- name: address
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
/txa/{address}/{limit}/{offset}:
|
||||
summary: Cached data for batch of latest transactions by account.
|
||||
description: Return cached data of the latest transactions where a specific account is the spender or beneficiary. If `limit` is 0, the number of maximum number of transactions returned is returned by the `/defaultlimit` API call.
|
||||
get:
|
||||
tags:
|
||||
- transactions
|
||||
description:
|
||||
Retrieve transactions
|
||||
operationId: txa.get.user.range
|
||||
responses:
|
||||
200:
|
||||
description: Transaction query successful.
|
||||
content:
|
||||
application/json:
|
||||
schema:
|
||||
$ref: "#/components/schemas/TransactionList"
|
||||
|
||||
parameters:
|
||||
- name: address
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: string
|
||||
- name: limit
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
- name: offset
|
||||
in: path
|
||||
required: true
|
||||
schema:
|
||||
type: integer
|
||||
format: int32
|
||||
|
||||
|
||||
components:
|
||||
schemas:
|
||||
Limit:
|
||||
type: integer
|
||||
format: int32
|
||||
BlocksBloom:
|
||||
type: object
|
||||
properties:
|
||||
low:
|
||||
type: integer
|
||||
type: int
|
||||
format: int32
|
||||
description: The lowest block number included in the filter
|
||||
high:
|
||||
type: integer
|
||||
format: int32
|
||||
description: The highest block number included in the filter
|
||||
block_filter:
|
||||
type: string
|
||||
format: byte
|
||||
@@ -555,89 +97,6 @@ components:
|
||||
type: string
|
||||
description: Hashing algorithm (currently only using sha256)
|
||||
filter_rounds:
|
||||
type: integer
|
||||
type: int
|
||||
format: int32
|
||||
description: Number of hash rounds used to create the filter
|
||||
TransactionList:
|
||||
type: object
|
||||
properties:
|
||||
low:
|
||||
type: integer
|
||||
format: int32
|
||||
description: The lowest block number included in the result set
|
||||
high:
|
||||
type: integer
|
||||
format: int32
|
||||
description: The highest block number included in the filter
|
||||
data:
|
||||
type: array
|
||||
description: Cached transaction data
|
||||
items:
|
||||
$ref: "#/components/schemas/Transaction"
|
||||
Transaction:
|
||||
type: object
|
||||
properties:
|
||||
block_number:
|
||||
type: integer
|
||||
format: int64
|
||||
description: Block number transaction was included in.
|
||||
tx_hash:
|
||||
type: string
|
||||
description: Transaction hash, in hex.
|
||||
date_block:
|
||||
type: integer
|
||||
format: int32
|
||||
description: Block timestamp.
|
||||
sender:
|
||||
type: string
|
||||
description: Spender address, in hex.
|
||||
recipient:
|
||||
type: string
|
||||
description: Beneficiary address, in hex.
|
||||
from_value:
|
||||
type: integer
|
||||
format: int64
|
||||
description: Value deducted from spender's balance.
|
||||
to_value:
|
||||
type: integer
|
||||
format: int64
|
||||
description: Value added to beneficiary's balance.
|
||||
source_token:
|
||||
type: string
|
||||
description: Network address of token in which `from_value` is denominated.
|
||||
destination_token:
|
||||
type: string
|
||||
description: Network address of token in which `to_value` is denominated.
|
||||
success:
|
||||
type: boolean
|
||||
description: Network consensus state on whether the transaction was successful or not.
|
||||
tx_type:
|
||||
type: string
|
||||
enum:
|
||||
- erc20.faucet
|
||||
- faucet.give_to
|
||||
|
||||
examples:
|
||||
data_last:
|
||||
summary: Get the latest cached transactions, using the server's default limit.
|
||||
value: "/txa"
|
||||
|
||||
data_limit:
|
||||
summary: Get the last 42 cached transactions.
|
||||
value: "/txa/42"
|
||||
|
||||
data_range:
|
||||
summary: Get the next 42 cached transactions, starting from the 13th (zero-indexed).
|
||||
value: "/txa/42/13"
|
||||
|
||||
data_range_block_offset:
|
||||
summary: Get the next 42 cached transactions, starting from block 1337 (inclusive).
|
||||
value: "/txa/42/0/1337"
|
||||
|
||||
data_range_block_offset:
|
||||
summary: Get the next 42 cached transactions within blocks 1337 and 1453 (inclusive).
|
||||
value: "/txa/42/0/1337/1453"
|
||||
|
||||
data_range_block_range:
|
||||
summary: Get the next 42 cached transactions after the 13th, within blocks 1337 and 1453 (inclusive).
|
||||
value: "/txa/42/13/1337/1453"
|
||||
|
||||
@@ -4,9 +4,9 @@ FROM $DOCKER_REGISTRY/cic-base-images:python-3.8.6-dev-e8eb2ee2
|
||||
|
||||
COPY requirements.txt .
|
||||
|
||||
ARG EXTRA_PIP_INDEX_URL=https://pip.grassrootseconomics.net
|
||||
ARG EXTRA_PIP_INDEX_URL="https://pip.grassrootseconomics.net"
|
||||
ARG EXTRA_PIP_ARGS=""
|
||||
ARG PIP_INDEX_URL=https://pypi.org/simple
|
||||
ARG PIP_INDEX_URL="https://pypi.org/simple"
|
||||
|
||||
RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
|
||||
pip install --index-url $PIP_INDEX_URL \
|
||||
@@ -14,9 +14,14 @@ RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
|
||||
--extra-index-url $EXTRA_PIP_INDEX_URL $EXTRA_PIP_ARGS \
|
||||
-r requirements.txt
|
||||
|
||||
COPY . .
|
||||
|
||||
COPY . .
|
||||
RUN pip install . --extra-index-url $EXTRA_PIP_INDEX_URL
|
||||
RUN python setup.py install
|
||||
|
||||
# ini files in config directory defines the configurable parameters for the application
|
||||
# they can all be overridden by environment variables
|
||||
# to generate a list of environment variables from configuration, use: confini-dump -z <dir> (executable provided by confini package)
|
||||
#COPY config/ /usr/local/etc/cic-cache/
|
||||
|
||||
# for db migrations
|
||||
COPY ./aux/wait-for-it/wait-for-it.sh ./
|
||||
|
||||
@@ -2,5 +2,5 @@
|
||||
|
||||
set -e
|
||||
>&2 echo executing database migration
|
||||
python scripts/migrate_cic_cache.py --migrations-dir /usr/local/share/cic-cache/alembic -vv
|
||||
python scripts/migrate.py --migrations-dir /usr/local/share/cic-cache/alembic -vv
|
||||
set +e
|
||||
|
||||
@@ -1,15 +1,14 @@
|
||||
alembic==1.4.2
|
||||
confini~=0.5.3
|
||||
confini>=0.3.6rc4,<0.5.0
|
||||
uwsgi==2.0.19.1
|
||||
moolb~=0.2.0
|
||||
cic-eth-registry~=0.6.6
|
||||
moolb~=0.1.1b2
|
||||
cic-eth-registry~=0.6.1a1
|
||||
SQLAlchemy==1.3.20
|
||||
semver==2.13.0
|
||||
psycopg2==2.8.6
|
||||
celery==4.4.7
|
||||
redis==3.5.3
|
||||
chainsyncer[sql]~=0.0.7
|
||||
erc20-faucet~=0.3.2
|
||||
chainlib-eth~=0.0.15
|
||||
eth-address-index~=0.2.4
|
||||
okota~=0.2.5
|
||||
chainsyncer[sql]>=0.0.6a3,<0.1.0
|
||||
erc20-faucet>=0.3.2a2, <0.4.0
|
||||
chainlib-eth>=0.0.9a14,<0.1.0
|
||||
eth-address-index>=0.2.3a4,<0.3.0
|
||||
|
||||
@@ -1,55 +1,54 @@
|
||||
#!/usr/bin/python3
|
||||
|
||||
# standard imports
|
||||
#!/usr/bin/python
|
||||
import os
|
||||
import argparse
|
||||
import logging
|
||||
import re
|
||||
|
||||
# external imports
|
||||
import alembic
|
||||
from alembic.config import Config as AlembicConfig
|
||||
import confini
|
||||
|
||||
# local imports
|
||||
from cic_cache.db import dsn_from_config
|
||||
import cic_cache.cli
|
||||
|
||||
logging.basicConfig(level=logging.WARNING)
|
||||
logg = logging.getLogger()
|
||||
|
||||
# BUG: the dbdir doesn't work after script install
|
||||
rootdir = os.path.dirname(os.path.dirname(os.path.realpath(cic_cache.__file__)))
|
||||
rootdir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
|
||||
dbdir = os.path.join(rootdir, 'cic_cache', 'db')
|
||||
default_migrations_dir = os.path.join(dbdir, 'migrations')
|
||||
migrationsdir = os.path.join(dbdir, 'migrations')
|
||||
configdir = os.path.join(rootdir, 'cic_cache', 'data', 'config')
|
||||
|
||||
#config_dir = os.path.join('/usr/local/etc/cic-cache')
|
||||
|
||||
arg_flags = cic_cache.cli.argflag_std_base
|
||||
local_arg_flags = cic_cache.cli.argflag_local_sync
|
||||
argparser = cic_cache.cli.ArgumentParser(arg_flags)
|
||||
argparser.process_local_flags(local_arg_flags)
|
||||
argparser = argparse.ArgumentParser()
|
||||
argparser.add_argument('-c', type=str, help='config file')
|
||||
argparser.add_argument('--env-prefix', default=os.environ.get('CONFINI_ENV_PREFIX'), dest='env_prefix', type=str, help='environment prefix for variables to overwrite configuration')
|
||||
argparser.add_argument('--migrations-dir', dest='migrations_dir', default=migrationsdir, type=str, help='path to alembic migrations directory')
|
||||
argparser.add_argument('--reset', action='store_true', help='downgrade before upgrading')
|
||||
argparser.add_argument('-f', '--force', action='store_true', help='force action')
|
||||
argparser.add_argument('--migrations-dir', dest='migrations_dir', default=default_migrations_dir, type=str, help='migrations directory')
|
||||
argparser.add_argument('-f', action='store_true', help='force action')
|
||||
argparser.add_argument('-v', action='store_true', help='be verbose')
|
||||
argparser.add_argument('-vv', action='store_true', help='be more verbose')
|
||||
args = argparser.parse_args()
|
||||
|
||||
extra_args = {
|
||||
'reset': None,
|
||||
'force': None,
|
||||
'migrations_dir': None,
|
||||
}
|
||||
# process config
|
||||
config = cic_cache.cli.Config.from_args(args, arg_flags, local_arg_flags, extra_args=extra_args)
|
||||
if args.vv:
|
||||
logging.getLogger().setLevel(logging.DEBUG)
|
||||
elif args.v:
|
||||
logging.getLogger().setLevel(logging.INFO)
|
||||
|
||||
migrations_dir = os.path.join(config.get('_MIGRATIONS_DIR'), config.get('DATABASE_ENGINE', 'default'))
|
||||
config = confini.Config(configdir, args.env_prefix)
|
||||
config.process()
|
||||
config.censor('PASSWORD', 'DATABASE')
|
||||
config.censor('PASSWORD', 'SSL')
|
||||
logg.debug('config:\n{}'.format(config))
|
||||
|
||||
migrations_dir = os.path.join(args.migrations_dir, config.get('DATABASE_ENGINE'))
|
||||
if not os.path.isdir(migrations_dir):
|
||||
logg.debug('migrations dir for engine {} not found, reverting to default'.format(config.get('DATABASE_ENGINE')))
|
||||
migrations_dir = os.path.join(args.migrations_dir, 'default')
|
||||
|
||||
# connect to database
|
||||
dsn = dsn_from_config(config, 'cic_cache')
|
||||
dsn = dsn_from_config(config)
|
||||
|
||||
|
||||
logg.info('using migrations dir {}'.format(migrations_dir))
|
||||
@@ -1,7 +1,6 @@
|
||||
[metadata]
|
||||
name = cic-cache
|
||||
description = CIC Cache API and server
|
||||
version = 0.3.0a2
|
||||
author = Louis Holbrook
|
||||
author_email = dev@holbrook.no
|
||||
url = https://gitlab.com/grassrootseconomics/cic-eth
|
||||
@@ -35,7 +34,7 @@ packages =
|
||||
cic_cache.runnable.daemons
|
||||
cic_cache.runnable.daemons.filters
|
||||
scripts =
|
||||
./scripts/migrate_cic_cache.py
|
||||
./scripts/migrate.py
|
||||
|
||||
[options.entry_points]
|
||||
console_scripts =
|
||||
|
||||
@@ -1,39 +1,38 @@
|
||||
from setuptools import setup
|
||||
|
||||
# import configparser
|
||||
import configparser
|
||||
import os
|
||||
import time
|
||||
|
||||
# import time
|
||||
from cic_cache.version import (
|
||||
version_object,
|
||||
version_string
|
||||
)
|
||||
|
||||
# from cic_cache.version import (
|
||||
# version_object,
|
||||
# version_string
|
||||
# )
|
||||
#
|
||||
# class PleaseCommitFirstError(Exception):
|
||||
# pass
|
||||
#
|
||||
# def git_hash():
|
||||
# import subprocess
|
||||
# git_diff = subprocess.run(['git', 'diff'], capture_output=True)
|
||||
# if len(git_diff.stdout) > 0:
|
||||
# raise PleaseCommitFirstError()
|
||||
# git_hash = subprocess.run(['git', 'rev-parse', 'HEAD'], capture_output=True)
|
||||
# git_hash_brief = git_hash.stdout.decode('utf-8')[:8]
|
||||
# return git_hash_brief
|
||||
#
|
||||
# version_string = str(version_object)
|
||||
#
|
||||
# try:
|
||||
# version_git = git_hash()
|
||||
# version_string += '+build.{}'.format(version_git)
|
||||
# except FileNotFoundError:
|
||||
# time_string_pair = str(time.time()).split('.')
|
||||
# version_string += '+build.{}{:<09d}'.format(
|
||||
# time_string_pair[0],
|
||||
# int(time_string_pair[1]),
|
||||
# )
|
||||
# print('final version string will be {}'.format(version_string))
|
||||
class PleaseCommitFirstError(Exception):
|
||||
pass
|
||||
|
||||
def git_hash():
|
||||
import subprocess
|
||||
git_diff = subprocess.run(['git', 'diff'], capture_output=True)
|
||||
if len(git_diff.stdout) > 0:
|
||||
raise PleaseCommitFirstError()
|
||||
git_hash = subprocess.run(['git', 'rev-parse', 'HEAD'], capture_output=True)
|
||||
git_hash_brief = git_hash.stdout.decode('utf-8')[:8]
|
||||
return git_hash_brief
|
||||
|
||||
version_string = str(version_object)
|
||||
|
||||
try:
|
||||
version_git = git_hash()
|
||||
version_string += '+build.{}'.format(version_git)
|
||||
except FileNotFoundError:
|
||||
time_string_pair = str(time.time()).split('.')
|
||||
version_string += '+build.{}{:<09d}'.format(
|
||||
time_string_pair[0],
|
||||
int(time_string_pair[1]),
|
||||
)
|
||||
print('final version string will be {}'.format(version_string))
|
||||
|
||||
requirements = []
|
||||
f = open('requirements.txt', 'r')
|
||||
@@ -53,8 +52,9 @@ while True:
|
||||
test_requirements.append(l.rstrip())
|
||||
f.close()
|
||||
|
||||
|
||||
setup(
|
||||
# version=version_string,
|
||||
version=version_string,
|
||||
install_requires=requirements,
|
||||
tests_require=test_requirements,
|
||||
)
|
||||
|
||||
@@ -7,4 +7,4 @@ pytest-celery==0.0.0a1
|
||||
eth_tester==0.5.0b3
|
||||
py-evm==0.3.0a20
|
||||
sarafu-faucet~=0.0.7a1
|
||||
erc20-transfer-authorization~=0.3.6
|
||||
erc20-transfer-authorization>=0.3.5a1,<0.4.0
|
||||
|
||||
@@ -6,7 +6,6 @@ import datetime
|
||||
# external imports
|
||||
import pytest
|
||||
import moolb
|
||||
from chainlib.encode import TxHexNormalizer
|
||||
|
||||
# local imports
|
||||
from cic_cache import db
|
||||
@@ -43,8 +42,6 @@ def txs(
|
||||
list_tokens,
|
||||
):
|
||||
|
||||
tx_normalize = TxHexNormalizer()
|
||||
|
||||
session = init_database
|
||||
|
||||
tx_number = 13
|
||||
@@ -57,10 +54,10 @@ def txs(
|
||||
tx_hash_first,
|
||||
list_defaults['block'],
|
||||
tx_number,
|
||||
tx_normalize.wallet_address(list_actors['alice']),
|
||||
tx_normalize.wallet_address(list_actors['bob']),
|
||||
tx_normalize.executable_address(list_tokens['foo']),
|
||||
tx_normalize.executable_address(list_tokens['foo']),
|
||||
list_actors['alice'],
|
||||
list_actors['bob'],
|
||||
list_tokens['foo'],
|
||||
list_tokens['foo'],
|
||||
1024,
|
||||
2048,
|
||||
True,
|
||||
@@ -77,10 +74,10 @@ def txs(
|
||||
tx_hash_second,
|
||||
list_defaults['block']-1,
|
||||
tx_number,
|
||||
tx_normalize.wallet_address(list_actors['diane']),
|
||||
tx_normalize.wallet_address(list_actors['alice']),
|
||||
tx_normalize.executable_address(list_tokens['foo']),
|
||||
tx_normalize.wallet_address(list_tokens['foo']),
|
||||
list_actors['diane'],
|
||||
list_actors['alice'],
|
||||
list_tokens['foo'],
|
||||
list_tokens['foo'],
|
||||
1024,
|
||||
2048,
|
||||
False,
|
||||
@@ -106,8 +103,6 @@ def more_txs(
|
||||
|
||||
session = init_database
|
||||
|
||||
tx_normalize = TxHexNormalizer()
|
||||
|
||||
tx_number = 666
|
||||
tx_hash = '0x' + os.urandom(32).hex()
|
||||
tx_signed = '0x' + os.urandom(128).hex()
|
||||
@@ -120,10 +115,10 @@ def more_txs(
|
||||
tx_hash,
|
||||
list_defaults['block']+2,
|
||||
tx_number,
|
||||
tx_normalize.wallet_address(list_actors['alice']),
|
||||
tx_normalize.wallet_address(list_actors['diane']),
|
||||
tx_normalize.executable_address(list_tokens['bar']),
|
||||
tx_normalize.executable_address(list_tokens['bar']),
|
||||
list_actors['alice'],
|
||||
list_actors['diane'],
|
||||
list_tokens['bar'],
|
||||
list_tokens['bar'],
|
||||
2048,
|
||||
4096,
|
||||
False,
|
||||
|
||||
@@ -14,8 +14,7 @@ logg = logging.getLogger(__file__)
|
||||
@pytest.fixture(scope='session')
|
||||
def load_config():
|
||||
config_dir = os.path.join(root_dir, 'config/test')
|
||||
schema_config_dir = os.path.join(root_dir, 'cic_cache', 'data', 'config')
|
||||
conf = confini.Config(schema_config_dir, 'CICTEST', override_dirs=config_dir)
|
||||
conf = confini.Config(config_dir, 'CICTEST')
|
||||
conf.process()
|
||||
logg.debug('config {}'.format(conf))
|
||||
return conf
|
||||
|
||||
@@ -24,15 +24,11 @@ def database_engine(
|
||||
if load_config.get('DATABASE_ENGINE') == 'sqlite':
|
||||
SessionBase.transactional = False
|
||||
SessionBase.poolable = False
|
||||
name = 'cic_cache'
|
||||
database_name = name
|
||||
if load_config.get('DATABASE_PREFIX'):
|
||||
database_name = '{}_{}'.format(load_config.get('DATABASE_PREFIX'), database_name)
|
||||
try:
|
||||
os.unlink(database_name)
|
||||
os.unlink(load_config.get('DATABASE_NAME'))
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
dsn = dsn_from_config(load_config, name)
|
||||
dsn = dsn_from_config(load_config)
|
||||
SessionBase.connect(dsn, debug=load_config.true('DATABASE_DEBUG'))
|
||||
return dsn
|
||||
|
||||
|
||||
@@ -14,7 +14,7 @@ def test_api_all_data(
|
||||
):
|
||||
|
||||
env = {
|
||||
'PATH_INFO': '/txa/100/0/410000/420000',
|
||||
'PATH_INFO': '/txa/410000/420000',
|
||||
'HTTP_X_CIC_CACHE_MODE': 'all',
|
||||
}
|
||||
j = process_transactions_all_data(init_database, env)
|
||||
@@ -23,7 +23,7 @@ def test_api_all_data(
|
||||
assert len(o['data']) == 2
|
||||
|
||||
env = {
|
||||
'PATH_INFO': '/txa/100/0/420000/410000',
|
||||
'PATH_INFO': '/txa/420000/410000',
|
||||
'HTTP_X_CIC_CACHE_MODE': 'all',
|
||||
}
|
||||
|
||||
|
||||
@@ -6,7 +6,6 @@ import json
|
||||
|
||||
# external imports
|
||||
import pytest
|
||||
from chainlib.encode import TxHexNormalizer
|
||||
|
||||
# local imports
|
||||
from cic_cache import db
|
||||
@@ -63,8 +62,6 @@ def test_cache_ranges(
|
||||
|
||||
session = init_database
|
||||
|
||||
tx_normalize = TxHexNormalizer()
|
||||
|
||||
oldest = list_defaults['block'] - 1
|
||||
mid = list_defaults['block']
|
||||
newest = list_defaults['block'] + 2
|
||||
@@ -103,39 +100,32 @@ def test_cache_ranges(
|
||||
assert b[1] == mid
|
||||
|
||||
# now check when supplying account
|
||||
account = tx_normalize.wallet_address(list_actors['alice'])
|
||||
b = c.load_transactions_account(account, 0, 100)
|
||||
b = c.load_transactions_account(list_actors['alice'], 0, 100)
|
||||
assert b[0] == oldest
|
||||
assert b[1] == newest
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['bob'])
|
||||
b = c.load_transactions_account(account, 0, 100)
|
||||
b = c.load_transactions_account(list_actors['bob'], 0, 100)
|
||||
assert b[0] == mid
|
||||
assert b[1] == mid
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['diane'])
|
||||
b = c.load_transactions_account(account, 0, 100)
|
||||
b = c.load_transactions_account(list_actors['diane'], 0, 100)
|
||||
assert b[0] == oldest
|
||||
assert b[1] == newest
|
||||
|
||||
# add block filter to the mix
|
||||
account = tx_normalize.wallet_address(list_actors['alice'])
|
||||
b = c.load_transactions_account(account, 0, 100, block_offset=list_defaults['block'])
|
||||
b = c.load_transactions_account(list_actors['alice'], 0, 100, block_offset=list_defaults['block'])
|
||||
assert b[0] == mid
|
||||
assert b[1] == newest
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['alice'])
|
||||
b = c.load_transactions_account(account, 0, 100, block_offset=list_defaults['block'])
|
||||
b = c.load_transactions_account(list_actors['alice'], 0, 100, block_offset=list_defaults['block'])
|
||||
assert b[0] == mid
|
||||
assert b[1] == newest
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['bob'])
|
||||
b = c.load_transactions_account(account, 0, 100, block_offset=list_defaults['block'] - 1, block_limit=list_defaults['block'])
|
||||
b = c.load_transactions_account(list_actors['bob'], 0, 100, block_offset=list_defaults['block'] - 1, block_limit=list_defaults['block'])
|
||||
assert b[0] == mid
|
||||
assert b[1] == mid
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['diane'])
|
||||
b = c.load_transactions_account(account, 0, 100, block_offset=list_defaults['block'] - 1, block_limit=list_defaults['block'])
|
||||
b = c.load_transactions_account(list_actors['diane'], 0, 100, block_offset=list_defaults['block'] - 1, block_limit=list_defaults['block'])
|
||||
assert b[0] == oldest
|
||||
assert b[1] == oldest
|
||||
|
||||
@@ -150,8 +140,6 @@ def test_cache_ranges_data(
|
||||
|
||||
session = init_database
|
||||
|
||||
tx_normalize = TxHexNormalizer()
|
||||
|
||||
oldest = list_defaults['block'] - 1
|
||||
mid = list_defaults['block']
|
||||
newest = list_defaults['block'] + 2
|
||||
@@ -215,8 +203,7 @@ def test_cache_ranges_data(
|
||||
assert b[2][1]['tx_hash'] == more_txs[1]
|
||||
|
||||
# now check when supplying account
|
||||
account = tx_normalize.wallet_address(list_actors['alice'])
|
||||
b = c.load_transactions_account_with_data(account, 0, 100)
|
||||
b = c.load_transactions_account_with_data(list_actors['alice'], 0, 100)
|
||||
assert b[0] == oldest
|
||||
assert b[1] == newest
|
||||
assert len(b[2]) == 3
|
||||
@@ -224,15 +211,13 @@ def test_cache_ranges_data(
|
||||
assert b[2][1]['tx_hash'] == more_txs[1]
|
||||
assert b[2][2]['tx_hash'] == more_txs[2]
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['bob'])
|
||||
b = c.load_transactions_account_with_data(account, 0, 100)
|
||||
b = c.load_transactions_account_with_data(list_actors['bob'], 0, 100)
|
||||
assert b[0] == mid
|
||||
assert b[1] == mid
|
||||
assert len(b[2]) == 1
|
||||
assert b[2][0]['tx_hash'] == more_txs[1]
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['diane'])
|
||||
b = c.load_transactions_account_with_data(account, 0, 100)
|
||||
b = c.load_transactions_account_with_data(list_actors['diane'], 0, 100)
|
||||
assert b[0] == oldest
|
||||
assert b[1] == newest
|
||||
assert len(b[2]) == 2
|
||||
@@ -240,31 +225,27 @@ def test_cache_ranges_data(
|
||||
assert b[2][1]['tx_hash'] == more_txs[2]
|
||||
|
||||
# add block filter to the mix
|
||||
account = tx_normalize.wallet_address(list_actors['alice'])
|
||||
b = c.load_transactions_account_with_data(account, 0, 100, block_offset=list_defaults['block'])
|
||||
b = c.load_transactions_account_with_data(list_actors['alice'], 0, 100, block_offset=list_defaults['block'])
|
||||
assert b[0] == mid
|
||||
assert b[1] == newest
|
||||
assert len(b[2]) == 2
|
||||
assert b[2][0]['tx_hash'] == more_txs[0]
|
||||
assert b[2][1]['tx_hash'] == more_txs[1]
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['alice'])
|
||||
b = c.load_transactions_account_with_data(account, 0, 100, block_offset=list_defaults['block'])
|
||||
b = c.load_transactions_account_with_data(list_actors['alice'], 0, 100, block_offset=list_defaults['block'])
|
||||
assert b[0] == mid
|
||||
assert b[1] == newest
|
||||
assert len(b[2]) == 2
|
||||
assert b[2][0]['tx_hash'] == more_txs[0]
|
||||
assert b[2][1]['tx_hash'] == more_txs[1]
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['bob'])
|
||||
b = c.load_transactions_account_with_data(account, 0, 100, block_offset=list_defaults['block'] - 1, block_limit=list_defaults['block'])
|
||||
b = c.load_transactions_account_with_data(list_actors['bob'], 0, 100, block_offset=list_defaults['block'] - 1, block_limit=list_defaults['block'])
|
||||
assert b[0] == mid
|
||||
assert b[1] == mid
|
||||
assert len(b[2]) == 1
|
||||
assert b[2][0]['tx_hash'] == more_txs[1]
|
||||
|
||||
account = tx_normalize.wallet_address(list_actors['diane'])
|
||||
b = c.load_transactions_account_with_data(account, 0, 100, block_offset=list_defaults['block'] - 1, block_limit=list_defaults['block'])
|
||||
b = c.load_transactions_account_with_data(list_actors['diane'], 0, 100, block_offset=list_defaults['block'] - 1, block_limit=list_defaults['block'])
|
||||
assert b[0] == oldest
|
||||
assert b[1] == oldest
|
||||
assert len(b[2]) == 1
|
||||
|
||||
@@ -82,7 +82,7 @@ def test_query_regex(
|
||||
[
|
||||
('alice', None, None, [(420000, 13), (419999, 42)]),
|
||||
('alice', None, 1, [(420000, 13)]),
|
||||
('alice', 1, 1, [(419999, 42)]), # 420000 == list_defaults['block']
|
||||
('alice', 1, None, [(419999, 42)]), # 420000 == list_defaults['block']
|
||||
('alice', 2, None, []), # 420000 == list_defaults['block']
|
||||
],
|
||||
)
|
||||
@@ -107,11 +107,10 @@ def test_query_process_txs_account(
|
||||
path_info = '/tx/user/0x' + strip_0x(actor)
|
||||
if query_offset != None:
|
||||
path_info += '/' + str(query_offset)
|
||||
if query_limit == None:
|
||||
query_limit = 100
|
||||
path_info += '/' + str(query_limit)
|
||||
if query_offset == None:
|
||||
path_info += '/0'
|
||||
if query_limit != None:
|
||||
if query_offset == None:
|
||||
path_info += '/0'
|
||||
path_info += '/' + str(query_limit)
|
||||
env = {
|
||||
'PATH_INFO': path_info,
|
||||
}
|
||||
@@ -193,7 +192,7 @@ def test_query_process_txs_bloom(
|
||||
@pytest.mark.parametrize(
|
||||
'query_block_start, query_block_end, query_match_count',
|
||||
[
|
||||
(1, 42, 0),
|
||||
(None, 42, 0),
|
||||
(420000, 420001, 1),
|
||||
(419999, 419999, 1), # matches are inclusive
|
||||
(419999, 420000, 2),
|
||||
@@ -212,7 +211,7 @@ def test_query_process_txs_data(
|
||||
query_match_count,
|
||||
):
|
||||
|
||||
path_info = '/txa/100/0'
|
||||
path_info = '/txa'
|
||||
if query_block_start != None:
|
||||
path_info += '/' + str(query_block_start)
|
||||
if query_block_end != None:
|
||||
@@ -228,5 +227,4 @@ def test_query_process_txs_data(
|
||||
assert r != None
|
||||
|
||||
o = json.loads(r[1])
|
||||
logg.debug('oo {}'.format(o))
|
||||
assert len(o['data']) == query_match_count
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
celery==4.4.7
|
||||
erc20-demurrage-token~=0.0.6
|
||||
cic-eth-registry~=0.6.3
|
||||
chainlib~=0.0.14
|
||||
cic_eth~=0.12.6
|
||||
erc20-demurrage-token~=0.0.5a3
|
||||
cic-eth-registry~=0.6.1a6
|
||||
chainlib~=0.0.9rc1
|
||||
cic_eth~=0.12.4a11
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
[metadata]
|
||||
name = cic-eth-aux-erc20-demurrage-token
|
||||
version = 0.0.3
|
||||
version = 0.0.2a7
|
||||
description = cic-eth tasks supporting erc20 demurrage token
|
||||
author = Louis Holbrook
|
||||
author_email = dev@holbrook.no
|
||||
|
||||
@@ -1,36 +1 @@
|
||||
# CIC-ETH
|
||||
|
||||
## Testing CIC-ETH locally.
|
||||
|
||||
### Setup a Virtual Env
|
||||
|
||||
```bash
|
||||
python3 -m venv ./venv # Python 3.9
|
||||
source ./venv/activate
|
||||
```
|
||||
|
||||
### Running All Unit Tests
|
||||
|
||||
```bash
|
||||
bash ./tests/run_tests.sh # This will also install required dependencies
|
||||
```
|
||||
|
||||
### Running Specific Unit Tests
|
||||
|
||||
Ensure that:
|
||||
|
||||
- You have called `bash ./tests/run_tests.sh` at least once or run the following to install required dependencies
|
||||
- You have activated the virtual environment
|
||||
|
||||
```
|
||||
pip install --extra-index-url https://pip.grassrootseconomics.net --extra-index-url https://gitlab.com/api/v4/projects/27624814/packages/pypi/simple \
|
||||
-r admin_requirements.txt \
|
||||
-r services_requirements.txt \
|
||||
-r test_requirements.txt
|
||||
```
|
||||
|
||||
Then here is an example that only runs tests with the keyword(-k) `test_server`
|
||||
|
||||
```bash
|
||||
pytest -s -v --log-cli-level DEBUG --log-level DEBUG -k test_server
|
||||
```
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
SQLAlchemy==1.3.20
|
||||
hexathon~=0.1.0
|
||||
chainqueue~=0.0.6a4
|
||||
eth-erc20~=0.1.5
|
||||
cic-eth-registry>=0.6.1a6,<0.7.0
|
||||
hexathon~=0.0.1a8
|
||||
chainqueue>=0.0.4a6,<0.1.0
|
||||
eth-erc20>=0.1.2a2,<0.2.0
|
||||
|
||||
@@ -6,8 +6,11 @@
|
||||
# standard imports
|
||||
import logging
|
||||
|
||||
# external imports
|
||||
# external imports
|
||||
import celery
|
||||
from chainlib.chain import ChainSpec
|
||||
from hexathon import strip_0x
|
||||
|
||||
# local imports
|
||||
from cic_eth.api.base import ApiBase
|
||||
from cic_eth.enum import LockEnum
|
||||
|
||||
@@ -63,32 +63,22 @@ class Config(BaseConfig):
|
||||
config.get('REDIS_HOST'),
|
||||
config.get('REDIS_PORT'),
|
||||
)
|
||||
db = getattr(args, 'redis_db', None)
|
||||
if db != None:
|
||||
db = str(db)
|
||||
|
||||
redis_url = (
|
||||
'redis',
|
||||
hostport,
|
||||
db,
|
||||
getattr(args, 'redis_db', None),
|
||||
)
|
||||
|
||||
|
||||
celery_config_url = urllib.parse.urlsplit(config.get('CELERY_BROKER_URL'))
|
||||
hostport = urlhostmerge(
|
||||
celery_config_url[1],
|
||||
getattr(args, 'celery_host', None),
|
||||
getattr(args, 'celery_port', None),
|
||||
)
|
||||
db = getattr(args, 'redis_db', None)
|
||||
if db != None:
|
||||
db = str(db)
|
||||
celery_arg_url = (
|
||||
getattr(args, 'celery_scheme', None),
|
||||
hostport,
|
||||
db,
|
||||
getattr(args, 'celery_db', None),
|
||||
)
|
||||
|
||||
celery_url = urlmerge(redis_url, celery_config_url, celery_arg_url)
|
||||
celery_url_string = urllib.parse.urlunsplit(celery_url)
|
||||
local_celery_args_override['CELERY_BROKER_URL'] = celery_url_string
|
||||
|
||||
@@ -22,7 +22,7 @@ from hexathon import (
|
||||
from chainqueue.error import NotLocalTxError
|
||||
from eth_erc20 import ERC20
|
||||
from chainqueue.sql.tx import cache_tx_dict
|
||||
from okota.token_index.index import to_identifier
|
||||
from okota.token_index import to_identifier
|
||||
|
||||
# local imports
|
||||
from cic_eth.db.models.base import SessionBase
|
||||
@@ -46,14 +46,13 @@ from cic_eth.task import (
|
||||
from cic_eth.eth.nonce import CustodialTaskNonceOracle
|
||||
from cic_eth.encode import tx_normalize
|
||||
from cic_eth.eth.trust import verify_proofs
|
||||
from cic_eth.error import SignerError
|
||||
|
||||
celery_app = celery.current_app
|
||||
logg = logging.getLogger()
|
||||
|
||||
|
||||
@celery_app.task(bind=True, base=CriticalWeb3Task)
|
||||
def balance(self, tokens, holder_address, chain_spec_dict):
|
||||
@celery_app.task(base=CriticalWeb3Task)
|
||||
def balance(tokens, holder_address, chain_spec_dict):
|
||||
"""Return token balances for a list of tokens for given address
|
||||
|
||||
:param tokens: Token addresses
|
||||
@@ -72,9 +71,8 @@ def balance(self, tokens, holder_address, chain_spec_dict):
|
||||
for t in tokens:
|
||||
address = t['address']
|
||||
logg.debug('address {} {}'.format(address, holder_address))
|
||||
gas_oracle = self.create_gas_oracle(rpc, min_price=self.min_fee_price)
|
||||
token = ERC20Token(chain_spec, rpc, add_0x(address))
|
||||
c = ERC20(chain_spec, gas_oracle=gas_oracle)
|
||||
c = ERC20(chain_spec)
|
||||
o = c.balance_of(address, holder_address, sender_address=caller_address)
|
||||
r = rpc.do(o)
|
||||
t['balance_network'] = c.parse_balance(r)
|
||||
|
||||
@@ -92,7 +92,7 @@ def apply_gas_value_cache_local(address, method, value, tx_hash, session=None):
|
||||
|
||||
if o == None:
|
||||
o = GasCache(address, method, value, tx_hash)
|
||||
elif value > o.value:
|
||||
elif tx.gas_used > o.value:
|
||||
o.value = value
|
||||
o.tx_hash = strip_0x(tx_hash)
|
||||
|
||||
|
||||
@@ -1,12 +0,0 @@
|
||||
# standard imports
|
||||
import os
|
||||
import random
|
||||
import uuid
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
def blockchain_address() -> str:
|
||||
return os.urandom(20).hex().lower()
|
||||
|
||||
@@ -1,132 +0,0 @@
|
||||
# standard imports
|
||||
import os
|
||||
|
||||
# external imports
|
||||
import pytest
|
||||
from celery import uuid
|
||||
# test imports
|
||||
from cic_eth.pytest.helpers.accounts import blockchain_address
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def task_uuid():
|
||||
return uuid()
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def default_token_data(foo_token_symbol, foo_token):
|
||||
return {
|
||||
'symbol': foo_token_symbol,
|
||||
'address': foo_token,
|
||||
'name': 'Giftable Token',
|
||||
'decimals': 6,
|
||||
"converters": []
|
||||
}
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def mock_account_creation_task_request(mocker, task_uuid):
|
||||
def mock_request(self):
|
||||
mocked_task_request = mocker.patch('celery.app.task.Task.request')
|
||||
mocked_task_request.id = task_uuid
|
||||
return mocked_task_request
|
||||
mocker.patch('cic_eth.api.api_task.Api.create_account', mock_request)
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def mock_account_creation_task_result(mocker, task_uuid):
|
||||
def task_result(self):
|
||||
sync_res = mocker.patch('celery.result.AsyncResult')
|
||||
sync_res.id = task_uuid
|
||||
sync_res.get.return_value = blockchain_address()
|
||||
return sync_res
|
||||
mocker.patch('cic_eth.api.api_task.Api.create_account', task_result)
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def mock_token_api_query(foo_token_symbol, foo_token, mocker, task_uuid):
|
||||
def mock_query(self, token_symbol, proof=None):
|
||||
sync_res = mocker.patch('celery.result.AsyncResult')
|
||||
sync_res.id = task_uuid
|
||||
sync_res.get.return_value = [
|
||||
{
|
||||
'address': foo_token,
|
||||
'converters': [],
|
||||
'decimals': 6,
|
||||
'name': 'Giftable Token',
|
||||
'proofs': ['5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3'],
|
||||
'symbol': foo_token_symbol,
|
||||
},{'5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3': ['Eb3907eCad74a0013c259D5874AE7f22DcBcC95C','Eb3907eCad74a0013c259D5874AE7f22DcBcC95C']}
|
||||
]
|
||||
return sync_res
|
||||
mocker.patch('cic_eth.api.api_task.Api.token', mock_query)
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def mock_tokens_api_query(foo_token_symbol, foo_token, mocker, task_uuid):
|
||||
def mock_query(self, token_symbol, proof=None):
|
||||
sync_res = mocker.patch('celery.result.AsyncResult')
|
||||
sync_res.id = task_uuid
|
||||
sync_res.get.return_value = [[
|
||||
{
|
||||
'address': foo_token,
|
||||
'converters': [],
|
||||
'decimals': 6,
|
||||
'name': 'Giftable Token',
|
||||
'proofs': ['5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3'],
|
||||
'symbol': foo_token_symbol,
|
||||
},{'5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3': ['Eb3907eCad74a0013c259D5874AE7f22DcBcC95C','Eb3907eCad74a0013c259D5874AE7f22DcBcC95C']}
|
||||
], [
|
||||
{
|
||||
'address': foo_token,
|
||||
'converters': [],
|
||||
'decimals': 6,
|
||||
'name': 'Giftable Token',
|
||||
'proofs': ['5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3'],
|
||||
'symbol': foo_token_symbol,
|
||||
},{'5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3': ['Eb3907eCad74a0013c259D5874AE7f22DcBcC95C','Eb3907eCad74a0013c259D5874AE7f22DcBcC95C']}
|
||||
]]
|
||||
return sync_res
|
||||
mocker.patch('cic_eth.api.api_task.Api.tokens', mock_query)
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def mock_sync_balance_api_query(balances, mocker, task_uuid):
|
||||
def sync_api_query(self, address: str, token_symbol: str):
|
||||
sync_res = mocker.patch('celery.result.AsyncResult')
|
||||
sync_res.id = task_uuid
|
||||
sync_res.get.return_value = balances
|
||||
return sync_res
|
||||
mocker.patch('cic_eth.api.api_task.Api.balance', sync_api_query)
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def mock_sync_default_token_api_query(default_token_data, mocker, task_uuid):
|
||||
def mock_query(self):
|
||||
sync_res = mocker.patch('celery.result.AsyncResult')
|
||||
sync_res.id = task_uuid
|
||||
sync_res.get.return_value = default_token_data
|
||||
return sync_res
|
||||
mocker.patch('cic_eth.api.api_task.Api.default_token', mock_query)
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def mock_transaction_list_query(mocker):
|
||||
query_args = {}
|
||||
|
||||
def mock_query(self, address: str, limit: int):
|
||||
query_args['address'] = address
|
||||
query_args['limit'] = limit
|
||||
|
||||
mocker.patch('cic_eth.api.api_task.Api.list', mock_query)
|
||||
return query_args
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def mock_transfer_api(mocker):
|
||||
transfer_args = {}
|
||||
|
||||
def mock_transfer(self, from_address: str, to_address: str, value: int, token_symbol: str):
|
||||
transfer_args['from_address'] = from_address
|
||||
transfer_args['to_address'] = to_address
|
||||
transfer_args['value'] = value
|
||||
transfer_args['token_symbol'] = token_symbol
|
||||
|
||||
mocker.patch('cic_eth.api.api_task.Api.transfer', mock_transfer)
|
||||
return transfer_args
|
||||
@@ -1,11 +1,15 @@
|
||||
#!/usr/bin/python
|
||||
import sys
|
||||
import os
|
||||
import logging
|
||||
import uuid
|
||||
import json
|
||||
import argparse
|
||||
|
||||
# external imports
|
||||
import redis
|
||||
from xdg.BaseDirectory import xdg_config_home
|
||||
from chainlib.chain import ChainSpec
|
||||
|
||||
# local imports
|
||||
import cic_eth.cli
|
||||
|
||||
@@ -1,38 +0,0 @@
|
||||
import logging
|
||||
|
||||
import cic_eth.cli
|
||||
from cic_eth.server.app import create_app
|
||||
from cic_eth.server.celery import create_celery_wrapper
|
||||
|
||||
arg_flags = cic_eth.cli.argflag_std_base
|
||||
local_arg_flags = cic_eth.cli.argflag_local_taskcallback
|
||||
argparser = cic_eth.cli.ArgumentParser(arg_flags)
|
||||
argparser.process_local_flags(local_arg_flags)
|
||||
args = argparser.parse_args()
|
||||
config = cic_eth.cli.Config.from_args(args, arg_flags, local_arg_flags)
|
||||
# Define log levels
|
||||
if args.vv:
|
||||
logging.getLogger().setLevel(logging.DEBUG)
|
||||
elif args.v:
|
||||
logging.getLogger().setLevel(logging.INFO)
|
||||
|
||||
|
||||
# Setup Celery App
|
||||
celery_app = cic_eth.cli.CeleryApp.from_config(config)
|
||||
celery_app.set_default()
|
||||
|
||||
|
||||
chain_spec = config.get('CHAIN_SPEC')
|
||||
celery_queue = config.get('CELERY_QUEUE')
|
||||
redis_host = config.get('REDIS_HOST')
|
||||
redis_port = config.get('REDIS_PORT')
|
||||
redis_db = config.get('REDIS_DB')
|
||||
redis_timeout = config.get('REDIS_TIMEOUT')
|
||||
|
||||
celery_wrapper = create_celery_wrapper(celery_queue=celery_queue, chain_spec=chain_spec,
|
||||
redis_db=redis_db, redis_host=redis_host, redis_port=redis_port, redis_timeout=redis_timeout)
|
||||
app = create_app(celery_wrapper)
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
uvicorn.run(app, host="0.0.0.0", port=5000, log_level="info")
|
||||
@@ -1,3 +0,0 @@
|
||||
from . import converters
|
||||
from . import cache
|
||||
from . import celery
|
||||
@@ -1,119 +0,0 @@
|
||||
import logging
|
||||
import sys
|
||||
from typing import List, Optional, Union
|
||||
|
||||
from cic_eth.server import cache, converters
|
||||
from cic_eth.server.cache import setup_cache
|
||||
from cic_eth.server.celery import create_celery_wrapper
|
||||
from cic_eth.server.models import (DefaultToken, Token, TokenBalance,
|
||||
Transaction)
|
||||
from fastapi import FastAPI, Query
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def create_app(celery_wrapper):
|
||||
|
||||
app = FastAPI(debug=True,
|
||||
title="Grassroots Economics",
|
||||
description="CIC ETH API",
|
||||
version="0.0.1",
|
||||
terms_of_service="https://www.grassrootseconomics.org/pages/terms-and-conditions.html",
|
||||
contact={
|
||||
"name": "Grassroots Economics",
|
||||
"url": "https://www.grassrootseconomics.org",
|
||||
"email": "will@grassecon.org"
|
||||
},
|
||||
license_info={
|
||||
"name": "GPLv3",
|
||||
})
|
||||
|
||||
@app.get("/transactions", response_model=List[Transaction])
|
||||
def transactions(address: str, limit: Optional[str] = 10):
|
||||
return celery_wrapper('list', address, limit=limit)
|
||||
|
||||
@app.get("/balance", response_model=List[TokenBalance])
|
||||
def balance(token_symbol: str, address: str = Query(..., title="Address", min_length=40, max_length=42), include_pending: bool = True):
|
||||
log.info(f"address: {address}")
|
||||
log.info(f"token_symbol: {token_symbol}")
|
||||
data = celery_wrapper('balance', address, token_symbol,
|
||||
include_pending=include_pending)
|
||||
for b in data:
|
||||
token = get_token(token_symbol)
|
||||
b['balance_network'] = converters.from_wei(
|
||||
token.decimals, int(b['balance_network']))
|
||||
b['balance_incoming'] = converters.from_wei(
|
||||
token.decimals, int(b['balance_incoming']))
|
||||
b['balance_outgoing'] = converters.from_wei(
|
||||
token.decimals, int(b['balance_outgoing']))
|
||||
|
||||
b.update({
|
||||
"balance_available": int(b['balance_network']) + int(b['balance_incoming']) - int(b['balance_outgoing'])
|
||||
})
|
||||
return data
|
||||
|
||||
@app.post("/create_account")
|
||||
def create_account(password: Optional[str] = None, register: bool = True):
|
||||
data = celery_wrapper(
|
||||
'create_account', password=password, register=register)
|
||||
return data
|
||||
|
||||
# def refill_gas(start_response, query: dict):
|
||||
# address = query.pop('address')
|
||||
# data = celery_wrapper('refill_gas', address)
|
||||
# return data
|
||||
|
||||
# def ping(start_response, query: dict):
|
||||
# data = celery_wrapper('ping', **query)
|
||||
# return data
|
||||
|
||||
@app.post("/transfer")
|
||||
def transfer(from_address: str, to_address: str, value: int, token_symbol: str):
|
||||
token = get_token(
|
||||
token_symbol)
|
||||
wei_value = converters.to_wei(token.decimals, int(value))
|
||||
data = celery_wrapper('transfer', from_address,
|
||||
to_address, wei_value, token_symbol)
|
||||
return data
|
||||
|
||||
@app.post("/transfer_from")
|
||||
def transfer_from(from_address: str, to_address: str, value: int, token_symbol: str, spender_address: str):
|
||||
token = get_token(
|
||||
token_symbol)
|
||||
wei_value = converters.to_wei(token.decimals, int(value))
|
||||
data = celery_wrapper('transfer_from', from_address, to_address,
|
||||
wei_value, token_symbol, spender_address)
|
||||
return data
|
||||
|
||||
@app.get("/token", response_model=Token)
|
||||
def token(token_symbol: str, proof: Optional[str] = None):
|
||||
token = get_token(token_symbol)
|
||||
if token == None:
|
||||
sys.stderr.write(f"Cached Token {token_symbol} not found")
|
||||
data = celery_wrapper('token', token_symbol, proof=proof)
|
||||
token = Token.new(data)
|
||||
sys.stderr.write(f"Token {token}")
|
||||
|
||||
return token
|
||||
|
||||
@app.get("/tokens", response_model=List[Token])
|
||||
def tokens(token_symbols: Optional[List[str]] = Query(None), proof: Optional[Union[str, List[str], List[List[str]]]] = None):
|
||||
data = celery_wrapper('tokens', token_symbols,
|
||||
catch=len(token_symbols), proof=proof)
|
||||
if data:
|
||||
tokens = []
|
||||
for token in data:
|
||||
print(f"Token: {token}")
|
||||
tokens.append(Token.new(token))
|
||||
return tokens
|
||||
return None
|
||||
|
||||
@app.get("/default_token", response_model=DefaultToken)
|
||||
def default_token():
|
||||
data = celery_wrapper('default_token')
|
||||
return data
|
||||
|
||||
def get_token(token_symbol: str):
|
||||
data = celery_wrapper('token', token_symbol)
|
||||
return Token.new(data)
|
||||
return app
|
||||
@@ -1,130 +0,0 @@
|
||||
# standard imports
|
||||
import hashlib
|
||||
import json
|
||||
import logging
|
||||
from typing import Optional, Union
|
||||
|
||||
from cic_eth.server.models import Token
|
||||
from cic_types.condiments import MetadataPointer
|
||||
from redis import Redis, StrictRedis
|
||||
|
||||
logg = logging.getLogger(__file__)
|
||||
|
||||
|
||||
class Cache:
|
||||
store: Redis = None
|
||||
|
||||
|
||||
def setup_cache(redis_host, redis_port, redis_db):
|
||||
# Define universal redis cache access
|
||||
Cache.store = StrictRedis(
|
||||
host=redis_host, port=redis_port, db=redis_db, decode_responses=True)
|
||||
|
||||
|
||||
def get_token_data(token_symbol: str):
|
||||
"""
|
||||
:param token_symbol:
|
||||
:type token_symbol:
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
identifier = [token_symbol.encode('utf-8')]
|
||||
key = cache_data_key(identifier, MetadataPointer.TOKEN_DATA)
|
||||
logg.debug(f'Retrieving token data for: {token_symbol} at: {key}')
|
||||
token_data_str = get_cached_data(key=key)
|
||||
if(token_data_str is None):
|
||||
logg.debug(f'No token data found for: {token_symbol}')
|
||||
return None
|
||||
else:
|
||||
token_data = json.loads(token_data_str)
|
||||
logg.debug(f'Retrieved token data: {token_data}')
|
||||
return token_data
|
||||
|
||||
|
||||
def set_token_data(token_symbol: str, token: dict):
|
||||
"""
|
||||
:param token_symbol:
|
||||
:type token_symbol:
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
identifier = [token_symbol.encode('utf-8')]
|
||||
key = cache_data_key(identifier, MetadataPointer.TOKEN_DATA)
|
||||
cache_data(key, json.dumps(token))
|
||||
logg.debug(f'Cached token data for: {token_symbol} at: {key}')
|
||||
|
||||
|
||||
def get_default_token() -> Optional[str]:
|
||||
"""This function attempts to retrieve the default token's data from the redis cache.
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
logg.debug(f'Retrieving default token from cache')
|
||||
# TODO: What should the identifier be?
|
||||
key = cache_data_key(identifier="ff".encode('utf-8'),
|
||||
salt=MetadataPointer.TOKEN_DEFAULT)
|
||||
default_token_str = get_cached_data(key=key)
|
||||
if default_token_str is None:
|
||||
logg.debug(f'No cached default token found: {key}')
|
||||
return None
|
||||
default_token = json.loads(default_token_str)
|
||||
logg.debug(f'Retrieved default token data: {default_token}')
|
||||
return default_token
|
||||
|
||||
|
||||
def set_default_token(default_token: dict):
|
||||
"""
|
||||
:param default_token:
|
||||
:type default_token:
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
logg.debug(f'Setting default token in cache')
|
||||
key = cache_data_key(identifier="ff".encode('utf-8'),
|
||||
salt=MetadataPointer.TOKEN_DEFAULT)
|
||||
cache_data(key, json.dumps(default_token))
|
||||
|
||||
|
||||
def cache_data(key: str, data: str):
|
||||
"""
|
||||
:param key:
|
||||
:type key:
|
||||
:param data:
|
||||
:type data:
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
cache = Cache.store
|
||||
cache.set(name=key, value=data)
|
||||
cache.persist(name=key)
|
||||
logg.debug(f'caching: {data} with key: {key}.')
|
||||
|
||||
|
||||
def get_cached_data(key: str):
|
||||
"""
|
||||
:param key:
|
||||
:type key:
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
cache = Cache.store
|
||||
return cache.get(name=key)
|
||||
|
||||
|
||||
def cache_data_key(identifier: Union[list, bytes], salt: MetadataPointer):
|
||||
"""
|
||||
:param identifier:
|
||||
:type identifier:
|
||||
:param salt:
|
||||
:type salt:
|
||||
:return:
|
||||
:rtype:
|
||||
"""
|
||||
hash_object = hashlib.new("sha256")
|
||||
if isinstance(identifier, list):
|
||||
for identity in identifier:
|
||||
hash_object.update(identity)
|
||||
else:
|
||||
hash_object.update(identifier)
|
||||
hash_object.update(salt.value.encode(encoding="utf-8"))
|
||||
return hash_object.digest().hex()
|
||||
@@ -1,64 +0,0 @@
|
||||
import json
|
||||
import logging
|
||||
import sys
|
||||
import uuid
|
||||
|
||||
import redis
|
||||
from cic_eth.api.api_task import Api
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def create_celery_wrapper(chain_spec,
|
||||
celery_queue,
|
||||
redis_host,
|
||||
redis_port,
|
||||
redis_db,
|
||||
redis_timeout):
|
||||
def call(method, *args, catch=1, **kwargs):
|
||||
""" Creates a redis channel and calls `cic_eth.api` with the provided `method` and `*args`. Returns the result of the api call. Catch allows you to specify how many messages to catch before returning.
|
||||
"""
|
||||
log.debug(f"Using redis: {redis_host}, {redis_port}, {redis_db}")
|
||||
redis_channel = str(uuid.uuid4())
|
||||
r = redis.Redis(redis_host, redis_port, redis_db)
|
||||
ps = r.pubsub()
|
||||
ps.subscribe(redis_channel)
|
||||
api = Api(
|
||||
chain_spec,
|
||||
queue=celery_queue,
|
||||
callback_param='{}:{}:{}:{}'.format(
|
||||
redis_host, redis_port, redis_db, redis_channel),
|
||||
callback_task='cic_eth.callbacks.redis.redis',
|
||||
callback_queue=celery_queue,
|
||||
)
|
||||
getattr(api, method)(*args, **kwargs)
|
||||
|
||||
ps.get_message()
|
||||
try:
|
||||
data = []
|
||||
if catch == 1:
|
||||
message = ps.get_message(timeout=redis_timeout)
|
||||
data = json.loads(message['data'])["result"]
|
||||
raise data
|
||||
else:
|
||||
for _i in range(catch):
|
||||
message = ps.get_message(
|
||||
timeout=redis_timeout)
|
||||
result = json.loads(message['data'])["result"]
|
||||
data.append(result)
|
||||
|
||||
except TimeoutError as e:
|
||||
sys.stderr.write(
|
||||
f"cic_eth.api.{method}({args}, {kwargs}) timed out:\n {e}")
|
||||
raise e
|
||||
except Exception as e:
|
||||
sys.stderr.write(
|
||||
f'Unable to parse Data:\n{data}\n Error:\n{e}')
|
||||
raise e
|
||||
|
||||
log.debug(
|
||||
f"cic_eth.api.{method}(args={args}, kwargs={kwargs})\n {data}")
|
||||
|
||||
ps.unsubscribe()
|
||||
return data
|
||||
return call
|
||||
@@ -1,39 +0,0 @@
|
||||
# Stolen from ussd
|
||||
|
||||
from math import trunc
|
||||
|
||||
def from_wei(decimals: int, value: int) -> float:
|
||||
"""This function converts values in Wei to a token in the cic network.
|
||||
:param decimals: The decimals required for wei values.
|
||||
:type decimals: int
|
||||
:param value: Value in Wei
|
||||
:type value: int
|
||||
:return: SRF equivalent of value in Wei
|
||||
:rtype: float
|
||||
"""
|
||||
value = float(value) / (10**decimals)
|
||||
return truncate(value=value, decimals=2)
|
||||
|
||||
|
||||
def to_wei(decimals: int, value: int) -> int:
|
||||
"""This functions converts values from a token in the cic network to Wei.
|
||||
:param decimals: The decimals required for wei values.
|
||||
:type decimals: int
|
||||
:param value: Value in SRF
|
||||
:type value: int
|
||||
:return: Wei equivalent of value in SRF
|
||||
:rtype: int
|
||||
"""
|
||||
return int(value * (10**decimals))
|
||||
|
||||
def truncate(value: float, decimals: int) -> float:
|
||||
"""This function truncates a value to a specified number of decimals places.
|
||||
:param value: The value to be truncated.
|
||||
:type value: float
|
||||
:param decimals: The number of decimals for the value to be truncated to
|
||||
:type decimals: int
|
||||
:return: The truncated value.
|
||||
:rtype: int
|
||||
"""
|
||||
stepper = 10.0**decimals
|
||||
return trunc(stepper*value) / stepper
|
||||
@@ -1,92 +0,0 @@
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import List, Optional
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
|
||||
class Transaction(BaseModel):
|
||||
block_number: Optional[int] = Field(None, example=24531)
|
||||
date_checked: Optional[str] = Field(
|
||||
None, example='2021-11-12T09:36:40.725296')
|
||||
date_created: Optional[str] = Field(
|
||||
None, example='2021-11-12T09:36:40.131292')
|
||||
date_updated: Optional[str] = Field(
|
||||
None, example='2021-11-12T09:36:40.131292')
|
||||
destination_token: Optional[str] = Field(
|
||||
None, example=365185044137427460620354810422988491181438940190
|
||||
)
|
||||
destination_token_decimals: Optional[int] = Field(None, example=6)
|
||||
destination_token_symbol: Optional[str] = Field(None, example='COFE')
|
||||
from_value: Optional[int] = Field(None, example=100000000)
|
||||
hash: Optional[str] = Field(
|
||||
None,
|
||||
example=90380195350511178677041624165156640995490505896556680958001954705731707874291,
|
||||
)
|
||||
nonce: Optional[int] = Field(None, example=1)
|
||||
recipient: Optional[str] = Field(
|
||||
None, example='872e1ec9d499b242ebfcfd0a279a4c3e0cd472c0'
|
||||
)
|
||||
sender: Optional[str] = Field(
|
||||
None, example='1a92b05e0b880127a4c26ac0f68a52df3ac6b89d'
|
||||
)
|
||||
signed_tx: Optional[str] = Field(
|
||||
None,
|
||||
example=1601943273486236942256143665779318355236220334071247753507187634376562549990085710958441113013370129915441072693447256942510246386178938683325073160349857879326297351587330623503997011254644396580777843154770873208185332563272343361515226115860084201932230246018679661802320007832375955345977725551120479084062615799940692628221555193198194825737613358738414884130187144700126061702642574663703095161159219410608270,
|
||||
)
|
||||
source_token: Optional[str] = Field(
|
||||
None, example=365185044137427460620354810422988491181438940190
|
||||
)
|
||||
source_token_decimals: Optional[int] = Field(None, example=6)
|
||||
source_token_symbol: Optional[str] = Field(None, example='COFE')
|
||||
status: Optional[str] = Field(None, example='SUCCESS')
|
||||
status_code: Optional[int] = Field(None, example=4104)
|
||||
timestamp: Optional[int] = Field(None, example=1636709800)
|
||||
to_value: Optional[int] = Field(None, example=100000000)
|
||||
tx_hash: Optional[str] = Field(
|
||||
None,
|
||||
example=90380195350511178677041624165156640995490505896556680958001954705731707874291,
|
||||
)
|
||||
tx_index: Optional[int] = Field(None, example=0)
|
||||
|
||||
|
||||
class DefaultToken(BaseModel):
|
||||
symbol: Optional[str] = Field(None, description='Token Symbol')
|
||||
address: Optional[str] = Field(None, description='Token Address')
|
||||
name: Optional[str] = Field(None, description='Token Name')
|
||||
decimals: Optional[int] = Field(None, description='Decimals')
|
||||
|
||||
|
||||
class TokenBalance(BaseModel):
|
||||
address: Optional[str] = None
|
||||
converters: Optional[List[str]] = None
|
||||
balance_network: Optional[int] = None
|
||||
balance_incoming: Optional[int] = None
|
||||
balance_outgoing: Optional[int] = None
|
||||
balance_available: Optional[int] = None
|
||||
|
||||
|
||||
class Token(BaseModel):
|
||||
decimals: Optional[int] = None
|
||||
name: Optional[str] = None
|
||||
address: Optional[str] = None
|
||||
symbol: Optional[str] = None
|
||||
proofs: Optional[List[str]] = None
|
||||
converters: Optional[List[str]] = None
|
||||
proofs_with_signers: Optional[List[Proof]] = None
|
||||
|
||||
@staticmethod
|
||||
def new(data: List[dict]) -> Token:
|
||||
proofs_with_signers = [{"proof": proof, "signers": signers}
|
||||
for (proof, signers) in data[1].items()]
|
||||
return Token(**data[0],
|
||||
proofs_with_signers=proofs_with_signers,
|
||||
)
|
||||
|
||||
|
||||
class Proof(BaseModel):
|
||||
proof: Optional[str] = None
|
||||
signers: Optional[List[str]] = None
|
||||
|
||||
|
||||
Token.update_forward_refs()
|
||||
@@ -17,7 +17,7 @@ from cic_eth_registry.error import UnknownContractError
|
||||
# local imports
|
||||
from cic_eth.error import SeppukuError
|
||||
from cic_eth.db.models.base import SessionBase
|
||||
from cic_eth.eth.util import CacheGasOracle, MaxGasOracle
|
||||
from cic_eth.eth.util import CacheGasOracle
|
||||
|
||||
#logg = logging.getLogger().getChild(__name__)
|
||||
logg = logging.getLogger()
|
||||
@@ -25,14 +25,12 @@ logg = logging.getLogger()
|
||||
celery_app = celery.current_app
|
||||
|
||||
|
||||
|
||||
class BaseTask(celery.Task):
|
||||
|
||||
session_func = SessionBase.create_session
|
||||
call_address = ZERO_ADDRESS
|
||||
trusted_addresses = []
|
||||
min_fee_price = 1
|
||||
min_fee_limit = 30000
|
||||
default_token_address = None
|
||||
default_token_symbol = None
|
||||
default_token_name = None
|
||||
@@ -41,28 +39,21 @@ class BaseTask(celery.Task):
|
||||
|
||||
|
||||
def create_gas_oracle(self, conn, address=None, *args, **kwargs):
|
||||
x = None
|
||||
if address is None:
|
||||
x = RPCGasOracle(
|
||||
if address == None:
|
||||
return RPCGasOracle(
|
||||
conn,
|
||||
code_callback=kwargs.get('code_callback', self.get_min_fee_limit),
|
||||
code_callback=kwargs.get('code_callback'),
|
||||
min_price=self.min_fee_price,
|
||||
id_generator=kwargs.get('id_generator'),
|
||||
)
|
||||
else:
|
||||
|
||||
x = MaxGasOracle(conn)
|
||||
x.code_callback = x.get_fee_units
|
||||
|
||||
return x
|
||||
|
||||
|
||||
def get_min_fee_limit(self, code):
|
||||
return self.min_fee_limit
|
||||
|
||||
|
||||
def get_min_fee_limit(self, code):
|
||||
return self.min_fee_limit
|
||||
return CacheGasOracle(
|
||||
conn,
|
||||
address,
|
||||
method=kwargs.get('method'),
|
||||
min_price=self.min_fee_price,
|
||||
id_generator=kwargs.get('id_generator'),
|
||||
)
|
||||
|
||||
|
||||
def create_session(self):
|
||||
@@ -87,7 +78,7 @@ class BaseTask(celery.Task):
|
||||
)
|
||||
s.apply_async()
|
||||
|
||||
|
||||
|
||||
class CriticalTask(BaseTask):
|
||||
retry_jitter = True
|
||||
retry_backoff = True
|
||||
@@ -99,7 +90,7 @@ class CriticalSQLAlchemyTask(CriticalTask):
|
||||
sqlalchemy.exc.DatabaseError,
|
||||
sqlalchemy.exc.TimeoutError,
|
||||
sqlalchemy.exc.ResourceClosedError,
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
class CriticalWeb3Task(CriticalTask):
|
||||
@@ -107,7 +98,7 @@ class CriticalWeb3Task(CriticalTask):
|
||||
ConnectionError,
|
||||
)
|
||||
safe_gas_threshold_amount = 60000 * 3
|
||||
safe_gas_refill_amount = safe_gas_threshold_amount * 5
|
||||
safe_gas_refill_amount = safe_gas_threshold_amount * 5
|
||||
safe_gas_gifter_balance = safe_gas_threshold_amount * 5 * 100
|
||||
|
||||
|
||||
@@ -125,13 +116,13 @@ class CriticalSQLAlchemyAndSignerTask(CriticalTask):
|
||||
sqlalchemy.exc.DatabaseError,
|
||||
sqlalchemy.exc.TimeoutError,
|
||||
sqlalchemy.exc.ResourceClosedError,
|
||||
)
|
||||
)
|
||||
|
||||
class CriticalWeb3AndSignerTask(CriticalWeb3Task):
|
||||
autoretry_for = (
|
||||
ConnectionError,
|
||||
)
|
||||
|
||||
|
||||
@celery_app.task()
|
||||
def check_health(self):
|
||||
pass
|
||||
|
||||
@@ -1,5 +0,0 @@
|
||||
[redis]
|
||||
host=redis
|
||||
database=0
|
||||
password=
|
||||
port=6379
|
||||
@@ -11,6 +11,13 @@ ARG EXTRA_PIP_INDEX_URL=https://pip.grassrootseconomics.net
|
||||
ARG EXTRA_PIP_ARGS=""
|
||||
ARG PIP_INDEX_URL=https://pypi.org/simple
|
||||
|
||||
RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
|
||||
pip install --index-url $PIP_INDEX_URL \
|
||||
--pre \
|
||||
--extra-index-url $EXTRA_PIP_INDEX_URL $EXTRA_PIP_ARGS \
|
||||
cic-eth-aux-erc20-demurrage-token~=0.0.2a7
|
||||
|
||||
|
||||
COPY *requirements.txt ./
|
||||
RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
|
||||
pip install --index-url $PIP_INDEX_URL \
|
||||
@@ -18,7 +25,7 @@ RUN --mount=type=cache,mode=0755,target=/root/.cache/pip \
|
||||
--extra-index-url $EXTRA_PIP_INDEX_URL $EXTRA_PIP_ARGS \
|
||||
-r requirements.txt \
|
||||
-r services_requirements.txt \
|
||||
-r admin_requirements.txt
|
||||
-r admin_requirements.txt
|
||||
|
||||
COPY . .
|
||||
RUN python setup.py install
|
||||
@@ -33,6 +40,8 @@ RUN chmod 755 *.sh
|
||||
# # they can all be overridden by environment variables
|
||||
# # to generate a list of environment variables from configuration, use: confini-dump -z <dir> (executable provided by confini package)
|
||||
#COPY config/ /usr/local/etc/cic-eth/
|
||||
COPY cic_eth/db/migrations/ /usr/local/share/cic-eth/alembic/
|
||||
COPY crypto_dev_signer_config/ /usr/local/etc/crypto-dev-signer/
|
||||
|
||||
# TODO this kind of code sharing across projects should be discouraged...can we make util a library?
|
||||
#COPY util/liveness/health.sh /usr/local/bin/health.sh
|
||||
@@ -57,8 +66,9 @@ ENTRYPOINT []
|
||||
## # they can all be overridden by environment variables
|
||||
## # to generate a list of environment variables from configuration, use: confini-dump -z <dir> (executable provided by confini package)
|
||||
#COPY config/ /usr/local/etc/cic-eth/
|
||||
COPY cic_eth/db/migrations/ /usr/local/share/cic-eth/alembic/
|
||||
#COPY scripts/ scripts/
|
||||
#COPY cic_eth/db/migrations/ /usr/local/share/cic-eth/alembic/
|
||||
#COPY crypto_dev_signer_config/ /usr/local/etc/crypto-dev-signer/
|
||||
#COPY scripts/ scripts/
|
||||
#
|
||||
## TODO this kind of code sharing across projects should be discouraged...can we make util a library?
|
||||
##COPY util/liveness/health.sh /usr/local/bin/health.sh
|
||||
|
||||
@@ -1,9 +1,4 @@
|
||||
celery==4.4.7
|
||||
chainlib-eth>=0.0.10a20,<0.1.0
|
||||
semver==2.13.0
|
||||
chainlib-eth~=0.0.15
|
||||
urlybird~=0.0.1
|
||||
cic-eth-registry~=0.6.6
|
||||
cic-types~=0.2.1a8
|
||||
cic-eth-aux-erc20-demurrage-token~=0.0.3
|
||||
fastapi[all]==0.70.1
|
||||
uvicorn[standard]<0.16.0
|
||||
urlybird~=0.0.1a2
|
||||
|
||||
@@ -1,15 +1,16 @@
|
||||
chainqueue~=0.0.6a4
|
||||
chainsyncer[sql]~=0.0.7
|
||||
chainqueue>=0.0.6a1,<0.1.0
|
||||
chainsyncer[sql]>=0.0.7a3,<0.1.0
|
||||
alembic==1.4.2
|
||||
confini~=0.5.3
|
||||
confini>=0.3.6rc4,<0.5.0
|
||||
redis==3.5.3
|
||||
hexathon~=0.1.0
|
||||
hexathon~=0.0.1a8
|
||||
pycryptodome==3.10.1
|
||||
liveness~=0.0.1a7
|
||||
eth-address-index~=0.2.4
|
||||
eth-accounts-index~=0.1.2
|
||||
erc20-faucet~=0.3.2
|
||||
erc20-transfer-authorization~=0.3.6
|
||||
sarafu-faucet~=0.0.7
|
||||
moolb~=0.2.0
|
||||
okota~=0.2.5
|
||||
eth-address-index>=0.2.4a1,<0.3.0
|
||||
eth-accounts-index>=0.1.2a3,<0.2.0
|
||||
cic-eth-registry>=0.6.1a6,<0.7.0
|
||||
erc20-faucet>=0.3.2a2,<0.4.0
|
||||
erc20-transfer-authorization>=0.3.5a2,<0.4.0
|
||||
sarafu-faucet>=0.0.7a2,<0.1.0
|
||||
moolb~=0.1.1b2
|
||||
okota>=0.2.4a6,<0.3.0
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
[metadata]
|
||||
name = cic-eth
|
||||
#version = attr: cic_eth.version.__version_string__
|
||||
version = 0.12.7
|
||||
version = 0.12.5a2
|
||||
description = CIC Network Ethereum interaction
|
||||
author = Louis Holbrook
|
||||
author_email = dev@holbrook.no
|
||||
@@ -36,7 +36,6 @@ packages =
|
||||
cic_eth.db.models
|
||||
cic_eth.queue
|
||||
cic_eth.ext
|
||||
cic_eth.server
|
||||
cic_eth.runnable
|
||||
cic_eth.runnable.daemons
|
||||
cic_eth.runnable.daemons.filters
|
||||
|
||||
@@ -6,5 +6,4 @@ pytest-redis==2.0.0
|
||||
redis==3.5.3
|
||||
eth-tester==0.5.0b3
|
||||
py-evm==0.3.0a20
|
||||
eth-erc20~=0.1.5
|
||||
erc20-transfer-authorization~=0.3.6
|
||||
eth-erc20~=0.1.2a2
|
||||
|
||||
@@ -24,8 +24,6 @@ from cic_eth.pytest.fixtures_database import *
|
||||
from cic_eth.pytest.fixtures_role import *
|
||||
from cic_eth.pytest.fixtures_contract import *
|
||||
from cic_eth.pytest.fixtures_token import *
|
||||
from cic_eth.pytest.patches.account import *
|
||||
|
||||
from chainlib.eth.pytest import *
|
||||
from eth_contract_registry.pytest import *
|
||||
from cic_eth_registry.pytest.fixtures_contracts import *
|
||||
|
||||
@@ -40,7 +40,6 @@ def test_filter_gas(
|
||||
foo_token,
|
||||
token_registry,
|
||||
register_lookups,
|
||||
register_tokens,
|
||||
celery_session_worker,
|
||||
cic_registry,
|
||||
):
|
||||
@@ -70,7 +69,7 @@ def test_filter_gas(
|
||||
tx = Tx(tx_src, block=block)
|
||||
tx.apply_receipt(rcpt)
|
||||
t = fltr.filter(eth_rpc, block, tx, db_session=init_database)
|
||||
assert t.get() == None
|
||||
assert t == None
|
||||
|
||||
nonce_oracle = RPCNonceOracle(contract_roles['CONTRACT_DEPLOYER'], eth_rpc)
|
||||
c = TokenUniqueSymbolIndex(default_chain_spec, signer=eth_signer, nonce_oracle=nonce_oracle)
|
||||
|
||||
@@ -2,9 +2,9 @@
|
||||
|
||||
set -e
|
||||
|
||||
pip install --extra-index-url https://pip.grassrootseconomics.net --extra-index-url https://gitlab.com/api/v4/projects/27624814/packages/pypi/simple \
|
||||
-r admin_requirements.txt \
|
||||
-r services_requirements.txt \
|
||||
pip install --extra-index-url https://pip.grassrootseconomics.net --extra-index-url https://gitlab.com/api/v4/projects/27624814/packages/pypi/simple
|
||||
-r admin_requirements.txt
|
||||
-r services_requirements.txt
|
||||
-r test_requirements.txt
|
||||
|
||||
export PYTHONPATH=. && pytest -x --cov=cic_eth --cov-fail-under=90 --cov-report term-missing tests
|
||||
|
||||
@@ -288,6 +288,7 @@ def test_fix_nonce(
|
||||
|
||||
init_database.commit()
|
||||
|
||||
logg.debug('!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!')
|
||||
txs = get_nonce_tx_local(default_chain_spec, 3, agent_roles['ALICE'], session=init_database)
|
||||
ks = txs.keys()
|
||||
assert len(ks) == 2
|
||||
|
||||
@@ -191,17 +191,11 @@ def test_tokens(
|
||||
break
|
||||
|
||||
api_param = str(uuid.uuid4())
|
||||
fp = os.path.join(CallbackTask.mmap_path, api_param)
|
||||
f = open(fp, 'wb+')
|
||||
f.write(b'\x00')
|
||||
f.close()
|
||||
|
||||
api = Api(str(default_chain_spec), queue=None, callback_param=api_param, callback_task='cic_eth.pytest.mock.callback.test_callback')
|
||||
t = api.tokens(['BAR'], proof=[[bar_token_declaration]])
|
||||
r = t.get()
|
||||
logg.debug('rr {} {}'.format(r, t.children))
|
||||
|
||||
|
||||
while True:
|
||||
fp = os.path.join(CallbackTask.mmap_path, api_param)
|
||||
try:
|
||||
|
||||
@@ -35,26 +35,10 @@ from hexathon import strip_0x
|
||||
from cic_eth.eth.gas import cache_gas_data
|
||||
from cic_eth.error import OutOfGasError
|
||||
from cic_eth.queue.tx import queue_create
|
||||
from cic_eth.task import BaseTask
|
||||
|
||||
logg = logging.getLogger()
|
||||
|
||||
|
||||
def test_task_gas_limit(
|
||||
eth_rpc,
|
||||
eth_signer,
|
||||
default_chain_spec,
|
||||
agent_roles,
|
||||
celery_session_worker,
|
||||
):
|
||||
rpc = RPCConnection.connect(default_chain_spec, 'default')
|
||||
gas_oracle = BaseTask().create_gas_oracle(rpc)
|
||||
c = Gas(default_chain_spec, signer=eth_signer, gas_oracle=gas_oracle)
|
||||
(tx_hash_hex, o) = c.create(agent_roles['ALICE'], agent_roles['BOB'], 10, tx_format=TxFormat.RLP_SIGNED)
|
||||
tx = unpack(bytes.fromhex(strip_0x(o)), default_chain_spec)
|
||||
assert (tx['gas'], BaseTask.min_fee_price)
|
||||
|
||||
|
||||
def test_task_check_gas_ok(
|
||||
default_chain_spec,
|
||||
eth_rpc,
|
||||
|
||||
@@ -1,171 +0,0 @@
|
||||
# coding: utf-8
|
||||
from __future__ import absolute_import
|
||||
|
||||
import logging
|
||||
import time
|
||||
|
||||
import hexathon
|
||||
import pytest
|
||||
from cic_eth.server.app import create_app
|
||||
from cic_eth.server.celery import create_celery_wrapper
|
||||
from fastapi.testclient import TestClient
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def celery_wrapper(api):
|
||||
""" Creates a redis channel and calls `cic_eth.api` with the provided `method` and `*args`. Returns the result of the api call. Catch allows you to specify how many messages to catch before returning.
|
||||
"""
|
||||
def wrapper(method, *args, catch=1, **kwargs):
|
||||
t = getattr(api, method)(*args, **kwargs)
|
||||
return t.get()
|
||||
return wrapper
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def client(celery_wrapper):
|
||||
app = create_app(celery_wrapper)
|
||||
return TestClient(app)
|
||||
|
||||
|
||||
def test_default_token(client,mock_sync_default_token_api_query):
|
||||
# Default Token
|
||||
response = client.get('/default_token')
|
||||
log.debug(f"balance response {response}")
|
||||
default_token = response.json()
|
||||
assert default_token == {'symbol': 'FOO', 'address': '0xe7c559c40B297d7f039767A2c3677E20B24F1385', 'name': 'Giftable Token', 'decimals': 6}
|
||||
|
||||
def test_token(client, mock_token_api_query):
|
||||
# Default Token
|
||||
response = client.get('/token?token_symbol=FOO')
|
||||
log.debug(f"token response {response}")
|
||||
token = response.json()
|
||||
assert token == {
|
||||
'address': '0xe7c559c40B297d7f039767A2c3677E20B24F1385',
|
||||
'converters': [],
|
||||
'decimals': 6,
|
||||
'name': 'Giftable Token',
|
||||
'proofs': ['5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3'],
|
||||
'proofs_with_signers': [{'proof': '5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3',
|
||||
'signers': ['Eb3907eCad74a0013c259D5874AE7f22DcBcC95C', 'Eb3907eCad74a0013c259D5874AE7f22DcBcC95C']}],
|
||||
'symbol': 'FOO',
|
||||
}
|
||||
|
||||
def test_tokens(client, mock_tokens_api_query):
|
||||
# Default Token
|
||||
response = client.get(
|
||||
'/tokens', params={'token_symbols': ['FOO', 'FOO']})
|
||||
|
||||
log.debug(f"tokens response {response}")
|
||||
tokens = response.json()
|
||||
assert tokens == [
|
||||
{
|
||||
'address': '0xe7c559c40B297d7f039767A2c3677E20B24F1385',
|
||||
'converters': [],
|
||||
'decimals': 6,
|
||||
'name': 'Giftable Token',
|
||||
'proofs': ['5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3'],
|
||||
'proofs_with_signers': [{'proof': '5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3',
|
||||
'signers': ['Eb3907eCad74a0013c259D5874AE7f22DcBcC95C', 'Eb3907eCad74a0013c259D5874AE7f22DcBcC95C']}],
|
||||
'symbol': 'FOO',
|
||||
},
|
||||
{
|
||||
'address': '0xe7c559c40B297d7f039767A2c3677E20B24F1385',
|
||||
'converters': [],
|
||||
'decimals': 6,
|
||||
'name': 'Giftable Token',
|
||||
'proofs': ['5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3'],
|
||||
'proofs_with_signers': [{'proof': '5b1549818725ca07c19fc47fda5d8d85bbebb1283855d5ab99785dcb7d9051d3',
|
||||
'signers': ['Eb3907eCad74a0013c259D5874AE7f22DcBcC95C', 'Eb3907eCad74a0013c259D5874AE7f22DcBcC95C']}],
|
||||
'symbol': 'FOO',
|
||||
},
|
||||
]
|
||||
|
||||
@pytest.mark.skip("Not implemented")
|
||||
def test_account(client):
|
||||
# Default Token
|
||||
response = client.get('/default_token')
|
||||
log.debug(f"balance response {response}")
|
||||
default_token = response.json()
|
||||
|
||||
# Create Account 1
|
||||
params = {
|
||||
'password': '',
|
||||
'register': True
|
||||
}
|
||||
response = client.post(
|
||||
'/create_account',
|
||||
params=params)
|
||||
address_1 = hexathon.valid(response.json())
|
||||
|
||||
# Create Account 2
|
||||
params = {
|
||||
'password': '',
|
||||
'register': True
|
||||
}
|
||||
response = client.post('/create_account',
|
||||
params=params)
|
||||
address_2 = hexathon.valid(response.json())
|
||||
time.sleep(30) # Required to allow balance to show
|
||||
|
||||
# Balance Account 1
|
||||
params = {
|
||||
'address': address_1,
|
||||
'token_symbol': 'COFE',
|
||||
'include_pending': True
|
||||
}
|
||||
response = client.get('/balance',
|
||||
params=params)
|
||||
balance = response.json()
|
||||
|
||||
assert (balance[0] == {
|
||||
"address": default_token.get('address').lower(),
|
||||
"balance_available": 30000,
|
||||
"balance_incoming": 0,
|
||||
"balance_network": 30000,
|
||||
"balance_outgoing": 0,
|
||||
"converters": []
|
||||
})
|
||||
|
||||
# Transfer
|
||||
params = {
|
||||
'from_address': address_1,
|
||||
'to_address': address_2,
|
||||
'value': 100,
|
||||
'token_symbol': 'COFE'
|
||||
}
|
||||
response = client.post('/transfer',
|
||||
params=params)
|
||||
transfer = response.json()
|
||||
|
||||
# Balance Account 1
|
||||
params = {
|
||||
'address': address_1,
|
||||
'token_symbol': 'COFE',
|
||||
'include_pending': True
|
||||
}
|
||||
response = client.get('/balance',
|
||||
params=params)
|
||||
balance_after_transfer = response.json()
|
||||
assert (balance_after_transfer[0] == {
|
||||
"address": default_token.get('address').lower(),
|
||||
"balance_available": 29900,
|
||||
"balance_incoming": 0,
|
||||
"balance_network": 30000,
|
||||
"balance_outgoing": 100,
|
||||
"converters": []
|
||||
})
|
||||
|
||||
# Transactions Account 1
|
||||
params = {
|
||||
'address': address_1,
|
||||
'limit': 10
|
||||
}
|
||||
response = client.get('/transactions',
|
||||
params=params)
|
||||
transactions = response.json()
|
||||
# TODO: What are the other 2 transactions
|
||||
assert len(transactions) == 3
|
||||
# Check the transaction is correct
|
||||
# TODO wtf is READSEND (Ready to send? Or already sent)
|
||||
assert transactions[0].status == 'READYSEND'
|
||||
@@ -143,7 +143,7 @@ def test_incoming_balance(
|
||||
'converters': [],
|
||||
}
|
||||
b = balance_incoming([token_data], recipient, default_chain_spec.asdict())
|
||||
assert b[0]['balance_incoming'] == 1000
|
||||
assert b[0]['balance_incoming'] == 0
|
||||
|
||||
otx.readysend(session=init_database)
|
||||
init_database.flush()
|
||||
@@ -152,8 +152,8 @@ def test_incoming_balance(
|
||||
otx.sent(session=init_database)
|
||||
init_database.commit()
|
||||
|
||||
#b = balance_incoming([token_data], recipient, default_chain_spec.asdict())
|
||||
#assert b[0]['balance_incoming'] == 1000
|
||||
b = balance_incoming([token_data], recipient, default_chain_spec.asdict())
|
||||
assert b[0]['balance_incoming'] == 1000
|
||||
|
||||
otx.success(block=1024, session=init_database)
|
||||
init_database.commit()
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
chainqueue~=0.0.6a4
|
||||
crypto-dev-signer>=0.4.15rc2,<=0.4.15
|
||||
chainqueue>=0.0.5a3,<0.1.0
|
||||
cic-eth-registry>=0.6.1a6,<0.7.0
|
||||
redis==3.5.3
|
||||
hexathon~=0.1.0
|
||||
hexathon~=0.0.1a8
|
||||
pycryptodome==3.10.1
|
||||
pyxdg==0.27
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
[database]
|
||||
name=cic_notify_test
|
||||
user=
|
||||
password=
|
||||
host=localhost
|
||||
port=
|
||||
engine=sqlite
|
||||
driver=pysqlite
|
||||
debug=0
|
||||
[DATABASE]
|
||||
user = postgres
|
||||
password =
|
||||
host = localhost
|
||||
port = 5432
|
||||
name = /tmp/cic-notify.db
|
||||
#engine = postgresql
|
||||
#driver = psycopg2
|
||||
engine = sqlite
|
||||
driver = pysqlite
|
||||
|
||||
@@ -1,7 +0,0 @@
|
||||
[report]
|
||||
omit =
|
||||
venv/*
|
||||
scripts/*
|
||||
cic_notify/db/migrations/*
|
||||
cic_notify/runnable/*
|
||||
cic_notify/version.py
|
||||
@@ -3,7 +3,6 @@ import logging
|
||||
import re
|
||||
|
||||
# third-party imports
|
||||
import cic_notify.tasks.sms.db
|
||||
from celery.app.control import Inspect
|
||||
import celery
|
||||
|
||||
@@ -14,16 +13,45 @@ app = celery.current_app
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
logg = logging.getLogger()
|
||||
|
||||
sms_tasks_matcher = r"^(cic_notify.tasks.sms)(\.\w+)?"
|
||||
|
||||
|
||||
re_q = r'^cic-notify'
|
||||
def get_sms_queue_tasks(app, task_prefix='cic_notify.tasks.sms.'):
|
||||
host_queues = []
|
||||
|
||||
i = Inspect(app=app)
|
||||
qs = i.active_queues()
|
||||
for host in qs.keys():
|
||||
for q in qs[host]:
|
||||
if re.match(re_q, q['name']):
|
||||
host_queues.append((host, q['name'],))
|
||||
|
||||
task_prefix_len = len(task_prefix)
|
||||
queue_tasks = []
|
||||
for (host, queue) in host_queues:
|
||||
i = Inspect(app=app, destination=[host])
|
||||
for tasks in i.registered_tasks().values():
|
||||
for task in tasks:
|
||||
if len(task) >= task_prefix_len and task[:task_prefix_len] == task_prefix:
|
||||
queue_tasks.append((queue, task,))
|
||||
|
||||
return queue_tasks
|
||||
|
||||
|
||||
class Api:
|
||||
def __init__(self, queue: any = 'cic-notify'):
|
||||
# TODO: Implement callback strategy
|
||||
def __init__(self, queue=None):
|
||||
"""
|
||||
:param queue: The queue on which to execute notification tasks
|
||||
:type queue: str
|
||||
"""
|
||||
self.queue = queue
|
||||
self.sms_tasks = get_sms_queue_tasks(app)
|
||||
logg.debug('sms tasks {}'.format(self.sms_tasks))
|
||||
|
||||
def sms(self, message: str, recipient: str):
|
||||
|
||||
def sms(self, message, recipient):
|
||||
"""This function chains all sms tasks in order to send a message, log and persist said data to disk
|
||||
:param message: The message to be sent to the recipient.
|
||||
:type message: str
|
||||
@@ -32,9 +60,24 @@ class Api:
|
||||
:return: a celery Task
|
||||
:rtype: Celery.Task
|
||||
"""
|
||||
s_send = celery.signature('cic_notify.tasks.sms.africastalking.send', [message, recipient], queue=self.queue)
|
||||
s_log = celery.signature('cic_notify.tasks.sms.log.log', [message, recipient], queue=self.queue)
|
||||
s_persist_notification = celery.signature(
|
||||
'cic_notify.tasks.sms.db.persist_notification', [message, recipient], queue=self.queue)
|
||||
signatures = [s_send, s_log, s_persist_notification]
|
||||
return celery.group(signatures)()
|
||||
signatures = []
|
||||
for q in self.sms_tasks:
|
||||
|
||||
if not self.queue:
|
||||
queue = q[0]
|
||||
else:
|
||||
queue = self.queue
|
||||
|
||||
signature = celery.signature(
|
||||
q[1],
|
||||
[
|
||||
message,
|
||||
recipient,
|
||||
],
|
||||
queue=queue,
|
||||
)
|
||||
signatures.append(signature)
|
||||
|
||||
t = celery.group(signatures)()
|
||||
|
||||
return t
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
[alembic]
|
||||
# path to migration scripts
|
||||
script_location = .
|
||||
script_location = migrations
|
||||
|
||||
# template used to generate migration files
|
||||
# file_template = %%(rev)s_%%(slug)s
|
||||
@@ -27,17 +27,28 @@ script_location = .
|
||||
# sourceless = false
|
||||
|
||||
# version location specification; this defaults
|
||||
# to ./versions. When using multiple version
|
||||
# to migrations/versions. When using multiple version
|
||||
# directories, initial revisions must be specified with --version-path
|
||||
# version_locations = %(here)s/bar %(here)s/bat ./versions
|
||||
# version_locations = %(here)s/bar %(here)s/bat migrations/versions
|
||||
|
||||
# the output encoding used when revision files
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
sqlalchemy.url = driver://user:pass@localhost/dbname
|
||||
sqlalchemy.url = postgres+psycopg2://postgres@localhost/cic-notify
|
||||
|
||||
|
||||
[post_write_hooks]
|
||||
# post_write_hooks defines scripts or Python functions that are run
|
||||
# on newly generated revision scripts. See the documentation for further
|
||||
# detail and examples
|
||||
|
||||
# format using "black" - use the console_scripts runner, against the "black" entrypoint
|
||||
# hooks=black
|
||||
# black.type=console_scripts
|
||||
# black.entrypoint=black
|
||||
# black.options=-l 79
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
@@ -11,7 +11,7 @@ config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
fileConfig(config.config_file_name, disable_existing_loggers=True)
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
@@ -56,14 +56,11 @@ def run_migrations_online():
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
connectable = context.config.attributes.get("connection", None)
|
||||
|
||||
if connectable is None:
|
||||
connectable = engine_from_config(
|
||||
context.config.get_section(context.config.config_ini_section),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
connectable = engine_from_config(
|
||||
config.get_section(config.config_ini_section),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
|
||||
@@ -7,7 +7,7 @@ import celery
|
||||
|
||||
celery_app = celery.current_app
|
||||
logg = celery_app.log.get_default_logger()
|
||||
local_logg = logging.getLogger()
|
||||
local_logg = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@celery_app.task
|
||||
|
||||
@@ -9,7 +9,7 @@ import semver
|
||||
|
||||
logg = logging.getLogger()
|
||||
|
||||
version = (0, 4, 0, 'alpha.12')
|
||||
version = (0, 4, 0, 'alpha.11')
|
||||
|
||||
version_object = semver.VersionInfo(
|
||||
major=version[0],
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
confini~=0.5.1
|
||||
confini>=0.3.6rc4,<0.5.0
|
||||
africastalking==1.2.3
|
||||
SQLAlchemy==1.3.20
|
||||
alembic==1.4.2
|
||||
|
||||
@@ -1,9 +1,5 @@
|
||||
Faker==11.1.0
|
||||
faker-e164==0.1.0
|
||||
pytest==6.2.5
|
||||
pytest-celery~=0.0.0
|
||||
pytest-mock==3.6.1
|
||||
pysqlite3~=0.4.6
|
||||
pytest-cov==3.0.0
|
||||
pytest-alembic==0.7.0
|
||||
requests-mock==1.9.3
|
||||
pytest~=6.0.1
|
||||
pytest-celery~=0.0.0a1
|
||||
pytest-mock~=3.3.1
|
||||
pysqlite3~=0.4.3
|
||||
pytest-cov==2.10.1
|
||||
|
||||
@@ -1,28 +0,0 @@
|
||||
import pytest
|
||||
|
||||
|
||||
def test_single_head_revision(alembic_runner):
|
||||
heads = alembic_runner.heads
|
||||
head_count = len(heads)
|
||||
assert head_count == 1
|
||||
|
||||
|
||||
def test_upgrade(alembic_runner):
|
||||
try:
|
||||
alembic_runner.migrate_up_to("head")
|
||||
except RuntimeError:
|
||||
pytest.fail('Failed to upgrade to the head revision.')
|
||||
|
||||
|
||||
def test_up_down_consistency(alembic_runner):
|
||||
try:
|
||||
for revision in alembic_runner.history.revisions:
|
||||
alembic_runner.migrate_up_to(revision)
|
||||
except RuntimeError:
|
||||
pytest.fail('Failed to upgrade through each revision individually.')
|
||||
|
||||
try:
|
||||
for revision in reversed(alembic_runner.history.revisions):
|
||||
alembic_runner.migrate_down_to(revision)
|
||||
except RuntimeError:
|
||||
pytest.fail('Failed to downgrade through each revision individually.')
|
||||
@@ -1,27 +0,0 @@
|
||||
# standard imports
|
||||
|
||||
# external imports
|
||||
from faker import Faker
|
||||
from faker_e164.providers import E164Provider
|
||||
|
||||
# local imports
|
||||
from cic_notify.db.enum import NotificationStatusEnum, NotificationTransportEnum
|
||||
from cic_notify.db.models.notification import Notification
|
||||
|
||||
|
||||
# test imports
|
||||
from tests.helpers.phone import phone_number
|
||||
|
||||
|
||||
def test_notification(init_database):
|
||||
message = 'Hello world'
|
||||
recipient = phone_number()
|
||||
notification = Notification(NotificationTransportEnum.SMS, recipient, message)
|
||||
init_database.add(notification)
|
||||
init_database.commit()
|
||||
|
||||
notification = init_database.query(Notification).get(1)
|
||||
assert notification.status == NotificationStatusEnum.UNKNOWN
|
||||
assert notification.recipient == recipient
|
||||
assert notification.message == message
|
||||
assert notification.transport == NotificationTransportEnum.SMS
|
||||
@@ -1,38 +0,0 @@
|
||||
# standard imports
|
||||
import os
|
||||
|
||||
# third-party imports
|
||||
|
||||
# local imports
|
||||
from cic_notify.db import dsn_from_config
|
||||
|
||||
|
||||
def test_dsn_from_config(load_config):
|
||||
"""
|
||||
"""
|
||||
# test dsn for other db formats
|
||||
overrides = {
|
||||
'DATABASE_PASSWORD': 'password',
|
||||
'DATABASE_DRIVER': 'psycopg2',
|
||||
'DATABASE_ENGINE': 'postgresql'
|
||||
}
|
||||
load_config.dict_override(dct=overrides, dct_description='Override values to test different db formats.')
|
||||
|
||||
scheme = f'{load_config.get("DATABASE_ENGINE")}+{load_config.get("DATABASE_DRIVER")}'
|
||||
|
||||
dsn = dsn_from_config(load_config)
|
||||
assert dsn == f"{scheme}://{load_config.get('DATABASE_USER')}:{load_config.get('DATABASE_PASSWORD')}@{load_config.get('DATABASE_HOST')}:{load_config.get('DATABASE_PORT')}/{load_config.get('DATABASE_NAME')}"
|
||||
|
||||
# undoes overrides to revert engine and drivers to sqlite
|
||||
overrides = {
|
||||
'DATABASE_PASSWORD': '',
|
||||
'DATABASE_DRIVER': 'pysqlite',
|
||||
'DATABASE_ENGINE': 'sqlite'
|
||||
}
|
||||
load_config.dict_override(dct=overrides, dct_description='Override values to test different db formats.')
|
||||
|
||||
# test dsn for sqlite engine
|
||||
dsn = dsn_from_config(load_config)
|
||||
scheme = f'{load_config.get("DATABASE_ENGINE")}+{load_config.get("DATABASE_DRIVER")}'
|
||||
assert dsn == f'{scheme}:///{load_config.get("DATABASE_NAME")}'
|
||||
|
||||
@@ -1,75 +0,0 @@
|
||||
# standard imports
|
||||
import logging
|
||||
import os
|
||||
|
||||
# external imports
|
||||
import pytest
|
||||
import requests_mock
|
||||
|
||||
|
||||
# local imports
|
||||
from cic_notify.error import NotInitializedError, AlreadyInitializedError, NotificationSendError
|
||||
from cic_notify.tasks.sms.africastalking import AfricasTalkingNotifier
|
||||
|
||||
# test imports
|
||||
from tests.helpers.phone import phone_number
|
||||
|
||||
|
||||
def test_africas_talking_notifier(africastalking_response, caplog):
|
||||
caplog.set_level(logging.DEBUG)
|
||||
with pytest.raises(NotInitializedError) as error:
|
||||
AfricasTalkingNotifier()
|
||||
assert str(error.value) == ''
|
||||
|
||||
api_key = os.urandom(24).hex()
|
||||
sender_id = 'bar'
|
||||
username = 'sandbox'
|
||||
AfricasTalkingNotifier.initialize(username, api_key, sender_id)
|
||||
africastalking_notifier = AfricasTalkingNotifier()
|
||||
assert africastalking_notifier.sender_id == sender_id
|
||||
assert africastalking_notifier.initiated is True
|
||||
|
||||
with pytest.raises(AlreadyInitializedError) as error:
|
||||
AfricasTalkingNotifier.initialize(username, api_key, sender_id)
|
||||
assert str(error.value) == ''
|
||||
|
||||
with requests_mock.Mocker(real_http=False) as request_mocker:
|
||||
message = 'Hello world.'
|
||||
recipient = phone_number()
|
||||
africastalking_response.get('SMSMessageData').get('Recipients')[0]['number'] = recipient
|
||||
request_mocker.register_uri(method='POST',
|
||||
headers={'content-type': 'application/json'},
|
||||
json=africastalking_response,
|
||||
url='https://api.sandbox.africastalking.com/version1/messaging',
|
||||
status_code=200)
|
||||
africastalking_notifier.send(message, recipient)
|
||||
assert f'Africastalking response sender-id {africastalking_response}' in caplog.text
|
||||
africastalking_notifier.sender_id = None
|
||||
africastalking_notifier.send(message, recipient)
|
||||
assert f'africastalking response no-sender-id {africastalking_response}' in caplog.text
|
||||
with pytest.raises(NotificationSendError) as error:
|
||||
status = 'InvalidPhoneNumber'
|
||||
status_code = 403
|
||||
africastalking_response.get('SMSMessageData').get('Recipients')[0]['status'] = status
|
||||
africastalking_response.get('SMSMessageData').get('Recipients')[0]['statusCode'] = status_code
|
||||
|
||||
request_mocker.register_uri(method='POST',
|
||||
headers={'content-type': 'application/json'},
|
||||
json=africastalking_response,
|
||||
url='https://api.sandbox.africastalking.com/version1/messaging',
|
||||
status_code=200)
|
||||
africastalking_notifier.send(message, recipient)
|
||||
assert str(error.value) == f'Sending notification failed due to: {status}'
|
||||
with pytest.raises(NotificationSendError) as error:
|
||||
recipients = []
|
||||
status = 'InsufficientBalance'
|
||||
africastalking_response.get('SMSMessageData')['Recipients'] = recipients
|
||||
africastalking_response.get('SMSMessageData')['Message'] = status
|
||||
|
||||
request_mocker.register_uri(method='POST',
|
||||
headers={'content-type': 'application/json'},
|
||||
json=africastalking_response,
|
||||
url='https://api.sandbox.africastalking.com/version1/messaging',
|
||||
status_code=200)
|
||||
africastalking_notifier.send(message, recipient)
|
||||
assert str(error.value) == f'Unexpected number of recipients: {len(recipients)}. Status: {status}'
|
||||
@@ -1,26 +0,0 @@
|
||||
# standard imports
|
||||
|
||||
# external imports
|
||||
import celery
|
||||
|
||||
# local imports
|
||||
from cic_notify.db.enum import NotificationStatusEnum, NotificationTransportEnum
|
||||
from cic_notify.db.models.notification import Notification
|
||||
|
||||
# test imports
|
||||
from tests.helpers.phone import phone_number
|
||||
|
||||
|
||||
def test_persist_notification(celery_session_worker, init_database):
|
||||
message = 'Hello world.'
|
||||
recipient = phone_number()
|
||||
s_persist_notification = celery.signature(
|
||||
'cic_notify.tasks.sms.db.persist_notification', (message, recipient)
|
||||
)
|
||||
s_persist_notification.apply_async().get()
|
||||
|
||||
notification = Notification.session.query(Notification).filter_by(recipient=recipient).first()
|
||||
assert notification.status == NotificationStatusEnum.UNKNOWN
|
||||
assert notification.recipient == recipient
|
||||
assert notification.message == message
|
||||
assert notification.transport == NotificationTransportEnum.SMS
|
||||
@@ -1,21 +0,0 @@
|
||||
# standard imports
|
||||
import logging
|
||||
|
||||
# external imports
|
||||
import celery
|
||||
|
||||
# local imports
|
||||
|
||||
# test imports
|
||||
from tests.helpers.phone import phone_number
|
||||
|
||||
|
||||
def test_log(caplog, celery_session_worker):
|
||||
message = 'Hello world.'
|
||||
recipient = phone_number()
|
||||
caplog.set_level(logging.INFO)
|
||||
s_log = celery.signature(
|
||||
'cic_notify.tasks.sms.log.log', [message, recipient]
|
||||
)
|
||||
s_log.apply_async().get()
|
||||
assert f'message to {recipient}: {message}' in caplog.text
|
||||
@@ -1,24 +0,0 @@
|
||||
# standard imports
|
||||
|
||||
# external imports
|
||||
import celery
|
||||
|
||||
# local imports
|
||||
from cic_notify.api import Api
|
||||
|
||||
# test imports
|
||||
from tests.helpers.phone import phone_number
|
||||
|
||||
|
||||
def test_api(celery_session_worker, mocker):
|
||||
mocked_group = mocker.patch('celery.group')
|
||||
message = 'Hello world.'
|
||||
recipient = phone_number()
|
||||
s_send = celery.signature('cic_notify.tasks.sms.africastalking.send', [message, recipient], queue=None)
|
||||
s_log = celery.signature('cic_notify.tasks.sms.log.log', [message, recipient], queue=None)
|
||||
s_persist_notification = celery.signature(
|
||||
'cic_notify.tasks.sms.db.persist_notification', [message, recipient], queue=None)
|
||||
signatures = [s_send, s_log, s_persist_notification]
|
||||
api = Api(queue=None)
|
||||
api.sms(message, recipient)
|
||||
mocked_group.assert_called_with(signatures)
|
||||
@@ -1,13 +1,31 @@
|
||||
# standard imports
|
||||
import sys
|
||||
import os
|
||||
import pytest
|
||||
import logging
|
||||
|
||||
# third party imports
|
||||
import confini
|
||||
|
||||
script_dir = os.path.dirname(os.path.realpath(__file__))
|
||||
root_dir = os.path.dirname(script_dir)
|
||||
sys.path.insert(0, root_dir)
|
||||
|
||||
# local imports
|
||||
from cic_notify.db.models.base import SessionBase
|
||||
#from transport.notification import AfricastalkingNotification
|
||||
|
||||
# test imports
|
||||
# fixtures
|
||||
from tests.fixtures_config import *
|
||||
from tests.fixtures_celery import *
|
||||
from tests.fixtures_database import *
|
||||
|
||||
from .fixtures.celery import *
|
||||
from .fixtures.config import *
|
||||
from .fixtures.database import *
|
||||
from .fixtures.result import *
|
||||
logg = logging.getLogger()
|
||||
|
||||
|
||||
#@pytest.fixture(scope='session')
|
||||
#def africastalking_notification(
|
||||
# load_config,
|
||||
# ):
|
||||
# return AfricastalkingNotificationTransport(load_config)
|
||||
#
|
||||
|
||||
32
apps/cic-notify/tests/fixtures/config.py
vendored
32
apps/cic-notify/tests/fixtures/config.py
vendored
@@ -1,32 +0,0 @@
|
||||
# standard imports
|
||||
import os
|
||||
import logging
|
||||
|
||||
# external imports
|
||||
import pytest
|
||||
from confini import Config
|
||||
|
||||
logg = logging.getLogger(__file__)
|
||||
|
||||
|
||||
fixtures_dir = os.path.dirname(__file__)
|
||||
root_directory = os.path.dirname(os.path.dirname(fixtures_dir))
|
||||
|
||||
|
||||
@pytest.fixture(scope='session')
|
||||
def alembic_config():
|
||||
migrations_directory = os.path.join(root_directory, 'cic_notify', 'db', 'migrations', 'default')
|
||||
file = os.path.join(migrations_directory, 'alembic.ini')
|
||||
return {
|
||||
'file': file,
|
||||
'script_location': migrations_directory
|
||||
}
|
||||
|
||||
|
||||
@pytest.fixture(scope='session')
|
||||
def load_config():
|
||||
config_directory = os.path.join(root_directory, '.config/test')
|
||||
config = Config(default_dir=config_directory)
|
||||
config.process()
|
||||
logg.debug('config loaded\n{}'.format(config))
|
||||
return config
|
||||
54
apps/cic-notify/tests/fixtures/database.py
vendored
54
apps/cic-notify/tests/fixtures/database.py
vendored
@@ -1,54 +0,0 @@
|
||||
# standard imports
|
||||
import os
|
||||
|
||||
# third-party imports
|
||||
import pytest
|
||||
import alembic
|
||||
from alembic.config import Config as AlembicConfig
|
||||
|
||||
# local imports
|
||||
from cic_notify.db import dsn_from_config
|
||||
from cic_notify.db.models.base import SessionBase, create_engine
|
||||
|
||||
from .config import root_directory
|
||||
|
||||
|
||||
@pytest.fixture(scope='session')
|
||||
def alembic_engine(load_config):
|
||||
data_source_name = dsn_from_config(load_config)
|
||||
return create_engine(data_source_name)
|
||||
|
||||
|
||||
@pytest.fixture(scope='session')
|
||||
def database_engine(load_config):
|
||||
if load_config.get('DATABASE_ENGINE') == 'sqlite':
|
||||
try:
|
||||
os.unlink(load_config.get('DATABASE_NAME'))
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
dsn = dsn_from_config(load_config)
|
||||
SessionBase.connect(dsn)
|
||||
return dsn
|
||||
|
||||
|
||||
@pytest.fixture(scope='function')
|
||||
def init_database(load_config, database_engine):
|
||||
db_directory = os.path.join(root_directory, 'cic_notify', 'db')
|
||||
migrations_directory = os.path.join(db_directory, 'migrations', load_config.get('DATABASE_ENGINE'))
|
||||
if not os.path.isdir(migrations_directory):
|
||||
migrations_directory = os.path.join(db_directory, 'migrations', 'default')
|
||||
|
||||
session = SessionBase.create_session()
|
||||
|
||||
alembic_config = AlembicConfig(os.path.join(migrations_directory, 'alembic.ini'))
|
||||
alembic_config.set_main_option('sqlalchemy.url', database_engine)
|
||||
alembic_config.set_main_option('script_location', migrations_directory)
|
||||
|
||||
alembic.command.downgrade(alembic_config, 'base')
|
||||
alembic.command.upgrade(alembic_config, 'head')
|
||||
|
||||
yield session
|
||||
session.commit()
|
||||
session.close()
|
||||
|
||||
|
||||
24
apps/cic-notify/tests/fixtures/result.py
vendored
24
apps/cic-notify/tests/fixtures/result.py
vendored
@@ -1,24 +0,0 @@
|
||||
# standard imports
|
||||
|
||||
# external imports
|
||||
import pytest
|
||||
|
||||
|
||||
# local imports
|
||||
|
||||
# test imports
|
||||
|
||||
@pytest.fixture(scope="function")
|
||||
def africastalking_response():
|
||||
return {
|
||||
"SMSMessageData": {
|
||||
"Message": "Sent to 1/1 Total Cost: KES 0.8000",
|
||||
"Recipients": [{
|
||||
"statusCode": 101,
|
||||
"number": "+254711XXXYYY",
|
||||
"status": "Success",
|
||||
"cost": "KES 0.8000",
|
||||
"messageId": "ATPid_SampleTxnId123"
|
||||
}]
|
||||
}
|
||||
}
|
||||
@@ -37,6 +37,12 @@ def celery_config():
|
||||
shutil.rmtree(rq)
|
||||
|
||||
|
||||
@pytest.fixture(scope='session')
|
||||
def celery_worker_parameters():
|
||||
return {
|
||||
# 'queues': ('cic-notify'),
|
||||
}
|
||||
|
||||
@pytest.fixture(scope='session')
|
||||
def celery_enable_logging():
|
||||
return True
|
||||
20
apps/cic-notify/tests/fixtures_config.py
Normal file
20
apps/cic-notify/tests/fixtures_config.py
Normal file
@@ -0,0 +1,20 @@
|
||||
# standard imports
|
||||
import os
|
||||
import logging
|
||||
|
||||
# third-party imports
|
||||
import pytest
|
||||
import confini
|
||||
|
||||
script_dir = os.path.dirname(os.path.realpath(__file__))
|
||||
root_dir = os.path.dirname(script_dir)
|
||||
logg = logging.getLogger(__file__)
|
||||
|
||||
|
||||
@pytest.fixture(scope='session')
|
||||
def load_config():
|
||||
config_dir = os.path.join(root_dir, '.config/test')
|
||||
conf = confini.Config(config_dir, 'CICTEST')
|
||||
conf.process()
|
||||
logg.debug('config {}'.format(conf))
|
||||
return conf
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user