Reorganize script files
This commit is contained in:
parent
8f2d3997d3
commit
5e833f900f
@ -2,88 +2,164 @@
|
|||||||
|
|
||||||
This folder contains tools to generate and import test data.
|
This folder contains tools to generate and import test data.
|
||||||
|
|
||||||
## DATA CREATION
|
## OVERVIEW
|
||||||
|
|
||||||
Does not need the cluster to run.
|
Three sets of tools are available, sorted by respective subdirectories.
|
||||||
|
|
||||||
Vanilla:
|
* **eth**: Import using sovereign wallets.
|
||||||
|
* **cic_eth**: Import using the `cic_eth` custodial engine.
|
||||||
|
* **cic_ussd**: Import using the `cic_ussd` interface (backed by `cic_eth`)
|
||||||
|
|
||||||
|
Each of the modules include two main scripts:
|
||||||
|
|
||||||
|
* **import_users.py**: Registers all created accounts in the network
|
||||||
|
* **import_balance.py**: Transfer an opening balance using an external keystore wallet
|
||||||
|
|
||||||
|
The balance script will sync with the blockchain, processing transactions and triggering actions when it finds. In its current version it does not keep track of any other state, so it will run indefinitly and needs You the Human to decide when it has done what it needs to do.
|
||||||
|
|
||||||
|
|
||||||
|
In addition the following common tools are available:
|
||||||
|
|
||||||
|
* User creation script
|
||||||
|
* Import verification script
|
||||||
|
* **cic_meta**: Metadata imports
|
||||||
|
|
||||||
|
|
||||||
|
## HOW TO USE
|
||||||
|
|
||||||
|
### Step 1 - Data creation
|
||||||
|
|
||||||
|
Before running any of the imports, the user data to import has to be generated and saved to disk.
|
||||||
|
|
||||||
|
The script does not need any services to run.
|
||||||
|
|
||||||
|
Vanilla version:
|
||||||
|
|
||||||
`python create_import_users.py [--dir <datadir>] <number_of_users>`
|
`python create_import_users.py [--dir <datadir>] <number_of_users>`
|
||||||
|
|
||||||
If you want to use the `import_balance.py` script to add to the user's balance from an external address, add:
|
If you want to use a `import_balance.py` script to add to the user's balance from an external address, use:
|
||||||
|
|
||||||
`python create_import_users.py --gift-threshold <max_units_to_send> [--dir <datadir>] <number_of_users>`
|
`python create_import_users.py --gift-threshold <max_units_to_send> [--dir <datadir>] <number_of_users>`
|
||||||
|
|
||||||
|
|
||||||
## IMPORT
|
### Step 2 - Services
|
||||||
|
|
||||||
Make sure the following is running in the cluster:
|
Make sure the following is running in the cluster:
|
||||||
* eth
|
* eth
|
||||||
* postgres
|
* postgres
|
||||||
* redis
|
* redis
|
||||||
|
|
||||||
|
|
||||||
|
If metadata is to be imported, also run:
|
||||||
* cic-meta-server
|
* cic-meta-server
|
||||||
|
|
||||||
|
|
||||||
If using the _custodial_ alternative for user imports, also run:
|
If importing using `cic_eth` or `cic_ussd` also run:
|
||||||
* cic-eth-tasker
|
* cic-eth-tasker
|
||||||
* cic-eth-dispatcher
|
* cic-eth-dispatcher
|
||||||
* cic-eth-tracker
|
* cic-eth-tracker
|
||||||
|
|
||||||
|
If importing using `cic_ussd` also run:
|
||||||
|
* cic-ussd-tasker
|
||||||
|
* cic-ussd-server
|
||||||
|
* cic-notify-tasker
|
||||||
|
|
||||||
You will want to run these in sequence:
|
You will want to run these in sequence:
|
||||||
|
|
||||||
|
|
||||||
## 1. Metadata
|
### Step 3 - User imports
|
||||||
|
|
||||||
`node import_meta.js <datadir> <number_of_users>`
|
|
||||||
|
#### Alternative 1 - Sovereign wallet import - `eth`
|
||||||
|
|
||||||
|
|
||||||
|
First, make a note of the **block height** before running anything.
|
||||||
|
|
||||||
|
To import, run to _completion_:
|
||||||
|
|
||||||
|
`python eth/import_users.py -v -c config -p <eth_provider> -r <cic_registry_address> -y ../keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c <datadir>`
|
||||||
|
|
||||||
|
If you are transferring balances externally, then run:
|
||||||
|
|
||||||
|
`python eth/import_balance.py -v -c config -r <cic_registry_address> -p <eth_provider> --offset <block_height_at_start> -y ../keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c <datadir>`
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
#### Alternative 2 - Custodial engine import - `cic_eth`
|
||||||
|
|
||||||
|
Run in sequence, in first terminal:
|
||||||
|
|
||||||
|
`python cic_eth/import_balance.py -v -c config -p http://localhost:63545 -r 0xea6225212005e86a4490018ded4bf37f3e772161 -y ../keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c --head out`
|
||||||
|
|
||||||
|
In another terminal:
|
||||||
|
|
||||||
|
`python cic_eth/import_users.py -v -c config --redis-host-callback <redis_hostname_in_docker> out`
|
||||||
|
|
||||||
|
The `redis_hostname_in_docker` value is the hostname required to reach the redis server from within the docker cluster. The `import_users` script will receive the address of each newly created custodial account on a redis subscription fed by a callback task in the `cic_eth` account creation task chain.
|
||||||
|
|
||||||
|
|
||||||
|
#### Alternative 3 - USSD import
|
||||||
|
|
||||||
|
Run in sequence, in first terminal:
|
||||||
|
|
||||||
|
`python cic_eth/import_balance.py -v -c config -p http://localhost:63545 -r 0xea6225212005e86a4490018ded4bf37f3e772161 -y ../keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c out`
|
||||||
|
|
||||||
|
In second terminal:
|
||||||
|
|
||||||
|
`python cic_ussd/import_users.py -v -c config out`
|
||||||
|
|
||||||
|
The balance script is a celery task worker, and will not exit by itself in its current version. However, after it's done doing its job, you will find "reached nonce ... exiting" among the last lines of the log.
|
||||||
|
|
||||||
|
The connection parameters for the `cic-ussd-server` is currently _hardcoded_ in the `import_users.py` script file.
|
||||||
|
|
||||||
|
|
||||||
|
### Step 4 - Metadata import (optional)
|
||||||
|
|
||||||
|
The metadata import scripts can be run at any time after step 1 has been completed.
|
||||||
|
|
||||||
|
|
||||||
|
#### Importing user metadata
|
||||||
|
|
||||||
|
To import the main user metadata structs, run:
|
||||||
|
|
||||||
|
`node cic_meta/import_meta.js <datadir> <number_of_users>`
|
||||||
|
|
||||||
Monitors a folder for output from the `import_users.py` script, adding the metadata found to the `cic-meta` service.
|
Monitors a folder for output from the `import_users.py` script, adding the metadata found to the `cic-meta` service.
|
||||||
|
|
||||||
|
If _number of users_ is omitted the script will run until manually interrupted.
|
||||||
## 2. Balances
|
|
||||||
|
|
||||||
(Only if you used the `--gift-threshold` option above)
|
|
||||||
|
|
||||||
`python -c config -i <newchain:id> -r <cic_registry_address> -p <eth_provider> --head -y ../keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c <datadir>`
|
|
||||||
|
|
||||||
This will monitor new mined blocks and send balances to the newly created accounts.
|
|
||||||
|
|
||||||
|
|
||||||
### 3. Users
|
#### Importing phone pointer
|
||||||
|
|
||||||
Only use **one** of the following
|
**IMPORTANT** If you imported using `cic_ussd`, the phone pointer is _already added_ and this script should _NOT_ be run.
|
||||||
|
|
||||||
#### Custodial
|
`node cic_meta/import_meta_phone.js <datadir> <number_of_users>`
|
||||||
|
|
||||||
This alternative generates accounts using the `cic-eth` custodial engine
|
|
||||||
|
|
||||||
Without any modifications to the cluster and config files:
|
### Step 5 - Verify
|
||||||
|
|
||||||
`python import_users.py -c config --redis-host-callback redis <datadir>`
|
`python verify.py -v -c config -r <cic_registry_address> -p <eth_provider> <datadir>`
|
||||||
|
|
||||||
** A note on the The callback**: The script uses a redis callback to retrieve the newly generated custodial address. This is the redis server _from the perspective of the cic-eth component_.
|
Included checks:
|
||||||
|
|
||||||
#### Sovereign
|
|
||||||
|
|
||||||
This alternative generates keystore files, while registering corresponding addresses in the accounts registry directly
|
|
||||||
|
|
||||||
`python import_sovereign_users.py -c config -i <newchain:id> -r <cic_registry_address> -p <eth_provider> -y ../keystore/UTC--2021-01-08T17-18-44.521011372Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c <datadir>`
|
|
||||||
|
|
||||||
A `keystore` sub-directory in the data path is created, with ethereum keystore files for all generated private keys. Passphrase is set to empty string for all of them.
|
|
||||||
|
|
||||||
## VERIFY
|
|
||||||
|
|
||||||
`python verify.py -c config -i <newchain:id> -r <cic_registry_address> -p <eth_provider> <datadir>`
|
|
||||||
|
|
||||||
Checks
|
|
||||||
* Private key is in cic-eth keystore
|
* Private key is in cic-eth keystore
|
||||||
* Address is in accounts index
|
* Address is in accounts index
|
||||||
* Address has balance matching the gift threshold
|
* Address has gas balance
|
||||||
|
* Address has triggered the token faucet
|
||||||
|
* Address has token balance matching the gift threshold
|
||||||
* Metadata can be retrieved and has exact match
|
* Metadata can be retrieved and has exact match
|
||||||
|
|
||||||
|
Checks can be selectively included and excluded. See `--help` for details.
|
||||||
|
|
||||||
|
Will output one line for each check, with name of check and number of errors found per check.
|
||||||
|
|
||||||
Should exit with code 0 if all input data is found in the respective services.
|
Should exit with code 0 if all input data is found in the respective services.
|
||||||
|
|
||||||
|
|
||||||
## KNOWN ISSUES
|
## KNOWN ISSUES
|
||||||
|
|
||||||
If the faucet disbursement is set to a non-zero amount, the balances will be off. The verify script needs to be improved to check the faucet amount.
|
- If the faucet disbursement is set to a non-zero amount, the balances will be off. The verify script needs to be improved to check the faucet amount.
|
||||||
|
|
||||||
|
- When the account callback in `cic_eth` fails, the `cic_eth/import_users.py` script will exit with a cryptic complaint concerning a `None` value.
|
||||||
|
|
||||||
|
- Sovereign import scripts use the same keystore, and running them simultaneously will mess up the transaction nonce sequence. Better would be to use two different keystore wallets so balance and users scripts can be run simultaneously.
|
||||||
|
@ -108,13 +108,13 @@ token_symbol = args.token_symbol
|
|||||||
|
|
||||||
MetadataTask.meta_host = config.get('META_HOST')
|
MetadataTask.meta_host = config.get('META_HOST')
|
||||||
MetadataTask.meta_port = config.get('META_PORT')
|
MetadataTask.meta_port = config.get('META_PORT')
|
||||||
MetadataTask.chain_spec = chain_spec
|
ImportTask.chain_spec = chain_spec
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
conn = EthHTTPConnection(config.get('ETH_PROVIDER'))
|
conn = EthHTTPConnection(config.get('ETH_PROVIDER'))
|
||||||
|
|
||||||
MetadataTask.balance_processor = BalanceProcessor(conn, chain_spec, config.get('CIC_REGISTRY_ADDRESS'), signer_address, signer)
|
ImportTask.balance_processor = BalanceProcessor(conn, chain_spec, config.get('CIC_REGISTRY_ADDRESS'), signer_address, signer)
|
||||||
MetadataTask.balance_processor.init()
|
ImportTask.balance_processor.init()
|
||||||
|
|
||||||
# TODO get decimals from token
|
# TODO get decimals from token
|
||||||
balances = {}
|
balances = {}
|
||||||
@ -137,8 +137,8 @@ def main():
|
|||||||
|
|
||||||
f.close()
|
f.close()
|
||||||
|
|
||||||
MetadataTask.balances = balances
|
ImportTask.balances = balances
|
||||||
MetadataTask.count = i
|
ImportTask.count = i
|
||||||
|
|
||||||
s = celery.signature(
|
s = celery.signature(
|
||||||
'import_task.send_txs',
|
'import_task.send_txs',
|
@ -25,17 +25,21 @@ logg = logging.getLogger().getChild(__name__)
|
|||||||
celery_app = celery.current_app
|
celery_app = celery.current_app
|
||||||
|
|
||||||
|
|
||||||
class MetadataTask(celery.Task):
|
class ImportTask(celery.Task):
|
||||||
|
|
||||||
count = 0
|
|
||||||
balances = None
|
balances = None
|
||||||
chain_spec = None
|
|
||||||
import_dir = 'out'
|
import_dir = 'out'
|
||||||
|
count = 0
|
||||||
|
chain_spec = None
|
||||||
|
balance_processor = None
|
||||||
|
max_retries = None
|
||||||
|
|
||||||
|
class MetadataTask(ImportTask):
|
||||||
|
|
||||||
meta_host = None
|
meta_host = None
|
||||||
meta_port = None
|
meta_port = None
|
||||||
meta_path = ''
|
meta_path = ''
|
||||||
meta_ssl = False
|
meta_ssl = False
|
||||||
balance_processor = None
|
|
||||||
autoretry_for = (
|
autoretry_for = (
|
||||||
urllib.error.HTTPError,
|
urllib.error.HTTPError,
|
||||||
OSError,
|
OSError,
|
||||||
@ -43,7 +47,6 @@ class MetadataTask(celery.Task):
|
|||||||
retry_jitter = True
|
retry_jitter = True
|
||||||
retry_backoff = True
|
retry_backoff = True
|
||||||
retry_backoff_max = 60
|
retry_backoff_max = 60
|
||||||
max_retries = None
|
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def meta_url(self):
|
def meta_url(self):
|
||||||
@ -172,7 +175,7 @@ def opening_balance_tx(self, address, phone, serial):
|
|||||||
return tx['hash']
|
return tx['hash']
|
||||||
|
|
||||||
|
|
||||||
@celery_app.task(bind=True, base=MetadataTask, autoretry_for=(FileNotFoundError,), max_retries=None, countdown=0.1)
|
@celery_app.task(bind=True, base=ImportTask, autoretry_for=(FileNotFoundError,), max_retries=None, default_retry_delay=0.1)
|
||||||
def send_txs(self, nonce):
|
def send_txs(self, nonce):
|
||||||
|
|
||||||
if nonce == self.count + self.balance_processor.nonce_offset:
|
if nonce == self.count + self.balance_processor.nonce_offset:
|
300
apps/contract-migration/scripts/eth/import_balance.py
Normal file
300
apps/contract-migration/scripts/eth/import_balance.py
Normal file
@ -0,0 +1,300 @@
|
|||||||
|
# standard imports
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import logging
|
||||||
|
import time
|
||||||
|
import argparse
|
||||||
|
import sys
|
||||||
|
import re
|
||||||
|
import hashlib
|
||||||
|
import csv
|
||||||
|
import json
|
||||||
|
|
||||||
|
# external imports
|
||||||
|
import eth_abi
|
||||||
|
import confini
|
||||||
|
from hexathon import (
|
||||||
|
strip_0x,
|
||||||
|
add_0x,
|
||||||
|
)
|
||||||
|
from chainsyncer.backend import MemBackend
|
||||||
|
from chainsyncer.driver import HeadSyncer
|
||||||
|
from chainlib.eth.connection import EthHTTPConnection
|
||||||
|
from chainlib.eth.block import (
|
||||||
|
block_latest,
|
||||||
|
block_by_number,
|
||||||
|
Block,
|
||||||
|
)
|
||||||
|
from chainlib.eth.hash import keccak256_string_to_hex
|
||||||
|
from chainlib.eth.address import to_checksum_address
|
||||||
|
from chainlib.eth.erc20 import ERC20
|
||||||
|
from chainlib.eth.gas import OverrideGasOracle
|
||||||
|
from chainlib.eth.nonce import RPCNonceOracle
|
||||||
|
from chainlib.eth.tx import TxFactory
|
||||||
|
from chainlib.eth.rpc import jsonrpc_template
|
||||||
|
from chainlib.eth.error import EthException
|
||||||
|
from chainlib.chain import ChainSpec
|
||||||
|
from crypto_dev_signer.eth.signer import ReferenceSigner as EIP155Signer
|
||||||
|
from crypto_dev_signer.keystore.dict import DictKeystore
|
||||||
|
from cic_types.models.person import Person
|
||||||
|
|
||||||
|
|
||||||
|
logging.basicConfig(level=logging.WARNING)
|
||||||
|
logg = logging.getLogger()
|
||||||
|
|
||||||
|
config_dir = './config'
|
||||||
|
|
||||||
|
argparser = argparse.ArgumentParser(description='daemon that monitors transactions in new blocks')
|
||||||
|
argparser.add_argument('-p', '--provider', dest='p', type=str, help='chain rpc provider address')
|
||||||
|
argparser.add_argument('-y', '--key-file', dest='y', type=str, help='Ethereum keystore file to use for signing')
|
||||||
|
argparser.add_argument('-c', type=str, default=config_dir, help='config root to use')
|
||||||
|
argparser.add_argument('--old-chain-spec', type=str, dest='old_chain_spec', default='evm:oldchain:1', help='chain spec')
|
||||||
|
argparser.add_argument('-i', '--chain-spec', type=str, dest='i', help='chain spec')
|
||||||
|
argparser.add_argument('-r', '--registry-address', type=str, dest='r', help='CIC Registry address')
|
||||||
|
argparser.add_argument('--token-symbol', default='SRF', type=str, dest='token_symbol', help='Token symbol to use for trnsactions')
|
||||||
|
argparser.add_argument('--head', action='store_true', help='start at current block height (overrides --offset)')
|
||||||
|
argparser.add_argument('--env-prefix', default=os.environ.get('CONFINI_ENV_PREFIX'), dest='env_prefix', type=str, help='environment prefix for variables to overwrite configuration')
|
||||||
|
argparser.add_argument('-q', type=str, default='cic-eth', help='celery queue to submit transaction tasks to')
|
||||||
|
argparser.add_argument('--offset', type=int, default=0, help='block offset to start syncer from')
|
||||||
|
argparser.add_argument('-v', help='be verbose', action='store_true')
|
||||||
|
argparser.add_argument('-vv', help='be more verbose', action='store_true')
|
||||||
|
argparser.add_argument('user_dir', type=str, help='user export directory')
|
||||||
|
args = argparser.parse_args(sys.argv[1:])
|
||||||
|
|
||||||
|
if args.v == True:
|
||||||
|
logging.getLogger().setLevel(logging.INFO)
|
||||||
|
elif args.vv == True:
|
||||||
|
logging.getLogger().setLevel(logging.DEBUG)
|
||||||
|
|
||||||
|
config_dir = os.path.join(args.c)
|
||||||
|
os.makedirs(config_dir, 0o777, True)
|
||||||
|
config = confini.Config(config_dir, args.env_prefix)
|
||||||
|
config.process()
|
||||||
|
# override args
|
||||||
|
args_override = {
|
||||||
|
'CIC_CHAIN_SPEC': getattr(args, 'i'),
|
||||||
|
'ETH_PROVIDER': getattr(args, 'p'),
|
||||||
|
'CIC_REGISTRY_ADDRESS': getattr(args, 'r'),
|
||||||
|
}
|
||||||
|
config.dict_override(args_override, 'cli flag')
|
||||||
|
config.censor('PASSWORD', 'DATABASE')
|
||||||
|
config.censor('PASSWORD', 'SSL')
|
||||||
|
logg.debug('config loaded from {}:\n{}'.format(config_dir, config))
|
||||||
|
|
||||||
|
#app = celery.Celery(backend=config.get('CELERY_RESULT_URL'), broker=config.get('CELERY_BROKER_URL'))
|
||||||
|
|
||||||
|
signer_address = None
|
||||||
|
keystore = DictKeystore()
|
||||||
|
if args.y != None:
|
||||||
|
logg.debug('loading keystore file {}'.format(args.y))
|
||||||
|
signer_address = keystore.import_keystore_file(args.y)
|
||||||
|
logg.debug('now have key for signer address {}'.format(signer_address))
|
||||||
|
signer = EIP155Signer(keystore)
|
||||||
|
|
||||||
|
queue = args.q
|
||||||
|
chain_str = config.get('CIC_CHAIN_SPEC')
|
||||||
|
block_offset = 0
|
||||||
|
if args.head:
|
||||||
|
block_offset = -1
|
||||||
|
else:
|
||||||
|
block_offset = args.offset
|
||||||
|
|
||||||
|
chain_spec = ChainSpec.from_chain_str(chain_str)
|
||||||
|
old_chain_spec_str = args.old_chain_spec
|
||||||
|
old_chain_spec = ChainSpec.from_chain_str(old_chain_spec_str)
|
||||||
|
|
||||||
|
user_dir = args.user_dir # user_out_dir from import_users.py
|
||||||
|
|
||||||
|
token_symbol = args.token_symbol
|
||||||
|
|
||||||
|
|
||||||
|
class Handler:
|
||||||
|
|
||||||
|
account_index_add_signature = keccak256_string_to_hex('add(address)')[:8]
|
||||||
|
|
||||||
|
def __init__(self, conn, chain_spec, user_dir, balances, token_address, signer, gas_oracle, nonce_oracle):
|
||||||
|
self.token_address = token_address
|
||||||
|
self.user_dir = user_dir
|
||||||
|
self.balances = balances
|
||||||
|
self.chain_spec = chain_spec
|
||||||
|
self.tx_factory = ERC20(chain_spec, signer, gas_oracle, nonce_oracle)
|
||||||
|
|
||||||
|
|
||||||
|
def name(self):
|
||||||
|
return 'balance_handler'
|
||||||
|
|
||||||
|
|
||||||
|
def filter(self, conn, block, tx, db_session):
|
||||||
|
if tx.payload == None or len(tx.payload) == 0:
|
||||||
|
logg.debug('no payload, skipping {}'.format(tx))
|
||||||
|
return
|
||||||
|
|
||||||
|
if tx.payload[:8] == self.account_index_add_signature:
|
||||||
|
recipient = eth_abi.decode_single('address', bytes.fromhex(tx.payload[-64:]))
|
||||||
|
#original_address = to_checksum_address(self.addresses[to_checksum_address(recipient)])
|
||||||
|
user_file = 'new/{}/{}/{}.json'.format(
|
||||||
|
recipient[2:4].upper(),
|
||||||
|
recipient[4:6].upper(),
|
||||||
|
recipient[2:].upper(),
|
||||||
|
)
|
||||||
|
filepath = os.path.join(self.user_dir, user_file)
|
||||||
|
o = None
|
||||||
|
try:
|
||||||
|
f = open(filepath, 'r')
|
||||||
|
o = json.load(f)
|
||||||
|
f.close()
|
||||||
|
except FileNotFoundError:
|
||||||
|
logg.error('no import record of address {}'.format(recipient))
|
||||||
|
return
|
||||||
|
u = Person.deserialize(o)
|
||||||
|
original_address = u.identities[old_chain_spec.engine()]['{}:{}'.format(old_chain_spec.common_name(), old_chain_spec.network_id())][0]
|
||||||
|
try:
|
||||||
|
balance = self.balances[original_address]
|
||||||
|
except KeyError as e:
|
||||||
|
logg.error('balance get fail orig {} new {}'.format(original_address, recipient))
|
||||||
|
return
|
||||||
|
|
||||||
|
# TODO: store token object in handler ,get decimals from there
|
||||||
|
multiplier = 10**6
|
||||||
|
balance_full = balance * multiplier
|
||||||
|
logg.info('registered {} originally {} ({}) tx hash {} balance {}'.format(recipient, original_address, u, tx.hash, balance_full))
|
||||||
|
|
||||||
|
(tx_hash_hex, o) = self.tx_factory.transfer(self.token_address, signer_address, recipient, balance_full)
|
||||||
|
logg.info('submitting erc20 transfer tx {} for recipient {}'.format(tx_hash_hex, recipient))
|
||||||
|
r = conn.do(o)
|
||||||
|
# except TypeError as e:
|
||||||
|
# logg.warning('typerror {}'.format(e))
|
||||||
|
# pass
|
||||||
|
# except IndexError as e:
|
||||||
|
# logg.warning('indexerror {}'.format(e))
|
||||||
|
# pass
|
||||||
|
# except EthException as e:
|
||||||
|
# logg.error('send error {}'.format(e).ljust(200))
|
||||||
|
#except KeyError as e:
|
||||||
|
# logg.error('key record not found in imports: {}'.format(e).ljust(200))
|
||||||
|
|
||||||
|
|
||||||
|
#class BlockGetter:
|
||||||
|
#
|
||||||
|
# def __init__(self, conn, gas_oracle, nonce_oracle, chain_spec):
|
||||||
|
# self.conn = conn
|
||||||
|
# self.tx_factory = ERC20(signer=signer, gas_oracle=gas_oracle, nonce_oracle=nonce_oracle, chain_id=chain_id)
|
||||||
|
#
|
||||||
|
#
|
||||||
|
# def get(self, n):
|
||||||
|
# o = block_by_number(n)
|
||||||
|
# r = self.conn.do(o)
|
||||||
|
# b = None
|
||||||
|
# try:
|
||||||
|
# b = Block(r)
|
||||||
|
# except TypeError as e:
|
||||||
|
# if r == None:
|
||||||
|
# logg.debug('block not found {}'.format(n))
|
||||||
|
# else:
|
||||||
|
# logg.error('block retrieve error {}'.format(e))
|
||||||
|
# return b
|
||||||
|
|
||||||
|
|
||||||
|
def progress_callback(block_number, tx_index, s):
|
||||||
|
sys.stdout.write(str(s).ljust(200) + "\n")
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
global chain_str, block_offset, user_dir
|
||||||
|
|
||||||
|
conn = EthHTTPConnection(config.get('ETH_PROVIDER'))
|
||||||
|
gas_oracle = OverrideGasOracle(conn=conn, limit=8000000)
|
||||||
|
nonce_oracle = RPCNonceOracle(signer_address, conn)
|
||||||
|
|
||||||
|
# Get Token registry address
|
||||||
|
txf = TxFactory(chain_spec, signer=signer, gas_oracle=gas_oracle, nonce_oracle=None)
|
||||||
|
tx = txf.template(signer_address, config.get('CIC_REGISTRY_ADDRESS'))
|
||||||
|
|
||||||
|
registry_addressof_method = keccak256_string_to_hex('addressOf(bytes32)')[:8]
|
||||||
|
data = add_0x(registry_addressof_method)
|
||||||
|
data += eth_abi.encode_single('bytes32', b'TokenRegistry').hex()
|
||||||
|
txf.set_code(tx, data)
|
||||||
|
|
||||||
|
o = jsonrpc_template()
|
||||||
|
o['method'] = 'eth_call'
|
||||||
|
o['params'].append(txf.normalize(tx))
|
||||||
|
o['params'].append('latest')
|
||||||
|
r = conn.do(o)
|
||||||
|
token_index_address = to_checksum_address(eth_abi.decode_single('address', bytes.fromhex(strip_0x(r))))
|
||||||
|
logg.info('found token index address {}'.format(token_index_address))
|
||||||
|
|
||||||
|
|
||||||
|
# Get Sarafu token address
|
||||||
|
tx = txf.template(signer_address, token_index_address)
|
||||||
|
data = add_0x(registry_addressof_method)
|
||||||
|
h = hashlib.new('sha256')
|
||||||
|
h.update(token_symbol.encode('utf-8'))
|
||||||
|
z = h.digest()
|
||||||
|
data += eth_abi.encode_single('bytes32', z).hex()
|
||||||
|
txf.set_code(tx, data)
|
||||||
|
o = jsonrpc_template()
|
||||||
|
o['method'] = 'eth_call'
|
||||||
|
o['params'].append(txf.normalize(tx))
|
||||||
|
o['params'].append('latest')
|
||||||
|
r = conn.do(o)
|
||||||
|
try:
|
||||||
|
sarafu_token_address = to_checksum_address(eth_abi.decode_single('address', bytes.fromhex(strip_0x(r))))
|
||||||
|
except ValueError as e:
|
||||||
|
logg.critical('lookup failed for token {}: {}'.format(token_symbol, e))
|
||||||
|
sys.exit(1)
|
||||||
|
logg.info('found token address {}'.format(sarafu_token_address))
|
||||||
|
|
||||||
|
syncer_backend = MemBackend(chain_str, 0)
|
||||||
|
|
||||||
|
if block_offset == -1:
|
||||||
|
o = block_latest()
|
||||||
|
r = conn.do(o)
|
||||||
|
block_offset = int(strip_0x(r), 16) + 1
|
||||||
|
#
|
||||||
|
# addresses = {}
|
||||||
|
# f = open('{}/addresses.csv'.format(user_dir, 'r'))
|
||||||
|
# while True:
|
||||||
|
# l = f.readline()
|
||||||
|
# if l == None:
|
||||||
|
# break
|
||||||
|
# r = l.split(',')
|
||||||
|
# try:
|
||||||
|
# k = r[0]
|
||||||
|
# v = r[1].rstrip()
|
||||||
|
# addresses[k] = v
|
||||||
|
# sys.stdout.write('loading address mapping {} -> {}'.format(k, v).ljust(200) + "\r")
|
||||||
|
# except IndexError as e:
|
||||||
|
# break
|
||||||
|
# f.close()
|
||||||
|
|
||||||
|
# TODO get decimals from token
|
||||||
|
balances = {}
|
||||||
|
f = open('{}/balances.csv'.format(user_dir, 'r'))
|
||||||
|
remove_zeros = 10**6
|
||||||
|
i = 0
|
||||||
|
while True:
|
||||||
|
l = f.readline()
|
||||||
|
if l == None:
|
||||||
|
break
|
||||||
|
r = l.split(',')
|
||||||
|
try:
|
||||||
|
address = to_checksum_address(r[0])
|
||||||
|
sys.stdout.write('loading balance {} {} {}'.format(i, address, r[1]).ljust(200) + "\r")
|
||||||
|
except ValueError:
|
||||||
|
break
|
||||||
|
balance = int(int(r[1].rstrip()) / remove_zeros)
|
||||||
|
balances[address] = balance
|
||||||
|
i += 1
|
||||||
|
|
||||||
|
f.close()
|
||||||
|
|
||||||
|
syncer_backend.set(block_offset, 0)
|
||||||
|
syncer = HeadSyncer(syncer_backend, progress_callback=progress_callback)
|
||||||
|
handler = Handler(conn, chain_spec, user_dir, balances, sarafu_token_address, signer, gas_oracle, nonce_oracle)
|
||||||
|
syncer.add_filter(handler)
|
||||||
|
syncer.loop(1, conn)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
@ -88,7 +88,7 @@ signer = EIP155Signer(keystore)
|
|||||||
|
|
||||||
nonce_oracle = RPCNonceOracle(signer_address, rpc)
|
nonce_oracle = RPCNonceOracle(signer_address, rpc)
|
||||||
|
|
||||||
registry = Registry()
|
registry = Registry(chain_spec)
|
||||||
o = registry.address_of(config.get('CIC_REGISTRY_ADDRESS'), 'AccountRegistry')
|
o = registry.address_of(config.get('CIC_REGISTRY_ADDRESS'), 'AccountRegistry')
|
||||||
r = rpc.do(o)
|
r = rpc.do(o)
|
||||||
account_registry_address = registry.parse_address_of(r)
|
account_registry_address = registry.parse_address_of(r)
|
File diff suppressed because it is too large
Load Diff
@ -1 +0,0 @@
|
|||||||
{"metricId":"ea11447f-da1c-49e6-b0a2-8a988a99e3ce","metrics":{"from":"2021-02-12T18:59:21.666Z","to":"2021-02-12T18:59:21.666Z","successfulInstalls":0,"failedInstalls":1}}
|
|
@ -1 +0,0 @@
|
|||||||
python import_balance.py -c config -i evm:bloxberg:8996 -y /home/lash/tmp/d/keystore/UTC--2021-02-07T09-58-35.341813355Z--eb3907ecad74a0013c259d5874ae7f22dcbcc95c -v $@
|
|
@ -1 +0,0 @@
|
|||||||
python import_users.py -c config --redis-host-callback redis -vv $@
|
|
Loading…
Reference in New Issue
Block a user