Compare commits

...

37 Commits

Author SHA1 Message Date
André Silva
a42d780d02 [Beta] Backports (#7756)
* Filter-out nodes.json (#7716)

* Filter-out nodes.json

* network: sort node table nodes by failure ratio

* network: fix node table tests

* network: fit node failure percentage into buckets of 5%

* network: consider number of attempts in sorting of node table

* network: fix node table grumbles

* Fix client not being dropped on shutdown (#7695)

* parity: wait for client to drop on shutdown

* parity: fix grumbles in shutdown wait

* parity: increase shutdown timeouts

* Wrap --help output to 120 characters (#7626)

* Update Clap dependency and remove workarounds

* WIP

* Remove line breaks in help messages for now

* Multiple values can only be separated by commas (closes #7428)

* Grumbles; refactor repeating code; add constant

* Use a single Wrapper rather than allocate a new one for each call

* Wrap --help to 120 characters rather than 100 characters
2018-01-31 21:45:23 +01:00
GitLab Build Bot
582fa8ce45 [ci skip] js-precompiled 20180131-171157 2018-01-31 17:13:01 +00:00
Jaco Greeff
73be0fb096 [beta] Token filter balances (throttle) (#7742)
* [beta] Token filter balances (throttle)

* Cleanups

* Remove unused uniq

* Update @parity/shared to 2.2.23

* Remove unused code paths
2018-01-31 14:59:53 +01:00
Afri Schoedon
627d1a4971
Bump beta to 1.9.1 (#7751) 2018-01-31 13:25:11 +01:00
Jaco Greeff
a7807106f5 [beta] Explicitly add branch name (#7754)
* [beta] Explicitly add branch name

* Fix cargo update branch to beta
2018-01-31 12:04:04 +01:00
Tomasz Drwięga
33b39f0725 Revert "revert to #7677 #7679" (#7715)
This reverts commit 568dc33a02.
2018-01-29 11:43:30 +01:00
Denis S. Soldatov aka General-Beck
53ec1141cf
fix permissions 2018-01-25 03:45:03 +03:00
Denis S. Soldatov aka General-Beck
145229d46d
add display of stages in js-release 2018-01-25 03:38:31 +03:00
Denis S. Soldatov aka General-Beck
568dc33a02
revert to #7677 #7679 2018-01-25 03:23:33 +03:00
Denis S. Soldatov aka General-Beck
cf10450108
add display of stages in js-release 2018-01-25 02:16:58 +03:00
GitLab Build Bot
fe779686ca [ci skip] js-precompiled 20180124-230347 2018-01-24 23:04:39 +00:00
Denis S. Soldatov aka General-Beck
58c1dbe322 Update gitlab-test.sh 2018-01-24 23:52:00 +01:00
Denis S. Soldatov aka General-Beck
14b578832d Update gitlab-test.sh 2018-01-24 23:39:03 +01:00
Denis S. Soldatov aka General-Beck
e961398393 Update gitlab-test.sh 2018-01-24 23:25:06 +01:00
Denis S. Soldatov aka General-Beck
0fad2a6d8c Update gitlab-test.sh 2018-01-24 23:12:09 +01:00
Denis S. Soldatov aka General-Beck
f3bcada7b9 Update gitlab-test.sh 2018-01-24 23:09:39 +01:00
Amaury Martiny
b814f1ccbf Add when when too many accounts (#7677) (#7679) 2018-01-24 09:45:08 +01:00
Afri Schoedon
cad91df2b8
Update installer.nsi 2018-01-23 22:37:33 +01:00
Denis S. Soldatov aka General-Beck
50a58e1ae8
fix conditions in gitlab-test (#7676)
* fix conditions in gitlab-test

* Update gitlab-test.sh
2018-01-23 14:55:02 +03:00
Denis S. Soldatov aka General-Beck
1e36fc5d0f
remove cargo cache 2018-01-23 14:42:24 +03:00
Marek Kotewicz
fa6a0a6b60 Backports to beta (#7660)
* Improve handling of RocksDB corruption (#7630)

* kvdb-rocksdb: update rust-rocksdb version

* kvdb-rocksdb: mark corruptions and attempt repair on db open

* kvdb-rocksdb: better corruption detection on open

* kvdb-rocksdb: add corruption_file_name const

* kvdb-rocksdb: rename mark_corruption to check_for_corruption

* Hardening of CSP (#7621)

* Fixed delegatecall's from/to (#7568)

* Fixed delegatecall's from/to, closes #7166

* added tests for delegatecall traces, #7167

* Light client RPCs (#7603)

* Implement registrar.

* Implement eth_getCode

* Don't wait for providers.

* Don't wait for providers.

* Fix linting and wasm tests.

* Problem: AttachedProtocols don't get registered (#7610)

I was investigating issues I am having with Whisper support. I've
enabled Whisper on a custom test network and inserted traces into
Whisper handler implementation (Network<T> and NetworkProtocolHandler
for Network<T>) and I noticed that the handler was never invoked.

After further research on this matter, I found out that
AttachedProtocol's register function does nothing:
https://github.com/paritytech/parity/blob/master/sync/src/api.rs#L172
but there was an implementation originally:
99075ad#diff-5212acb6bcea60e9804ba7b50f6fe6ec and it did the actual
expected logic of registering the protocol in the NetworkService.

However, as of 16d84f8#diff-5212acb6bcea60e9804ba7b50f6fe6ec ("finished
removing ipc") this implementation is gone and only the no-op function
is left.

Which leads me to a conclusion that in fact Whisper's handler never gets
registered in the service and therefore two nodes won't communicate
using it.

Solution: Resurrect original non-empty `AttachedProtocols.register`
implementation

Resolves #7566

* Fix Temporarily Invalid blocks handling (#7613)

* Handle temporarily invalid blocks in sync.

* Fix tests.
2018-01-23 12:32:34 +01:00
Denis S. Soldatov aka General-Beck
a8fc42d282
add docker build for beta (#7671)
* add docker build for beta

* add cargo cache
2018-01-23 06:19:39 +03:00
Denis S. Soldatov aka General-Beck
c6685a7f57
fix snapcraft build for beta (#7670) 2018-01-23 04:12:22 +03:00
Denis S. Soldatov aka General-Beck
736a8c40f0
Update Parity.pkgproj
1.10.0 ->1.9.0
2018-01-23 02:53:18 +03:00
Denis S. Soldatov aka General-Beck
5f74f8c265
update gitlab build from master
Signed-off-by: Denis S. Soldatov aka General-Beck <general.beck@gmail.com>
2018-01-23 01:50:52 +03:00
GitLab Build Bot
97ed569588 [ci skip] js-precompiled 20180119-115500 2018-01-19 11:55:49 +00:00
Jaco Greeff
6766ef988d
Update references to dapp sources (#7634) (#7636)
* Update plugin references

* Update dapp references

* Update console references
2018-01-19 12:19:33 +01:00
GitLab Build Bot
8a87cfb893 [ci skip] js-precompiled 20180118-163407 2018-01-18 16:34:59 +00:00
Jaco Greeff
54aebdcb45
Update tokenreg (#7618) (#7619)
* Update tokenreg

* Add commit hash
2018-01-18 16:54:23 +01:00
GitLab Build Bot
86a6145d76 [ci skip] js-precompiled 20180117-211011 2018-01-17 21:11:06 +00:00
Afri Schoedon
718020b64b [beta] fix cache:key (#7598)
* Bump 1.9 to beta

* Update .gitlab-ci.yml

fix cache:key
2018-01-17 23:36:13 +03:00
Afri Schoedon
8c36a56365
Bump 1.9 to beta (#7533) 2018-01-17 12:28:21 +01:00
GitLab Build Bot
7bccaa5c15 [ci skip] js-precompiled 20180111-130237 2018-01-11 13:03:32 +00:00
Jaco Greeff
98ec46fff6
[beta] Trigger js-precompiled (#7535) 2018-01-11 13:27:01 +01:00
GitLab Build Bot
8dc584ece9 [ci skip] js-precompiled 20180111-094838 2018-01-11 09:50:08 +00:00
André Silva
63d154dad3 kvdb: update rust-rocksdb version (#7512) 2018-01-10 11:23:37 +01:00
Amaury Martiny
0030bb4f1d Update js-api (#7510) 2018-01-09 17:57:52 +01:00
56 changed files with 1515 additions and 2415 deletions

View File

@ -4,13 +4,15 @@ stages:
- push-release
- build
variables:
SIMPLECOV: "true"
RUST_BACKTRACE: "1"
RUSTFLAGS: ""
CARGOFLAGS: ""
CI_SERVER_NAME: "GitLab CI"
LIBSSL: "libssl1.0.0 (>=1.0.0)"
cache:
key: "$CI_BUILD_STAGE/$CI_BUILD_REF_NAME"
key: "$CI_BUILD_STAGE-$CI_BUILD_REF_NAME"
paths:
- target
untracked: true
linux-stable:
stage: build
@ -22,77 +24,14 @@ linux-stable:
- triggers
script:
- rustup default stable
- cargo build -j $(nproc) --release --features final $CARGOFLAGS
- cargo build -j $(nproc) --release -p evmbin
- cargo build -j $(nproc) --release -p ethstore-cli
- cargo build -j $(nproc) --release -p ethkey-cli
- strip target/release/parity
- strip target/release/parity-evm
- strip target/release/ethstore
- strip target/release/ethkey
- export SHA3=$(target/release/parity tools hash target/release/parity)
- md5sum target/release/parity > parity.md5
- sh scripts/deb-build.sh amd64
- cp target/release/parity deb/usr/bin/parity
- cp target/release/parity-evm deb/usr/bin/parity-evm
- cp target/release/ethstore deb/usr/bin/ethstore
- cp target/release/ethkey deb/usr/bin/ethkey
- export VER=$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")
- dpkg-deb -b deb "parity_"$VER"_amd64.deb"
- md5sum "parity_"$VER"_amd64.deb" > "parity_"$VER"_amd64.deb.md5"
- aws configure set aws_access_key_id $s3_key
- aws configure set aws_secret_access_key $s3_secret
- if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
- aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu/parity --body target/release/parity
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu/parity.md5 --body parity.md5
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu/"parity_"$VER"_amd64.deb" --body "parity_"$VER"_amd64.deb"
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu/"parity_"$VER"_amd64.deb.md5" --body "parity_"$VER"_amd64.deb.md5"
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu
# ARGUMENTS: 1. BUILD_PLATFORM (target for binaries) 2. PLATFORM (target for cargo) 3. ARC (architecture) 4. & 5. CC & CXX flags
- scripts/gitlab-build.sh x86_64-unknown-linux-gnu x86_64-unknown-linux-gnu amd64 gcc g++
tags:
- rust
- rust-stable
artifacts:
paths:
- target/release/parity
- target/release/parity-evm
- target/release/ethstore
- target/release/ethkey
- parity.zip
name: "stable-x86_64-unknown-linux-gnu_parity"
linux-snap:
stage: build
image: parity/snapcraft:gitlab-ci
only:
- snap
- beta
- tags
- triggers
script:
- export VER=$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")
- cd snap
- rm -rf *snap
- sed -i 's/master/'"$VER"'/g' snapcraft.yaml
- echo "Version:"$VER
- snapcraft
- ls
- cp "parity_"$CI_BUILD"_REF_NAME_amd64.snap" "parity_"$VER"_amd64.snap"
- md5sum "parity_"$VER"_amd64.snap" > "parity_"$VER"_amd64.snap.md5"
- aws configure set aws_access_key_id $s3_key
- aws configure set aws_secret_access_key $s3_secret
- if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu/"parity_"$VER"_amd64.snap" --body "parity_"$VER"_amd64.snap"
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu/"parity_"$VER"_amd64.snap.md5" --body "parity_"$VER"_amd64.snap.md5"
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/x86_64-unknown-linux-gnu
tags:
- rust
- rust-stable
artifacts:
paths:
- scripts/parity_*_amd64.snap
name: "stable-x86_64-unknown-snap-gnu_parity"
allow_failure: true
linux-stable-debian:
stage: build
image: parity/rust-debian:gitlab-ci
@ -102,81 +41,14 @@ linux-stable-debian:
- stable
- triggers
script:
- cargo build -j $(nproc) --release --features final $CARGOFLAGS
- cargo build -j $(nproc) --release -p evmbin
- cargo build -j $(nproc) --release -p ethstore-cli
- cargo build -j $(nproc) --release -p ethkey-cli
- strip target/release/parity
- strip target/release/parity-evm
- strip target/release/ethstore
- strip target/release/ethkey
- export SHA3=$(target/release/parity tools hash target/release/parity)
- md5sum target/release/parity > parity.md5
- sh scripts/deb-build.sh amd64
- cp target/release/parity deb/usr/bin/parity
- cp target/release/parity-evm deb/usr/bin/parity-evm
- cp target/release/ethstore deb/usr/bin/ethstore
- cp target/release/ethkey deb/usr/bin/ethkey
- export VER=$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")
- dpkg-deb -b deb "parity_"$VER"_amd64.deb"
- md5sum "parity_"$VER"_amd64.deb" > "parity_"$VER"_amd64.deb.md5"
- aws configure set aws_access_key_id $s3_key
- aws configure set aws_secret_access_key $s3_secret
- if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
- aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/x86_64-unknown-debian-gnu
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-debian-gnu/parity --body target/release/parity
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-debian-gnu/parity.md5 --body parity.md5
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-debian-gnu/"parity_"$VER"_amd64.deb" --body "parity_"$VER"_amd64.deb"
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/x86_64-unknown-debian-gnu/"parity_"$VER"_amd64.deb.md5" --body "parity_"$VER"_amd64.deb.md5"
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/x86_64-unknown-debian-gnu
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/x86_64-unknown-debian-gnu
- export LIBSSL="libssl1.1 (>=1.1.0)"
- scripts/gitlab-build.sh x86_64-unknown-debian-gnu x86_64-unknown-linux-gnu amd64 gcc g++
tags:
- rust
- rust-debian
artifacts:
paths:
- target/release/parity
- parity.zip
name: "stable-x86_64-unknown-debian-gnu_parity"
linux-beta:
stage: build
image: parity/rust:gitlab-ci
only:
- beta
- tags
- stable
- triggers
script:
- rustup default beta
- cargo build -j $(nproc) --release $CARGOFLAGS
- strip target/release/parity
tags:
- rust
- rust-beta
artifacts:
paths:
- target/release/parity
name: "beta-x86_64-unknown-linux-gnu_parity"
allow_failure: true
linux-nightly:
stage: build
image: parity/rust:gitlab-ci
only:
- beta
- tags
- stable
- triggers
script:
- rustup default nightly
- cargo build -j $(nproc) --release $CARGOFLAGS
- strip target/release/parity
tags:
- rust
- rust-nightly
artifacts:
paths:
- target/release/parity
name: "nigthly-x86_64-unknown-linux-gnu_parity"
allow_failure: true
linux-centos:
stage: build
image: parity/rust-centos:gitlab-ci
@ -186,42 +58,12 @@ linux-centos:
- stable
- triggers
script:
- export CXX="g++"
- export CC="gcc"
- export PLATFORM=x86_64-unknown-centos-gnu
- cargo build -j $(nproc) --release --features final $CARGOFLAGS
- cargo build -j $(nproc) --release -p evmbin
- cargo build -j $(nproc) --release -p ethstore-cli
- cargo build -j $(nproc) --release -p ethkey-cli
- strip target/release/parity
- strip target/release/parity-evm
- strip target/release/ethstore
- strip target/release/ethkey
- md5sum target/release/parity > parity.md5
- md5sum target/release/parity-evm > parity-evm.md5
- md5sum target/release/ethstore > ethstore.md5
- md5sum target/release/ethkey > ethkey.md5
- export SHA3=$(target/release/parity tools hash target/release/parity)
- aws configure set aws_access_key_id $s3_key
- aws configure set aws_secret_access_key $s3_secret
- if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
- aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu
- aws s3api put-object --bucket builds-parity --key $CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu/parity --body target/release/parity
- aws s3api put-object --bucket builds-parity --key $CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu/parity.md5 --body parity.md5
- aws s3api put-object --bucket builds-parity --key $CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu/parity-evm --body target/release/parity-evm
- aws s3api put-object --bucket builds-parity --key $CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu/parity-evm.md5 --body parity-evm.md5
- aws s3api put-object --bucket builds-parity --key $CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu/ethstore --body target/release/ethstore
- aws s3api put-object --bucket builds-parity --key $CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu/ethstore.md5 --body ethstore.md5
- aws s3api put-object --bucket builds-parity --key $CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu/ethkey --body target/release/ethkey
- aws s3api put-object --bucket builds-parity --key $CI_BUILD_REF_NAME/x86_64-unknown-centos-gnu/ethkey.md5 --body ethkey.md5
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- scripts/gitlab-build.sh x86_64-unknown-centos-gnu x86_64-unknown-linux-gnu x86_64 gcc g++
tags:
- rust
- rust-centos
artifacts:
paths:
- target/release/parity
- parity.zip
name: "x86_64-unknown-centos-gnu_parity"
linux-i686:
stage: build
@ -232,47 +74,13 @@ linux-i686:
- stable
- triggers
script:
- export HOST_CC=gcc
- export HOST_CXX=g++
- export COMMIT=$(git rev-parse HEAD)
- export PLATFORM=i686-unknown-linux-gnu
- cargo build -j $(nproc) --target $PLATFORM --features final --release $CARGOFLAGS
- cargo build -j $(nproc) --target $PLATFORM --release -p evmbin
- cargo build -j $(nproc) --target $PLATFORM --release -p ethstore-cli
- cargo build -j $(nproc) --target $PLATFORM --release -p ethkey-cli
- strip target/$PLATFORM/release/parity
- strip target/$PLATFORM/release/parity-evm
- strip target/$PLATFORM/release/ethstore
- strip target/$PLATFORM/release/ethkey
- strip target/$PLATFORM/release/parity
- md5sum target/$PLATFORM/release/parity > parity.md5
- export SHA3=$(target/$PLATFORM/release/parity tools hash target/$PLATFORM/release/parity)
- sh scripts/deb-build.sh i386
- cp target/$PLATFORM/release/parity deb/usr/bin/parity
- cp target/$PLATFORM/release/parity-evm deb/usr/bin/parity-evm
- cp target/$PLATFORM/release/ethstore deb/usr/bin/ethstore
- cp target/$PLATFORM/release/ethkey deb/usr/bin/ethkey
- export VER=$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")
- dpkg-deb -b deb "parity_"$VER"_i386.deb"
- md5sum "parity_"$VER"_i386.deb" > "parity_"$VER"_i386.deb.md5"
- aws configure set aws_access_key_id $s3_key
- aws configure set aws_secret_access_key $s3_secret
- if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
- aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/$PLATFORM
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity --body target/$PLATFORM/release/parity
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity.md5 --body parity.md5
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity_"$VER"_i386.deb" --body "parity_"$VER"_i386.deb"
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity_"$VER"_i386.deb.md5" --body "parity_"$VER"_i386.deb.md5"
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- scripts/gitlab-build.sh i686-unknown-linux-gnu i686-unknown-linux-gnu i386 gcc g++
tags:
- rust
- rust-i686
artifacts:
paths:
- target/i686-unknown-linux-gnu/release/parity
- parity.zip
name: "i686-unknown-linux-gnu"
allow_failure: true
linux-armv7:
stage: build
image: parity/rust-armv7:gitlab-ci
@ -282,55 +90,13 @@ linux-armv7:
- stable
- triggers
script:
- export CC=arm-linux-gnueabihf-gcc
- export CXX=arm-linux-gnueabihf-g++
- export HOST_CC=gcc
- export HOST_CXX=g++
- export PLATFORM=armv7-unknown-linux-gnueabihf
- rm -rf .cargo
- mkdir -p .cargo
- echo "[target.$PLATFORM]" >> .cargo/config
- echo "linker= \"arm-linux-gnueabihf-gcc\"" >> .cargo/config
- cat .cargo/config
- cargo build -j $(nproc) --target $PLATFORM --features final --release $CARGOFLAGS
- cargo build -j $(nproc) --target $PLATFORM --release -p evmbin
- cargo build -j $(nproc) --target $PLATFORM --release -p ethstore-cli
- cargo build -j $(nproc) --target $PLATFORM --release -p ethkey-cli
- md5sum target/$PLATFORM/release/parity > parity.md5
- export SHA3=$(target/$PLATFORM/release/parity tools hash target/$PLATFORM/release/parity)
- sh scripts/deb-build.sh i386
- arm-linux-gnueabihf-strip target/$PLATFORM/release/parity
- arm-linux-gnueabihf-strip target/$PLATFORM/release/parity-evm
- arm-linux-gnueabihf-strip target/$PLATFORM/release/ethstore
- arm-linux-gnueabihf-strip target/$PLATFORM/release/ethkey
- export SHA3=$(rhash --sha3-256 target/$PLATFORM/release/parity -p %h)
- md5sum target/$PLATFORM/release/parity > parity.md5
- sh scripts/deb-build.sh armhf
- cp target/$PLATFORM/release/parity deb/usr/bin/parity
- cp target/$PLATFORM/release/parity-evm deb/usr/bin/parity-evm
- cp target/$PLATFORM/release/ethstore deb/usr/bin/ethstore
- cp target/$PLATFORM/release/ethkey deb/usr/bin/ethkey
- export VER=$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")
- dpkg-deb -b deb "parity_"$VER"_armhf.deb"
- md5sum "parity_"$VER"_armhf.deb" > "parity_"$VER"_armhf.deb.md5"
- aws configure set aws_access_key_id $s3_key
- aws configure set aws_secret_access_key $s3_secret
- if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
- aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/$PLATFORM
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity --body target/$PLATFORM/release/parity
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity.md5 --body parity.md5
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity_"$VER"_armhf.deb" --body "parity_"$VER"_armhf.deb"
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity_"$VER"_armhf.deb.md5" --body "parity_"$VER"_armhf.deb.md5"
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- scripts/gitlab-build.sh armv7-unknown-linux-gnueabihf armv7-unknown-linux-gnueabihf armhf arm-linux-gnueabihf-gcc arm-linux-gnueabihf-g++
tags:
- rust
- rust-arm
artifacts:
paths:
- target/armv7-unknown-linux-gnueabihf/release/parity
- parity.zip
name: "armv7_unknown_linux_gnueabihf_parity"
allow_failure: true
linux-arm:
stage: build
image: parity/rust-arm:gitlab-ci
@ -340,52 +106,13 @@ linux-arm:
- stable
- triggers
script:
- export CC=arm-linux-gnueabihf-gcc
- export CXX=arm-linux-gnueabihf-g++
- export HOST_CC=gcc
- export HOST_CXX=g++
- export PLATFORM=arm-unknown-linux-gnueabihf
- rm -rf .cargo
- mkdir -p .cargo
- echo "[target.$PLATFORM]" >> .cargo/config
- echo "linker= \"arm-linux-gnueabihf-gcc\"" >> .cargo/config
- cat .cargo/config
- cargo build -j $(nproc) --target $PLATFORM --features final --release $CARGOFLAGS
- cargo build -j $(nproc) --target $PLATFORM --release -p evmbin
- cargo build -j $(nproc) --target $PLATFORM --release -p ethstore-cli
- cargo build -j $(nproc) --target $PLATFORM --release -p ethkey-cli
- arm-linux-gnueabihf-strip target/$PLATFORM/release/parity
- arm-linux-gnueabihf-strip target/$PLATFORM/release/parity-evm
- arm-linux-gnueabihf-strip target/$PLATFORM/release/ethstore
- arm-linux-gnueabihf-strip target/$PLATFORM/release/ethkey
- export SHA3=$(rhash --sha3-256 target/$PLATFORM/release/parity -p %h)
- md5sum target/$PLATFORM/release/parity > parity.md5
- sh scripts/deb-build.sh armhf
- cp target/$PLATFORM/release/parity deb/usr/bin/parity
- cp target/$PLATFORM/release/parity-evm deb/usr/bin/parity-evm
- cp target/$PLATFORM/release/ethstore deb/usr/bin/ethstore
- cp target/$PLATFORM/release/ethkey deb/usr/bin/ethkey
- export VER=$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")
- dpkg-deb -b deb "parity_"$VER"_armhf.deb"
- md5sum "parity_"$VER"_armhf.deb" > "parity_"$VER"_armhf.deb.md5"
- aws configure set aws_access_key_id $s3_key
- aws configure set aws_secret_access_key $s3_secret
- if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
- aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/$PLATFORM
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity --body target/$PLATFORM/release/parity
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity.md5 --body parity.md5
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity_"$VER"_armhf.deb" --body "parity_"$VER"_armhf.deb"
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity_"$VER"_armhf.deb.md5" --body "parity_"$VER"_armhf.deb.md5"
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- scripts/gitlab-build.sh arm-unknown-linux-gnueabihf arm-unknown-linux-gnueabihf armhf arm-linux-gnueabihf-gcc arm-linux-gnueabihf-g++
tags:
- rust
- rust-arm
artifacts:
paths:
- target/arm-unknown-linux-gnueabihf/release/parity
- parity.zip
name: "arm-unknown-linux-gnueabihf_parity"
allow_failure: true
linux-aarch64:
stage: build
image: parity/rust-arm64:gitlab-ci
@ -395,50 +122,29 @@ linux-aarch64:
- stable
- triggers
script:
- export CC=aarch64-linux-gnu-gcc
- export CXX=aarch64-linux-gnu-g++
- export HOST_CC=gcc
- export HOST_CXX=g++
- export PLATFORM=aarch64-unknown-linux-gnu
- rm -rf .cargo
- mkdir -p .cargo
- echo "[target.$PLATFORM]" >> .cargo/config
- echo "linker= \"aarch64-linux-gnu-gcc\"" >> .cargo/config
- cat .cargo/config
- cargo build -j $(nproc) --target $PLATFORM --features final --release $CARGOFLAGS
- cargo build -j $(nproc) --target $PLATFORM --release -p evmbin
- cargo build -j $(nproc) --target $PLATFORM --release -p ethstore-cli
- cargo build -j $(nproc) --target $PLATFORM --release -p ethkey-cli
- aarch64-linux-gnu-strip target/$PLATFORM/release/parity
- aarch64-linux-gnu-strip target/$PLATFORM/release/parity-evm
- aarch64-linux-gnu-strip target/$PLATFORM/release/ethstore
- aarch64-linux-gnu-strip target/$PLATFORM/release/ethkey
- export SHA3=$(rhash --sha3-256 target/$PLATFORM/release/parity -p %h)
- md5sum target/$PLATFORM/release/parity > parity.md5
- sh scripts/deb-build.sh arm64
- cp target/$PLATFORM/release/parity deb/usr/bin/parity
- cp target/$PLATFORM/release/parity-evm deb/usr/bin/parity-evm
- cp target/$PLATFORM/release/ethstore deb/usr/bin/ethstore
- cp target/$PLATFORM/release/ethkey deb/usr/bin/ethkey
- export VER=$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")
- dpkg-deb -b deb "parity_"$VER"_arm64.deb"
- md5sum "parity_"$VER"_arm64.deb" > "parity_"$VER"_arm64.deb.md5"
- aws configure set aws_access_key_id $s3_key
- aws configure set aws_secret_access_key $s3_secret
- if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
- aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/$PLATFORM
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity.md5 --body parity.md5
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity_"$VER"_arm64.deb" --body "parity_"$VER"_arm64.deb"
- aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity_"$VER"_arm64.deb.md5" --body "parity_"$VER"_arm64.deb.md5"
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/$PLATFORM
- scripts/gitlab-build.sh aarch64-unknown-linux-gnu aarch64-unknown-linux-gnu arm64 aarch64-linux-gnu-gcc aarch64-linux-gnu-g++
tags:
- rust
- rust-arm
artifacts:
paths:
- target/aarch64-unknown-linux-gnu/release/parity
- parity.zip
name: "aarch64-unknown-linux-gnu_parity"
linux-snap:
stage: build
image: parity/snapcraft:gitlab-ci
only:
- stable
- beta
- tags
- triggers
script:
- scripts/gitlab-build.sh x86_64-unknown-snap-gnu x86_64-unknown-linux-gnu amd64 gcc g++
tags:
- rust-stable
artifacts:
paths:
- scripts/parity_*_amd64.snap
name: "stable-x86_64-unknown-snap-gnu_parity"
allow_failure: true
darwin:
stage: build
@ -447,45 +153,17 @@ darwin:
- tags
- stable
- triggers
script: |
export COMMIT=$(git rev-parse HEAD)
export PLATFORM=x86_64-apple-darwin
rustup default stable
cargo clean
cargo build -j 8 --features final --release #$CARGOFLAGS
cargo build -j 8 --release -p ethstore-cli #$CARGOFLAGS
cargo build -j 8 --release -p ethkey-cli #$CARGOFLAGS
cargo build -j 8 --release -p evmbin #$CARGOFLAGS
rm -rf parity.md5
md5sum target/release/parity > parity.md5
export SHA3=$(target/release/parity tools hash target/release/parity)
cd mac
xcodebuild -configuration Release
cd ..
packagesbuild -v mac/Parity.pkgproj
productsign --sign 'Developer ID Installer: PARITY TECHNOLOGIES LIMITED (P2PX3JU8FT)' target/release/Parity\ Ethereum.pkg target/release/Parity\ Ethereum-signed.pkg
export VER=$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")
mv target/release/Parity\ Ethereum-signed.pkg "parity-"$VER"-macos-installer.pkg"
md5sum "parity-"$VER"-macos-installer.pkg" >> "parity-"$VER"-macos-installer.pkg.md5"
aws configure set aws_access_key_id $s3_key
aws configure set aws_secret_access_key $s3_secret
if [[ $CI_BUILD_REF_NAME =~ ^(master|beta|stable|nightly)$ ]]; then export S3_BUCKET=builds-parity-published; else export S3_BUCKET=builds-parity; fi
aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/$PLATFORM
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity --body target/release/parity
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/parity.md5 --body parity.md5
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity-"$VER"-macos-installer.pkg" --body "parity-"$VER"-macos-installer.pkg"
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$PLATFORM/"parity-"$VER"-macos-installer.pkg.md5" --body "parity-"$VER"-macos-installer.pkg.md5"
curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/$PLATFORM
curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/$PLATFORM
script:
- scripts/gitlab-build.sh x86_64-apple-darwin x86_64-apple-darwin macos gcc g++
tags:
- osx
artifacts:
paths:
- target/release/parity
- parity.zip
name: "x86_64-apple-darwin_parity"
windows:
cache:
key: "%CI_BUILD_STAGE%/%CI_BUILD_REF_NAME%"
key: "%CI_BUILD_STAGE%-%CI_BUILD_REF_NAME%"
untracked: true
stage: build
only:
@ -494,62 +172,12 @@ windows:
- stable
- triggers
script:
- set PLATFORM=x86_64-pc-windows-msvc
- set INCLUDE=C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Include;C:\vs2015\VC\include;C:\Program Files (x86)\Windows Kits\10\Include\10.0.10240.0\ucrt
- set LIB=C:\vs2015\VC\lib;C:\Program Files (x86)\Windows Kits\10\Lib\10.0.10240.0\ucrt\x64
- set RUST_BACKTRACE=1
- set RUSTFLAGS=%RUSTFLAGS%
- rustup default stable-x86_64-pc-windows-msvc
- cargo clean
- cargo build --features final --release #%CARGOFLAGS%
- cargo build --release -p ethstore-cli #%CARGOFLAGS%
- cargo build --release -p ethkey-cli #%CARGOFLAGS%
- cargo build --release -p evmbin #%CARGOFLAGS%
- signtool sign /f %keyfile% /p %certpass% target\release\parity.exe
- target\release\parity.exe tools hash target\release\parity.exe > parity.sha3
- set /P SHA3=<parity.sha3
- curl -sL --url "https://github.com/paritytech/win-build/raw/master/SimpleFC.dll" -o nsis\SimpleFC.dll
- curl -sL --url "https://github.com/paritytech/win-build/raw/master/vc_redist.x64.exe" -o nsis\vc_redist.x64.exe
- msbuild windows\ptray\ptray.vcxproj /p:Platform=x64 /p:Configuration=Release
- signtool sign /f %keyfile% /p %certpass% windows\ptray\x64\release\ptray.exe
- cd nsis
- makensis.exe installer.nsi
- copy installer.exe InstallParity.exe
- signtool sign /f %keyfile% /p %certpass% InstallParity.exe
- md5sums InstallParity.exe > InstallParity.exe.md5
- zip win-installer.zip InstallParity.exe InstallParity.exe.md5
- md5sums win-installer.zip > win-installer.zip.md5
- cd ..\target\release\
- md5sums parity.exe > parity.exe.md5
- zip parity.zip parity.exe parity.md5
- md5sums parity.zip > parity.zip.md5
- cd ..\..
- aws configure set aws_access_key_id %s3_key%
- aws configure set aws_secret_access_key %s3_secret%
- echo %CI_BUILD_REF_NAME%
- echo %CI_BUILD_REF_NAME% | findstr /R "master" >nul 2>&1 && set S3_BUCKET=builds-parity-published|| set S3_BUCKET=builds-parity
- echo %CI_BUILD_REF_NAME% | findstr /R "beta" >nul 2>&1 && set S3_BUCKET=builds-parity-published|| set S3_BUCKET=builds-parity
- echo %CI_BUILD_REF_NAME% | findstr /R "stable" >nul 2>&1 && set S3_BUCKET=builds-parity-published|| set S3_BUCKET=builds-parity
- echo %CI_BUILD_REF_NAME% | findstr /R "nightly" >nul 2>&1 && set S3_BUCKET=builds-parity-published|| set S3_BUCKET=builds-parity
- echo %S3_BUCKET%
- aws s3 rm --recursive s3://%S3_BUCKET%/%CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc
- aws s3api put-object --bucket %S3_BUCKET% --key %CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc/parity.exe --body target\release\parity.exe
- aws s3api put-object --bucket %S3_BUCKET% --key %CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc/parity.exe.md5 --body target\release\parity.exe.md5
- aws s3api put-object --bucket %S3_BUCKET% --key %CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc/parity.zip --body target\release\parity.zip
- aws s3api put-object --bucket %S3_BUCKET% --key %CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc/parity.zip.md5 --body target\release\parity.zip.md5
- aws s3api put-object --bucket %S3_BUCKET% --key %CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc/InstallParity.exe --body nsis\InstallParity.exe
- aws s3api put-object --bucket %S3_BUCKET% --key %CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc/InstallParity.exe.md5 --body nsis\InstallParity.exe.md5
- aws s3api put-object --bucket %S3_BUCKET% --key %CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc/win-installer.zip --body nsis\win-installer.zip
- aws s3api put-object --bucket %S3_BUCKET% --key %CI_BUILD_REF_NAME%/x86_64-pc-windows-msvc/win-installer.zip.md5 --body nsis\win-installer.zip.md5
- curl --data "commit=%CI_BUILD_REF%&sha3=%SHA3%&filename=parity.exe&secret=%RELEASES_SECRET%" http://update.parity.io:1337/push-build/%CI_BUILD_REF_NAME%/%PLATFORM%
- curl --data "commit=%CI_BUILD_REF%&sha3=%SHA3%&filename=parity.exe&secret=%RELEASES_SECRET%" http://update.parity.io:1338/push-build/%CI_BUILD_REF_NAME%/%PLATFORM%
- sh scripts/gitlab-build.sh x86_64-pc-windows-msvc x86_64-pc-windows-msvc installer "" "" ""
tags:
- rust-windows
artifacts:
paths:
- target/release/parity.exe
- target/release/parity.pdb
- nsis/InstallParity.exe
- parity.zip
name: "x86_64-pc-windows-msvc_parity"
docker-build:
stage: build
@ -559,10 +187,10 @@ docker-build:
before_script:
- docker info
script:
- if [ "$CI_BUILD_REF_NAME" == "beta-release" ]; then DOCKER_TAG="latest"; else DOCKER_TAG=$CI_BUILD_REF_NAME; fi
- DOCKER_TAG=$CI_BUILD_REF_NAME
- echo "Tag:" $DOCKER_TAG
- docker login -u $Docker_Hub_User_Parity -p $Docker_Hub_Pass_Parity
- sh scripts/docker-build.sh $DOCKER_TAG
- scripts/docker-build.sh $DOCKER_TAG
- docker logout
tags:
- docker
@ -571,63 +199,16 @@ test-coverage:
only:
- master
script:
- git submodule update --init --recursive
- rm -rf target/*
- rm -rf js/.coverage
- scripts/cov.sh
# - COVERAGE=$(grep -Po 'covered":.*?[^\\]"' target/cov/index.json | grep "[0-9]*\.[0-9]" -o)
# - echo "Coverage:" $COVERAGE
- scripts/gitlab-test.sh test-coverage
tags:
- kcov
allow_failure: true
test-darwin:
stage: test
only:
- triggers
variables:
RUST_BACKTRACE: 1
script:
- git submodule update --init --recursive
- if [ $RUST_FILES_MODIFIED -eq 0 ]; then echo "Skipping Rust tests since no Rust files modified."; else ./test.sh $CARGOFLAGS; fi
tags:
- osx
allow_failure: true
test-windows:
stage: test
only:
- triggers
variables:
RUST_BACKTRACE: 1
script:
- git submodule update --init --recursive
- echo cargo test --features json-tests -p rlp -p ethash -p ethcore -p ethcore-bigint -p parity-dapps -p parity-rpc -p ethcore-util -p ethcore-network -p ethcore-io -p ethkey -p ethstore -p ethsync -p ethcore-ipc -p ethcore-ipc-tests -p ethcore-ipc-nano -p parity-rpc-client -p parity %CARGOFLAGS% --verbose --release
tags:
- rust-windows
allow_failure: true
test-rust-stable:
stage: test
image: parity/rust:gitlab-ci
variables:
RUST_BACKTRACE: 1
script:
- git submodule update --init --recursive
- rustup show
- if [ $RUST_FILES_MODIFIED -eq 0 ]; then echo "Skipping Rust tests since no Rust files modified."; else ./test.sh $CARGOFLAGS; fi
- if [ "$CI_BUILD_REF_NAME" == "nightly" ]; then sh scripts/aura-test.sh; fi
- scripts/gitlab-test.sh stable
tags:
- rust
- rust-stable
js-test:
stage: test
image: parity/rust:gitlab-ci
script:
- git submodule update --init --recursive
- if [ $JS_FILES_MODIFIED -eq 0 ]; then echo "Skipping JS deps install since no JS files modified."; else ./js/scripts/install-deps.sh;fi
- if [ $JS_OLD_FILES_MODIFIED -eq 0 ]; then echo "Skipping JS (old) deps install since no JS files modified."; else ./js-old/scripts/install-deps.sh;fi
- if [ $JS_FILES_MODIFIED -eq 0 ]; then echo "Skipping JS lint since no JS files modified."; else ./js/scripts/lint.sh && ./js/scripts/test.sh && ./js/scripts/build.sh; fi
- if [ $JS_OLD_FILES_MODIFIED -eq 0 ]; then echo "Skipping JS (old) lint since no JS files modified."; else ./js-old/scripts/lint.sh && ./js-old/scripts/test.sh && ./js-old/scripts/build.sh; fi
tags:
- rust
- rust-stable
test-rust-beta:
stage: test
@ -635,14 +216,9 @@ test-rust-beta:
- triggers
- master
image: parity/rust:gitlab-ci
variables:
RUST_BACKTRACE: 1
script:
- git submodule update --init --recursive
- rustup default beta
- if [ $RUST_FILES_MODIFIED -eq 0 ]; then echo "Skipping Rust tests since no Rust files modified."; else ./test.sh $CARGOFLAGS; fi
- scripts/gitlab-test.sh beta
tags:
- rust
- rust-beta
allow_failure: true
test-rust-nightly:
@ -651,16 +227,19 @@ test-rust-nightly:
- triggers
- master
image: parity/rust:gitlab-ci
variables:
RUST_BACKTRACE: 1
script:
- git submodule update --init --recursive
- rustup default nightly
- if [ $RUST_FILES_MODIFIED -eq 0 ]; then echo "Skipping Rust tests since no Rust files modified."; else ./test.sh $CARGOFLAGS; fi
- scripts/gitlab-test.sh nightly
tags:
- rust
- rust-nightly
allow_failure: true
js-test:
stage: test
image: parity/rust:gitlab-ci
script:
- scripts/gitlab-test.sh js-test
tags:
- rust-stable
js-release:
stage: js-build
only:
@ -671,16 +250,7 @@ js-release:
- triggers
image: parity/rust:gitlab-ci
script:
- rustup default stable
- echo $JS_FILES_MODIFIED
- if [ $JS_FILES_MODIFIED -eq 0 ]; then echo "Skipping JS deps install since no JS files modified."; else ./js/scripts/install-deps.sh;fi
- if [ $JS_FILES_MODIFIED -eq 0 ]; then echo "Skipping JS rebuild since no JS files modified."; else ./js/scripts/build.sh && ./js/scripts/push-precompiled.sh; fi
- echo $JS_OLD_FILES_MODIFIED
- if [ $JS_OLD_FILES_MODIFIED -eq 0 ]; then echo "Skipping JS (old) deps install since no JS files modified."; else ./js-old/scripts/install-deps.sh;fi
- if [ $JS_OLD_FILES_MODIFIED -eq 0 ]; then echo "Skipping JS (old) rebuild since no JS files modified."; else ./js-old/scripts/build.sh && ./js-old/scripts/push-precompiled.sh; fi
- if [ $JS_FILES_MODIFIED -eq 0 ] && [ $JS_OLD_FILES_MODIFIED -eq 0 ]; then echo "Skipping Cargo update since no JS files modified."; else ./js/scripts/push-cargo.sh; fi
- scripts/gitlab-test.sh js-release
tags:
- javascript
push-release:
@ -695,13 +265,3 @@ push-release:
- curl --data "secret=$RELEASES_SECRET" http://update.parity.io:1338/push-release/$CI_BUILD_REF_NAME/$CI_BUILD_REF
tags:
- curl
# ---------------------------------------------------------------------------
.functions: &functions |
export JS_FILES_MODIFIED=$(git --no-pager diff --name-only master...$CI_BUILD_REF | grep ^js/ | wc -l)
export JS_OLD_FILES_MODIFIED=$(git --no-pager diff --name-only master...$CI_BUILD_REF | grep ^js-old/ | wc -l)
export RUST_FILES_MODIFIED=$(git --no-pager diff --name-only master...$CI_BUILD_REF | grep -v -e ^js -e ^\\. -e ^LICENSE -e ^README.md -e ^test.sh -e ^windows/ -e ^scripts/ -e^mac/ -e ^nsis/ | wc -l)
before_script:
- *functions

67
Cargo.lock generated
View File

@ -25,6 +25,11 @@ name = "ansi_term"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "ansi_term"
version = "0.10.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "app_dirs"
version = "1.1.1"
@ -163,6 +168,11 @@ name = "bitflags"
version = "0.9.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "bitflags"
version = "1.0.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
[[package]]
name = "bloomable"
version = "0.1.0"
@ -234,15 +244,14 @@ dependencies = [
[[package]]
name = "clap"
version = "2.26.2"
version = "2.29.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"ansi_term 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"ansi_term 0.10.2 (registry+https://github.com/rust-lang/crates.io-index)",
"atty 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)",
"bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)",
"strsim 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)",
"term_size 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)",
"textwrap 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)",
"textwrap 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
"vec_map 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)",
]
@ -663,6 +672,8 @@ dependencies = [
"rlp 0.2.1",
"rust-crypto 0.2.36 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-hex 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
"serde 1.0.15 (registry+https://github.com/rust-lang/crates.io-index)",
"serde_derive 1.0.15 (registry+https://github.com/rust-lang/crates.io-index)",
"serde_json 1.0.3 (registry+https://github.com/rust-lang/crates.io-index)",
"slab 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"snappy 0.1.0 (git+https://github.com/paritytech/rust-snappy)",
@ -1903,11 +1914,11 @@ dependencies = [
[[package]]
name = "parity"
version = "1.9.0"
version = "1.9.1"
dependencies = [
"ansi_term 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"app_dirs 1.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
"clap 2.26.2 (registry+https://github.com/rust-lang/crates.io-index)",
"clap 2.29.1 (registry+https://github.com/rust-lang/crates.io-index)",
"ctrlc 1.1.1 (git+https://github.com/paritytech/rust-ctrlc.git)",
"daemonize 0.2.3 (registry+https://github.com/rust-lang/crates.io-index)",
"dir 0.1.0",
@ -1951,7 +1962,7 @@ dependencies = [
"parity-rpc 1.9.0",
"parity-rpc-client 1.4.0",
"parity-updater 1.9.0",
"parity-version 1.9.0",
"parity-version 1.9.1",
"parity-whisper 0.1.0",
"parking_lot 0.4.8 (registry+https://github.com/rust-lang/crates.io-index)",
"path 0.1.0",
@ -1966,6 +1977,8 @@ dependencies = [
"serde 1.0.15 (registry+https://github.com/rust-lang/crates.io-index)",
"serde_derive 1.0.15 (registry+https://github.com/rust-lang/crates.io-index)",
"serde_json 1.0.3 (registry+https://github.com/rust-lang/crates.io-index)",
"term_size 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)",
"textwrap 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
"time 0.1.38 (registry+https://github.com/rust-lang/crates.io-index)",
"toml 0.4.5 (registry+https://github.com/rust-lang/crates.io-index)",
"winapi 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
@ -1997,7 +2010,7 @@ dependencies = [
"parity-hash-fetch 1.9.0",
"parity-reactor 0.1.0",
"parity-ui 1.9.0",
"parity-version 1.9.0",
"parity-version 1.9.1",
"parking_lot 0.4.8 (registry+https://github.com/rust-lang/crates.io-index)",
"rand 0.3.16 (registry+https://github.com/rust-lang/crates.io-index)",
"rustc-hex 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2145,7 +2158,7 @@ dependencies = [
"order-stat 0.1.3 (registry+https://github.com/rust-lang/crates.io-index)",
"parity-reactor 0.1.0",
"parity-updater 1.9.0",
"parity-version 1.9.0",
"parity-version 1.9.1",
"parking_lot 0.4.8 (registry+https://github.com/rust-lang/crates.io-index)",
"pretty_assertions 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
"rand 0.3.16 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2204,8 +2217,8 @@ version = "1.9.0"
dependencies = [
"parity-ui-dev 1.9.0",
"parity-ui-old-dev 1.9.0",
"parity-ui-old-precompiled 1.9.0 (git+https://github.com/js-dist-paritytech/parity-master-1-9-v1.git)",
"parity-ui-precompiled 1.9.0 (git+https://github.com/js-dist-paritytech/parity-master-1-9-shell.git)",
"parity-ui-old-precompiled 1.9.0 (git+https://github.com/js-dist-paritytech/parity-beta-1-9-v1.git)",
"parity-ui-precompiled 1.9.0 (git+https://github.com/js-dist-paritytech/parity-beta-1-9-shell.git)",
"rustc_version 0.1.7 (registry+https://github.com/rust-lang/crates.io-index)",
]
@ -2226,7 +2239,7 @@ dependencies = [
[[package]]
name = "parity-ui-old-precompiled"
version = "1.9.0"
source = "git+https://github.com/js-dist-paritytech/parity-master-1-9-v1.git#ae75b135453ed081a6beb7c2bf3e3aa2b9957f69"
source = "git+https://github.com/js-dist-paritytech/parity-beta-1-9-v1.git#26ece85bf8a6c75a7f05f48917a86ab9d24bbd15"
dependencies = [
"parity-dapps-glue 1.9.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
@ -2234,7 +2247,7 @@ dependencies = [
[[package]]
name = "parity-ui-precompiled"
version = "1.9.0"
source = "git+https://github.com/js-dist-paritytech/parity-master-1-9-shell.git#47c6e031f8f5b16af8e1d99cb5c6f27055c0156d"
source = "git+https://github.com/js-dist-paritytech/parity-beta-1-9-shell.git#700333b520a71d3dfcadde24b3e6eeafc7db9884"
dependencies = [
"parity-dapps-glue 1.9.1 (registry+https://github.com/rust-lang/crates.io-index)",
]
@ -2253,7 +2266,7 @@ dependencies = [
"log 0.3.8 (registry+https://github.com/rust-lang/crates.io-index)",
"parity-hash-fetch 1.9.0",
"parity-reactor 0.1.0",
"parity-version 1.9.0",
"parity-version 1.9.1",
"parking_lot 0.4.8 (registry+https://github.com/rust-lang/crates.io-index)",
"path 0.1.0",
"semver 0.6.0 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2262,7 +2275,7 @@ dependencies = [
[[package]]
name = "parity-version"
version = "1.9.0"
version = "1.9.1"
dependencies = [
"ethcore-bytes 0.1.0",
"rlp 0.2.1",
@ -2505,7 +2518,7 @@ dependencies = [
name = "pwasm-run-test"
version = "0.1.0"
dependencies = [
"clap 2.26.2 (registry+https://github.com/rust-lang/crates.io-index)",
"clap 2.29.1 (registry+https://github.com/rust-lang/crates.io-index)",
"ethcore-bigint 0.2.1",
"ethcore-logger 1.9.0",
"ethjson 0.1.0",
@ -2671,7 +2684,7 @@ dependencies = [
[[package]]
name = "rocksdb"
version = "0.4.5"
source = "git+https://github.com/paritytech/rust-rocksdb#166e14ed63cbd2e44b51267b8b98e4b89b0f236f"
source = "git+https://github.com/paritytech/rust-rocksdb#ecf06adf3148ab10f6f7686b724498382ff4f36e"
dependencies = [
"libc 0.2.31 (registry+https://github.com/rust-lang/crates.io-index)",
"local-encoding 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2681,10 +2694,11 @@ dependencies = [
[[package]]
name = "rocksdb-sys"
version = "0.3.0"
source = "git+https://github.com/paritytech/rust-rocksdb#166e14ed63cbd2e44b51267b8b98e4b89b0f236f"
source = "git+https://github.com/paritytech/rust-rocksdb#ecf06adf3148ab10f6f7686b724498382ff4f36e"
dependencies = [
"cc 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)",
"libc 0.2.31 (registry+https://github.com/rust-lang/crates.io-index)",
"local-encoding 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)",
"snappy-sys 0.1.0 (git+https://github.com/paritytech/rust-snappy)",
]
@ -3112,10 +3126,9 @@ dependencies = [
[[package]]
name = "textwrap"
version = "0.8.0"
version = "0.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
dependencies = [
"term_size 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)",
"unicode-width 0.1.4 (registry+https://github.com/rust-lang/crates.io-index)",
]
@ -3476,7 +3489,7 @@ version = "0.1.0"
source = "git+https://github.com/paritytech/wasm-utils#3d59f7ca0661317bc66894a26b2a5a319fa5d229"
dependencies = [
"byteorder 1.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"clap 2.26.2 (registry+https://github.com/rust-lang/crates.io-index)",
"clap 2.29.1 (registry+https://github.com/rust-lang/crates.io-index)",
"env_logger 0.4.3 (registry+https://github.com/rust-lang/crates.io-index)",
"glob 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
"lazy_static 0.2.8 (registry+https://github.com/rust-lang/crates.io-index)",
@ -3555,6 +3568,7 @@ dependencies = [
"checksum adler32 1.0.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6cbd0b9af8587c72beadc9f72d35b9fbb070982c9e6203e46e93f10df25f8f45"
"checksum advapi32-sys 0.2.0 (registry+https://github.com/rust-lang/crates.io-index)" = "e06588080cb19d0acb6739808aafa5f26bfb2ca015b2b6370028b44cf7cb8a9a"
"checksum aho-corasick 0.6.3 (registry+https://github.com/rust-lang/crates.io-index)" = "500909c4f87a9e52355b26626d890833e9e1d53ac566db76c36faa984b889699"
"checksum ansi_term 0.10.2 (registry+https://github.com/rust-lang/crates.io-index)" = "6b3568b48b7cefa6b8ce125f9bb4989e52fbcc29ebea88df04cc7c5f12f70455"
"checksum ansi_term 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)" = "23ac7c30002a5accbf7e8987d0632fa6de155b7c3d39d0067317a391e00a2ef6"
"checksum app_dirs 1.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "b7d1c0d48a81bbb13043847f957971f4d87c81542d80ece5e84ba3cba4058fd4"
"checksum arrayvec 0.4.4 (registry+https://github.com/rust-lang/crates.io-index)" = "1c0250693b17316353df525fb088da32a8c18f84eb65d113dde31f5a76ed17b6"
@ -3573,6 +3587,7 @@ dependencies = [
"checksum bitflags 0.7.0 (registry+https://github.com/rust-lang/crates.io-index)" = "aad18937a628ec6abcd26d1489012cc0e18c21798210f491af69ded9b881106d"
"checksum bitflags 0.8.2 (registry+https://github.com/rust-lang/crates.io-index)" = "1370e9fc2a6ae53aea8b7a5110edbd08836ed87c88736dfabccade1c2b44bff4"
"checksum bitflags 0.9.1 (registry+https://github.com/rust-lang/crates.io-index)" = "4efd02e230a02e18f92fc2735f44597385ed02ad8f831e7c1c1156ee5e1ab3a5"
"checksum bitflags 1.0.1 (registry+https://github.com/rust-lang/crates.io-index)" = "b3c30d3802dfb7281680d6285f2ccdaa8c2d8fee41f93805dba5c4cf50dc23cf"
"checksum bloomchain 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "3f421095d2a76fc24cd3fb3f912b90df06be7689912b1bdb423caefae59c258d"
"checksum bn 0.4.4 (git+https://github.com/paritytech/bn)" = "<none>"
"checksum byteorder 1.1.0 (registry+https://github.com/rust-lang/crates.io-index)" = "ff81738b726f5d099632ceaffe7fb65b90212e8dce59d518729e7e8634032d3d"
@ -3580,7 +3595,7 @@ dependencies = [
"checksum cc 1.0.0 (registry+https://github.com/rust-lang/crates.io-index)" = "7db2f146208d7e0fbee761b09cd65a7f51ccc38705d4e7262dad4d73b12a76b1"
"checksum cfg-if 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)" = "d4c819a1287eb618df47cc647173c5c4c66ba19d888a6e50d605672aed3140de"
"checksum cid 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)" = "34aa7da06f10541fbca6850719cdaa8fa03060a5d2fb33840f149cf8133a00c7"
"checksum clap 2.26.2 (registry+https://github.com/rust-lang/crates.io-index)" = "3451e409013178663435d6f15fdb212f14ee4424a3d74f979d081d0a66b6f1f2"
"checksum clap 2.29.1 (registry+https://github.com/rust-lang/crates.io-index)" = "8f4a2b3bb7ef3c672d7c13d15613211d5a6976b6892c598b0fcb5d40765f19c2"
"checksum coco 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)" = "c06169f5beb7e31c7c67ebf5540b8b472d23e3eade3b2ec7d1f5b504a85f91bd"
"checksum conv 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "78ff10625fd0ac447827aa30ea8b861fead473bb60aeb73af6c1c58caf0d1299"
"checksum cookie 0.3.1 (registry+https://github.com/rust-lang/crates.io-index)" = "d53b80dde876f47f03cda35303e368a79b91c70b0d65ecba5fd5280944a08591"
@ -3695,8 +3710,8 @@ dependencies = [
"checksum owning_ref 0.3.3 (registry+https://github.com/rust-lang/crates.io-index)" = "cdf84f41639e037b484f93433aa3897863b561ed65c6e59c7073d7c561710f37"
"checksum parity-dapps-glue 1.9.1 (registry+https://github.com/rust-lang/crates.io-index)" = "261c025c67ba416e9fe63aa9b3236520ce3c74cfbe43590c9cdcec4ccc8180e4"
"checksum parity-tokio-ipc 0.1.5 (git+https://github.com/nikvolf/parity-tokio-ipc)" = "<none>"
"checksum parity-ui-old-precompiled 1.9.0 (git+https://github.com/js-dist-paritytech/parity-master-1-9-v1.git)" = "<none>"
"checksum parity-ui-precompiled 1.9.0 (git+https://github.com/js-dist-paritytech/parity-master-1-9-shell.git)" = "<none>"
"checksum parity-ui-old-precompiled 1.9.0 (git+https://github.com/js-dist-paritytech/parity-beta-1-9-v1.git)" = "<none>"
"checksum parity-ui-precompiled 1.9.0 (git+https://github.com/js-dist-paritytech/parity-beta-1-9-shell.git)" = "<none>"
"checksum parity-wasm 0.15.3 (registry+https://github.com/rust-lang/crates.io-index)" = "8431a184ad88cfbcd71a792aaca319cc7203a94300c26b8dce2d0df0681ea87d"
"checksum parity-wordlist 1.2.0 (registry+https://github.com/rust-lang/crates.io-index)" = "1d0dec124478845b142f68b446cbee953d14d4b41f1bc0425024417720dce693"
"checksum parking_lot 0.4.8 (registry+https://github.com/rust-lang/crates.io-index)" = "149d8f5b97f3c1133e3cfcd8886449959e856b557ff281e292b733d7c69e005e"
@ -3784,7 +3799,7 @@ dependencies = [
"checksum tempdir 0.3.5 (registry+https://github.com/rust-lang/crates.io-index)" = "87974a6f5c1dfb344d733055601650059a3363de2a6104819293baff662132d6"
"checksum term 0.4.6 (registry+https://github.com/rust-lang/crates.io-index)" = "fa63644f74ce96fbeb9b794f66aff2a52d601cbd5e80f4b97123e3899f4570f1"
"checksum term_size 0.3.0 (registry+https://github.com/rust-lang/crates.io-index)" = "e2b6b55df3198cc93372e85dd2ed817f0e38ce8cc0f22eb32391bfad9c4bf209"
"checksum textwrap 0.8.0 (registry+https://github.com/rust-lang/crates.io-index)" = "df8e08afc40ae3459e4838f303e465aa50d823df8d7f83ca88108f6d3afe7edd"
"checksum textwrap 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)" = "c0b59b6b4b44d867f1370ef1bd91bfb262bf07bf0ae65c202ea2fbc16153b693"
"checksum thread_local 0.3.4 (registry+https://github.com/rust-lang/crates.io-index)" = "1697c4b57aeeb7a536b647165a2825faddffb1d3bad386d507709bd51a90bb14"
"checksum threadpool 1.7.1 (registry+https://github.com/rust-lang/crates.io-index)" = "e2f0c90a5f3459330ac8bc0d2f879c693bb7a2f59689c1083fc4ef83834da865"
"checksum time 0.1.38 (registry+https://github.com/rust-lang/crates.io-index)" = "d5d788d3aa77bc0ef3e9621256885555368b47bd495c13dd2e7413c89f845520"

View File

@ -1,7 +1,7 @@
[package]
description = "Parity Ethereum client"
name = "parity"
version = "1.9.0"
version = "1.9.1"
license = "GPL-3.0"
authors = ["Parity Technologies <admin@parity.io>"]
build = "build.rs"
@ -12,6 +12,8 @@ env_logger = "0.4"
rustc-hex = "1.0"
docopt = "0.8"
clap = "2"
term_size = "0.3"
textwrap = "0.9"
time = "0.1"
num_cpus = "1.2"
number_prefix = "0.2"

View File

@ -47,6 +47,8 @@ pub fn add_security_headers(headers: &mut header::Headers, embeddable_on: Embedd
// Content Security Policy headers
headers.set_raw("Content-Security-Policy", String::new()
// Restrict everything to the same origin by default.
+ "default-src 'self';"
// Allow connecting to WS servers and HTTP(S) servers.
// We could be more restrictive and allow only RPC server URL.
+ "connect-src http: https: ws: wss:;"
@ -64,7 +66,9 @@ pub fn add_security_headers(headers: &mut header::Headers, embeddable_on: Embedd
+ "style-src 'self' 'unsafe-inline' data: blob: https:;"
// Allow fonts from data: and HTTPS.
+ "font-src 'self' data: https:;"
// Allow inline scripts and scripts eval (webpack/jsconsole)
// Disallow objects
+ "object-src 'none';"
// Allow scripts
+ {
let script_src = embeddable_on.as_ref()
.map(|e| e.extra_script_src.iter()
@ -72,18 +76,16 @@ pub fn add_security_headers(headers: &mut header::Headers, embeddable_on: Embedd
.join(" ")
).unwrap_or_default();
&format!(
"script-src 'self' 'unsafe-inline' 'unsafe-eval' {};",
"script-src 'self' {};",
script_src
)
}
// Same restrictions as script-src with additional
// blob: that is required for camera access (worker)
+ "worker-src 'self' 'unsafe-inline' 'unsafe-eval' https: blob:;"
// Restrict everything else to the same origin.
+ "default-src 'self';"
+ "worker-src 'self' https: blob:;"
// Run in sandbox mode (although it's not fully safe since we allow same-origin and script)
+ "sandbox allow-same-origin allow-forms allow-modals allow-popups allow-presentation allow-scripts;"
// Disallow subitting forms from any dapps
// Disallow submitting forms from any dapps
+ "form-action 'none';"
// Never allow mixed content
+ "block-all-mixed-content;"

View File

@ -13,8 +13,8 @@ rustc_version = "0.1"
parity-ui-dev = { path = "../../js", optional = true }
parity-ui-old-dev = { path = "../../js-old", optional = true }
# This is managed by the js/scripts/release.sh script on CI - keep it in a single line
parity-ui-old-precompiled = { git = "https://github.com/js-dist-paritytech/parity-master-1-9-v1.git", optional = true }
parity-ui-precompiled = { git = "https://github.com/js-dist-paritytech/parity-master-1-9-shell.git", optional = true }
parity-ui-old-precompiled = { git = "https://github.com/js-dist-paritytech/parity-beta-1-9-v1.git", optional = true }
parity-ui-precompiled = { git = "https://github.com/js-dist-paritytech/parity-beta-1-9-shell.git", optional = true }
[features]
no-precompiled-js = ["parity-ui-dev", "parity-ui-old-dev"]

View File

@ -44,8 +44,7 @@ use bigint::hash::{H256, H520};
use semantic_version::SemanticVersion;
use parking_lot::{Mutex, RwLock};
use unexpected::{Mismatch, OutOfBounds};
use util::*;
use bytes::Bytes;
use util::Address;
mod finality;
@ -291,9 +290,11 @@ struct EpochVerifier {
impl super::EpochVerifier<EthereumMachine> for EpochVerifier {
fn verify_light(&self, header: &Header) -> Result<(), Error> {
// Validate the timestamp
verify_timestamp(&*self.step, header_step(header)?)?;
// always check the seal since it's fast.
// nothing heavier to do.
verify_external(header, &self.subchain_validators, &*self.step, |_| {})
verify_external(header, &self.subchain_validators)
}
fn check_finality_proof(&self, proof: &[u8]) -> Option<Vec<H256>> {
@ -317,7 +318,7 @@ impl super::EpochVerifier<EthereumMachine> for EpochVerifier {
//
// `verify_external` checks that signature is correct and author == signer.
if header.seal().len() != 2 { return None }
otry!(verify_external(header, &self.subchain_validators, &*self.step, |_| {}).ok());
otry!(verify_external(header, &self.subchain_validators).ok());
let newly_finalized = otry!(finality_checker.push_hash(header.hash(), header.author().clone()).ok());
finalized.extend(newly_finalized);
@ -327,16 +328,6 @@ impl super::EpochVerifier<EthereumMachine> for EpochVerifier {
}
}
// Report misbehavior
#[derive(Debug)]
#[allow(dead_code)]
enum Report {
// Malicious behavior
Malicious(Address, BlockNumber, Bytes),
// benign misbehavior
Benign(Address, BlockNumber),
}
fn header_step(header: &Header) -> Result<usize, ::rlp::DecoderError> {
UntrustedRlp::new(&header.seal().get(0).expect("was either checked with verify_block_basic or is genesis; has 2 fields; qed (Make sure the spec file has a correct genesis seal)")).as_val()
}
@ -355,34 +346,35 @@ fn is_step_proposer(validators: &ValidatorSet, bh: &H256, step: usize, address:
step_proposer(validators, bh, step) == *address
}
fn verify_external<F: Fn(Report)>(header: &Header, validators: &ValidatorSet, step: &Step, report: F)
-> Result<(), Error>
{
let header_step = header_step(header)?;
fn verify_timestamp(step: &Step, header_step: usize) -> Result<(), BlockError> {
match step.check_future(header_step) {
Err(None) => {
trace!(target: "engine", "verify_block_external: block from the future");
report(Report::Benign(*header.author(), header.number()));
return Err(BlockError::InvalidSeal.into())
trace!(target: "engine", "verify_timestamp: block from the future");
Err(BlockError::InvalidSeal.into())
},
Err(Some(oob)) => {
trace!(target: "engine", "verify_block_external: block too early");
return Err(BlockError::TemporarilyInvalid(oob).into())
// NOTE This error might be returned only in early stage of verification (Stage 1).
// Returning it further won't recover the sync process.
trace!(target: "engine", "verify_timestamp: block too early");
Err(BlockError::TemporarilyInvalid(oob).into())
},
Ok(_) => {
let proposer_signature = header_signature(header)?;
let correct_proposer = validators.get(header.parent_hash(), header_step);
let is_invalid_proposer = *header.author() != correct_proposer ||
!verify_address(&correct_proposer, &proposer_signature, &header.bare_hash())?;
Ok(_) => Ok(()),
}
}
if is_invalid_proposer {
trace!(target: "engine", "verify_block_external: bad proposer for step: {}", header_step);
Err(EngineError::NotProposer(Mismatch { expected: correct_proposer, found: header.author().clone() }))?
} else {
Ok(())
}
}
fn verify_external(header: &Header, validators: &ValidatorSet) -> Result<(), Error> {
let header_step = header_step(header)?;
let proposer_signature = header_signature(header)?;
let correct_proposer = validators.get(header.parent_hash(), header_step);
let is_invalid_proposer = *header.author() != correct_proposer ||
!verify_address(&correct_proposer, &proposer_signature, &header.bare_hash())?;
if is_invalid_proposer {
trace!(target: "engine", "verify_block_external: bad proposer for step: {}", header_step);
Err(EngineError::NotProposer(Mismatch { expected: correct_proposer, found: header.author().clone() }))?
} else {
Ok(())
}
}
@ -655,26 +647,38 @@ impl Engine<EthereumMachine> for AuthorityRound {
/// Check the number of seal fields.
fn verify_block_basic(&self, header: &Header) -> Result<(), Error> {
if header.number() >= self.validate_score_transition && *header.difficulty() >= U256::from(U128::max_value()) {
Err(From::from(BlockError::DifficultyOutOfBounds(
return Err(From::from(BlockError::DifficultyOutOfBounds(
OutOfBounds { min: None, max: Some(U256::from(U128::max_value())), found: *header.difficulty() }
)))
} else {
Ok(())
)));
}
// TODO [ToDr] Should this go from epoch manager?
// If yes then probably benign reporting needs to be moved further in the verification.
let set_number = header.number();
match verify_timestamp(&*self.step, header_step(header)?) {
Err(BlockError::InvalidSeal) => {
self.validators.report_benign(header.author(), set_number, header.number());
Err(BlockError::InvalidSeal.into())
}
Err(e) => Err(e.into()),
Ok(()) => Ok(()),
}
}
/// Do the step and gas limit validation.
fn verify_block_family(&self, header: &Header, parent: &Header) -> Result<(), Error> {
let step = header_step(header)?;
let parent_step = header_step(parent)?;
// TODO [ToDr] Should this go from epoch manager?
let set_number = header.number();
// Ensure header is from the step after parent.
if step == parent_step
|| (header.number() >= self.validate_step_transition && step <= parent_step) {
trace!(target: "engine", "Multiple blocks proposed for step {}.", parent_step);
self.validators.report_malicious(header.author(), header.number(), header.number(), Default::default());
self.validators.report_malicious(header.author(), set_number, header.number(), Default::default());
Err(EngineError::DoubleVote(header.author().clone()))?;
}
@ -687,7 +691,7 @@ impl Engine<EthereumMachine> for AuthorityRound {
let skipped_primary = step_proposer(&*self.validators, &parent.hash(), s);
// Do not report this signer.
if skipped_primary != me {
self.validators.report_benign(&skipped_primary, header.number(), header.number());
self.validators.report_benign(&skipped_primary, set_number, header.number());
}
// Stop reporting once validators start repeating.
if !reported.insert(skipped_primary) { break; }
@ -702,9 +706,8 @@ impl Engine<EthereumMachine> for AuthorityRound {
// fetch correct validator set for current epoch, taking into account
// finality of previous transitions.
let active_set;
let (validators, set_number) = if self.immediate_transitions {
(&*self.validators, header.number())
let validators = if self.immediate_transitions {
&*self.validators
} else {
// get correct validator set for epoch.
let client = match self.client.read().as_ref().and_then(|weak| weak.upgrade()) {
@ -722,21 +725,12 @@ impl Engine<EthereumMachine> for AuthorityRound {
}
active_set = epoch_manager.validators().clone();
(&active_set as &_, epoch_manager.epoch_transition_number)
};
// always report with "self.validators" so that the report actually gets
// to the contract.
let report = |report| match report {
Report::Benign(address, block_number) =>
self.validators.report_benign(&address, set_number, block_number),
Report::Malicious(address, block_number, proof) =>
self.validators.report_malicious(&address, set_number, block_number, proof),
&active_set as &_
};
// verify signature against fixed list, but reports should go to the
// contract itself.
verify_external(header, validators, &*self.step, report)
verify_external(header, validators)
}
fn genesis_epoch_data(&self, header: &Header, call: &Call) -> Result<Vec<u8>, String> {
@ -1059,8 +1053,7 @@ mod tests {
assert!(engine.verify_block_family(&header, &parent_header).is_ok());
assert!(engine.verify_block_external(&header).is_ok());
header.set_seal(vec![encode(&5usize).into_vec(), encode(&(&*signature as &[u8])).into_vec()]);
assert!(engine.verify_block_family(&header, &parent_header).is_ok());
assert!(engine.verify_block_external(&header).is_err());
assert!(engine.verify_block_basic(&header).is_err());
}
#[test]
@ -1200,3 +1193,4 @@ mod tests {
AuthorityRound::new(params, machine).unwrap();
}
}

View File

@ -192,7 +192,7 @@ mod tests {
header.set_number(2);
header.set_parent_hash(client.chain_info().best_block_hash);
// `reportBenign` when the designated proposer releases block from the future (bad clock).
assert!(client.engine().verify_block_external(&header).is_err());
assert!(client.engine().verify_block_basic(&header).is_err());
// Seal a block.
client.engine().step();
assert_eq!(client.chain_info().best_block_number, 1);

View File

@ -1406,7 +1406,7 @@ mod tests {
}
#[test]
fn should_not_trace_delegatecall() {
fn should_trace_delegatecall_properly() {
init_log();
let mut state = get_temp_state();
@ -1426,7 +1426,7 @@ mod tests {
}.sign(&secret(), None);
state.init_code(&0xa.into(), FromHex::from_hex("6000600060006000600b618000f4").unwrap()).unwrap();
state.init_code(&0xb.into(), FromHex::from_hex("6000").unwrap()).unwrap();
state.init_code(&0xb.into(), FromHex::from_hex("60056000526001601ff3").unwrap()).unwrap();
let result = state.apply(&info, &machine, &t, true).unwrap();
let expected_trace = vec![FlatTrace {
@ -1441,23 +1441,23 @@ mod tests {
call_type: CallType::Call,
}),
result: trace::Res::Call(trace::CallResult {
gas_used: U256::from(721), // in post-eip150
gas_used: U256::from(736), // in post-eip150
output: vec![]
}),
}, FlatTrace {
trace_address: vec![0].into_iter().collect(),
subtraces: 0,
action: trace::Action::Call(trace::Call {
from: "9cce34f7ab185c7aba1b7c8140d620b4bda941d6".into(),
to: 0xa.into(),
from: 0xa.into(),
to: 0xb.into(),
value: 0.into(),
gas: 32768.into(),
input: vec![],
call_type: CallType::DelegateCall,
}),
result: trace::Res::Call(trace::CallResult {
gas_used: 3.into(),
output: vec![],
gas_used: 18.into(),
output: vec![5],
}),
}];

View File

@ -74,13 +74,23 @@ pub struct Call {
impl From<ActionParams> for Call {
fn from(p: ActionParams) -> Self {
Call {
from: p.sender,
to: p.address,
value: p.value.value(),
gas: p.gas,
input: p.data.unwrap_or_else(Vec::new),
call_type: p.call_type,
match p.call_type {
CallType::DelegateCall => Call {
from: p.address,
to: p.code_address,
value: p.value.value(),
gas: p.gas,
input: p.data.unwrap_or_else(Vec::new),
call_type: p.call_type,
},
_ => Call {
from: p.sender,
to: p.address,
value: p.value.value(),
gas: p.gas,
input: p.data.unwrap_or_else(Vec::new),
call_type: p.call_type,
},
}
}
}

View File

@ -4,8 +4,9 @@ set -e
# variables
PVER="1-9"
PTYPE="v1"
TRACK="beta"
UTCDATE=`date -u "+%Y%m%d-%H%M%S"`
PRE_REPO="js-dist-paritytech/parity-${CI_BUILD_REF_NAME}-${PVER}-${PTYPE}.git"
PRE_REPO="js-dist-paritytech/parity-${TRACK}-${PVER}-${PTYPE}.git"
PRE_REPO_TOKEN="https://${GITHUB_JS_PRECOMPILED}:@github.com/${PRE_REPO}"
BASEDIR=`dirname $0`

View File

@ -1 +1 @@
// test script 23
// test script 24

View File

@ -306,7 +306,7 @@ export function fetchTokensBalances (updates, skipNotifications = false) {
const tokenIdsToFetch = Object.values(balances)
.reduce((tokenIds, balance) => {
const nextTokenIds = Object.keys(balance)
.filter((tokenId) => balance[tokenId].gt(0));
.filter((tokenId) => balance[tokenId] && balance[tokenId].gt(0));
return tokenIds.concat(nextTokenIds);
}, []);
@ -328,7 +328,7 @@ export function fetchTokensBalances (updates, skipNotifications = false) {
dispatch(setBalances(balances, skipNotifications));
})
.catch((error) => {
console.warn('balances::fetchTokensBalances', error);
console.warn('v1: balances::fetchTokensBalances', error);
});
};
}

View File

@ -105,7 +105,7 @@ export function loadTokens (options = {}) {
}
export function loadTokensBasics (tokenIndexes, options) {
const limit = 64;
const limit = 128;
return (dispatch, getState) => {
const { api } = getState();
@ -154,7 +154,7 @@ export function loadTokensBasics (tokenIndexes, options) {
export function fetchTokens (_tokenIndexes) {
const tokenIndexes = uniq(_tokenIndexes || []);
const tokenChunks = chunk(tokenIndexes, 64);
const tokenChunks = chunk(tokenIndexes, 128);
return (dispatch, getState) => {
const { tokenReg } = Contracts.get();

View File

@ -65,15 +65,15 @@ export default function (api, browserHistory, forEmbed = false) {
.then(() => console.log('v1: started Status Provider'))
.then(() => console.log('v1: starting Personal Provider...'))
.then(() => PersonalProvider.start())
.then(() => withTimeoutForLight('personal', PersonalProvider.start(), store))
.then(() => console.log('v1: started Personal Provider'))
.then(() => console.log('v1: starting Balances Provider...'))
.then(() => BalancesProvider.start())
.then(() => withTimeoutForLight('balances', BalancesProvider.start(), store))
.then(() => console.log('v1: started Balances Provider'))
.then(() => console.log('v1: starting Tokens Provider...'))
.then(() => TokensProvider.start())
.then(() => withTimeoutForLight('tokens', TokensProvider.start(), store))
.then(() => console.log('v1: started Tokens Provider'));
};
@ -97,3 +97,39 @@ export default function (api, browserHistory, forEmbed = false) {
return store;
}
function withTimeoutForLight (id, promise, store) {
const { nodeKind } = store.getState().nodeStatus;
const isLightNode = nodeKind.capability !== 'full';
if (!isLightNode) {
// make sure that no values are passed
return promise.then(() => {});
}
return new Promise((resolve, reject) => {
let isResolved = false;
const doResolve = () => {
if (!isResolved) {
isResolved = true;
resolve();
}
};
const timeout = setTimeout(() => {
console.warn(`Resolving ${id} by timeout.`);
doResolve();
}, 1000);
promise
.then(() => {
clearTimeout(timeout);
doResolve();
})
.catch(err => {
clearTimeout(timeout);
if (!isResolved) {
reject(err);
}
});
});
}

View File

@ -14,7 +14,7 @@
// You should have received a copy of the GNU General Public License
// along with Parity. If not, see <http://www.gnu.org/licenses/>.
import { range } from 'lodash';
import { chunk, range } from 'lodash';
import BigNumber from 'bignumber.js';
import { hashToImageUrl } from '~/redux/util';
@ -58,13 +58,11 @@ export function fetchTokensBasics (api, tokenReg, start = 0, limit = 100) {
return decodeArray(api, 'address[]', result);
})
.then((tokenAddresses) => {
return tokenAddresses.map((tokenAddress, index) => {
return tokenAddresses.map((address, index) => {
const tokenIndex = start + index;
return {
address: /^0x0*$/.test(tokenAddress)
? ''
: tokenAddress,
address,
id: getTokenId(tokenIndex),
index: tokenIndex,
fetched: false
@ -80,12 +78,17 @@ export function fetchTokensBasics (api, tokenReg, start = 0, limit = 100) {
return tokens.map((token) => {
if (balances[token.id] && balances[token.id].gt(0)) {
token.address = '';
token.address = null;
}
return token;
});
});
})
.then((tokens) => {
return tokens.filter(({ address }) => {
return address && !/^0x0*$/.test(address);
});
});
}
@ -195,19 +198,22 @@ export function fetchAccountsBalances (api, tokens, updates) {
});
const tokenPromise = Object.keys(tokenUpdates)
.reduce((tokenPromise, accountAddress) => {
.reduce((promises, accountAddress) => {
const tokenIds = tokenUpdates[accountAddress];
const updateTokens = tokens
.filter((t) => tokenIds.includes(t.id));
return tokenPromise
.then(() => fetchTokensBalances(api, updateTokens, [ accountAddress ]))
.then((balances) => {
tokensBalances[accountAddress] = balances[accountAddress];
});
}, Promise.resolve());
promises.push(
fetchTokensBalances(api, updateTokens, [ accountAddress ])
.then((balances) => {
tokensBalances[accountAddress] = balances[accountAddress];
})
);
return Promise.all([ ethPromise, tokenPromise ])
return promises;
}, []);
return Promise.all([ ethPromise, Promise.all(tokenPromise) ])
.then(() => {
const balances = Object.assign({}, tokensBalances);
@ -243,29 +249,24 @@ function fetchEthBalances (api, accountAddresses) {
});
}
function fetchTokensBalances (api, tokens, accountAddresses) {
const tokenAddresses = tokens.map((t) => t.address);
const tokensBalancesCallData = encode(
api,
[ 'address[]', 'address[]' ],
[ accountAddresses, tokenAddresses ]
);
function fetchTokensBalances (api, _tokens, accountAddresses) {
const promises = chunk(_tokens, 128).map((tokens) => {
const data = tokensBalancesBytecode + encode(
api,
[ 'address[]', 'address[]' ],
[ accountAddresses, tokens.map(({ address }) => address) ]
);
return api.eth
.call({ data: tokensBalancesBytecode + tokensBalancesCallData })
.then((result) => {
const rawBalances = decodeArray(api, 'uint[]', result);
return api.eth.call({ data }).then((result) => {
const balances = {};
const rawBalances = decodeArray(api, 'uint[]', result);
accountAddresses.forEach((accountAddress, accountIndex) => {
const preIndex = accountIndex * tokens.length;
const balance = {};
const preIndex = accountIndex * tokenAddresses.length;
tokenAddresses.forEach((tokenAddress, tokenIndex) => {
const index = preIndex + tokenIndex;
const token = tokens[tokenIndex];
balance[token.id] = rawBalances[index];
tokens.forEach((token, tokenIndex) => {
balance[token.id] = rawBalances[preIndex + tokenIndex];
});
balances[accountAddress] = balance;
@ -273,6 +274,31 @@ function fetchTokensBalances (api, tokens, accountAddresses) {
return balances;
});
});
return Promise.all(promises).then((results) => {
return results.reduce((combined, result) => {
Object
.keys(result)
.forEach((address) => {
if (!combined[address]) {
combined[address] = {};
}
Object
.keys(result[address])
.forEach((token) => {
const value = result[address][token];
if (value && value.gt(0)) {
combined[address][token] = result[address][token];
}
});
});
return combined;
}, {});
});
}
function getTokenId (...args) {

1619
js/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -44,17 +44,17 @@
"test:coverage": "cross-env NODE_ENV=test istanbul cover _mocha -- 'src/**/*.spec.js'"
},
"devDependencies": {
"@parity/dapp-console": "paritytech/dapp-console",
"@parity/dapp-console": "parity-js/dapp-console#5562e6c11783dc82c1ee37fd85dc3403ff28f8f5",
"@parity/dapp-dapp-methods": "js-dist-paritytech/dapp-dapp-methods",
"@parity/dapp-dapp-visible": "js-dist-paritytech/dapp-dapp-visible",
"@parity/dapp-dappreg": "paritytech/dapp-dappreg",
"@parity/dapp-githubhint": "paritytech/dapp-githubhint",
"@parity/dapp-localtx": "paritytech/dapp-localtx",
"@parity/dapp-registry": "paritytech/dapp-registry",
"@parity/dapp-signaturereg": "paritytech/dapp-signaturereg",
"@parity/dapp-dappreg": "parity-js/dapp-dappreg#66f2e52dfa9a783cc0ca616505460e5e34ebdf54",
"@parity/dapp-githubhint": "parity-js/dapp-githubhint#b5cdef016e8bead7669ca077be526897ee42f83a",
"@parity/dapp-localtx": "parity-js/dapp-localtx#ac1d82c7f55bf0b6e2bf0e56044f95fa49c52ea9",
"@parity/dapp-registry": "parity-js/dapp-registry#d3aeab6437ebcc5537e5f71a1cd49c8de8329f34",
"@parity/dapp-signaturereg": "parity-js/dapp-signaturereg#f597e976bd89b07809173d7a35e3929ac81e7e75",
"@parity/dapp-status": "js-dist-paritytech/dapp-status",
"@parity/dapp-tokendeploy": "paritytech/dapp-tokendeploy",
"@parity/dapp-tokenreg": "paritytech/dapp-tokenreg",
"@parity/dapp-tokendeploy": "parity-js/dapp-tokendeploy#0f6f9f2adb82c02e35056dba792a75e95f440cdd",
"@parity/dapp-tokenreg": "parity-js/dapp-tokenreg#9750a2c10a934f9ae0e6e7fd6fa5b9e25a7b0785",
"babel-cli": "6.26.0",
"babel-core": "6.26.0",
"babel-eslint": "7.1.1",
@ -141,12 +141,12 @@
"yargs": "6.6.0"
},
"dependencies": {
"@parity/api": "^2.1.14",
"@parity/plugin-signer-account": "paritytech/plugin-signer-account",
"@parity/plugin-signer-default": "paritytech/plugin-signer-default",
"@parity/plugin-signer-hardware": "paritytech/plugin-signer-hardware",
"@parity/plugin-signer-qr": "paritytech/plugin-signer-qr",
"@parity/shared": "2.2.21",
"@parity/api": "2.1.15",
"@parity/plugin-signer-account": "parity-js/plugin-signer-account#c1272caa242c8b97dac78e5d0b1e068614657fdc",
"@parity/plugin-signer-default": "parity-js/plugin-signer-default#9a47bded9d6d70b69bb2f719732bd6f7854d1842",
"@parity/plugin-signer-hardware": "parity-js/plugin-signer-hardware#4320d818a053d4efae890b74a7476e4c8dc6ba10",
"@parity/plugin-signer-qr": "parity-js/plugin-signer-qr#2d1fafad347ba53eaf58c14265d4d07631d6a45c",
"@parity/shared": "2.2.23",
"@parity/ui": "3.0.22",
"keythereum": "1.0.2",
"lodash.flatten": "4.4.0",
@ -170,7 +170,6 @@
"redux": "3.7.2",
"semantic-ui-react": "0.77.0",
"solc": "ngotchac/solc-js",
"store": "1.3.20",
"web3": "1.0.0-beta.26"
"store": "1.3.20"
}
}

View File

@ -4,7 +4,7 @@ set -e
# variables
PVER="1-9"
UTCDATE=`date -u "+%Y%m%d-%H%M%S"`
BRANCH=$CI_BUILD_REF_NAME
BRANCH="beta"
GIT_PARITY="https://${GITHUB_JS_PRECOMPILED}:@github.com/paritytech/parity.git"
echo "*** [cargo] Setting up GitHub config for parity"

View File

@ -4,8 +4,9 @@ set -e
# variables
PVER="1-9"
PTYPE="shell"
TRACK="beta"
UTCDATE=`date -u "+%Y%m%d-%H%M%S"`
PRE_REPO="js-dist-paritytech/parity-${CI_BUILD_REF_NAME}-${PVER}-${PTYPE}.git"
PRE_REPO="js-dist-paritytech/parity-${TRACK}-${PVER}-${PTYPE}.git"
PRE_REPO_TOKEN="https://${GITHUB_JS_PRECOMPILED}:@github.com/${PRE_REPO}"
BASEDIR=`dirname $0`

View File

@ -1 +1 @@
// test script 29
// test script 30

View File

@ -26,11 +26,14 @@
.list {
margin: 0 !important;
padding: 1em 1em !important;
background-color: #f5f5f5;
background-color: white;
}
.isDefault {
background-color: white;
.accountsList {
background-color: #f5f5f5;
height: 300px;
overflow-y: auto;
}
.hasOtherAccounts {

View File

@ -68,14 +68,14 @@ class DefaultAccount extends Component {
}
content={
<div>
<List relaxed='very' selection className={ [styles.list, styles.isDefault, allAccounts.length > 1 && styles.hasOtherAccounts].join(' ') }>
<List relaxed='very' selection className={ [styles.list, allAccounts.length > 1 && styles.hasOtherAccounts].join(' ') }>
<AccountItem
isDefault
account={ defaultAccount }
/>
</List>
{allAccounts.length > 1 &&
<List relaxed='very' selection className={ styles.list } divided>
<List relaxed='very' selection className={ [styles.list, styles.accountsList].join(' ') } divided>
{allAccounts
.filter(({ address }) => address !== defaultAddress)
.map(account => (

View File

@ -16,7 +16,6 @@
import Api from '@parity/api';
import qs from 'query-string';
import Web3 from 'web3';
function initProvider () {
const path = window.location.pathname.split('/');
@ -48,24 +47,9 @@ function initProvider () {
}
function initWeb3 (ethereum) {
// FIXME: Use standard provider for web3
const provider = new Api.Provider.SendAsync(ethereum);
const web3 = new Web3(provider);
const currentProvider = new Api.Provider.SendAsync(ethereum);
if (!web3.currentProvider) {
web3.currentProvider = provider;
}
// set default account
web3.eth.getAccounts((error, accounts) => {
if (error || !accounts || !accounts[0]) {
return;
}
web3.eth.defaultAccount = accounts[0];
});
window.web3 = web3;
window.web3 = { currentProvider };
}
function initParity (ethereum) {

View File

@ -13,6 +13,8 @@
0ACF9AC71E30FAB600D5C935 /* MainMenu.xib in Resources */ = {isa = PBXBuildFile; fileRef = 0ACF9AC51E30FAB600D5C935 /* MainMenu.xib */; };
0AE564F11E3CE42C00BD01F7 /* GetBSDProcessList.swift in Sources */ = {isa = PBXBuildFile; fileRef = 0AE564F01E3CE42C00BD01F7 /* GetBSDProcessList.swift */; };
0AED4DA01E3E22F800BF87C0 /* ethstore in CopyFiles */ = {isa = PBXBuildFile; fileRef = 0AED4D9F1E3E22F800BF87C0 /* ethstore */; settings = {ATTRIBUTES = (CodeSignOnCopy, ); }; };
84CF92B3200E559900AD6E78 /* parity-evm in CopyFiles */ = {isa = PBXBuildFile; fileRef = 84CF92B2200E559900AD6E78 /* parity-evm */; settings = {ATTRIBUTES = (CodeSignOnCopy, ); }; };
84CF92B6200E56AE00AD6E78 /* ethkey in CopyFiles */ = {isa = PBXBuildFile; fileRef = 84CF92B5200E56AE00AD6E78 /* ethkey */; settings = {ATTRIBUTES = (CodeSignOnCopy, ); }; };
/* End PBXBuildFile section */
/* Begin PBXCopyFilesBuildPhase section */
@ -22,6 +24,8 @@
dstPath = "";
dstSubfolderSpec = 6;
files = (
84CF92B6200E56AE00AD6E78 /* ethkey in CopyFiles */,
84CF92B3200E559900AD6E78 /* parity-evm in CopyFiles */,
0AED4DA01E3E22F800BF87C0 /* ethstore in CopyFiles */,
0A7A475D1E3D2CDD0093D1AB /* parity in CopyFiles */,
);
@ -38,6 +42,8 @@
0ACF9AC81E30FAB600D5C935 /* Info.plist */ = {isa = PBXFileReference; lastKnownFileType = text.plist.xml; path = Info.plist; sourceTree = "<group>"; };
0AE564F01E3CE42C00BD01F7 /* GetBSDProcessList.swift */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.swift; path = GetBSDProcessList.swift; sourceTree = "<group>"; };
0AED4D9F1E3E22F800BF87C0 /* ethstore */ = {isa = PBXFileReference; lastKnownFileType = "compiled.mach-o.executable"; name = ethstore; path = ../target/release/ethstore; sourceTree = "<group>"; };
84CF92B2200E559900AD6E78 /* parity-evm */ = {isa = PBXFileReference; lastKnownFileType = "compiled.mach-o.executable"; name = "parity-evm"; path = "../target/release/parity-evm"; sourceTree = "<group>"; };
84CF92B5200E56AE00AD6E78 /* ethkey */ = {isa = PBXFileReference; lastKnownFileType = "compiled.mach-o.executable"; name = ethkey; path = ../target/release/ethkey; sourceTree = "<group>"; };
/* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */
@ -54,6 +60,8 @@
0ACF9AB51E30FAB600D5C935 = {
isa = PBXGroup;
children = (
84CF92B5200E56AE00AD6E78 /* ethkey */,
84CF92B2200E559900AD6E78 /* parity-evm */,
0AED4D9F1E3E22F800BF87C0 /* ethstore */,
0A7A475C1E3D2CDD0093D1AB /* parity */,
0ACF9AC01E30FAB600D5C935 /* Parity Ethereum */,
@ -110,7 +118,7 @@
isa = PBXProject;
attributes = {
LastSwiftUpdateCheck = 0800;
LastUpgradeCheck = 0800;
LastUpgradeCheck = 0820;
ORGANIZATIONNAME = "Parity Technologies";
TargetAttributes = {
0ACF9ABD1E30FAB600D5C935 = {
@ -192,6 +200,7 @@
CLANG_WARN_INFINITE_RECURSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_SUSPICIOUS_MOVE = YES;
CLANG_WARN_SUSPICIOUS_MOVES = YES;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
@ -241,6 +250,7 @@
CLANG_WARN_INFINITE_RECURSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_SUSPICIOUS_MOVE = YES;
CLANG_WARN_SUSPICIOUS_MOVES = YES;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;

View File

@ -462,7 +462,7 @@
<key>OVERWRITE_PERMISSIONS</key>
<false/>
<key>VERSION</key>
<string>1.9.0</string>
<string>1.9.1</string>
</dict>
<key>UUID</key>
<string>2DCD5B81-7BAF-4DA1-9251-6274B089FD36</string>

View File

@ -1,4 +1,4 @@
// Copyright 2015-2017 Parity Technologies (UK) Ltd.
// Copyright 2015-2018 Parity Technologies (UK) Ltd.
// This file is part of Parity.
// Parity is free software: you can redistribute it and/or modify
@ -22,12 +22,12 @@ import Cocoa
class AppDelegate: NSObject, NSApplicationDelegate {
@IBOutlet weak var statusMenu: NSMenu!
@IBOutlet weak var startAtLogonMenuItem: NSMenuItem!
let statusItem = NSStatusBar.system().statusItem(withLength: NSVariableStatusItemLength)
var parityPid: Int32? = nil
var commandLine: [String] = []
let defaultDefaults = "{\"fat_db\":false,\"mode\":\"passive\",\"mode.alarm\":3600,\"mode.timeout\":300,\"pruning\":\"fast\",\"tracing\":false}"
func menuAppPath() -> String {
return Bundle.main.executablePath!
}
@ -40,20 +40,20 @@ class AppDelegate: NSObject, NSApplicationDelegate {
return NSRunningApplication.runningApplications(withBundleIdentifier: Bundle.main.bundleIdentifier!).count > 1
}
func isParityRunning() -> Bool {
if let pid = self.parityPid {
return kill(pid, 0) == 0
}
return false
}
func killParity() {
if let pid = self.parityPid {
kill(pid, SIGKILL)
}
}
func openUI() {
let parity = Process()
parity.launchPath = self.parityPath()
@ -61,29 +61,29 @@ class AppDelegate: NSObject, NSApplicationDelegate {
parity.arguments!.append("ui")
parity.launch()
}
func writeConfigFiles() {
let basePath = FileManager.default.urls(for: .applicationSupportDirectory, in: .userDomainMask).first?
.appendingPathComponent(Bundle.main.bundleIdentifier!, isDirectory: true)
if FileManager.default.fileExists(atPath: basePath!.path) {
return
}
do {
let defaultsFileDir = basePath?.appendingPathComponent("chains").appendingPathComponent("ethereum")
let defaultsFile = defaultsFileDir?.appendingPathComponent("user_defaults")
try FileManager.default.createDirectory(atPath: (defaultsFileDir?.path)!, withIntermediateDirectories: true, attributes: nil)
if !FileManager.default.fileExists(atPath: defaultsFile!.path) {
try defaultDefaults.write(to: defaultsFile!, atomically: false, encoding: String.Encoding.utf8)
}
let configFile = basePath?.appendingPathComponent("config.toml")
}
catch {}
}
func autostartEnabled() -> Bool {
return itemReferencesInLoginItems().existingReference != nil
}
@ -123,7 +123,7 @@ class AppDelegate: NSObject, NSApplicationDelegate {
}
return (nil, nil)
}
func toggleLaunchAtStartup() {
let itemReferences = itemReferencesInLoginItems()
let shouldBeToggled = (itemReferences.existingReference == nil)
@ -155,7 +155,7 @@ class AppDelegate: NSObject, NSApplicationDelegate {
func launchParity() {
self.commandLine = CommandLine.arguments.dropFirst().filter({ $0 != "ui"})
let processes = GetBSDProcessList()!
let parityProcess = processes.index(where: {
var name = $0.kp_proc.p_comm
@ -166,7 +166,7 @@ class AppDelegate: NSObject, NSApplicationDelegate {
}
return str == "parity"
})
if parityProcess == nil {
let parity = Process()
let p = self.parityPath()
@ -178,7 +178,7 @@ class AppDelegate: NSObject, NSApplicationDelegate {
self.parityPid = processes[parityProcess!].kp_proc.p_pid
}
}
func applicationDidFinishLaunching(_ aNotification: Notification) {
if self.isAlreadyRunning() {
openUI()
@ -188,12 +188,12 @@ class AppDelegate: NSObject, NSApplicationDelegate {
self.writeConfigFiles()
self.launchParity()
Timer.scheduledTimer(withTimeInterval: 1.0, repeats: true, block: {_ in
Timer.scheduledTimer(withTimeInterval: 1.0, repeats: true, block: {_ in
if !self.isParityRunning() {
NSApplication.shared().terminate(self)
}
})
let icon = NSImage(named: "statusIcon")
icon?.isTemplate = true // best for dark mode
statusItem.image = icon
@ -206,19 +206,18 @@ class AppDelegate: NSObject, NSApplicationDelegate {
}
return true
}
@IBAction func quitClicked(_ sender: NSMenuItem) {
self.killParity()
NSApplication.shared().terminate(self)
}
@IBAction func openClicked(_ sender: NSMenuItem) {
self.openUI()
}
@IBAction func startAtLogonClicked(_ sender: NSMenuItem) {
self.toggleLaunchAtStartup()
}
}

Binary file not shown.

Before

Width:  |  Height:  |  Size: 115 KiB

After

Width:  |  Height:  |  Size: 51 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 372 B

After

Width:  |  Height:  |  Size: 679 B

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.3 KiB

After

Width:  |  Height:  |  Size: 3.6 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 707 B

After

Width:  |  Height:  |  Size: 1.7 KiB

View File

@ -1,4 +1,4 @@
// Copyright 2015-2017 Parity Technologies (UK) Ltd.
// Copyright 2015-2018 Parity Technologies (UK) Ltd.
// This file is part of Parity.
// Parity is free software: you can redistribute it and/or modify
@ -21,21 +21,21 @@ import Foundation
import Darwin
public func GetBSDProcessList() -> ([kinfo_proc]?) {
var done = false
var result: [kinfo_proc]?
var err: Int32
repeat {
let name = [CTL_KERN, KERN_PROC, KERN_PROC_ALL, 0];
let namePointer = name.withUnsafeBufferPointer { UnsafeMutablePointer<Int32>(mutating: $0.baseAddress) }
var length: Int = 0
err = sysctl(namePointer, u_int(name.count), nil, &length, nil, 0)
if err == -1 {
err = errno
}
if err == 0 {
let count = length / MemoryLayout<kinfo_proc>.stride
result = [kinfo_proc](repeating: kinfo_proc(), count: count)
@ -54,6 +54,6 @@ public func GetBSDProcessList() -> ([kinfo_proc]?) {
}
}
} while err == 0 && !done
return result
}

View File

@ -3,8 +3,6 @@ Parity Wallet
Welcome to Parity Wallet, your all-in-one Ethereum node and wallet.
WARNING: This installer is **EXPERIMENTAL**. Use it at your own risk.
If you continue, Parity will be installed as a user service. You will be able to use the Parity Wallet through your browser by using the menu bar icon, following the shortcut in the Launchpad or navigating to http://localhost:8080/ in your browser.
If you continue, Parity will be installed as a user service. You will be able to use the Parity Wallet through your browser by using the menu bar icon, following the shortcut in the Launchpad or navigating to http://localhost:8180/ in your browser.
Parity is distributed under the terms of the GPL.

View File

@ -9,5 +9,4 @@ fi
PLIST=~/Library/LaunchAgents/io.parity.ethereum.plist
su $SUDO_USER -c "launchctl stop io.parity.ethereum"
su $SUDO_USER -c "launchctl unload $PLIST"
rm -f /usr/local/libexec/parity /usr/local/libexec/uninstall-parity.sh /usr/local/bin/ethstore $PLIST
rm -f /usr/local/libexec/parity /usr/local/libexec/uninstall-parity.sh /usr/local/bin/ethstore /usr/local/bin/ethkey /usr/local/bin/parity-evm $PLIST

View File

@ -6,17 +6,17 @@
!define SYNC_TERM 0x00100001
!define APPNAME "Parity"
!define COMPANYNAME "Parity"
!define COMPANYNAME "Parity Technologies"
!define DESCRIPTION "Fast, light, robust Ethereum implementation"
!define VERSIONMAJOR 1
!define VERSIONMINOR 9
!define VERSIONBUILD 0
!define VERSIONBUILD 1
!define ARGS ""
!define FIRST_START_ARGS "--mode=passive ui"
!addplugindir .\
!define HELPURL "https://github.com/paritytech/parity/wiki" # "Support Information" link
!define HELPURL "https://paritytech.github.io/wiki/" # "Support Information" link
!define UPDATEURL "https://github.com/paritytech/parity/releases" # "Product Updates" link
!define ABOUTURL "https://github.com/paritytech/parity" # "Publisher" link
!define INSTALLSIZE 26120
@ -88,14 +88,13 @@ section "install"
!insertmacro TerminateApp
# Files added here should be removed by the uninstaller (see section "uninstall")
file /oname=parity.exe ..\target\release\parity.exe
file /oname=parity.exe ..\target\x86_64-pc-windows-msvc\release\parity.exe
file /oname=parity-evm.exe ..\target\x86_64-pc-windows-msvc\release\parity-evm.exe
file /oname=ethstore.exe ..\target\x86_64-pc-windows-msvc\release\ethstore.exe
file /oname=ethkey.exe ..\target\x86_64-pc-windows-msvc\release\ethkey.exe
file /oname=ptray.exe ..\windows\ptray\x64\Release\ptray.exe
file "logo.ico"
file vc_redist.x64.exe
ExecWait '"$INSTDIR\vc_redist.x64.exe" /passive /norestart'
delete $INSTDIR\vc_redist.x64.exe
# Add any other files for the install directory (license files, app data, etc) here
# Uninstaller - See function un.onInit and section "uninstall" for configuration
@ -167,6 +166,9 @@ section "uninstall"
# Remove files
delete $INSTDIR\parity.exe
delete $INSTDIR\parity-evm.exe
delete $INSTDIR\ethstore.exe
delete $INSTDIR\ethkey.exe
delete $INSTDIR\ptray.exe
delete $INSTDIR\logo.ico
@ -187,4 +189,3 @@ section "uninstall"
DeleteRegValue HKLM "Software\Microsoft\Windows\CurrentVersion\Run" "${APPNAME}"
DeleteRegValue HKCU "Software\Microsoft\Windows\CurrentVersion\Run" "${APPNAME}"
sectionEnd

View File

@ -257,12 +257,7 @@ usage! {
ARG arg_mode: (String) = "last", or |c: &Config| otry!(c.parity).mode.clone(),
"--mode=[MODE]",
"Set the operating mode. MODE can be one of:
last - Uses the last-used mode, active if none.
active - Parity continuously syncs the chain.
passive - Parity syncs initially, then sleeps and wakes regularly to resync.
dark - Parity syncs only when the RPC is active.
offline - Parity doesn't sync.",
"Set the operating mode. MODE can be one of: last - Uses the last-used mode, active if none; active - Parity continuously syncs the chain; passive - Parity syncs initially, then sleeps and wakes regularly to resync; dark - Parity syncs only when the RPC is active; offline - Parity doesn't sync.",
ARG arg_mode_timeout: (u64) = 300u64, or |c: &Config| otry!(c.parity).mode_timeout.clone(),
"--mode-timeout=[SECS]",
@ -274,19 +269,11 @@ usage! {
ARG arg_auto_update: (String) = "critical", or |c: &Config| otry!(c.parity).auto_update.clone(),
"--auto-update=[SET]",
"Set a releases set to automatically update and install.
all - All updates in the our release track.
critical - Only consensus/security updates.
none - No updates will be auto-installed.",
"Set a releases set to automatically update and install. SET can be one of: all - All updates in the our release track; critical - Only consensus/security updates; none - No updates will be auto-installed.",
ARG arg_release_track: (String) = "current", or |c: &Config| otry!(c.parity).release_track.clone(),
"--release-track=[TRACK]",
"Set which release track we should use for updates.
stable - Stable releases.
beta - Beta releases.
nightly - Nightly releases (unstable).
testing - Testing releases (do not use).
current - Whatever track this executable was released on",
"Set which release track we should use for updates. TRACK can be one of: stable - Stable releases; beta - Beta releases; nightly - Nightly releases (unstable); testing - Testing releases (do not use); current - Whatever track this executable was released on.",
ARG arg_chain: (String) = "foundation", or |c: &Config| otry!(c.parity).chain.clone(),
"--chain=[CHAIN]",
@ -311,8 +298,7 @@ usage! {
["Convenience options"]
FLAG flag_unsafe_expose: (bool) = false, or |c: &Config| otry!(c.misc).unsafe_expose,
"--unsafe-expose",
"All servers will listen on external interfaces and will be remotely accessible. It's equivalent with setting the following: --{{ws,jsonrpc,ui,ipfs,secret_store,stratum}}-interface=all --*-hosts=all
This option is UNSAFE and should be used with great care!",
"All servers will listen on external interfaces and will be remotely accessible. It's equivalent with setting the following: --{{ws,jsonrpc,ui,ipfs,secret_store,stratum}}-interface=all --*-hosts=all This option is UNSAFE and should be used with great care!",
ARG arg_config: (String) = "$BASE/config.toml", or |_| None,
"-c, --config=[CONFIG]",
@ -498,7 +484,7 @@ usage! {
ARG arg_ws_hosts: (String) = "none", or |c: &Config| otry!(c.websockets).hosts.as_ref().map(|vec| vec.join(",")),
"--ws-hosts=[HOSTS]",
"List of allowed Host header values. This option will validate the Host header sent by the browser, it is additional security against some attack vectors. Special options: \"all\", \"none\",.",
"List of allowed Host header values. This option will validate the Host header sent by the browser, it is additional security against some attack vectors. Special options: \"all\", \"none\".",
["API and console options IPC"]
FLAG flag_no_ipc: (bool) = false, or |c: &Config| otry!(c.ipc).disable.clone(),
@ -582,7 +568,7 @@ usage! {
ARG arg_secretstore_path: (String) = "$BASE/secretstore", or |c: &Config| otry!(c.secretstore).path.clone(),
"--secretstore-path=[PATH]",
"Specify directory where Secret Store should save its data..",
"Specify directory where Secret Store should save its data.",
ARG arg_secretstore_secret: (Option<String>) = None, or |c: &Config| otry!(c.secretstore).self_secret.clone(),
"--secretstore-secret=[SECRET]",
@ -671,7 +657,7 @@ usage! {
ARG arg_tx_queue_gas: (String) = "off", or |c: &Config| otry!(c.mining).tx_queue_gas.clone(),
"--tx-queue-gas=[LIMIT]",
"Maximum amount of total gas for external transactions in the queue. LIMIT can be either an amount of gas or 'auto' or 'off'. 'auto' sets the limit to be 20x the current block gas limit..",
"Maximum amount of total gas for external transactions in the queue. LIMIT can be either an amount of gas or 'auto' or 'off'. 'auto' sets the limit to be 20x the current block gas limit.",
ARG arg_tx_queue_strategy: (String) = "gas_price", or |c: &Config| otry!(c.mining).tx_queue_strategy.clone(),
"--tx-queue-strategy=[S]",
@ -778,7 +764,7 @@ usage! {
ARG arg_pruning_history: (u64) = 64u64, or |c: &Config| otry!(c.footprint).pruning_history.clone(),
"--pruning-history=[NUM]",
"Set a minimum number of recent states to keep when pruning is active..",
"Set a minimum number of recent states to keep when pruning is active.",
ARG arg_pruning_memory: (usize) = 32usize, or |c: &Config| otry!(c.footprint).pruning_memory.clone(),
"--pruning-memory=[MB]",
@ -1322,12 +1308,15 @@ mod tests {
let args = Args::parse(&["parity", "--secretstore-nodes", "abc@127.0.0.1:3333,cde@10.10.10.10:4444"]).unwrap();
assert_eq!(args.arg_secretstore_nodes, "abc@127.0.0.1:3333,cde@10.10.10.10:4444");
// Arguments with a single value shouldn't accept multiple values
let args = Args::parse(&["parity", "--auto-update", "critical", "all"]);
assert!(args.is_err());
let args = Args::parse(&["parity", "--password", "~/.safe/1", "~/.safe/2"]).unwrap();
let args = Args::parse(&["parity", "--password", "~/.safe/1", "--password", "~/.safe/2", "--ui-port", "8123", "ui"]).unwrap();
assert_eq!(args.arg_password, vec!["~/.safe/1".to_owned(), "~/.safe/2".to_owned()]);
assert_eq!(args.arg_ui_port, 8123);
assert_eq!(args.cmd_ui, true);
let args = Args::parse(&["parity", "--password", "~/.safe/1,~/.safe/2", "--ui-port", "8123", "ui"]).unwrap();
assert_eq!(args.arg_password, vec!["~/.safe/1".to_owned(), "~/.safe/2".to_owned()]);
assert_eq!(args.arg_ui_port, 8123);
assert_eq!(args.cmd_ui, true);
}
#[test]

View File

@ -150,14 +150,20 @@ macro_rules! usage {
}
) => {
use toml;
use std::{fs, io, process};
use std::{fs, io, process, cmp};
use std::io::{Read, Write};
use parity_version::version;
use clap::{Arg, App, SubCommand, AppSettings, ArgMatches as ClapArgMatches, Error as ClapError, ErrorKind as ClapErrorKind};
use clap::{Arg, App, SubCommand, AppSettings, ArgSettings, Error as ClapError, ErrorKind as ClapErrorKind};
use dir::helpers::replace_home;
use std::ffi::OsStr;
use std::collections::HashMap;
extern crate textwrap;
extern crate term_size;
use self::textwrap::{Wrapper};
const MAX_TERM_WIDTH: usize = 120;
#[cfg(test)]
use regex::Regex;
@ -365,11 +371,23 @@ macro_rules! usage {
#[allow(unused_mut)] // subc_subc_exist may be assigned true by the macro
#[allow(unused_assignments)] // Rust issue #22630
pub fn print_help() -> String {
const TAB: &str = " ";
const TAB_TAB: &str = " ";
let term_width = match term_size::dimensions() {
None => MAX_TERM_WIDTH,
Some((w, _)) => {
cmp::min(w, MAX_TERM_WIDTH)
}
};
let mut help : String = include_str!("./usage_header.txt").to_owned();
help.push_str("\n\n");
help.push_str("\n");
// Subcommands
let mut subcommands_wrapper = Wrapper::new(term_width).subsequent_indent(TAB);
help.push_str("parity [options]\n");
$(
{
@ -386,11 +404,14 @@ macro_rules! usage {
)*
];
if subc_subc_usages.is_empty() {
help.push_str(&format!("parity [options] {} {}\n", underscore_to_hyphen!(&stringify!($subc)[4..]), underscore_to_hyphen!(&stringify!($subc_subc)[stringify!($subc).len()+1..])));
} else {
help.push_str(&format!("parity [options] {} {} {}\n", underscore_to_hyphen!(&stringify!($subc)[4..]), underscore_to_hyphen!(&stringify!($subc_subc)[stringify!($subc).len()+1..]), subc_subc_usages.join(" ")));
}
help.push_str(&subcommands_wrapper.fill(
format!(
"parity [options] {} {} {}\n",
underscore_to_hyphen!(&stringify!($subc)[4..]),
underscore_to_hyphen!(&stringify!($subc_subc)[stringify!($subc).len()+1..]),
subc_subc_usages.join(" ")
).as_ref())
);
)*
// Print the subcommand on its own only if it has no subsubcommands
@ -404,22 +425,30 @@ macro_rules! usage {
)*
];
if subc_usages.is_empty() {
help.push_str(&format!("parity [options] {}\n", underscore_to_hyphen!(&stringify!($subc)[4..])));
} else {
help.push_str(&format!("parity [options] {} {}\n", underscore_to_hyphen!(&stringify!($subc)[4..]), subc_usages.join(" ")));
}
help.push_str(&subcommands_wrapper.fill(
format!(
"parity [options] {} {}\n",
underscore_to_hyphen!(&stringify!($subc)[4..]),
subc_usages.join(" ")
).as_ref())
);
}
}
)*
help.push_str("\n");
// Arguments and flags
let args_wrapper = Wrapper::new(term_width).initial_indent(TAB_TAB).subsequent_indent(TAB_TAB);
$(
help.push_str("\n");
help.push_str($group_name); help.push_str(":\n");
$(
help.push_str(&format!("\t{}\n\t\t{}\n", $flag_usage, $flag_help));
help.push_str(&format!("{}{}\n{}\n",
TAB, $flag_usage,
args_wrapper.fill($flag_help)
));
help.push_str("\n");
)*
$(
@ -429,10 +458,24 @@ macro_rules! usage {
if_option_vec!(
$($arg_type_tt)+,
THEN {
help.push_str(&format!("\t{}\n\t\t{} (default: {:?})\n", $arg_usage, $arg_help, {let x : inner_option_type!($($arg_type_tt)+)> = $arg_default; x}))
help.push_str(&format!("{}{}\n{}\n",
TAB, $arg_usage,
args_wrapper.fill(format!(
"{} (default: {:?})",
$arg_help,
{let x : inner_option_type!($($arg_type_tt)+)> = $arg_default; x}
).as_ref())
))
}
ELSE {
help.push_str(&format!("\t{}\n\t\t{}{}\n", $arg_usage, $arg_help, $arg_default.map(|x: inner_option_type!($($arg_type_tt)+)| format!(" (default: {})",x)).unwrap_or("".to_owned())))
help.push_str(&format!("{}{}\n{}\n",
TAB, $arg_usage,
args_wrapper.fill(format!(
"{}{}",
$arg_help,
$arg_default.map(|x: inner_option_type!($($arg_type_tt)+)| format!(" (default: {})",x)).unwrap_or("".to_owned())
).as_ref())
))
}
)
}
@ -440,14 +483,27 @@ macro_rules! usage {
if_vec!(
$($arg_type_tt)+,
THEN {
help.push_str(&format!("\t{}\n\t\t{} (default: {:?})\n", $arg_usage, $arg_help, {let x : $($arg_type_tt)+ = $arg_default; x}))
help.push_str(&format!("{}{}\n{}\n", TAB, $arg_usage,
args_wrapper.fill(format!(
"{} (default: {:?})",
$arg_help,
{let x : $($arg_type_tt)+ = $arg_default; x}
).as_ref())
))
}
ELSE {
help.push_str(&format!("\t{}\n\t\t{} (default: {})\n", $arg_usage, $arg_help, $arg_default))
help.push_str(&format!("{}{}\n{}\n", TAB, $arg_usage,
args_wrapper.fill(format!(
"{} (default: {})",
$arg_help,
$arg_default
).as_ref())
))
}
)
}
);
help.push_str("\n");
)*
)*
@ -503,36 +559,6 @@ macro_rules! usage {
args
}
pub fn hydrate_with_globals(self: &mut Self, matches: &ClapArgMatches) -> Result<(), ClapError> {
$(
$(
self.$flag = self.$flag || matches.is_present(stringify!($flag));
)*
$(
if let some @ Some(_) = return_if_parse_error!(if_option!(
$($arg_type_tt)+,
THEN {
if_option_vec!(
$($arg_type_tt)+,
THEN { values_t!(matches, stringify!($arg), inner_option_vec_type!($($arg_type_tt)+)) }
ELSE { value_t!(matches, stringify!($arg), inner_option_type!($($arg_type_tt)+)) }
)
}
ELSE {
if_vec!(
$($arg_type_tt)+,
THEN { values_t!(matches, stringify!($arg), inner_vec_type!($($arg_type_tt)+)) }
ELSE { value_t!(matches, stringify!($arg), $($arg_type_tt)+) }
)
}
)) {
self.$arg = some;
}
)*
)*
Ok(())
}
#[allow(unused_variables)] // the submatches of arg-less subcommands aren't used
pub fn parse<S: AsRef<str>>(command: &[S]) -> Result<Self, ClapError> {
@ -582,21 +608,31 @@ macro_rules! usage {
let matches = App::new("Parity")
.global_setting(AppSettings::VersionlessSubcommands)
.global_setting(AppSettings::DisableHelpSubcommand)
.max_term_width(MAX_TERM_WIDTH)
.help(Args::print_help().as_ref())
.args(&usages.iter().map(|u| Arg::from_usage(u).use_delimiter(false).allow_hyphen_values(true)).collect::<Vec<Arg>>())
.args(&usages.iter().map(|u| {
let mut arg = Arg::from_usage(u)
.allow_hyphen_values(true) // Allow for example --allow-ips -10.0.0.0/8
.global(true) // Argument doesn't have to come before the first subcommand
.hidden(true); // Hide global arguments from the (subcommand) help messages generated by Clap
if arg.is_set(ArgSettings::Multiple) {
arg = arg.require_delimiter(true); // Multiple values can only be separated by commas, not spaces (#7428)
}
arg
}).collect::<Vec<Arg>>())
$(
.subcommand(
SubCommand::with_name(&underscore_to_hyphen!(&stringify!($subc)[4..]))
.about($subc_help)
.args(&subc_usages.get(stringify!($subc)).unwrap().iter().map(|u| Arg::from_usage(u).use_delimiter(false).allow_hyphen_values(true)).collect::<Vec<Arg>>())
.args(&usages.iter().map(|u| Arg::from_usage(u).use_delimiter(false).allow_hyphen_values(true)).collect::<Vec<Arg>>()) // accept global arguments at this position
$(
.setting(AppSettings::SubcommandRequired) // prevent from running `parity account`
.subcommand(
SubCommand::with_name(&underscore_to_hyphen!(&stringify!($subc_subc)[stringify!($subc).len()+1..]))
.about($subc_subc_help)
.args(&subc_usages.get(stringify!($subc_subc)).unwrap().iter().map(|u| Arg::from_usage(u).use_delimiter(false).allow_hyphen_values(true)).collect::<Vec<Arg>>())
.args(&usages.iter().map(|u| Arg::from_usage(u).use_delimiter(false).allow_hyphen_values(true)).collect::<Vec<Arg>>()) // accept global arguments at this position
)
)*
)
@ -605,15 +641,39 @@ macro_rules! usage {
let mut raw_args : RawArgs = Default::default();
raw_args.hydrate_with_globals(&matches)?;
// Globals
$(
$(
raw_args.$flag = raw_args.$flag || matches.is_present(stringify!($flag));
)*
$(
if let some @ Some(_) = return_if_parse_error!(if_option!(
$($arg_type_tt)+,
THEN {
if_option_vec!(
$($arg_type_tt)+,
THEN { values_t!(matches, stringify!($arg), inner_option_vec_type!($($arg_type_tt)+)) }
ELSE { value_t!(matches, stringify!($arg), inner_option_type!($($arg_type_tt)+)) }
)
}
ELSE {
if_vec!(
$($arg_type_tt)+,
THEN { values_t!(matches, stringify!($arg), inner_vec_type!($($arg_type_tt)+)) }
ELSE { value_t!(matches, stringify!($arg), $($arg_type_tt)+) }
)
}
)) {
raw_args.$arg = some;
}
)*
)*
// Subcommands
$(
if let Some(submatches) = matches.subcommand_matches(&underscore_to_hyphen!(&stringify!($subc)[4..])) {
raw_args.$subc = true;
// Globals
raw_args.hydrate_with_globals(&submatches)?;
// Subcommand flags
$(
raw_args.$subc_flag = submatches.is_present(&stringify!($subc_flag));
@ -643,8 +703,6 @@ macro_rules! usage {
if let Some(subsubmatches) = submatches.subcommand_matches(&underscore_to_hyphen!(&stringify!($subc_subc)[stringify!($subc).len()+1..])) {
raw_args.$subc_subc = true;
// Globals
raw_args.hydrate_with_globals(&subsubmatches)?;
// Sub-subcommand flags
$(
raw_args.$subc_subc_flag = subsubmatches.is_present(&stringify!($subc_subc_flag));

View File

@ -1,3 +1,3 @@
Parity. Ethereum Client.
By Wood/Paronyan/Kotewicz/Drwięga/Volf et al.
Copyright 2015, 2016, 2017 Parity Technologies (UK) Ltd
Copyright 2015, 2016, 2017, 2018 Parity Technologies (UK) Ltd

View File

@ -1,6 +1,6 @@
Parity
version {}
Copyright 2015, 2016, 2017 Parity Technologies (UK) Ltd
Copyright 2015, 2016, 2017, 2018 Parity Technologies (UK) Ltd
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>.
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.

View File

@ -586,7 +586,12 @@ impl Configuration {
let mut extra_embed = dev_ui.clone();
match self.ui_hosts() {
// In case host validation is disabled allow all frame ancestors
None => extra_embed.push(("*".to_owned(), ui_port)),
None => {
// NOTE Chrome does not seem to support "*:<port>"
// we use `http(s)://*:<port>` instead.
extra_embed.push(("http://*".to_owned(), ui_port));
extra_embed.push(("https://*".to_owned(), ui_port));
},
Some(hosts) => extra_embed.extend(hosts.into_iter().filter_map(|host| {
let mut it = host.split(":");
let host = it.next();

View File

@ -16,6 +16,8 @@
use std::fmt;
use std::sync::{Arc, Weak};
use std::time::{Duration, Instant};
use std::thread;
use std::net::{TcpListener};
use ctrlc::CtrlC;
@ -171,8 +173,10 @@ impl ::local_store::NodeInfo for FullNodeInfo {
}
}
type LightClient = ::light::client::Client<::light_helpers::EpochFetch>;
// helper for light execution.
fn execute_light(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) -> Result<(bool, Option<String>), String> {
fn execute_light_impl(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) -> Result<((bool, Option<String>), Weak<LightClient>), String> {
use light::client as light_client;
use ethsync::{LightSyncParams, LightSync, ManageNetwork};
use parking_lot::{Mutex, RwLock};
@ -237,8 +241,9 @@ fn execute_light(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) ->
let service = light_client::Service::start(config, &spec, fetch, &db_dirs.client_path(algorithm), cache.clone())
.map_err(|e| format!("Error starting light client: {}", e))?;
let client = service.client();
let txq = Arc::new(RwLock::new(::light::transaction_queue::TransactionQueue::default()));
let provider = ::light::provider::LightProvider::new(service.client().clone(), txq.clone());
let provider = ::light::provider::LightProvider::new(client.clone(), txq.clone());
// start network.
// set up bootnodes
@ -275,7 +280,7 @@ fn execute_light(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) ->
// queue cull service.
let queue_cull = Arc::new(::light_helpers::QueueCull {
client: service.client().clone(),
client: client.clone(),
sync: light_sync.clone(),
on_demand: on_demand.clone(),
txq: txq.clone(),
@ -299,7 +304,7 @@ fn execute_light(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) ->
let signer_service = Arc::new(signer::new_service(&cmd.ws_conf, &cmd.ui_conf, &cmd.logger_config));
let (node_health, dapps_deps) = {
let contract_client = Arc::new(::dapps::LightRegistrar {
client: service.client().clone(),
client: client.clone(),
sync: light_sync.clone(),
on_demand: on_demand.clone(),
});
@ -342,7 +347,7 @@ fn execute_light(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) ->
let dapps_service = dapps::service(&dapps_middleware);
let deps_for_rpc_apis = Arc::new(rpc_apis::LightDependencies {
signer_service: signer_service,
client: service.client().clone(),
client: client.clone(),
sync: light_sync.clone(),
net: light_sync.clone(),
health: node_health,
@ -382,7 +387,7 @@ fn execute_light(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) ->
// the informant
let informant = Arc::new(Informant::new(
LightNodeInformantData {
client: service.client().clone(),
client: client.clone(),
sync: light_sync.clone(),
cache: cache,
},
@ -397,26 +402,13 @@ fn execute_light(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) ->
let res = wait_for_exit(None, None, can_restart);
informant.shutdown();
Ok(res)
// Create a weak reference to the client so that we can wait on shutdown until it is dropped
let weak_client = Arc::downgrade(&client);
Ok((res, weak_client))
}
pub fn execute(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) -> Result<(bool, Option<String>), String> {
if cmd.ui && cmd.dapps_conf.enabled {
// Check if Parity is already running
let addr = format!("{}:{}", cmd.ui_conf.interface, cmd.ui_conf.port);
if !TcpListener::bind(&addr as &str).is_ok() {
return open_ui(&cmd.ws_conf, &cmd.ui_conf, &cmd.logger_config).map(|_| (false, None));
}
}
// increase max number of open files
raise_fd_limit();
// run as light client.
if cmd.light {
return execute_light(cmd, can_restart, logger);
}
pub fn execute_impl(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) -> Result<((bool, Option<String>), Weak<Client>), String> {
// load spec
let spec = cmd.spec.spec(&cmd.dirs.cache)?;
@ -855,6 +847,9 @@ pub fn execute(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) -> R
open_dapp(&cmd.dapps_conf, &cmd.http_conf, &dapp)?;
}
// Create a weak reference to the client so that we can wait on shutdown until it is dropped
let weak_client = Arc::downgrade(&client);
// Handle exit
let restart = wait_for_exit(Some(updater), Some(client), can_restart);
@ -868,7 +863,33 @@ pub fn execute(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) -> R
// just Arc is dropping here, to allow other reference release in its default time
drop(informant);
Ok(restart)
Ok((restart, weak_client))
}
pub fn execute(cmd: RunCmd, can_restart: bool, logger: Arc<RotatingLogger>) -> Result<(bool, Option<String>), String> {
if cmd.ui && cmd.dapps_conf.enabled {
// Check if Parity is already running
let addr = format!("{}:{}", cmd.ui_conf.interface, cmd.ui_conf.port);
if !TcpListener::bind(&addr as &str).is_ok() {
return open_ui(&cmd.ws_conf, &cmd.ui_conf, &cmd.logger_config).map(|_| (false, None));
}
}
// increase max number of open files
raise_fd_limit();
fn wait<T>(res: Result<((bool, Option<String>), Weak<T>), String>) -> Result<(bool, Option<String>), String> {
res.map(|(restart, weak_client)| {
wait_for_drop(weak_client);
restart
})
}
if cmd.light {
wait(execute_light_impl(cmd, can_restart, logger))
} else {
wait(execute_impl(cmd, can_restart, logger))
}
}
#[cfg(not(windows))]
@ -1001,3 +1022,27 @@ fn wait_for_exit(
let _ = exit.1.wait(&mut l);
l.clone()
}
fn wait_for_drop<T>(w: Weak<T>) {
let sleep_duration = Duration::from_secs(1);
let warn_timeout = Duration::from_secs(60);
let max_timeout = Duration::from_secs(300);
let instant = Instant::now();
let mut warned = false;
while instant.elapsed() < max_timeout {
if w.upgrade().is_none() {
return;
}
if !warned && instant.elapsed() > warn_timeout {
warned = true;
warn!("Shutdown is taking longer than expected.");
}
thread::sleep(sleep_duration);
}
warn!("Shutdown timeout reached, exiting uncleanly.");
}

View File

@ -147,19 +147,30 @@ impl LightFetch {
Err(e) => return Box::new(future::err(e)),
};
let maybe_future = self.sync.with_context(move |ctx| {
Box::new(self.on_demand.request_raw(ctx, reqs)
.expect("all back-references known to be valid; qed")
.map(|res| extract_header(&res, header_ref)
.expect("these responses correspond to requests that header_ref belongs to. \
therefore it will not fail; qed"))
.map_err(errors::on_demand_cancel))
});
match maybe_future {
Some(recv) => recv,
None => Box::new(future::err(errors::network_disabled()))
}
self.send_requests(reqs, |res|
extract_header(&res, header_ref)
.expect("these responses correspond to requests that header_ref belongs to \
therefore it will not fail; qed")
)
}
/// Helper for getting contract code at a given block.
pub fn code(&self, address: Address, id: BlockId) -> BoxFuture<Vec<u8>> {
let mut reqs = Vec::new();
let header_ref = match self.make_header_requests(id, &mut reqs) {
Ok(r) => r,
Err(e) => return Box::new(future::err(e)),
};
reqs.push(request::Account { header: header_ref.clone(), address: address }.into());
let account_idx = reqs.len() - 1;
reqs.push(request::Code { header: header_ref, code_hash: Field::back_ref(account_idx, 0) }.into());
self.send_requests(reqs, |mut res| match res.pop() {
Some(OnDemandResponse::Code(code)) => code,
_ => panic!("responses correspond directly with requests in amount and type; qed"),
})
}
/// Helper for getting account info at a given block.
@ -173,20 +184,10 @@ impl LightFetch {
reqs.push(request::Account { header: header_ref, address: address }.into());
let maybe_future = self.sync.with_context(move |ctx| {
Box::new(self.on_demand.request_raw(ctx, reqs)
.expect("all back-references known to be valid; qed")
.map(|mut res| match res.pop() {
Some(OnDemandResponse::Account(acc)) => acc,
_ => panic!("responses correspond directly with requests in amount and type; qed"),
})
.map_err(errors::on_demand_cancel))
});
match maybe_future {
Some(recv) => recv,
None => Box::new(future::err(errors::network_disabled()))
}
self.send_requests(reqs, |mut res|match res.pop() {
Some(OnDemandResponse::Account(acc)) => acc,
_ => panic!("responses correspond directly with requests in amount and type; qed"),
})
}
/// Helper for getting proved execution.
@ -277,20 +278,10 @@ impl LightFetch {
reqs.push(request::Body(header_ref).into());
let maybe_future = self.sync.with_context(move |ctx| {
Box::new(self.on_demand.request_raw(ctx, reqs)
.expect(NO_INVALID_BACK_REFS)
.map(|mut res| match res.pop() {
Some(OnDemandResponse::Body(b)) => b,
_ => panic!("responses correspond directly with requests in amount and type; qed"),
})
.map_err(errors::on_demand_cancel))
});
match maybe_future {
Some(recv) => recv,
None => Box::new(future::err(errors::network_disabled()))
}
self.send_requests(reqs, |mut res| match res.pop() {
Some(OnDemandResponse::Body(b)) => b,
_ => panic!("responses correspond directly with requests in amount and type; qed"),
})
}
/// Get the block receipts. Fails on unknown block ID.
@ -303,20 +294,10 @@ impl LightFetch {
reqs.push(request::BlockReceipts(header_ref).into());
let maybe_future = self.sync.with_context(move |ctx| {
Box::new(self.on_demand.request_raw(ctx, reqs)
.expect(NO_INVALID_BACK_REFS)
.map(|mut res| match res.pop() {
Some(OnDemandResponse::Receipts(b)) => b,
_ => panic!("responses correspond directly with requests in amount and type; qed"),
})
.map_err(errors::on_demand_cancel))
});
match maybe_future {
Some(recv) => recv,
None => Box::new(future::err(errors::network_disabled()))
}
self.send_requests(reqs, |mut res| match res.pop() {
Some(OnDemandResponse::Receipts(b)) => b,
_ => panic!("responses correspond directly with requests in amount and type; qed"),
})
}
/// Get transaction logs
@ -433,6 +414,23 @@ impl LightFetch {
Either::B(extract_transaction)
}))
}
fn send_requests<T, F>(&self, reqs: Vec<OnDemandRequest>, parse_response: F) -> BoxFuture<T> where
F: FnOnce(Vec<OnDemandResponse>) -> T + Send + 'static,
T: Send + 'static,
{
let maybe_future = self.sync.with_context(move |ctx| {
Box::new(self.on_demand.request_raw(ctx, reqs)
.expect(NO_INVALID_BACK_REFS)
.map(parse_response)
.map_err(errors::on_demand_cancel))
});
match maybe_future {
Some(recv) => recv,
None => Box::new(future::err(errors::network_disabled()))
}
}
}
#[derive(Clone)]

View File

@ -349,8 +349,8 @@ impl<T: LightChainClient + 'static> Eth for EthClient<T> {
}))
}
fn code_at(&self, _address: RpcH160, _num: Trailing<BlockNumber>) -> BoxFuture<Bytes> {
Box::new(future::err(errors::unimplemented(None)))
fn code_at(&self, address: RpcH160, num: Trailing<BlockNumber>) -> BoxFuture<Bytes> {
Box::new(self.fetcher().code(address.into(), num.unwrap_or_default().into()).map(Into::into))
}
fn send_raw_transaction(&self, raw: Bytes) -> Result<RpcH256> {

View File

@ -208,7 +208,12 @@ impl Parity for ParityClient {
}
fn registry_address(&self) -> Result<Option<H160>> {
Err(errors::light_unimplemented(None))
let reg = self.light_dispatch.client.engine().params().registrar;
if reg == Default::default() {
Ok(None)
} else {
Ok(Some(reg.into()))
}
}
fn rpc_settings(&self) -> Result<RpcSettings> {

View File

@ -12,42 +12,21 @@
### Running coverage
set -x
KCOV=${1:-kcov}
if ! type $KCOV > /dev/null; then
echo "Install kcov first (details inside this file). Aborting."
exit 1
fi
RUSTFLAGS="-C link-dead-code" cargo test --all --exclude parity-ipfs-api --exclude evmjit --no-run || exit $?
RUSTFLAGS="-C link-dead-code" cargo test --all --exclude evmjit --no-run || exit $?
KCOV_TARGET="target/cov"
KCOV_FLAGS="--verify"
EXCLUDE="/usr/lib,\
/usr/include,\
$HOME/.cargo,\
$HOME/.multirust,\
rocksdb,\
secp256k1
"
rm -rf $KCOV_TARGET
EXCLUDE="/usr/lib,/usr/include,$HOME/.cargo,$HOME/.multirust,rocksdb,secp256k1"
mkdir -p $KCOV_TARGET
echo "Cover RUST"
for FILE in `find target/debug/deps ! -name "*.*"`
do
$KCOV --exclude-pattern $EXCLUDE $KCOV_FLAGS $KCOV_TARGET $FILE
done
$KCOV --exclude-pattern $EXCLUDE $KCOV_FLAGS $KCOV_TARGET target/debug/parity-*
do
timeout --signal=SIGKILL 5m kcov --exclude-pattern $EXCLUDE $KCOV_FLAGS $KCOV_TARGET $FILE
done
timeout --signal=SIGKILL 5m kcov --exclude-pattern $EXCLUDE $KCOV_FLAGS $KCOV_TARGET target/debug/parity-*
echo "Cover JS"
cd js
npm install&&npm run test:coverage
cd ..
codecov
bash <(curl -s https://codecov.io/bash)&&
echo "Uploaded code coverage"
exit 0

292
scripts/gitlab-build.sh Executable file
View File

@ -0,0 +1,292 @@
#!/bin/bash
set -e # fail on any error
set -u # treat unset variables as error
#ARGUMENTS: 1. BUILD_PLATFORM (target for binaries) 2. PLATFORM (target for cargo) 3. ARC (architecture) 4. & 5. CC & CXX flags
BUILD_PLATFORM=$1
PLATFORM=$2
ARC=$3
CC=$4
CXX=$5
VER="$(grep -m 1 version Cargo.toml | awk '{print $3}' | tr -d '"' | tr -d "\n")"
S3WIN=""
echo "--------------------"
echo "Build for platform: " $BUILD_PLATFORM
echo "Cargo target: " $PLATFORM
echo "CC&CXX flags: " $CC ", " $CXX
echo "Architecture: " $ARC
echo "Libssl version: " $LIBSSL
echo "Parity version: " $VER
echo "Branch: " $CI_BUILD_REF_NAME
echo "--------------------"
set_env () {
echo "Set ENVIROMENT"
export HOST_CC=gcc
export HOST_CXX=g++
rm -rf .cargo
mkdir -p .cargo
echo "[target.$PLATFORM]" >> .cargo/config
echo "linker= \"$CC\"" >> .cargo/config
cat .cargo/config
}
set_env_win () {
set PLATFORM=x86_64-pc-windows-msvc
set INCLUDE="C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Include;C:\vs2015\VC\include;C:\Program Files (x86)\Windows Kits\10\Include\10.0.10240.0\ucrt"
set LIB="C:\vs2015\VC\lib;C:\Program Files (x86)\Windows Kits\10\Lib\10.0.10240.0\ucrt\x64"
set RUST_BACKTRACE=1
#export RUSTFLAGS=$RUSTFLAGS
rustup default stable-x86_64-pc-windows-msvc
echo "MsBuild.exe windows\ptray\ptray.vcxproj /p:Platform=x64 /p:Configuration=Release" > msbuild.cmd
echo "@ signtool sign /f "\%"1 /p "\%"2 /tr http://timestamp.comodoca.com /du https://parity.io "\%"3" > sign.cmd
}
build () {
echo "Build parity:"
cargo build --target $PLATFORM --features final --release
echo "Build evmbin:"
cargo build --target $PLATFORM --release -p evmbin
echo "Build ethstore-cli:"
cargo build --target $PLATFORM --release -p ethstore-cli
echo "Build ethkey-cli:"
cargo build --target $PLATFORM --release -p ethkey-cli
}
strip_md5 () {
echo "Strip binaries:"
$STRIP_BIN -v target/$PLATFORM/release/parity
$STRIP_BIN -v target/$PLATFORM/release/parity-evm
$STRIP_BIN -v target/$PLATFORM/release/ethstore
$STRIP_BIN -v target/$PLATFORM/release/ethkey;
export SHA3=$(rhash --sha3-256 target/$PLATFORM/release/parity -p %h)
echo "Checksum calculation:"
rm -rf *.md5
export SHA3=$(rhash --sha3-256 target/$PLATFORM/release/parity -p %h)
echo "Parity file SHA3:" $SHA3
md5sum target/$PLATFORM/release/parity > parity.md5
md5sum target/$PLATFORM/release/parity-evm > parity-evm.md5
md5sum target/$PLATFORM/release/ethstore > ethstore.md5
md5sum target/$PLATFORM/release/ethkey > ethkey.md5
}
make_deb () {
rm -rf deb
echo "create DEBIAN files"
mkdir -p deb/usr/bin/
mkdir -p deb/DEBIAN
echo "create copyright, docs, compat"
cp LICENSE deb/DEBIAN/copyright
echo "https://github.com/paritytech/parity/wiki" >> deb/DEBIAN/docs
echo "8" >> deb/DEBIAN/compat
echo "create control file"
control=deb/DEBIAN/control
echo "Package: parity" >> $control
echo "Version: $VER" >> $control
echo "Source: parity" >> $control
echo "Section: science" >> $control
echo "Priority: extra" >> $control
echo "Maintainer: Parity Technologies <devops@parity.io>" >> $control
echo "Build-Depends: debhelper (>=9)" >> $control
echo "Standards-Version: 3.9.5" >> $control
echo "Homepage: https://parity.io" >> $control
echo "Vcs-Git: git://github.com/paritytech/parity.git" >> $control
echo "Vcs-Browser: https://github.com/paritytech/parity" >> $control
echo "Architecture: $ARC" >> $control
echo "Depends: $LIBSSL" >> $control
echo "Description: Ethereum network client by Parity Technologies" >> $control
size=`du deb/|awk 'END {print $1}'`
echo "Installed-Size: $size" >> $control
echo "build .deb package"
cp target/$PLATFORM/release/parity deb/usr/bin/parity
cp target/$PLATFORM/release/parity-evm deb/usr/bin/parity-evm
cp target/$PLATFORM/release/ethstore deb/usr/bin/ethstore
cp target/$PLATFORM/release/ethkey deb/usr/bin/ethkey
dpkg-deb -b deb "parity_"$VER"_"$ARC".deb"
md5sum "parity_"$VER"_"$ARC".deb" > "parity_"$VER"_"$ARC".deb.md5"
}
make_rpm () {
rm -rf /install
mkdir -p /install/usr/bin
cp target/$PLATFORM/release/parity /install/usr/bin
cp target/$PLATFORM/release/parity-evm /install/usr/bin/parity-evm
cp target/$PLATFORM/release/ethstore /install/usr/bin/ethstore
cp target/$PLATFORM/release/ethkey /install/usr/bin/ethkey
fpm -s dir -t rpm -n parity -v $VER --epoch 1 --license GPLv3 -d openssl --provides parity --url https://parity.io --vendor "Parity Technologies" -a x86_64 -m "<devops@parity.io>" --description "Ethereum network client by Parity Technologies" -C /install/
cp "parity-"$VER"-1."$ARC".rpm" "parity_"$VER"_"$ARC".rpm"
md5sum "parity_"$VER"_"$ARC".rpm" > "parity_"$VER"_"$ARC".rpm.md5"
}
make_pkg () {
echo "make PKG"
cp target/$PLATFORM/release/parity target/release/parity
cp target/$PLATFORM/release/parity-evm target/release/parity-evm
cp target/$PLATFORM/release/ethstore target/release/ethstore
cp target/$PLATFORM/release/ethkey target/release/ethkey
cd mac
xcodebuild -configuration Release
cd ..
packagesbuild -v mac/Parity.pkgproj
productsign --sign 'Developer ID Installer: PARITY TECHNOLOGIES LIMITED (P2PX3JU8FT)' target/release/Parity\ Ethereum.pkg target/release/Parity\ Ethereum-signed.pkg
mv target/release/Parity\ Ethereum-signed.pkg "parity_"$VER"_"$ARC".pkg"
md5sum "parity_"$VER"_"$ARC"."$EXT >> "parity_"$VER"_"$ARC".pkg.md5"
}
make_exe () {
./sign.cmd $keyfile $certpass "target/$PLATFORM/release/parity.exe"
SHA3=$(rhash --sha3-256 target/$PLATFORM/release/parity.exe -p %h)
echo "Checksum calculation:"
rm -rf *.md5
echo "Parity file SHA3:" $SHA3
rhash --md5 target/$PLATFORM/release/parity.exe -p %h > parity.exe.md5
rhash --md5 target/$PLATFORM/release/parity-evm.exe -p %h > parity-evm.exe.md5
rhash --md5 target/$PLATFORM/release/ethstore.exe -p %h > ethstore.exe.md5
rhash --md5 target/$PLATFORM/release/ethkey.exe -p %h > ethkey.exe.md5
./msbuild.cmd
./sign.cmd $keyfile $certpass windows/ptray/x64/release/ptray.exe
cd nsis
curl -sL --url "https://github.com/paritytech/win-build/raw/master/vc_redist.x64.exe" -o vc_redist.x64.exe
echo "makensis.exe installer.nsi" > nsis.cmd
./nsis.cmd
cd ..
cp nsis/installer.exe "parity_"$VER"_"$ARC"."$EXT
./sign.cmd $keyfile $certpass "parity_"$VER"_"$ARC"."$EXT
rhash --md5 "parity_"$VER"_"$ARC"."$EXT -p %h > "parity_"$VER"_"$ARC"."$EXT".md5"
}
push_binaries () {
echo "Push binaries to AWS S3"
aws configure set aws_access_key_id $s3_key
aws configure set aws_secret_access_key $s3_secret
if [[ "$CI_BUILD_REF_NAME" = "master" || "$CI_BUILD_REF_NAME" = "beta" || "$CI_BUILD_REF_NAME" = "stable" || "$CI_BUILD_REF_NAME" = "nightly" ]];
then
export S3_BUCKET=builds-parity-published;
else
export S3_BUCKET=builds-parity;
fi
aws s3 rm --recursive s3://$S3_BUCKET/$CI_BUILD_REF_NAME/$BUILD_PLATFORM
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/parity$S3WIN --body target/$PLATFORM/release/parity$S3WIN
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/parity$S3WIN.md5 --body parity$S3WIN.md5
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/parity-evm$S3WIN --body target/$PLATFORM/release/parity-evm$S3WIN
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/parity-evm$S3WIN.md5 --body parity-evm$S3WIN.md5
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/ethstore$S3WIN --body target/$PLATFORM/release/ethstore$S3WIN
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/ethstore$S3WIN.md5 --body ethstore$S3WIN.md5
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/ethkey$S3WIN --body target/$PLATFORM/release/ethkey$S3WIN
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/ethkey$S3WIN.md5 --body ethkey$S3WIN.md5
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/"parity_"$VER"_"$ARC"."$EXT --body "parity_"$VER"_"$ARC"."$EXT
aws s3api put-object --bucket $S3_BUCKET --key $CI_BUILD_REF_NAME/$BUILD_PLATFORM/"parity_"$VER"_"$ARC"."$EXT".md5" --body "parity_"$VER"_"$ARC"."$EXT".md5"
}
make_archive () {
echo "add artifacts to archive"
rm -rf parity.zip
zip -r parity.zip target/$PLATFORM/release/parity$S3WIN target/$PLATFORM/release/parity-evm$S3WIN target/$PLATFORM/release/ethstore$S3WIN target/$PLATFORM/release/ethkey$S3WIN parity$S3WIN.md5 parity-evm$S3WIN.md5 ethstore$S3WIN.md5 ethkey$S3WIN.md5
}
push_release () {
echo "push release"
curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/$PLATFORM
curl --data "commit=$CI_BUILD_REF&sha3=$SHA3&filename=parity&secret=$RELEASES_SECRET" http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/$PLATFORM
}
case $BUILD_PLATFORM in
x86_64-unknown-linux-gnu)
#set strip bin
STRIP_BIN="strip"
#package extention
EXT="deb"
build
strip_md5
make_deb
make_archive
push_binaries
push_release
;;
x86_64-unknown-debian-gnu)
STRIP_BIN="strip"
EXT="deb"
LIBSSL="libssl1.1 (>=1.1.0)"
echo "Use libssl1.1 (>=1.1.0) for Debian builds"
build
strip_md5
make_deb
make_archive
push_binaries
push_release
;;
x86_64-unknown-centos-gnu)
STRIP_BIN="strip"
EXT="rpm"
build
strip_md5
make_rpm
make_archive
push_binaries
push_release
;;
i686-unknown-linux-gnu)
STRIP_BIN="strip"
EXT="deb"
set_env
build
strip_md5
make_deb
make_archive
push_binaries
push_release
;;
armv7-unknown-linux-gnueabihf)
STRIP_BIN="arm-linux-gnueabihf-strip"
EXT="deb"
set_env
build
strip_md5
make_deb
make_archive
push_binaries
push_release
;;
arm-unknown-linux-gnueabihf)
STRIP_BIN="arm-linux-gnueabihf-strip"
EXT="deb"
set_env
build
strip_md5
make_deb
make_archive
push_binaries
push_release
;;
aarch64-unknown-linux-gnu)
STRIP_BIN="aarch64-linux-gnu-strip"
EXT="deb"
set_env
build
strip_md5
make_deb
make_archive
push_binaries
push_release
;;
x86_64-apple-darwin)
STRIP_BIN="strip"
PLATFORM="x86_64-apple-darwin"
EXT="pkg"
build
strip_md5
make_pkg
make_archive
push_binaries
push_release
;;
x86_64-unknown-snap-gnu)
cd snap
ARC="amd64"
EXT="snap"
rm -rf *snap
sed -i 's/master/'"$VER"'/g' snapcraft.yaml
snapcraft
cp "parity_"$CI_BUILD_REF_NAME"_amd64.snap" "parity_"$VER"_amd64.snap"
md5sum "parity_"$VER"_amd64.snap" > "parity_"$VER"_amd64.snap.md5"
push_binaries
;;
x86_64-pc-windows-msvc)
set_env_win
EXT="exe"
S3WIN=".exe"
build
make_exe
make_archive
push_binaries
push_release
esac

100
scripts/gitlab-test.sh Executable file
View File

@ -0,0 +1,100 @@
#!/bin/bash
#ARGUMENT test for RUST, JS, COVERAGE or JS_RELEASE
set -e # fail on any error
set -u # treat unset variables as error
if [[ "$CI_COMMIT_REF_NAME" = "beta" || "$CI_COMMIT_REF_NAME" = "stable" ]]; then
export GIT_COMPARE=$CI_COMMIT_REF_NAME;
else
export GIT_COMPARE=master;
fi
export JS_FILES_MODIFIED="$(git --no-pager diff --name-only $GIT_COMPARE...$CI_COMMIT_SHA | grep ^js/ | wc -l)"
export JS_OLD_FILES_MODIFIED="$(git --no-pager diff --name-only $GIT_COMPARE...$CI_COMMIT_SHA | grep ^js-old/ | wc -l)"
export RUST_FILES_MODIFIED="$(git --no-pager diff --name-only $GIT_COMPARE...$CI_COMMIT_SHA | grep -v -e ^js -e ^\\. -e ^LICENSE -e ^README.md -e ^test.sh -e ^windows/ -e ^scripts/ -e ^mac/ -e ^nsis/ | wc -l)"
echo "RUST_FILES_MODIFIED: $RUST_FILES_MODIFIED"
echo "JS_FILES_MODIFIED: $JS_FILES_MODIFIED"
echo "JS_OLD_FILES_MODIFIED: $JS_OLD_FILES_MODIFIED"
TEST_SWITCH=$1
rust_test () {
git submodule update --init --recursive
rustup show
if [[ "${RUST_FILES_MODIFIED}" == "0" ]];
then echo "Skipping Rust tests since no Rust files modified.";
else ./test.sh;
fi
if [[ "$CI_COMMIT_REF_NAME" == "nightly" ]];
then sh scripts/aura-test.sh;
fi
}
js_test () {
git submodule update --init --recursive
if [[ "${JS_FILES_MODIFIED}" == "0" ]];
then echo "Skipping JS deps install since no JS files modified.";
else ./js/scripts/install-deps.sh;
fi
if [[ "${JS_OLD_FILES_MODIFIED}" == "0" ]];
then echo "Skipping JS (old) deps install since no JS files modified.";
else ./js-old/scripts/install-deps.sh;
fi
if [[ "${JS_FILES_MODIFIED}" == "0" ]];
then echo "Skipping JS lint since no JS files modified.";
else ./js/scripts/lint.sh && ./js/scripts/test.sh && ./js/scripts/build.sh;
fi
if [[ "${JS_OLD_FILES_MODIFIED}" == "0" ]];
then echo "Skipping JS (old) lint since no JS files modified.";
else ./js-old/scripts/lint.sh && ./js-old/scripts/test.sh && ./js-old/scripts/build.sh;
fi
}
js_release () {
rustup default stable
if [[ "${JS_FILES_MODIFIED}" == "0" ]];
then echo "Skipping JS deps install since no JS files modified.";
else echo "install JS deps---------------"&&./js/scripts/install-deps.sh&&echo "done----------------";
fi
if [[ "${JS_FILES_MODIFIED}" == "0" ]];
then echo "Skipping JS rebuild since no JS files modified.";
else echo "build JS--------------"&&./js/scripts/build.sh&&echo "Puch JS precompiled-----------------"&&./js/scripts/push-precompiled.sh&&echo "done----------------";
fi
if [[ "${JS_OLD_FILES_MODIFIED}" == "0" ]];
then echo "Skipping JS (old) deps install since no JS files modified.";
else echo "install JS_OLD deps---------------"&&./js-old/scripts/install-deps.sh&&echo "done----------------";
fi
if [[ "${JS_OLD_FILES_MODIFIED}" == "0" ]];
then echo "Skipping JS (old) rebuild since no JS files modified.";
else echo "build JS--------------"&&./js-old/scripts/build.sh&&echo "Puch JS precompiled-----------------"&&./js-old/scripts/push-precompiled.sh&&echo "done----------------";
fi
if [[ "${JS_FILES_MODIFIED}" == "0" ]] && [[ "${JS_OLD_FILES_MODIFIED}" == "0" ]];
then echo "Skipping Cargo update since no JS files modified.";
else echo "push cargo---------"&&./js/scripts/push-cargo.sh&&echo "done----------------";
fi
}
coverage_test () {
git submodule update --init --recursive
rm -rf target/*
rm -rf js/.coverage
scripts/cov.sh
}
case $TEST_SWITCH in
stable )
rustup default stable
rust_test
;;
beta)
rustup default beta
rust_test
;;
nightly)
rustup default nightly
rust_test
;;
js-test)
js_test
;;
js-release)
js_release
;;
test-coverage)
coverage_test
;;
esac

View File

@ -19,4 +19,4 @@ parts:
parity:
source: ..
plugin: rust
build-packages: [g++, libudev-dev, libssl-dev, make, pkg-config]
build-packages: [g++, libudev-dev, libssl-dev, make, pkg-config]

View File

@ -170,7 +170,18 @@ pub struct AttachedProtocol {
}
impl AttachedProtocol {
fn register(&self, _network: &NetworkService) {}
fn register(&self, network: &NetworkService) {
let res = network.register_protocol(
self.handler.clone(),
self.protocol_id,
self.packet_count,
self.versions
);
if let Err(e) = res {
warn!(target: "sync", "Error attaching protocol {:?}: {:?}", self.protocol_id, e);
}
}
}
/// EthSync initialization parameters.

View File

@ -522,6 +522,10 @@ impl BlockDownloader {
trace!(target: "sync", "Unknown new block parent, restarting sync");
break;
},
Err(BlockImportError::Block(BlockError::TemporarilyInvalid(_))) => {
debug!(target: "sync", "Block temporarily invalid, restarting sync");
break;
},
Err(e) => {
debug!(target: "sync", "Bad block {:?} : {:?}", h, e);
bad = true;

View File

@ -32,7 +32,7 @@ use std::cmp;
use std::collections::HashMap;
use std::marker::PhantomData;
use std::path::{PathBuf, Path};
use std::{mem, fs, io};
use std::{fs, io, mem, result};
use parking_lot::{Mutex, MutexGuard, RwLock};
use rocksdb::{
@ -257,7 +257,25 @@ pub struct Database {
flushing_lock: Mutex<bool>,
}
#[inline]
fn check_for_corruption<T, P: AsRef<Path>>(path: P, res: result::Result<T, String>) -> result::Result<T, String> {
if let Err(ref s) = res {
if s.starts_with("Corruption:") {
warn!("DB corrupted: {}. Repair will be triggered on next restart", s);
let _ = fs::File::create(path.as_ref().join(Database::CORRUPTION_FILE_NAME));
}
}
res
}
fn is_corrupted(s: &str) -> bool {
s.starts_with("Corruption:") || s.starts_with("Invalid argument: You have to open all column families")
}
impl Database {
const CORRUPTION_FILE_NAME: &'static str = "CORRUPTED";
/// Open database with default settings.
pub fn open_default(path: &str) -> Result<Database> {
Database::open(&DatabaseConfig::default(), path)
@ -287,6 +305,14 @@ impl Database {
block_opts.set_cache(cache);
}
// attempt database repair if it has been previously marked as corrupted
let db_corrupted = Path::new(path).join(Database::CORRUPTION_FILE_NAME);
if db_corrupted.exists() {
warn!("DB has been previously marked as corrupted, attempting repair");
DB::repair(&opts, path)?;
fs::remove_file(db_corrupted)?;
}
let columns = config.columns.unwrap_or(0) as usize;
let mut cf_options = Vec::with_capacity(columns);
@ -306,12 +332,11 @@ impl Database {
let mut cfs: Vec<Column> = Vec::new();
let db = match config.columns {
Some(columns) => {
Some(_) => {
match DB::open_cf(&opts, path, &cfnames, &cf_options) {
Ok(db) => {
cfs = cfnames.iter().map(|n| db.cf_handle(n)
.expect("rocksdb opens a cf_handle for each cfname; qed")).collect();
assert!(cfs.len() == columns as usize);
Ok(db)
}
Err(_) => {
@ -321,7 +346,7 @@ impl Database {
cfs = cfnames.iter().enumerate().map(|(i, n)| db.create_cf(n, &cf_options[i])).collect::<::std::result::Result<_, _>>()?;
Ok(db)
},
err @ Err(_) => err,
err => err,
}
}
}
@ -331,14 +356,18 @@ impl Database {
let db = match db {
Ok(db) => db,
Err(ref s) if s.starts_with("Corruption:") => {
info!("{}", s);
info!("Attempting DB repair for {}", path);
Err(ref s) if is_corrupted(s) => {
warn!("DB corrupted: {}, attempting repair", s);
DB::repair(&opts, path)?;
match cfnames.is_empty() {
true => DB::open(&opts, path)?,
false => DB::open_cf(&opts, path, &cfnames, &cf_options)?
false => {
let db = DB::open_cf(&opts, path, &cfnames, &cf_options)?;
cfs = cfnames.iter().map(|n| db.cf_handle(n)
.expect("rocksdb opens a cf_handle for each cfname; qed")).collect();
db
},
}
},
Err(s) => { return Err(s.into()); }
@ -425,7 +454,11 @@ impl Database {
}
}
}
db.write_opt(batch, &self.write_opts)?;
check_for_corruption(
&self.path,
db.write_opt(batch, &self.write_opts))?;
for column in self.flushing.write().iter_mut() {
column.clear();
column.shrink_to_fit();
@ -471,7 +504,10 @@ impl Database {
},
}
}
db.write_opt(batch, &self.write_opts).map_err(Into::into)
check_for_corruption(
&self.path,
db.write_opt(batch, &self.write_opts)).map_err(Into::into)
},
None => Err("Database is closed".into())
}

View File

@ -31,7 +31,9 @@ ethcore-logger = { path ="../../logger" }
ipnetwork = "0.12.6"
keccak-hash = { path = "../hash" }
snappy = { git = "https://github.com/paritytech/rust-snappy" }
serde = "1.0"
serde_json = "1.0"
serde_derive = "1.0"
error-chain = { version = "0.11", default-features = false }
[dev-dependencies]

View File

@ -719,7 +719,7 @@ impl Host {
let address = {
let mut nodes = self.nodes.write();
if let Some(node) = nodes.get_mut(id) {
node.last_attempted = Some(::time::now());
node.attempts += 1;
node.endpoint.address
}
else {
@ -738,6 +738,7 @@ impl Host {
}
}
};
if let Err(e) = self.create_connection(socket, Some(id), io) {
debug!(target: "network", "Can't create connection: {:?}", e);
}
@ -1281,4 +1282,3 @@ fn host_client_url() {
let host: Host = Host::new(config, Arc::new(NetworkStats::new()), None).unwrap();
assert!(host.local_url().starts_with("enode://101b3ef5a4ea7a1c7928e24c4c75fd053c235d7b80c22ae5c03d145d0ac7396e2a4ffff9adee3133a7b05044a5cee08115fd65145e5165d646bde371010d803c@"));
}

View File

@ -80,14 +80,16 @@ extern crate path;
extern crate ethcore_logger;
extern crate ipnetwork;
extern crate keccak_hash as hash;
extern crate serde;
extern crate serde_json;
extern crate snappy;
#[macro_use]
extern crate error_chain;
#[macro_use]
extern crate log;
#[macro_use]
extern crate serde_derive;
#[cfg(test)]
extern crate ethcore_devtools as devtools;
@ -213,4 +215,3 @@ pub enum AllowIP {
/// Block all addresses
None,
}

View File

@ -14,25 +14,20 @@
// You should have received a copy of the GNU General Public License
// along with Parity. If not, see <http://www.gnu.org/licenses/>.
use std::mem;
use std::slice::from_raw_parts;
use std::net::{SocketAddr, ToSocketAddrs, SocketAddrV4, SocketAddrV6, Ipv4Addr, Ipv6Addr};
use std::hash::{Hash, Hasher};
use std::str::{FromStr};
use std::collections::{HashMap, HashSet};
use std::fmt::{Display, Formatter};
use std::path::{PathBuf};
use std::fmt;
use std::fs;
use std::io::{Read, Write};
use bigint::hash::*;
use std::fmt::{self, Display, Formatter};
use std::hash::{Hash, Hasher};
use std::net::{SocketAddr, ToSocketAddrs, SocketAddrV4, SocketAddrV6, Ipv4Addr, Ipv6Addr};
use std::path::PathBuf;
use std::str::FromStr;
use std::{fs, mem, slice};
use bigint::hash::H512;
use rlp::*;
use time::Tm;
use error::{Error, ErrorKind};
use {AllowIP, IpFilter};
use discovery::{TableUpdates, NodeEntry};
use ip_utils::*;
use serde_json::Value;
use serde_json;
/// Node public key
pub type NodeId = H512;
@ -80,7 +75,7 @@ impl NodeEndpoint {
4 => Ok(SocketAddr::V4(SocketAddrV4::new(Ipv4Addr::new(addr_bytes[0], addr_bytes[1], addr_bytes[2], addr_bytes[3]), tcp_port))),
16 => unsafe {
let o: *const u16 = mem::transmute(addr_bytes.as_ptr());
let o = from_raw_parts(o, 8);
let o = slice::from_raw_parts(o, 8);
Ok(SocketAddr::V6(SocketAddrV6::new(Ipv6Addr::new(o[0], o[1], o[2], o[3], o[4], o[5], o[6], o[7]), tcp_port, 0, 0)))
},
_ => Err(DecoderError::RlpInconsistentLengthAndData)
@ -95,7 +90,7 @@ impl NodeEndpoint {
}
SocketAddr::V6(a) => unsafe {
let o: *const u8 = mem::transmute(a.ip().segments().as_ptr());
rlp.append(&from_raw_parts(o, 16));
rlp.append(&slice::from_raw_parts(o, 16));
}
};
rlp.append(&self.udp_port);
@ -143,18 +138,30 @@ pub struct Node {
pub id: NodeId,
pub endpoint: NodeEndpoint,
pub peer_type: PeerType,
pub attempts: u32,
pub failures: u32,
pub last_attempted: Option<Tm>,
}
const DEFAULT_FAILURE_PERCENTAGE: usize = 50;
impl Node {
pub fn new(id: NodeId, endpoint: NodeEndpoint) -> Node {
Node {
id: id,
endpoint: endpoint,
peer_type: PeerType::Optional,
attempts: 0,
failures: 0,
last_attempted: None,
}
}
/// Returns the node's failure percentage (0..100) in buckets of 5%. If there are 0 connection attempts for this
/// node the default failure percentage is returned (50%).
pub fn failure_percentage(&self) -> usize {
if self.attempts == 0 {
DEFAULT_FAILURE_PERCENTAGE
} else {
(self.failures * 100 / self.attempts / 5 * 5) as usize
}
}
}
@ -184,7 +191,7 @@ impl FromStr for Node {
id: id,
endpoint: endpoint,
peer_type: PeerType::Optional,
last_attempted: None,
attempts: 0,
failures: 0,
})
}
@ -203,6 +210,9 @@ impl Hash for Node {
}
}
const MAX_NODES: usize = 1024;
const NODES_FILE: &str = "nodes.json";
/// Node table backed by disk file.
pub struct NodeTable {
nodes: HashMap<NodeId, Node>,
@ -221,23 +231,37 @@ impl NodeTable {
/// Add a node to table
pub fn add_node(&mut self, mut node: Node) {
// preserve failure counter
let failures = self.nodes.get(&node.id).map_or(0, |n| n.failures);
// preserve attempts and failure counter
let (attempts, failures) =
self.nodes.get(&node.id).map_or((0, 0), |n| (n.attempts, n.failures));
node.attempts = attempts;
node.failures = failures;
self.nodes.insert(node.id.clone(), node);
}
/// Returns node ids sorted by number of failures
/// Returns node ids sorted by failure percentage, for nodes with the same failure percentage the absolute number of
/// failures is considered.
pub fn nodes(&self, filter: IpFilter) -> Vec<NodeId> {
let mut refs: Vec<&Node> = self.nodes.values().filter(|n| !self.useless_nodes.contains(&n.id) && n.endpoint.is_allowed(&filter)).collect();
refs.sort_by(|a, b| a.failures.cmp(&b.failures));
refs.iter().map(|n| n.id.clone()).collect()
let mut refs: Vec<&Node> = self.nodes.values()
.filter(|n| !self.useless_nodes.contains(&n.id))
.filter(|n| n.endpoint.is_allowed(&filter))
.collect();
refs.sort_by(|a, b| {
a.failure_percentage().cmp(&b.failure_percentage())
.then_with(|| a.failures.cmp(&b.failures))
.then_with(|| b.attempts.cmp(&a.attempts)) // we use reverse ordering for number of attempts
});
refs.into_iter().map(|n| n.id).collect()
}
/// Unordered list of all entries
pub fn unordered_entries(&self) -> Vec<NodeEntry> {
// preserve failure counter
self.nodes.values().map(|n| NodeEntry { endpoint: n.endpoint.clone(), id: n.id.clone() }).collect()
self.nodes.values().map(|n| NodeEntry {
endpoint: n.endpoint.clone(),
id: n.id.clone(),
}).collect()
}
/// Get particular node
@ -270,7 +294,7 @@ impl NodeTable {
}
}
/// Mark as useless, no furter attempts to connect until next call to `clear_useless`.
/// Mark as useless, no further attempts to connect until next call to `clear_useless`.
pub fn mark_as_useless(&mut self, id: &NodeId) {
self.useless_nodes.insert(id.clone());
}
@ -282,77 +306,62 @@ impl NodeTable {
/// Save the nodes.json file.
pub fn save(&self) {
if let Some(ref path) = self.path {
let mut path_buf = PathBuf::from(path);
if let Err(e) = fs::create_dir_all(path_buf.as_path()) {
warn!("Error creating node table directory: {:?}", e);
return;
};
path_buf.push("nodes.json");
let mut json = String::new();
json.push_str("{\n");
json.push_str("\"nodes\": [\n");
let node_ids = self.nodes(IpFilter::default());
for i in 0 .. node_ids.len() {
let node = self.nodes.get(&node_ids[i]).expect("self.nodes() only returns node IDs from self.nodes");
json.push_str(&format!("\t{{ \"url\": \"{}\", \"failures\": {} }}{}\n", node, node.failures, if i == node_ids.len() - 1 {""} else {","}))
}
json.push_str("]\n");
json.push_str("}");
let mut file = match fs::File::create(path_buf.as_path()) {
Ok(file) => file,
Err(e) => {
warn!("Error creating node table file: {:?}", e);
return;
let mut path = match self.path {
Some(ref path) => PathBuf::from(path),
None => return,
};
if let Err(e) = fs::create_dir_all(&path) {
warn!("Error creating node table directory: {:?}", e);
return;
}
path.push(NODES_FILE);
let node_ids = self.nodes(IpFilter::default());
let nodes = node_ids.into_iter()
.map(|id| self.nodes.get(&id).expect("self.nodes() only returns node IDs from self.nodes"))
.take(MAX_NODES)
.map(|node| node.clone())
.map(Into::into)
.collect();
let table = json::NodeTable { nodes };
match fs::File::create(&path) {
Ok(file) => {
if let Err(e) = serde_json::to_writer_pretty(file, &table) {
warn!("Error writing node table file: {:?}", e);
}
};
if let Err(e) = file.write(&json.into_bytes()) {
warn!("Error writing node table file: {:?}", e);
},
Err(e) => {
warn!("Error creating node table file: {:?}", e);
}
}
}
fn load(path: Option<String>) -> HashMap<NodeId, Node> {
let mut nodes: HashMap<NodeId, Node> = HashMap::new();
if let Some(path) = path {
let mut path_buf = PathBuf::from(path);
path_buf.push("nodes.json");
let mut file = match fs::File::open(path_buf.as_path()) {
Ok(file) => file,
Err(e) => {
debug!("Error opening node table file: {:?}", e);
return nodes;
}
};
let mut buf = String::new();
match file.read_to_string(&mut buf) {
Ok(_) => {},
Err(e) => {
warn!("Error reading node table file: {:?}", e);
return nodes;
}
}
let json: Value = match ::serde_json::from_str(&buf) {
Ok(json) => json,
Err(e) => {
warn!("Error parsing node table file: {:?}", e);
return nodes;
}
};
if let Some(list) = json.as_object().and_then(|o| o.get("nodes")).and_then(|n| n.as_array()) {
for n in list.iter().filter_map(|n| n.as_object()) {
if let Some(url) = n.get("url").and_then(|u| u.as_str()) {
if let Ok(mut node) = Node::from_str(url) {
if let Some(failures) = n.get("failures").and_then(|f| f.as_u64()) {
node.failures = failures as u32;
}
nodes.insert(node.id.clone(), node);
}
}
}
}
let path = match path {
Some(path) => PathBuf::from(path).join(NODES_FILE),
None => return Default::default(),
};
let file = match fs::File::open(&path) {
Ok(file) => file,
Err(e) => {
debug!("Error opening node table file: {:?}", e);
return Default::default();
},
};
let res: Result<json::NodeTable, _> = serde_json::from_reader(file);
match res {
Ok(table) => {
table.nodes.into_iter()
.filter_map(|n| n.into_node())
.map(|n| (n.id.clone(), n))
.collect()
},
Err(e) => {
warn!("Error reading node table file: {:?}", e);
Default::default()
},
}
nodes
}
}
@ -364,13 +373,51 @@ impl Drop for NodeTable {
/// Check if node url is valid
pub fn validate_node_url(url: &str) -> Option<Error> {
use std::str::FromStr;
match Node::from_str(url) {
Ok(_) => None,
Err(e) => Some(e)
}
}
mod json {
use super::*;
#[derive(Serialize, Deserialize)]
pub struct NodeTable {
pub nodes: Vec<Node>,
}
#[derive(Serialize, Deserialize)]
pub struct Node {
pub url: String,
pub attempts: u32,
pub failures: u32,
}
impl Node {
pub fn into_node(self) -> Option<super::Node> {
match super::Node::from_str(&self.url) {
Ok(mut node) => {
node.attempts = self.attempts;
node.failures = self.failures;
Some(node)
},
_ => None,
}
}
}
impl<'a> From<&'a super::Node> for Node {
fn from(node: &'a super::Node) -> Self {
Node {
url: format!("{}", node),
attempts: node.attempts,
failures: node.failures,
}
}
}
}
#[cfg(test)]
mod tests {
use super::*;
@ -408,26 +455,42 @@ mod tests {
}
#[test]
fn table_failure_order() {
fn table_failure_percentage_order() {
let node1 = Node::from_str("enode://a979fb575495b8d6db44f750317d0f4622bf4c2aa3365d6af7c284339968eef29b69ad0dce72a4d8db5ebb4968de0e3bec910127f134779fbcb0cb6d3331163c@22.99.55.44:7770").unwrap();
let node2 = Node::from_str("enode://b979fb575495b8d6db44f750317d0f4622bf4c2aa3365d6af7c284339968eef29b69ad0dce72a4d8db5ebb4968de0e3bec910127f134779fbcb0cb6d3331163c@22.99.55.44:7770").unwrap();
let node3 = Node::from_str("enode://c979fb575495b8d6db44f750317d0f4622bf4c2aa3365d6af7c284339968eef29b69ad0dce72a4d8db5ebb4968de0e3bec910127f134779fbcb0cb6d3331163c@22.99.55.44:7770").unwrap();
let node4 = Node::from_str("enode://d979fb575495b8d6db44f750317d0f4622bf4c2aa3365d6af7c284339968eef29b69ad0dce72a4d8db5ebb4968de0e3bec910127f134779fbcb0cb6d3331163c@22.99.55.44:7770").unwrap();
let id1 = H512::from_str("a979fb575495b8d6db44f750317d0f4622bf4c2aa3365d6af7c284339968eef29b69ad0dce72a4d8db5ebb4968de0e3bec910127f134779fbcb0cb6d3331163c").unwrap();
let id2 = H512::from_str("b979fb575495b8d6db44f750317d0f4622bf4c2aa3365d6af7c284339968eef29b69ad0dce72a4d8db5ebb4968de0e3bec910127f134779fbcb0cb6d3331163c").unwrap();
let id3 = H512::from_str("c979fb575495b8d6db44f750317d0f4622bf4c2aa3365d6af7c284339968eef29b69ad0dce72a4d8db5ebb4968de0e3bec910127f134779fbcb0cb6d3331163c").unwrap();
let id4 = H512::from_str("d979fb575495b8d6db44f750317d0f4622bf4c2aa3365d6af7c284339968eef29b69ad0dce72a4d8db5ebb4968de0e3bec910127f134779fbcb0cb6d3331163c").unwrap();
let mut table = NodeTable::new(None);
table.add_node(node3);
table.add_node(node1);
table.add_node(node2);
table.add_node(node3);
table.add_node(node4);
// node 1 - failure percentage 100%
table.get_mut(&id1).unwrap().attempts = 2;
table.note_failure(&id1);
table.note_failure(&id1);
// node2 - failure percentage 33%
table.get_mut(&id2).unwrap().attempts = 3;
table.note_failure(&id2);
// node3 - failure percentage 0%
table.get_mut(&id3).unwrap().attempts = 1;
// node4 - failure percentage 50% (default when no attempts)
let r = table.nodes(IpFilter::default());
assert_eq!(r[0][..], id3[..]);
assert_eq!(r[1][..], id2[..]);
assert_eq!(r[2][..], id1[..]);
assert_eq!(r[2][..], id4[..]);
assert_eq!(r[3][..], id1[..]);
}
#[test]
@ -441,6 +504,9 @@ mod tests {
let mut table = NodeTable::new(Some(tempdir.path().to_str().unwrap().to_owned()));
table.add_node(node1);
table.add_node(node2);
table.get_mut(&id1).unwrap().attempts = 1;
table.get_mut(&id2).unwrap().attempts = 1;
table.note_failure(&id2);
}

View File

@ -1,7 +1,7 @@
[package]
name = "parity-version"
# NOTE: this value is used for Parity version string.
version = "1.9.0"
version = "1.9.1"
authors = ["Parity Technologies <admin@parity.io>"]
build = "build.rs"

View File

@ -28,7 +28,7 @@ include!(concat!(env!("OUT_DIR"), "/version.rs"));
include!(concat!(env!("OUT_DIR"), "/rustc_version.rs"));
#[cfg(feature = "final")]
const THIS_TRACK: &'static str = "nightly";
const THIS_TRACK: &'static str = "beta";
// ^^^ should be reset to "stable" or "beta" according to the release branch.
#[cfg(not(feature = "final"))]