Backports for beta 2.1.2 (#9649)

* parity-version: bump beta to 2.1.2

* docs(rpc): push the branch along with tags (#9578)

* docs(rpc): push the branch along with tags

* ci: remove old rpc docs script

* Remove snapcraft clean (#9585)

* Revert " add snapcraft package image (master) (#9584)"

This reverts commit ceaedbbd7f.

* Update package-snap.sh

* Update .gitlab-ci.yml

* ci: fix regex 🙄 (#9597)

* docs(rpc): annotate tag with the provided message (#9601)

* Update ropsten.json (#9602)

* HF in POA Sokol (2018-09-19) (#9607)

https://github.com/poanetwork/poa-chain-spec/pull/86

* fix(network): don't disconnect reserved peers (#9608)

The priority of && and || was borked.

* fix failing node-table tests on mac os, closes #9632 (#9633)

* ethcore-io retries failed work steal (#9651)

* ethcore-io uses newer version of crossbeam && retries failed work steal

* ethcore-io non-mio service uses newer crossbeam

* remove master from releasable branches (#9655)

* remove master from releasable branches

need backporting in beta 
fix https://gitlab.parity.io/parity/parity-ethereum/-/jobs/101065 etc

* add except for snap packages for master

* Test fix for windows cache name... (#9658)

* Test fix for windows cache name...

* Fix variable name.

* fix(light_fetch): avoid race with BlockNumber::Latest (#9665)

* Calculate sha3 instead of sha256 for push-release. (#9673)

* Calculate sha3 instead of sha256 for push-release.

* Add pushes to the script.

* Hardfork the testnets (#9562)

* ethcore: propose hardfork block number 4230000 for ropsten

* ethcore: propose hardfork block number 9000000 for kovan

* ethcore: enable kip-4 and kip-6 on kovan

* etcore: bump kovan hardfork to block 9.2M

* ethcore: fix ropsten constantinople block number to 4.2M

* ethcore: disable difficulty_test_ropsten until ethereum/tests are updated upstream

* ci: fix push script (#9679)

* ci: fix push script

* Fix copying & running on windows.

* CI: Remove unnecessary pipes (#9681)

* ci: reduce gitlab pipelines significantly

* ci: build pipeline for PR

* ci: remove dead weight

* ci: remove github release script

* ci: remove forever broken aura tests

* ci: add random stuff to the end of the pipes

* ci: add wind and mac to the end of the pipe

* ci: remove snap artifacts

* ci: (re)move dockerfiles

* ci: clarify job names

* ci: add cargo audit job

* ci: make audit script executable

* ci: ignore snap and docker files for rust check

* ci: simplify audit script

* ci: rename misc to optional

* ci: add publish script to releaseable branches

* ci: more verbose cp command for windows build

* ci: fix weird binary checksum logic in push script

* ci: fix regex in push script for windows

* ci: simplify gitlab caching

* docs: align README with ci changes

* ci: specify default cargo target dir

* ci: print verbose environment

* ci: proper naming of scripts

* ci: restore docker files

* ci: use docker hub file

* ci: use cargo home instead of cargo target dir

* ci: touch random rust file to trigger real builds

* ci: set cargo target dir for audit script

* ci: remove temp file

* ci: don't export the cargo target dir in the audit script

* ci: fix windows unbound variable

* docs: fix gitlab badge path

* rename deprecated gitlab ci variables

https://docs.gitlab.com/ee/ci/variables/#9-0-renaming

* ci: fix git compare for nightly builds

* test: skip c++ example for all platforms but linux

* ci: add random rust file to trigger tests

* ci: remove random rust file

* disable cpp lib test for mac, win and beta (#9686)

* cleanup ci merge

* ci: fix tests

* fix bad-block reporting no reason (#9638)

* ethcore: fix detection of major import (#9552)

* sync: set state to idle after sync is completed

* sync: refactor sync reset

* Don't hash the init_code of CREATE. (#9688)

* Docker: run as parity user (#9689)

* Implement CREATE2 gas changes and fix some potential overflowing (#9694)

* Implement CREATE2 gas changes and fix some potential overflowing

* Ignore create2 state tests

* Split CREATE and CREATE2 in gasometer

* Generalize rounding (x + 31) / 32 to to_word_size

* make instantSeal engine backwards compatible, closes #9696 (#9700)

* ethcore: delay ropsten hardfork (#9704)

* fix (light/provider) : Make `read_only executions` read-only (#9591)

* `ExecutionsRequest` from light-clients as read-only

This changes so all `ExecutionRequests` from light-clients are executed
as read-only which the `virtual``flag == true ensures.

This boost up the current transaction to always succeed

Note, this only affects `eth_estimateGas` and `eth_call` AFAIK.

* grumbles(revert renaming) : TransactionProof

* grumbles(trace) : remove incorrect trace

* grumbles(state/prove_tx) : explicit `virt`

Remove the boolean flag to determine that a `state::prove_transaction`
whether it should be executed in a virtual context or not.

Because of that also rename the function to
`state::prove_transction_virtual` to make more clear

* CI: Skip docs job for nightly (#9693)

* ci: force-tag wiki changes

* ci: force-tag wiki changes

* ci: skip docs job for master and nightly

* ci: revert docs job checking for nightly tag

* ci: exclude docs job from nightly builds in gitlab script
This commit is contained in:
Afri Schoedon 2018-10-09 15:04:30 +02:00 committed by GitHub
parent 8e347b2602
commit 52fe28a052
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
65 changed files with 967 additions and 1008 deletions

View File

@ -1,16 +1,14 @@
stages:
- test
- build
- package
- publish
- docs
- optional
image: parity/rust:gitlab-ci
variables:
CI_SERVER_NAME: "GitLab CI"
CARGO_HOME: "${CI_PROJECT_DIR}/cargo"
CARGO_HOME: "${CI_PROJECT_DIR}/.cargo"
BUILD_TARGET: ubuntu
BUILD_ARCH: amd64
CARGO_TARGET: x86_64-unknown-linux-gnu
@ -18,21 +16,15 @@ variables:
cache:
key: "${CI_JOB_NAME}"
paths:
- ${CI_PROJECT_DIR}/target/
- ${CI_PROJECT_DIR}/cargo/
- ./target
- ./.cargo
.releaseable_branches: # list of git refs for building GitLab artifacts (think "pre-release binaries")
only: &releaseable_branches
- master
- stable
- beta
- tags
.publishable_branches: # list of git refs for publishing builds to the "production" locations
only: &publishable_branches
- nightly # Our nightly builds from schedule, on `master`
- /^v2.*$/ # Our version tags
.collect_artifacts: &collect_artifacts
artifacts:
name: "${CI_JOB_NAME}_${CI_COMMIT_REF_NAME}"
@ -49,57 +41,16 @@ cache:
- export VERSION
- echo "Version = ${VERSION}"
#### stage: test
test-rust-stable: &test
test-linux:
stage: test
variables:
RUN_TESTS: all
script:
- scripts/gitlab/test.sh stable
- scripts/gitlab/test-all.sh stable
tags:
- rust-stable
.optional_test: &optional_test
<<: *test
allow_failure: true
only:
- master
test-rust-beta:
<<: *optional_test
script:
- scripts/gitlab/test.sh beta
test-rust-nightly:
<<: *optional_test
script:
- scripts/gitlab/test.sh nightly
test-lint-rustfmt:
<<: *optional_test
script:
- scripts/gitlab/rustfmt.sh
test-lint-clippy:
<<: *optional_test
script:
- scripts/gitlab/clippy.sh
test-coverage-kcov:
stage: test
only:
- master
script:
- scripts/gitlab/coverage.sh
tags:
- shell
allow_failure: true
#### stage: build
build-linux-ubuntu-amd64: &build
build-linux:
stage: build
only: *releaseable_branches
variables:
@ -109,59 +60,23 @@ build-linux-ubuntu-amd64: &build
<<: *collect_artifacts
tags:
- rust-stable
allow_failure: true
build-linux-ubuntu-i386:
<<: *build
image: parity/rust-i686:gitlab-ci
variables:
CARGO_TARGET: i686-unknown-linux-gnu
tags:
- rust-i686
build-linux-ubuntu-arm64:
<<: *build
image: parity/rust-arm64:gitlab-ci
variables:
CARGO_TARGET: aarch64-unknown-linux-gnu
tags:
- rust-arm
build-linux-ubuntu-armhf:
<<: *build
image: parity/rust-armv7:gitlab-ci
variables:
CARGO_TARGET: armv7-unknown-linux-gnueabihf
tags:
- rust-arm
build-linux-android-armhf:
<<: *build
image: parity/rust-android:gitlab-ci
variables:
CARGO_TARGET: armv7-linux-androideabi
tags:
- rust-arm
build-darwin-macos-x86_64:
<<: *build
build-darwin:
stage: build
only: *releaseable_branches
variables:
CARGO_TARGET: x86_64-apple-darwin
CC: gcc
CXX: g++
script:
- scripts/gitlab/build-unix.sh
tags:
- osx
- rust-osx
<<: *collect_artifacts
build-windows-msvc-x86_64:
build-windows:
stage: build
only: *releaseable_branches
cache:
key: "%CI_JOB_NAME%"
paths:
- "%CI_PROJECT_DIR%/target/"
- "%CI_PROJECT_DIR%/cargo/"
# No cargo caching, since fetch-locking on Windows gets stuck
variables:
CARGO_TARGET: x86_64-pc-windows-msvc
script:
@ -170,132 +85,96 @@ build-windows-msvc-x86_64:
- rust-windows
<<: *collect_artifacts
#### stage: package
package-linux-snap-amd64: &package_snap
stage: package
only: *releaseable_branches
image: parity/snapcraft:gitlab-ci
cache: {}
before_script: *determine_version
variables:
CARGO_TARGET: x86_64-unknown-linux-gnu
dependencies:
- build-linux-ubuntu-amd64
script:
- scripts/gitlab/package-snap.sh
tags:
- rust-stable
<<: *collect_artifacts
package-linux-snap-i386:
<<: *package_snap
variables:
BUILD_ARCH: i386
CARGO_TARGET: i686-unknown-linux-gnu
dependencies:
- build-linux-ubuntu-i386
package-linux-snap-arm64:
<<: *package_snap
variables:
BUILD_ARCH: arm64
CARGO_TARGET: aarch64-unknown-linux-gnu
dependencies:
- build-linux-ubuntu-arm64
package-linux-snap-armhf:
<<: *package_snap
variables:
BUILD_ARCH: armhf
CARGO_TARGET: armv7-unknown-linux-gnueabihf
dependencies:
- build-linux-ubuntu-armhf
#### stage: publish
publish-linux-snap-amd64: &publish_snap
stage: publish
only: *publishable_branches
image: parity/snapcraft:gitlab-ci
cache: {}
before_script: *determine_version
variables:
BUILD_ARCH: amd64
dependencies:
- package-linux-snap-amd64
script:
- scripts/gitlab/publish-snap.sh
tags:
- rust-stable
publish-linux-snap-i386:
<<: *publish_snap
variables:
BUILD_ARCH: i386
dependencies:
- package-linux-snap-i386
publish-linux-snap-arm64:
<<: *publish_snap
variables:
BUILD_ARCH: arm64
dependencies:
- package-linux-snap-arm64
publish-linux-snap-armhf:
<<: *publish_snap
variables:
BUILD_ARCH: armhf
dependencies:
- package-linux-snap-armhf
publish-docker-parity-amd64: &publish_docker
publish-docker:
stage: publish
only: *releaseable_branches
cache: {}
dependencies:
- build-linux-ubuntu-amd64
- build-linux
tags:
- shell
allow_failure: true
script:
- scripts/gitlab/publish-docker.sh parity
publish-docker-parityevm-amd64:
<<: *publish_docker
script:
- scripts/gitlab/publish-docker.sh parity-evm
publish-github-and-s3:
publish-awss3:
stage: publish
only: *publishable_branches
only: *releaseable_branches
cache: {}
dependencies:
- build-linux-ubuntu-amd64
- build-linux-ubuntu-i386
- build-linux-ubuntu-armhf
- build-linux-ubuntu-arm64
- build-darwin-macos-x86_64
- build-windows-msvc-x86_64
- build-linux
- build-darwin
- build-windows
before_script: *determine_version
script:
- scripts/gitlab/push.sh
- scripts/gitlab/publish-awss3.sh
tags:
- shell
allow_failure: true
####stage: docs
docs-rpc-json:
stage: docs
docs-jsonrpc:
stage: optional
only:
- tags
except:
- nightly
cache: {}
script:
- scripts/gitlab/rpc-docs.sh
- scripts/gitlab/docs-jsonrpc.sh
tags:
- shell
cargo-audit:
stage: optional
script:
- scripts/gitlab/cargo-audit.sh
tags:
- rust-stable
test-android:
stage: optional
image: parity/rust-android:gitlab-ci
variables:
CARGO_TARGET: armv7-linux-androideabi
script:
- scripts/gitlab/test-all.sh stable
tags:
- rust-arm
test-darwin:
stage: optional
variables:
CARGO_TARGET: x86_64-apple-darwin
CC: gcc
CXX: g++
RUN_TESTS: cargo
script:
- scripts/gitlab/test-all.sh stable
tags:
- rust-osx
test-windows:
stage: optional
variables:
CARGO_TARGET: x86_64-pc-windows-msvc
RUN_TESTS: cargo
script:
- sh scripts/gitlab/test-all.sh stable
tags:
- rust-windows
test-beta:
stage: optional
variables:
RUN_TESTS: cargo
script:
- scripts/gitlab/test-all.sh beta
tags:
- rust-beta
test-nightly:
stage: optional
variables:
RUN_TESTS: all
script:
- scripts/gitlab/test-all.sh nightly
tags:
- rust-nightly

14
Cargo.lock generated
View File

@ -599,7 +599,7 @@ version = "1.12.0"
name = "ethcore-io"
version = "1.12.0"
dependencies = [
"crossbeam 0.3.2 (registry+https://github.com/rust-lang/crates.io-index)",
"crossbeam-deque 0.6.1 (registry+https://github.com/rust-lang/crates.io-index)",
"fnv 1.0.6 (registry+https://github.com/rust-lang/crates.io-index)",
"log 0.4.5 (registry+https://github.com/rust-lang/crates.io-index)",
"mio 0.6.16 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2007,7 +2007,7 @@ version = "1.12.0"
dependencies = [
"jni 0.10.2 (registry+https://github.com/rust-lang/crates.io-index)",
"panic_hook 0.1.0",
"parity-ethereum 2.1.1",
"parity-ethereum 2.1.2",
]
[[package]]
@ -2023,7 +2023,7 @@ dependencies = [
[[package]]
name = "parity-ethereum"
version = "2.1.1"
version = "2.1.2"
dependencies = [
"ansi_term 0.10.2 (registry+https://github.com/rust-lang/crates.io-index)",
"atty 0.2.11 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2073,7 +2073,7 @@ dependencies = [
"parity-rpc 1.12.0",
"parity-rpc-client 1.4.0",
"parity-updater 1.12.0",
"parity-version 2.1.1",
"parity-version 2.1.2",
"parity-whisper 0.1.0",
"parking_lot 0.6.4 (registry+https://github.com/rust-lang/crates.io-index)",
"pretty_assertions 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2238,7 +2238,7 @@ dependencies = [
"parity-crypto 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"parity-reactor 0.1.0",
"parity-updater 1.12.0",
"parity-version 2.1.1",
"parity-version 2.1.2",
"parking_lot 0.6.4 (registry+https://github.com/rust-lang/crates.io-index)",
"patricia-trie 0.2.2 (registry+https://github.com/rust-lang/crates.io-index)",
"pretty_assertions 0.1.2 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2327,7 +2327,7 @@ dependencies = [
"parity-bytes 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"parity-hash-fetch 1.12.0",
"parity-path 0.1.1 (registry+https://github.com/rust-lang/crates.io-index)",
"parity-version 2.1.1",
"parity-version 2.1.2",
"parking_lot 0.6.4 (registry+https://github.com/rust-lang/crates.io-index)",
"rand 0.4.3 (registry+https://github.com/rust-lang/crates.io-index)",
"semver 0.9.0 (registry+https://github.com/rust-lang/crates.io-index)",
@ -2337,7 +2337,7 @@ dependencies = [
[[package]]
name = "parity-version"
version = "2.1.1"
version = "2.1.2"
dependencies = [
"parity-bytes 0.1.0 (registry+https://github.com/rust-lang/crates.io-index)",
"rlp 0.2.4 (registry+https://github.com/rust-lang/crates.io-index)",

View File

@ -2,7 +2,7 @@
description = "Parity Ethereum client"
name = "parity-ethereum"
# NOTE Make sure to update util/version/Cargo.toml as well
version = "2.1.1"
version = "2.1.2"
license = "GPL-3.0"
authors = ["Parity Technologies <admin@parity.io>"]

View File

@ -4,9 +4,7 @@
<p align="center"><strong><a href="https://github.com/paritytech/parity-ethereum/releases/latest">» Download the latest release «</a></strong></p>
<p align="center"><a href="https://gitlab.parity.io/parity/parity/commits/master" target="_blank"><img src="https://gitlab.parity.io/parity/parity/badges/master/build.svg" /></a>
<a href="https://codecov.io/gh/paritytech/parity-ethereum" target="_blank"><img src="https://codecov.io/gh/paritytech/parity-ethereum/branch/master/graph/badge.svg" /></a>
<a href="https://build.snapcraft.io/user/paritytech/parity" target="_blank"><img src="https://build.snapcraft.io/badge/paritytech/parity.svg" /></a>
<p align="center"><a href="https://gitlab.parity.io/parity/parity-ethereum/commits/master" target="_blank"><img src="https://gitlab.parity.io/parity/parity-ethereum/badges/master/build.svg" /></a>
<a href="https://www.gnu.org/licenses/gpl-3.0.en.html" target="_blank"><img src="https://img.shields.io/badge/license-GPL%20v3-green.svg" /></a></p>
**Built for mission-critical use**: Miners, service providers, and exchanges need fast synchronisation and maximum uptime. Parity Ethereum provides the core infrastructure essential for speedy and reliable services.
@ -25,11 +23,11 @@ By default, Parity Ethereum runs a JSON-RPC HTTP server on port `:8545` and a We
If you run into problems while using Parity Ethereum, check out the [wiki for documentation](https://wiki.parity.io/), feel free to [file an issue in this repository](https://github.com/paritytech/parity-ethereum/issues/new), or hop on our [Gitter](https://gitter.im/paritytech/parity) or [Riot](https://riot.im/app/#/group/+parity:matrix.parity.io) chat room to ask a question. We are glad to help! **For security-critical issues**, please refer to the security policy outlined in [SECURITY.md](SECURITY.md).
Parity Ethereum's current beta-release is 2.0. You can download it at [the releases page](https://github.com/paritytech/parity-ethereum/releases) or follow the instructions below to build from source. Please, mind the [CHANGELOG.md](CHANGELOG.md) for a list of all changes between different versions.
Parity Ethereum's current beta-release is 2.1. You can download it at [the releases page](https://github.com/paritytech/parity-ethereum/releases) or follow the instructions below to build from source. Please, mind the [CHANGELOG.md](CHANGELOG.md) for a list of all changes between different versions.
## Build Dependencies
Parity Ethereum requires **Rust version 1.28.x** to build.
Parity Ethereum requires **Rust version 1.29.x** to build.
We recommend installing Rust through [rustup](https://www.rustup.rs/). If you don't already have `rustup`, you can install it like this:
@ -60,26 +58,6 @@ Once you have `rustup` installed, then you need to install:
Make sure that these binaries are in your `PATH`. After that, you should be able to build Parity Ethereum from source.
## Install from the Snapcraft Store
In any of the [supported Linux distros](https://snapcraft.io/docs/core/install):
```bash
sudo snap install parity
```
Alternatively, if you want to contribute testing the upcoming release:
```bash
sudo snap install parity --beta
```
Moreover, to test the latest code from the master branch:
```bash
sudo snap install parity --edge
```
## Build from Source Code
```bash

View File

@ -1,61 +0,0 @@
FROM ubuntu:xenial
LABEL maintainer="Parity Technologies <devops@parity.io>"
RUN apt-get update && \
apt-get install -yq sudo curl file build-essential wget git g++ cmake pkg-config bison flex \
unzip lib32stdc++6 lib32z1 python autotools-dev automake autoconf libtool \
gperf xsltproc docbook-xsl
# Rust & Cargo
RUN curl https://sh.rustup.rs -sSf | sh -s -- -y
ENV PATH /root/.cargo/bin:$PATH
RUN rustup toolchain install stable
RUN rustup target add --toolchain stable arm-linux-androideabi
RUN rustup target add --toolchain stable armv7-linux-androideabi
# Android NDK and toolchain
RUN cd /usr/local && \
wget -q https://dl.google.com/android/repository/android-ndk-r16b-linux-x86_64.zip && \
unzip -q android-ndk-r16b-linux-x86_64.zip && \
rm android-ndk-r16b-linux-x86_64.zip
ENV NDK_HOME /usr/local/android-ndk-r16b
RUN /usr/local/android-ndk-r16b/build/tools/make-standalone-toolchain.sh \
--arch=arm --install-dir=/opt/ndk-standalone --stl=libc++ --platform=android-26
ENV PATH $PATH:/opt/ndk-standalone/bin
# Compiling libudev for Android
# This is the most hacky part of the process, as we need to apply a patch and pass specific
# options that the compiler environment doesn't define.
RUN cd /root && \
git clone https://github.com/gentoo/eudev.git
ADD libudev.patch /root
RUN cd /root/eudev && \
git checkout 83d918449f22720d84a341a05e24b6d109e6d3ae && \
./autogen.sh && \
./configure --disable-introspection --disable-programs --disable-hwdb \
--host=arm-linux-androideabi --prefix=/opt/ndk-standalone/sysroot/usr/ \
--enable-shared=false CC=arm-linux-androideabi-clang \
CFLAGS="-D LINE_MAX=2048 -D RLIMIT_NLIMITS=15 -D IPTOS_LOWCOST=2 -std=gnu99" \
CXX=arm-linux-androideabi-clang++ && \
git apply - < /root/libudev.patch && \
make && \
make install
RUN rm -rf /root/eudev
RUN rm /root/libudev.patch
# Rust-related configuration
ADD cargo-config.toml /root/.cargo/config
ENV ARM_LINUX_ANDROIDEABI_OPENSSL_DIR /opt/ndk-standalone/sysroot/usr
ENV ARMV7_LINUX_ANDROIDEABI_OPENSSL_DIR /opt/ndk-standalone/sysroot/usr
ENV CC_arm_linux_androideabi arm-linux-androideabi-clang
ENV CC_armv7_linux_androideabi arm-linux-androideabi-clang
ENV CXX_arm_linux_androideabi arm-linux-androideabi-clang++
ENV CXX_armv7_linux_androideabi arm-linux-androideabi-clang++
ENV AR_arm_linux_androideabi arm-linux-androideabi-ar
ENV AR_armv7_linux_androideabi arm-linux-androideabi-ar
ENV CFLAGS_arm_linux_androideabi -std=gnu11 -fPIC -D OS_ANDROID
ENV CFLAGS_armv7_linux_androideabi -std=gnu11 -fPIC -D OS_ANDROID
ENV CXXFLAGS_arm_linux_androideabi -std=gnu++11 -fPIC -fexceptions -frtti -static-libstdc++ -D OS_ANDROID
ENV CXXFLAGS_armv7_linux_androideabi -std=gnu++11 -fPIC -fexceptions -frtti -static-libstdc++ -D OS_ANDROID
ENV CXXSTDLIB_arm_linux_androideabi ""
ENV CXXSTDLIB_armv7_linux_androideabi ""

View File

@ -1,9 +0,0 @@
[target.armv7-linux-androideabi]
linker = "arm-linux-androideabi-clang"
ar = "arm-linux-androideabi-ar"
rustflags = ["-C", "link-arg=-lc++_static", "-C", "link-arg=-lc++abi", "-C", "link-arg=-landroid_support"]
[target.arm-linux-androideabi]
linker = "arm-linux-androideabi-clang"
ar = "arm-linux-androideabi-ar"
rustflags = ["-C", "link-arg=-lc++_static", "-C", "link-arg=-lc++abi", "-C", "link-arg=-landroid_support"]

View File

@ -1,216 +0,0 @@
diff --git a/src/collect/collect.c b/src/collect/collect.c
index 2cf1f00..b24f26b 100644
--- a/src/collect/collect.c
+++ b/src/collect/collect.c
@@ -84,7 +84,7 @@ static void usage(void)
" invoked for each ID in <idlist>) collect returns 0, the\n"
" number of missing IDs otherwise.\n"
" On error a negative number is returned.\n\n"
- , program_invocation_short_name);
+ , "parity");
}
/*
diff --git a/src/scsi_id/scsi_id.c b/src/scsi_id/scsi_id.c
index 8b76d87..7bf3948 100644
--- a/src/scsi_id/scsi_id.c
+++ b/src/scsi_id/scsi_id.c
@@ -321,7 +321,7 @@ static void help(void) {
" -u --replace-whitespace Replace all whitespace by underscores\n"
" -v --verbose Verbose logging\n"
" -x --export Print values as environment keys\n"
- , program_invocation_short_name);
+ , "parity");
}
diff --git a/src/shared/hashmap.h b/src/shared/hashmap.h
index a03ee58..a7c2005 100644
--- a/src/shared/hashmap.h
+++ b/src/shared/hashmap.h
@@ -98,10 +98,7 @@ extern const struct hash_ops uint64_hash_ops;
#if SIZEOF_DEV_T != 8
unsigned long devt_hash_func(const void *p, const uint8_t hash_key[HASH_KEY_SIZE]) _pure_;
int devt_compare_func(const void *a, const void *b) _pure_;
-extern const struct hash_ops devt_hash_ops = {
- .hash = devt_hash_func,
- .compare = devt_compare_func
-};
+extern const struct hash_ops devt_hash_ops;
#else
#define devt_hash_func uint64_hash_func
#define devt_compare_func uint64_compare_func
diff --git a/src/shared/log.c b/src/shared/log.c
index 4a40996..1496984 100644
--- a/src/shared/log.c
+++ b/src/shared/log.c
@@ -335,7 +335,7 @@ static int write_to_syslog(
IOVEC_SET_STRING(iovec[0], header_priority);
IOVEC_SET_STRING(iovec[1], header_time);
- IOVEC_SET_STRING(iovec[2], program_invocation_short_name);
+ IOVEC_SET_STRING(iovec[2], "parity");
IOVEC_SET_STRING(iovec[3], header_pid);
IOVEC_SET_STRING(iovec[4], buffer);
@@ -383,7 +383,7 @@ static int write_to_kmsg(
char_array_0(header_pid);
IOVEC_SET_STRING(iovec[0], header_priority);
- IOVEC_SET_STRING(iovec[1], program_invocation_short_name);
+ IOVEC_SET_STRING(iovec[1], "parity");
IOVEC_SET_STRING(iovec[2], header_pid);
IOVEC_SET_STRING(iovec[3], buffer);
IOVEC_SET_STRING(iovec[4], "\n");
diff --git a/src/udev/udevadm-control.c b/src/udev/udevadm-control.c
index 6af7163..3271e56 100644
--- a/src/udev/udevadm-control.c
+++ b/src/udev/udevadm-control.c
@@ -41,7 +41,7 @@ static void print_help(void) {
" -p --property=KEY=VALUE Set a global property for all events\n"
" -m --children-max=N Maximum number of children\n"
" --timeout=SECONDS Maximum time to block for a reply\n"
- , program_invocation_short_name);
+ , "parity");
}
static int adm_control(struct udev *udev, int argc, char *argv[]) {
diff --git a/src/udev/udevadm-info.c b/src/udev/udevadm-info.c
index 0aec976..a31ac02 100644
--- a/src/udev/udevadm-info.c
+++ b/src/udev/udevadm-info.c
@@ -279,7 +279,7 @@ static void help(void) {
" -P --export-prefix Export the key name with a prefix\n"
" -e --export-db Export the content of the udev database\n"
" -c --cleanup-db Clean up the udev database\n"
- , program_invocation_short_name);
+ , "parity");
}
static int uinfo(struct udev *udev, int argc, char *argv[]) {
diff --git a/src/udev/udevadm-monitor.c b/src/udev/udevadm-monitor.c
index 15ded09..b58dd08 100644
--- a/src/udev/udevadm-monitor.c
+++ b/src/udev/udevadm-monitor.c
@@ -73,7 +73,7 @@ static void help(void) {
" -u --udev Print udev events\n"
" -s --subsystem-match=SUBSYSTEM[/DEVTYPE] Filter events by subsystem\n"
" -t --tag-match=TAG Filter events by tag\n"
- , program_invocation_short_name);
+ , "parity");
}
static int adm_monitor(struct udev *udev, int argc, char *argv[]) {
diff --git a/src/udev/udevadm-settle.c b/src/udev/udevadm-settle.c
index 33597bc..b36a504 100644
--- a/src/udev/udevadm-settle.c
+++ b/src/udev/udevadm-settle.c
@@ -43,7 +43,7 @@ static void help(void) {
" --version Show package version\n"
" -t --timeout=SECONDS Maximum time to wait for events\n"
" -E --exit-if-exists=FILE Stop waiting if file exists\n"
- , program_invocation_short_name);
+ , "parity");
}
static int adm_settle(struct udev *udev, int argc, char *argv[]) {
diff --git a/src/udev/udevadm-test-builtin.c b/src/udev/udevadm-test-builtin.c
index baaeca9..50ed812 100644
--- a/src/udev/udevadm-test-builtin.c
+++ b/src/udev/udevadm-test-builtin.c
@@ -39,7 +39,7 @@ static void help(struct udev *udev) {
" -h --help Print this message\n"
" --version Print version of the program\n\n"
"Commands:\n"
- , program_invocation_short_name);
+ , "parity");
udev_builtin_list(udev);
}
diff --git a/src/udev/udevadm-test.c b/src/udev/udevadm-test.c
index 47fd924..a855412 100644
--- a/src/udev/udevadm-test.c
+++ b/src/udev/udevadm-test.c
@@ -39,7 +39,7 @@ static void help(void) {
" --version Show package version\n"
" -a --action=ACTION Set action string\n"
" -N --resolve-names=early|late|never When to resolve names\n"
- , program_invocation_short_name);
+ , "parity");
}
static int adm_test(struct udev *udev, int argc, char *argv[]) {
diff --git a/src/udev/udevadm-trigger.c b/src/udev/udevadm-trigger.c
index 4dc756a..67787d3 100644
--- a/src/udev/udevadm-trigger.c
+++ b/src/udev/udevadm-trigger.c
@@ -92,7 +92,7 @@ static void help(void) {
" -y --sysname-match=NAME Trigger devices with this /sys path\n"
" --name-match=NAME Trigger devices with this /dev name\n"
" -b --parent-match=NAME Trigger devices with that parent device\n"
- , program_invocation_short_name);
+ , "parity");
}
static int adm_trigger(struct udev *udev, int argc, char *argv[]) {
diff --git a/src/udev/udevadm.c b/src/udev/udevadm.c
index 3e57cf6..b03dfaa 100644
--- a/src/udev/udevadm.c
+++ b/src/udev/udevadm.c
@@ -62,7 +62,7 @@ static int adm_help(struct udev *udev, int argc, char *argv[]) {
printf("%s [--help] [--version] [--debug] COMMAND [COMMAND OPTIONS]\n\n"
"Send control commands or test the device manager.\n\n"
"Commands:\n"
- , program_invocation_short_name);
+ , "parity");
for (i = 0; i < ELEMENTSOF(udevadm_cmds); i++)
if (udevadm_cmds[i]->help != NULL)
@@ -128,7 +128,7 @@ int main(int argc, char *argv[]) {
goto out;
}
- fprintf(stderr, "%s: missing or unknown command\n", program_invocation_short_name);
+ fprintf(stderr, "%s: missing or unknown command\n", "parity");
rc = 2;
out:
mac_selinux_finish();
diff --git a/src/udev/udevd.c b/src/udev/udevd.c
index cf826c6..4eec0af 100644
--- a/src/udev/udevd.c
+++ b/src/udev/udevd.c
@@ -1041,7 +1041,7 @@ static void help(void) {
" -t --event-timeout=SECONDS Seconds to wait before terminating an event\n"
" -N --resolve-names=early|late|never\n"
" When to resolve users and groups\n"
- , program_invocation_short_name);
+ , "parity");
}
static int parse_argv(int argc, char *argv[]) {
diff --git a/src/v4l_id/v4l_id.c b/src/v4l_id/v4l_id.c
index 1dce0d5..f65badf 100644
--- a/src/v4l_id/v4l_id.c
+++ b/src/v4l_id/v4l_id.c
@@ -49,7 +49,7 @@ int main(int argc, char *argv[]) {
printf("%s [-h,--help] <device file>\n\n"
"Video4Linux device identification.\n\n"
" -h Print this message\n"
- , program_invocation_short_name);
+ , "parity");
return 0;
case '?':
return -EINVAL;
diff --git a/src/shared/path-util.c b/src/shared/path-util.c
index 0744563..7151356 100644
--- a/src/shared/path-util.c
+++ b/src/shared/path-util.c
@@ -109,7 +109,7 @@ char *path_make_absolute_cwd(const char *p) {
if (path_is_absolute(p))
return strdup(p);
- cwd = get_current_dir_name();
+ cwd = getcwd(malloc(128), 128);
if (!cwd)
return NULL;

View File

@ -176,9 +176,8 @@ impl<Gas: evm::CostType> Gasometer<Gas> {
Request::GasMem(default_gas, mem_needed(stack.peek(0), stack.peek(1))?)
},
instructions::SHA3 => {
let w = overflowing!(add_gas_usize(Gas::from_u256(*stack.peek(1))?, 31));
let words = w >> 5;
let gas = Gas::from(schedule.sha3_gas) + (Gas::from(schedule.sha3_word_gas) * words);
let words = overflowing!(to_word_size(Gas::from_u256(*stack.peek(1))?));
let gas = overflowing!(Gas::from(schedule.sha3_gas).overflow_add(overflowing!(Gas::from(schedule.sha3_word_gas).overflow_mul(words))));
Request::GasMem(gas, mem_needed(stack.peek(0), stack.peek(1))?)
},
instructions::CALLDATACOPY | instructions::CODECOPY | instructions::RETURNDATACOPY => {
@ -231,9 +230,24 @@ impl<Gas: evm::CostType> Gasometer<Gas> {
Request::GasMemProvide(gas, mem, Some(requested))
},
instructions::CREATE | instructions::CREATE2 => {
instructions::CREATE => {
let start = stack.peek(1);
let len = stack.peek(2);
let gas = Gas::from(schedule.create_gas);
let mem = mem_needed(stack.peek(1), stack.peek(2))?;
let mem = mem_needed(start, len)?;
Request::GasMemProvide(gas, mem, None)
},
instructions::CREATE2 => {
let start = stack.peek(1);
let len = stack.peek(2);
let base = Gas::from(schedule.create_gas);
let word = overflowing!(to_word_size(Gas::from_u256(*len)?));
let word_gas = overflowing!(Gas::from(schedule.sha3_word_gas).overflow_mul(word));
let gas = overflowing!(base.overflow_add(word_gas));
let mem = mem_needed(start, len)?;
Request::GasMemProvide(gas, mem, None)
},
@ -283,8 +297,8 @@ impl<Gas: evm::CostType> Gasometer<Gas> {
},
Request::GasMemCopy(gas, mem_size, copy) => {
let (mem_gas_cost, new_mem_gas, new_mem_size) = self.mem_gas_cost(schedule, current_mem_size, &mem_size)?;
let copy = overflowing!(add_gas_usize(copy, 31)) >> 5;
let copy_gas = Gas::from(schedule.copy_gas) * copy;
let copy = overflowing!(to_word_size(copy));
let copy_gas = overflowing!(Gas::from(schedule.copy_gas).overflow_mul(copy));
let gas = overflowing!(gas.overflow_add(copy_gas));
let gas = overflowing!(gas.overflow_add(mem_gas_cost));
@ -311,7 +325,7 @@ impl<Gas: evm::CostType> Gasometer<Gas> {
};
let current_mem_size = Gas::from(current_mem_size);
let req_mem_size_rounded = (overflowing!(mem_size.overflow_add(Gas::from(31 as usize))) >> 5) << 5;
let req_mem_size_rounded = overflowing!(to_word_size(*mem_size)) << 5;
let (mem_gas_cost, new_mem_gas) = if req_mem_size_rounded > current_mem_size {
let new_mem_gas = gas_for_mem(req_mem_size_rounded)?;
@ -343,6 +357,16 @@ fn add_gas_usize<Gas: evm::CostType>(value: Gas, num: usize) -> (Gas, bool) {
value.overflow_add(Gas::from(num))
}
#[inline]
fn to_word_size<Gas: evm::CostType>(value: Gas) -> (Gas, bool) {
let (gas, overflow) = add_gas_usize(value, 31);
if overflow {
return (gas, overflow);
}
(gas >> 5, false)
}
#[inline]
fn calculate_eip1283_sstore_gas<Gas: evm::CostType>(schedule: &Schedule, original: &U256, current: &U256, new: &U256) -> Gas {
Gas::from(

View File

@ -306,8 +306,7 @@ impl<Cost: CostType> Interpreter<Cost> {
match result {
InstructionResult::JumpToPosition(position) => {
if self.valid_jump_destinations.is_none() {
let code_hash = self.params.code_hash.clone().unwrap_or_else(|| keccak(self.reader.code.as_ref()));
self.valid_jump_destinations = Some(self.cache.jump_destinations(&code_hash, &self.reader.code));
self.valid_jump_destinations = Some(self.cache.jump_destinations(&self.params.code_hash, &self.reader.code));
}
let jump_destinations = self.valid_jump_destinations.as_ref().expect("jump_destinations are initialized on first jump; qed");
let pos = self.verify_jump(position, jump_destinations)?;

View File

@ -50,7 +50,8 @@ impl SharedCache {
}
/// Get jump destinations bitmap for a contract.
pub fn jump_destinations(&self, code_hash: &H256, code: &[u8]) -> Arc<BitSet> {
pub fn jump_destinations(&self, code_hash: &Option<H256>, code: &[u8]) -> Arc<BitSet> {
if let Some(ref code_hash) = code_hash {
if code_hash == &KECCAK_EMPTY {
return Self::find_jump_destinations(code);
}
@ -58,9 +59,13 @@ impl SharedCache {
if let Some(d) = self.jump_destinations.lock().get_mut(code_hash) {
return d.0.clone();
}
}
let d = Self::find_jump_destinations(code);
self.jump_destinations.lock().insert(code_hash.clone(), Bits(d.clone()));
if let Some(ref code_hash) = code_hash {
self.jump_destinations.lock().insert(*code_hash, Bits(d.clone()));
}
d
}

View File

@ -43,7 +43,13 @@
"eip211Transition": 5067000,
"eip214Transition": 5067000,
"eip658Transition": 5067000,
"wasmActivationTransition": 6600000
"wasmActivationTransition": 6600000,
"eip145Transition": 9200000,
"eip1014Transition": 9200000,
"eip1052Transition": 9200000,
"eip1283Transition": 9200000,
"kip4Transition": 9200000,
"kip6Transition": 9200000
},
"genesis": {
"seal": {

View File

@ -18,9 +18,14 @@
},
"509355": {
"safeContract": "0x03048F666359CFD3C74a1A5b9a97848BF71d5038"
},
"4622420": {
"safeContract": "0x4c6a159659CCcb033F4b2e2Be0C16ACC62b89DDB"
}
}
}
},
"blockRewardContractAddress": "0x3145197AD50D7083D0222DE4fCCf67d9BD05C30D",
"blockRewardContractTransition": 4639000
}
}
},

View File

@ -9,12 +9,14 @@
"durationLimit": "0x0d",
"blockReward": {
"0": "0x4563918244F40000",
"1700000": "0x29A2241AF62C0000"
"1700000": "0x29A2241AF62C0000",
"4230000": "0x1BC16D674EC80000"
},
"homesteadTransition": 0,
"eip100bTransition": 1700000,
"difficultyBombDelays": {
"1700000": 3000000
"1700000": 3000000,
"4230000": 2000000
}
}
}
@ -39,7 +41,11 @@
"eip140Transition": 1700000,
"eip211Transition": 1700000,
"eip214Transition": 1700000,
"eip658Transition": 1700000
"eip658Transition": 1700000,
"eip145Transition": 4230000,
"eip1014Transition": 4230000,
"eip1052Transition": 4230000,
"eip1283Transition": 4230000
},
"genesis": {
"seal": {
@ -1978,7 +1984,8 @@
"enode://6332792c4a00e3e4ee0926ed89e0d27ef985424d97b6a45bf0f23e51f0dcb5e66b875777506458aea7af6f9e4ffb69f43f3778ee73c81ed9d34c51c4b16b0b0f@52.232.243.152:30303",
"enode://94c15d1b9e2fe7ce56e458b9a3b672ef11894ddedd0c6f247e0f1d3487f52b66208fb4aeb8179fce6e3a749ea93ed147c37976d67af557508d199d9594c35f09@192.81.208.223:30303",
"enode://30b7ab30a01c124a6cceca36863ece12c4f5fa68e3ba9b0b51407ccc002eeed3b3102d20a88f1c1d3c3154e2449317b8ef95090e77b312d5cc39354f86d5d606@52.176.7.10:30303",
"enode://865a63255b3bb68023b6bffd5095118fcc13e79dcf014fe4e47e065c350c7cc72af2e53eff895f11ba1bbb6a2b33271c1116ee870f266618eadfc2e78aa7349c@52.176.100.77:30303"
"enode://865a63255b3bb68023b6bffd5095118fcc13e79dcf014fe4e47e065c350c7cc72af2e53eff895f11ba1bbb6a2b33271c1116ee870f266618eadfc2e78aa7349c@52.176.100.77:30303",
"enode://691907d5a7dee24884b791e799183e5db01f4fe0b6e9b795ffaf5cf85a3023a637f2abadc82fc0da168405092df869126377c5f190794cd2d1c067245ae2b1ce@13.125.237.43:30303"
],
"accounts": {
"0000000000000000000000000000000000000000": { "balance": "1" },

View File

@ -0,0 +1,468 @@
{ "block":
[
{
"reference": "9590",
"failing": "stCreateTest",
"subtests": [
"CreateOOGafterInitCodeReturndata2_d0g1v0_Constantinople"
]
},
{
"reference": "9590",
"failing": "stCreate2",
"subtests": [
"RevertDepthCreateAddressCollision_d0g1v0_Constantinople",
"RevertDepthCreateAddressCollision_d1g1v1_Constantinople",
"CREATE2_Suicide_d5g0v0_Constantinople",
"CREATE2_Suicide_d7g0v0_Constantinople",
"create2collisionSelfdestructedOOG_d2g0v0_Byzantium",
"create2collisionSelfdestructedOOG_d2g0v0_Constantinople",
"create2collisionNonce_d1g0v0_Byzantium",
"create2collisionNonce_d1g0v0_Constantinople",
"CreateMessageRevertedOOGInInit_d0g1v0_Constantinople",
"create2callPrecompiles_d3g0v0_Constantinople",
"create2collisionCode_d1g0v0_Byzantium",
"create2collisionCode_d1g0v0_Constantinople",
"create2collisionStorage_d0g0v0_Byzantium",
"create2collisionStorage_d0g0v0_Constantinople",
"create2callPrecompiles_d4g0v0_Constantinople",
"create2collisionSelfdestructedRevert_d0g0v0_Byzantium",
"create2collisionSelfdestructedRevert_d0g0v0_Constantinople",
"CreateMessageReverted_d0g1v0_Constantinople",
"RevertOpcodeCreate_d0g1v0_Constantinople",
"CREATE2_Suicide_d11g0v0_Constantinople",
"create2checkFieldsInInitcode_d5g0v0_Constantinople",
"create2collisionSelfdestructedOOG_d1g0v0_Byzantium",
"create2collisionSelfdestructedOOG_d1g0v0_Constantinople",
"returndatacopy_following_create_d1g0v0_Constantinople",
"RevertDepthCreate2OOG_d1g1v1_Constantinople",
"create2collisionSelfdestructed_d2g0v0_Byzantium",
"create2collisionSelfdestructed_d2g0v0_Constantinople",
"create2callPrecompiles_d2g0v0_Constantinople",
"create2InitCodes_d2g0v0_Constantinople",
"create2collisionNonce_d2g0v0_Byzantium",
"create2collisionNonce_d2g0v0_Constantinople",
"create2collisionCode_d0g0v0_Byzantium",
"create2collisionCode_d0g0v0_Constantinople",
"CREATE2_Bounds_d0g0v0_Constantinople",
"RevertDepthCreate2OOG_d0g0v0_Constantinople",
"CREATE2_Suicide_d1g0v0_Constantinople",
"CREATE2_Bounds3_d0g1v0_Constantinople",
"create2collisionStorage_d2g0v0_Byzantium",
"create2collisionStorage_d2g0v0_Constantinople",
"RevertDepthCreateAddressCollision_d0g0v1_Constantinople",
"create2callPrecompiles_d5g0v0_Constantinople",
"create2collisionCode2_d0g0v0_Byzantium",
"create2collisionCode2_d0g0v0_Constantinople",
"create2noCash_d0g0v0_Byzantium",
"create2noCash_d0g0v0_Constantinople",
"create2checkFieldsInInitcode_d7g0v0_Constantinople",
"create2SmartInitCode_d1g0v0_Constantinople",
"create2InitCodes_d6g0v0_Constantinople",
"create2noCash_d1g0v0_Byzantium",
"create2noCash_d1g0v0_Constantinople",
"CREATE2_ContractSuicideDuringInit_ThenStoreThenReturn_d0g0v0_Constantinople",
"RevertOpcodeInCreateReturns_d0g0v0_Constantinople",
"create2collisionStorage_d1g0v0_Byzantium",
"create2collisionStorage_d1g0v0_Constantinople",
"create2checkFieldsInInitcode_d3g0v0_Constantinople",
"create2collisionBalance_d0g0v0_Byzantium",
"create2collisionBalance_d0g0v0_Constantinople",
"create2collisionSelfdestructed2_d0g0v0_Constantinople",
"create2InitCodes_d3g0v0_Constantinople",
"create2collisionCode2_d1g0v0_Byzantium",
"create2collisionCode2_d1g0v0_Constantinople",
"create2checkFieldsInInitcode_d1g0v0_Constantinople",
"create2collisionBalance_d1g0v0_Byzantium",
"create2collisionBalance_d1g0v0_Constantinople",
"CREATE2_Bounds3_d0g2v0_Constantinople",
"create2callPrecompiles_d6g0v0_Constantinople",
"Create2Recursive_d0g0v0_Constantinople",
"create2collisionSelfdestructedOOG_d0g0v0_Byzantium",
"create2collisionSelfdestructedOOG_d0g0v0_Constantinople",
"CREATE2_Suicide_d3g0v0_Constantinople",
"returndatacopy_following_create_d0g0v0_Constantinople",
"create2InitCodes_d8g0v0_Constantinople",
"RevertDepthCreate2OOG_d0g0v1_Constantinople",
"create2checkFieldsInInitcode_d2g0v0_Constantinople",
"RevertDepthCreate2OOG_d1g0v1_Constantinople",
"Create2OnDepth1024_d0g0v0_Constantinople",
"create2collisionSelfdestructed2_d1g0v0_Constantinople",
"create2collisionSelfdestructedRevert_d2g0v0_Byzantium",
"create2collisionSelfdestructedRevert_d2g0v0_Constantinople",
"create2callPrecompiles_d0g0v0_Constantinople",
"RevertDepthCreateAddressCollision_d0g1v1_Constantinople",
"create2collisionSelfdestructed_d1g0v0_Byzantium",
"create2collisionSelfdestructed_d1g0v0_Constantinople",
"call_outsize_then_create2_successful_then_returndatasize_d0g0v0_Byzantium",
"call_outsize_then_create2_successful_then_returndatasize_d0g0v0_Constantinople",
"Create2OOGafterInitCodeRevert_d0g0v0_Constantinople",
"Create2OOGafterInitCodeReturndata3_d0g0v0_Constantinople",
"Create2OOGafterInitCodeReturndataSize_d0g0v0_Constantinople",
"create2InitCodes_d7g0v0_Constantinople",
"CREATE2_Suicide_d10g0v0_Constantinople",
"RevertDepthCreate2OOG_d0g1v0_Constantinople",
"create2InitCodes_d5g0v0_Constantinople",
"create2collisionSelfdestructedRevert_d1g0v0_Byzantium",
"create2collisionSelfdestructedRevert_d1g0v0_Constantinople",
"RevertDepthCreate2OOG_d1g1v0_Constantinople",
"create2collisionSelfdestructed_d0g0v0_Byzantium",
"create2collisionSelfdestructed_d0g0v0_Constantinople",
"create2noCash_d2g0v0_Byzantium",
"create2noCash_d2g0v0_Constantinople",
"CREATE2_Bounds3_d0g0v0_Constantinople",
"create2collisionNonce_d0g0v0_Byzantium",
"create2collisionNonce_d0g0v0_Constantinople",
"CREATE2_Suicide_d2g0v0_Constantinople",
"Create2OOGafterInitCode_d0g0v0_Constantinople",
"call_then_create2_successful_then_returndatasize_d0g0v0_Byzantium",
"call_then_create2_successful_then_returndatasize_d0g0v0_Constantinople",
"create2collisionBalance_d2g0v0_Byzantium",
"create2collisionBalance_d2g0v0_Constantinople",
"create2checkFieldsInInitcode_d6g0v0_Constantinople",
"RevertDepthCreate2OOG_d0g1v1_Constantinople",
"returndatacopy_afterFailing_create_d0g0v0_Constantinople",
"returndatacopy_following_revert_in_create_d0g0v0_Constantinople",
"CREATE2_Suicide_d9g0v0_Constantinople",
"create2callPrecompiles_d7g0v0_Constantinople",
"RevertDepthCreateAddressCollision_d1g0v1_Constantinople",
"create2InitCodes_d1g0v0_Constantinople",
"CREATE2_Bounds_d0g1v0_Constantinople",
"Create2OOGafterInitCodeReturndata_d0g0v0_Constantinople",
"create2checkFieldsInInitcode_d4g0v0_Constantinople",
"CreateMessageRevertedOOGInInit_d0g0v0_Constantinople",
"RevertDepthCreateAddressCollision_d1g1v0_Constantinople",
"returndatacopy_following_successful_create_d0g0v0_Constantinople",
"create2checkFieldsInInitcode_d0g0v0_Constantinople",
"CreateMessageReverted_d0g0v0_Constantinople",
"create2SmartInitCode_d0g0v0_Constantinople",
"CREATE2_Bounds2_d0g0v0_Constantinople",
"returndatasize_following_successful_create_d0g0v0_Constantinople",
"CREATE2_Bounds2_d0g1v0_Constantinople",
"returndatacopy_0_0_following_successful_create_d0g0v0_Constantinople",
"RevertDepthCreateAddressCollision_d0g0v0_Constantinople",
"CREATE2_Suicide_d0g0v0_Constantinople",
"create2InitCodes_d0g0v0_Constantinople",
"Create2OnDepth1023_d0g0v0_Constantinople",
"create2InitCodes_d4g0v0_Constantinople",
"Create2OOGafterInitCodeReturndata2_d0g0v0_Constantinople",
"create2collisionBalance_d3g0v0_Byzantium",
"create2collisionBalance_d3g0v0_Constantinople",
"CREATE2_Suicide_d4g0v0_Constantinople",
"Create2OOGafterInitCode_d0g1v0_Constantinople",
"RevertDepthCreateAddressCollision_d1g0v0_Constantinople",
"Create2OOGafterInitCodeRevert2_d0g0v0_Constantinople",
"Create2OOGafterInitCodeReturndata_d0g1v0_Constantinople",
"Create2Recursive_d0g1v0_Constantinople",
"create2collisionCode_d2g0v0_Byzantium",
"create2collisionCode_d2g0v0_Constantinople",
"CREATE2_Suicide_d6g0v0_Constantinople",
"CREATE2_Suicide_d8g0v0_Constantinople",
"RevertOpcodeCreate_d0g0v0_Constantinople",
"Create2OOGafterInitCodeReturndata2_d0g1v0_Constantinople",
"create2callPrecompiles_d1g0v0_Constantinople",
"RevertInCreateInInit_d0g0v0_Constantinople",
"RevertDepthCreate2OOG_d1g0v0_Constantinople"
]
},
{
"reference": "9590",
"failing": "bcStateTest",
"subtests": [
"suicideStorageCheck_Byzantium",
"suicideStorageCheck_Constantinople",
"suicideStorageCheckVCreate2_Byzantium",
"suicideStorageCheckVCreate2_Constantinople",
"create2collisionwithSelfdestructSameBlock_Constantinople",
"blockhashNonConstArg_Constantinople",
"suicideThenCheckBalance_Constantinople",
"suicideThenCheckBalance_Homestead",
"suicideStorageCheckVCreate_Byzantium",
"suicideStorageCheckVCreate_Constantinople"
]
},
{
"reference": "9590",
"failing": "stEIP158Specific",
"subtests": [
"callToEmptyThenCallError_d0g0v0_Byzantium",
"callToEmptyThenCallError_d0g0v0_Constantinople",
"callToEmptyThenCallError_d0g0v0_EIP158"
]
},
{
"reference": "9590",
"failing": "stPreCompiledContracts",
"subtests": [
"identity_to_smaller_d0g0v0_Constantinople",
"identity_to_bigger_d0g0v0_Constantinople"
]
},
{
"reference": "9590",
"failing": "stReturnDataTest",
"subtests": [
"modexp_modsize0_returndatasize_d0g1v0_Constantinople",
"modexp_modsize0_returndatasize_d0g2v0_Constantinople",
"modexp_modsize0_returndatasize_d0g3v0_Constantinople"
]
},
{
"reference": "9590",
"failing": "stSpecialTest",
"subtests": [
"push32withoutByte_d0g0v0_Constantinople"
]
}
],
"state":
[
{
"reference": "9590",
"failing": "stCreateTest",
"subtests": {
"CreateOOGafterInitCodeReturndata2": {
"subnumbers": ["2"],
"chain": "Constantinople (test)"
}
}
},
{
"reference": "9590",
"failing": "stCreate2Test",
"subtests": {
"RevertInCreateInInit": {
"subnumbers": ["1"],
"chain": "Constantinople (test)"
}
}
},
{
"reference": "9590",
"failing": "stEIP150Specific",
"subtests": {
"NewGasPriceForCodes": {
"subnumbers": ["1"],
"chain": "Constantinople (test)"
}
}
},
{
"reference": "9590",
"failing": "stInitCodeTest",
"subtests": {
"OutOfGasContractCreation": {
"subnumbers": ["4"],
"chain": "Constantinople (test)"
}
}
},
{
"reference": "9590",
"failing": "stPreCompiledContracts",
"subtests": {
"modexp": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
}
}
},
{
"reference": "9590",
"failing": "stRevertTest",
"subtests": {
"LoopCallsDepthThenRevert3": {
"subnumbers": ["1"],
"chain": "Constantinople (test)"
},
"RevertOpcodeCreate": {
"subnumbers": ["1"],
"chain": "Constantinople (test)"
},
"RevertSubCallStorageOOG2": {
"subnumbers": ["1","3"],
"chain": "Constantinople (test)"
},
"RevertDepthCreateOOG": {
"subnumbers": ["3","4"],
"chain": "Constantinople (test)"
},
"RevertOpcodeMultipleSubCalls": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"RevertOpcodeDirectCall": {
"subnumbers": ["1","2"],
"chain": "Constantinople (test)"
},
"LoopCallsDepthThenRevert2": {
"subnumbers": ["1"],
"chain": "Constantinople (test)"
},
"RevertDepth2": {
"subnumbers": ["1"],
"chain": "Constantinople (test)"
},
"RevertRemoteSubCallStorageOOG2": {
"subnumbers": ["1","2"],
"chain": "Constantinople (test)"
},
"RevertDepthCreateAddressCollision": {
"subnumbers": ["3","4"],
"chain": "Constantinople (test)"
}
}
},
{
"reference": "9590",
"failing": "stStaticCall",
"subtests": {
"static_RevertDepth2": {
"subnumbers": ["1","3"],
"chain": "Constantinople (test)"
},
"static_CheckOpcodes4": {
"subnumbers": ["3"],
"chain": "Constantinople (test)"
},
"static_CheckOpcodes3": {
"subnumbers": ["5","6","7","8"],
"chain": "Constantinople (test)"
},
"static_callBasic": {
"subnumbers": ["1","2"],
"chain": "Constantinople (test)"
},
"static_CheckOpcodes2": {
"subnumbers": ["5","6","7","8"],
"chain": "Constantinople (test)"
},
"static_callCreate": {
"subnumbers": ["2"],
"chain": "Constantinople (test)"
}
}
},
{
"reference": "https://github.com/ethereum/tests/issues/512",
"failing": "stZeroKnowledge",
"subtests": {
"pointAddTrunc": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"pointAdd": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"pointMulAdd": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"pointMulAdd2": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
}
}
},
{
"reference": "9590",
"failing": "stCreate2Test",
"subtests": {
"call_then_create2_successful_then_returndatasize": {
"subnumbers": ["1"],
"chain": "Constantinople (test)"
},
"returndatacopy_afterFailing_create": {
"subnumbers": ["1"],
"chain": "Constantinople (test)"
},
"create2checkFieldsInInitcode": {
"subnumbers": ["1","2","3","5","6","7"],
"chain": "Constantinople (test)"
},
"Create2Recursive": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"create2collisionBalance": {
"subnumbers": ["2","3"],
"chain": "Constantinople (test)"
},
"create2InitCodes": {
"subnumbers": ["1","5","6","7","8","9"],
"chain": "Constantinople (test)"
},
"Create2OOGafterInitCode": {
"subnumbers": ["2"],
"chain": "Constantinople (test)"
},
"CreateMessageRevertedOOGInInit": {
"subnumbers": ["2"],
"chain": "Constantinople (test)"
},
"returndatacopy_following_revert_in_create": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"create2collisionSelfdestructed": {
"subnumbers": ["2"],
"chain": "Constantinople (test)"
},
"returndatacopy_0_0_following_successful_create": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"Create2OnDepth1023": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"Create2OOGafterInitCodeReturndata2": {
"subnumbers": ["2"],
"chain": "Constantinople (test)"
},
"RevertOpcodeInCreateReturns": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"CREATE2_ContractSuicideDuringInit_ThenStoreThenReturn": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"returndatasize_following_successful_create": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"call_outsize_then_create2_successful_then_returndatasize": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"CreateMessageReverted": {
"subnumbers": ["2"],
"chain": "Constantinople (test)"
},
"CREATE2_Suicide": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"Create2OOGafterInitCodeRevert": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"Create2OnDepth1024": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
},
"create2collisionStorage": {
"subnumbers": ["2","3"],
"chain": "Constantinople (test)"
},
"create2callPrecompiles": {
"subnumbers": ["*"],
"chain": "Constantinople (test)"
}
}
}
]
}

View File

@ -52,7 +52,7 @@ use encoded;
use engines::{EthEngine, EpochTransition, ForkChoice};
use error::{
ImportErrorKind, BlockImportErrorKind, ExecutionError, CallError, BlockError, ImportResult,
QueueError, QueueErrorKind, Error as EthcoreError
QueueError, QueueErrorKind, Error as EthcoreError, EthcoreResult,
};
use vm::{EnvInfo, LastHashes};
use evm::Schedule;
@ -296,7 +296,6 @@ impl Importer {
continue;
}
let raw = block.bytes.clone();
match self.check_and_lock_block(block, client) {
Ok(closed_block) => {
if self.engine.is_proposal(&header) {
@ -314,8 +313,8 @@ impl Importer {
}
},
Err(err) => {
self.bad_blocks.report(raw, format!("{:?}", err));
invalid_blocks.insert(header.hash());
self.bad_blocks.report(bytes, format!("{:?}", err));
invalid_blocks.insert(hash);
},
}
}
@ -356,7 +355,7 @@ impl Importer {
imported
}
fn check_and_lock_block(&self, block: PreverifiedBlock, client: &Client) -> Result<LockedBlock, ()> {
fn check_and_lock_block(&self, block: PreverifiedBlock, client: &Client) -> EthcoreResult<LockedBlock> {
let engine = &*self.engine;
let header = block.header.clone();
@ -364,7 +363,7 @@ impl Importer {
let best_block_number = client.chain.read().best_block_number();
if client.pruning_info().earliest_state > header.number() {
warn!(target: "client", "Block import failed for #{} ({})\nBlock is ancient (current best block: #{}).", header.number(), header.hash(), best_block_number);
return Err(());
bail!("Block is ancient");
}
// Check if parent is in chain
@ -372,7 +371,7 @@ impl Importer {
Some(h) => h,
None => {
warn!(target: "client", "Block import failed for #{} ({}): Parent not found ({}) ", header.number(), header.hash(), header.parent_hash());
return Err(());
bail!("Parent not found");
}
};
@ -391,13 +390,13 @@ impl Importer {
if let Err(e) = verify_family_result {
warn!(target: "client", "Stage 3 block verification failed for #{} ({})\nError: {:?}", header.number(), header.hash(), e);
return Err(());
bail!(e);
};
let verify_external_result = self.verifier.verify_block_external(&header, engine);
if let Err(e) = verify_external_result {
warn!(target: "client", "Stage 4 block verification failed for #{} ({})\nError: {:?}", header.number(), header.hash(), e);
return Err(());
bail!(e);
};
// Enact Verified Block
@ -417,9 +416,13 @@ impl Importer {
&mut chain.ancestry_with_metadata_iter(*header.parent_hash()),
);
let mut locked_block = enact_result.map_err(|e| {
let mut locked_block = match enact_result {
Ok(b) => b,
Err(e) => {
warn!(target: "client", "Block import failed for #{} ({})\nError: {:?}", header.number(), header.hash(), e);
})?;
bail!(e);
}
};
// Strip receipts for blocks before validate_receipts_transition,
// if the expected receipts root header does not match.
@ -433,7 +436,7 @@ impl Importer {
// Final Verification
if let Err(e) = self.verifier.verify_block_final(&header, locked_block.block().header()) {
warn!(target: "client", "Stage 5 block verification failed for #{} ({})\nError: {:?}", header.number(), header.hash(), e);
return Err(());
bail!(e);
}
Ok(locked_block)
@ -443,7 +446,7 @@ impl Importer {
///
/// The block is guaranteed to be the next best blocks in the
/// first block sequence. Does no sealing or transaction validation.
fn import_old_block(&self, unverified: Unverified, receipts_bytes: &[u8], db: &KeyValueDB, chain: &BlockChain) -> Result<(), ::error::Error> {
fn import_old_block(&self, unverified: Unverified, receipts_bytes: &[u8], db: &KeyValueDB, chain: &BlockChain) -> EthcoreResult<()> {
let receipts = ::rlp::decode_list(receipts_bytes);
let _import_lock = self.import_lock.lock();
@ -2353,14 +2356,13 @@ impl ProvingBlockChainClient for Client {
env_info.gas_limit = transaction.gas.clone();
let mut jdb = self.state_db.read().journal_db().boxed_clone();
state::prove_transaction(
state::prove_transaction_virtual(
jdb.as_hashdb_mut(),
header.state_root().clone(),
&transaction,
self.engine.machine(),
&env_info,
self.factories.clone(),
false,
)
}

View File

@ -18,7 +18,7 @@ use engines::{Engine, Seal};
use parity_machine::{Machine, Transactions, TotalScoredHeader};
/// `InstantSeal` params.
#[derive(Debug, PartialEq)]
#[derive(Default, Debug, PartialEq)]
pub struct InstantSealParams {
/// Whether to use millisecond timestamp
pub millisecond_timestamp: bool,

View File

@ -32,7 +32,7 @@ pub mod epoch;
pub use self::authority_round::AuthorityRound;
pub use self::basic_authority::BasicAuthority;
pub use self::epoch::{EpochVerifier, Transition as EpochTransition};
pub use self::instant_seal::InstantSeal;
pub use self::instant_seal::{InstantSeal, InstantSealParams};
pub use self::null_engine::NullEngine;
pub use self::tendermint::Tendermint;

View File

@ -693,7 +693,6 @@ impl Engine<EthereumMachine> for Tendermint {
}
fn stop(&self) {
self.step_service.stop()
}
fn is_proposal(&self, header: &Header) -> bool {

View File

@ -95,10 +95,11 @@ mod difficulty_test_foundation {
declare_test!{DifficultyTests_difficultyMainNetwork, "BasicTests/difficultyMainNetwork.json"}
}
mod difficulty_test_ropsten {
difficulty_json_test_nopath!(new_ropsten_test);
declare_test!{DifficultyTests_difficultyRopsten, "BasicTests/difficultyRopsten.json"}
}
// Disabling Ropsten diff tests; waiting for upstream ethereum/tests Constantinople update
//mod difficulty_test_ropsten {
// difficulty_json_test_nopath!(new_ropsten_test);
// declare_test!{DifficultyTests_difficultyRopsten, "BasicTests/difficultyRopsten.json"}
//}
mod difficulty_test_frontier {
difficulty_json_test_nopath!(new_frontier_test);

View File

@ -33,7 +33,10 @@ use vm::{EnvInfo, CallType, ActionValue, ActionParams, ParamsType};
use builtin::Builtin;
use encoded;
use engines::{EthEngine, NullEngine, InstantSeal, BasicAuthority, AuthorityRound, Tendermint, DEFAULT_BLOCKHASH_CONTRACT};
use engines::{
EthEngine, NullEngine, InstantSeal, InstantSealParams, BasicAuthority,
AuthorityRound, Tendermint, DEFAULT_BLOCKHASH_CONTRACT
};
use error::Error;
use executive::Executive;
use factory::Factories;
@ -596,7 +599,8 @@ impl Spec {
match engine_spec {
ethjson::spec::Engine::Null(null) => Arc::new(NullEngine::new(null.params.into(), machine)),
ethjson::spec::Engine::Ethash(ethash) => Arc::new(::ethereum::Ethash::new(spec_params.cache_dir, ethash.params.into(), machine, spec_params.optimization_setting)),
ethjson::spec::Engine::InstantSeal(instant_seal) => Arc::new(InstantSeal::new(instant_seal.params.into(), machine)),
ethjson::spec::Engine::InstantSeal(Some(instant_seal)) => Arc::new(InstantSeal::new(instant_seal.params.into(), machine)),
ethjson::spec::Engine::InstantSeal(None) => Arc::new(InstantSeal::new(InstantSealParams::default(), machine)),
ethjson::spec::Engine::BasicAuthority(basic_authority) => Arc::new(BasicAuthority::new(basic_authority.params.into(), machine)),
ethjson::spec::Engine::AuthorityRound(authority_round) => AuthorityRound::new(authority_round.params.into(), machine)
.expect("Failed to start AuthorityRound consensus engine."),
@ -873,14 +877,13 @@ impl Spec {
data: d,
}.fake_sign(from);
let res = ::state::prove_transaction(
let res = ::state::prove_transaction_virtual(
db.as_hashdb_mut(),
*genesis.state_root(),
&tx,
self.engine.machine(),
&env_info,
factories.clone(),
true,
);
res.map(|(out, proof)| {

View File

@ -224,17 +224,16 @@ pub fn check_proof(
}
}
/// Prove a transaction on the given state.
/// Prove a `virtual` transaction on the given state.
/// Returns `None` when the transacion could not be proved,
/// and a proof otherwise.
pub fn prove_transaction<H: AsHashDB<KeccakHasher> + Send + Sync>(
pub fn prove_transaction_virtual<H: AsHashDB<KeccakHasher> + Send + Sync>(
db: H,
root: H256,
transaction: &SignedTransaction,
machine: &Machine,
env_info: &EnvInfo,
factories: Factories,
virt: bool,
) -> Option<(Bytes, Vec<DBValue>)> {
use self::backend::Proving;
@ -252,7 +251,7 @@ pub fn prove_transaction<H: AsHashDB<KeccakHasher> + Send + Sync>(
};
let options = TransactOptions::with_no_tracing().dont_check_nonce().save_output_from_contract();
match state.execute(env_info, machine, transaction, options, virt) {
match state.execute(env_info, machine, transaction, options, true) {
Err(ExecutionError::Internal(_)) => None,
Err(e) => {
trace!(target: "state", "Proved call failed: {}", e);

View File

@ -465,19 +465,19 @@ impl<K: Kind> VerificationQueue<K> {
/// Add a block to the queue.
pub fn import(&self, input: K::Input) -> ImportResult {
let h = input.hash();
let hash = input.hash();
{
if self.processing.read().contains_key(&h) {
if self.processing.read().contains_key(&hash) {
bail!(ErrorKind::Import(ImportErrorKind::AlreadyQueued));
}
let mut bad = self.verification.bad.lock();
if bad.contains(&h) {
if bad.contains(&hash) {
bail!(ErrorKind::Import(ImportErrorKind::KnownBad));
}
if bad.contains(&input.parent_hash()) {
bad.insert(h.clone());
bad.insert(hash);
bail!(ErrorKind::Import(ImportErrorKind::KnownBad));
}
}
@ -486,21 +486,21 @@ impl<K: Kind> VerificationQueue<K> {
Ok(item) => {
self.verification.sizes.unverified.fetch_add(item.heap_size_of_children(), AtomicOrdering::SeqCst);
self.processing.write().insert(h.clone(), item.difficulty());
self.processing.write().insert(hash, item.difficulty());
{
let mut td = self.total_difficulty.write();
*td = *td + item.difficulty();
}
self.verification.unverified.lock().push_back(item);
self.more_to_verify.notify_all();
Ok(h)
Ok(hash)
},
Err(err) => {
match err {
// Don't mark future blocks as bad.
Error(ErrorKind::Block(BlockError::TemporarilyInvalid(_)), _) => {},
_ => {
self.verification.bad.lock().insert(h.clone());
self.verification.bad.lock().insert(hash);
}
}
Err(err)

View File

@ -507,8 +507,9 @@ impl ChainSync {
self.peers.clear();
}
/// Reset sync. Clear all downloaded data but keep the queue
fn reset(&mut self, io: &mut SyncIo) {
/// Reset sync. Clear all downloaded data but keep the queue.
/// Set sync state to the given state or to the initial state if `None` is provided.
fn reset(&mut self, io: &mut SyncIo, state: Option<SyncState>) {
self.new_blocks.reset();
let chain_info = io.chain().chain_info();
for (_, ref mut p) in &mut self.peers {
@ -520,7 +521,7 @@ impl ChainSync {
}
}
}
self.state = ChainSync::get_init_state(self.warp_sync, io.chain());
self.state = state.unwrap_or_else(|| ChainSync::get_init_state(self.warp_sync, io.chain()));
// Reactivate peers only if some progress has been made
// since the last sync round of if starting fresh.
self.active_peers = self.peers.keys().cloned().collect();
@ -534,7 +535,7 @@ impl ChainSync {
io.snapshot_service().abort_restore();
}
self.snapshot.clear();
self.reset(io);
self.reset(io, None);
self.continue_sync(io);
}
@ -699,7 +700,7 @@ impl ChainSync {
/// Called after all blocks have been downloaded
fn complete_sync(&mut self, io: &mut SyncIo) {
trace!(target: "sync", "Sync complete");
self.reset(io);
self.reset(io, Some(SyncState::Idle));
}
/// Enter waiting state

View File

@ -26,7 +26,7 @@ pub enum Engine {
Null(NullEngine),
/// Instantly sealing engine.
#[serde(rename="instantSeal")]
InstantSeal(InstantSeal),
InstantSeal(Option<InstantSeal>),
/// Ethash engine.
Ethash(Ethash),
/// BasicAuthority engine.
@ -71,6 +71,17 @@ mod tests {
_ => panic!(),
};
let s = r#"{
"instantSeal": null
}"#;
let deserialized: Engine = serde_json::from_str(s).unwrap();
match deserialized {
Engine::InstantSeal(_) => {}, // instant seal is unit tested in its own file.
_ => panic!(),
};
let s = r#"{
"Ethash": {
"params": {

View File

@ -188,7 +188,7 @@ impl LightFetch {
}
/// Helper for getting proved execution.
pub fn proved_execution(&self, req: CallRequest, num: Trailing<BlockNumber>) -> impl Future<Item = ExecutionResult, Error = Error> + Send {
pub fn proved_read_only_execution(&self, req: CallRequest, num: Trailing<BlockNumber>) -> impl Future<Item = ExecutionResult, Error = Error> + Send {
const DEFAULT_GAS_PRICE: u64 = 21_000;
// starting gas when gas not provided.
const START_GAS: u64 = 50_000;
@ -247,19 +247,20 @@ impl LightFetch {
}).join(header_fut).and_then(move |((gas_known, tx), hdr)| {
// then request proved execution.
// TODO: get last-hashes from network.
let env_info = match client.env_info(id) {
let hash = hdr.hash();
let env_info = match client.env_info(BlockId::Hash(hash)) {
Some(env_info) => env_info,
_ => return Either::A(future::err(errors::unknown_block())),
};
Either::B(execute_tx(gas_known, ExecuteParams {
from: from,
tx: tx,
hdr: hdr,
env_info: env_info,
Either::B(execute_read_only_tx(gas_known, ExecuteParams {
from,
tx,
hdr,
env_info,
engine: client.engine().clone(),
on_demand: on_demand,
sync: sync,
on_demand,
sync,
}))
}))
}
@ -598,10 +599,10 @@ struct ExecuteParams {
// has a peer execute the transaction with given params. If `gas_known` is false,
// this will double the gas on each `OutOfGas` error.
fn execute_tx(gas_known: bool, params: ExecuteParams) -> impl Future<Item = ExecutionResult, Error = Error> + Send {
fn execute_read_only_tx(gas_known: bool, params: ExecuteParams) -> impl Future<Item = ExecutionResult, Error = Error> + Send {
if !gas_known {
Box::new(future::loop_fn(params, |mut params| {
execute_tx(true, params.clone()).and_then(move |res| {
execute_read_only_tx(true, params.clone()).and_then(move |res| {
match res {
Ok(executed) => {
// TODO: how to distinguish between actual OOG and

View File

@ -398,7 +398,7 @@ impl<T: LightChainClient + 'static> Eth for EthClient<T> {
}
fn call(&self, req: CallRequest, num: Trailing<BlockNumber>) -> BoxFuture<Bytes> {
Box::new(self.fetcher().proved_execution(req, num).and_then(|res| {
Box::new(self.fetcher().proved_read_only_execution(req, num).and_then(|res| {
match res {
Ok(exec) => Ok(exec.output.into()),
Err(e) => Err(errors::execution(e)),
@ -408,7 +408,7 @@ impl<T: LightChainClient + 'static> Eth for EthClient<T> {
fn estimate_gas(&self, req: CallRequest, num: Trailing<BlockNumber>) -> BoxFuture<RpcU256> {
// TODO: binary chop for more accurate estimates.
Box::new(self.fetcher().proved_execution(req, num).and_then(|res| {
Box::new(self.fetcher().proved_read_only_execution(req, num).and_then(|res| {
match res {
Ok(exec) => Ok((exec.refunded + exec.gas_used).into()),
Err(e) => Err(errors::execution(e)),

View File

@ -1,12 +0,0 @@
#!/usr/bin/env bash
set -e # fail on any error
set -u # treat unset variables as error
cargo build -j $(nproc) --release --features final $CARGOFLAGS
git clone https://github.com/paritytech/parity-import-tests
cp target/release/parity parity-import-tests/aura/parity
cd parity-import-tests/aura
echo "Start Aura test"
./parity import blocks.rlp --chain chain.json
./parity restore snap --chain chain.json
echo "Aura test complete"

View File

@ -15,8 +15,17 @@ RUN apt autoremove -y
RUN apt clean -y
RUN rm -rf /tmp/* /var/tmp/* /var/lib/apt/lists/*
RUN groupadd -g 1000 parity \
&& useradd -m -u 1000 -g parity -s /bin/sh parity
USER parity
WORKDIR /home/parity
ENV PATH "~/bin:${PATH}"
#add TARGET to docker image
COPY artifacts/x86_64-unknown-linux-gnu/$TARGET /usr/bin/$TARGET
COPY artifacts/x86_64-unknown-linux-gnu/$TARGET ./bin/$TARGET
# Build a shell script because the ENTRYPOINT command doesn't like using ENV
RUN echo "#!/bin/bash \n ${TARGET} \$@" > ./entrypoint.sh

View File

@ -1,54 +0,0 @@
#!/usr/bin/env bash
clone_repos() {
git clone https://github.com/parity-js/jsonrpc.git jsonrpc
git clone https://github.com/paritytech/wiki.git wiki
}
build_docs() {
npm install
npm run build:markdown
}
update_wiki_docs() {
for file in $(ls jsonrpc/docs); do
module_name=${file:0:-3}
mv jsonrpc/docs/$file wiki/JSONRPC-$module_name-module.md
done
}
set_remote_wiki() {
git config remote.origin.url "https://${GITHUB_TOKEN}@github.com/paritytech/wiki.git"
}
setup_git() {
git config --global user.email "devops@parity.com"
git config --global user.name "Devops Parity"
}
commit_files() {
git checkout -b rpcdoc-update-${CI_COMMIT_REF_NAME}
git add .
git commit -m "Update docs to ${CI_COMMIT_REF_NAME}"
git tag -a "${CI_COMMIT_REF_NAME}" -m "Updated to ${CI_COMMIT_REF_NAME}"
}
upload_files() {
git push origin HEAD
git push --tags
}
PROJECT_DIR=$(pwd)
setup_git
cd ..
clone_repos
cp -r $PROJECT_DIR jsonrpc/.parity
cd jsonrpc
build_docs
cd ..
update_wiki_docs
cd wiki
set_remote_wiki
commit_files
upload_files

View File

@ -2,13 +2,19 @@
set -e # fail on any error
set -u # treat unset variables as error
echo "__________Show ENVIROMENT__________"
echo "CI_SERVER_NAME: " $CI_SERVER_NAME
echo "CARGO_HOME: " $CARGO_HOME
echo "BUILD_TARGET: " $BUILD_TARGET
echo "BUILD_ARCH: " $BUILD_ARCH
echo "CARGO_TARGET: " $CARGO_TARGET
echo "CC: " $CC
echo "CXX: " $CXX
echo "__________CARGO CONFIG__________"
rm -rf .cargo
mkdir -p .cargo
rm -f .cargo/config
echo "[target.$CARGO_TARGET]" >> .cargo/config
echo "linker= \"$CC\"" >> .cargo/config
cat .cargo/config
@ -26,10 +32,15 @@ mkdir -p artifacts
cd artifacts
mkdir -p $CARGO_TARGET
cd $CARGO_TARGET
cp ../../target/$CARGO_TARGET/release/{parity,parity-evm,ethstore,ethkey,whisper} .
cp ../../target/$CARGO_TARGET/release/parity ./parity
cp ../../target/$CARGO_TARGET/release/parity-evm ./parity-evm
cp ../../target/$CARGO_TARGET/release/ethstore ./ethstore
cp ../../target/$CARGO_TARGET/release/ethkey ./ethkey
cp ../../target/$CARGO_TARGET/release/whisper ./whisper
strip -v ./*
echo "_____ Calculating checksums _____"
for binary in $(ls)
do
rhash --sha256 $binary -o $binary.sha256
./parity tools hash $binary > $binary.sha3
done

View File

@ -6,18 +6,27 @@ set INCLUDE="C:\Program Files (x86)\Microsoft SDKs\Windows\v7.1A\Include;C:\vs20
set LIB="C:\vs2015\VC\lib;C:\Program Files (x86)\Windows Kits\10\Lib\10.0.10240.0\ucrt\x64"
rustup default stable-x86_64-pc-windows-msvc
echo "_____ Building _____"
echo "__________Show ENVIROMENT__________"
echo "CI_SERVER_NAME: " $CI_SERVER_NAME
echo "CARGO_HOME: " $CARGO_HOME
echo "BUILD_TARGET: " $BUILD_TARGET
echo "BUILD_ARCH: " $BUILD_ARCH
echo "CARGO_TARGET: " $CARGO_TARGET
echo "_____ Building target: "$CARGO_TARGET" _____"
time cargo build --target $CARGO_TARGET --release --features final
time cargo build --target $CARGO_TARGET --release -p evmbin
time cargo build --target $CARGO_TARGET --release -p ethstore-cli
time cargo build --target $CARGO_TARGET --release -p ethkey-cli
time cargo build --target $CARGO_TARGET --release -p whisper-cli
echo "__________Sign binaries__________"
scripts/gitlab/sign.cmd $keyfile $certpass target/$CARGO_TARGET/release/parity.exe
scripts/gitlab/sign.cmd $keyfile $certpass target/$CARGO_TARGET/release/parity-evm.exe
scripts/gitlab/sign.cmd $keyfile $certpass target/$CARGO_TARGET/release/ethstore.exe
scripts/gitlab/sign.cmd $keyfile $certpass target/$CARGO_TARGET/release/ethkey.exe
scripts/gitlab/sign.cmd $keyfile $certpass target/$CARGO_TARGET/release/whisper.exe
scripts/gitlab/sign-win.cmd $keyfile $certpass target/$CARGO_TARGET/release/parity.exe
scripts/gitlab/sign-win.cmd $keyfile $certpass target/$CARGO_TARGET/release/parity-evm.exe
scripts/gitlab/sign-win.cmd $keyfile $certpass target/$CARGO_TARGET/release/ethstore.exe
scripts/gitlab/sign-win.cmd $keyfile $certpass target/$CARGO_TARGET/release/ethkey.exe
scripts/gitlab/sign-win.cmd $keyfile $certpass target/$CARGO_TARGET/release/whisper.exe
echo "_____ Post-processing binaries _____"
rm -rf artifacts
@ -25,11 +34,17 @@ mkdir -p artifacts
cd artifacts
mkdir -p $CARGO_TARGET
cd $CARGO_TARGET
cp --verbose ../../target/$CARGO_TARGET/release/{parity.exe,parity-evm.exe,ethstore.exe,ethkey.exe,whisper.exe} .
cp --verbose ../../target/$CARGO_TARGET/release/parity.exe ./parity.exe
cp --verbose ../../target/$CARGO_TARGET/release/parity-evm.exe ./parity-evm.exe
cp --verbose ../../target/$CARGO_TARGET/release/ethstore.exe ./ethstore.exe
cp --verbose ../../target/$CARGO_TARGET/release/ethkey.exe ./ethkey.exe
cp --verbose ../../target/$CARGO_TARGET/release/whisper.exe ./whisper.exe
echo "_____ Calculating checksums _____"
for binary in $(ls)
do
rhash --sha256 $binary -o $binary.sha256
./parity.exe tools hash $binary > $binary.sha3
done
cp parity.exe.sha256 parity.sha256
cp parity.exe.sha3 parity.sha3

View File

@ -3,5 +3,5 @@
set -e # fail on any error
set -u # treat unset variables as error
cargo install clippy
cargo clippy -- -D warnings
CARGO_TARGET_DIR=./target cargo +stable install cargo-audit
cargo +stable audit

View File

@ -1,20 +0,0 @@
#!/bin/bash
set -x
git submodule update --init --recursive
rm -rf target/*
cargo test --all --exclude evmjit --no-run -- --test-threads 8|| exit $?
KCOV_TARGET="target/cov"
KCOV_FLAGS="--verify"
mkdir -p $KCOV_TARGET
echo "__________Cover RUST___________"
for FILE in `find target/debug/deps ! -name "*.*" -type f`
do
timeout --signal=SIGKILL 5m kcov --include-path=$(pwd) --exclude-path=$(pwd)/target $KCOV_FLAGS $KCOV_TARGET $FILE
done
timeout --signal=SIGKILL 5m kcov --include-path=$(pwd) --exclude-path=$(pwd)/target $KCOV_FLAGS $KCOV_TARGET target/debug/parity-*
bash <(curl -s https://codecov.io/bash) &&
echo "Uploaded code coverage"
exit 0

View File

@ -43,6 +43,7 @@ commit_files() {
upload_files() {
echo "__________Upload files__________"
git push origin HEAD
git push --tags
}

View File

@ -1,8 +0,0 @@
echo "Parity Wallet
=============
Welcome to Parity Wallet, your all-in-one Ethereum node and wallet.
If you continue, Parity will be installed as a user service. You will be able to use the Parity Wallet through your browser by using the menu bar icon, following the shortcut in the Launchpad or navigating to http://localhost:8180/ in your browser.
Parity is distributed under the terms of the GPL."

View File

@ -1,24 +0,0 @@
#!/bin/bash
set -e # fail on any error
set -u # treat unset variables as error
case ${CI_COMMIT_REF_NAME} in
nightly|*v2.2*) export GRADE="devel";;
beta|*v2.1*) export GRADE="stable";;
stable|*v2.0*) export GRADE="stable";;
*) echo "No release" exit 0;;
esac
SNAP_PACKAGE="parity_"$VERSION"_"$BUILD_ARCH".snap"
echo "__________Create snap package__________"
echo "Release channel :" $GRADE " Branch/tag: " $CI_COMMIT_REF_NAME
echo $VERSION:$GRADE:$BUILD_ARCH
cat scripts/gitlab/templates/snapcraft.template.yaml | envsubst '$VERSION:$GRADE:$BUILD_ARCH:$CARGO_TARGET' > snapcraft.yaml
cat snapcraft.yaml
snapcraft --target-arch=$BUILD_ARCH
ls *.snap
echo "__________Post-processing snap package__________"
mkdir -p artifacts
mv -v $SNAP_PACKAGE "artifacts/"$SNAP_PACKAGE
echo "_____ Calculating checksums _____"
cd artifacts
rhash --sha256 $SNAP_PACKAGE -o $SNAP_PACKAGE".sha256"

49
scripts/gitlab/publish-awss3.sh Executable file
View File

@ -0,0 +1,49 @@
#!/bin/bash
set -e # fail on any error
set -u # treat unset variables as error
echo "__________Register Release__________"
DATA="secret=$RELEASES_SECRET"
echo "Pushing release to Mainnet"
./scripts/gitlab/safe-curl.sh $DATA "http://update.parity.io:1337/push-release/$CI_COMMIT_REF_NAME/$CI_COMMIT_SHA"
echo "Pushing release to Kovan"
./scripts/gitlab/safe-curl.sh $DATA "http://update.parity.io:1338/push-release/$CI_COMMIT_REF_NAME/$CI_COMMIT_SHA"
cd artifacts
ls -l | sort -k9
filetest=( * )
echo ${filetest[*]}
for DIR in "${filetest[@]}";
do
cd $DIR
if [[ $DIR =~ "windows" ]];
then
WIN=".exe";
else
WIN="";
fi
sha3=$(cat parity.sha3 | awk '{print $1}')
case $DIR in
x86_64* )
DATA="commit=$CI_COMMIT_SHA&sha3=$sha3&filename=parity$WIN&secret=$RELEASES_SECRET"
../../scripts/gitlab/safe-curl.sh $DATA "http://update.parity.io:1337/push-build/$CI_COMMIT_REF_NAME/$DIR"
# Kovan
../../scripts/gitlab/safe-curl.sh $DATA "http://update.parity.io:1338/push-build/$CI_COMMIT_REF_NAME/$DIR"
;;
esac
cd ..
done
echo "__________Push binaries to AWS S3____________"
aws configure set aws_access_key_id $s3_key
aws configure set aws_secret_access_key $s3_secret
if [[ "$CI_COMMIT_REF_NAME" = "beta" || "$CI_COMMIT_REF_NAME" = "stable" || "$CI_COMMIT_REF_NAME" = "nightly" ]];
then
export S3_BUCKET=builds-parity-published;
else
export S3_BUCKET=builds-parity;
fi
aws s3 sync ./ s3://$S3_BUCKET/$CI_COMMIT_REF_NAME/

View File

@ -4,8 +4,8 @@ set -e # fail on any error
set -u # treat unset variables as error
if [ "$CI_COMMIT_REF_NAME" == "master" ];
then export DOCKER_BUILD_TAG="latest";
else export DOCKER_BUILD_TAG=$CI_COMMIT_REF_NAME;
then export DOCKER_BUILD_TAG="latest";
else export DOCKER_BUILD_TAG=$CI_COMMIT_REF_NAME;
fi
docker login -u $Docker_Hub_User_Parity -p $Docker_Hub_Pass_Parity
@ -17,6 +17,6 @@ export DOCKER_TARGET=$1
echo $DOCKER_TARGET
echo "__________Docker build and push__________"
docker build --build-arg TARGET=$DOCKER_TARGET --no-cache=true --tag parity/$DOCKER_TARGET:$DOCKER_BUILD_TAG -f docker/hub/Dockerfile .
docker build --build-arg TARGET=$DOCKER_TARGET --no-cache=true --tag parity/$DOCKER_TARGET:$DOCKER_BUILD_TAG -f scripts/docker/hub/Dockerfile .
docker push parity/$DOCKER_TARGET:$DOCKER_BUILD_TAG
docker logout

View File

@ -1,18 +0,0 @@
#!/bin/bash
set -e # fail on any error
set -u # treat unset variables as error
case ${CI_COMMIT_REF_NAME} in
nightly|*v2.2*) export CHANNEL="edge";;
beta|*v2.1*) export CHANNEL="beta";;
stable|*v2.0*) export CHANNEL="stable";;
*) echo "No release" exit 0;;
esac
echo "Release channel :" $CHANNEL " Branch/tag: " $CI_COMMIT_REF_NAME
echo $SNAPCRAFT_LOGIN_PARITY_BASE64 | base64 --decode > snapcraft.login
snapcraft login --with snapcraft.login
snapcraft push --release $CHANNEL "artifacts/parity_"$VERSION"_"$BUILD_ARCH".snap"
snapcraft status parity
snapcraft logout

View File

@ -1,67 +0,0 @@
#!/bin/bash
set -e # fail on any error
set -u # treat unset variables as error
updater_push_release () {
echo "push release"
# Mainnet
}
echo "__________Set ENVIROMENT__________"
DESCRIPTION="$(cat CHANGELOG.md)"
RELEASE_TABLE="$(cat scripts/gitlab/templates/release-table.md)"
RELEASE_TABLE="$(echo "${RELEASE_TABLE//\$VERSION/${VERSION}}")"
#The text in the file CANGELOG.md before which the table with links is inserted. Must be present in this file necessarily
REPLACE_TEXT="The full list of included changes:"
case ${CI_COMMIT_REF_NAME} in
nightly|*v2.2*) NAME="Parity "$VERSION" nightly";;
beta|*v2.1*) NAME="Parity "$VERSION" beta";;
stable|*v2.0*) NAME="Parity "$VERSION" stable";;
*) echo "No release" exit 0;;
esac
cd artifacts
ls -l | sort -k9
filetest=( * )
echo ${filetest[*]}
for DIR in "${filetest[@]}";
do
cd $DIR
if [[ $DIR == "*windows*" ]];
then
WIN=".exe";
else
WIN="";
fi
for binary in $(ls parity.sha256)
do
sha256=$(cat $binary | awk '{ print $1}' )
case $DIR in
x86_64* )
DATA="commit=$CI_BUILD_REF&sha3=$sha256&filename=parity$WIN&secret=$RELEASES_SECRET"
../../scripts/gitlab/safe_curl.sh $DATA "http://update.parity.io:1337/push-build/$CI_BUILD_REF_NAME/$DIR"
# Kovan
../../scripts/gitlab/safe_curl.sh $DATA "http://update.parity.io:1338/push-build/$CI_BUILD_REF_NAME/$DIR"
;;
esac
RELEASE_TABLE="$(echo "${RELEASE_TABLE/sha$DIR/${sha256}}")"
done
cd ..
done
#do not touch the following 3 lines. Features of output in Markdown
DESCRIPTION="$(echo "${DESCRIPTION/${REPLACE_TEXT}/${RELEASE_TABLE}
${REPLACE_TEXT}}")"
echo "$DESCRIPTION"
if [[ "$CI_COMMIT_REF_NAME" == "nightly" ]]; then DESCRIPTION=""; fi #TODO in the future, we need to prepare a script that will do changelog
echo "__________Create release to Github____________"
github-release release --user devops-parity --repo parity-ethereum --tag "$CI_COMMIT_REF_NAME" --draft --name "$NAME" --description "$DESCRIPTION"
echo "__________Push binaries to AWS S3____________"
aws configure set aws_access_key_id $s3_key
aws configure set aws_secret_access_key $s3_secret
if [[ "$CI_BUILD_REF_NAME" = "beta" || "$CI_BUILD_REF_NAME" = "stable" || "$CI_BUILD_REF_NAME" = "nightly" ]];
then
export S3_BUCKET=builds-parity-published;
else
export S3_BUCKET=builds-parity;
fi
aws s3 sync ./ s3://$S3_BUCKET/$CI_BUILD_REF_NAME/

View File

@ -1,7 +0,0 @@
#!/bin/bash
set -e # fail on any error
set -u # treat unset variables as error
cargo install rustfmt-nightly
cargo fmt -- --write-mode=diff

View File

@ -1,16 +0,0 @@
| OS | Arch | Download | SHA256 Checksum |
|:---:|:---:|:---|:---|
| linux | arm64 | [parity](https://releases.parity.io/$VERSION/aarch64-unknown-linux-gnu/parity) | <sup>`shaaarch64-unknown-linux-gnu`</sup> |
| android | armv7 | [parity](https://releases.parity.io/$VERSION/armv7-linux-androideabi/parity) | <sup>`shaarmv7-linux-androideabi`</sup> |
| linux | armv7 | [parity](https://releases.parity.io/$VERSION/armv7-unknown-linux-gnueabihf/parity) | <sup>`shaarmv7-unknown-linux-gnueabihf`</sup> |
| linux | i686 | [parity](https://releases.parity.io/$VERSION/i686-unknown-linux-gnu/parity) | <sup>`shai686-unknown-linux-gnu`</sup> |
| osx | x64 | [parity](https://releases.parity.io/$VERSION/x86_64-apple-darwin/parity) | <sup>`shax86_64-apple-darwin`</sup> |
| windows | x64 | [parity.exe](https://releases.parity.io/$VERSION/x86_64-pc-windows-msvc/parity.exe) | <sup>`shax86_64-pc-windows-msvc`</sup> |
| linux | x64 | [parity](https://releases.parity.io/$VERSION/x86_64-unknown-linux-gnu/parity) | <sup>`shax86_64-unknown-linux-gnu`</sup> |
| OS | Alternative | Link |
|:---:|:---:|:---|
| <img src="https://gist.github.com/5chdn/1fce888fde1d773761f809b607757f76/raw/44c4f0fc63f1ea8e61a9513af5131ef65eaa6c75/apple.png" alt="Apple Icon by Pixel Perfect from https://www.flaticon.com/authors/pixel-perfect" style="width: 32px;"/> | Homebrew |[github.com/paritytech/homebrew-paritytech/blob/master/README.md](https://github.com/paritytech/homebrew-paritytech/blob/master/README.md) |
| <img src="https://gist.github.com/5chdn/1fce888fde1d773761f809b607757f76/raw/44c4f0fc63f1ea8e61a9513af5131ef65eaa6c75/linux.png" alt="Linux Icon by Pixel Perfect from https://www.flaticon.com/authors/pixel-perfect" style="width: 32px;"/> | Snapcraft | [snapcraft.io/parity](https://snapcraft.io/parity/) |
| <img src="https://gist.github.com/5chdn/1fce888fde1d773761f809b607757f76/raw/44c4f0fc63f1ea8e61a9513af5131ef65eaa6c75/settings.png" alt="Settings Icon by Pixel Perfect from https://www.flaticon.com/authors/pixel-perfect" style="width: 32px;"/> | Docker | [hub.docker.com/r/parity/parity](https://hub.docker.com/r/parity/parity) |
| <img src="https://gist.github.com/5chdn/1fce888fde1d773761f809b607757f76/raw/44c4f0fc63f1ea8e61a9513af5131ef65eaa6c75/settings.png" alt="Settings Icon by Pixel Perfect from https://www.flaticon.com/authors/pixel-perfect" style="width: 32px;"/> | Other binaries | [vanity-service.parity.io/parity-binaries?format=markdown&version=$VERSION](https://vanity-service.parity.io/parity-binaries?format=markdown&version=$VERSION) |

View File

@ -1,58 +0,0 @@
name: parity
version: $VERSION
architectures: [$BUILD_ARCH]
grade: $GRADE
confinement: strict
summary: Fast, light, robust Ethereum implementation
description: |
Parity's goal is to be the fastest, lightest, and most secure Ethereum
client. We are developing Parity using the sophisticated and cutting-edge
Rust programming language. Parity is licensed under the GPLv3, and can be
used for all your Ethereum needs.
apps:
parity:
command: parity
plugs: [home, network, network-bind, mount-observe, x11, unity7, desktop, desktop-legacy, wayland]
desktop: usr/share/applications/parity.desktop
parity-evm:
command: parity-evm
plugs: [home, network, network-bind]
ethkey:
command: ethkey
plugs: [home]
ethstore:
command: ethstore
plugs: [home]
whisper:
command: whisper
plugs: [home]
icon: snap/gui/icon.png
parts:
desktop-icon:
source: ./snap
plugin: nil
override-build: |
mkdir -p $SNAPCRAFT_PART_INSTALL/usr/share/applications
mkdir -p $SNAPCRAFT_PART_INSTALL/usr/share/pixmaps
cp -v gui/parity.desktop $SNAPCRAFT_PART_INSTALL/usr/share/applications/
cp -v gui/icon.png $SNAPCRAFT_PART_INSTALL/usr/share/pixmaps/
parity:
source: ./artifacts/$CARGO_TARGET
plugin: nil
override-build: |
mkdir -p $SNAPCRAFT_PART_INSTALL/usr/bin
cp -v parity $SNAPCRAFT_PART_INSTALL/usr/bin/parity
cp -v parity-evm $SNAPCRAFT_PART_INSTALL/usr/bin/parity-evm
cp -v ethkey $SNAPCRAFT_PART_INSTALL/usr/bin/ethkey
cp -v ethstore $SNAPCRAFT_PART_INSTALL/usr/bin/ethstore
cp -v whisper $SNAPCRAFT_PART_INSTALL/usr/bin/whisper
stage-packages: [libc6, libssl1.0.0, libudev1, libstdc++6, cmake]
df:
plugin: nil
stage-packages: [coreutils]
stage: [bin/df]

35
scripts/gitlab/test-all.sh Executable file
View File

@ -0,0 +1,35 @@
#!/bin/bash
# ARGUMENT $1 Rust flavor to test with (stable/beta/nightly)
set -e # fail on any error
set -u # treat unset variables as error
git log --graph --oneline --all --decorate=short -n 10
case $CI_COMMIT_REF_NAME in
(beta|stable)
export GIT_COMPARE=$CI_COMMIT_REF_NAME~
;;
(master|nightly)
export GIT_COMPARE=master~
;;
(*)
export GIT_COMPARE=master
;;
esac
export RUST_FILES_MODIFIED="$(git --no-pager diff --name-only $GIT_COMPARE...$CI_COMMIT_SHA | grep -v -e ^\\. -e ^LICENSE -e ^README.md -e ^CHANGELOG.md -e ^test.sh -e ^scripts/ -e ^docs/ -e ^docker/ -e ^snap/ | wc -l | tr -d ' ')"
echo "RUST_FILES_MODIFIED: $RUST_FILES_MODIFIED"
if [ "${RUST_FILES_MODIFIED}" = "0" ]
then
echo "__________Skipping Rust tests since no Rust files modified__________";
exit 0
fi
rustup default $1
git submodule update --init --recursive
rustup show
exec ./test.sh

View File

@ -1,28 +0,0 @@
#!/bin/bash
# ARGUMENT $1 Rust flavor to test with (stable/beta/nightly)
set -e # fail on any error
set -u # treat unset variables as error
rustup default $1
if [[ "$CI_COMMIT_REF_NAME" = "beta" || "$CI_COMMIT_REF_NAME" = "stable" ]]; then
export GIT_COMPARE=$CI_COMMIT_REF_NAME~;
else
export GIT_COMPARE=master;
fi
export RUST_FILES_MODIFIED="$(git --no-pager diff --name-only $GIT_COMPARE...$CI_COMMIT_SHA | grep -v -e ^\\. -e ^LICENSE -e ^README.md -e ^test.sh -e ^windows/ -e ^scripts/ -e ^mac/ -e ^nsis/ | wc -l)"
echo "RUST_FILES_MODIFIED: $RUST_FILES_MODIFIED"
git submodule update --init --recursive
rustup show
if [[ "${RUST_FILES_MODIFIED}" == "0" ]];
then echo "__________Skipping Rust tests since no Rust files modified__________";
else ./test.sh || exit $?;
fi
# if [[ "$CI_COMMIT_REF_NAME" == "nightly" ]];
# ### @TODO re-enable fail after https://github.com/paritytech/parity-import-tests/issues/3
# then sh scripts/aura-test.sh; # || exit $?;
# fi

Binary file not shown.

Before

Width:  |  Height:  |  Size: 4.9 KiB

View File

@ -1,8 +0,0 @@
[Desktop Entry]
Type=Application
Encoding=UTF-8
Name=Parity Ethereum
Comment=The fastest and most advanced Ethereum client.
Exec=parity
Icon=/usr/share/pixmaps/icon.png
Terminal=true

78
test.sh
View File

@ -4,6 +4,7 @@
FEATURES="json-tests"
OPTIONS="--release"
VALIDATE=1
THREADS=8
case $1 in
--no-json)
@ -29,31 +30,72 @@ esac
set -e
if [ "$VALIDATE" -eq "1" ]; then
# Validate --no-default-features build
echo "________Validate build________"
time cargo check --no-default-features
time cargo check --manifest-path util/io/Cargo.toml --no-default-features
time cargo check --manifest-path util/io/Cargo.toml --features "mio"
# Validate chainspecs
echo "________Validate chainspecs________"
time ./scripts/validate_chainspecs.sh
fi
validate () {
if [ "$VALIDATE" -eq "1" ]
then
echo "________Validate build________"
time cargo check $@ --no-default-features
time cargo check $@ --manifest-path util/io/Cargo.toml --no-default-features
time cargo check $@ --manifest-path util/io/Cargo.toml --features "mio"
# Running the C++ example
echo "________Running the C++ example________"
cd parity-clib-examples/cpp && \
# Validate chainspecs
echo "________Validate chainspecs________"
time ./scripts/validate_chainspecs.sh
else
echo "# not validating due to \$VALIDATE!=1"
fi
}
cpp_test () {
case $CARGO_TARGET in
(x86_64-unknown-linux-gnu)
# Running the C++ example
echo "________Running the C++ example________"
cd parity-clib-examples/cpp && \
mkdir -p build && \
cd build && \
cmake .. && \
make -j 8 && \
make -j $THREADS && \
./parity-example && \
cd .. && \
rm -rf build && \
cd ../..
;;
(*)
echo "________Skipping the C++ example________"
;;
esac
}
cargo_test () {
echo "________Running Parity Full Test Suite________"
git submodule update --init --recursive
time cargo test $OPTIONS --features "$FEATURES" --all $@ -- --test-threads $THREADS
}
if [ "$CARGO_TARGET" ]
then
validate --target $CARGO_TARGET
else
validate
fi
test "${RUN_TESTS}" = "all" && cpp_test
if [ "$CARGO_TARGET" ]
then
case "${RUN_TESTS}" in
(cargo|all)
cargo_test --target $CARGO_TARGET $@
;;
('')
cargo_test --no-run --target $CARGO_TARGET $@
;;
esac
else
cargo_test $@
fi
# Running tests
echo "________Running Parity Full Test Suite________"
git submodule update --init --recursive
time cargo test $OPTIONS --features "$FEATURES" --all $1 -- --test-threads 8

View File

@ -9,7 +9,7 @@ authors = ["Parity Technologies <admin@parity.io>"]
[dependencies]
fnv = "1.0"
mio = { version = "0.6.8", optional = true }
crossbeam = "0.3"
crossbeam-deque = "0.6"
parking_lot = "0.6"
log = "0.4"
slab = "0.4"

View File

@ -74,7 +74,7 @@ extern crate mio;
#[macro_use]
extern crate log as rlog;
extern crate slab;
extern crate crossbeam;
extern crate crossbeam_deque as deque;
extern crate parking_lot;
extern crate num_cpus;
extern crate timer;

View File

@ -20,7 +20,7 @@ use std::collections::HashMap;
use mio::*;
use mio::timer::{Timeout};
use mio::deprecated::{EventLoop, Handler, Sender, EventLoopBuilder};
use crossbeam::sync::chase_lev;
use deque;
use slab::Slab;
use {IoError, IoHandler};
use worker::{Worker, Work, WorkType};
@ -184,7 +184,7 @@ pub struct IoManager<Message> where Message: Send + Sync {
timers: Arc<RwLock<HashMap<HandlerId, UserTimer>>>,
handlers: Arc<RwLock<Slab<Arc<IoHandler<Message>>>>>,
workers: Vec<Worker>,
worker_channel: chase_lev::Worker<Work<Message>>,
worker_channel: deque::Worker<Work<Message>>,
work_ready: Arc<Condvar>,
}
@ -194,7 +194,7 @@ impl<Message> IoManager<Message> where Message: Send + Sync + 'static {
event_loop: &mut EventLoop<IoManager<Message>>,
handlers: Arc<RwLock<Slab<Arc<IoHandler<Message>>>>>
) -> Result<(), IoError> {
let (worker, stealer) = chase_lev::deque();
let (worker, stealer) = deque::fifo();
let num_workers = 4;
let work_ready_mutex = Arc::new(Mutex::new(()));
let work_ready = Arc::new(Condvar::new());
@ -430,7 +430,7 @@ impl<Message> IoChannel<Message> where Message: Send + Sync + 'static {
/// General IO Service. Starts an event loop and dispatches IO requests.
/// 'Message' is a notification message type
pub struct IoService<Message> where Message: Send + Sync + 'static {
thread: Mutex<Option<JoinHandle<()>>>,
thread: Option<JoinHandle<()>>,
host_channel: Mutex<Sender<IoMessage<Message>>>,
handlers: Arc<RwLock<Slab<Arc<IoHandler<Message>>>>>,
}
@ -448,19 +448,19 @@ impl<Message> IoService<Message> where Message: Send + Sync + 'static {
IoManager::<Message>::start(&mut event_loop, h).expect("Error starting IO service");
});
Ok(IoService {
thread: Mutex::new(Some(thread)),
thread: Some(thread),
host_channel: Mutex::new(channel),
handlers: handlers,
})
}
pub fn stop(&self) {
pub fn stop(&mut self) {
trace!(target: "shutdown", "[IoService] Closing...");
// Clear handlers so that shared pointers are not stuck on stack
// in Channel::send_sync
self.handlers.write().clear();
self.host_channel.lock().send(IoMessage::Shutdown).unwrap_or_else(|e| warn!("Error on IO service shutdown: {:?}", e));
if let Some(thread) = self.thread.lock().take() {
if let Some(thread) = self.thread.take() {
thread.join().unwrap_or_else(|e| {
debug!(target: "shutdown", "Error joining IO service event loop thread: {:?}", e);
});

View File

@ -16,7 +16,7 @@
use std::sync::{Arc, Weak};
use std::thread;
use crossbeam::sync::chase_lev;
use deque;
use slab::Slab;
use fnv::FnvHashMap;
use {IoError, IoHandler};
@ -198,7 +198,7 @@ struct Shared<Message> where Message: Send + Sync + 'static {
// necessary.
timers: Mutex<FnvHashMap<TimerToken, TimerGuard>>,
// Channel used to send work to the worker threads.
channel: Mutex<Option<chase_lev::Worker<WorkTask<Message>>>>,
channel: Mutex<Option<deque::Worker<WorkTask<Message>>>>,
}
// Messages used to communicate with the event loop from other threads.
@ -224,7 +224,7 @@ impl<Message> Clone for WorkTask<Message> where Message: Send + Sized {
impl<Message> IoService<Message> where Message: Send + Sync + 'static {
/// Starts IO event loop
pub fn start() -> Result<IoService<Message>, IoError> {
let (tx, rx) = chase_lev::deque();
let (tx, rx) = deque::fifo();
let shared = Arc::new(Shared {
handlers: RwLock::new(Slab::with_capacity(MAX_HANDLERS)),
@ -251,7 +251,7 @@ impl<Message> IoService<Message> where Message: Send + Sync + 'static {
}
/// Stops the IO service.
pub fn stop(&self) {
pub fn stop(&mut self) {
trace!(target: "shutdown", "[IoService] Closing...");
// Clear handlers so that shared pointers are not stuck on stack
// in Channel::send_sync
@ -307,15 +307,15 @@ impl<Message> Drop for IoService<Message> where Message: Send + Sync {
}
}
fn do_work<Message>(shared: &Arc<Shared<Message>>, rx: chase_lev::Stealer<WorkTask<Message>>)
fn do_work<Message>(shared: &Arc<Shared<Message>>, rx: deque::Stealer<WorkTask<Message>>)
where Message: Send + Sync + 'static
{
loop {
match rx.steal() {
chase_lev::Steal::Abort => continue,
chase_lev::Steal::Empty => thread::park(),
chase_lev::Steal::Data(WorkTask::Shutdown) => break,
chase_lev::Steal::Data(WorkTask::UserMessage(message)) => {
deque::Steal::Retry => continue,
deque::Steal::Empty => thread::park(),
deque::Steal::Data(WorkTask::Shutdown) => break,
deque::Steal::Data(WorkTask::UserMessage(message)) => {
for id in 0 .. MAX_HANDLERS {
if let Some(handler) = shared.handlers.read().get(id) {
let ctxt = IoContext { handler: id, shared: shared.clone() };
@ -323,7 +323,7 @@ fn do_work<Message>(shared: &Arc<Shared<Message>>, rx: chase_lev::Stealer<WorkTa
}
}
},
chase_lev::Steal::Data(WorkTask::TimerTrigger { handler_id, token }) => {
deque::Steal::Data(WorkTask::TimerTrigger { handler_id, token }) => {
if let Some(handler) = shared.handlers.read().get(handler_id) {
let ctxt = IoContext { handler: handler_id, shared: shared.clone() };
handler.timeout(&ctxt, token);

View File

@ -17,7 +17,7 @@
use std::sync::Arc;
use std::thread::{JoinHandle, self};
use std::sync::atomic::{AtomicBool, Ordering as AtomicOrdering};
use crossbeam::sync::chase_lev;
use deque;
use service_mio::{HandlerId, IoChannel, IoContext};
use IoHandler;
use LOCAL_STACK_SIZE;
@ -53,7 +53,7 @@ pub struct Worker {
impl Worker {
/// Creates a new worker instance.
pub fn new<Message>(index: usize,
stealer: chase_lev::Stealer<Work<Message>>,
stealer: deque::Stealer<Work<Message>>,
channel: IoChannel<Message>,
wait: Arc<Condvar>,
wait_mutex: Arc<Mutex<()>>,
@ -75,8 +75,9 @@ impl Worker {
worker
}
fn work_loop<Message>(stealer: chase_lev::Stealer<Work<Message>>,
channel: IoChannel<Message>, wait: Arc<Condvar>,
fn work_loop<Message>(stealer: deque::Stealer<Work<Message>>,
channel: IoChannel<Message>,
wait: Arc<Condvar>,
wait_mutex: Arc<Mutex<()>>,
deleting: Arc<AtomicBool>)
where Message: Send + Sync + 'static {
@ -91,8 +92,9 @@ impl Worker {
while !deleting.load(AtomicOrdering::Acquire) {
match stealer.steal() {
chase_lev::Steal::Data(work) => Worker::do_work(work, channel.clone()),
_ => break,
deque::Steal::Data(work) => Worker::do_work(work, channel.clone()),
deque::Steal::Retry => {},
deque::Steal::Empty => break,
}
}
}

View File

@ -751,12 +751,15 @@ impl Host {
let max_ingress = max(max_peers - min_peers, min_peers / 2);
if reserved_only ||
(s.info.originated && egress_count > min_peers) ||
(!s.info.originated && ingress_count > max_ingress) && !self.reserved_nodes.read().contains(&id) {
(!s.info.originated && ingress_count > max_ingress) {
if !self.reserved_nodes.read().contains(&id) {
// only proceed if the connecting peer is reserved.
trace!(target: "network", "Disconnecting non-reserved peer {:?}", id);
s.disconnect(io, DisconnectReason::TooManyPeers);
kill = true;
break;
}
}
if !self.filter.as_ref().map_or(true, |f| f.connection_allowed(&self_id, &id, ConnectionDirection::Inbound)) {
trace!(target: "network", "Inbound connection not allowed for {:?}", id);

View File

@ -624,19 +624,29 @@ mod tests {
// unknown - node 6
// nodes are also ordered according to their addition time
//
// nanosecond precision lost since mac os x high sierra update so let's not compare their order
// https://github.com/paritytech/parity-ethereum/issues/9632
let r = table.nodes(&IpFilter::default());
assert_eq!(r[0][..], id4[..]); // most recent success
assert_eq!(r[1][..], id3[..]);
// most recent success
assert!(
(r[0] == id4 && r[1] == id3) ||
(r[0] == id3 && r[1] == id4)
);
// unknown (old contacts and new nodes), randomly shuffled
assert!(
r[2][..] == id5[..] && r[3][..] == id6[..] ||
r[2][..] == id6[..] && r[3][..] == id5[..]
(r[2] == id5 && r[3] == id6) ||
(r[2] == id6 && r[3] == id5)
);
assert_eq!(r[4][..], id1[..]); // oldest failure
assert_eq!(r[5][..], id2[..]);
// oldest failure
assert!(
(r[4] == id1 && r[5] == id2) ||
(r[4] == id2 && r[5] == id1)
);
}
#[test]

View File

@ -3,7 +3,7 @@
[package]
name = "parity-version"
# NOTE: this value is used for Parity Ethereum version string (via env CARGO_PKG_VERSION)
version = "2.1.1"
version = "2.1.2"
authors = ["Parity Technologies <admin@parity.io>"]
build = "build.rs"