Compare commits

...

79 Commits

Author SHA1 Message Date
taw1313
1a25eda598
Community wip 2022 (#334)
* update tools script

* update community docs
2022-07-26 13:56:26 -06:00
David Rigby
495130a40f
Merge pull request #332 from dragonchain/txn-type-fix
check for null object
2021-11-30 23:38:36 +00:00
DRigby26
a7151a6520 removes log 2021-11-30 12:38:29 -08:00
DRigby26
a184a850a9 check for null object 2021-06-15 14:13:22 -07:00
Alex Benedetto
a1a8bb887b
Merge pull request #327 from dragonchain/object-storage-change
Increase block size by storing payloads in separate file
2020-11-19 11:25:51 -08:00
Alex Benedetto
4579be1fcd store payloads in separate file and fixed worker issue where long queries result in replay attack error 2020-11-19 10:58:53 -08:00
Alex Benedetto
24a28ac609 Fixed bug in re-indexing 2020-07-24 08:33:27 -07:00
Alex Benedetto
0d3fb63500
Querying interchain broadcast bug fix with UUID chain ID (#324)
* bug fix for querying redisearch with a UUID chain ID instead of public key
2020-07-23 13:40:23 -07:00
Alex Benedetto
0f60f78115 remove drop index 2020-07-22 16:55:39 -07:00
Adam
eb432480e9
update dependencies (#323) 2020-07-21 01:36:28 +00:00
Alex Benedetto
34d34e344b
Merge pull request #322 from dragonchain/verification-redisearch-index
Subsequent L5 verification search
2020-07-20 12:28:41 -07:00
Alex Benedetto
53ba26d629 add new endpoint for querying subsequent interchain transactions 2020-07-20 12:20:52 -07:00
Adam
5ae03cc0b2
Merge pull request #320 from cheeseandcereal/update_dependencies
update dependencies
2020-06-19 14:07:38 -07:00
Adam Crowder
f1d7a88b48
update dependencies/fix lint errors
Signed-off-by: Adam Crowder <adam@adamcrowder.net>
2020-06-19 14:03:00 -07:00
Adam Crowder
7bf7b756e0
update dependencies
Signed-off-by: Adam Crowder <adam@adamcrowder.net>
2020-06-01 22:46:11 -07:00
David Rigby
b817ea1de6
Merge pull request #319 from dragonchain/block-level-fix
check current level for -1
2020-06-01 09:37:56 -07:00
DavidR
0aa8658c40 fix 2020-06-01 09:23:56 -07:00
David Rigby
cbd7f0dea8
Update dragonchain/broadcast_processor/broadcast_processor.py
Co-authored-by: Alex Benedetto <alex061994@gmail.com>
2020-05-29 15:40:22 -07:00
DavidR
d3f1f808dc replaced break with continue 2020-05-29 15:38:02 -07:00
DavidR
7855f47432 removed bad f strings 2020-05-29 15:30:32 -07:00
DavidR
9cc1439227 updated readme 2020-05-29 14:23:25 -07:00
DavidR
7d3f59d756 added break to if statements 2020-05-29 14:19:36 -07:00
DavidR
9f84f21f6c check current level for -1 2020-05-29 14:03:10 -07:00
Adam
55b6fe4256
update various dependencies and containers (#318) 2020-03-19 12:06:46 -07:00
Adam
9103ee4c42
update dependencies (#317)
* update dependencies and typing for web3

* update dependency containers
2020-03-04 12:45:01 -08:00
Adam
8e68babfd7
update dependencies (#316) 2020-02-24 15:14:55 -08:00
Adam
6dc803cb27
update dependencies (#315) 2020-02-20 09:42:31 -08:00
Adam
f7e2227791
bump dependencies (#314)
* bump dependencies
2020-02-18 10:19:43 -08:00
Adam
2be3d2bcd3
update dependencies (#313) 2020-02-12 18:59:18 +00:00
Adam
364fe2ee05
version bumps (#312) 2020-02-05 21:10:27 +00:00
Adam
cc53cfcf02
fix transaction query with bad type (#310) 2020-02-05 01:03:08 +00:00
Adam
4516d78ccf
provide better error when interchain publish fails (#309) 2020-02-04 23:41:19 +00:00
Adam
ab5c53b879
fix l5 backlog queue (#308) 2020-02-04 19:05:32 +00:00
Adam
a855f1d073
add interchain transaction publish (#307)
* add interchain transaction publish

* remove codeowners
2020-02-03 22:55:17 +00:00
Alex Benedetto
57e7de3371
Delete contract by transaction type (#306)
* added delete route for contract by transaction type

Signed-off-by: Adam Crowder <adam@adamcrowder.net>

Co-authored-by: Adam <adam@adamcrowder.net>
2020-01-30 11:06:56 -08:00
Adam
980b26a486
Update L4 block schema and fwatchdog (#305) 2020-01-29 20:01:30 +00:00
Adam
59826e9727
update redisearch (#304) 2020-01-28 23:33:31 +00:00
Adam
d093c4a15d
update dependencies (#303) 2020-01-27 19:27:07 +00:00
Adam
088d5abb7e
post-release version bump (#302) 2020-01-17 00:52:20 +00:00
Adam
dbede7feed
Fix contract building bug (#300) 2020-01-16 22:16:18 +00:00
Adam
51a5180bf5
fix l5 receipt processing bug (#299) 2020-01-16 18:21:07 +00:00
Adam
ff2b1553ec
update dependencies (#297)
* update dependencies

* update helm in dependency/test container
2020-01-14 01:55:38 +00:00
Dean Shelton
e389ae53a8 (configmap.yaml) Removes duplicate quotes (#296)
* (configmap.yaml) Removes duplicate quotes

* update CHANGELOG
2020-01-10 21:46:12 +00:00
Adam
280165ce85
update dependencies (#295) 2020-01-08 20:02:21 +00:00
Adam
d15a2de50d
remove tx processor callback checking logs (#294) 2020-01-03 01:37:58 +00:00
Roby Daquilante
a29d77e636
Add recovery queue and brpoplpush to job processor (#281) 2019-12-31 15:00:03 -08:00
Adam
4ccdeca854
update copyright dates (#293) 2019-12-31 18:34:07 +00:00
Dean Shelton
b1a98e5288
Merge pull request #292 from dragonchain/addHeader
adds header to verification notification
2019-12-27 15:45:42 -08:00
Dean Shelton
3681d0a4f1 adds header to verification notification 2019-12-27 15:26:27 -08:00
Adam
c78b635755
fix tests (python 3.8.1) (#291)
* fix tests

* assert await
2019-12-27 21:54:44 +00:00
Adam
ccc414cdd5
post-release version bump (#290) 2019-12-24 01:13:21 +00:00
Adam
5c5215fa2c
fix take 3 (#289) 2019-12-24 00:33:17 +00:00
Adam
28d0585d8a
fix broadcast processor bug (#288)
* fix final

* with dependency containers
2019-12-24 00:07:57 +00:00
Adam
84feb860d9
fix dockerfiles (#287) 2019-12-23 23:29:10 +00:00
Adam
bd137aa6a2
broadcast processor fix (#285) 2019-12-23 23:22:05 +00:00
Adam
47b8fb686f
post-release bumps (#284) 2019-12-23 18:37:05 +00:00
Adam
cd8fbcc96f
Restrict smart contract secret name charset (#282) 2019-12-20 18:30:11 +00:00
Adam
de1f7b21cf
Deprecate morden and add raspi docs (#280)
* deprecate morden and add docs
2019-12-19 21:10:39 +00:00
Adam
0cfde63a5b
Fix bugs (#278)
* fix cache redis config map

* fix contract secret update

* update contract create/update schema
2019-12-13 22:01:38 +00:00
Adam
a0071be227
Allow updating smart contract api keys, and remove 'reserved' key concept from WEB_ keys (#276)
* allow get/list/update/delete of WEB_ keys and get/list/update of SC_ keys

* update changelog

* refactor list api keys
2019-12-13 19:14:23 +00:00
Adam
6469e8bb53
tweak resource usages (#277) 2019-12-13 19:07:38 +00:00
Adam
1d31100fe7
Update status endpoint to indicate if indexing is on or off (#275)
* update status endpoint to add indication is redisearch is on or off

* add changelog/tests
2019-12-12 23:38:32 +00:00
Adam
d5544e7c06
Remove broken arm7 builds (#274) 2019-12-12 18:15:26 +00:00
Adam
4c322ae102
Armv7 builds (#273)
* add armv7 build

* update changelog
2019-12-11 21:36:57 +00:00
Adam
1b2315924f
fix buildspec again (#272) 2019-12-10 23:04:45 +00:00
Wingman4l7
fa315563f6
implementing deadline for L5s (#268) 2019-12-10 14:55:18 -08:00
Adam
55c07cea22
fix buildspec (#271) 2019-12-10 22:30:17 +00:00
Adam
4ee89873aa
Add Arm64 builds (#270)
* arm64
2019-12-10 22:19:12 +00:00
Adam
12a40e73f1
Make redisearch optional on verification nodes (#269) 2019-12-06 21:01:20 +00:00
Adam
8e8f5eabf3
3.4.1 version bump (#267) 2019-12-05 21:13:37 +00:00
Adam
ed83c5309f
reduce interchain timeout (#266) 2019-12-05 20:03:38 +00:00
Adam
d0ae7f3015
fix bulk endpoint behavior (#265) 2019-12-05 18:47:35 +00:00
Roby Daquilante
4f083642de Support changes to matchmaking error code updates (#186)
* update

* handle matchmaking not found
2019-12-04 23:39:51 +00:00
Adam
3ad7fe9225
Add Key Permissioning (#263)
* key permissioning
2019-12-04 22:23:23 +00:00
Wingman4l7
8d2df3a340
adding L4 block limit (#262)
* added L4 block limit, updated changelog
2019-12-03 15:51:55 -08:00
Wingman4l7
8ad5a6a78c
squash (#255) 2019-11-26 12:41:55 -08:00
Adam
360efde694
update gunicorn (#260) 2019-11-26 18:47:15 +00:00
Adam
dd73edf418
Add getting started links/update dependencies (#259) 2019-11-22 23:49:11 +00:00
Adam
e14c8d0255
dependency updates (#257) 2019-11-20 20:50:55 +00:00
178 changed files with 3870 additions and 1310 deletions

1
.github/CODEOWNERS vendored
View File

@ -1 +0,0 @@
* @cheeseandcereal @deanshelton913 @regan-karlewicz

1
.gitignore vendored
View File

@ -11,6 +11,7 @@ coverage.xml
.envrc
.direnv
.venv
venv
# Installer logs
pip-log.txt

View File

@ -5,6 +5,7 @@ branches:
language: minimal
services:
- docker
before_script:
- curl -L https://codeclimate.com/downloads/test-reporter/test-reporter-latest-linux-amd64 > ./cc-test-reporter
- chmod +x ./cc-test-reporter

View File

@ -1 +1 @@
4.2.0
4.5.1

24
.vscode/settings.json vendored
View File

@ -1,16 +1,12 @@
{
"files.watcherExclude": {
"**/.mypy_cache": true,
"**/.venv": true
},
"python.linting.flake8Enabled": true,
"python.formatting.provider": "black",
"python.formatting.blackArgs": [
"-l",
"150",
"-t",
"py38"
],
"editor.formatOnSave": true,
"restructuredtext.confPath": "${workspaceFolder}/docs"
"files.watcherExclude": {
"**/.mypy_cache": true,
"**/.venv": true
},
"python.linting.flake8Enabled": true,
"python.formatting.provider": "black",
"python.formatting.blackArgs": ["-l", "150", "-t", "py38"],
"editor.formatOnSave": true,
"restructuredtext.confPath": "${workspaceFolder}/docs",
"cSpell.words": ["Dragonchain"]
}

View File

@ -1,5 +1,120 @@
# Changelog
## 4.5.1
- **Bugs**
- Fixed bug in querying transactions that takes longer than `10s` result in replay attack error
- Fixed bug with retrieving transactions with payloads larger than `1mb` results in S3 Select error `OverMaxRecordSize`
## 4.5.0
- **Feature:**
- Add new endpoint `GET /v1/verifications/interchains/<block_id>` for getting the subsequent interchain broadcasts
- **Packaging:**
- Update web3, docker, bit, requests, fastjsonschema, kubernetes, redis, redisearch, Flask, pycoin, base58, and boto3 dependencies
- Update redisearch in helm chart to 1.6.13
- Update redis in helm chart to 6.0.5
- **Bugs**
- Fixed bug in broadcast processor causing failure when a block cannot be found in redis
- Fixed bug in querying interchain broadcasts when the old L5 chain ID is a UUID and not public key
- Fixed bug in re-indexing of L5 verifications where redisearch throws document exists error
- **Development:**
- Use helm 3.2.4 and yq 3.3.2 for dependency container and lint checking
## 4.4.0
- **Feature:**
- Modify L4 block schema to include chain name in the L4 block header
- Add new endpoint `DELETE /v1/contract/txn_type/<txn_type>` for deleting smart contracts by transaction type
- Add new endpoint `POST /v1/interchains/transaction/publish` for publishing a signed interchain transaction
- **Bugs:**
- Fix issue where L5 recovery queue would not process when trying to resolve already resolved claim-checks
- **Packaging:**
- Update redis, web3, and boto3 dependencies
- Update redisearch in helm chart to 1.6.7
- Update fwatchdog to 0.18.10 for OpenFaaS smart contracts
- **Development:**
- Removed all references about lab chain
- Remove codeowners
## 4.3.3
- **Bugs:**
- Adds Content-Type header to all verification notifications
- Add retry logic to catch failed job_processor jobs
- Fixes double quote error on VERIFICATION_NOTIFICATION json deployment
- Fix bug where an L1 processing a receipt from an L5 could result in dropping some receipts
- **Packaging:**
- Update redisearch, base58, and boto3 dependencies
- Un-pin python to version 3.8.X
- Update redisearch in helm chart to 1.4.20
- **Development:**
- Fix tests for python 3.8.1
- Use helm 3 (specifically 3.0.2) for dependency container and lint checking
## 4.3.2
- **Bugs:**
- Fix bug that would cause broadcast processor to fail when handling sending blocks to L5 chains
- **Packaging:**
- Update boto3 dependency
- Pin python to 3.8.0
- **CICD:**
- Only push edge containers if building dev code (master)
- Modify/fix dependency/test docker containers for docker caching
## 4.3.1
- **Feature:**
- Add field to status endpoint return to indicate if indexing (redisearch) is on/off
- Allow get/list/update on smart contract api keys (allowing their permissions to be viewed/changed)
- **Bugs:**
- Fix bug where updating an existing secret for a smart contract would cause the contract deployment to fail
- Restrict contract create/update schema to deny overwriting reserved 'secret-key' and 'auth-key-id' secrets
- Restrict contract create/update schema to only allow a restricted character-set for secret names so invalid names can't be used (which would previously cause the smart contract build to fail)
- **Documentation:**
- Edit notes about ram usage requirements for Dragonchain
- Add documentation for deploying with a raspberry pi
- **Packaging:**
- Update boto3, fastjsonschema, and web3 dependencies
- Change default node level with helm install to L2 (from L1)
- Allow option for turning redisearch on/off with verification (L2+) nodes (and disable by default)
- Provide multiarch (arm64/amd64) manifests for built containers in dockerhub
- Update redisearch in helm chart to 1.4.19
- Tweak pod resource usages in helm chart for lower requirements
- Update fwatchdog to 0.18.7 for OpenFaaS smart contracts
- **CICD:**
- Add building for arm64 containers (in addition to existing amd64)
- **Development:**
- Add script to check for newer requirements.txt package versions
- Implemented deadlines for L5 blocks based on block times and confirmations for public blockchains
- Remove any concept of api keys starting with `WEB_` from being special
- Deprecate support for Ethereum Classic Testnet (Morden) with ethereum interchain functionality
## 4.3.0
- **Feature:**
- Add api key permissioning (check their [docs](https://dragonchain-core-docs.dragonchain.com/latest/usage/permissioning.html) for more info)
- Add root and permissions document api key information in response when creating/getting/listing/updating api keys
- Speed up bulk transaction intake with redis pipelines, and increase max transactions in a bulk request to 5000 (from 250)
- Change error codes expected from matchmaking to support new error code for not enough validating nodes and properly handle claim not found
- **Bugs:**
- Fix a bug where a failure in matchmaking would result in claims not being finalized
- **Documentation:**
- Update documentation links to add the [getting started guide](https://docs.dragonchain.com/)
- Add top-level section for usage documentation
- Add pages for authentication and permissioning in usage documentation
- **Packaging:**
- Update boto3, aioredis, and gunicorn dependencies
- Remove now unnnecessary `binutils` and `musl-dev` packages from docker build since gunicorn update
- Update redisearch in helm chart to 1.4.18
- Update redis in helm chart to 5.0.7
- Update fwatchdog to 0.18.4 for OpenFaaS smart contracts
- **Development:**
- Added hard limit to the number of L4 blocks included in an L5 block
- Use independent model and dao for api keys
- Reduce interchain timeouts so client requests don't timeout
## 4.2.0
- **Feature:**
@ -11,7 +126,7 @@
- Fix a bug where L2+ chains could have the transaction processor go into a failure loop if a block failed to write to storage at some point
- Fix a bug where Ethereum L5 nodes could estimate a gas price of 0 for low-activity networks
- Fix a bug where an open-source chain couldn't build smart contracts due to a bad environment variable
- Fix a bug where a chain could infinitely retry to connect to dragon net
- Fix a bug where a chain could infinitely retry to connect to Dragon Net
- Fix a bug with storage deletion using the disk storage interface which could cause unexpected failures
- Fix a bug with private docker registry delete when deleting smart contracts
- Fix a bug with smart contract heap get where pre-pending an extra '/' could give bad results
@ -40,7 +155,7 @@ field where necessary.
- **Feature:**
- Default error reporting to save to disk, so that crash logs/tracebacks can be automatically saved
- Provide better error message when bad input to api doesn't match required schemas
- Adds verification-notification callback in the reciept endpoint
- Adds verification-notification callback in the receipt endpoint
- Add indexed redisearch tag field "invoker" by default when indexing smart contract transactions
- Remove max limit for custom indexes on a transaction type/smart contract
- **Bugs:**

View File

@ -2,17 +2,16 @@ FROM python:3.8-alpine as base
WORKDIR /usr/src/core
# Install necessary base dependencies and set UTC timezone for apscheduler
RUN apk --no-cache upgrade && apk --no-cache add libffi libstdc++ gmp && echo "UTC" > /etc/timezone && apk --no-cache add binutils musl-dev
# apk --no-cache add binutils musl-dev is required for gunicorn 20.0.0 until https://github.com/benoitc/gunicorn/issues/2160 is fixed
RUN apk --no-cache upgrade && apk --no-cache add libffi libstdc++ gmp && echo "UTC" > /etc/timezone
FROM base AS builder
# Install build dependencies
RUN apk add g++ make gmp-dev libffi-dev automake autoconf libtool
RUN apk --no-cache add g++ make gmp-dev libffi-dev automake autoconf libtool
# Install python dependencies
ENV SECP_BUNDLED_EXPERIMENTAL 1
ENV SECP_BUNDLED_WITH_BIGNUM 1
COPY requirements.txt .
RUN python3 -m pip install -r requirements.txt
RUN python3 -m pip install --no-cache-dir -r requirements.txt
FROM base AS release
# Copy the installed python dependencies from the builder

21
Dockerfile.arm64 Normal file
View File

@ -0,0 +1,21 @@
FROM arm64v8/python:3.8-alpine as base
WORKDIR /usr/src/core
# Install necessary base dependencies and set UTC timezone for apscheduler
RUN apk --no-cache upgrade && apk --no-cache add libffi libstdc++ gmp && echo "UTC" > /etc/timezone
FROM base AS builder
# Install build dependencies
RUN apk --no-cache add g++ make gmp-dev libffi-dev automake autoconf libtool
# Install python dependencies
ENV SECP_BUNDLED_EXPERIMENTAL 1
ENV SECP_BUNDLED_WITH_BIGNUM 1
COPY requirements.txt .
RUN python3 -m pip install --no-cache-dir -r requirements.txt
FROM base AS release
# Copy the installed python dependencies from the builder
COPY --from=builder /usr/local/lib/python3.8/site-packages /usr/local/lib/python3.8/site-packages
COPY --from=builder /usr/local/bin/gunicorn /usr/local/bin/gunicorn
# Copy our actual application
COPY --chown=1000:1000 . .

View File

@ -17,8 +17,15 @@ and public blockchain interoperability, Dragonchain shines a new and interesting
</div>
## 🏁 Getting Started
For getting started with Dragonchain, we recommend visiting [docs.dragonchain.com](https://docs.dragonchain.com/)
Those docs go over the friendliest overview for understanding, installing, and using Dragonchain.
## 🔗 Quick Links
- [Getting Started With Dragonchain](https://docs.dragonchain.com/)
- [Project Documentation](https://dragonchain-core-docs.dragonchain.com/latest/)
- [Dragonchain Console](https://console.dragonchain.com/)
- [Dragonchain Inc](https://dragonchain.com/)
@ -31,15 +38,13 @@ and public blockchain interoperability, Dragonchain shines a new and interesting
## 📝 Documentation
Please read the [docs](https://dragonchain-core-docs.dragonchain.com/latest/) for further details and documentation.
Documentation for this specific repository are [available here](https://dragonchain-core-docs.dragonchain.com/latest/).
The documentation is intended for developers wishing to learn about and contribute to the Dragonchain core platform itself.
This documentation is intended for developers wishing to learn about and contribute to the Dragonchain core platform itself.
For _interaction_ with the Dragonchain, we recommend signing up for a [Dragonchain Console](https://console.dragonchain.com)
account and testing with our managed service, as it will be easier for getting started with developing _on top of_ dragonchain
(rather than developing the actual Dragonchain core platform).
For _interaction_ with the Dragonchain, we recommend using the [Getting Started link](https://docs.dragonchain.com/) instead.
For interaction and using the Dragonchain, check out the SDKs (or CLI) and their relevant documentation instead:
You can also view one of our SDKs (or their docs) to start interacting with a Dragonchain:
- Python: [SDK](https://pypi.org/project/dragonchain-sdk/) - [Documentation](https://python-sdk-docs.dragonchain.com/latest/)
- Javascript: [SDK](https://www.npmjs.com/package/dragonchain-sdk) - [Documentation](https://node-sdk-docs.dragonchain.com/latest/)

View File

@ -6,10 +6,10 @@ Parameters:
Default: GitHubReadOnlyToken
Resources:
DevCodeBuild:
DevCodeBuildAmd64:
Type: AWS::CodeBuild::Project
Properties:
Name: "dragonchain-codebuild-deploy-dev"
Name: "dragonchain-codebuild-deploy-dev-amd64"
ServiceRole:
Ref: CICDPipelineRole
Artifacts:
@ -22,16 +22,44 @@ Resources:
- Name: STAGE
Type: PLAINTEXT
Value: dev
- Name: ARCHITECTURE
Type: PLAINTEXT
Value: amd64
PrivilegedMode: true
Source:
Type: CODEPIPELINE
BuildSpec: cicd/buildspec.deploy.yml
TimeoutInMinutes: 60
ProdCodeBuild:
DevCodeBuildArm64:
Type: AWS::CodeBuild::Project
Properties:
Name: "dragonchain-codebuild-deploy-prod"
Name: "dragonchain-codebuild-deploy-dev-arm64"
ServiceRole:
Ref: CICDPipelineRole
Artifacts:
Type: CODEPIPELINE
Environment:
Type: ARM_CONTAINER
ComputeType: BUILD_GENERAL1_LARGE
Image: aws/codebuild/amazonlinux2-aarch64-standard:1.0
EnvironmentVariables:
- Name: STAGE
Type: PLAINTEXT
Value: dev
- Name: ARCHITECTURE
Type: PLAINTEXT
Value: arm64
PrivilegedMode: true
Source:
Type: CODEPIPELINE
BuildSpec: cicd/buildspec.deploy.yml
TimeoutInMinutes: 60
ProdCodeBuildAmd64:
Type: AWS::CodeBuild::Project
Properties:
Name: "dragonchain-codebuild-deploy-prod-amd64"
ServiceRole:
Ref: CICDPipelineRole
Artifacts:
@ -44,6 +72,34 @@ Resources:
- Name: STAGE
Type: PLAINTEXT
Value: prod
- Name: ARCHITECTURE
Type: PLAINTEXT
Value: amd64
PrivilegedMode: true
Source:
Type: CODEPIPELINE
BuildSpec: cicd/buildspec.deploy.yml
TimeoutInMinutes: 60
ProdCodeBuildArm64:
Type: AWS::CodeBuild::Project
Properties:
Name: "dragonchain-codebuild-deploy-prod-arm64"
ServiceRole:
Ref: CICDPipelineRole
Artifacts:
Type: CODEPIPELINE
Environment:
Type: ARM_CONTAINER
ComputeType: BUILD_GENERAL1_LARGE
Image: aws/codebuild/amazonlinux2-aarch64-standard:1.0
EnvironmentVariables:
- Name: STAGE
Type: PLAINTEXT
Value: prod
- Name: ARCHITECTURE
Type: PLAINTEXT
Value: arm64
PrivilegedMode: true
Source:
Type: CODEPIPELINE
@ -123,7 +179,7 @@ Resources:
RunOrder: 1
- Name: Deploy
Actions:
- Name: Test-Build-Deploy
- Name: Test-Build-Deploy-amd64
ActionTypeId:
Category: Build
Owner: AWS
@ -131,7 +187,20 @@ Resources:
Version: 1
Configuration:
ProjectName:
Ref: DevCodeBuild
Ref: DevCodeBuildAmd64
InputArtifacts:
- Name: MasterDragonchainCode
RoleArn: !GetAtt CICDPipelineRole.Arn
RunOrder: 1
- Name: Test-Build-Deploy-arm64
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: 1
Configuration:
ProjectName:
Ref: DevCodeBuildArm64
InputArtifacts:
- Name: MasterDragonchainCode
RoleArn: !GetAtt CICDPipelineRole.Arn
@ -166,7 +235,7 @@ Resources:
RunOrder: 1
- Name: Deploy
Actions:
- Name: Test-Build-Deploy
- Name: Test-Build-Deploy-amd64
ActionTypeId:
Category: Build
Owner: AWS
@ -174,7 +243,20 @@ Resources:
Version: 1
Configuration:
ProjectName:
Ref: ProdCodeBuild
Ref: ProdCodeBuildAmd64
InputArtifacts:
- Name: ProductionDragonchainCode
RoleArn: !GetAtt CICDPipelineRole.Arn
RunOrder: 1
- Name: Test-Build-Deploy-arm64
ActionTypeId:
Category: Build
Owner: AWS
Provider: CodeBuild
Version: 1
Configuration:
ProjectName:
Ref: ProdCodeBuildArm64
InputArtifacts:
- Name: ProductionDragonchainCode
RoleArn: !GetAtt CICDPipelineRole.Arn

View File

@ -1,20 +1,25 @@
# This container is used as a base by Dockerfile.test in order to speed up dependency install for testing purposes only
FROM python:3.8-alpine
FROM amd64/python:3.8-alpine
# Install helm for linting chart, and yq for building docs
RUN wget -O helm-v2.14.3-linux-amd64.tar.gz 'https://get.helm.sh/helm-v2.14.3-linux-amd64.tar.gz' && \
tar xzf helm-v2.14.3-linux-amd64.tar.gz && mv linux-amd64/helm /usr/local/bin/helm && \
rm -rf helm-v2.14.3-linux-amd64.tar.gz linux-amd64 && \
wget -O yq 'https://github.com/mikefarah/yq/releases/download/2.4.0/yq_linux_amd64' && \
chmod +x yq && mv yq /usr/local/bin/
WORKDIR /usr/src/core
# Install necessary base dependencies and set UTC timezone for apscheduler
RUN apk --no-cache upgrade && apk --no-cache add libffi libstdc++ gmp && echo "UTC" > /etc/timezone
# Install dev build dependencies
RUN apk --no-cache upgrade && apk --no-cache add g++ make gmp-dev libffi-dev automake autoconf libtool && echo "UTC" > /etc/timezone && apk --no-cache add binutils musl-dev
# apk --no-cache add binutils musl-dev is required for gunicorn 20.0.0 until https://github.com/benoitc/gunicorn/issues/2160 is fixed
RUN apk --no-cache add g++ make gmp-dev libffi-dev automake autoconf libtool
# Install python dev dependencies
ENV SECP_BUNDLED_EXPERIMENTAL 1
ENV SECP_BUNDLED_WITH_BIGNUM 1
COPY requirements.txt .
RUN python3 -m pip install -r requirements.txt
RUN python3 -m pip install --no-cache-dir -r requirements.txt
COPY dev_requirements.txt .
RUN python3 -m pip install -r dev_requirements.txt
RUN python3 -m pip install --no-cache-dir --upgrade -r dev_requirements.txt
# Install Helm for chart linting and/or yq for doc builds if it doesn't exist
RUN if ! command -v helm; then \
wget -O helm-v3.2.4-linux-amd64.tar.gz 'https://get.helm.sh/helm-v3.2.4-linux-amd64.tar.gz' && \
tar xzf helm-v3.2.4-linux-amd64.tar.gz && mv linux-amd64/helm /usr/local/bin/helm && \
rm -rf helm-v3.2.4-linux-amd64.tar.gz linux-amd64; fi && \
if ! command -v yq; then \
wget -O yq 'https://github.com/mikefarah/yq/releases/download/3.3.2/yq_linux_amd64' && \
chmod +x yq && mv yq /usr/local/bin/; fi

View File

@ -0,0 +1,25 @@
# This container is used as a base by Dockerfile.test.arm64 in order to speed up dependency install for testing purposes only
FROM arm64v8/python:3.8-alpine
WORKDIR /usr/src/core
# Install necessary base dependencies and set UTC timezone for apscheduler
RUN apk --no-cache upgrade && apk --no-cache add libffi libstdc++ gmp && echo "UTC" > /etc/timezone
# Install dev build dependencies
RUN apk --no-cache add g++ make gmp-dev libffi-dev automake autoconf libtool
# Install python dev dependencies
ENV SECP_BUNDLED_EXPERIMENTAL 1
ENV SECP_BUNDLED_WITH_BIGNUM 1
COPY requirements.txt .
RUN python3 -m pip install --no-cache-dir -r requirements.txt
COPY dev_requirements.txt .
RUN python3 -m pip install --no-cache-dir --upgrade -r dev_requirements.txt
# Install Helm for chart linting and/or yq for doc builds if it doesn't exist
RUN if ! command -v helm; then \
wget -O helm-v3.2.4-linux-arm64.tar.gz 'https://get.helm.sh/helm-v3.2.4-linux-arm64.tar.gz' && \
tar xzf helm-v3.2.4-linux-arm64.tar.gz && mv linux-arm64/helm /usr/local/bin/helm && \
rm -rf helm-v3.2.4-linux-arm64.tar.gz linux-arm64; fi && \
if ! command -v yq; then \
wget -O yq 'https://github.com/mikefarah/yq/releases/download/3.3.2/yq_linux_arm64' && \
chmod +x yq && mv yq /usr/local/bin/; fi

View File

@ -1,37 +1,29 @@
# Change FROM to python:3.8-alpine to test without the dependencies container
FROM dragonchain/dragonchain_core_dependencies:latest as base
# Install Helm for chart linting and/or yq for doc builds if it doesn't exist
RUN if ! command -v helm; then \
wget -O helm-v2.14.3-linux-amd64.tar.gz 'https://get.helm.sh/helm-v2.14.3-linux-amd64.tar.gz' && \
tar xzf helm-v2.14.3-linux-amd64.tar.gz && mv linux-amd64/helm /usr/local/bin/helm && \
rm -rf helm-v2.14.3-linux-amd64.tar.gz linux-amd64; fi && \
helm init --client-only && \
if ! command -v yq; then \
wget -O yq 'https://github.com/mikefarah/yq/releases/download/2.4.0/yq_linux_amd64' && \
chmod +x yq && mv yq /usr/local/bin/; fi
# Change FROM to amd64/python:3.8-alpine to test without the dependencies container
FROM dragonchain/dragonchain_core_dependencies:linux-amd64-latest
WORKDIR /usr/src/core
# Install necessary base dependencies and set UTC timezone for apscheduler
RUN apk --no-cache upgrade && apk --no-cache add libffi libstdc++ gmp && echo "UTC" > /etc/timezone && apk --no-cache add binutils musl-dev
# apk --no-cache add binutils musl-dev is required for gunicorn 20.0.0 until https://github.com/benoitc/gunicorn/issues/2160 is fixed
RUN apk --no-cache upgrade && apk --no-cache add libffi libstdc++ gmp && echo "UTC" > /etc/timezone
FROM base AS builder
# Install dev build dependencies
RUN apk add g++ make gmp-dev libffi-dev automake autoconf libtool
RUN apk --no-cache add g++ make gmp-dev libffi-dev automake autoconf libtool
# Install python dev dependencies
ENV SECP_BUNDLED_EXPERIMENTAL 1
ENV SECP_BUNDLED_WITH_BIGNUM 1
COPY requirements.txt .
RUN python3 -m pip install -r requirements.txt
RUN python3 -m pip install --no-cache-dir -r requirements.txt
COPY dev_requirements.txt .
RUN python3 -m pip install --upgrade -r dev_requirements.txt
RUN python3 -m pip install --no-cache-dir --upgrade -r dev_requirements.txt
# Install Helm for chart linting and/or yq for doc builds if it doesn't exist
RUN if ! command -v helm; then \
wget -O helm-v3.2.4-linux-amd64.tar.gz 'https://get.helm.sh/helm-v3.2.4-linux-amd64.tar.gz' && \
tar xzf helm-v3.2.4-linux-amd64.tar.gz && mv linux-amd64/helm /usr/local/bin/helm && \
rm -rf helm-v3.2.4-linux-amd64.tar.gz linux-amd64; fi && \
if ! command -v yq; then \
wget -O yq 'https://github.com/mikefarah/yq/releases/download/3.3.2/yq_linux_amd64' && \
chmod +x yq && mv yq /usr/local/bin/; fi
FROM base AS release
# Copy the installed python dependencies from the builder
COPY --from=builder /usr/local/lib/python3.8/site-packages /usr/local/lib/python3.8/site-packages
# Sphinx is needed to build the docs
COPY --from=builder /usr/local/bin/sphinx-build /usr/local/bin/sphinx-build
# Copy our actual application
COPY --chown=1000:1000 . .
RUN chmod 777 .

View File

@ -0,0 +1,31 @@
# Change FROM to arm64v8/python:3.8-alpine to test without the dependencies container
FROM dragonchain/dragonchain_core_dependencies:linux-arm64-latest
WORKDIR /usr/src/core
# Install necessary base dependencies and set UTC timezone for apscheduler
RUN apk --no-cache upgrade && apk --no-cache add libffi libstdc++ gmp && echo "UTC" > /etc/timezone
# Install dev build dependencies
RUN apk --no-cache add g++ make gmp-dev libffi-dev automake autoconf libtool
# Install python dev dependencies
ENV SECP_BUNDLED_EXPERIMENTAL 1
ENV SECP_BUNDLED_WITH_BIGNUM 1
COPY requirements.txt .
RUN python3 -m pip install --no-cache-dir -r requirements.txt
COPY dev_requirements.txt .
RUN python3 -m pip install --no-cache-dir --upgrade -r dev_requirements.txt
# Install Helm for chart linting and/or yq for doc builds if it doesn't exist
RUN if ! command -v helm; then \
wget -O helm-v3.2.4-linux-arm64.tar.gz 'https://get.helm.sh/helm-v3.2.4-linux-arm64.tar.gz' && \
tar xzf helm-v3.2.4-linux-arm64.tar.gz && mv linux-arm64/helm /usr/local/bin/helm && \
rm -rf helm-v3.2.4-linux-arm64.tar.gz linux-arm64; fi && \
if ! command -v yq; then \
wget -O yq 'https://github.com/mikefarah/yq/releases/download/3.3.2/yq_linux_arm64' && \
chmod +x yq && mv yq /usr/local/bin/; fi
# Copy our actual application
COPY --chown=1000:1000 . .
RUN chmod 777 .
CMD [ "sh", "tools.sh", "full-test" ]

View File

@ -7,6 +7,8 @@ phases:
commands:
- if [ -z "$STAGE" ]; then echo "STAGE env var is missing"; exit 1; fi
- if [ -z "$AWS_DEFAULT_REGION" ]; then echo "AWS_DEFAULT_VERSION env var is missing"; exit 1; fi
# Enable docker cli experimental features for manifests
- mkdir -p "$HOME/.docker" && echo '{"experimental":"enabled"}' > "$HOME/.docker/config.json"
# Install helm and s3repo plugin
- curl -LO https://git.io/get_helm.sh
- bash get_helm.sh --version v2.14.3
@ -20,32 +22,39 @@ phases:
commands:
# Run tests before building
- echo Building and running tests
- docker build . -f cicd/Dockerfile.test -t built
- if [ "$ARCHITECTURE" = "amd64" ]; then docker build . -f cicd/Dockerfile.test -t built; fi
- if [ "$ARCHITECTURE" = "arm64" ]; then docker build . -f cicd/Dockerfile.test.arm64 -t built; fi
- docker run -v $(pwd)/docs:/usr/src/core/docs built
- if [ ! -d "./docs/.build/html" ]; then echo "Docs did not build correctly!"; exit 1; fi
build:
commands:
# Package/upload helm chart (if necessary)
- sh cicd/deploy_helm.sh
- if [ "$ARCHITECTURE" = "amd64" ]; then sh cicd/deploy_helm.sh; fi
# Set docker tags
- export VERSION=$(cat .version)
- TAG="381978683274.dkr.ecr.$AWS_DEFAULT_REGION.amazonaws.com/dragonchain_core:$STAGE-$VERSION"
- PUB_TAG_EDGE="dragonchain/dragonchain_core:edge"
- PUB_TAG_LATEST="dragonchain/dragonchain_core:latest"
- PUB_TAG_VERSION="dragonchain/dragonchain_core:$VERSION"
- PUB_TAG_EDGE="dragonchain/dragonchain_core:edge-linux-$ARCHITECTURE"
- PUB_TAG_VERSION="dragonchain/dragonchain_core:$VERSION-linux-$ARCHITECTURE"
# Login to docker repositories
- echo Logging into docker
- $(aws ecr get-login --no-include-email --region us-west-2)
- aws secretsmanager get-secret-value --secret-id dockerHubPassword --query SecretString --output text | docker login -u dragonchain --password-stdin
# Build/tag container
- echo Building and pushing docker containers
- docker build . -t $TAG
- docker tag $TAG $PUB_TAG_LATEST
- if [ "$ARCHITECTURE" = "amd64" ]; then docker build . -f Dockerfile -t "$TAG"; fi
- if [ "$ARCHITECTURE" = "arm64" ]; then docker build . -f Dockerfile.arm64 -t "$TAG"; fi
- docker tag $TAG $PUB_TAG_VERSION
- docker tag $TAG $PUB_TAG_EDGE
# Upload built containers and docs
- docker push $TAG
- docker push $PUB_TAG_EDGE
- sh cicd/deploy_docs.sh
- if [ "$STAGE" = dev ]; then echo Generating Banana Index && jq -c ".message |= \"$(grep -ir banana . | wc -l)\"" cicd/banana-shield.json > shield.json && aws s3 cp shield.json s3://dragonchain-core-docs/banana-shield.json; fi
- if [ "$STAGE" = prod ]; then docker push $PUB_TAG_LATEST && docker push $PUB_TAG_VERSION; fi
- if [ "$ARCHITECTURE" = "amd64" ]; then docker push $TAG; fi
- if [ "$STAGE" = dev ]; then docker push $PUB_TAG_EDGE; fi
- if [ "$ARCHITECTURE" = "amd64" ]; then sh cicd/deploy_docs.sh; fi
- if [ "$STAGE" = dev ] && [ "$ARCHITECTURE" = "amd64" ]; then echo Generating Banana Index && jq -c ".message |= \"$(grep -ir banana . | wc -l)\"" cicd/banana-shield.json > shield.json && aws s3 cp shield.json s3://dragonchain-core-docs/banana-shield.json; fi
- if [ "$STAGE" = prod ]; then docker push $PUB_TAG_VERSION; fi
# Update docker manifests
- rm -rf "$HOME/.docker/manifests/"
- if [ "$STAGE" = dev ]; then if docker manifest create dragonchain/dragonchain_core:edge dragonchain/dragonchain_core:edge-linux-amd64 dragonchain/dragonchain_core:edge-linux-arm64; then docker manifest push dragonchain/dragonchain_core:edge; fi; fi
- if [ "$STAGE" = prod ]; then if docker manifest create dragonchain/dragonchain_core:$VERSION dragonchain/dragonchain_core:$VERSION-linux-amd64 dragonchain/dragonchain_core:$VERSION-linux-arm64; then docker manifest push dragonchain/dragonchain_core:$VERSION; fi; fi
- if [ "$STAGE" = prod ]; then if docker manifest create dragonchain/dragonchain_core:latest dragonchain/dragonchain_core:$VERSION-linux-amd64 dragonchain/dragonchain_core:$VERSION-linux-arm64; then docker manifest push dragonchain/dragonchain_core:latest; fi; fi
# Build dependencies container if necessary
- if [ "$STAGE" = dev ]; then sh scripts/build_dependency_docker.sh; fi

View File

@ -7,7 +7,6 @@ flake8-builtins
flake8-import-order
flake8-mypy
mypy
mypy-extensions
coverage
bandit
sphinx

View File

@ -27,7 +27,7 @@ sys.path.insert(0, os.path.abspath(".."))
# -- Project information -----------------------------------------------------
project = "Dragonchain"
copyright = "2019, Dragonchain" # noqa: A001
copyright = "2020, Dragonchain" # noqa: A001
author = "Dragonchain"
# The short X.Y version

View File

@ -72,7 +72,7 @@ Once the values are set, install the helm chart with:
```sh
helm repo add dragonchain https://dragonchain-charts.s3.amazonaws.com
helm repo update
helm upgrade --install my-dragonchain --values opensource-config.yaml --namespace dragonchain dragonchain/dragonchain-k8s --version 1.0.3
helm upgrade --install my-dragonchain --values opensource-config.yaml --namespace dragonchain dragonchain/dragonchain-k8s --version 1.0.9
```
If you need to change any values AFTER the helm chart has already been

View File

@ -0,0 +1,56 @@
# Running On A Raspberry Pi
Dragonchain has ARM64 builds of its docker container to support running on ARM
devices such as the raspberry pi.
Although all of the Dragonchain code supports running on ARM, redisearch does
not run on ARM, so any chain with a redisearch will not be able to run on a
raspberry pi.
With that said, verification nodes (L2-5) do not require redisearch and are
deployed without a redisearch by default, so the existing helm chart fully
supports deploying onto a kubernetes cluster running on an ARM machine such as
as raspberry pi.
## Requirements
Currently, because Dragonchain requires kubernetes, and does not yet run on
something lighter weight for a single deployment (such as docker compose), a
kubernetes cluster is required to run Dragonchain.
Running a lightweight kubernetes distribution on a raspberry pi (such as
[k3s](https://k3s.io/) or [microk8s](https://microk8s.io/)) ends up using
around ~500MB of RAM, on top of the OS. This means that before Dragonchain is
deployed, around ~750MB of RAM is used just by linux/kubernetes.
Unfortunately, this means that Dragonchain will currently only run on a
raspberry pi with **2GB or more of total RAM**. Currently the only devices that
support this are the raspberry pi model 4 in either the 2 or 4GB variant.
Also note that you must install a 64 bit OS onto your raspberry pi. Raspbian
_**is not**_ currently a 64 bit operating system, so installing an alternative
such as
[ubuntu's 64bit raspberry pi distribution](https://ubuntu.com/download/raspberry-pi)
is required.
## Installing Dragonchain
All of the previous docs still apply to deploying a dragonchain on a raspberry
pi, with the exception that only L2+ chains are supported.
Additionally, when deploying the helm chart, some cpu resource limits should
be increased in order to compensate for the lower performance of the device's
CPU.
These suggested limits are provided, commented out, at the bottom of the
available `opensource-config.yaml` from the previous deployment docs.
Alternatively, simply add this flag to your `helm upgrade` or `helm install`
command when installing dragonchain:
```sh
--set cacheredis.resources.limits.cpu=1,persistentredis.resources.limits.cpu=1,webserver.resources.limits.cpu=1,transactionProcessor.resources.limits.cpu=1
```
Other than that change, no other changes should be required in order to have
Dragonchain running on a Raspberry pi.

View File

@ -32,9 +32,13 @@ cluster with the intent to run dragonchain(s).
layer, the relevant ports to your running chain(s) are forwarded
appropriately.
- ~1GB of RAM is required for L2+ chains to run, and ~1.5GB of RAM is
required for L1 chains. This means that the kubernetes node running the chain
(or the VM in the case of minikube) should have at least ~2GB of RAM total.
- ~600MB of RAM is required for L2+ chains to run (~900MB if redisearch is
enabled), and ~1.25GB of RAM is required for L1 chains (or a bit more if
openfaas/docker registry is running on the same machine). This means that the
kubernetes node running the chain (or the VM in the case of minikube) should
have at least ~1.5-2GB of RAM total for an L2+ chain, or ~3GB for an L1 (with
openfaas/docker also running). This is because there is also overhead with
linux and kubernetes itself.
### Recommended

View File

@ -40,8 +40,16 @@ All of the source code, as well as issue tracker can be viewed `on github <https
deployment/dragonnet
deployment/deploying
deployment/links
deployment/raspberry_pi
deployment/migrating_v4
.. toctree::
:caption: Usage
:maxdepth: 2
usage/authentication
usage/permissioning
.. toctree::
:caption: Components
:maxdepth: 2

View File

@ -1,18 +1,9 @@
# Community
Dragonchain has a slack for technical discussion and support.
If you need technical support please email info@dragonchain.com.
We highly recommend joining if you want to participate in technical discussion
to try to understand or contribute to dragonchain, or need technical support.
For general Dragonchain (company/business) chat please join us in the [Dragonchain Lair](https://den.social/l/Dragonchain/) on Den.social.
Please note that this slack is for technical purposes only, and not general
Dragonchain (company/business) chat. For that, please use the
[Dragonchain Telegram](https://t.me/dragontalk).
## More Information
## Form
If you are interested in joining this slack, please fill out
[this form](https://forms.gle/ec7sACnfnpLCv6tXA).
After submitting the form, we will review the application and send an
invitation via email.
More information about Dragonchain [can be found here](https://docs.dragonchain.com/).

View File

@ -0,0 +1,122 @@
# Webserver Authentication
Dragonchain utilizes a custom HMAC-validated schema in order to authenticate
HTTP requests to the api webserver.
## Crafting An Authenticated Request
Please note the following are for dragonchain auth version `1`, which is the
only supported version at this time.
### Required Elements
In order to create an authenticated HTTP request to Dragonchain, the following
elements are needed:
- Capitalized HTTP verb of the request (i.e. `GET, POST, PUT, DELETE, PATCH`)
- Full path of the request, including query parameters (i.e.
`/v1/path?some=value`)
- Public dragonchain id of the request (to be provided in an HTTP header:
`dragonchain`)
- ISO 8601 UTC timestamp of the request (to be provided in an HTTP header:
`timestamp`)
- The `Content-Type` header of the request (if it exists)
- The actual bytes of the body of the HTTP request (if it exists)
- The auth key id that you are using to perform the HMAC operations
- The auth key itself that is used as the secret in the HMAC operation
- The name of the supported HMAC hashing algorithm you are using (currently
`SHA256`, `BLAKE2b512`, or `SHA3-256`)
- The version of the HMAC authentication scheme that you are using (currently
only `1`)
### Generating the HMAC "signature"
In order to generate the actual HMAC "signature" that is to be sent in the
`Authorization` HTTP header, first assemble the message that you are going to
perform the HMAC operation on.
In order to do this, first take the bytes that will make up your HTTP request
body, and perform a hash on this data (using your desired supported hashing
method which must match the HMAC hash method). Take the result of this hash and
base64 encode it into a simple ascii string.
Note that this step is still _required_ even if your HTTP request does not have
a body. If this is the case, simply perform the above hash digest with no
bytes. For example, if you are using SHA256, then these operations should
result in using `47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU=` as the hashed
and base64-encoded (empty) body for every `GET` request (or any other request
without a body).
Now take the following elements and assemble them into a single string
delimited with newlines (`/n`) in this order:
1. Uppercase HTTP Verb
1. Full Request Path
1. Public Dragonchain ID
1. ISO 8601 Timestamp
1. Content-Type header (empty string if it doesn't exist)
1. Base64-encoded hashed HTTP Body (created as described above)
Full HMAC Message Example String:
```text
POST
/v1/transaction-type
294sjLHcCc8dMqMUdFzAnqLmiaCMWmoMTspuuYpSeBMvM
2019-12-04T21:49:49.990Z
application/json
REHbyAdHuzqToafZVXvpIKIGYODingumtIuIZXmAia4=
```
Now UTF-8 encode this string, and perform an HMAC operation on the resulting
encoded bytes, using your auth key as the HMAC secret. Remember that the hash
method you used to hash the HTTP body must match the hash method used for this
HMAC operation.
Take the result of this HMAC operation and base64 encode it to get the
resulting "signature" that will be used in the final HTTP request.
### Assembling the HTTP Request
In order to assemble the authenticated HTTP request to send to Dragonchain,
make sure that the following data is set on the request:
#### Headers
- `timestamp`: Must be set to the ISO 8601 timestamp that was used when
generating the HMAC signature.
If this timestamp is too far off the current time, you will get an
authorization error claiming that the timestamp is too skewed to be
authenticated.
- `dragonchain`: Must be set to the public dragonchain ID that was used when
generating the HMAC signature.
If this ID does not match the ID of the chain that you are calling, then an
authorization error will occur.
- `Content-Type`: Must be set to the value that was used when generating the
HMAC signature. This header can be omitted entirely if an empty string was
used for the content type in the HMAC generation.
- `Authorization`: Must be set to the authentication scheme used
(`DC<version>-HMAC-<algorithm>`), followed by a space and a colon(`:`)
separated string of `auth_key_id_used:hmac_signature`
For example, if SHA256 was used as the hash/hmac algorithm, and an auth key
with the id `ABCDEF123456` was used, then this full header may look like:
`DC1-HMAC-SHA256 ABCDEF123456:hpbpaheNqGkJlT2OrUNiRtKAXLLs7e4nBKS/xkYNmpI=`
#### Body
Ensure that the HTTP body that you send are the same exact bytes used when
hashing the body for the HMAC signature. If there is a mis-match, authorization
will not work because the HMAC will not match.
#### Path
Ensure that the path of your http request (everything _after_ the fully
qualified domain name and protocol) starting from the initial `/`, and
**including** any query parameters, is exactly what was used when creating the
HMAC signature. If there is a mis-match, the request cannot be authenticated.

309
docs/usage/permissioning.md Normal file
View File

@ -0,0 +1,309 @@
# Permissioning
Dragonchain has a built-in permissioning system, allowing specific api keys to
be either allowed or denied for certain dragonchain operations via the
Dragonchain api.
When using these permissions, policy documents on api keys which determine
which operations are allowed and forbidden.
## Setting Permissions
In order to set permissions, create or update an api key with a permissions
document (schema explained below).
By default, an api key with no explicitly set permissions will have access to
every endpoint, except the ability to create/update/delete api keys.
Keep in mind that the ability to create or update api keys is effectively a
root-level permission, because keys can be created or modified with elevated
permissions via those endpoints.
### Root API Key
The root api key (in the kubernetes secret when deploying the chain) will
always have permission to use any Dragonchain api endpoint (aside from
endpoints for Dragon Net operations, which are reserved for
dragonchain-to-dragonchain communication).
The root api key cannot be deleted.
## Permissions Document
The permissions document codifies all permissions explicitly allowed or denied
on a given api-key.
The document is a JSON object containing the document version (currently only
`"1"`), a `"default_allow"` boolean, which determines if the permissions should
be treated as a whitelist or a blacklist by default, and a `"permissions"`
object which defines further permissions.
The `"permissions"` object contains 3 parts (in order from most generic to most
specific):
1. The global object which can contain `"allow_create"`, `"allow_read"`,
`"allow_update"`, `"allow_delete"` booleans, as well as api resource
objects.
1. API resource objects which can contain `"allow_create"`, `"allow_read"`,
`"allow_update"`, `"allow_delete"` booleans, as well as specific api
endpoint permission objects.
1. API endpoint permission objects, which can contain a special schema for
allowing or denying a particular api operation on a per-endpoint basis.
In a permissions document, the **_most specific_** (aka most deeply nested)
defined permission is the permission that is followed. This means that if an
endpoint permission object is defined, then that _and only that_ permission is
used to determine if the api key is allowed to perform an operation on that
endpoint. This happens because an endpoint permission object is the most deeply
nested item in a permissions document.
### Important Privilege Escalation Note
Since creating/modifying/deleting permissions via an api keys is a permissioned
action, it is important to explicitly deny api key operations if
create/update/delete permissions were implicitly given elsewhere.
Failure to do so can result in creating an api key, which itself can create a
more-permissioned key, thus leading to privilege escalation.
See the examples at the bottom of this page for more details/examples.
### API Endpoint Schemas
Each api endpoint can be individually turned on or off with an endpoint
permissions object. Most endpoints use a default schema which is simply an
object with the boolean `"allowed"` which turns that particular endpoint on or
off.
See the table (or custom endpoint list) below to check for custom permissions
on a per-endpoint basis.
### API Resources and Permission Names
The following are the available api resources exposed via the Dragonchain
RESTful API:
- `api_keys` : Operations related to dragonchain api keys
- `blocks` : Operations related to blocks on the chain
- `interchains` : Operations related to interchain (eth, btc, etc) operations
(L1/5 Only)
- `misc` : Miscellaneous Operations (currently only getting status)
- `contracts` : Operations related to dragonchain smart contracts (L1 Only)
- `transaction_types` : Operations related to transaction types (L1 Only)
- `transactions` : Operations related to individual chain transactions (L1
Only)
- `verifications` : Operations related to Dragon Net verifications (L1 Only)
The following are all the available api endpoints for permissioning, along with
their operation type, and whether or not their endpoint permission object has a
custom schema:
| API Resource | Endpoint Name | Operation Type | Endpoint Schema |
|---------------------|----------------------------------------|----------------|-----------------|
| `api_keys` | `create_api_key` | `create` | default |
| `api_keys` | `get_api_key` | `read` | default |
| `api_keys` | `list_api_keys` | `read` | default |
| `api_keys` | `delete_api_key` | `delete` | default |
| `api_keys` | `update_api_key` | `update` | default |
| `blocks` | `get_block` | `read` | default |
| `blocks` | `query_blocks` | `read` | default |
| `interchains` | `create_interchain` | `create` | default |
| `interchains` | `update_interchain` | `update` | default |
| `interchains` | `create_interchain_transaction` | `create` | default |
| `interchains` | `publish_interchain_transaction` | `create` | default |
| `interchains` | `list_interchains` | `read` | default |
| `interchains` | `get_interchain` | `read` | default |
| `interchains` | `delete_interchain` | `delete` | default |
| `interchains` | `get_default_interchain` | `read` | default |
| `interchains` | `set_default_interchain` | `create` | default |
| `interchains` | `get_interchain_legacy` | `read` | default |
| `interchains` | `create_interchain_transaction_legacy` | `create` | default |
| `misc` | `get_status` | `read` | default |
| `contracts` | `get_contract` | `read` | default |
| `contracts` | `get_contract_logs` | `read` | default |
| `contracts` | `list_contracts` | `read` | default |
| `contracts` | `create_contract` | `create` | default |
| `contracts` | `update_contract` | `update` | default |
| `contracts` | `delete_contract` | `delete` | default |
| `contracts` | `get_contract_object` | `read` | default |
| `contracts` | `list_contract_objects` | `read` | default |
| `transaction_types` | `create_transaction_type` | `create` | default |
| `transaction_types` | `delete_transaction_type` | `delete` | default |
| `transaction_types` | `list_transaction_types` | `read` | default |
| `transaction_types` | `get_transaction_type` | `read` | default |
| `transactions` | `create_transaction` | `create` | custom |
| `transactions` | `query_transactions` | `read` | default |
| `transactions` | `get_transaction` | `read` | default |
| `verifications` | `get_verifications` | `read` | default |
| `verifications` | `get_pending_verifications` | `read` | default |
| `verifications` | `query_interchain_verifications` | `read` | default |
### Custom Endpoint Permissions
The following are all of the endpoints with a custom permission schema for more
fine-grained permissioning control on that endpoint
#### `create_transaction`
This endpoint affects both the regular and bulk create transaction endpoints.
The custom endpoint permissions object for this permission allows an api key
to be allowed or denied permission to create a transaction based on the
transaction type of the transaction(s) that the api call is creating.
##### Schema
The schema for this custom endpoint permission object has the boolean
`"allowed"`, which similar to all other schemas, simply indicates if this
endpoint is enabled or disabled by default.
Additionally, there is the `"transaction_types"` object which defines which
transactions types are allowed (or denied), regardless of all other
permissions (including `"allowed"`).
The `"transaction_types"` object is a simple map of strings to booleans where
the string key is the name of the transaction type, and the boolean is whether
or not to allow the creation of a transaction with that type.
The following example allows all transactions to be created, _except_ for
transactions with the transaction type `honey`. Note that the `"butter": true`
is technically redundant since it implicitly already has permissions to create
any other transaction due to the `"allowed": true`
```text
...
{
"allowed": true,
"transaction_types": {
"honey": false,
"butter": true
}
}
...
```
The following example allows _only_ transactions with the type `banana` to be
created.
```text
...
{
"allowed": false,
"transaction_types": {
"banana": true
}
}
...
```
If `"allowed"` is not defined, then its value is derived from its parent, which
is whether or not it is allowed to perform a `create` operation on the
`transaction` resource.
### Examples
The following are some examples of a full permissions document object,
explaining what the permissions document is allowing/denying.
---
This is a permissions document that allows all operations on all actions by
default, but globally disables any `delete` abilities, while explicitly
allowing `delete` on interchain operations and explicitly denying creating any
interchain transaction. Additionally, because `"default_allow": true` was set,
it also ensures that creating or updating api keys is not allowed (as to avoid
privilege escalation)
Note that the `"allow_delete": false` in the `api_keys` resource is technically
redundant, because deletions were already denied at the global level.
Regardless, this is still a valid schema.
```json
{
"version": "1",
"default_allow": true,
"permissions": {
"allow_delete": false,
"interchains": {
"allow_delete": true,
"create_interchain_transaction": {
"allowed": false
}
},
"api_keys": {
"allow_create": false,
"allow_update": false,
"allow_delete": false
}
}
}
```
---
This is a permissions document which disables all actions by default, but
globally allows any `read` operations. Additionally, it allows the creation of
transaction types, and explicitly denies reading any smart contract logs.
```json
{
"version": "1",
"default_allow": false,
"permissions": {
"allow_read": true,
"transaction_types": {
"allow_create": true
},
"contracts": {
"get_contract_logs": {
"allowed": false
}
}
}
}
```
---
This is a permissions document that allows all operations on all actions by
default, but only allows creating transactions with the transaction type:
`banana`. Additionally, it also has disabled creating/updating/deleting api
keys in order to avoid privilege escalation.
```json
{
"version": "1",
"default_allow": true,
"permissions": {
"transactions": {
"create_transaction": {
"allowed": false,
"transaction_types": {
"banana": true
}
}
},
"api_keys": {
"allow_create": false,
"allow_update": false,
"allow_delete": false
}
}
}
```
---
This is a permissions document that allows all operations on all actions by
default. The only difference between this and a root key is that a root key
cannot be deleted.
Note that `"permissions"` with an empty object is still required, even though
no further permissions were defined.
```json
{
"version": "1",
"default_allow": true,
"permissions": {}
}
```

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -31,6 +31,8 @@ BROADCAST_BLOCK_PREFIX = "broadcast:block"
CLAIM_CHECK_KEY = "broadcast:claimcheck"
NOTIFICATION_KEY = "broadcast:notifications"
STORAGE_FOLDER = "BROADCASTS"
FAULT_TOLERATION = 10 # Number of times fetching verifications fails before rolling back block verification state
HAS_VERIFICATION_NOTIFICATIONS = os.environ.get("VERIFICATION_NOTIFICATION") is not None
@ -206,7 +208,7 @@ def schedule_notification_for_broadcast_sync(notification_location: str) -> None
def verification_storage_location(l1_block_id: str, level_received_from: int, chain_id: str) -> str:
""" Format the path for the storage of a verification object
"""Format the path for the storage of a verification object
Args:
l1_block_id: the id of the L1 block which this verification verifies
level_received_from: the level from which this verification was received
@ -342,3 +344,14 @@ async def remove_block_from_broadcast_system_async(block_id: str) -> None:
transaction.hdel(CLAIM_CHECK_KEY, block_id)
await transaction.execute()
async def save_unfinished_claim(block_id: str) -> None:
"""If a claim no longer exists in Dragon Net, but we don't have all the results,
save its id for a potentially later date.
Args:
block_id: The block_id to save and remove from the broadcasting system
"""
storage.put_object_as_json(f"{STORAGE_FOLDER}/UNFINISHED/{block_id}", {"time": time.time()})
await remove_block_from_broadcast_system_async(block_id)

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -15,29 +15,15 @@
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
import asyncio
import importlib
import unittest
from unittest.mock import patch, MagicMock, call
from unittest.mock import patch, MagicMock, AsyncMock, call, ANY
from dragonchain import test_env # noqa: F401
from dragonchain.broadcast_processor import broadcast_functions
from dragonchain import exceptions
def async_test(coro):
def wrapper(*args, **kwargs):
loop = asyncio.get_event_loop()
return loop.run_until_complete(coro(*args, **kwargs))
return wrapper
class BroadcastFunctionTests(unittest.TestCase):
def tearDown(self):
# Necessary because we modify the class during testing...
importlib.reload(broadcast_functions.redis)
class BroadcastFunctionTests(unittest.IsolatedAsyncioTestCase):
def test_state_key_returns_correct_key(self):
self.assertEqual(broadcast_functions.state_key("id"), "broadcast:block:id:state")
@ -61,6 +47,7 @@ class BroadcastFunctionTests(unittest.TestCase):
get_sync.assert_called_once_with("key", decode=False)
set_sync.assert_called_once_with("key", "3")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.pipeline_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.storage_error_key", return_value="error_key")
@patch("dragonchain.broadcast_processor.broadcast_functions.verifications_key", return_value="verifications_key")
@patch("dragonchain.broadcast_processor.broadcast_functions.state_key", return_value="state_key")
@ -68,39 +55,35 @@ class BroadcastFunctionTests(unittest.TestCase):
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.smembers_sync", return_value={"abc", "def"})
@patch("dragonchain.broadcast_processor.broadcast_functions.storage.list_objects", return_value=["BLOCK/blah-l2-abc"])
def test_storage_error_rolls_back_state_correctly_when_needed(
self, list_objects, smembers_sync, get_sync, state_key, verifications_key, error_key
self, list_objects, smembers_sync, get_sync, state_key, verifications_key, error_key, mock_pipeline
):
fake_pipeline = MagicMock()
broadcast_functions.redis.pipeline_sync = MagicMock(return_value=fake_pipeline)
mock_pipeline.return_value = fake_pipeline
broadcast_functions.increment_storage_error_sync("blah", 3)
broadcast_functions.redis.pipeline_sync.assert_called_once()
mock_pipeline.assert_called_once()
fake_pipeline.srem.assert_called_once_with("verifications_key", "def")
fake_pipeline.delete.assert_called_once_with("error_key")
fake_pipeline.set.assert_called_once_with("state_key", "2")
fake_pipeline.execute.assert_called_once()
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.z_range_by_score_async", return_value="dummy")
@patch("dragonchain.broadcast_processor.broadcast_functions.time.time", return_value=123)
@async_test
async def test_get_for_process_async(self, mock_time):
broadcast_functions.redis.z_range_by_score_async = MagicMock(return_value=asyncio.Future())
broadcast_functions.redis.z_range_by_score_async.return_value.set_result("dummy")
async def test_get_for_process_async(self, mock_time, mock_zrange):
self.assertEqual(await broadcast_functions.get_blocks_to_process_for_broadcast_async(), "dummy")
mock_time.assert_called_once()
broadcast_functions.redis.z_range_by_score_async.assert_called_once_with("broadcast:in-flight", 0, 123, withscores=True, offset=0, count=1000)
mock_zrange.assert_awaited_once_with("broadcast:in-flight", 0, 123, withscores=True, offset=0, count=1000)
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.get_async", return_value="3")
@patch("dragonchain.broadcast_processor.broadcast_functions.state_key", return_value="key")
@async_test
async def test_get_block_level_async(self, mock_key):
broadcast_functions.redis.get_async = MagicMock(return_value=asyncio.Future())
broadcast_functions.redis.get_async.return_value.set_result("3")
async def test_get_block_level_async(self, mock_key, mock_get):
self.assertEqual(await broadcast_functions.get_current_block_level_async("blah"), 3)
broadcast_functions.redis.get_async.assert_called_once_with("key", decode=False)
mock_get.assert_awaited_once_with("key", decode=False)
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.get_sync", return_value=b"3")
@patch("dragonchain.broadcast_processor.broadcast_functions.state_key", return_value="key")
def test_get_block_level_sync(self, mock_key):
broadcast_functions.redis.get_sync = MagicMock(return_value=b"3")
def test_get_block_level_sync(self, mock_key, mock_get):
self.assertEqual(broadcast_functions.get_current_block_level_sync("blah"), 3)
broadcast_functions.redis.get_sync.assert_called_once_with("key", decode=False)
mock_get.assert_called_once_with("key", decode=False)
@patch("dragonchain.broadcast_processor.broadcast_functions.get_current_block_level_sync", return_value=3)
def test_block_accepting_from_level(self, mock_get_block_level):
@ -108,42 +91,39 @@ class BroadcastFunctionTests(unittest.TestCase):
mock_get_block_level.assert_called_once_with("blah")
self.assertFalse(broadcast_functions.is_block_accepting_verifications_from_level("blah", 4))
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.set_async")
@patch("dragonchain.broadcast_processor.broadcast_functions.state_key", return_value="key")
@async_test
async def test_set_block_level_async_calls_redis_with_correct_params(self, mock_key):
broadcast_functions.redis.set_async = MagicMock(return_value=asyncio.Future())
broadcast_functions.redis.set_async.return_value.set_result("doesnt matter")
async def test_set_block_level_async_calls_redis_with_correct_params(self, mock_key, mock_set):
await broadcast_functions.set_current_block_level_async("blah", 3)
broadcast_functions.redis.set_async.assert_called_once_with("key", "3")
mock_set.assert_awaited_once_with("key", "3")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.set_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.state_key", return_value="key")
def test_set_block_level_sync_calls_redis_with_correct_params(self, mock_key):
broadcast_functions.redis.set_sync = MagicMock(return_value="doesnt matter")
def test_set_block_level_sync_calls_redis_with_correct_params(self, mock_key, mock_set):
broadcast_functions.set_current_block_level_sync("blah", 3)
broadcast_functions.redis.set_sync.assert_called_once_with("key", "3")
mock_set.assert_called_once_with("key", "3")
@async_test
async def test_schedule_block_async_calls_redis_with_correct_params(self):
broadcast_functions.redis.zadd_async = MagicMock(return_value=asyncio.Future())
broadcast_functions.redis.zadd_async.return_value.set_result("doesnt matter")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.zadd_async")
async def test_schedule_block_async_calls_redis_with_correct_params(self, mock_zadd):
await broadcast_functions.schedule_block_for_broadcast_async("id", 123)
broadcast_functions.redis.zadd_async.assert_called_once_with("broadcast:in-flight", 123, "id")
mock_zadd.assert_awaited_once_with("broadcast:in-flight", 123, "id")
def test_schedule_block_sync_calls_redis_with_correct_params(self):
broadcast_functions.redis.zadd_sync = MagicMock(return_value="doesnt matter")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.zadd_sync")
def test_schedule_block_sync_calls_redis_with_correct_params(self, mock_zadd):
broadcast_functions.schedule_block_for_broadcast_sync("id", 123)
broadcast_functions.redis.zadd_sync.assert_called_once_with("broadcast:in-flight", {"id": 123})
mock_zadd.assert_called_once_with("broadcast:in-flight", {"id": 123})
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.pipeline_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.verifications_key", return_value="verification_key")
def test_get_all_verifications_sync(self, mock_verification_key):
def test_get_all_verifications_sync(self, mock_verification_key, mock_pipeline):
fake_pipeline = MagicMock()
fake_pipeline.execute.return_value = [{b"l2chain1", b"l2chain2"}, {b"l3chain"}, {b"l4chain"}, set()]
broadcast_functions.redis.pipeline_sync = MagicMock(return_value=fake_pipeline)
mock_pipeline.return_value = fake_pipeline
result = broadcast_functions.get_all_verifications_for_block_sync("id")
# Check that mocks were called as expected
broadcast_functions.redis.pipeline_sync.assert_called_once()
mock_pipeline.assert_called_once()
smembers_calls = [call("verification_key"), call("verification_key"), call("verification_key"), call("verification_key")]
fake_pipeline.smembers.assert_has_calls(smembers_calls)
fake_pipeline.execute.assert_called_once()
@ -153,28 +133,27 @@ class BroadcastFunctionTests(unittest.TestCase):
# Check actual result
self.assertEqual(result, [{"l2chain1", "l2chain2"}, {"l3chain"}, {"l4chain"}, set()])
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.smembers_sync", return_value={b"thing"})
@patch("dragonchain.broadcast_processor.broadcast_functions.verifications_key", return_value="key")
def test_get_verifications_sync(self, mock_key):
broadcast_functions.redis.smembers_sync = MagicMock(return_value={b"thing"})
def test_get_verifications_sync(self, mock_key, mock_smembers):
self.assertEqual(broadcast_functions.get_receieved_verifications_for_block_and_level_sync("id", 2), {b"thing"})
broadcast_functions.redis.smembers_sync.assert_called_once_with("key")
mock_smembers.assert_called_once_with("key")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.smembers_async", return_value={"thing"})
@patch("dragonchain.broadcast_processor.broadcast_functions.verifications_key", return_value="key")
@async_test
async def test_get_verifications_async(self, mock_key):
broadcast_functions.redis.smembers_async = MagicMock(return_value=asyncio.Future())
broadcast_functions.redis.smembers_async.return_value.set_result({"thing"})
async def test_get_verifications_async(self, mock_key, mock_smembers):
self.assertEqual(await broadcast_functions.get_receieved_verifications_for_block_and_level_async("id", 2), {"thing"})
broadcast_functions.redis.smembers_async.assert_called_once_with("key")
mock_smembers.assert_awaited_once_with("key")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.pipeline_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.state_key", return_value="state_key")
@patch("dragonchain.broadcast_processor.broadcast_functions.verifications_key", return_value="verification_key")
@patch("dragonchain.broadcast_processor.broadcast_functions.storage_error_key", return_value="storage_error_key")
def test_remove_block_sync_calls_redis_with_correct_deletes(self, mock_error_key, mock_verification_key, mock_state_key):
def test_remove_block_sync_calls_redis_with_correct_deletes(self, mock_error_key, mock_verification_key, mock_state_key, mock_pipeline):
fake_pipeline = MagicMock()
broadcast_functions.redis.pipeline_sync = MagicMock(return_value=fake_pipeline)
mock_pipeline.return_value = fake_pipeline
broadcast_functions.remove_block_from_broadcast_system_sync("id")
broadcast_functions.redis.pipeline_sync.assert_called_once()
mock_pipeline.assert_called_once()
fake_pipeline.zrem.assert_called_once_with("broadcast:in-flight", "id")
fake_pipeline.hdel.assert_called_once_with("broadcast:claimcheck", "id")
delete_calls = [
@ -191,18 +170,15 @@ class BroadcastFunctionTests(unittest.TestCase):
verification_key_calls = [call("id", 2), call("id", 3), call("id", 4), call("id", 5)]
mock_verification_key.assert_has_calls(verification_key_calls)
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.multi_exec_async")
@patch("dragonchain.broadcast_processor.broadcast_functions.state_key", return_value="state_key")
@patch("dragonchain.broadcast_processor.broadcast_functions.verifications_key", return_value="verification_key")
@patch("dragonchain.broadcast_processor.broadcast_functions.storage_error_key", return_value="storage_error_key")
@async_test
async def test_remove_block_async_calls_redis_with_correct_deletes(self, mock_error_key, mock_verification_key, mock_state_key):
fake_pipeline = MagicMock()
fake_pipeline.execute = MagicMock(return_value=asyncio.Future())
fake_pipeline.execute.return_value.set_result(None)
broadcast_functions.redis.multi_exec_async = MagicMock(return_value=asyncio.Future())
broadcast_functions.redis.multi_exec_async.return_value.set_result(fake_pipeline)
async def test_remove_block_async_calls_redis_with_correct_deletes(self, mock_error_key, mock_verification_key, mock_state_key, mock_multi_exec):
fake_pipeline = MagicMock(execute=AsyncMock())
mock_multi_exec.return_value = fake_pipeline
await broadcast_functions.remove_block_from_broadcast_system_async("id")
broadcast_functions.redis.multi_exec_async.assert_called_once()
mock_multi_exec.assert_awaited_once()
fake_pipeline.zrem.assert_called_once_with("broadcast:in-flight", "id")
fake_pipeline.hdel.assert_called_once_with("broadcast:claimcheck", "id")
delete_calls = [
@ -214,7 +190,7 @@ class BroadcastFunctionTests(unittest.TestCase):
call("verification_key"),
]
fake_pipeline.delete.assert_has_calls(delete_calls)
fake_pipeline.execute.assert_called_once()
fake_pipeline.execute.assert_awaited_once()
mock_state_key.assert_called_once_with("id")
verification_key_calls = [call("id", 2), call("id", 3), call("id", 4), call("id", 5)]
mock_verification_key.assert_has_calls(verification_key_calls)
@ -226,20 +202,22 @@ class BroadcastFunctionTests(unittest.TestCase):
exceptions.NotAcceptingVerifications, broadcast_functions.set_receieved_verification_for_block_from_chain_sync, "block_id", 2, "chain_id"
)
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.pipeline_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.sadd_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.dragonnet_config.DRAGONNET_CONFIG", {"l2": {"nodesRequired": 3}})
@patch("dragonchain.broadcast_processor.broadcast_functions.verifications_key", return_value="key")
@patch("dragonchain.broadcast_processor.broadcast_functions.get_current_block_level_sync", return_value=2)
def test_set_record_for_block_sync_calls_redis_with_correct_params(self, mock_get_block_level, mock_key, mock_sadd):
def test_set_record_for_block_sync_calls_redis_with_correct_params(self, mock_get_block_level, mock_key, mock_sadd, mock_pipeline):
fake_pipeline = MagicMock()
fake_pipeline.execute = MagicMock(return_value=[1, 2])
broadcast_functions.redis.pipeline_sync = MagicMock(return_value=fake_pipeline)
fake_pipeline.execute.return_value = [1, 2]
mock_pipeline.return_value = fake_pipeline
broadcast_functions.set_receieved_verification_for_block_from_chain_sync("block_id", 2, "chain_id")
broadcast_functions.redis.pipeline_sync.assert_called_once()
mock_pipeline.assert_called_once()
fake_pipeline.sadd.assert_called_once_with("key", "chain_id")
fake_pipeline.scard.assert_called_once_with("key")
fake_pipeline.execute.assert_called_once()
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.pipeline_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.sadd_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.dragonnet_config.DRAGONNET_CONFIG", {"l3": {"nodesRequired": 3}})
@patch("dragonchain.broadcast_processor.broadcast_functions.set_current_block_level_sync")
@ -248,46 +226,57 @@ class BroadcastFunctionTests(unittest.TestCase):
@patch("dragonchain.broadcast_processor.broadcast_functions.get_current_block_level_sync", return_value=3)
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.delete_sync")
def test_set_record_for_block_sync_promotes_when_needed_met(
self, mock_delete_sync, mock_get_block_level, mock_key, mock_schedule, mock_set_block, mock_sadd
self, mock_delete_sync, mock_get_block_level, mock_key, mock_schedule, mock_set_block, mock_sadd, mock_pipeline
):
fake_pipeline = MagicMock()
fake_pipeline.execute = MagicMock(return_value=[1, 3])
broadcast_functions.redis.pipeline_sync = MagicMock(return_value=fake_pipeline)
fake_pipeline.execute.return_value = [1, 3]
mock_pipeline.return_value = fake_pipeline
broadcast_functions.set_receieved_verification_for_block_from_chain_sync("block_id", 3, "chain_id")
mock_set_block.assert_called_once_with("block_id", 4)
mock_schedule.assert_called_once_with("block_id")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.pipeline_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.sadd_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.dragonnet_config.DRAGONNET_CONFIG", {"l5": {"nodesRequired": 3}})
@patch("dragonchain.broadcast_processor.broadcast_functions.remove_block_from_broadcast_system_sync")
@patch("dragonchain.broadcast_processor.broadcast_functions.verifications_key", return_value="key")
@patch("dragonchain.broadcast_processor.broadcast_functions.get_current_block_level_sync", return_value=5)
def test_set_record_for_block_sync_calls_remove_when_required_met_and_level_5(self, mock_get_block_level, mock_key, mock_remove, mock_sadd):
def test_set_record_for_block_sync_calls_remove_when_required_met_and_level_5(
self, mock_get_block_level, mock_key, mock_remove, mock_sadd, mock_pipeline
):
fake_pipeline = MagicMock()
fake_pipeline.execute = MagicMock(return_value=[1, 3])
broadcast_functions.redis.pipeline_sync = MagicMock(return_value=fake_pipeline)
fake_pipeline.execute.return_value = [1, 3]
mock_pipeline.return_value = fake_pipeline
broadcast_functions.set_receieved_verification_for_block_from_chain_sync("block_id", 5, "chain_id")
mock_remove.assert_called_once_with("block_id")
@async_test
async def test_get_notification_verifications_for_broadcast_async(self):
broadcast_functions.redis.smembers_async = MagicMock(return_value=asyncio.Future())
broadcast_functions.redis.smembers_async.return_value.set_result({"thing"})
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.smembers_async", return_value={"thing"})
async def test_get_notification_verifications_for_broadcast_async(self, mock_smembers):
await broadcast_functions.get_notification_verifications_for_broadcast_async()
broadcast_functions.redis.smembers_async.assert_called_once_with("broadcast:notifications")
mock_smembers.assert_awaited_once_with("broadcast:notifications")
@async_test
async def test_remove_notification_verification_for_broadcast_async(self):
broadcast_functions.redis.srem_async = MagicMock(return_value=asyncio.Future())
broadcast_functions.redis.srem_async.return_value.set_result(1)
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.srem_async", return_value=1)
async def test_remove_notification_verification_for_broadcast_async(self, mock_srem):
await broadcast_functions.remove_notification_verification_for_broadcast_async("banana")
broadcast_functions.redis.srem_async.assert_called_once_with("broadcast:notifications", "banana")
mock_srem.assert_awaited_once_with("broadcast:notifications", "banana")
def test_schedule_notification_for_broadcast_sync(self):
broadcast_functions.redis.sadd_sync = MagicMock(return_value=1)
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.sadd_sync", return_value=1)
def test_schedule_notification_for_broadcast_sync(self, mock_sadd):
broadcast_functions.schedule_notification_for_broadcast_sync("banana")
broadcast_functions.redis.sadd_sync.assert_called_once_with("broadcast:notifications", "banana")
mock_sadd.assert_called_once_with("broadcast:notifications", "banana")
def test_verification_storage_location(self):
result = broadcast_functions.verification_storage_location("l1_block_id", 2, "chain_id")
self.assertEqual(result, "BLOCK/l1_block_id-l2-chain_id")
@patch("dragonchain.broadcast_processor.broadcast_functions.remove_block_from_broadcast_system_async")
@patch("dragonchain.broadcast_processor.broadcast_functions.storage.put_object_as_json")
async def test_save_unfinished_claim_writes_to_storage(self, mock_put, mock_remove):
await broadcast_functions.save_unfinished_claim("123")
mock_put.assert_called_once_with("BROADCASTS/UNFINISHED/123", ANY)
@patch("dragonchain.broadcast_processor.broadcast_functions.remove_block_from_broadcast_system_async")
@patch("dragonchain.broadcast_processor.broadcast_functions.storage")
async def test_save_unfinished_claim_removes_claim_from_system(self, mock_storage, mock_remove):
await broadcast_functions.save_unfinished_claim("123")
mock_remove.assert_awaited_once_with("123")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -35,19 +35,23 @@ from dragonchain.lib import crypto
from dragonchain import logger
from dragonchain import exceptions
from dragonchain.lib.interfaces import storage
from dragonchain.lib.dto import eth, btc, bnb
BROADCAST = os.environ["BROADCAST"]
LEVEL = os.environ["LEVEL"]
HTTP_REQUEST_TIMEOUT = 30 # seconds
BROADCAST_RECEIPT_WAIT_TIME = 35 # seconds
# TODO Make L5 wait time dynamic
BROADCAST_RECEIPT_WAIT_TIME_L5 = 43200 # seconds
VERIFICATION_NOTIFICATION: Dict[str, List[str]] = {}
if os.environ.get("VERIFICATION_NOTIFICATION") is not None:
VERIFICATION_NOTIFICATION = cast(Dict[str, List[str]], json.loads(os.environ["VERIFICATION_NOTIFICATION"]))
_network_attr = {
"bitcoin": {"confirmations": btc.CONFIRMATIONS_CONSIDERED_FINAL, "block_time": btc.AVERAGE_BLOCK_TIME, "delay_buffer": 3.0},
"ethereum": {"confirmations": eth.CONFIRMATIONS_CONSIDERED_FINAL, "block_time": eth.AVERAGE_BLOCK_TIME, "delay_buffer": 3.0},
"binance": {"confirmations": bnb.CONFIRMATIONS_CONSIDERED_FINAL, "block_time": bnb.AVERAGE_BLOCK_TIME, "delay_buffer": 1.5},
}
_l5_wait_times: Dict[str, int] = {} # dcID: wait in seconds
_log = logger.get_logger()
# For these variables, we are sure to call setup() when initializing this module before using it, so we ignore type error for None
_requirements: dict = {}
@ -87,6 +91,29 @@ def chain_id_set_from_matchmaking_claim(claim: dict, level: int) -> Set[str]:
return set(claim["validations"][f"l{level}"].keys())
def get_l5_wait_time(chain_id: str) -> int:
if chain_id in _l5_wait_times:
return _l5_wait_times[chain_id]
else:
return set_l5_wait_time(chain_id)
def set_l5_wait_time(chain_id: str) -> int:
try:
mm_config = matchmaking.get_registration(chain_id)
interchain_network = mm_config["network"].split(" ", 1)[0] # first word of network string
broadcast_interval = mm_config["broadcastInterval"] # returns: decimal value in hours
broadcast_interval = int(broadcast_interval * 3600) # converts to int value in seconds
attr = _network_attr[interchain_network]
broadcast_receipt_wait_time_l5 = int(attr["confirmations"] * attr["block_time"] * attr["delay_buffer"]) # in seconds
broadcast_receipt_wait_time_l5 += broadcast_interval
except Exception: # if there is an error when contacting matchmaking
_log.exception(f"[BROADCAST PROCESSOR] Exception when fetching config from matchmaking for chain {chain_id}")
broadcast_receipt_wait_time_l5 = 43200 # seconds (12 hours) [fallback value]
_l5_wait_times[chain_id] = broadcast_receipt_wait_time_l5 # adds to module-level dictionary cache
return broadcast_receipt_wait_time_l5
def make_broadcast_futures(session: aiohttp.ClientSession, block_id: str, level: int, chain_ids: set) -> Optional[Set[asyncio.Task]]:
"""Initiate broadcasts for a block id to certain higher level nodes
Args:
@ -112,6 +139,8 @@ def make_broadcast_futures(session: aiohttp.ClientSession, block_id: str, level:
headers, data = authorization.generate_authenticated_request("POST", chain, path, broadcast_dto)
if level != 5:
headers["deadline"] = str(BROADCAST_RECEIPT_WAIT_TIME)
else:
headers["deadline"] = str(get_l5_wait_time(chain))
url = f"{matchmaking.get_dragonchain_address(chain)}{path}"
_log.info(f"[BROADCAST PROCESSOR] Firing transaction for {chain} (level {level}) at {url}")
broadcasts.add(asyncio.create_task(session.post(url=url, data=data, headers=headers, timeout=HTTP_REQUEST_TIMEOUT)))
@ -170,7 +199,7 @@ async def process_verification_notifications(session: aiohttp.ClientSession) ->
async def send_notification_verification(
session: aiohttp.ClientSession, url: str, verification_bytes: bytes, signature: str, redis_list_value: str
) -> None:
""" Send a notification verification to a preconfigured address
"""Send a notification verification to a preconfigured address
This is the actual async broadcast of a single notification at its most atomic
@ -187,16 +216,19 @@ async def send_notification_verification(
_log.debug(f"Notification -> {url}")
try:
resp = await session.post(
url=url, data=verification_bytes, headers={"dragonchainId": keys.get_public_id(), "signature": signature}, timeout=HTTP_REQUEST_TIMEOUT
url=url,
data=verification_bytes,
headers={"Content-Type": "application/json", "dragonchainId": keys.get_public_id(), "signature": signature},
timeout=HTTP_REQUEST_TIMEOUT,
)
_log.debug(f"Notification <- {resp.status} {url}")
except Exception:
_log.exception(f"Unable to send verification notification.")
_log.exception("Unable to send verification notification.")
await broadcast_functions.remove_notification_verification_for_broadcast_async(redis_list_value)
async def process_blocks_for_broadcast(session: aiohttp.ClientSession) -> None:
async def process_blocks_for_broadcast(session: aiohttp.ClientSession) -> None: # noqa: C901
"""Main function of the broadcast processor
Retrieves blocks that need to be processed, gets matchmaking claims for blocks,
@ -213,18 +245,31 @@ async def process_blocks_for_broadcast(session: aiohttp.ClientSession) -> None:
for block_id, score in await broadcast_functions.get_blocks_to_process_for_broadcast_async():
_log.info(f"[BROADCAST PROCESSOR] Checking block {block_id}")
current_level = await broadcast_functions.get_current_block_level_async(block_id)
if current_level == -1:
_log.warning(f"Failed to lookup current level for block {block_id}.")
continue
try:
claim: Any = matchmaking.get_or_create_claim_check(block_id, _requirements)
except exceptions.InsufficientFunds:
_log.warning("[BROADCAST PROCESSOR] Out of funds! Will not broadcast anything for 30 minutes")
await asyncio.sleep(1800) # Sleep for 30 minutes if insufficient funds
break
except exceptions.NotFound:
except exceptions.UnableToUpdate:
_log.warning("Matchmaking does not have enough matches to create a claim check")
# Schedule this block for 5 minutes later, so we don't spam matchmaking every second if there aren't matches available
await broadcast_functions.schedule_block_for_broadcast_async(block_id, int(time.time()) + 300)
continue
except exceptions.NotFound:
_log.warning(
f"Matchmaking does not have record of claim for block {block_id}."
"Presumably closed. Saving to unfinished claim and removing from broadcast system."
)
await broadcast_functions.save_unfinished_claim(block_id)
continue
claim_chains = chain_id_set_from_matchmaking_claim(claim, current_level)
if current_level == 5:
chain_id = claim_chains.pop() # 'peek' l5 chain id from set by popping and re-adding
claim_chains.add(chain_id)
if score == 0:
# If this block hasn't been broadcast at this level before (score is 0)
_log.info(f"[BROADCAST PROCESSOR] Block {block_id} Level {current_level} not broadcasted yet. Broadcasting to all chains in claim")
@ -236,7 +281,7 @@ async def process_blocks_for_broadcast(session: aiohttp.ClientSession) -> None:
request_futures.update(futures)
# Schedule this block to be re-checked after BROADCAST_RECEIPT_WAIT_TIME more seconds have passed
await broadcast_functions.schedule_block_for_broadcast_async(
block_id, int(time.time()) + (BROADCAST_RECEIPT_WAIT_TIME if current_level != 5 else BROADCAST_RECEIPT_WAIT_TIME_L5)
block_id, int(time.time()) + (BROADCAST_RECEIPT_WAIT_TIME if current_level != 5 else get_l5_wait_time(chain_id))
)
else:
# Block has been broadcast at this level before. Figure out which chains didn't respond in time
@ -247,26 +292,33 @@ async def process_blocks_for_broadcast(session: aiohttp.ClientSession) -> None:
_log.info(f"[BROADCAST PROCESSOR] Chain {chain} didn't respond to broadcast in time. Fetching new chain")
try:
claim = matchmaking.overwrite_no_response_node(block_id, current_level, chain)
except exceptions.NotFound:
except exceptions.UnableToUpdate:
_log.warning(f"Matchmaking does not have enough matches to update this claim check with new chains for level {current_level}")
# Schedule for 5 minutes later, so we don't spam matchmaking every second if there aren't matches
await broadcast_functions.schedule_block_for_broadcast_async(block_id, int(time.time()) + 300)
claim = None
break
except exceptions.NotFound:
_log.warning(
f"Matchmaking does not have record of claim for block {block_id}."
"Presumably closed. Saving to unfinished claim and removing from broadcast system."
)
await broadcast_functions.save_unfinished_claim(block_id)
claim = None
break
# Can't continue processing this block if the claim wasn't updated
if claim is None:
# Schedule for 5 minutes later, so we don't spam matchmaking every second if there aren't matches
await broadcast_functions.schedule_block_for_broadcast_async(block_id, int(time.time()) + 300)
continue
new_claim_chains = chain_id_set_from_matchmaking_claim(claim, current_level)
# Make requests for all the new chains
futures = make_broadcast_futures(session, block_id, current_level, new_claim_chains.difference(current_verifications))
if (
futures is None
): # This occurs when make_broadcast_futures failed to create the broadcast dto (we need to process this block later)
if futures is None:
# This occurs when make_broadcast_futures failed to create the broadcast dto (we need to process this block later)
continue
request_futures.update(futures)
# Schedule this block to be re-checked after BROADCAST_RECEIPT_WAIT_TIME more seconds have passed
await broadcast_functions.schedule_block_for_broadcast_async(
block_id, int(time.time()) + (BROADCAST_RECEIPT_WAIT_TIME if current_level != 5 else BROADCAST_RECEIPT_WAIT_TIME_L5)
block_id, int(time.time()) + (BROADCAST_RECEIPT_WAIT_TIME if current_level != 5 else get_l5_wait_time(chain_id))
)
else:
if current_level >= 5:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -18,22 +18,14 @@
import importlib
import asyncio
import unittest
from unittest.mock import patch, MagicMock
from unittest.mock import patch, MagicMock, AsyncMock
from dragonchain import test_env # noqa: F401
from dragonchain.broadcast_processor import broadcast_processor
from dragonchain import exceptions
def async_test(coro):
def wrapper(*args, **kwargs):
loop = asyncio.get_event_loop()
return loop.run_until_complete(coro(*args, **kwargs))
return wrapper
class BroadcastProcessorTests(unittest.TestCase):
class BroadcastProcessorTests(unittest.IsolatedAsyncioTestCase):
def setUp(self):
importlib.reload(broadcast_processor)
broadcast_processor.BROADCAST = "true"
@ -87,6 +79,34 @@ class BroadcastProcessorTests(unittest.TestCase):
urls = broadcast_processor.get_notification_urls("all")
self.assertEqual(urls, {"url1"})
@patch(
"dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_registration",
return_value={"network": "bitcoin mainnet", "broadcastInterval": 1.23},
)
def test_set_l5_wait_time_success(self, mock_get_rego):
self.assertEqual(broadcast_processor.set_l5_wait_time("chainid"), 15228) # (600 * 6 * 3) + ((1.23 * 60) *60)
mock_get_rego.assert_called_once_with("chainid")
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_registration", return_value={"fruit": "banana"})
def test_set_l5_wait_time_throws_exception(self, mock_get_rego):
self.assertEqual(broadcast_processor.set_l5_wait_time("chainid"), 43200) # hardcoded fallback value
mock_get_rego.assert_called_once_with("chainid")
@patch.dict("dragonchain.broadcast_processor.broadcast_processor._l5_wait_times", {"banana": 123})
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_registration")
def test_get_l5_wait_time_is_cached(self, mock_get_rego):
self.assertEqual(broadcast_processor.get_l5_wait_time("banana"), 123)
mock_get_rego.assert_not_called()
@patch.dict("dragonchain.broadcast_processor.broadcast_processor._l5_wait_times", {})
@patch(
"dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_registration",
return_value={"network": "bitcoin mainnet", "broadcastInterval": 1.23},
)
def test_get_l5_wait_time_not_cached(self, mock_get_rego):
self.assertEqual(broadcast_processor.get_l5_wait_time("chainid"), 15228)
mock_get_rego.assert_called_once_with("chainid")
@patch("dragonchain.broadcast_processor.broadcast_processor.block_dao.get_broadcast_dto")
def test_broadcast_futures_gets_broadcast_dto_for_block_id(self, patch_get_broadcast):
broadcast_processor.make_broadcast_futures(None, "id", 3, set())
@ -122,12 +142,19 @@ class BroadcastProcessorTests(unittest.TestCase):
"dragonchain.broadcast_processor.broadcast_processor.authorization.generate_authenticated_request",
return_value=({"header": "thing"}, b"some data"),
)
def test_broadcast_futures_doesnt_set_deadline_header_for_l5(self, mock_gen_request, mock_get_address, mock_create_task, mock_dto):
@patch(
"dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_registration",
return_value={"network": "bitcoin mainnet", "broadcastInterval": 1.23},
)
def test_broadcast_futures_sets_deadline_header_for_l5(self, mock_get_rego, mock_gen_request, mock_get_address, mock_create_task, mock_dto):
fake_session = MagicMock()
fake_session.post = MagicMock(return_value="session_request")
broadcast_processor.make_broadcast_futures(fake_session, "block_id", 5, {"chain_id"})
fake_session.post.assert_called_once_with(
url="addr/v1/enqueue", data=b"some data", headers={"header": "thing"}, timeout=broadcast_processor.HTTP_REQUEST_TIMEOUT
url="addr/v1/enqueue",
data=b"some data",
headers={"header": "thing", "deadline": "15228"},
timeout=broadcast_processor.HTTP_REQUEST_TIMEOUT,
)
@patch("dragonchain.broadcast_processor.broadcast_processor.block_dao.get_broadcast_dto", return_value="dto")
@ -145,76 +172,61 @@ class BroadcastProcessorTests(unittest.TestCase):
mock_increment_error.assert_called_once_with("block_id", 2)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
)
@async_test
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async", return_value=[])
async def test_process_blocks_gets_blocks_for_broadcast(self, mock_get_blocks, mock_gather):
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_get_blocks.assert_called_once()
mock_get_blocks.assert_awaited_once()
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check", return_value="claim")
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async", return_value=asyncio.Future()
"dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check",
return_value={"metadata": {"dcId": "banana-dc-id"}},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async")
@patch("dragonchain.broadcast_processor.broadcast_processor.make_broadcast_futures")
@patch("dragonchain.broadcast_processor.broadcast_processor.chain_id_set_from_matchmaking_claim")
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=2)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 0)],
)
@async_test
async def test_process_blocks_calls_matchmaking_for_claims(
self, mock_get_blocks, mock_gather, mock_get_block_level, mock_chain_id_set, mock_get_futures, mock_schedule_broadcast, mock_claim
):
mock_schedule_broadcast.return_value.set_result(None)
mock_get_block_level.return_value.set_result(2)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 0)])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_claim.assert_called_once_with("block_id", broadcast_processor._requirements)
mock_chain_id_set.assert_called_once_with("claim", 2)
mock_chain_id_set.assert_called_once_with({"metadata": {"dcId": "banana-dc-id"}}, 2)
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check", side_effect=exceptions.InsufficientFunds)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.sleep", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.sleep")
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=None)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 0)],
)
@async_test
async def test_process_blocks_sleeps_with_insufficient_funds(self, mock_get_blocks, mock_gather, mock_get_block_level, mock_sleep, mock_claim):
mock_sleep.return_value.set_result(None)
mock_get_block_level.return_value.set_result(None)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 0)])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_sleep.assert_called_once_with(1800)
mock_sleep.assert_awaited_once_with(1800)
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check")
@patch("dragonchain.broadcast_processor.broadcast_processor.time.time", return_value=123)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_receieved_verifications_for_block_and_level_async",
return_value=asyncio.Future(),
)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async", return_value=asyncio.Future()
return_value=None,
)
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async")
@patch("dragonchain.broadcast_processor.broadcast_processor.make_broadcast_futures")
@patch("dragonchain.broadcast_processor.broadcast_processor.chain_id_set_from_matchmaking_claim", return_value={"chain_id"})
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=2)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 0)],
)
@async_test
async def test_process_blocks_fires_requests_and_reschedules_for_new_block(
self,
mock_get_blocks,
@ -227,60 +239,47 @@ class BroadcastProcessorTests(unittest.TestCase):
mock_time,
mock_claim,
):
mock_get_verifications.return_value.set_result(None)
mock_schedule_broadcast.return_value.set_result(None)
mock_get_block_level.return_value.set_result(2)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 0)])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_get_futures.assert_called_once_with(None, "block_id", 2, {"chain_id"})
mock_schedule_broadcast.assert_called_once_with("block_id", 123 + broadcast_processor.BROADCAST_RECEIPT_WAIT_TIME)
mock_schedule_broadcast.assert_awaited_once_with("block_id", 123 + broadcast_processor.BROADCAST_RECEIPT_WAIT_TIME)
mock_gather.assert_called_once_with(return_exceptions=True)
mock_get_verifications.assert_not_called()
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check")
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async", return_value=asyncio.Future()
)
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async")
@patch("dragonchain.broadcast_processor.broadcast_processor.make_broadcast_futures", return_value=None)
@patch("dragonchain.broadcast_processor.broadcast_processor.chain_id_set_from_matchmaking_claim", return_value={"chain_id"})
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=2)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 0)],
)
@async_test
async def test_process_blocks_doesnt_reschedule_new_block_which_failed_had_no_futures(
self, mock_get_blocks, mock_gather, mock_get_block_level, mock_chain_id_set, mock_get_futures, mock_schedule_broadcast, mock_claim
):
mock_schedule_broadcast.return_value.set_result(None)
mock_get_block_level.return_value.set_result(2)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 0)])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_get_futures.assert_called_once()
mock_schedule_broadcast.assert_not_called()
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check")
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.set_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.set_current_block_level_async")
@patch("dragonchain.broadcast_processor.broadcast_processor.needed_verifications", return_value=0)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_receieved_verifications_for_block_and_level_async",
return_value=asyncio.Future(),
)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async", return_value=asyncio.Future()
return_value={"verification"},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async")
@patch("dragonchain.broadcast_processor.broadcast_processor.make_broadcast_futures")
@patch("dragonchain.broadcast_processor.broadcast_processor.chain_id_set_from_matchmaking_claim", return_value={"chain_id"})
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=2)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 1)],
)
@async_test
async def test_process_blocks_promotes_block_with_enough_verifications(
self,
mock_get_blocks,
@ -294,15 +293,10 @@ class BroadcastProcessorTests(unittest.TestCase):
mock_set_block_level,
mock_claim_check,
):
mock_set_block_level.return_value.set_result(None)
mock_get_verifications.return_value.set_result({"verification"})
mock_schedule_broadcast.return_value.set_result(None)
mock_get_block_level.return_value.set_result(2)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 1)])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_set_block_level.assert_called_once_with("block_id", 3)
mock_schedule_broadcast.assert_called_once_with("block_id")
mock_set_block_level.assert_awaited_once_with("block_id", 3)
mock_schedule_broadcast.assert_awaited_once_with("block_id")
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check")
@patch(
@ -312,17 +306,16 @@ class BroadcastProcessorTests(unittest.TestCase):
@patch("dragonchain.broadcast_processor.broadcast_processor.needed_verifications", return_value=0)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_receieved_verifications_for_block_and_level_async",
return_value=asyncio.Future(),
return_value={"verification"},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.make_broadcast_futures")
@patch("dragonchain.broadcast_processor.broadcast_processor.chain_id_set_from_matchmaking_claim", return_value={"chain_id"})
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=5)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 1)],
)
@async_test
async def test_process_blocks_removes_l5_block_with_enough_verifications(
self,
mock_get_blocks,
@ -335,11 +328,8 @@ class BroadcastProcessorTests(unittest.TestCase):
mock_remove_block,
mock_claim_check,
):
mock_get_verifications.return_value.set_result({"verification"})
mock_remove_block.return_value.set_result(None)
mock_get_block_level.return_value.set_result(5)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 1)])
mock_remove_block.return_value.set_result(None)
await broadcast_processor.process_blocks_for_broadcast(None)
mock_remove_block.assert_called_once_with("block_id")
@ -348,20 +338,17 @@ class BroadcastProcessorTests(unittest.TestCase):
@patch("dragonchain.broadcast_processor.broadcast_processor.needed_verifications", return_value=3)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_receieved_verifications_for_block_and_level_async",
return_value=asyncio.Future(),
)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async", return_value=asyncio.Future()
return_value={"verification"},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async")
@patch("dragonchain.broadcast_processor.broadcast_processor.make_broadcast_futures")
@patch("dragonchain.broadcast_processor.broadcast_processor.chain_id_set_from_matchmaking_claim", return_value={"chain_id"})
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=2)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 1)],
)
@async_test
async def test_process_blocks_updates_matchmaking_claim_for_new_chain_verification(
self,
mock_get_blocks,
@ -375,34 +362,30 @@ class BroadcastProcessorTests(unittest.TestCase):
mock_no_response_node,
mock_claim_check,
):
mock_get_verifications.return_value.set_result({"verification"})
mock_schedule_broadcast.return_value.set_result(None)
mock_get_block_level.return_value.set_result(2)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 1)])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_no_response_node.assert_called_once_with("block_id", 2, "chain_id")
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check")
@patch(
"dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check",
return_value={"metadata": {"dcId": "banana-dc-id"}},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.overwrite_no_response_node", return_value={"verification"})
@patch("dragonchain.broadcast_processor.broadcast_processor.time.time", return_value=123)
@patch("dragonchain.broadcast_processor.broadcast_processor.needed_verifications", return_value=3)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_receieved_verifications_for_block_and_level_async",
return_value=asyncio.Future(),
)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async", return_value=asyncio.Future()
return_value={"verification"},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async")
@patch("dragonchain.broadcast_processor.broadcast_processor.make_broadcast_futures")
@patch("dragonchain.broadcast_processor.broadcast_processor.chain_id_set_from_matchmaking_claim", return_value={"chain_id"})
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=2)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 1)],
)
@async_test
async def test_process_blocks_makes_broadcast_and_reschedules_block_when_sending_new_requests(
self,
mock_get_blocks,
@ -417,14 +400,10 @@ class BroadcastProcessorTests(unittest.TestCase):
mock_no_response_node,
mock_claim_check,
):
mock_get_verifications.return_value.set_result({"verification"})
mock_schedule_broadcast.return_value.set_result(None)
mock_get_block_level.return_value.set_result(2)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 1)])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_get_futures.assert_called_once_with(None, "block_id", 2, {"chain_id"})
mock_schedule_broadcast.assert_called_once_with("block_id", 123 + broadcast_processor.BROADCAST_RECEIPT_WAIT_TIME)
mock_schedule_broadcast.assert_awaited_once_with("block_id", 123 + broadcast_processor.BROADCAST_RECEIPT_WAIT_TIME)
mock_gather.assert_called_once_with(return_exceptions=True)
@patch("dragonchain.broadcast_processor.broadcast_processor.matchmaking.get_or_create_claim_check")
@ -432,20 +411,17 @@ class BroadcastProcessorTests(unittest.TestCase):
@patch("dragonchain.broadcast_processor.broadcast_processor.needed_verifications", return_value=3)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_receieved_verifications_for_block_and_level_async",
return_value=asyncio.Future(),
)
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async", return_value=asyncio.Future()
return_value={"verification"},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.schedule_block_for_broadcast_async")
@patch("dragonchain.broadcast_processor.broadcast_processor.make_broadcast_futures", return_value=None)
@patch("dragonchain.broadcast_processor.broadcast_processor.chain_id_set_from_matchmaking_claim", return_value={"chain_id"})
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=asyncio.Future())
@patch("dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_current_block_level_async", return_value=2)
@patch("dragonchain.broadcast_processor.broadcast_processor.asyncio.gather", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_processor.broadcast_functions.get_blocks_to_process_for_broadcast_async",
return_value=asyncio.Future(),
return_value=[("block_id", 1)],
)
@async_test
async def test_process_blocks_doesnt_reschedule_existing_block_which_failed_had_no_futures(
self,
mock_get_blocks,
@ -459,53 +435,51 @@ class BroadcastProcessorTests(unittest.TestCase):
mock_no_response_node,
mock_claim_check,
):
mock_get_verifications.return_value.set_result({"verification"})
mock_schedule_broadcast.return_value.set_result(None)
mock_get_block_level.return_value.set_result(2)
mock_gather.return_value.set_result(None)
mock_get_blocks.return_value.set_result([("block_id", 1)])
await broadcast_processor.process_blocks_for_broadcast(None)
mock_get_futures.assert_called_once()
mock_schedule_broadcast.assert_not_called()
@patch("dragonchain.broadcast_processor.broadcast_processor.VERIFICATION_NOTIFICATION", {"all": ["url1"]})
@patch("dragonchain.broadcast_processor.broadcast_functions.get_notification_verifications_for_broadcast_async", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_functions.get_notification_verifications_for_broadcast_async",
return_value={"BLOCK/banana-l2-whatever"},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.sign", return_value="my-signature")
@patch("dragonchain.broadcast_processor.broadcast_processor.storage.get", return_value=b"location-object-bytes")
@patch("dragonchain.broadcast_processor.broadcast_processor.keys.get_public_id", return_value="my-public-id")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.srem_async", return_value=asyncio.Future())
@async_test
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.srem_async", return_value="OK")
async def test_process_verification_notification_calls_configured_url(
self, srem_mock, public_id_mock, storage_get_mock, sign_mock, get_location_mock
):
get_location_mock.return_value.set_result(["BLOCK/banana-l2-whatever"])
mock = MagicMock(return_value=asyncio.Future())
mock.return_value.set_result(MagicMock(status=200))
fake_session = MagicMock(post=mock)
srem_mock.return_value.set_result("OK")
broadcast_processor.VERIFICATION_NOTIFICATION = {"all": ["url1"]}
fake_session = AsyncMock(post=AsyncMock())
await broadcast_processor.process_verification_notifications(fake_session)
fake_session.post.assert_called_once_with(
data=b"location-object-bytes", headers={"dragonchainId": "my-public-id", "signature": "my-signature"}, timeout=30, url="url1"
fake_session.post.assert_awaited_once_with(
data=b"location-object-bytes",
headers={"Content-Type": "application/json", "dragonchainId": "my-public-id", "signature": "my-signature"},
timeout=30,
url="url1",
)
srem_mock.assert_called_once_with("broadcast:notifications", "BLOCK/banana-l2-whatever")
srem_mock.assert_awaited_once_with("broadcast:notifications", "BLOCK/banana-l2-whatever")
@patch("dragonchain.broadcast_processor.broadcast_processor.VERIFICATION_NOTIFICATION", {"all": ["url1"]})
@patch("dragonchain.broadcast_processor.broadcast_functions.get_notification_verifications_for_broadcast_async", return_value=asyncio.Future())
@patch(
"dragonchain.broadcast_processor.broadcast_functions.get_notification_verifications_for_broadcast_async",
return_value={"BLOCK/banana-l2-whatever"},
)
@patch("dragonchain.broadcast_processor.broadcast_processor.sign", return_value="my-signature")
@patch("dragonchain.broadcast_processor.broadcast_processor.storage.get", return_value=b"location-object-bytes")
@patch("dragonchain.broadcast_processor.broadcast_processor.keys.get_public_id", return_value="my-public-id")
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.srem_async", return_value=asyncio.Future())
@async_test
@patch("dragonchain.broadcast_processor.broadcast_functions.redis.srem_async", return_value="OK")
async def test_process_verification_notification_removes_from_set_when_fail(
self, srem_mock, public_id_mock, storage_get_mock, sign_mock, get_location_mock
):
get_location_mock.return_value.set_result(["BLOCK/banana-l2-whatever"])
mock = MagicMock(side_effect=Exception("boom"))
mock.return_value.set_result(MagicMock(status=200))
fake_session = MagicMock(post=mock)
srem_mock.return_value.set_result("OK")
broadcast_processor.VERIFICATION_NOTIFICATION = {"all": ["url1"]}
fake_session = AsyncMock(post=AsyncMock(side_effect=Exception("boom")))
await broadcast_processor.process_verification_notifications(fake_session)
fake_session.post.assert_called_once_with(
data=b"location-object-bytes", headers={"dragonchainId": "my-public-id", "signature": "my-signature"}, timeout=30, url="url1"
data=b"location-object-bytes",
headers={"Content-Type": "application/json", "dragonchainId": "my-public-id", "signature": "my-signature"},
timeout=30,
url="url1",
)
srem_mock.assert_called_once_with("broadcast:notifications", "BLOCK/banana-l2-whatever")
srem_mock.assert_awaited_once_with("broadcast:notifications", "BLOCK/banana-l2-whatever")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -57,14 +57,14 @@ class OpenFaasException(DragonchainException):
"""Exception raised when OpenFaaS returns with error status"""
class LabChainForbiddenException(DragonchainException):
"""Exception raised when lab chain action is not allowed"""
class NotFound(DragonchainException):
"""Exception raised when object is not found"""
class UnableToUpdate(DragonchainException):
"""Exception raised by matchmaking client when it cannot find enough nodes to replace non responsive"""
class BadRequest(DragonchainException):
"""Exception raised on bad request"""
@ -151,6 +151,10 @@ class SanityCheckFailure(DragonchainException):
"""Exception raised when sanity check fails"""
class InterchainPublishError(DragonchainException):
"""Exception raised when an interchain publish action fails"""
class InterchainConnectionError(DragonchainException):
"""Exception raise when RPC / API call has an error"""
@ -167,5 +171,9 @@ class MatchmakingError(DragonchainException):
"""Exception raised by matchmaking client when a problem has occurred"""
class MatchmakingRetryableError(DragonchainException):
"""Exception raised by matchmaking when there is a server error"""
class PartyError(DragonchainException):
"""Exception raised by party client when a problem has occurred"""

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -27,11 +27,12 @@ import requests
import docker
from dragonchain.scheduler import scheduler
from dragonchain.lib.dto import api_key_model
from dragonchain.lib.dao import api_key_dao
from dragonchain.lib.dao import transaction_dao
from dragonchain.lib.dao import transaction_type_dao
from dragonchain.lib.dao import smart_contract_dao
from dragonchain.lib.dto import smart_contract_model
from dragonchain.lib import authorization
from dragonchain.lib import error_reporter
from dragonchain.lib import keys
from dragonchain.lib.interfaces import storage
@ -115,10 +116,11 @@ class ContractJob(object):
self.end_error_state = smart_contract_model.ContractState.DELETE_FAILED.value
def populate_api_keys(self) -> None:
key = authorization.register_new_auth_key(smart_contract=True)
self.model.secrets["secret-key"] = key["key"]
self.model.secrets["auth-key-id"] = key["id"]
self.model.auth_key_id = key["id"]
key = api_key_model.new_from_scratch(smart_contract=True)
api_key_dao.save_api_key(key)
self.model.secrets["secret-key"] = key.key
self.model.secrets["auth-key-id"] = key.key_id
self.model.auth_key_id = key.key_id
def populate_env(self) -> None:
"""Populate environment variables for the job"""
@ -208,10 +210,10 @@ class ContractJob(object):
"com.openfaas.scale.max": "20",
"com.openfaas.scale.factor": "20",
"com.dragonchain.id": INTERNAL_ID,
"com.openfaas.fwatchdog.version": "0.18.2", # Update this as the fwatchdog executable in bin is updates
"com.openfaas.fwatchdog.version": "0.18.10", # Update this as the fwatchdog executable in bin is updates
},
"limits": {"cpu": "0.50", "memory": "600M"},
"requests": {"cpu": "0.25", "memory": "600M"},
"requests": {"cpu": "0.1", "memory": "600M"},
}
_log.info(f"OpenFaaS spec: {spec}")
return spec
@ -283,11 +285,11 @@ class ContractJob(object):
def create_openfaas_secrets(self) -> None:
"""Creates secrets for openfaas functions
Args:
existing_model (obj, optional): The existing model for this contract if action is update
Args:
existing_model (obj, optional): The existing model for this contract if action is update
Returns:
None
Returns:
None
"""
existing_secrets = self.model.existing_secrets or []
@ -297,29 +299,29 @@ class ContractJob(object):
new_secrets = self.model.secrets
for secret, value in new_secrets.items():
secret_name = f"sc-{self.model.id}-{secret.lower()}"
secret = secret.lower()
secret_name = f"sc-{self.model.id}-{secret}"
requests_method = requests.post if secret not in existing_secrets else requests.put
_log.info(f"Creating secret: {secret_name} at {FAAS_GATEWAY}")
response = requests_method(
f"{FAAS_GATEWAY}/system/secrets", headers={"Authorization": faas.get_faas_auth()}, json={"name": secret_name, "value": value}
)
_log.info(f"Response: {response.status_code}")
_log.info(f"Response Body: {response.text}")
if response.status_code != 202:
self.model.set_state(state=self.end_error_state, msg="Error creating contract secrets")
raise exceptions.ContractException("Error creating contract secret")
existing_secrets.append(secret.lower())
if secret not in existing_secrets:
existing_secrets.append(secret)
self.model.existing_secrets = existing_secrets
def delete_openfaas_secrets(self) -> None:
"""Deletes secrets for an openfaas function
Returns:
None
Returns:
None
"""
_log.info(f"Deleting OpenFaaS secrets: {self.model.existing_secrets}")
for secret in self.model.existing_secrets:
@ -332,8 +334,8 @@ class ContractJob(object):
def deploy_to_openfaas(self) -> None:
"""Deploy this job's smart contract to OpenFaaS and update the faas_spec
Returns:
None, or throws exceptions.InternalServerError
Returns:
None, or throws exceptions.InternalServerError
"""
_log.info("Deploying to OpenFaaS cluster")
spec = self.get_openfaas_spec()
@ -354,8 +356,8 @@ class ContractJob(object):
def delete_openfaas_function(self) -> None:
"""Delete this job's smart contract in OpenFaaS and remove the faas_spec
Returns:
None, or throws exceptions.InternalServerError
Returns:
None, or throws exceptions.InternalServerError
"""
_log.info("Deleting OpenFaaS function")
response = requests.delete(

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -127,12 +127,15 @@ class ContractJobTest(unittest.TestCase):
self.assertIsNone(test_job.update_model)
self.assertEqual(test_job.model.task_type, "delete")
@patch("dragonchain.job_processor.contract_job.authorization.register_new_auth_key", return_value={"key": "ban", "id": "ana"})
def test_populate_api_keys(self, mock_register_auth):
@patch("dragonchain.job_processor.contract_job.api_key_model.new_from_scratch", return_value=MagicMock(key="banana", key_id="fish"))
@patch("dragonchain.job_processor.contract_job.api_key_dao.save_api_key")
def test_populate_api_keys(self, mock_save_key, mock_new_key):
self.test_job.populate_api_keys()
self.assertEqual(self.test_job.model.secrets["secret-key"], "ban")
self.assertEqual(self.test_job.model.secrets["auth-key-id"], "ana")
self.assertEqual(self.test_job.model.auth_key_id, "ana")
mock_save_key.assert_called_once_with(mock_new_key.return_value)
mock_new_key.assert_called_once_with(smart_contract=True)
self.assertEqual(self.test_job.model.secrets["secret-key"], "banana")
self.assertEqual(self.test_job.model.secrets["auth-key-id"], "fish")
self.assertEqual(self.test_job.model.auth_key_id, "fish")
@patch("dragonchain.lib.keys.get_public_id", return_value="z7S3WADvnjCyFkUmL48cPGqrSHDrQghNxLFMwBEwwtMa")
def test_populate_env(self, mock_get_id):
@ -220,10 +223,10 @@ class ContractJobTest(unittest.TestCase):
"com.openfaas.scale.factor": "20",
"com.openfaas.scale.max": "20",
"com.openfaas.scale.min": "1",
"com.openfaas.fwatchdog.version": "0.18.2",
"com.openfaas.fwatchdog.version": "0.18.10",
},
"limits": {"cpu": "0.50", "memory": "600M"},
"requests": {"cpu": "0.25", "memory": "600M"},
"requests": {"cpu": "0.1", "memory": "600M"},
"image": "/customer-contracts@sha256:imasha",
},
)

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -40,6 +40,9 @@ STORAGE_LOCATION = os.environ["STORAGE_LOCATION"]
SECRET_LOCATION = os.environ["SECRET_LOCATION"]
DRAGONCHAIN_IMAGE = os.environ["DRAGONCHAIN_IMAGE"]
CONTRACT_TASK_KEY = "mq:contract-task"
PENDING_TASK_KEY = "mq:contract-pending"
_log = logger.get_logger()
_kube: kubernetes.client.BatchV1Api = cast(kubernetes.client.BatchV1Api, None) # This will always be defined before starting by being set in start()
_validate_sc_build_task = fastjsonschema.compile(schema.smart_contract_build_task_schema)
@ -55,6 +58,14 @@ def start() -> None:
_kube = kubernetes.client.BatchV1Api()
_log.debug("Job processor ready!")
if redis.llen_sync(PENDING_TASK_KEY):
_log.warning("WARNING! Pending job processor queue was not empty. Last job probably crashed. Re-queueing these dropped items.")
to_recover = redis.lrange_sync(PENDING_TASK_KEY, 0, -1, decode=False)
p = redis.pipeline_sync()
p.rpush(CONTRACT_TASK_KEY, *to_recover)
p.delete(PENDING_TASK_KEY)
p.execute()
while True:
start_task()
@ -93,10 +104,12 @@ def start_task() -> None:
job = get_existing_job_status(task_definition)
if job and job.status.active:
_log.warning("Throwing away task because job already exists")
redis.lpop_sync(PENDING_TASK_KEY)
return
if job and (job.status.succeeded or job.status.failed):
delete_existing_job(task_definition)
attempt_job_launch(task_definition)
redis.lpop_sync(PENDING_TASK_KEY)
def get_next_task() -> Optional[dict]:
@ -105,16 +118,16 @@ def get_next_task() -> Optional[dict]:
The next task. Blocks until a job is found.
"""
_log.info("Awaiting contract task...")
pop_result = redis.brpop_sync("mq:contract-task", 0, decode=False)
pop_result = redis.brpoplpush_sync(CONTRACT_TASK_KEY, PENDING_TASK_KEY, 0, decode=False)
if pop_result is None:
return None
_, event = pop_result
_log.debug(f"received task: {event}")
_log.debug(f"received task: {pop_result}")
try:
event = json.loads(event)
event = json.loads(pop_result)
_validate_sc_build_task(event)
except Exception:
_log.exception("Error processing task, skipping")
redis.lpop_sync(PENDING_TASK_KEY)
return None
_log.info(f"New task request received: {event}")
return event

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -59,20 +59,42 @@ class TestJobPoller(unittest.TestCase):
},
)
@patch("dragonchain.job_processor.job_processor.redis.brpop_sync", return_value=(1, valid_task_definition_string))
def test_can_get_next_task(self, mock_brpop):
@patch("dragonchain.job_processor.job_processor.start_task", side_effect=Exception("this is really stupid"))
@patch("dragonchain.job_processor.job_processor.kubernetes.client.BatchV1Api")
@patch("dragonchain.job_processor.job_processor.kubernetes.config.load_incluster_config")
@patch("dragonchain.job_processor.job_processor.redis.llen_sync", return_value=1)
@patch("dragonchain.job_processor.job_processor.redis.lrange_sync")
@patch("dragonchain.job_processor.job_processor.redis.pipeline_sync")
def test_restores_from_pending_queue(self, mock_pipeline, mock_lrange, mock_redis_llen, mock_kube_config, mock_kube_client, mock_start_task):
try:
job_processor.start()
except Exception:
# catch exception to allow the start method to ever exit
mock_redis_llen.assert_called_once_with("mq:contract-pending")
mock_lrange.assert_called_once_with("mq:contract-pending", 0, -1, decode=False)
mock_start_task.assert_called_once()
mock_pipeline.assert_called_once()
return
self.fail("Should have had an exception, honestly not sure how you got here")
@patch("dragonchain.job_processor.job_processor.redis.brpoplpush_sync", return_value=valid_task_definition_string.encode("utf8"))
def test_can_get_next_task(self, mock_brpoplpush):
self.assertEqual(job_processor.get_next_task(), valid_task_definition)
mock_brpop.assert_called_once_with("mq:contract-task", 0, decode=False)
mock_brpoplpush.assert_called_once_with("mq:contract-task", "mq:contract-pending", 0, decode=False)
@patch("dragonchain.job_processor.job_processor.redis.brpop_sync", return_value=(1, invalid_task_definition_string))
def test_get_next_task_returns_none_on_invalid_json_schema(self, mock_brpop):
@patch("dragonchain.job_processor.job_processor.redis.lpop_sync")
@patch("dragonchain.job_processor.job_processor.redis.brpoplpush_sync", return_value=(1, invalid_task_definition_string.encode("utf8")))
def test_get_next_task_returns_none_on_invalid_json_schema(self, mock_brpoplpush, mock_lpop):
self.assertIsNone(job_processor.get_next_task())
mock_brpop.assert_called_once_with("mq:contract-task", 0, decode=False)
mock_brpoplpush.assert_called_once_with("mq:contract-task", "mq:contract-pending", 0, decode=False)
mock_lpop.assert_called_once_with("mq:contract-pending")
@patch("dragonchain.job_processor.job_processor.redis.brpop_sync", return_value=(1, '!i "am" not {valid} json!'))
def test_get_next_task_returns_none_on_invalid_json(self, mock_brpop):
@patch("dragonchain.job_processor.job_processor.redis.lpop_sync")
@patch("dragonchain.job_processor.job_processor.redis.brpoplpush_sync", return_value=(1, '!i "am" not {valid} json!'))
def test_get_next_task_returns_none_on_invalid_json(self, mock_brpoplpush, mock_lpop):
self.assertIsNone(job_processor.get_next_task())
mock_brpop.assert_called_once_with("mq:contract-task", 0, decode=False)
mock_brpoplpush.assert_called_once_with("mq:contract-task", "mq:contract-pending", 0, decode=False)
mock_lpop.assert_called_once_with("mq:contract-pending")
@patch("dragonchain.job_processor.job_processor._kube", read_namespaced_job_status=MagicMock(return_value={"test": "dict"}))
def test_get_existing_job_status(self, mock_kube):
@ -120,45 +142,53 @@ class TestJobPoller(unittest.TestCase):
"contract-my-id", "", body=kubernetes.client.V1DeleteOptions(propagation_policy="Background")
)
@patch("dragonchain.job_processor.job_processor.redis.lpop_sync")
@patch("dragonchain.job_processor.job_processor.get_next_task", return_value=valid_task_definition)
@patch("dragonchain.job_processor.job_processor.get_existing_job_status", return_value=None)
@patch("dragonchain.job_processor.job_processor.delete_existing_job")
@patch("dragonchain.job_processor.job_processor.attempt_job_launch")
def test_start_task_launches_job_when_no_existing_job(self, mock_job_launch, mock_delete_job, mock_get_job, mock_get_task):
def test_start_task_launches_job_when_no_existing_job(self, mock_job_launch, mock_delete_job, mock_get_job, mock_get_task, mock_lpopsync):
job_processor.start_task()
mock_get_task.assert_called_once_with()
mock_get_job.assert_called_once_with(valid_task_definition)
mock_delete_job.assert_not_called()
mock_job_launch.assert_called_once_with(valid_task_definition)
mock_lpopsync.assert_called_once()
@patch("dragonchain.job_processor.job_processor.redis.lpop_sync")
@patch("dragonchain.job_processor.job_processor.get_next_task", return_value=valid_task_definition)
@patch(
"dragonchain.job_processor.job_processor.get_existing_job_status", return_value=MagicMock(status=MagicMock(active=0, succeeded=1, failed=0))
)
@patch("dragonchain.job_processor.job_processor.delete_existing_job")
@patch("dragonchain.job_processor.job_processor.attempt_job_launch")
def test_start_task_deletes_and_launches_job_when_finished_existing_job(self, mock_job_launch, mock_delete_job, mock_get_job, mock_get_task):
def test_start_task_deletes_and_launches_job_when_finished_existing_job(
self, mock_job_launch, mock_delete_job, mock_get_job, mock_get_task, mock_lpopsync
):
job_processor.start_task()
mock_get_task.assert_called_once_with()
mock_get_job.assert_called_once_with(valid_task_definition)
mock_delete_job.assert_called_once_with(valid_task_definition)
mock_job_launch.assert_called_once_with(valid_task_definition)
mock_lpopsync.assert_called_once()
@patch("dragonchain.job_processor.job_processor.redis.lpop_sync")
@patch("dragonchain.job_processor.job_processor.get_next_task", return_value=valid_task_definition)
@patch(
"dragonchain.job_processor.job_processor.get_existing_job_status", return_value=MagicMock(status=MagicMock(active=1, succeeded=0, failed=0))
)
@patch("dragonchain.job_processor.job_processor.delete_existing_job")
@patch("dragonchain.job_processor.job_processor.attempt_job_launch")
def test_start_task_no_ops_when_running_job(self, mock_job_launch, mock_delete_job, mock_get_job, mock_get_task):
def test_start_task_no_ops_when_running_job(self, mock_job_launch, mock_delete_job, mock_get_job, mock_get_task, mock_lpop):
job_processor.start_task()
mock_get_task.assert_called_once_with()
mock_get_job.assert_called_once_with(valid_task_definition)
mock_delete_job.assert_not_called()
mock_job_launch.assert_not_called()
mock_lpop.assert_called_once()
def test_attempt_job_launch_raises_on_too_many_retries(self):
self.assertRaises(RuntimeError, job_processor.attempt_job_launch, valid_task_definition, retry=6)

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -20,16 +20,14 @@ import re
import time
import json
import datetime
import string
import random
import secrets
import base64
from typing import Optional, Tuple, Dict, Any
from typing import Optional, Tuple
import requests
from dragonchain.lib.interfaces import storage
from dragonchain.lib.interfaces import secrets as dc_secrets
from dragonchain.lib.dto import api_key_model
from dragonchain.lib.dao import api_key_dao
from dragonchain.lib import matchmaking
from dragonchain.lib import crypto
from dragonchain.lib import keys
@ -72,29 +70,6 @@ def get_supported_hmac_hash(hash_type_str: str) -> crypto.SupportedHashes:
raise ValueError(f"{hash_type_str} is an unsupported HMAC hash type")
def gen_auth_key() -> str:
"""Generate an auth key string
Returns:
String of the newly generated auth key
"""
# Note a 43 character key with this keyset gives us ~256 bits of entropy for these auth_keys
return "".join(secrets.choice(string.ascii_letters + string.digits) for _ in range(43))
def gen_auth_key_id(smart_contract: bool = False) -> str:
"""Generate an auth key ID string
Args:
smart_contract: if the key id should be generated for a smart contract
Returns:
String of the newly generated auth key ID
"""
# Generate key ID consisting of 12 characters, all uppercase characters
key_id = "".join(secrets.choice(string.ascii_uppercase) for _ in range(12))
if smart_contract:
key_id = "SC_" + key_id
return key_id
def get_hmac_message_string(
http_verb: str, full_path: str, dcid: str, timestamp: str, content_type: str, content: bytes, hash_type: crypto.SupportedHashes
) -> str:
@ -139,92 +114,6 @@ def get_authorization(
return f"DC{version}-HMAC-{hmac_hash_type} {auth_key_id}:{hmac}"
def get_auth_key(auth_key_id: str, interchain: bool) -> Optional[str]:
"""Retrieve the auth key corresponding to a key id
Args:
auth_key_id: The key id to grab (if interchain, this is the interchain dcid)
interchain: boolean whether the key to get is an interchain key or not
Returns:
The base64 encoded auth key string corresponding to the id (None if not found)
"""
response = None
try:
if interchain:
response = storage.get_json_from_object(f"KEYS/INTERCHAIN/{auth_key_id}")
else:
response = storage.get_json_from_object(f"KEYS/{auth_key_id}")
except exceptions.NotFound:
pass
if response:
return response.get("key")
return None
def register_new_auth_key(smart_contract: bool = False, auth_key: str = "", auth_key_id: str = "", nickname: str = "") -> Dict[str, Any]:
"""Register a new auth key for use with the chain
Args:
smart_contract: whether it should generate a key for a smart contract
auth_key: (optional) specify an auth_key to use (must be in conjunction with auth_key_id)
auth_key_id: (optional) specify an auth_key_id to use (must be in conjunction with auth_key_id)
Returns:
Dictionary where 'id' is the new auth_key_id and 'key' is the new auth_key
Raises:
ValueError when only one of auth_key or auth_key_id are defined, but not both
"""
if (not auth_key) or (not auth_key_id):
# Check that both are not specified (don't allow only auth_key or auth_key_id to be individually provided)
if auth_key or auth_key_id:
raise ValueError("auth_key and auth_key_id must both be specified together if provided")
# Python do-while
while True:
auth_key_id = gen_auth_key_id(smart_contract)
# Make sure this randomly generated key id doesn't already exist
if not get_auth_key(auth_key_id, False):
break
auth_key = gen_auth_key()
register = {"key": auth_key, "id": auth_key_id, "registration_time": int(time.time()), "nickname": nickname}
storage.put_object_as_json(f"KEYS/{auth_key_id}", register)
return register
def remove_auth_key(auth_key_id: str, interchain: bool = False) -> bool:
"""Remove a registered auth key from this chain
Args:
auth_key_id: The key id string associated with the auth_key to delete
Note: in case of interchain, this is the interchain dcid
interchain: boolean whether this key to remove is an interchain key
Returns:
False if failed to delete, True otherwise
"""
path = None
if interchain:
path = f"KEYS/INTERCHAIN/{auth_key_id}"
else:
path = f"KEYS/{auth_key_id}"
try:
storage.delete(path)
return True
except Exception:
return False
def save_interchain_auth_key(interchain_dcid: str, auth_key: str) -> bool:
"""Register a new interchain auth key. !This will overwrite any existing interchain key for this dcid!
Args:
interchain_dcid: chain id of the interchain sharing this key
auth_key: auth_key to add
Returns:
Boolean if successful
"""
try:
# Add the new key
register = {"key": auth_key, "registration_time": int(time.time())}
storage.put_object_as_json(f"KEYS/INTERCHAIN/{interchain_dcid}", register)
return True
except Exception:
return False
def save_matchmaking_auth_key(auth_key: str) -> bool:
"""Register a new matchmaking auth key. !This will overwrite the existing matchmaking key for this chain!
Args:
@ -247,28 +136,27 @@ def get_matchmaking_key() -> Optional[str]:
return redis.get_sync(MATCHMAKING_KEY_LOCATION)
def register_new_interchain_key_with_remote(interchain_dcid: str) -> str:
def register_new_interchain_key_with_remote(interchain_dcid: str) -> api_key_model.APIKeyModel:
"""Make a new auth key and register it with a remote dragonchain for inter-level communication
Args:
interchain_dcid: chain id of the interchain sharing this key
Returns:
auth key string of the newly shared key
API key model for the newly shared key
Raises:
RuntimeError when bad response from chain or couldn't save to storage
"""
# We need to estabilish a shared HMAC key for this chain before we can post
auth_key = gen_auth_key()
signature = keys.get_my_keys().make_signature(f"{interchain_dcid}_{auth_key}".encode("utf-8"), crypto.SupportedHashes.sha256)
new_key = {"dcid": keys.get_public_id(), "key": auth_key, "signature": signature}
new_interchain_key = api_key_model.new_from_scratch(interchain_dcid=interchain_dcid)
signature = keys.get_my_keys().make_signature(f"{interchain_dcid}_{new_interchain_key.key}".encode("utf-8"), crypto.SupportedHashes.sha256)
new_key = {"dcid": keys.get_public_id(), "key": new_interchain_key.key, "signature": signature}
try:
r = requests.post(f"{matchmaking.get_dragonchain_address(interchain_dcid)}/v1/interchain-auth-register", json=new_key, timeout=30)
except Exception as e:
raise RuntimeError(f"Unable to register shared auth key with dragonchain {interchain_dcid}\nError: {e}")
if r.status_code < 200 or r.status_code >= 300:
raise RuntimeError(f"Unable to register shared auth key with dragonchain {interchain_dcid}\nStatus code: {r.status_code}")
if not save_interchain_auth_key(interchain_dcid, auth_key):
raise RuntimeError("Unable to add new interchain auth key to storage")
return auth_key
api_key_dao.save_api_key(new_interchain_key)
return new_interchain_key
def register_new_key_with_matchmaking() -> str:
@ -278,7 +166,7 @@ def register_new_key_with_matchmaking() -> str:
Raises:
RuntimeError when bad response from chain or couldn't save to storage
"""
auth_key = gen_auth_key()
auth_key = api_key_model.gen_auth_key()
signature = keys.get_my_keys().make_signature(f"matchmaking_{auth_key}".encode("utf-8"), crypto.SupportedHashes.sha256)
new_key = {"dcid": keys.get_public_id(), "key": auth_key, "signature": signature}
try:
@ -317,10 +205,11 @@ def generate_authenticated_request(
# We need to estabilish a shared HMAC key with matchmaking before we can make a request
auth_key = register_new_key_with_matchmaking()
else:
auth_key = get_auth_key(dcid, interchain=True)
if auth_key is None:
try:
auth_key = api_key_dao.get_api_key(dcid, interchain=True).key
except exceptions.NotFound:
# We need to estabilish a shared HMAC key for this chain before we can make a request
auth_key = register_new_interchain_key_with_remote(dcid)
auth_key = register_new_interchain_key_with_remote(dcid).key
timestamp = get_now_datetime().isoformat() + "Z"
content_type = ""
content = b""
@ -395,8 +284,10 @@ def verify_request_authorization( # noqa: C901
content_type: str,
content: bytes,
interchain: bool,
root_only: bool,
) -> None:
api_resource: str,
api_operation: str,
api_name: str,
) -> api_key_model.APIKeyModel:
"""Verify an http request to the webserver
Args:
authorization: Authorization header of the request
@ -407,10 +298,15 @@ def verify_request_authorization( # noqa: C901
content-type: content-type header of the request (if it exists)
content: byte object of the body of the request (if it exists)
interchain: boolean whether to use interchain keys to check or not
root_only: boolean whether or not root is required
api_resource: the api resource name of this endpoint
api_operation: the CRUD api operation of this endpoint ("create", "read", "update", "delete")
api_name: the api name of this particular endpoint
Raises:
exceptions.UnauthorizedException (with message) when the authorization is not valid
exceptions.ActionForbidden (with message) when the authorization is valid, but the action is not allowed
exceptions.APIRateLimitException (with message) when rate limit is currently exceeded for the provided api key id
Returns:
The api key model used for this request (if successfully authenticated)
"""
if dcid != keys.get_public_id():
raise exceptions.UnauthorizedException("Incorrect Dragonchain ID")
@ -442,24 +338,26 @@ def verify_request_authorization( # noqa: C901
message_string = get_hmac_message_string(http_verb, full_path, dcid, timestamp, content_type, content, supported_hash)
try:
auth_key_id = re.search(" (.*):", authorization).group(1) # noqa: T484
if "/" in auth_key_id:
_log.info(f"Authorization failure from potentially malicious key id {auth_key_id}")
raise exceptions.UnauthorizedException("Invalid HMAC Authentication")
if root_only and auth_key_id != dc_secrets.get_dc_secret("hmac-id"):
raise exceptions.ActionForbidden("this action can only be performed with root auth key")
auth_key = get_auth_key(auth_key_id, interchain)
if not auth_key:
try:
auth_key = api_key_dao.get_api_key(auth_key_id, interchain)
except exceptions.NotFound:
_log.info(f"Authorization failure from key that does not exist {auth_key_id}")
raise exceptions.UnauthorizedException("Invalid HMAC Authentication")
# Check if this key should be rate limited (does not apply to interchain keys)
if not interchain and should_rate_limit(auth_key_id):
raise exceptions.APIRateLimitException(f"API Rate Limit Exceeded. {RATE_LIMIT} requests allowed per minute.")
if crypto.compare_hmac(supported_hash, hmac, auth_key, message_string):
if crypto.compare_hmac(supported_hash, hmac, auth_key.key, message_string):
# Check if this signature has already been used for replay protection
if signature_is_replay(f"{auth_key_id}:{base64.b64encode(hmac).decode('ascii')}"):
raise exceptions.UnauthorizedException("Previous matching request found (no replays allowed)")
# Signature is valid; Return nothing on success
return
# Check that this key is allowed to perform this action
try:
if auth_key.is_key_allowed(api_resource, api_operation, api_name, interchain):
# Signature is valid and key is allowed; Return the api key used on success
return auth_key
except Exception:
_log.exception("Uncaught exception checking if api key is allowed")
raise exceptions.ActionForbidden(f"This key is not allowed to perform {api_name}")
else:
# HMAC doesn't match
raise exceptions.UnauthorizedException("Invalid HMAC Authentication")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -39,15 +39,19 @@ class TestAuthorization(unittest.TestCase):
def test_datetime(self):
self.assertIsInstance(authorization.get_now_datetime(), datetime.datetime)
def test_gen_auth_key(self):
auth_key = authorization.gen_auth_key()
self.assertRegex(auth_key, r"[a-zA-Z0-9]{43}")
@patch("dragonchain.lib.authorization.redis.set_sync")
def test_save_matchmaking_auth_key_calls_redis(self, mock_redis_set):
self.assertTrue(authorization.save_matchmaking_auth_key("key"))
mock_redis_set.assert_called_once_with("authorization:matchmaking", "key")
def test_gen_auth_key_id(self):
auth_key_id = authorization.gen_auth_key_id()
self.assertRegex(auth_key_id, r"[A-Z]{12}")
auth_key_id = authorization.gen_auth_key_id(True)
self.assertRegex(auth_key_id, r"SC_[A-Z]{12}")
@patch("dragonchain.lib.authorization.redis.set_sync", side_effect=Exception)
def test_save_matchmaking_auth_returns_false_on_redis_error(self, mock_redis_set):
self.assertFalse(authorization.save_matchmaking_auth_key("key"))
@patch("dragonchain.lib.authorization.redis.get_sync", return_value="banana")
def test_get_matchmaking_key_returns_from_redis(self, mock_redis_get):
self.assertEqual(authorization.get_matchmaking_key(), "banana")
mock_redis_get.assert_called_once_with("authorization:matchmaking")
def test_get_hmac_string(self):
http_verb = "TEST"
@ -69,103 +73,72 @@ class TestAuthorization(unittest.TestCase):
"DC1-HMAC-SHA256 id:G0ufeozs9/jOZCvIAkEfWhwCxx0NBDrvapnqdqShxWA=",
)
@patch("dragonchain.lib.authorization.storage.get_json_from_object", return_value={"key": "thing"})
def test_get_auth_key(self, mock_storage):
self.assertEqual(authorization.get_auth_key("test", False), "thing")
mock_storage.assert_called_with("KEYS/test")
@patch("dragonchain.lib.authorization.storage.get_json_from_object", return_value={"key": "thing"})
def test_get_auth_key_interchain(self, mock_storage):
self.assertEqual(authorization.get_auth_key("test", True), "thing")
mock_storage.assert_called_with("KEYS/INTERCHAIN/test")
@patch("dragonchain.lib.authorization.storage.get_json_from_object", side_effect=exceptions.NotFound)
def test_get_auth_key_returns_none_on_not_found(self, mock_storage):
self.assertIsNone(authorization.get_auth_key("test", False))
@patch("dragonchain.lib.authorization.storage.get_json_from_object", return_value=None)
def test_get_auth_key_returns_none_on_empty_storage_get(self, mock_storage):
self.assertIsNone(authorization.get_auth_key("test", False))
@patch("dragonchain.lib.authorization.storage.delete", return_value=True)
def test_remove_auth_key(self, mock_storage):
self.assertTrue(authorization.remove_auth_key("test"))
mock_storage.assert_called_with("KEYS/test")
@patch("dragonchain.lib.authorization.storage.delete", return_value=True)
def test_remove_auth_key_interchain(self, mock_storage):
self.assertTrue(authorization.remove_auth_key("test", True))
mock_storage.assert_called_with("KEYS/INTERCHAIN/test")
@patch("dragonchain.lib.authorization.storage.delete", return_value=True)
def test_remove_auth_key_returns_false_on_error(self, mock_storage):
mock_storage.side_effect = RuntimeError
self.assertFalse(authorization.remove_auth_key("test"))
@patch("dragonchain.lib.authorization.gen_auth_key", return_value="test_key")
@patch("dragonchain.lib.authorization.gen_auth_key_id", return_value="test_key_id")
@patch("dragonchain.lib.authorization.storage.put_object_as_json")
@patch("dragonchain.lib.authorization.get_auth_key", return_value=False)
def test_register_new_auth_key_with_valid_data(self, mock_get_auth_key, mock_storage, mock_gen_key_id, mock_gen_key):
self.assertRaises(ValueError, authorization.register_new_auth_key, False, None, "id")
result = authorization.register_new_auth_key()
mock_storage.assert_called_with("KEYS/test_key_id", result)
self.assertEqual(result["key"], "test_key")
self.assertEqual(result["id"], "test_key_id")
@patch("dragonchain.lib.authorization.storage.put_object_as_json")
def test_register_new_auth_key_supplying_both_key_and_id(self, mock_storage):
result = authorization.register_new_auth_key(auth_key="test", auth_key_id="yes")
mock_storage.assert_called_with("KEYS/yes", result)
self.assertEqual(result["key"], "test")
self.assertEqual(result["id"], "yes")
@patch("dragonchain.lib.authorization.storage.put_object_as_json")
def test_register_new_interchain_key_returns_true_on_success(self, mock_storage):
self.assertTrue(authorization.save_interchain_auth_key("test", "key"))
mock_storage.assert_called_once()
@patch("dragonchain.lib.authorization.storage.put_object_as_json", side_effect=Exception)
def test_register_new_interchain_key_returns_false_on_error(self, mock_storage):
self.assertFalse(authorization.save_interchain_auth_key("test", "key"))
@patch("dragonchain.lib.authorization.api_key_dao.save_api_key")
@patch("dragonchain.lib.authorization.api_key_model.new_from_scratch")
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.keys.get_my_keys", return_value=MagicMock(make_signature=MagicMock(return_value="sig")))
@patch("dragonchain.lib.authorization.save_interchain_auth_key", return_value=True)
@patch("dragonchain.lib.authorization.requests.post", return_value=MagicMock(status_code=201))
@patch("dragonchain.lib.authorization.gen_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.matchmaking.get_dragonchain_address", return_value="https://someurl")
def test_register_interchain_key_with_remote_returns_valid(self, mock_get_address, mock_gen_auth, mock_post, mock_save, mock_keys, mock_dcid):
def test_register_interchain_key_with_remote_returns_valid(self, mock_get_address, mock_post, mock_keys, mock_dcid, mock_new_key, mock_save):
remote_dcid = "remote"
url = "https://someurl/v1/interchain-auth-register"
expected_key = {"dcid": "test_dcid", "key": "key", "signature": "sig"}
self.assertEqual(authorization.register_new_interchain_key_with_remote(remote_dcid), "key")
expected_key = {"dcid": "test_dcid", "key": mock_new_key.return_value.key, "signature": "sig"}
self.assertEqual(authorization.register_new_interchain_key_with_remote(remote_dcid), mock_new_key.return_value)
mock_post.assert_called_with(url, json=expected_key, timeout=30)
mock_save.assert_called_once_with(mock_new_key.return_value)
@patch("dragonchain.lib.authorization.api_key_model.new_from_scratch")
@patch("dragonchain.lib.keys.get_public_id", return_value="z7S3WADvnjCyFkUmL48cPGqrSHDrQghNxLFMwBEwwtMa")
@patch("dragonchain.lib.authorization.keys.get_my_keys")
@patch("dragonchain.lib.authorization.save_interchain_auth_key", return_value=True)
@patch("dragonchain.lib.authorization.requests.post", return_value=MagicMock(status_code=100))
@patch("dragonchain.lib.authorization.gen_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.matchmaking.get_dragonchain_address", return_value="https://someurl")
def test_register_interchain_key_raises_with_bad_status_code(self, mock_get_address, mock_gen_auth, mock_post, mock_save, mock_keys, mock_get_id):
def test_register_interchain_key_raises_with_bad_status_code(self, mock_get_address, mock_post, mock_keys, mock_get_id, mock_new_key):
self.assertRaises(RuntimeError, authorization.register_new_interchain_key_with_remote, "thing")
@patch("dragonchain.lib.authorization.api_key_model.new_from_scratch")
@patch("dragonchain.lib.keys.get_public_id", return_value="z7S3WADvnjCyFkUmL48cPGqrSHDrQghNxLFMwBEwwtMa")
@patch("dragonchain.lib.authorization.keys.get_my_keys")
@patch("dragonchain.lib.authorization.save_interchain_auth_key", return_value=False)
@patch("dragonchain.lib.authorization.requests.post", return_value=MagicMock(status_code=201))
@patch("dragonchain.lib.authorization.gen_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.requests.post", side_effect=Exception)
@patch("dragonchain.lib.authorization.matchmaking.get_dragonchain_address", return_value="https://someurl")
def test_register_interchain_key_raises_with_failure_to_register_interchain_key(
self, mock_get_address, mock_gen_auth, mock_post, mock_save, mock_keys, mock_get_id
):
def test_register_interchain_key_raises_with_bad_request_exception(self, mock_get_address, mock_post, mock_keys, mock_get_id, mock_new_key):
self.assertRaises(RuntimeError, authorization.register_new_interchain_key_with_remote, "thing")
@patch("dragonchain.lib.authorization.api_key_model.gen_auth_key", return_value="banana")
@patch("dragonchain.lib.keys.get_public_id", return_value="z7S3WADvnjCyFkUmL48cPGqrSHDrQghNxLFMwBEwwtMa")
@patch("dragonchain.lib.authorization.keys.get_my_keys", return_value=MagicMock(make_signature=MagicMock(return_value="signature")))
@patch("dragonchain.lib.authorization.requests.post", return_value=MagicMock(status_code=201))
@patch("dragonchain.lib.authorization.save_matchmaking_auth_key", return_value=True)
def test_register_with_matchmaking_returns_valid(self, mock_save_key, mock_post, mock_get_keys, mock_get_id, mock_gen_key):
self.assertEqual(authorization.register_new_key_with_matchmaking(), "banana")
@patch("dragonchain.lib.authorization.api_key_model.gen_auth_key", return_value="banana")
@patch("dragonchain.lib.keys.get_public_id", return_value="z7S3WADvnjCyFkUmL48cPGqrSHDrQghNxLFMwBEwwtMa")
@patch("dragonchain.lib.authorization.keys.get_my_keys", return_value=MagicMock(make_signature=MagicMock(return_value="signature")))
@patch("dragonchain.lib.authorization.requests.post", return_value=MagicMock(status_code=100))
@patch("dragonchain.lib.authorization.save_matchmaking_auth_key", return_value=True)
def test_register_with_matchmaking_raises_with_bad_status_code(self, mock_save_key, mock_post, mock_get_keys, mock_get_id, mock_gen_key):
self.assertRaises(RuntimeError, authorization.register_new_key_with_matchmaking)
@patch("dragonchain.lib.authorization.api_key_model.gen_auth_key", return_value="banana")
@patch("dragonchain.lib.keys.get_public_id", return_value="z7S3WADvnjCyFkUmL48cPGqrSHDrQghNxLFMwBEwwtMa")
@patch("dragonchain.lib.authorization.keys.get_my_keys", return_value=MagicMock(make_signature=MagicMock(return_value="signature")))
@patch("dragonchain.lib.authorization.requests.post", side_effect=Exception)
@patch("dragonchain.lib.authorization.save_matchmaking_auth_key", return_value=True)
def test_register_with_matchmaking_raises_with_request_exception(self, mock_save_key, mock_post, mock_get_keys, mock_get_id, mock_gen_key):
self.assertRaises(RuntimeError, authorization.register_new_key_with_matchmaking)
@patch("dragonchain.lib.authorization.api_key_model.gen_auth_key", return_value="banana")
@patch("dragonchain.lib.keys.get_public_id", return_value="z7S3WADvnjCyFkUmL48cPGqrSHDrQghNxLFMwBEwwtMa")
@patch("dragonchain.lib.authorization.keys.get_my_keys", return_value=MagicMock(make_signature=MagicMock(return_value="signature")))
@patch("dragonchain.lib.authorization.requests.post", return_value=MagicMock(status_code=200))
@patch("dragonchain.lib.authorization.save_matchmaking_auth_key", return_value=False)
def test_register_with_matchmaking_raises_with_bad_key_save(self, mock_save_key, mock_post, mock_get_keys, mock_get_id, mock_gen_key):
self.assertRaises(RuntimeError, authorization.register_new_key_with_matchmaking)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.register_new_interchain_key_with_remote", return_value="key")
@patch("dragonchain.lib.authorization.register_new_interchain_key_with_remote", return_value=MagicMock(key="key"))
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=MagicMock(isoformat=MagicMock(return_value="timestamp")))
@patch("dragonchain.lib.authorization.get_auth_key", return_value=None)
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", side_effect=exceptions.NotFound)
def test_gen_interchain_request_dcid(self, mock_get_auth_key, date_mock, mock_register, mock_dcid):
dcid = "adcid"
full_path = "/path"
@ -225,7 +198,7 @@ class TestAuthorization(unittest.TestCase):
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_raises_with_wrong_dc_id(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
@ -243,14 +216,16 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_raises_with_unsupported_auth_version(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
http_verb = "GET"
full_path = "/path"
@ -268,14 +243,16 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_raises_with_unsupported_hmac_hash(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
http_verb = "GET"
full_path = "/path"
@ -293,14 +270,16 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_raises_with_old_timestamp(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
@ -318,14 +297,16 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_raises_with_malformed_authorization(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
http_verb = "GET"
full_path = "/path"
@ -343,7 +324,9 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
self.assertRaisesWithMessage(
exceptions.UnauthorizedException,
@ -357,14 +340,16 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_raises_with_invalid_hmac(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
http_verb = "GET"
full_path = "/path"
@ -382,14 +367,16 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_passes_when_valid(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
@ -397,7 +384,9 @@ class TestAuthorization(unittest.TestCase):
dcid = "test_dcid"
timestamp = "2018-11-14T09:05:25.128176Z"
# Test valid SHA256
authorization.verify_request_authorization(auth_str, http_verb, full_path, dcid, timestamp, "", b"", False, False)
authorization.verify_request_authorization(
auth_str, http_verb, full_path, dcid, timestamp, "", b"", False, "api_keys", "create", "create_api_key"
)
# Test valid BLAKE2b512
authorization.verify_request_authorization(
"DC1-HMAC-BLAKE2b512 id:x1PrKtbs51CR1X6/NTIxyjwOPmZF3rxIXdtJARDialRV+H3FbmUxLmqDuCQvPKEOLN9rNUFhsZa3QZVf8+kXkA==",
@ -408,18 +397,30 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
# Test valid SHA3-256
authorization.verify_request_authorization(
"DC1-HMAC-SHA3-256 id:IjPhj3dzTyj0VhcI5oUl5vcFapX8/GpJaO5M82SD3dE=", http_verb, full_path, dcid, timestamp, "", b"", False, False
"DC1-HMAC-SHA3-256 id:IjPhj3dzTyj0VhcI5oUl5vcFapX8/GpJaO5M82SD3dE=",
http_verb,
full_path,
dcid,
timestamp,
"",
b"",
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=True)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_raises_on_replay(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
@ -438,7 +439,67 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=False)))
def test_verify_req_auth_raises_on_key_not_allowed(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
full_path = "/path"
dcid = "test_dcid"
timestamp = "2018-11-14T09:05:25.128176Z"
self.assertRaisesWithMessage(
exceptions.ActionForbidden,
"This key is not allowed to perform create_api_key",
authorization.verify_request_authorization,
auth_str,
http_verb,
full_path,
dcid,
timestamp,
"",
b"",
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch(
"dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(side_effect=Exception))
)
def test_verify_req_auth_raises_on_key_allowed_exception(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
full_path = "/path"
dcid = "test_dcid"
timestamp = "2018-11-14T09:05:25.128176Z"
self.assertRaisesWithMessage(
exceptions.ActionForbidden,
"This key is not allowed to perform create_api_key",
authorization.verify_request_authorization,
auth_str,
http_verb,
full_path,
dcid,
timestamp,
"",
b"",
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@ -446,7 +507,7 @@ class TestAuthorization(unittest.TestCase):
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.should_rate_limit", return_value=True)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_verify_req_auth_raises_with_rate_limit(self, mock_get_auth_key, mock_date, mock_should_limit, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
@ -465,14 +526,16 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value=None)
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", side_effect=exceptions.NotFound)
def test_verify_req_auth_raises_with_no_key(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
@ -491,14 +554,16 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", side_effect=Exception)
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", side_effect=Exception)
def test_verify_req_auth_raises_on_get_key_error(self, mock_get_auth_key, mock_date, mock_is_replay, mock_dcid):
auth_str = "DC1-HMAC-SHA256 id:gr1FvIvTe1oOmFZqHgRQUhi6s/EyBvZmJWqH1oWV+UQ="
http_verb = "GET"
@ -517,13 +582,15 @@ class TestAuthorization(unittest.TestCase):
"",
b"",
False,
False,
"api_keys",
"create",
"create_api_key",
)
@patch("dragonchain.lib.authorization.keys.get_public_id", return_value="test_dcid")
@patch("dragonchain.lib.authorization.signature_is_replay", return_value=False)
@patch("dragonchain.lib.authorization.get_now_datetime", return_value=datetime.datetime(2018, 11, 14, 9, 5, 25, 128176))
@patch("dragonchain.lib.authorization.get_auth_key", return_value="key")
@patch("dragonchain.lib.authorization.api_key_dao.get_api_key", return_value=MagicMock(key="key", is_key_allowed=MagicMock(return_value=True)))
def test_generated_authenticated_request_with_verifier(self, mock_get_auth_key, mock_date, mock_is_replay, mock_get_id):
"""
This is more of psuedo integration test, ensuring that
@ -540,15 +607,21 @@ class TestAuthorization(unittest.TestCase):
headers, content = authorization.generate_authenticated_request("POST", dcid, full_path, json_content, "SHA256")
auth_str = headers["Authorization"]
# Test with SHA256 HMAC Auth
authorization.verify_request_authorization(auth_str, "POST", full_path, dcid, timestamp, "application/json", content, False, False)
authorization.verify_request_authorization(
auth_str, "POST", full_path, dcid, timestamp, "application/json", content, False, "api_keys", "create", "create_api_key"
)
headers, content = authorization.generate_authenticated_request("POST", dcid, full_path, json_content, "BLAKE2b512")
auth_str = headers["Authorization"]
# Test with BLAKE2b512 HMAC Auth
authorization.verify_request_authorization(auth_str, "POST", full_path, dcid, timestamp, "application/json", content, False, False)
authorization.verify_request_authorization(
auth_str, "POST", full_path, dcid, timestamp, "application/json", content, False, "api_keys", "create", "create_api_key"
)
headers, content = authorization.generate_authenticated_request("POST", dcid, full_path, json_content, "SHA3-256")
auth_str = headers["Authorization"]
# Test with SHA3-256 HMAC Auth
authorization.verify_request_authorization(auth_str, "POST", full_path, dcid, timestamp, "application/json", content, False, False)
authorization.verify_request_authorization(
auth_str, "POST", full_path, dcid, timestamp, "application/json", content, False, "api_keys", "create", "create_api_key"
)
@patch("dragonchain.lib.authorization.RATE_LIMIT", 0)
@patch("dragonchain.lib.authorization.redis.lindex_sync")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -22,8 +22,10 @@ from typing import TYPE_CHECKING, cast
import requests
from dragonchain import logger
from dragonchain import exceptions
from dragonchain.lib import authorization
from dragonchain.lib import matchmaking
from dragonchain.lib.database import redis
if TYPE_CHECKING:
from dragonchain.lib.dto import l5_block_model
@ -99,8 +101,11 @@ def send_receipts(l5_block: "l5_block_model.L5BlockModel") -> None:
try:
claim_check_id = f"{chain_id}-{block}"
matchmaking.resolve_claim_check(claim_check_id)
except exceptions.MatchmakingRetryableError: # any 500-level server errors
_log.exception(f"Adding claim to failed queue. Claim ID: {claim_check_id}")
redis.sadd_sync("mq:failed-claims", claim_check_id) # using a set avoids duplicates
except Exception:
_log.exception("Failure to finalize claim in matchmaking. Sending reciepts to lower level nodes.")
_log.exception("Failure to finalize claim in matchmaking. Sending receipts to lower level nodes.")
except Exception as e:
_log.exception(f"[BROADCAST] Error while trying to broadcast down for l4 block {l4_block}\n{e}\n!Will ignore this broadcast!")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -32,10 +32,8 @@ def register_callback(txn_id: str, callback_url: str) -> None:
def fire_if_exists(txn_id: str, transaction_model: transaction_model.TransactionModel) -> None:
""" Fires a callback with a given payload, then removes from redis"""
_log.debug(f"Looking in redis for callback: {CALLBACK_REDIS_KEY} {txn_id}")
"""Fires a callback with a given payload, then removes from redis"""
url = redis.hget_sync(CALLBACK_REDIS_KEY, txn_id)
_log.debug(f"Found {url}")
if url is not None:
try:
_log.debug(f"POST -> {url}")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -0,0 +1,102 @@
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
# Section 6. Trademarks. is deleted and replaced with:
# 6. Trademarks. This License does not grant permission to use the trade
# names, trademarks, service marks, or product names of the Licensor
# and its affiliates, except as required to comply with Section 4(c) of
# the License and to reproduce the content of the NOTICE file.
# You may obtain a copy of the Apache License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the Apache License with the above modification is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
from typing import List
from dragonchain.lib.interfaces import storage
from dragonchain.lib.dto import api_key_model
from dragonchain import exceptions
from dragonchain import logger
FOLDER = "KEYS"
INTERCHAIN_FOLDER = "KEYS/INTERCHAIN"
MIGRATION_V1 = "MIGRATION_V1_COMPLETE"
_log = logger.get_logger()
def save_api_key(api_key: api_key_model.APIKeyModel) -> None:
"""Save an api key model to storage"""
storage.put_object_as_json(f"{INTERCHAIN_FOLDER if api_key.interchain else FOLDER}/{api_key.key_id}", api_key.export_as_at_rest())
def get_api_key(key_id: str, interchain: bool) -> api_key_model.APIKeyModel:
"""Get an api key from storage
Args:
key_id: The key id to fetch (public chain id if interchain)
interchain: Whether or not this is an interchain key
"""
# Explicitly don't allow permission keys with slashes (may be malicious)
if "/" in key_id:
raise exceptions.NotFound
model = api_key_model.new_from_at_rest(storage.get_json_from_object(f"{INTERCHAIN_FOLDER if interchain else FOLDER}/{key_id}"))
if model.interchain != interchain: # Double check the interchain value of the key is what we expect; otherwise panic
raise RuntimeError(f"Bad interchain key {key_id} found. Expected interchain: {interchain} but got {model.interchain}")
return model
def list_api_keys(include_interchain: bool) -> List[api_key_model.APIKeyModel]:
"""Retrieve a list of api keys
Args:
include_interchain: whether or not to include interchain api keys
Returns:
List of api key models
"""
# Get keys from storage, excluding migration marker and interchain keys
return_list = []
for key in storage.list_objects(prefix=FOLDER):
if (MIGRATION_V1 in key) or (key.startswith("KEYS/INTERCHAIN") and not include_interchain):
continue
return_list.append(api_key_model.new_from_at_rest(storage.get_json_from_object(key)))
return return_list
def delete_api_key(key_id: str, interchain: bool) -> None:
"""Delete an api key from this chain
Args:
key_id: The key id to delete (public chain id if interchain)
interchain: Whether or not this is an interchain key
"""
if not interchain and key_id.startswith("INTERCHAIN"):
raise RuntimeError("Attempt to remove interchain key when not intended")
storage.delete(f"{INTERCHAIN_FOLDER if interchain else FOLDER}/{key_id}")
def perform_api_key_migration_v1_if_necessary() -> None:
"""Checks if an api key migration needs to be performed, and does so if necessary"""
try:
if storage.get(f"{FOLDER}/{MIGRATION_V1}") == b"1":
# Migration was previously performed. No action necessary
return
except exceptions.NotFound:
pass
_log.info("Api key migration required. Performing now")
valid_keys = storage.list_objects(prefix=FOLDER)
regular_keys = list(filter(lambda x: not x.startswith("KEYS/INTERCHAIN/"), valid_keys))
interchain_keys = list(filter(lambda x: x.startswith("KEYS/INTERCHAIN/"), valid_keys))
for key in regular_keys:
_log.info(f"Migrating {key}")
api_key = api_key_model.new_from_legacy(storage.get_json_from_object(key), interchain_dcid="")
save_api_key(api_key)
for key in interchain_keys:
_log.info(f"Migrating interchain key {key}")
interchain_dcid = key[key.find("KEYS/INTERCHAIN/") + 16 :] # Get the interchain dcid from the key
api_key = api_key_model.new_from_legacy(storage.get_json_from_object(key), interchain_dcid=interchain_dcid)
save_api_key(api_key)
# Save migration marker once complete
storage.put(f"{FOLDER}/{MIGRATION_V1}", b"1")
_log.info("Api key migration v1 complete")

View File

@ -0,0 +1,181 @@
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
# Section 6. Trademarks. is deleted and replaced with:
# 6. Trademarks. This License does not grant permission to use the trade
# names, trademarks, service marks, or product names of the Licensor
# and its affiliates, except as required to comply with Section 4(c) of
# the License and to reproduce the content of the NOTICE file.
# You may obtain a copy of the Apache License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the Apache License with the above modification is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
import unittest
from unittest.mock import patch, call, MagicMock
from dragonchain import test_env # noqa: F401
from dragonchain.lib.dao import api_key_dao
from dragonchain import exceptions
class TestApiKeyDAO(unittest.TestCase):
@patch(
"dragonchain.lib.dao.api_key_dao.storage.get_json_from_object",
return_value={
"key_id": "blah",
"registration_time": 1234,
"key": "my_auth_key",
"version": "1",
"permissions_document": {"version": "1", "default_allow": True, "permissions": {}},
"interchain": False,
"root": False,
"nickname": "",
},
)
@patch("dragonchain.lib.dao.api_key_dao.storage.list_objects", return_value=["KEYS/INTERCHAIN/blah", "KEYS/blah"])
def test_list_api_keys_removes_interchain_keys(self, mock_list_objects, mock_get_object):
response = api_key_dao.list_api_keys(include_interchain=False)
self.assertEqual(len(response), 1)
self.assertEqual(response[0].key_id, "blah")
self.assertEqual(response[0].registration_time, 1234)
mock_get_object.assert_called_once()
@patch(
"dragonchain.lib.dao.api_key_dao.storage.get_json_from_object",
return_value={
"key_id": "blah",
"registration_time": 1234,
"key": "my_auth_key",
"version": "1",
"permissions_document": {"version": "1", "default_allow": True, "permissions": {}},
"interchain": True,
"root": False,
"nickname": "",
},
)
@patch("dragonchain.lib.dao.api_key_dao.storage.list_objects", return_value=["KEYS/INTERCHAIN/blah"])
def test_list_api_keys_include_interchain_keys(self, mock_list_objects, mock_get_object):
response = api_key_dao.list_api_keys(include_interchain=True)
self.assertEqual(len(response), 1)
self.assertEqual(response[0].key_id, "blah")
self.assertEqual(response[0].registration_time, 1234)
mock_get_object.assert_called_once()
@patch("dragonchain.lib.dao.api_key_dao.storage.put_object_as_json")
def test_save_api_key_calls_storage_correctly(self, mock_save):
fake_api_key = MagicMock()
fake_api_key.export_as_at_rest.return_value = {"thing": "yup"}
fake_api_key.interchain = False
fake_api_key.key_id = "someid"
api_key_dao.save_api_key(fake_api_key)
fake_api_key.export_as_at_rest.assert_called_once()
mock_save.assert_called_once_with("KEYS/someid", fake_api_key.export_as_at_rest.return_value)
@patch(
"dragonchain.lib.dao.api_key_dao.storage.get_json_from_object",
return_value={
"key_id": "blah",
"registration_time": 1234,
"key": "my_auth_key",
"version": "1",
"permissions_document": {"version": "1", "default_allow": True, "permissions": {}},
"interchain": True,
"root": False,
"nickname": "",
},
)
def test_get_api_key_gets_from_storage_correctly(self, mock_get_object):
api_key_dao.get_api_key("some_id", interchain=True)
mock_get_object.return_value = { # Set interchain value now to false or else error will be thrown
"key_id": "blah",
"registration_time": 1234,
"key": "my_auth_key",
"version": "1",
"permissions_document": {"version": "1", "default_allow": True, "permissions": {}},
"interchain": False,
"root": False,
"nickname": "",
}
returned_key = api_key_dao.get_api_key("some_id", interchain=False)
self.assertEqual(returned_key.key_id, "blah")
mock_get_object.assert_has_calls([call("KEYS/INTERCHAIN/some_id"), call("KEYS/some_id")])
@patch(
"dragonchain.lib.dao.api_key_dao.storage.get_json_from_object",
return_value={
"key_id": "blah",
"registration_time": 1234,
"key": "my_auth_key",
"version": "1",
"permissions_document": {"version": "1", "default_allow": True, "permissions": {}},
"interchain": False,
"root": False,
"nickname": "",
},
)
def test_get_api_key_raises_error_when_mismatching_interchain(self, mock_get_object):
self.assertRaises(RuntimeError, api_key_dao.get_api_key, "some_id", interchain=True)
@patch("dragonchain.lib.dao.api_key_dao.storage.get_json_from_object")
def test_get_api_key_raises_not_found_when_slash_in_key_id(self, mock_get_object):
self.assertRaises(exceptions.NotFound, api_key_dao.get_api_key, "some/malicious/key", interchain=False)
mock_get_object.assert_not_called()
@patch("dragonchain.lib.dao.api_key_dao.storage.delete")
def test_delete_api_key_deletes_from_storage_correctly(self, mock_delete):
api_key_dao.delete_api_key("interchain", interchain=True)
api_key_dao.delete_api_key("notinterchain", interchain=False)
mock_delete.assert_has_calls([call("KEYS/INTERCHAIN/interchain"), call("KEYS/notinterchain")])
def test_delete_api_key_throws_error_if_deleting_interchain_key_when_not_intended(self):
self.assertRaises(RuntimeError, api_key_dao.delete_api_key, "INTERCHAIN/malicious", False)
@patch("dragonchain.lib.dao.api_key_dao.storage.list_objects")
@patch("dragonchain.lib.dao.api_key_dao.storage.get", return_value=b"1")
def test_perform_api_key_migration_doesnt_do_anything_when_already_migrated(self, mock_get, mock_list):
api_key_dao.perform_api_key_migration_v1_if_necessary()
mock_get.assert_called_once_with("KEYS/MIGRATION_V1_COMPLETE")
mock_list.assert_not_called()
@patch("dragonchain.lib.dao.api_key_dao.api_key_model.new_from_legacy", return_value="banana")
@patch("dragonchain.lib.dao.api_key_dao.save_api_key")
@patch(
"dragonchain.lib.dao.api_key_dao.storage.get_json_from_object",
return_value={"id": "some_id", "key": "some_key", "registration_time": 1234, "nickname": "banana"},
)
@patch("dragonchain.lib.dao.api_key_dao.storage.put")
@patch("dragonchain.lib.dao.api_key_dao.storage.list_objects", return_value=["KEYS/whatever"])
@patch("dragonchain.lib.dao.api_key_dao.storage.get", side_effect=exceptions.NotFound)
def test_perform_api_key_migration_migrates_regular_keys(
self, mock_get, mock_list, mock_put, mock_get_object, mock_save_key, mock_new_from_legacy
):
api_key_dao.perform_api_key_migration_v1_if_necessary()
mock_new_from_legacy.assert_called_once_with(
{"id": "some_id", "key": "some_key", "registration_time": 1234, "nickname": "banana"}, interchain_dcid=""
)
mock_save_key.assert_called_once_with("banana")
@patch("dragonchain.lib.dao.api_key_dao.api_key_model.new_from_legacy", return_value="banana")
@patch("dragonchain.lib.dao.api_key_dao.save_api_key")
@patch("dragonchain.lib.dao.api_key_dao.storage.get_json_from_object", return_value={"key": "some_key", "registration_time": 1234})
@patch("dragonchain.lib.dao.api_key_dao.storage.put")
@patch("dragonchain.lib.dao.api_key_dao.storage.list_objects", return_value=["KEYS/INTERCHAIN/whatever"])
@patch("dragonchain.lib.dao.api_key_dao.storage.get", side_effect=exceptions.NotFound)
def test_perform_api_key_migration_migrates_interchain_keys(
self, mock_get, mock_list, mock_put, mock_get_object, mock_save_key, mock_new_from_legacy
):
api_key_dao.perform_api_key_migration_v1_if_necessary()
mock_new_from_legacy.assert_called_once_with({"key": "some_key", "registration_time": 1234}, interchain_dcid="whatever")
mock_save_key.assert_called_once_with("banana")
@patch("dragonchain.lib.dao.api_key_dao.storage.put")
@patch("dragonchain.lib.dao.api_key_dao.storage.list_objects", return_value=[])
@patch("dragonchain.lib.dao.api_key_dao.storage.get", return_value=b"not1")
def test_perform_api_key_migration_saves_migration_marker_when_complete(self, mock_storage_get, mock_storage_list, mock_storage_put):
api_key_dao.perform_api_key_migration_v1_if_necessary()
mock_storage_put.assert_called_once_with("KEYS/MIGRATION_V1_COMPLETE", b"1")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -99,8 +99,15 @@ def insert_block(block: "model.BlockModel") -> None:
# Create ref to this block for the next block
last_block_ref = {"block_id": block.block_id, "proof": block.proof}
# Upload stripped block
redisearch.put_document(redisearch.Indexes.block.value, block.block_id, block.export_as_search_index(), upsert=True)
if redisearch.ENABLED:
redisearch.put_document(redisearch.Indexes.block.value, block.block_id, block.export_as_search_index(), upsert=True)
storage.put_object_as_json(f"{FOLDER}/{block.block_id}", block.export_as_at_rest())
# Upload ref
storage.put_object_as_json(f"{FOLDER}/{LAST_CLOSED_KEY}", last_block_ref)
def insert_l5_verification(storage_location: str, block: "model.BlockModel") -> None:
if redisearch.ENABLED:
index_id = storage_location.split("/")[1]
redisearch.put_document(redisearch.Indexes.verification.value, index_id, block.export_as_search_index(), upsert=True)

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -63,6 +63,7 @@ def store_full_txns(block_model: "l1_block_model.L1BlockModel") -> None:
"""
_log.info("[TRANSACTION DAO] Putting transaction to storage")
storage.put(f"{FOLDER}/{block_model.block_id}", block_model.export_as_full_transactions().encode("utf-8"))
block_model.store_transaction_payloads()
txn_dict: Dict[str, Dict[str, Dict[str, Any]]] = {}
txn_dict[redisearch.Indexes.transaction.value] = {}
# O(N) loop where N = # of txn

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -114,7 +114,7 @@ def create_new_transaction_type(txn_type_model: transaction_type_model.Transacti
txn_type_dto = txn_type_model.export_as_at_rest()
_log.info(f"Adding transaction index for {txn_type_model.txn_type}")
redisearch.create_transaction_index(txn_type_model.txn_type, txn_type_model.custom_indexes)
_log.debug(f"Queuing for activation")
_log.debug("Queuing for activation")
redis.lpush_sync(QUEUED_TXN_TYPES, txn_type_model.txn_type)
_log.debug(f"Adding the transaction type to storage")
_log.debug("Adding the transaction type to storage")
storage.put_object_as_json(f"{FOLDER}/{txn_type_model.txn_type}", txn_type_dto)

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -32,9 +32,9 @@ REDIS_ENDPOINT = os.environ["REDIS_ENDPOINT"]
LRU_REDIS_ENDPOINT = os.environ["LRU_REDIS_ENDPOINT"]
REDIS_PORT = int(os.environ["REDIS_PORT"]) or 6379
redis_client: redis.Redis = cast(redis.Redis, None)
redis_client_lru: redis.Redis = cast(redis.Redis, None)
async_redis_client: aioredis.Redis = cast(aioredis.Redis, None)
redis_client = cast(redis.Redis, None)
redis_client_lru = cast(redis.Redis, None)
async_redis_client = cast(aioredis.Redis, None)
def _set_redis_client_if_necessary() -> None:
@ -141,7 +141,7 @@ async def _initialize_async_redis(host: str, port: int, wait_time: int = 30) ->
sleep_time = 1 # Number of seconds to wait after a failure to connect before retrying
while time.time() < expire_time:
try:
client = await aioredis.create_redis_pool((host, port), loop=asyncio.get_running_loop())
client = await aioredis.create_redis_pool((host, port))
if await client.ping():
_log.debug(f"Successfully connected with redis at {host}:{port}")
return client # Connected to a working redis, return now
@ -284,7 +284,7 @@ def hset_sync(name: str, key: str, value: str) -> int:
def brpop_sync(keys: str, timeout: int = 0, decode: bool = True) -> Optional[tuple]:
"""Preform a blocking pop against redis list(s)
"""Perform a blocking pop against redis list(s)
Args:
keys: Can be a single key (bytes, string, int, etc), or an array of keys to wait on
timeout: Number of seconds to wait before 'timing out' and returning None. If 0, it will block indefinitely (default)
@ -299,6 +299,23 @@ def brpop_sync(keys: str, timeout: int = 0, decode: bool = True) -> Optional[tup
return _decode_tuple_response(response, decode)
def brpoplpush_sync(pop_key: str, push_key: str, timeout: int = 0, decode: bool = True) -> Optional[str]:
"""Perform a blocking pop against redis list(s)
Args:
pop_key: Can be a single key (bytes, string, int, etc), or an array of keys to wait on popping from
push_key: key to push currently processing items to
timeout: Number of seconds to wait before 'timing out' and returning None. If 0, it will block indefinitely (default)
Returns:
None when no element could be popped and the timeout expired. This is only possible when timeout is not 0
The element that was moved between the lists
"""
_set_redis_client_if_necessary()
response = redis_client.brpoplpush(pop_key, push_key, timeout)
if response is None:
return None
return _decode_response(response, decode)
def get_sync(name: str, decode: bool = True) -> Optional[str]:
_set_redis_client_if_necessary()
response = redis_client.get(name)
@ -379,4 +396,4 @@ def hexists_sync(name: str, key: str) -> bool:
def zadd_sync(name: str, mapping: Dict[str, int], nx: bool = False, xx: bool = False, ch: bool = False, incr: bool = False) -> int:
_set_redis_client_if_necessary()
return redis_client.zadd(name, mapping, nx=nx, xx=xx, ch=ch, incr=incr)
return redis_client.zadd(name, mapping, nx=nx, xx=xx, ch=ch, incr=incr) # noqa: T484

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -15,26 +15,17 @@
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
import asyncio
import unittest
from unittest.mock import patch, MagicMock
from unittest.mock import patch, MagicMock, AsyncMock
from dragonchain.lib.database import redis
def async_test(coro):
def wrapper(*args, **kwargs):
loop = asyncio.get_event_loop()
return loop.run_until_complete(coro(*args, **kwargs))
return wrapper
class TestRedisAccess(unittest.TestCase):
class TestRedisAccess(unittest.IsolatedAsyncioTestCase):
def setUp(self):
redis.redis_client = MagicMock()
redis.redis_client_lru = MagicMock()
redis.async_redis_client = MagicMock(return_value=asyncio.Future())
redis.async_redis_client = AsyncMock(multi_exec=MagicMock())
@patch("dragonchain.lib.database.redis._initialize_redis")
def test_set_redis_client_if_necessary(self, mock_redis):
@ -48,104 +39,63 @@ class TestRedisAccess(unittest.TestCase):
redis._set_redis_client_lru_if_necessary()
mock_redis.assert_called_once()
@async_test
async def test_set_redis_client_async_if_necessary(self):
redis._initialize_async_redis = MagicMock(return_value=asyncio.Future())
redis._initialize_async_redis.return_value.set_result("dummy")
@patch("dragonchain.lib.database.redis._initialize_async_redis")
async def test_set_redis_client_async_if_necessary(self, mock_redis):
redis.async_redis_client = None
await redis._set_redis_client_async_if_necessary()
redis._initialize_async_redis.assert_called_once()
@async_test
async def test_z_range_by_score_async(self):
redis.async_redis_client.zrangebyscore = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.zrangebyscore.return_value.set_result("dummy")
await redis.z_range_by_score_async("banana", 1, 2)
redis.async_redis_client.zrangebyscore.assert_called_once_with("banana", 1, 2, count=None, encoding="utf8", offset=None, withscores=False)
redis.async_redis_client.zrangebyscore.assert_awaited_once_with("banana", 1, 2, count=None, encoding="utf8", offset=None, withscores=False)
@async_test
async def test_get_async(self):
redis.async_redis_client.get = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.get.return_value.set_result("dummy")
await redis.get_async("banana")
redis.async_redis_client.get.assert_called_once_with("banana", encoding="utf8")
redis.async_redis_client.get.assert_awaited_once_with("banana", encoding="utf8")
@async_test
async def test_set_async(self):
redis.async_redis_client.set = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.set.return_value.set_result("dummy")
await redis.set_async("banana", "banana")
redis.async_redis_client.set.assert_called_once_with("banana", "banana", expire=0, pexpire=0, exist=None)
redis.async_redis_client.set.assert_awaited_once_with("banana", "banana", expire=0, pexpire=0, exist=None)
@async_test
async def test_zadd_async(self):
redis.async_redis_client.zadd = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.zadd.return_value.set_result("dummy")
await redis.zadd_async("banana", "banana", "banana")
redis.async_redis_client.zadd.assert_called_once_with("banana", "banana", "banana", exist=None)
redis.async_redis_client.zadd.assert_awaited_once_with("banana", "banana", "banana", exist=None)
@async_test
async def test_smembers_async(self):
redis.async_redis_client.smembers = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.smembers.return_value.set_result("dummy")
await redis.smembers_async("banana")
redis.async_redis_client.smembers.assert_called_once_with("banana", encoding="utf8")
redis.async_redis_client.smembers.assert_awaited_once_with("banana", encoding="utf8")
@async_test
async def test_multi_exec_async(self):
redis.async_redis_client.multi_exec = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.multi_exec.return_value.set_result("dummy")
await redis.multi_exec_async()
redis.async_redis_client.multi_exec.assert_called_once()
@async_test
async def test_hgetall_async(self):
redis.async_redis_client.hgetall = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.hgetall.return_value.set_result("dummy")
await redis.hgetall_async("banana")
redis.async_redis_client.hgetall.assert_called_once_with("banana", encoding="utf8")
redis.async_redis_client.hgetall.assert_awaited_once_with("banana", encoding="utf8")
@async_test
async def test_rpush_async(self):
redis.async_redis_client.rpush = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.rpush.return_value.set_result("dummy")
await redis.rpush_async("banana", "banana", "banana", "banana")
redis.async_redis_client.rpush.assert_called_once_with("banana", "banana", "banana", "banana")
redis.async_redis_client.rpush.assert_awaited_once_with("banana", "banana", "banana", "banana")
@async_test
async def test_delete_async(self):
redis.async_redis_client.delete = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.delete.return_value.set_result("dummy")
await redis.delete_async("banana", "banana", "banana")
redis.async_redis_client.delete.assert_called_once_with("banana", "banana", "banana")
redis.async_redis_client.delete.assert_awaited_once_with("banana", "banana", "banana")
@async_test
async def test_brpop_async(self):
redis.async_redis_client.brpop = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.brpop.return_value.set_result("dummy")
await redis.brpop_async("banana", "banana", "banana")
redis.async_redis_client.brpop.assert_called_once_with("banana", "banana", "banana", encoding="utf8", timeout=0)
redis.async_redis_client.brpop.assert_awaited_once_with("banana", "banana", "banana", encoding="utf8", timeout=0)
@async_test
async def test_hset_async(self):
redis.async_redis_client.hset = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.hset.return_value.set_result("dummy")
await redis.hset_async("banana", "banana", "banana")
redis.async_redis_client.hset.assert_called_once_with("banana", "banana", "banana")
redis.async_redis_client.hset.assert_awaited_once_with("banana", "banana", "banana")
@async_test
async def test_srem_async(self):
redis.async_redis_client.srem = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.srem.return_value.set_result(1)
await redis.srem_async("apple", "banana")
redis.async_redis_client.srem.assert_called_once_with("apple", "banana")
redis.async_redis_client.srem.assert_awaited_once_with("apple", "banana")
@async_test
async def test_hdel_async(self):
redis.async_redis_client.hdel = MagicMock(return_value=asyncio.Future())
redis.async_redis_client.hdel.return_value.set_result("dummy")
await redis.hdel_async("banana", "banana", "banana")
redis.async_redis_client.hdel.assert_called_once_with("banana", "banana", "banana")
redis.async_redis_client.hdel.assert_awaited_once_with("banana", "banana", "banana")
def test_cache_put_with_cache_expire(self):
redis.cache_put("banana", "banana", cache_expire=60)
@ -195,6 +145,10 @@ class TestRedisAccess(unittest.TestCase):
redis.brpop_sync("banana")
redis.redis_client.brpop.assert_called_once_with("banana", timeout=0)
def test_brpoplpush(self):
redis.brpoplpush_sync("banana", "apple")
redis.redis_client.brpoplpush.assert_called_once_with("banana", "apple", 0)
def test_get_sync(self):
redis.get_sync("banana")
redis.redis_client.get.assert_called_once_with("banana")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -44,13 +44,19 @@ if TYPE_CHECKING:
_log = logger.get_logger()
BROADCAST_ENABLED = os.environ["BROADCAST"].lower() != "false"
LEVEL = os.environ["LEVEL"]
REDISEARCH_ENDPOINT = os.environ["REDISEARCH_ENDPOINT"]
REDIS_PORT = int(os.environ["REDIS_PORT"]) or 6379
ENABLED = not (LEVEL != "1" and os.environ.get("USE_REDISEARCH") == "false")
if ENABLED:
REDISEARCH_ENDPOINT = os.environ["REDISEARCH_ENDPOINT"]
REDIS_PORT = int(os.environ["REDIS_PORT"]) or 6379
INDEX_L5_VERIFICATION_GENERATION_KEY = "dc:l5_index_generation_complete"
INDEX_GENERATION_KEY = "dc:index_generation_complete"
L5_BLOCK_MIGRATION_KEY = "dc:migrations:l5_block"
BLOCK_MIGRATION_KEY = "dc:migrations:block"
TXN_MIGRATION_KEY = "dc:migrations:txn"
L5_NODES = "dc:nodes:l5"
_escape_transformation = str.maketrans(
{
@ -89,6 +95,7 @@ class Indexes(enum.Enum):
block = "bk"
smartcontract = "sc"
transaction = "tx"
verification = "ver"
_redis_connection = None
@ -103,6 +110,8 @@ def _get_redisearch_index_client(index: str) -> redisearch.Client:
"""
global _redis_connection
if _redis_connection is None:
if not ENABLED:
raise RuntimeError("Redisearch was attempted to be used, but is disabled")
_redis_connection = dragonchain_redis._initialize_redis(host=REDISEARCH_ENDPOINT, port=REDIS_PORT)
return redisearch.Client(index, conn=_redis_connection)
@ -295,23 +304,69 @@ def generate_indexes_if_necessary() -> None:
"""Initialize redisearch with necessary indexes and fill them from storage if migration has not been marked as complete"""
redisearch_redis_client = _get_redisearch_index_client("").redis
needs_generation = not bool(redisearch_redis_client.get(INDEX_GENERATION_KEY))
needs_l5_generation = not bool(redisearch_redis_client.get(INDEX_L5_VERIFICATION_GENERATION_KEY))
# No-op if indexes are marked as already generated
if not needs_generation:
if not needs_generation and not needs_l5_generation:
return
# Create block index
_log.info("Creating block indexes")
_generate_block_indexes()
# Create indexes for transactions
_log.info("Creating transaction indexes")
_generate_transaction_indexes()
# Create smart contract index
_log.info("Creating smart contract indexes")
_generate_smart_contract_indexes()
# Mark index generation as complete
_log.info("Marking redisearch index generation complete")
redisearch_redis_client.delete(BLOCK_MIGRATION_KEY)
redisearch_redis_client.delete(TXN_MIGRATION_KEY)
redisearch_redis_client.set(INDEX_GENERATION_KEY, "a")
if needs_l5_generation:
# Create L5 verification indexes
_generate_l5_verification_indexes()
# Mark index generation as complete
redisearch_redis_client.delete(L5_BLOCK_MIGRATION_KEY)
redisearch_redis_client.set(INDEX_L5_VERIFICATION_GENERATION_KEY, "a")
if needs_generation:
# Create block index
_log.info("Creating block indexes")
_generate_block_indexes()
# Create indexes for transactions
_log.info("Creating transaction indexes")
_generate_transaction_indexes()
# Create smart contract index
_log.info("Creating smart contract indexes")
_generate_smart_contract_indexes()
# Mark index generation as complete
_log.info("Marking redisearch index generation complete")
redisearch_redis_client.delete(BLOCK_MIGRATION_KEY)
redisearch_redis_client.delete(TXN_MIGRATION_KEY)
redisearch_redis_client.set(INDEX_GENERATION_KEY, "a")
def _generate_l5_verification_indexes() -> None:
client = _get_redisearch_index_client(Indexes.verification.value)
try:
client.create_index(
[
redisearch.NumericField("block_id", sortable=True),
redisearch.NumericField("prev_id", sortable=True),
redisearch.NumericField("timestamp", sortable=True),
redisearch.TagField("dc_id"),
]
)
except redis.exceptions.ResponseError as e:
if not str(e).startswith("Index already exists"): # We don't care if index already exists
raise
_log.info("Listing all blocks in storage")
block_paths = storage.list_objects("BLOCK/")
pattern = re.compile(r"BLOCK\/([0-9]+)-([Ll])5(.*)$")
for block_path in block_paths:
if LEVEL == "1" and BROADCAST_ENABLED and re.search(pattern, block_path):
if not client.redis.sismember(L5_BLOCK_MIGRATION_KEY, block_path):
raw_block = storage.get_json_from_object(block_path)
block = l5_block_model.new_from_at_rest(raw_block)
storage_location = block_path.split("/")[1]
try:
put_document(Indexes.verification.value, storage_location, block.export_as_search_index())
except redis.exceptions.ResponseError as e:
if not str(e).startswith("Document already exists"):
raise
else:
_log.info(f"Document {storage_location} already exists")
client.redis.sadd(L5_NODES, block.dc_id)
client.redis.sadd(L5_BLOCK_MIGRATION_KEY, block_path)
else:
_log.info(f"Skipping already indexed L5 block {block_path}")
def _generate_block_indexes() -> None:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -17,7 +17,7 @@
import os
import unittest
from unittest.mock import patch, MagicMock
from unittest.mock import patch, MagicMock, call
import redis
@ -194,6 +194,7 @@ class TestRedisearch(unittest.TestCase):
redisearch._get_redisearch_index_client.assert_any_call("bk")
redisearch._get_redisearch_index_client.assert_any_call("sc")
redisearch._get_redisearch_index_client.assert_any_call("tx")
mock_redis.get.assert_called_once_with("dc:index_generation_complete")
mock_redis.set.assert_called_once()
redisearch._get_redisearch_index_client.assert_any_call("ver")
mock_redis.get.assert_has_calls([call("dc:index_generation_complete"), call("dc:l5_index_generation_complete")])
self.assertEqual(mock_redis.set.call_count, 2)
mock_put_document.assert_called()

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -0,0 +1,320 @@
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
# Section 6. Trademarks. is deleted and replaced with:
# 6. Trademarks. This License does not grant permission to use the trade
# names, trademarks, service marks, or product names of the Licensor
# and its affiliates, except as required to comply with Section 4(c) of
# the License and to reproduce the content of the NOTICE file.
# You may obtain a copy of the Apache License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the Apache License with the above modification is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
import time
import string
import secrets
from typing import Dict, Any, Optional, TYPE_CHECKING
from dragonchain.lib.dto import model
from dragonchain import logger
if TYPE_CHECKING:
from dragonchain.lib.types import permissions_doc # noqa: F401
_log = logger.get_logger()
# Default permissions document allows all actions except for create/update/delete api keys
DEFAULT_PERMISSIONS_DOCUMENT: "permissions_doc" = {
"version": "1",
"default_allow": True,
"permissions": {"api_keys": {"allow_create": False, "allow_update": False, "allow_delete": False}},
}
def _check_default_endpoint_permission(api_name_permissions: Dict[str, Any], extra_data: Optional[Dict[str, Any]]) -> Optional[bool]:
"""Helper method which parses an endpoint with a default permission policy
Args:
api_name_permissions: The specific section of the self.permissions_document for the endpoint being checked
extra_data: The extra data from is_key_allowed (not used in this function, but required for compatibility)
Returns:
Boolean of allowed/disallowed if able to be parsed correctly, else None
"""
return api_name_permissions.get("allowed")
def _check_create_transaction_permission(api_name_permissions: Dict[str, Any], extra_data: Optional[Dict[str, Any]]) -> Optional[bool]:
"""Method to check if creating a transaction is allowed
Args:
api_name_permissions: The specific section of the self.permissions_document for create_transaction
extra_data: Dictionary with the key "requested_types" which is an iterable of strings of the transaction types to check if allowed
Returns:
Boolean if allowed to create the transaction(s)
"""
if not extra_data: # Will not have extra data when checking with request authorizer
return True
# Will have extra data here when checking in create transaction library functions
allowed = api_name_permissions.get("allowed")
transaction_type_permissions = api_name_permissions.get("transaction_types")
if transaction_type_permissions is not None:
if allowed is False:
for txn_type in extra_data["requested_types"]:
# All transaction types must be explicitly true (since allowed is explicitly false)
if transaction_type_permissions.get(txn_type) is not True:
return False
return True
else:
for txn_type in extra_data["requested_types"]:
# Only deny if explicitly false (since allowed is not explicitly false)
if transaction_type_permissions.get(txn_type) is False:
return False
return allowed
ENDPOINT_MAP = {
"create_api_key": _check_default_endpoint_permission,
"get_api_key": _check_default_endpoint_permission,
"list_api_keys": _check_default_endpoint_permission,
"delete_api_key": _check_default_endpoint_permission,
"update_api_key": _check_default_endpoint_permission,
"get_block": _check_default_endpoint_permission,
"query_blocks": _check_default_endpoint_permission,
"create_interchain": _check_default_endpoint_permission,
"update_interchain": _check_default_endpoint_permission,
"create_interchain_transaction": _check_default_endpoint_permission,
"publish_interchain_transaction": _check_default_endpoint_permission,
"list_interchains": _check_default_endpoint_permission,
"get_interchain": _check_default_endpoint_permission,
"delete_interchain": _check_default_endpoint_permission,
"get_default_interchain": _check_default_endpoint_permission,
"set_default_interchain": _check_default_endpoint_permission,
"get_interchain_legacy": _check_default_endpoint_permission,
"create_interchain_transaction_legacy": _check_default_endpoint_permission,
"get_status": _check_default_endpoint_permission,
"get_contract": _check_default_endpoint_permission,
"get_contract_logs": _check_default_endpoint_permission,
"list_contracts": _check_default_endpoint_permission,
"create_contract": _check_default_endpoint_permission,
"update_contract": _check_default_endpoint_permission,
"delete_contract": _check_default_endpoint_permission,
"get_contract_object": _check_default_endpoint_permission,
"list_contract_objects": _check_default_endpoint_permission,
"create_transaction_type": _check_default_endpoint_permission,
"delete_transaction_type": _check_default_endpoint_permission,
"list_transaction_types": _check_default_endpoint_permission,
"get_transaction_type": _check_default_endpoint_permission,
"create_transaction": _check_create_transaction_permission,
"query_transactions": _check_default_endpoint_permission,
"get_transaction": _check_default_endpoint_permission,
"get_verifications": _check_default_endpoint_permission,
"get_pending_verifications": _check_default_endpoint_permission,
"query_interchain_verifications": _check_default_endpoint_permission,
}
def gen_auth_key() -> str:
"""Generate an auth key string
Returns:
String of the newly generated auth key
"""
# Note a 43 character key with this keyset gives us ~256 bits of entropy for these auth_keys
return "".join(secrets.choice(string.ascii_letters + string.digits) for _ in range(43))
def new_root_key(key_id: str, key: str) -> "APIKeyModel":
"""Create a new root api key model from only the provided key/key_id
Args:
key_id: The key id for this root key
key: The key for this root key
Returns:
Constructed APIKeyModel
"""
return APIKeyModel(
key_id=key_id,
key=key,
registration_time=int(time.time()),
root=True,
nickname="",
interchain=False,
permissions_document={"version": "1", "default_allow": True, "permissions": {}},
)
def new_from_scratch(smart_contract: bool = False, nickname: str = "", interchain_dcid: str = "") -> "APIKeyModel":
"""Create a new api key model from scratch, generating necessary fields
Args:
smart_contract: Whether or not this key is for a smart contract
nickname: The nickname for this api key
interchain_dcid: The dcid of the interchain (if this is an interchain key; otherwise leave blank)
Returns:
Constructed APIKeyModel
"""
if smart_contract and interchain_dcid:
raise RuntimeError("Can't create a smart contract api key that is also an interchain key")
interchain = bool(interchain_dcid)
if not interchain:
key_id = "".join(secrets.choice(string.ascii_uppercase) for _ in range(12))
if smart_contract:
key_id = f"SC_{key_id}"
permissions_document = DEFAULT_PERMISSIONS_DOCUMENT
else:
key_id = interchain_dcid
# Default interchain keys aren't allowed any permissions (can still call Dragon Net reserved interchain endpoints)
permissions_document = {"version": "1", "default_allow": False, "permissions": {}}
return APIKeyModel(
key_id=key_id,
key=gen_auth_key(),
registration_time=int(time.time()),
root=False,
nickname=nickname,
interchain=interchain,
permissions_document=permissions_document,
)
def new_from_at_rest(api_key_data: Dict[str, Any]) -> "APIKeyModel":
"""Construct an api key model from at rest (cached storage)"""
if api_key_data.get("version") == "1":
return APIKeyModel(
key_id=api_key_data["key_id"],
key=api_key_data["key"],
registration_time=api_key_data["registration_time"],
root=api_key_data["root"],
nickname=api_key_data["nickname"],
permissions_document=api_key_data["permissions_document"],
interchain=api_key_data["interchain"],
)
else:
raise NotImplementedError(f"Version {api_key_data.get('version')} is not supported")
def new_from_legacy(api_key_data: Dict[str, Any], interchain_dcid: str) -> "APIKeyModel":
"""Construct an api key model from legacy (pre-4.3.0) api key dto storage"""
permissions_document = DEFAULT_PERMISSIONS_DOCUMENT
if interchain_dcid:
permissions_document = {"version": "1", "default_allow": False, "permissions": {}}
elif api_key_data.get("root"):
permissions_document = {"version": "1", "default_allow": True, "permissions": {}}
return APIKeyModel(
key_id=api_key_data.get("id") or interchain_dcid,
key=api_key_data["key"],
registration_time=api_key_data.get("registration_time") or 0,
root=api_key_data.get("root") or False,
nickname=api_key_data.get("nickname") or "",
interchain=bool(interchain_dcid),
permissions_document=permissions_document,
)
class APIKeyModel(model.Model):
"""
APIKeyModel class is an abstracted representation of an api key
"""
def __init__(
self, key_id: str, key: str, registration_time: int, root: bool, nickname: str, interchain: bool, permissions_document: "permissions_doc"
):
self.key_id = key_id
self.key = key
self.root = root
self.registration_time = registration_time
self.nickname = nickname
self.interchain = interchain
self.permissions_document = permissions_document
def is_key_allowed(
self, api_resource: str, api_operation: str, api_name: str, interchain: bool, extra_data: Optional[Dict[str, Any]] = None
) -> bool:
"""Checks if this keys is allowed to perform an action for a given api endpoint
Args:
api_resource: The resource that the endpoint being checked belongs to. i.e. api_keys, blocks, interchains, etc
api_operation: The CRUD operation that this endpoint is performing. Should be one of: create, read, update, or delete
api_name: The exact name of this api action used for permissioning. i.e. create_api_key, list_contracts, get_status, etc
extra_data: Any extra data required for non-default permission endpoints
Returns:
boolean whether or not this key is allowed to perform the action
"""
# Ensure that if this is a reserved Dragon Net action, only interchain keys can invoke it
if interchain:
return self.interchain
# Interchain keys are not allowed to invoke any other endpoint
if self.interchain:
return False
if self.root:
return True
if self.permissions_document.get("version") == "1":
return self.is_key_allowed_v1(api_resource, api_operation, api_name, extra_data)
else:
_log.error(f"Auth from invalid permissioning on key {self.key_id}\nPermissions: {self.permissions_document}")
raise RuntimeError(f"Invalid permissions document version: {self.permissions_document.get('version')}")
def is_key_allowed_v1(self, api_resource: str, api_operation: str, api_name: str, extra_data: Optional[Dict[str, Any]] = None) -> bool:
"""Checks if a key is allowed with v1 permissions"""
allowed = self.permissions_document["default_allow"]
# Get our per-endpoint validation function now to ensure that api_name is valid before continuing
try:
validation_function = ENDPOINT_MAP[api_name]
except Exception:
# This should never happen
_log.exception(f"Error api_name {api_name} is wrong. This should never happen!")
raise RuntimeError(f"'{api_name}' is not a valid know api_name")
# Check the 'global' CRUD values
group_allow = _process_api_resource(self.permissions_document["permissions"], api_operation)
if group_allow is not None:
allowed = group_allow
# Check the specific api resource CRUD values
api_resource_permissions = self.permissions_document["permissions"].get(api_resource)
if api_resource_permissions:
group_allow = _process_api_resource(api_resource_permissions, api_operation)
if group_allow is not None:
allowed = group_allow
# Check the specific api operation permissions itself
api_name_permissions = api_resource_permissions.get(api_name)
if api_name_permissions:
# Special permissions on a per-endpoint level are handled here
endpoint_allow = validation_function(api_name_permissions, extra_data)
if endpoint_allow is not None:
allowed = endpoint_allow
return allowed
def export_as_at_rest(self):
return {
"version": "1",
"key_id": self.key_id,
"key": self.key,
"registration_time": self.registration_time,
"root": self.root,
"nickname": self.nickname,
"permissions_document": self.permissions_document,
"interchain": self.interchain,
}
def _process_api_resource(permission_resource: Dict[str, Any], api_operation: str) -> Optional[bool]:
"""Helper method to check if the api action permission is in this API resource
Args:
permission_resource: The dictionary for this permission resource to check
api_operation: The api_operation as defined from is_key_allowed
Returns:
Value of the resource permission if it exists, else None
"""
if api_operation == "create":
return permission_resource.get("allow_create")
elif api_operation == "read":
return permission_resource.get("allow_read")
elif api_operation == "update":
return permission_resource.get("allow_update")
elif api_operation == "delete":
return permission_resource.get("allow_delete")
else:
raise RuntimeError(f"'{api_operation}' is not a valid api_operation (must be one of create, read, update, delete)")

View File

@ -0,0 +1,288 @@
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
# Section 6. Trademarks. is deleted and replaced with:
# 6. Trademarks. This License does not grant permission to use the trade
# names, trademarks, service marks, or product names of the Licensor
# and its affiliates, except as required to comply with Section 4(c) of
# the License and to reproduce the content of the NOTICE file.
# You may obtain a copy of the Apache License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the Apache License with the above modification is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
import unittest
from unittest.mock import patch
from dragonchain.lib.dto import api_key_model
def create_generic_api_key_model() -> api_key_model.APIKeyModel:
return api_key_model.APIKeyModel(
key="whatever",
key_id="some_id",
registration_time=0,
nickname="",
root=False,
interchain=False,
permissions_document={"version": "1", "default_allow": True, "permissions": {}},
)
class TestApiKeyModel(unittest.TestCase):
def test_gen_auth_key(self):
auth_key = api_key_model.gen_auth_key()
self.assertRegex(auth_key, r"[a-zA-Z0-9]{43}")
def test_new_from_legacy_parses_old_normal_key_dto(self):
old_dto = {"id": "some_id", "key": "some_key", "registration_time": 1234, "nickname": "banana"}
model = api_key_model.new_from_legacy(old_dto, interchain_dcid="")
self.assertEqual(model.key_id, old_dto["id"])
self.assertEqual(model.key, old_dto["key"])
self.assertEqual(model.registration_time, old_dto["registration_time"])
self.assertEqual(model.nickname, old_dto["nickname"])
self.assertFalse(model.root)
self.assertFalse(model.interchain)
self.assertEqual(model.permissions_document, api_key_model.DEFAULT_PERMISSIONS_DOCUMENT)
def test_new_from_legacy_parses_old_root_key_dto(self):
old_dto = {"id": "some_id", "key": "some_key", "root": True, "registration_time": 0}
model = api_key_model.new_from_legacy(old_dto, interchain_dcid="")
self.assertEqual(model.key_id, old_dto["id"])
self.assertEqual(model.key, old_dto["key"])
self.assertEqual(model.registration_time, old_dto["registration_time"])
self.assertEqual(model.nickname, "")
self.assertTrue(model.root)
self.assertFalse(model.interchain)
self.assertEqual(model.permissions_document, {"version": "1", "default_allow": True, "permissions": {}})
def test_new_from_legacy_parses_old_interchain_key_dto(self):
old_dto = {"key": "some_key", "registration_time": 1234}
model = api_key_model.new_from_legacy(old_dto, interchain_dcid="banana")
self.assertEqual(model.key_id, "banana")
self.assertEqual(model.key, old_dto["key"])
self.assertEqual(model.registration_time, old_dto["registration_time"])
self.assertEqual(model.nickname, "")
self.assertFalse(model.root)
self.assertTrue(model.interchain)
self.assertEqual(model.permissions_document, {"version": "1", "default_allow": False, "permissions": {}})
def test_new_from_at_rest_parses_version_1_dto(self):
v1_dto = {
"version": "1",
"key_id": "some_id",
"key": "some_key",
"registration_time": 1234,
"root": False,
"nickname": "banana",
"interchain": True,
"permissions_document": {},
}
model = api_key_model.new_from_at_rest(v1_dto)
self.assertEqual(model.key_id, v1_dto["key_id"])
self.assertEqual(model.key, v1_dto["key"])
self.assertEqual(model.registration_time, v1_dto["registration_time"])
self.assertEqual(model.nickname, v1_dto["nickname"])
self.assertFalse(model.root)
self.assertTrue(model.interchain)
def test_new_from_at_rest_throws_with_bad_version(self):
bad_version_dto = {"version": "bad"}
self.assertRaises(NotImplementedError, api_key_model.new_from_at_rest, bad_version_dto)
def test_new_root_key_sets_root_and_keys(self):
model = api_key_model.new_root_key("key_id", "key")
self.assertTrue(model.root)
self.assertEqual(model.key_id, "key_id")
self.assertEqual(model.key, "key")
def test_new_from_scratch_generates_api_key_model_and_uses_default_permissions(self):
model = api_key_model.new_from_scratch()
self.assertIsInstance(model, api_key_model.APIKeyModel)
self.assertEqual(model.permissions_document, api_key_model.DEFAULT_PERMISSIONS_DOCUMENT)
def test_new_from_scratch_generates_sc_id_if_contract(self):
model = api_key_model.new_from_scratch(smart_contract=True)
self.assertTrue(model.key_id.startswith("SC_"))
def test_new_from_scratch_uses_interchain_dcid_for_key_id(self):
model = api_key_model.new_from_scratch(interchain_dcid="banana")
self.assertEqual(model.key_id, "banana")
self.assertTrue(model.interchain)
# Also check that the correct permissions document was created
self.assertEqual(model.permissions_document, {"version": "1", "default_allow": False, "permissions": {}})
def test_new_from_scratch_raises_if_dcid_and_contract(self):
self.assertRaises(RuntimeError, api_key_model.new_from_scratch, interchain_dcid="banana", smart_contract=True)
def test_export_as_at_rest_returns_good_dto(self):
model = api_key_model.APIKeyModel(
key_id="some_id",
key="some_key",
registration_time=1234,
root=False,
nickname="Banana",
interchain=False,
permissions_document={"version": "1", "default_allow": True, "permissions": {}},
)
self.assertEqual(
model.export_as_at_rest(),
{
"key": "some_key",
"key_id": "some_id",
"nickname": "Banana",
"registration_time": 1234,
"root": False,
"interchain": False,
"permissions_document": {"version": "1", "default_allow": True, "permissions": {}},
"version": "1",
},
)
@patch("dragonchain.lib.dto.api_key_model.APIKeyModel.is_key_allowed_v1")
def test_root_key_is_allowed(self, is_key_allowed_v1):
model = create_generic_api_key_model()
model.root = True
model.permissions_document["default_allow"] = False
self.assertTrue(model.is_key_allowed("cool", "banana", "salad", interchain=False))
is_key_allowed_v1.assert_not_called()
@patch("dragonchain.lib.dto.api_key_model.APIKeyModel.is_key_allowed_v1")
def test_permissions_doc_v1_is_allowed_uses_is_allowed_v1(self, is_key_allowed_v1):
model = create_generic_api_key_model()
model.permissions_document["default_allow"] = False
self.assertEqual(model.is_key_allowed("cool", "banana", "salad", interchain=False), is_key_allowed_v1.return_value)
def test_is_allowed_handles_interchain_keys(self):
model = create_generic_api_key_model()
model.interchain = True
self.assertTrue(model.is_key_allowed("", "", "", interchain=True))
self.assertFalse(model.is_key_allowed("", "", "", interchain=False))
def test_bad_permission_document_version_raises(self):
model = create_generic_api_key_model()
model.permissions_document = {"version": "banana", "default_allow": False, "permissions": {}}
self.assertRaises(RuntimeError, model.is_key_allowed, "cool", "banana", "salad", interchain=False)
def test_is_allowed_v1_raises_error_on_bad_api_name(self):
model = create_generic_api_key_model()
self.assertRaises(RuntimeError, model.is_key_allowed_v1, "invalid", "api", "name")
def test_is_allowed_v1_uses_default(self):
model = create_generic_api_key_model()
model.permissions_document["default_allow"] = True
self.assertTrue(model.is_key_allowed_v1("api_keys", "create", "create_api_key"))
def test_is_allowed_v1_global_crud_overrides_default(self):
model = create_generic_api_key_model()
model.permissions_document = {"version": "1", "default_allow": False, "permissions": {"allow_create": True}}
self.assertTrue(model.is_key_allowed_v1("api_keys", "create", "create_api_key"))
def test_is_allowed_v1_group_crud_overrides_global_crud(self):
model = create_generic_api_key_model()
model.permissions_document = {
"version": "1",
"default_allow": False,
"permissions": {"allow_create": False, "api_keys": {"allow_create": True}},
}
self.assertTrue(model.is_key_allowed_v1("api_keys", "create", "create_api_key"))
def test_is_allowed_v1_group_crud_ignored_when_no_matching_group_crud(self):
model = create_generic_api_key_model()
model.permissions_document = {
"version": "1",
"default_allow": False,
"permissions": {"allow_create": True, "api_keys": {"something": "whatever"}},
}
self.assertTrue(model.is_key_allowed_v1("api_keys", "create", "create_api_key"))
def test_is_allowed_v1_specific_api_name_overrides_group_crud(self):
model = create_generic_api_key_model()
model.permissions_document = {
"version": "1",
"default_allow": False,
"permissions": {"allow_create": False, "api_keys": {"allow_create": False, "create_api_key": {"allowed": True}}},
}
self.assertTrue(model.is_key_allowed_v1("api_keys", "create", "create_api_key"))
def test_is_allowed_v1_specific_group_ignored_when_no_matching_specific_group(self):
model = create_generic_api_key_model()
model.permissions_document = {
"version": "1",
"default_allow": False,
"permissions": {"allow_create": False, "api_keys": {"allow_create": True, "create_api_key": {"irrelevant": "data"}}},
}
self.assertTrue(model.is_key_allowed_v1("api_keys", "create", "create_api_key"))
def test_is_allowed_v1_raises_with_bad_action(self):
model = create_generic_api_key_model()
self.assertRaises(RuntimeError, model.is_key_allowed_v1, "api_keys", "not_an_action", "create_api_key")
def test_is_allowed_v1_crud_reads_correct_fields(self):
model = create_generic_api_key_model()
model.permissions_document = {
"version": "1",
"default_allow": False,
"permissions": {"allow_create": True, "allow_read": True, "allow_update": True, "allow_delete": True},
}
self.assertTrue(model.is_key_allowed_v1("api_keys", "create", "create_api_key"))
self.assertTrue(model.is_key_allowed_v1("api_keys", "read", "get_api_key"))
self.assertTrue(model.is_key_allowed_v1("api_keys", "update", "update_api_key"))
self.assertTrue(model.is_key_allowed_v1("api_keys", "delete", "delete_api_key"))
def test_check_create_transaction_permission_returns_true_with_no_extra_data(self):
model = create_generic_api_key_model()
model.permissions_document = {
"version": "1",
"default_allow": False,
"permissions": {"transactions": {"create_transaction": {"not": "checked"}}},
}
self.assertTrue(model.is_key_allowed("transactions", "create", "create_transaction", False))
def test_check_create_transaction_works_with_allowed_value_and_provided_extra_data(self):
model = create_generic_api_key_model()
model.permissions_document = {
"version": "1",
"default_allow": False,
"permissions": {"transactions": {"create_transaction": {"allowed": True, "transaction_types": {"banana": False, "salad": True}}}},
}
# Check allowed true, specific type false
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"banana"}}))
# Check allowed true, specific type true
self.assertTrue(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"salad"}}))
# Check allowed true, no specific type
self.assertTrue(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"bacon"}}))
# Check allowed true, specific type none and true
self.assertTrue(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"bacon", "salad"}}))
# Check allowed true, specific type none and false
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"bacon", "banana"}}))
# Check allowed true, specific type true and false
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"banana", "salad"}}))
model.permissions_document["permissions"]["transactions"]["create_transaction"]["allowed"] = False
# Check allowed false, specific type false
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"banana"}}))
# Check allowed false, specific type true
self.assertTrue(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"salad"}}))
# Check allowed false, no specific type
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"bacon"}}))
# Check allowed true, specific type none and true
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"bacon", "salad"}}))
# Check allowed true, specific type none and false
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"bacon", "banana"}}))
# Check allowed true, specific type true and false
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"banana", "salad"}}))
def test_check_create_transaction_defaults_if_no_allowed_set(self):
model = create_generic_api_key_model()
model.permissions_document = {
"version": "1",
"default_allow": False,
"permissions": {"transactions": {"create_transaction": {"transaction_types": {"banana": False, "salad": True}}}},
}
self.assertFalse(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"bacon"}}))
model.permissions_document["default_allow"] = True
self.assertTrue(model.is_key_allowed("transactions", "create", "create_transaction", False, {"requested_types": {"bacon"}}))

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -37,6 +37,7 @@ MAINNET_API_PORT = 1169
TESTNET_RPC_PORT = 26657
TESTNET_API_PORT = 11699
AVERAGE_BLOCK_TIME = 1 # in seconds
CONFIRMATIONS_CONSIDERED_FINAL = 1 # https://docs.binance.org/faq.html#what-is-the-design-principle-of-binance-chain
BLOCK_THRESHOLD = 3 # The number of blocks that can pass by before trying to send another transaction
SEND_FEE = 37500 # transfer fee fixed at 0.000375 BNB : https://docs.binance.org/trading-spec.html#current-fees-table-on-mainnet
@ -162,7 +163,7 @@ class BinanceNetwork(model.InterchainModel):
response_rpc = self._call_node_rpc("status", {}).json()
response_api = self._call_node_api("tokens/BNB").json()
if response_rpc.get("error") or response_api.get("error"):
raise exceptions.InterchainConnectionError(f"[BINANCE] Node ping checks failed!")
raise exceptions.InterchainConnectionError("[BINANCE] Node ping checks failed!")
# https://docs.binance.org/api-reference/node-rpc.html#6114-query-tx
def is_transaction_confirmed(self, transaction_hash: str) -> bool:
@ -202,7 +203,7 @@ class BinanceNetwork(model.InterchainModel):
# cannot check HTTP status codes, errors will return 200 :
if response_json.get("error") is not None:
if "interface is nil, not types.NamedAccount" in response_json["error"]["data"] and response.status_code == 500:
_log.warning(f"[BINANCE] Non 200 response from Binance node:")
_log.warning("[BINANCE] Non 200 response from Binance node:")
_log.warning(f"[BINANCE] response code: {response.status_code}")
_log.warning(f"[BINANCE] response error: {response_json['error']['data']}")
_log.warning("[BINANCE] This is actually expected for a zero balance address.")
@ -338,7 +339,22 @@ class BinanceNetwork(model.InterchainModel):
raise exceptions.BadRequest(f"[BINANCE] Error signing transaction: {e}")
# https://docs.binance.org/api-reference/node-rpc.html#622-broadcasttxcommit
def _publish_transaction(self, transaction_payload: str) -> str:
def publish_transaction(self, signed_transaction: str) -> str:
"""Publish an already signed transaction to this network
Args:
signed_transaction: The already signed transaction from self.sign_transaction
Returns:
The string of the published transaction hash
"""
_log.debug(f"[BINANCE] Publishing transaction {signed_transaction}")
response = self._call_node_rpc("broadcast_tx_commit", {"tx": signed_transaction})
response_json = response.json()
# cannot check HTTP status codes, errors will return 200 :
if response_json.get("error") is not None:
_log.warning(f"[BINANCE] Error response from Binance node: {response_json['error']['data']}")
return response_json["result"]["hash"] # transaction hash
def _publish_l5_transaction(self, transaction_payload: str) -> str:
"""Publish a transaction to this network with a certain data payload
Args:
transaction_payload: The arbitrary data to send with this transaction
@ -350,13 +366,7 @@ class BinanceNetwork(model.InterchainModel):
# send funds to yourself, avoid hardcoding a dummy recipient address
raw_transaction = {"amount": 1, "to_address": self.address, "symbol": "BNB", "memo": transaction_payload}
signed_tx = self.sign_transaction(raw_transaction)
_log.info(f"[BINANCE] Sending signed transaction: {signed_tx}")
response = self._call_node_rpc("broadcast_tx_commit", {"tx": signed_tx})
response_json = response.json()
# cannot check HTTP status codes, errors will return 200 :
if response_json.get("error") is not None:
_log.warning(f"[BINANCE] Error response from Binance node: {response_json['error']['data']}")
return response_json["result"]["hash"] # transaction hash
return self.publish_transaction(signed_tx)
# endpoints currently hit are:
# "status" (ping check)
@ -368,7 +378,7 @@ class BinanceNetwork(model.InterchainModel):
body = {"method": method, "jsonrpc": "2.0", "params": params, "id": "dontcare"}
_log.debug(f"Binance RPC: -> {full_address} {body}")
try:
response = requests.post(full_address, json=body, timeout=30)
response = requests.post(full_address, json=body, timeout=10)
except Exception as e:
raise exceptions.InterchainConnectionError(f"Error sending post request to binance node: {e}")
_log.debug(f"Binance <- {response.status_code} {response.text}")
@ -382,7 +392,7 @@ class BinanceNetwork(model.InterchainModel):
def _call_node_api(self, path: str) -> Any:
full_address = f"{self.node_url}:{self.api_port}/api/v1/{path}"
try:
response = requests.get(full_address, timeout=30)
response = requests.get(full_address, timeout=10)
except Exception as e:
raise exceptions.InterchainConnectionError(f"Error sending get request to binance node: {e}")
_log.debug(f"Binance <- {response.status_code} {response.text}")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -112,7 +112,7 @@ class TestBinanceMethods(unittest.TestCase):
self.assertEqual(response, "ZmFrZV9lbmNvZGVkX3R4bg==")
mock_encode.assert_called_once()
def test_publish_transaction(self):
def test_publish_l5_transaction(self):
fake_response = requests.Response()
fake_response._content = b'{"result": {"hash": "BOGUS_RESULT_HASH"}}'
self.client._build_transaction_msg = MagicMock(return_value={"built_tx": "fake"})
@ -121,7 +121,7 @@ class TestBinanceMethods(unittest.TestCase):
fake_acct._content = b'{"sequence": 0, "account_number": 12345}'
self.client._call_node_api = MagicMock(return_value=fake_acct)
self.client._call_node_rpc = MagicMock(return_value=fake_response)
response = self.client._publish_transaction("DC-L5:_fake_L5_block_hash")
response = self.client._publish_l5_transaction("DC-L5:_fake_L5_block_hash")
self.assertEqual(response, "BOGUS_RESULT_HASH")
self.client._call_node_rpc.assert_called_once_with("broadcast_tx_commit", {"tx": "signed_tx"})
@ -213,14 +213,14 @@ class TestBinanceMethods(unittest.TestCase):
response = self.client._call_node_rpc("MyMethod", {"symbol": "BANANA"})
self.assertEqual(response.json(), {"result": "MyResult"})
mock_post.assert_called_once_with(
"b.a.n.a.n.a:27147/", json={"method": "MyMethod", "jsonrpc": "2.0", "params": {"symbol": "BANANA"}, "id": "dontcare"}, timeout=30
"b.a.n.a.n.a:27147/", json={"method": "MyMethod", "jsonrpc": "2.0", "params": {"symbol": "BANANA"}, "id": "dontcare"}, timeout=10
)
@patch("requests.get", return_value=MagicMock(status_code=200, json=MagicMock(return_value={"result": "MyResult"})))
def test_api_request_success(self, mock_get):
response = self.client._call_node_api("MyPath")
self.assertEqual(response.json(), {"result": "MyResult"})
mock_get.assert_called_once_with("b.a.n.a.n.a:1169/api/v1/MyPath", timeout=30)
mock_get.assert_called_once_with("b.a.n.a.n.a:1169/api/v1/MyPath", timeout=10)
@patch("dragonchain.lib.dto.bnb.requests.post")
def test_rpc_request_error(self, mock_requests):

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -32,6 +32,7 @@ DRAGONCHAIN_MAINNET_NODE = "http://internal-Btc-Mainnet-Internal-297595751.us-we
DRAGONCHAIN_TESTNET_NODE = "http://internal-Btc-Testnet-Internal-1334656512.us-west-2.elb.amazonaws.com:18332"
DRAGONCHAIN_NODE_AUTHORIZATION = "Yml0Y29pbnJwYzpkcmFnb24=" # Username: bitcoinrpc | Password: dragon
AVERAGE_BLOCK_TIME = 600 # in seconds (10 minutes)
CONFIRMATIONS_CONSIDERED_FINAL = 6
BLOCK_THRESHOLD = 10 # The number of blocks that can pass by before trying to send another transaction
MINIMUM_SATOSHI_PER_BYTE = 10
@ -253,7 +254,17 @@ class BitcoinNetwork(model.InterchainModel):
"""
return base64.b64encode(self.priv_key.to_bytes()).decode("ascii")
def _publish_transaction(self, transaction_payload: str) -> str:
def publish_transaction(self, signed_transaction: str) -> str:
"""Publish an already signed transaction to this network
Args:
signed_transaction: The already signed transaction from self.sign_transaction
Returns:
The string of the published transaction hash
"""
_log.debug(f"[BTC] Publishing transaction {signed_transaction}")
return self._call("sendrawtransaction", signed_transaction)
def _publish_l5_transaction(self, transaction_payload: str) -> str:
"""Publish a transaction to this network with a certain data payload
Args:
transaction_payload: The arbitrary data to send with this transaction
@ -264,7 +275,7 @@ class BitcoinNetwork(model.InterchainModel):
# Sign transaction data
signed_transaction = self.sign_transaction({"data": transaction_payload})
# Send signed transaction
return self._call("sendrawtransaction", signed_transaction)
return self.publish_transaction(signed_transaction)
def _calculate_transaction_fee(self) -> int:
"""Get the current satoshi/byte fee estimate
@ -308,7 +319,7 @@ class BitcoinNetwork(model.InterchainModel):
self.rpc_address,
json={"method": method, "params": list(args), "id": "1", "jsonrpc": "1.0"},
headers={"Authorization": f"Basic {self.authorization}", "Content-Type": "text/plain"},
timeout=30,
timeout=20,
)
if r.status_code != 200:
raise exceptions.InterchainConnectionError(f"Error from bitcoin node with http status code {r.status_code} | {r.text}")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -60,7 +60,7 @@ class TestBitcoinMethods(unittest.TestCase):
def test_publish_creates_signs_and_sends(self):
self.client._call = MagicMock(return_value="MyFakeTransactionHash")
self.client.sign_transaction = MagicMock(return_value="signed_transaction")
response = self.client._publish_transaction("DC-L5:0xhash")
response = self.client._publish_l5_transaction("DC-L5:0xhash")
self.assertEqual(response, "MyFakeTransactionHash")
self.client._call.assert_called_once_with("sendrawtransaction", "signed_transaction")
@ -161,7 +161,7 @@ class TestBitcoinMethods(unittest.TestCase):
"http://whatever",
json={"method": "myMethod", "params": ["arg1", 2, True], "id": "1", "jsonrpc": "1.0"},
headers={"Authorization": "Basic auth", "Content-Type": "text/plain"},
timeout=30,
timeout=20,
)
@patch("requests.post", return_value=MagicMock(status_code=200, json=MagicMock(return_value={"error": "MyResult"})))
@ -171,7 +171,7 @@ class TestBitcoinMethods(unittest.TestCase):
"http://whatever",
json={"method": "myMethod", "params": ["arg1", 2, True], "id": "1", "jsonrpc": "1.0"},
headers={"Authorization": "Basic auth", "Content-Type": "text/plain"},
timeout=30,
timeout=20,
)
@patch("dragonchain.lib.dto.btc.BitcoinNetwork.ping")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -18,6 +18,7 @@
import base64
from typing import Dict, Any
from eth_typing import URI, ChecksumAddress, HexStr
import secp256k1
import web3
import web3.gas_strategies.time_based
@ -33,10 +34,8 @@ DRAGONCHAIN_MAINNET_NODE = "http://internal-Parity-Mainnet-Internal-1844666982.u
DRAGONCHAIN_ROPSTEN_NODE = "http://internal-Parity-Ropsten-Internal-1699752391.us-west-2.elb.amazonaws.com:8545"
# Mainnet ETC
DRAGONCHAIN_CLASSIC_NODE = "http://internal-Parity-Classic-Internal-2003699904.us-west-2.elb.amazonaws.com:8545"
# Testnet ETC
DRAGONCHAIN_MORDEN_NODE = "http://internal-Parity-Morden-Internal-26081757.us-west-2.elb.amazonaws.com:8545"
AVERAGE_BLOCK_TIME = 15 # in seconds
CONFIRMATIONS_CONSIDERED_FINAL = 12
BLOCK_THRESHOLD = 30 # The number of blocks that can pass by before trying to send another transaction
STANDARD_GAS_LIMIT = 60000
@ -76,11 +75,9 @@ def new_from_user_input(user_input: Dict[str, Any]) -> "EthereumNetwork": # noq
user_input["rpc_address"] = DRAGONCHAIN_ROPSTEN_NODE
elif user_input.get("chain_id") == 61:
user_input["rpc_address"] = DRAGONCHAIN_CLASSIC_NODE
elif user_input.get("chain_id") == 62:
user_input["rpc_address"] = DRAGONCHAIN_MORDEN_NODE
else:
raise exceptions.BadRequest(
"If an rpc address is not provided, a valid chain id must be provided. ETH_MAIN = 1, ETH_ROPSTEN = 3, ETC_MAIN = 61, ETC_MORDEN = 62"
"If an rpc address is not provided, a valid chain id must be provided. ETH_MAIN = 1, ETH_ROPSTEN = 3, ETC_MAIN = 61"
)
# Create our client with a still undetermined chain id
try:
@ -95,9 +92,9 @@ def new_from_user_input(user_input: Dict[str, Any]) -> "EthereumNetwork": # noq
except Exception as e:
raise exceptions.BadRequest(f"Error trying to contact ethereum rpc node. Error: {e}")
effective_chain_id = user_input.get("chain_id")
# For ethereum classic, the mainnet/testnet nodes are chain ID 1/2, however their transactions
# must be signed with chain id 61/62, so we have an exception here for the chain id sanity check
if effective_chain_id == 61 or effective_chain_id == 62:
# For ethereum classic, the mainnet node is chain ID 1, however its transactions
# must be signed with chain id 61, so we have an exception here for the chain id sanity check
if effective_chain_id == 61:
effective_chain_id -= 60
# Sanity check if user provided chain id that it matches the what the RPC node reports
if isinstance(effective_chain_id, int) and effective_chain_id != reported_chain_id:
@ -131,6 +128,8 @@ def new_from_at_rest(ethereum_network_at_rest: Dict[str, Any]) -> "EthereumNetwo
class EthereumNetwork(model.InterchainModel):
address: ChecksumAddress
def __init__(self, name: str, rpc_address: str, chain_id: int, b64_private_key: str):
self.blockchain = "ethereum"
self.name = name
@ -138,7 +137,7 @@ class EthereumNetwork(model.InterchainModel):
self.chain_id = chain_id
self.priv_key = eth_keys.keys.PrivateKey(base64.b64decode(b64_private_key))
self.address = self.priv_key.public_key.to_checksum_address()
self.w3 = web3.Web3(web3.HTTPProvider(self.rpc_address))
self.w3 = web3.Web3(web3.HTTPProvider(URI(self.rpc_address)))
# Set gas strategy
self.w3.eth.setGasPriceStrategy(web3.gas_strategies.time_based.medium_gas_price_strategy)
@ -185,12 +184,12 @@ class EthereumNetwork(model.InterchainModel):
"""
_log.info(f"[ETHEREUM] Getting confirmations for {transaction_hash}")
try:
transaction_block_number = self.w3.eth.getTransaction(transaction_hash)["blockNumber"]
transaction_block_number = self.w3.eth.getTransaction(HexStr(transaction_hash))["blockNumber"]
except web3.exceptions.TransactionNotFound:
raise exceptions.TransactionNotFound(f"Transaction {transaction_hash} not found")
latest_block_number = self.get_current_block()
_log.info(f"[ETHEREUM] Latest ethereum block number: {latest_block_number} | Block number of transaction: {transaction_block_number}")
return transaction_block_number and (latest_block_number - transaction_block_number) >= CONFIRMATIONS_CONSIDERED_FINAL
return bool(transaction_block_number) and (latest_block_number - transaction_block_number) >= CONFIRMATIONS_CONSIDERED_FINAL
def check_balance(self) -> int:
"""Check the balance of the address for this network
@ -238,7 +237,17 @@ class EthereumNetwork(model.InterchainModel):
"""
return base64.b64encode(self.priv_key.to_bytes()).decode("ascii")
def _publish_transaction(self, transaction_payload: str) -> str:
def publish_transaction(self, signed_transaction: str) -> str:
"""Publish an already signed transaction to this network
Args:
signed_transaction: The already signed transaction from self.sign_transaction
Returns:
The hex string of the published transaction hash
"""
_log.debug(f"[ETH] Publishing transaction {signed_transaction}")
return self.w3.toHex(self.w3.eth.sendRawTransaction(HexStr(signed_transaction)))
def _publish_l5_transaction(self, transaction_payload: str) -> str:
"""Publish a transaction to this network with a certain data payload
Args:
transaction_payload: The arbitrary data to send with this transaction
@ -255,15 +264,15 @@ class EthereumNetwork(model.InterchainModel):
}
)
# Send signed transaction
return self.w3.toHex(self.w3.eth.sendRawTransaction(signed_transaction))
return self.publish_transaction(signed_transaction)
def _calculate_transaction_fee(self) -> int:
"""Get the current gas price estimate
Returns:
Gas price estimate in wei
"""
_log.debug(f"[ETHEREUM] Getting estimated gas price")
gas_price = max(int(self.w3.eth.generateGasPrice()), 100000000) # Calculate gas price, but set minimum to 0.1 gwei for safety
_log.debug("[ETHEREUM] Getting estimated gas price")
gas_price = max(int(self.w3.eth.generateGasPrice() or 0), 100000000) # Calculate gas price, but set minimum to 0.1 gwei for safety
_log.info(f"[ETHEREUM] Current estimated gas price: {gas_price}")
return gas_price

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -54,7 +54,7 @@ class TestEthereumMethods(unittest.TestCase):
return_value=b"\xec>s;\xb6\x8a\xbb?\xfa\x87\xa1+\x03\x9at\x9f\xcc\xafXDn\xee\xed\xa9:\xd0\xd5\x9fQ\x03\x8f\xf2"
)
response = self.client._publish_transaction("DC-L5:0xhash")
response = self.client._publish_l5_transaction("DC-L5:0xhash")
self.assertEqual(response, "0xec3e733bb68abb3ffa87a12b039a749fccaf58446eeeeda93ad0d59f51038ff2")
self.client.sign_transaction.assert_called_once_with(
@ -166,10 +166,6 @@ class TestEthereumMethods(unittest.TestCase):
client = eth.new_from_user_input({"version": "1", "name": "banana", "chain_id": 61})
self.assertEqual(client.rpc_address, "http://internal-Parity-Classic-Internal-2003699904.us-west-2.elb.amazonaws.com:8545")
self.assertEqual(client.chain_id, 61) # Ensure the chain id for ETC mainnet is correct
mock_check_chain_id.return_value = 2
client = eth.new_from_user_input({"version": "1", "name": "banana", "chain_id": 62})
self.assertEqual(client.rpc_address, "http://internal-Parity-Morden-Internal-26081757.us-west-2.elb.amazonaws.com:8545")
self.assertEqual(client.chain_id, 62) # Ensure the chain id for ETC testnet is correct
@patch("dragonchain.lib.dto.eth.EthereumNetwork.check_rpc_chain_id", return_value=1)
def test_new_from_user_input_sets_good_private_keys(self, mock_check_chain_id):

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -22,6 +22,7 @@ from typing import Dict, Any, List, Set, TYPE_CHECKING
import fastjsonschema
from dragonchain.lib.interfaces import storage
from dragonchain.lib.dto import transaction_model
from dragonchain.lib.dto import schema
from dragonchain.lib.dto import model
@ -182,6 +183,11 @@ class L1BlockModel(model.BlockModel):
"""Export full transactions in block as NDJSON (for storage select when querying)"""
txn_string = ""
for transaction in self.transactions:
txn_string += '{"txn_id": "' + transaction.txn_id + '", '
txn_string += '{"txn_id": "' + transaction.txn_id + '", "stripped_payload": true, '
txn_string += '"txn": ' + json.dumps(transaction.export_as_full(), separators=(",", ":")) + "}\n"
return txn_string
def store_transaction_payloads(self) -> None:
"""Stores full transaction payloads for block"""
for transaction in self.transactions:
storage.put(f"PAYLOADS/{transaction.txn_id}", json.dumps(transaction.payload, separators=(",", ":")).encode("utf-8"))

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -36,7 +36,23 @@ def new_from_at_rest(block: Dict[str, Any]) -> "L4BlockModel":
validations = []
for item in block["l3-validations"]:
validations.append({"l3_dc_id": item["l3_dc_id"], "l3_block_id": item["l3_block_id"], "l3_proof": item["l3_proof"], "valid": item["valid"]})
if block.get("version") == "2":
if block.get("version") == "3":
return L4BlockModel(
dc_id=block["header"]["dc_id"],
current_ddss=block["header"].get("current_ddss"),
block_id=block["header"]["block_id"],
timestamp=block["header"].get("timestamp") or "-1",
prev_proof=block["header"]["prev_proof"],
scheme=block["proof"]["scheme"],
proof=block["proof"]["proof"],
nonce=block["proof"].get("nonce"),
l1_dc_id=block["header"]["l1_dc_id"],
l1_block_id=block["header"]["l1_block_id"],
l1_proof=block["header"]["l1_proof"],
validations=validations,
chain_name=block["header"]["chain_name"],
)
elif block.get("version") == "2":
return L4BlockModel(
dc_id=block["header"]["dc_id"],
current_ddss=block["header"].get("current_ddss"),
@ -78,6 +94,7 @@ class L4BlockModel(model.BlockModel):
l1_block_id=None,
l1_proof=None,
validations=None,
chain_name="",
):
"""Model Constructor"""
if validations is None:
@ -96,6 +113,7 @@ class L4BlockModel(model.BlockModel):
self.l1_block_id = l1_block_id
self.l1_proof = l1_proof
self.validations = validations
self.chain_name = chain_name
def get_associated_l1_dcid(self) -> str:
"""Interface function for compatibility"""
@ -113,9 +131,10 @@ class L4BlockModel(model.BlockModel):
else:
proof = {"scheme": self.scheme, "proof": self.proof, "nonce": self.nonce}
return {
"version": "2",
"version": "3",
"dcrn": schema.DCRN.Block_L4_At_Rest.value,
"header": {
"chain_name": self.chain_name,
"dc_id": self.dc_id,
"current_ddss": self.current_ddss,
"level": 4,

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -16,6 +16,7 @@
# language governing permissions and limitations under the Apache License.
import json
from typing import Dict, Any
import fastjsonschema
@ -29,7 +30,7 @@ _validate_l5_block_at_rest = fastjsonschema.compile(schema.l5_block_at_rest_sche
def new_from_at_rest(block: dict) -> "L5BlockModel":
"""
Used in querying from the DAO
Input: Block::L4::AtRest DTO
Input: Block::L5::AtRest DTO
Returns: BlockModel object
"""
# Validate inputted schema
@ -139,3 +140,12 @@ class L5BlockModel(model.BlockModel):
"l4-blocks": self.l4_blocks,
"proof": proof,
}
def export_as_search_index(self) -> Dict[str, Any]:
"""Export as block search index DTO"""
return {
"block_id": int(self.block_id),
"timestamp": int(self.timestamp),
"prev_id": int(self.prev_id) if self.prev_id else 0,
"dc_id": self.dc_id if self.dc_id else "",
}

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -71,7 +71,7 @@ class InterchainModel(Model):
Returns:
String of the transaction hash (or equivalent) for the posted transaction
"""
return self._publish_transaction(f"DC-L5:{l5_block_hash}")
return self._publish_l5_transaction(f"DC-L5:{l5_block_hash}")
def is_transaction_confirmed(self, transaction_hash: str) -> bool:
"""Check if a transaction is considered confirmed
@ -121,8 +121,17 @@ class InterchainModel(Model):
"""
raise NotImplementedError("This is an abstract method")
def _publish_transaction(self, payload: str) -> str:
"""Publish a transaction to this network with a certain data payload
def publish_transaction(self, signed_transaction: str) -> str:
"""Publish an already signed transaction to this network
Args:
signed_transaction: The already signed transaction from self.sign_transaction
Returns:
String of created transaction hash (or equivalent)
"""
raise NotImplementedError("This is an abstract method")
def _publish_l5_transaction(self, payload: str) -> str:
"""Publish an l5 transaction to this network with a certain data payload (l5 block hash)
Args:
transaction_payload: The arbitrary data to send with this transaction
Returns:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -47,12 +47,6 @@ class DCRN(enum.Enum):
Error_InTransit_Template = "Error::L{}::InTransit"
api_key_create_schema_v1 = {"type": "object", "properties": {"nickname": {"type": "string"}}, "additionalProperties": False}
api_key_update_schema_v1 = {"type": "object", "properties": {"nickname": {"type": "string"}}, "required": ["nickname"], "additionalProperties": False}
interchain_auth_registration_schema_v1 = {
"type": "object",
"properties": {"dcid": {"type": "string"}, "key": {"type": "string"}, "signature": {"type": "string"}},
@ -81,7 +75,7 @@ bulk_transaction_create_schema_v1 = {
"type": "array",
"items": transaction_create_schema_v1,
"minItems": 1,
"maxItems": 250, # Arbitrarily set for now. Feel free to change this if needed
"maxItems": 5000, # Arbitrarily set for now. Feel free to change this if needed
}
@ -112,7 +106,7 @@ transaction_full_schema = {
"payload": {"type": "string"},
"proof": {"type": "object", "properties": {"full": {"type": "string"}, "stripped": {"type": "string"}}, "required": ["full", "stripped"]},
},
"required": ["version", "dcrn", "header", "payload", "proof"],
"required": ["version", "dcrn", "header", "proof"],
}
@ -537,7 +531,12 @@ smart_contract_create_schema_v1 = {
"cmd": {"type": "string"},
"args": {"type": "array", "items": {"type": "string"}},
"env": {"type": "object"},
"secrets": {"type": "object"},
"secrets": {
"type": "object",
# Don't allow secrets to overwrite 'secret-key' or 'auth-key-id'
"patternProperties": {"^(?=(?!secret-key))(?=(?!auth-key-id))[a-z0-9-]+$": {"type": "string"}},
"additionalProperties": False,
},
"seconds": {"type": "integer", "minimum": 1, "maximum": 60},
"cron": {"type": "string"},
"execution_order": {"type": "string", "enum": ["serial", "parallel"]},
@ -560,7 +559,12 @@ smart_contract_update_schema_v1 = {
"cmd": {"type": "string"},
"args": {"type": "array", "items": {"type": "string"}},
"env": {"type": "object"},
"secrets": {"type": "object"},
"secrets": {
"type": "object",
# Don't allow secrets to overwrite 'secret-key' or 'auth-key-id'
"patternProperties": {"^(?=(?!secret-key))(?=(?!auth-key-id))[a-z0-9-]+$": {"type": "string"}},
"additionalProperties": False,
},
"seconds": {"type": "integer", "minimum": 1, "maximum": 60},
"cron": {"type": "string"},
"execution_order": {"type": "string", "enum": ["serial", "parallel"]},
@ -573,6 +577,17 @@ set_default_interchain_schema_v1 = {
"properties": {"version": {"type": "string", "enum": ["1"]}, "blockchain": {"type": "string"}, "name": {"type": "string"}},
}
publish_interchain_transaction_schema_v1 = {
"type": "object",
"properties": {
"version": {"type": "string", "enum": ["1"]},
"blockchain": {"type": "string"},
"name": {"type": "string"},
"signed_txn": {"type": "string"},
},
"additionalProperties": False,
}
# BITCOIN INTERCHAIN #
create_bitcoin_interchain_schema_v1 = {
@ -715,3 +730,151 @@ bnb_transaction_schema_v1 = {
"required": ["version", "amount", "to_address"],
"additionalProperties": False,
}
def add_crud_default_properties(other_properties: Dict[str, Any]):
other_properties["allow_create"] = {"type": "boolean"}
other_properties["allow_read"] = {"type": "boolean"}
other_properties["allow_update"] = {"type": "boolean"}
other_properties["allow_delete"] = {"type": "boolean"}
return other_properties
default_endpoint_property_schema = {
"type": "object",
"properties": {"allowed": {"type": "boolean"}},
"additionalProperties": False,
"required": ["allowed"],
}
create_transaction_endpoint_property_schema = {
"type": "object",
"properties": {"allowed": {"type": "boolean"}, "transaction_types": {"type": "object", "patternProperties": {".*": {"type": "boolean"}}}},
"additionalProperties": False,
}
permission_document_schema_v1 = {
"type": "object",
"properties": {
"version": {"type": "string", "enum": ["1"]},
"default_allow": {"type": "boolean"},
"permissions": {
"type": "object",
"properties": add_crud_default_properties(
{
"api_keys": {
"type": "object",
"properties": add_crud_default_properties(
{
"create_api_key": default_endpoint_property_schema,
"get_api_key": default_endpoint_property_schema,
"list_api_keys": default_endpoint_property_schema,
"delete_api_key": default_endpoint_property_schema,
"update_api_key": default_endpoint_property_schema,
}
),
"additionalProperties": False,
},
"blocks": {
"type": "object",
"properties": add_crud_default_properties(
{"get_block": default_endpoint_property_schema, "query_blocks": default_endpoint_property_schema}
),
"additionalProperties": False,
},
"interchains": {
"type": "object",
"properties": add_crud_default_properties(
{
"create_interchain": default_endpoint_property_schema,
"update_interchain": default_endpoint_property_schema,
"create_interchain_transaction": default_endpoint_property_schema,
"publish_interchain_transaction": default_endpoint_property_schema,
"list_interchains": default_endpoint_property_schema,
"get_interchain": default_endpoint_property_schema,
"delete_interchain": default_endpoint_property_schema,
"get_default_interchain": default_endpoint_property_schema,
"set_default_interchain": default_endpoint_property_schema,
"get_interchain_legacy": default_endpoint_property_schema,
"create_interchain_transaction_legacy": default_endpoint_property_schema,
}
),
"additionalProperties": False,
},
"misc": {
"type": "object",
"properties": add_crud_default_properties({"get_status": default_endpoint_property_schema}),
"additionalProperties": False,
},
"contracts": {
"type": "object",
"properties": add_crud_default_properties(
{
"get_contract": default_endpoint_property_schema,
"get_contract_logs": default_endpoint_property_schema,
"list_contracts": default_endpoint_property_schema,
"create_contract": default_endpoint_property_schema,
"update_contract": default_endpoint_property_schema,
"delete_contract": default_endpoint_property_schema,
"get_contract_object": default_endpoint_property_schema,
"list_contract_objects": default_endpoint_property_schema,
}
),
"additionalProperties": False,
},
"transaction_types": {
"type": "object",
"properties": add_crud_default_properties(
{
"create_transaction_type": default_endpoint_property_schema,
"delete_transaction_type": default_endpoint_property_schema,
"list_transaction_types": default_endpoint_property_schema,
"get_transaction_type": default_endpoint_property_schema,
}
),
"additionalProperties": False,
},
"transactions": {
"type": "object",
"properties": add_crud_default_properties(
{
"create_transaction": create_transaction_endpoint_property_schema,
"query_transactions": default_endpoint_property_schema,
"get_transaction": default_endpoint_property_schema,
}
),
"additionalProperties": False,
},
"verifications": {
"type": "object",
"properties": add_crud_default_properties(
{
"get_verifications": default_endpoint_property_schema,
"get_pending_verifications": default_endpoint_property_schema,
"query_interchain_verifications": default_endpoint_property_schema,
}
),
"additionalProperties": False,
},
}
),
"additionalProperties": False,
},
},
"required": ["version", "default_allow", "permissions"],
"additionalProperties": False,
}
api_key_create_schema_v1 = {
"type": "object",
"properties": {"nickname": {"type": "string"}, "permissions_document": permission_document_schema_v1},
"additionalProperties": False,
}
api_key_update_schema_v1 = {
"type": "object",
"properties": {"nickname": {"type": "string"}, "permissions_document": permission_document_schema_v1},
"additionalProperties": False,
}

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -164,7 +164,6 @@ class TransactionModel(model.Model):
"tag": self.tag,
"invoker": self.invoker or "",
},
"payload": self.payload,
"proof": {"full": self.full_hash, "stripped": self.signature},
}
@ -234,7 +233,7 @@ class TransactionModel(model.Model):
indexable_object = jsonpath.jsonpath(json_payload, path)
_log.debug(f"indexable_object: {indexable_object}")
# If we found a valid item at the specified indexable path
if indexable_object and isinstance(indexable_object, list) and len(indexable_object) == 1:
if indexable_object and indexable_object[0] and isinstance(indexable_object, list) and len(indexable_object) == 1:
index_item = indexable_object[0]
# Check that the item we extracted is a string for tag or text type custom indexes
if index["type"] == "tag" or index["type"] == "text":

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,3 +1,20 @@
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
# Section 6. Trademarks. is deleted and replaced with:
# 6. Trademarks. This License does not grant permission to use the trade
# names, trademarks, service marks, or product names of the Licensor
# and its affiliates, except as required to comply with Section 4(c) of
# the License and to reproduce the content of the NOTICE file.
# You may obtain a copy of the Apache License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the Apache License with the above modification is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
import os
import base64
import json

View File

@ -1,9 +1,25 @@
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
# Section 6. Trademarks. is deleted and replaced with:
# 6. Trademarks. This License does not grant permission to use the trade
# names, trademarks, service marks, or product names of the Licensor
# and its affiliates, except as required to comply with Section 4(c) of
# the License and to reproduce the content of the NOTICE file.
# You may obtain a copy of the Apache License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the Apache License with the above modification is
# distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the Apache License for the specific
# language governing permissions and limitations under the Apache License.
import unittest
from unittest.mock import patch, MagicMock
import base64
import os
from dragonchain.lib import faas
from dragonchain import exceptions

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -91,7 +91,7 @@ def select_transaction(location: str, block_id: str, txn_id: str) -> dict:
obj = s3.select_object_content(
Bucket=location,
Key=f"TRANSACTION/{block_id}",
Expression=f"select s.txn from s3object s where s.txn_id = '{txn_id}' limit 1", # nosec (this s3 select query is safe)
Expression=f"select s.txn, s.stripped_payload from s3object s where s.txn_id = '{txn_id}' limit 1", # nosec (this s3 select query is safe)
ExpressionType="SQL",
InputSerialization={"JSON": {"Type": "DOCUMENT"}},
OutputSerialization={"JSON": {"RecordDelimiter": "\n"}},
@ -104,7 +104,14 @@ def select_transaction(location: str, block_id: str, txn_id: str) -> dict:
if event.get("Records"):
txn_data = f'{txn_data}{event["Records"]["Payload"].decode("utf-8")}'
if txn_data:
return json.loads(txn_data)["txn"]
loaded_txn = json.loads(txn_data)
if loaded_txn.get("stripped_payload"):
payload_key = f"PAYLOADS/{txn_id}"
if does_object_exist(location, payload_key):
loaded_txn["txn"]["payload"] = json.loads(get(location, payload_key).decode("utf-8"))
else:
loaded_txn["txn"]["payload"] = json.dumps({})
return loaded_txn["txn"]
raise exceptions.NotFound

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -62,7 +62,7 @@ class TestS3Interface(unittest.TestCase):
mock_select_object_content.assert_called_once_with(
Bucket="loc",
Key="TRANSACTION/block",
Expression="select s.txn from s3object s where s.txn_id = 'txn' limit 1",
Expression="select s.txn, s.stripped_payload from s3object s where s.txn_id = 'txn' limit 1",
ExpressionType="SQL",
InputSerialization={"JSON": {"Type": "DOCUMENT"}},
OutputSerialization={"JSON": {"RecordDelimiter": "\n"}},

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:
@ -100,6 +100,12 @@ def select_transaction(location: str, block_id: str, txn_id: str) -> dict:
try:
loaded_txn = json.loads(transaction)
if loaded_txn["txn_id"] == txn_id:
if loaded_txn.get("stripped_payload"):
payload_key = os.path.join("PAYLOADS", txn_id)
if does_object_exist(location, payload_key):
loaded_txn["txn"]["payload"] = json.loads(get(location, payload_key).decode("utf-8"))
else:
loaded_txn["txn"]["payload"] = json.dumps({})
return loaded_txn["txn"]
except Exception:
_log.exception("Error loading retrieved transaction from disk select_transaction")

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

View File

@ -1,4 +1,4 @@
# Copyright 2019 Dragonchain, Inc.
# Copyright 2020 Dragonchain, Inc.
# Licensed under the Apache License, Version 2.0 (the "Apache License")
# with the following modification; you may not use this file except in
# compliance with the Apache License and the following modification to it:

Some files were not shown because too many files have changed in this diff Show More