Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: add import ordering rule #255

Merged
merged 4 commits into from
Aug 27, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 58 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Contribution Guidelines

## Development Tools

Please install the following tools in your development environment to make sure that
you can run the basic CI checks in your local environment:

- `taplo`

This is a tool that is used to lint and format `TOML` files. You can install it with:

```bash
brew install taplo
```

You can learn more [here](https://taplo.tamasfe.dev/cli/installation/binary.html).

- `codespell`

This is a tool that is used to check for common misspellings in code. You can install it with:

```bash
pip install codespell # or `pip3 install codespell`
```

You can learn more [here](https://github.com/codespell-project/codespell).

- `nextest`

This is a modern test runner for Rust. You can install it with:

```bash
cargo install --locked nextest
```

Learn more [here](https://nexte.st).

- `cargo audit`

This is a tool to check Cargo.lock files for crates containing security vulnerabilities.

```bash
cargo install --locked cargo-audit
```
Learn more [here](https://docs.rs/cargo-audit/latest/cargo_audit/).

- Functional test runner

For dependencies required to run functional tests, see instructions in its [`README.md`](./functional-tests/README.md).

## Before Creating a PR

Before you create a PR, make sure that all the required CI checks pass locally.
For your convenience, a `Makefile` recipe has been created which you can run via:

```bash
make pr # `make` should already be installed in most systems
```
49 changes: 0 additions & 49 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,52 +87,3 @@ Start CL Client/Sequencer
cargo run --bin alpen-express-sequencer -- --config config.toml
```

## Contribution Guidelines

### Development Tools

Please install the following tools in your development environment to make sure that
you can run the basic CI checks in your local environment:

- `taplo`

This is a tool that is used to lint and format `TOML` files. You can install it with:

```bash
brew install taplo
```

You can learn more [here](https://taplo.tamasfe.dev/cli/installation/binary.html).

- `codespell`

This is a tool that is used to check for common misspellings in code. You can install it with:

```bash
pip install codespell # or `pip3 install codespell`
```

You can learn more [here](https://github.com/codespell-project/codespell).

- `nextest`

This is a modern test runner for Rust. You can install it with:

```bash
cargo install --locked nextest
```

Learn more [here](https://nexte.st).

- Functional test runner

For dependencies required to run functional tests, see instructions in its [`README.md`](./functional-tests/README.md).

### Before Creating a PR

Before you create a PR, make sure that all the required CI checks pass locally.
For your convenience, a `Makefile` recipe has been created which you can run via:

```bash
make pr # `make` should already be installed in most systems
```
10 changes: 4 additions & 6 deletions crates/btcio/src/broadcaster/handle.rs
Original file line number Diff line number Diff line change
@@ -1,19 +1,17 @@
use std::sync::Arc;

use express_tasks::TaskExecutor;
use tokio::sync::mpsc;
use tracing::*;

use alpen_express_db::{
types::{L1TxEntry, L1TxStatus},
DbResult,
};
use alpen_express_primitives::buf::Buf32;
use express_storage::BroadcastDbOps;

use crate::rpc::traits::{L1Client, SeqL1Client};
use express_tasks::TaskExecutor;
use tokio::sync::mpsc;
use tracing::*;

use super::task::broadcaster_task;
use crate::rpc::traits::{L1Client, SeqL1Client};

pub struct L1BroadcastHandle {
ops: Arc<BroadcastDbOps>,
Expand Down
15 changes: 7 additions & 8 deletions crates/btcio/src/broadcaster/task.rs
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
use std::{collections::BTreeMap, sync::Arc, time::Duration};

use alpen_express_db::types::{ExcludeReason, L1TxEntry, L1TxStatus};
use alpen_express_primitives::buf::Buf32;
use bitcoin::{hashes::Hash, Txid};
use express_storage::{ops::l1tx_broadcast, BroadcastDbOps};
use tokio::sync::mpsc::Receiver;
use tracing::*;

use alpen_express_db::types::{ExcludeReason, L1TxEntry, L1TxStatus};

use super::error::BroadcasterResult;
use crate::{
broadcaster::{error::BroadcasterError, state::BroadcasterState},
rpc::{
Expand All @@ -16,8 +16,6 @@ use crate::{
},
};

use super::error::BroadcasterResult;

// TODO: make these configurable, get from config
const BROADCAST_POLL_INTERVAL: u64 = 1000; // millis
const FINALITY_DEPTH: u64 = 6;
Expand Down Expand Up @@ -206,14 +204,15 @@ async fn send_tx(
#[cfg(test)]
mod test {
use alpen_express_db::{traits::TxBroadcastDatabase, types::ExcludeReason};
use alpen_express_rocksdb::broadcaster::db::{BroadcastDatabase, BroadcastDb};
use alpen_express_rocksdb::test_utils::get_rocksdb_tmp_instance;
use alpen_express_rocksdb::{
broadcaster::db::{BroadcastDatabase, BroadcastDb},
test_utils::get_rocksdb_tmp_instance,
};
use alpen_test_utils::ArbitraryGenerator;
use express_storage::ops::l1tx_broadcast::Context;

use crate::test_utils::TestBitcoinClient;

use super::*;
use crate::test_utils::TestBitcoinClient;

fn get_db() -> Arc<impl TxBroadcastDatabase> {
let (db, dbops) = get_rocksdb_tmp_instance().unwrap();
Expand Down
23 changes: 14 additions & 9 deletions crates/btcio/src/reader/query.rs
Original file line number Diff line number Diff line change
@@ -1,18 +1,23 @@
use std::collections::VecDeque;
use std::sync::Arc;
use std::time::{Duration, SystemTime, UNIX_EPOCH};
use std::{
collections::VecDeque,
sync::Arc,
time::{Duration, SystemTime, UNIX_EPOCH},
};

use alpen_express_rpc_types::types::L1Status;
use anyhow::bail;
use bitcoin::{Block, BlockHash};
use tokio::sync::{mpsc, RwLock};
use tracing::*;

use alpen_express_rpc_types::types::L1Status;

use super::config::ReaderConfig;
use super::messages::{BlockData, L1Event};
use crate::rpc::traits::L1Client;
use crate::status::{apply_status_updates, StatusUpdate};
use super::{
config::ReaderConfig,
messages::{BlockData, L1Event},
};
use crate::{
rpc::traits::L1Client,
status::{apply_status_updates, StatusUpdate},
};

fn filter_interesting_txs(block: &Block) -> Vec<u32> {
// TODO actually implement the filter logic. Now it returns everything
Expand Down
36 changes: 17 additions & 19 deletions crates/btcio/src/rpc/client.rs
Original file line number Diff line number Diff line change
@@ -1,32 +1,30 @@
use std::sync::atomic::AtomicU64;
use std::time::Duration;
use std::{fmt::Display, str::FromStr};
use std::{fmt::Display, str::FromStr, sync::atomic::AtomicU64, time::Duration};

use async_trait::async_trait;
use bitcoin::consensus::encode::{deserialize_hex, serialize_hex};
use bitcoin::Txid;

use base64::engine::general_purpose;
use base64::Engine;
use base64::{engine::general_purpose, Engine};
use bitcoin::{
block::{Header, Version},
consensus::deserialize,
consensus::{
deserialize,
encode::{deserialize_hex, serialize_hex},
},
hash_types::TxMerkleNode,
hashes::Hash as _,
Address, Block, BlockHash, CompactTarget, Network, Transaction,
Address, Block, BlockHash, CompactTarget, Network, Transaction, Txid,
};
use reqwest::header::HeaderMap;
use reqwest::StatusCode;
use reqwest::{header::HeaderMap, StatusCode};
use serde::{Deserialize, Serialize};
use serde_json::{json, to_value, value::RawValue, value::Value};
use tracing::*;

use super::{traits::SeqL1Client, types::RPCTransactionInfo};

use serde_json::{
json, to_value,
value::{RawValue, Value},
};
use thiserror::Error;
use tracing::*;

use super::traits::L1Client;
use super::types::{RawUTXO, RpcBlockchainInfo};
use super::{
traits::{L1Client, SeqL1Client},
types::{RPCTransactionInfo, RawUTXO, RpcBlockchainInfo},
};

const MAX_RETRIES: u32 = 3;

Expand Down
3 changes: 2 additions & 1 deletion crates/btcio/src/rpc/types.rs
Original file line number Diff line number Diff line change
Expand Up @@ -104,9 +104,10 @@ where
#[cfg(test)]
mod test {

use super::*;
use serde::Deserialize;

use super::*;

#[derive(Deserialize)]
struct TestStruct {
#[serde(deserialize_with = "deserialize_satoshis")]
Expand Down
3 changes: 1 addition & 2 deletions crates/btcio/src/status.rs
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
use std::sync::Arc;

use tokio::sync::RwLock;

use alpen_express_rpc_types::types::L1Status;
use tokio::sync::RwLock;

#[derive(Debug, Clone)]
pub enum StatusUpdate {
Expand Down
9 changes: 4 additions & 5 deletions crates/btcio/src/writer/broadcast.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,17 +2,16 @@

use std::{sync::Arc, time::Duration};

use alpen_express_db::{
traits::{SeqDataProvider, SeqDataStore, SequencerDatabase},
types::BlobL1Status,
};
use alpen_express_rpc_types::L1Status;
use anyhow::anyhow;
use bitcoin::{consensus::deserialize, Txid};
use tokio::sync::RwLock;
use tracing::*;

use alpen_express_db::{
traits::{SeqDataProvider, SeqDataStore, SequencerDatabase},
types::BlobL1Status,
};

use crate::{
rpc::{
traits::{L1Client, SeqL1Client},
Expand Down
3 changes: 1 addition & 2 deletions crates/btcio/src/writer/builder.rs
Original file line number Diff line number Diff line change
Expand Up @@ -31,13 +31,12 @@ use bitcoin::{
use rand::RngCore;
use thiserror::Error;

use super::config::{InscriptionFeePolicy, WriterConfig};
use crate::rpc::{
traits::{L1Client, SeqL1Client},
types::RawUTXO,
};

use super::config::{InscriptionFeePolicy, WriterConfig};

const BITCOIN_DUST_LIMIT: u64 = 546;

// TODO: these might need to be in rollup params
Expand Down
3 changes: 2 additions & 1 deletion crates/btcio/src/writer/config.rs
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
use bitcoin::Address;
use std::str::FromStr;

use bitcoin::Address;

#[derive(Debug, Clone)]
pub struct WriterConfig {
/// The sequencer change_address. This is where the reveal txn spends it's utxo to
Expand Down
25 changes: 11 additions & 14 deletions crates/btcio/src/writer/utils.rs
Original file line number Diff line number Diff line change
@@ -1,20 +1,16 @@
use std::sync::Arc;

use alpen_express_db::{
traits::{SeqDataProvider, SeqDataStore, SequencerDatabase},
types::{BlobEntry, BlobL1Status},
};
use alpen_express_primitives::buf::Buf32;
use anyhow::Context;
use bitcoin::hashes::Hash;
use bitcoin::{consensus::serialize, Transaction};
use bitcoin::{consensus::serialize, hashes::Hash, Transaction};
use sha2::{Digest, Sha256};

use super::{builder::build_inscription_txs, config::WriterConfig};
use crate::rpc::traits::{L1Client, SeqL1Client};
use alpen_express_db::types::BlobL1Status;
use alpen_express_db::{
traits::{SeqDataProvider, SeqDataStore, SequencerDatabase},
types::BlobEntry,
};

use super::builder::build_inscription_txs;
use super::config::WriterConfig;

// Helper function to fetch a blob entry from within tokio
pub async fn get_blob_by_idx<D: SequencerDatabase + Send + Sync + 'static>(
Expand Down Expand Up @@ -143,16 +139,17 @@ pub fn calculate_blob_hash(blob: &[u8]) -> Buf32 {
mod test {
use std::{str::FromStr, sync::Arc};

use bitcoin::{Address, Network};

use alpen_express_db::traits::SequencerDatabase;
use alpen_express_rocksdb::{
sequencer::db::SequencerDB, test_utils::get_rocksdb_tmp_instance, SeqDb,
};
use bitcoin::{Address, Network};

use super::*;
use crate::test_utils::TestBitcoinClient;
use crate::writer::config::{InscriptionFeePolicy, WriterConfig};
use crate::{
test_utils::TestBitcoinClient,
writer::config::{InscriptionFeePolicy, WriterConfig},
};

fn get_db() -> Arc<SequencerDB<SeqDb>> {
let (db, db_ops) = get_rocksdb_tmp_instance().unwrap();
Expand Down
Loading
Loading