Fuzzing Bark for server reliability
Ark users are always in control of their bitcoin. They can perform an emergency exit to get back on-chain whenever they need to. But Bark is a client-server system, and users depend on the server to make payments and refresh their VTXOs before expiry. Careful engineering, rigorous code review, and thorough testing go a long way, but we wanted to go further, so we've started fuzzing Bark.
Sniper vs. shotgun
Traditional testing is targeted: you write tests for bugs you can anticipate. Fuzzing takes the opposite approach, bombarding the system with random, malformed inputs to surface bugs you'd never think to look for.
VTXO roundtrip: deserialize, serialize, repeat
We use Honggfuzz, a coverage-guided fuzzer that instruments the binary and systematically explores code paths. We built two helper scripts around it: fuzz.sh to run targets, and debug.sh to analyze crashes when they're found.
Our first fuzz target is a VTXO roundtrip test. Feed random bytes into the VTXO deserializer, and if it parses successfully, re-serialize and deserialize again, then check the results match. If it panics at any step, that's a bug.
fn do_test(data: &[u8]) {
let result: Result<Vtxo<ServerVtxoPolicy>, ProtocolDecodingError> =
Vtxo::deserialize(&mut data.as_ref());
if let Ok(vtxo) = result {
let serialized = vtxo.serialize();
let vtxo2: Vtxo<ServerVtxoPolicy> =
Vtxo::deserialize(&mut serialized.as_slice())
.expect("re-serialization should succeed");
let serialized2 = vtxo2.serialize();
assert_eq!(
serialized, serialized2,
"serialization should be deterministic"
);
}
}
Unchecked allocation crashes the decoder
It paid off quickly. Fuzzing found a bug where a malformed VTXO could claim an arbitrary vector length during deserialization. The decoder would call Vec::with_capacity with whatever size the input specified, causing a capacity overflow panic. A single crafted request could have brought down the server.
Stable Rust doesn't yet offer try_with_capacity to handle this natively, so we added an explicit check. We introduced a MAX_VEC_SIZE of 4 MB (borrowing rust-bitcoin's approach) and a guard that runs before every vector allocation during decoding:
pub const MAX_VEC_SIZE: usize = 4_000_000;
impl OversizedVectorError {
pub fn check<T>(requested: usize) -> Result<(), Self> {
let max = MAX_VEC_SIZE / mem::size_of::<T>();
if requested > max {
Err(Self { requested, max })
} else {
Ok(())
}
}
}
The decoder now returns a clean error instead of panicking. Without fuzzing, this would have gone unnoticed until someone exploited it.
Continuous fuzzing and public test vectors
We have a fuzzer running around the clock and we're setting up a workflow to routinely push minimized corpora to our bark-qa repo, which also holds test vectors used throughout Bark's development.
More targets are on the way: dedicated serialization/deserialization targets for every protocol message type, then "getter" targets that fuzz method calls on deserialized types to catch errors beyond encoding.
Your bitcoins are safe by protocol design. We're making sure the infrastructure is just as solid.
Follow us on X or sign up to our newsletter to keep up with Bark development.