I wanted to learn more about blue noise generation, and ended up making a demo with Bevy (version 0.18). The demo works as a desktop app or on the web—scroll to the end of this post to check it out. The full code is in this repo.
White noise is produced by independent random sampling over the whole domain. This image is an example of white noise. Notice how there are clumps where many points are very close together, and empty spaces without any points.

By contrast, blue noise spreads points over the domain more evenly. This image has approximately the same number of points as the one above, just distributed differently.

This kind of effect can be quite useful, for example when spreading out randomly-placed foliage in a video game, or when sampling randomly-but-evenly from a domain.
We can think of blue noise as enforcing a minimum distance between any two points, which forces them to spread out more evenly. There are many ways to produce this effect, with varying tradeoffs.
This table lists several of the blue noise generation algorithms I came across while searching for one with the right properties for this demo (a very quick runtime with decent quality). The “quality” assignment is a nonrigorous guess. Statistical analysis is possible here (something about comparing Fourier transforms with the optimal blue noise profile), but I simply didn’t dive that deep. Large slowdown factors are indicated by a \(k\) in the (casual) big O analysis, but small ones are ignored.
| algorithm description | (first) author | publication year | time complexity | quality |
|---|---|---|---|---|
| Jittered Grid1 | Cook | 1986 | \(O(n)\) | very low |
| Best Candidate2 | Mitchell | 1987 | \(O(k n^2)\) | low |
| Void and Cluster3 | Ulichney | 1993 | \(O(n^2 \log n)\) | high |
| Grid Cell Spatial Lookup4 | Bridson | 2007 | \(O(n)\) | medium |
| Optimal Transport5 | de Goes | 2012 | \(O(n^2)\) | high |
| Adapted Walk Monster6 | Muratori | 2014 | \(O(n^2)\) | low |
| Unsupervised Learning7 | Giunchi | 2022 | \(O(n^2)\) | medium |
| Filter-adapted Spatiotemporal8 |
Donnelly | 2024 | \(O(k n)\) | high |
Clearly, Bridson’s algorithm wasn’t the last word in blue noise generation, but its simplicity and linear runtime complexity stood out to me.
Many papers focus on the problem of sampling (especially when rendering images), but I only wanted to generate a set of points. It’s possible to get higher-quality blue noise than Bridson’s, especially in “offline” algorithms that take a lot of time to run, but my focus was on efficient runtime generation. Some algorithms would have been simpler to implement than Bridson’s, but I didn’t mind a little extra implementation challenge. Overall, it seemed like a good fit along each tradeoff dimension.
The paper that introduces Bridson’s algorithm is Fast Poisson Disk Sampling in Arbitrary Dimensions. It’s only a single page!
Because I was writing this in the context of producing a demo with Bevy, we’ll write the algorithm in Rust. While the algorithm works in any dimension, we’ll be producing a vector of 2d points.
fn bridson() -> Vec<Vec2> {One insight this algorithm uses is that it’s possible to produce a
grid where each cell of the grid contains at most one sample point. This
is because if the cells were any larger, it’d be possible to have two
points more than RADIUS apart in the same cell. This can
help us eliminate chunks of space from contention later.
When writing out code in the rest of this post, I’ll often unindent or move code around to save horizontal space.
let cell = RADIUS / (2.0_f32).sqrt();
let grid_width = (WIDTH / cell).ceil() as usize;
let grid_height = (HEIGHT / cell).ceil() as usize;We initialize an empty grid, an empty list of sampled points, and an “active frontier” of indices into those samples.
let mut grid: Vec<Option<usize>> =
vec![None; grid_width * grid_height];
let mut samples: Vec<Vec2> = Vec::new();
let mut active: Vec<usize> = Vec::new();Insertion of a new point adds it to the samples, expands
active, and marks that cell in the grid as containing a
point.
let insert = |point: Vec2,
grid: &mut Vec<Option<usize>>,
samples: &mut Vec<Vec2>,
active: &mut Vec<usize>| {
let index = samples.len();
samples.push(point);
active.push(index);
let col = (point.x / cell) as usize;
let row = (point.y / cell) as usize;
grid[row * grid_width + col] = Some(index);
};We can check validity of a candidate point before insertion. First, we ensure it’s in the target domain at all.
let is_valid = |point: Vec2, grid: &Vec<Option<usize>>,
samples: &Vec<Vec2>| -> bool {
let Vec2 { x, y } = point;
if !(0.0..WIDTH).contains(&x) ||
!(0.0..HEIGHT).contains(&y) {
return false;
}Then, we can find nearby grid cells in the neighborhood—up to two cells away in each direction.
let col = (x / cell) as usize;
let row = (y / cell) as usize;
let col_min = col.saturating_sub(2);
let col_max = (col + 2).min(grid_width - 1);
let row_min = row.saturating_sub(2);
let row_max = (row + 2).min(grid_height - 1);For every cell in the neighborhood, if there’s a point in that cell
and the distance to the new point is less than RADIUS, our
new point is invalid. We calculate distance squared rather than
distance—a common trick to avoid an expensive sqrt
operation—but also immediately discount cells that don’t contain a
point.
for r in row_min..=row_max {
for c in col_min..=col_max {
if let Some(i) = grid[r * grid_width + c]
&& (x - samples[i].x).powi(2) +
(y - samples[i].y).powi(2) <
RADIUS * RADIUS
{
return false;
}
}
}
true
};We start with an initial point, from anywhere in the entire domain, and work outwards from that point.
let mut rng = thread_rng();
let initial_point = Vec2::new(
rng.gen_range(0.0..WIDTH),
rng.gen_range(0.0..HEIGHT));
insert(initial_point, &mut grid, &mut samples, &mut active);We keep working while the active frontier still has work left for us to do.
while !active.is_empty() {Pick a random sample from the active frontier.
let active_index = rng.gen_range(0..active.len());
let sample = samples[active[active_index]];We try up to \(k\) different candidates—in our case \(k=30\) (a fairly standard choice).
let mut placed = false;
for _ in 0..K_CANDIDATES {We calculate a random angle and distance to generate a point.
let angle: f32 = rng.gen_range(0.0..TAU);
let dist: f32 = rng.gen_range(RADIUS..2.0 * RADIUS);
let point = Vec2::new(sample.x + dist * angle.cos(),
sample.y + dist * angle.sin());If that point is valid, we insert it and move on. Otherwise, keep trying new candidates.
if is_valid(point, &grid, &samples) {
insert(point, &mut grid, &mut samples, &mut active);
placed = true;
break;
}If we still haven’t managed to place a new point after trying \(k\) times, drop this index from the active frontier and move on to another one.
if !placed {
active.swap_remove(active_index);
}Finally, return the sampled points. While the number of points can vary from run to run, a typical run with my default settings produced about 9,050 points.
println!("points count = {}", samples.len());
samples
}Recently I’ve been learning Bevy, and it seemed like a good fit for a demo showing off a 2d point generation algorithm. Bevy is a not-yet-stable ECS-based game engine in Rust.
From the bevy website:
Components are Rust structures, Systems are Rust functions
This fact alone makes Bevy much more comfortable to work with than many other ECS solutions and game engines I’ve tried in the past. We’re basically just writing normal code, and the engine “fades into the background” more nicely than any other I’ve tried.
We’ll end up with an interactive app that looks like this screenshot:

A Resource is sort of like a singleton
Component. Here, we’ll use one to track the points we’ve
generated.
#[derive(Resource)]
struct BlueNoisePoints(Vec<Vec2>);We can easily generate a new set of points to use in our app by
swapping out the Vec<Vec2> in this resource. We look
for R key presses (from another resource), and update
BlueNoisePoints when one happens.
fn handle_keyboard(
keys: Res<ButtonInput<KeyCode>>,
mut points: ResMut<BlueNoisePoints>
) {
if keys.just_pressed(KeyCode::KeyR) {
points.0 = bridson();
}
}Thanks to Rust, mutability is reflected in the type signature (by
mut and ResMut). I also like how it seems
we’re grabbing these resources “out of thin air”—Bevy handles everything
behind the scenes.
Creation involves describing which Plugins,
Resources and Systems we want, and telling
that to an App. In main we make a new
App:
fn main() {
App::new()Then, we use WindowPlugin to produce a 1600×900 pixel
window with a title of
"Bridson's blue noise algorithm".
.add_plugins(DefaultPlugins.set(WindowPlugin {
primary_window: Some(Window {
title: "Bridson's blue noise algorithm".into(),
resolution: WindowResolution::new(1600, 900),
..default()
}),
..default()
}))The background color is a dark blue.
.insert_resource(ClearColor(Color::srgb(0.05, 0.05, 0.1)))We add our points resource, using the bridson() function
to generate them.
.insert_resource(BlueNoisePoints(bridson()))Then, we add some “systems”. Each one runs on a schedule. On
Startup (once at the beginning of our App’s
lifetime) we’ll run the setup function. Then, on every
Update (every new frame) we’ll draw to the
screen, and handle user input.
.add_systems(Startup, setup)
.add_systems(Update, (draw, handle_zoom, handle_keyboard))Finally, we run the App.
.run();
}The setup system (a normal Rust function) runs on the
Startup schedule, taking Commands that can
spawn entities.
fn setup(mut commands: Commands) {First, we’ll spawn an orthographic camera, using mostly default
settings. DEFAULT_ZOOM is a zoom level somewhere between
the minimum and maximum zoom levels. This is defined elsewhere with
const DEFAULT_ZOOM: f32 = 1.33;
commands.spawn((
Camera2d,
Projection::Orthographic(OrthographicProjection {
scale: DEFAULT_ZOOM,
..OrthographicProjection::default_2d()
}),
));Our setup system/function also spawns some white text in the top left corner of the window, to give users some information about the possible inputs.
commands.spawn((
Text::new(
"Bridson's blue noise algorithm\n".to_owned() +
"Pinch or scroll mouse wheel to zoom\n" +
"R to regenerate points"
),
TextColor(Color::WHITE),
Node {
position_type: PositionType::Absolute,
top: Val::Px(16.0),
left: Val::Px(16.0),
..default()
},
));
}There are several constants that determine how zooming feels. These
were tuned by hand, with guess-and-check.
cargo watch -x run was the main command run while
developing, to allow this quick-reloading workflow.
const ZOOM_SENSITIVITY_PINCH: f32 = 1.0;
const ZOOM_SENSITIVITY_SCROLL: f32 = 0.01;
const ZOOM_MIN: f32 = 0.2;
const DEFAULT_ZOOM: f32 = 1.33;
const ZOOM_MAX: f32 = 8.0;One adjustment that made guess-and-check more feasible was adding
these profile settings to Cargo.toml. They allow faster
recompiles of code in development, but optimize package code for speed
(since it only has to be compiled once).
[profile.dev]
opt-level = 1
[profile.dev.package."*"]
opt-level = 3I wanted to allow both “pinch to zoom” and using a mouse scroll
wheel. Bevy has MessageReaders for both. We grab those, as
well as querying for all entities that are an Projection
with a Camera2d attached.
fn handle_zoom(
mut pinch: MessageReader<PinchGesture>,
mut scroll: MessageReader<MouseWheel>,
mut query: Query<
&mut Projection,
With<Camera2d>
>,
) {Since we expect to have one (and only one) camera, we’ll bail out if
that isn’t true. single_mut() sort of does what it says on
the tin: find a single thing, and give it to us as mutable.
let Ok(projection) = query.single_mut() else { return };
let Projection::Orthographic(ortho) =
projection.into_inner()
else {
return;
};We read all the possible zoom-affecting events, and update
ortho.scale based on them.
for event in pinch.read() {
ortho.scale *= 1.0 - event.0 * ZOOM_SENSITIVITY_PINCH;
}
for event in scroll.read() {
let delta = match event.unit {
MouseScrollUnit::Line => event.y * 20.0,
MouseScrollUnit::Pixel => event.y,
};
ortho.scale *= 1.0 - delta * ZOOM_SENSITIVITY_SCROLL;
}Finally, we clamp the zoom level so it never exceeds its bounds.
ortho.scale = ortho.scale.clamp(ZOOM_MIN, ZOOM_MAX);
}The last thing we need Bevy to do for us is take our points and draw
them to the screen. We grab our points and camera, as well as something
called Gizmos, which provides basic drawing capabilities,
like drawing lines.
fn draw(
mut gizmos: Gizmos,
points: Res<BlueNoisePoints>,
query: Query<
&Projection,
With<Camera2d>
>,
) {As before, we expect exactly one camera, so bail out if we can’t find it. This one is simpler due to immutability.
let Ok(ortho) = query.single() else {
return;
};offset moves the grid of points to be centered in the
window. We’ll draw small “circular” (actually 12-sided) points in light
blue.
let offset = Vec2::new(WIDTH * 0.5, HEIGHT * 0.5);
let scaled_radius = POINT_RADIUS * ortho.scale;
let fill = Color::srgb(0.184, 0.674, 0.929);
let segments = 12_usize;Then, for each point, we draw a bunch of lines to fill in a 12-sided tiny “circle”.
for &point in &points.0 {
let center = point - offset;
for i in 0..segments {
let a0 = (i as f32 / segments as f32) * TAU;
let a1 = ((i + 1) as f32 / segments as f32) * TAU;
let v0 = center +
Vec2::new(a0.cos(), a0.sin()) *
scaled_radius;
let v1 = center +
Vec2::new(a1.cos(), a1.sin()) *
scaled_radius;
gizmos.line_2d(center, v0, fill);
gizmos.line_2d(v0, v1, fill);
gizmos.line_2d(v1, center, fill);
}
}One last thing: let’s build our demo for the web, using WebAssembly.
In Cargo.toml we have to add Bevy’s web
feature, and introduce a dependency on getrandom to get a
working random number generator.
[dependencies]
bevy = { version = "0.18", features = ["web"] }
rand = "0.8"
getrandom = { version = "0.2", features = ["js"] }In our case, only one code change is needed. We’ll update our
Window definition to look for an HTML canvas
element with ID #bevy-canvas, and set
fit_canvas_to_parent so our web demo doesn’t stay locked to
the 1600×900 window dimensions, but instead expands with the canvas.
Neither of these settings affects the native application.
primary_window: Some(Window {
title: "Bridson's blue noise algorithm".into(),
resolution: WindowResolution::new(1600, 900),
fit_canvas_to_parent: true, // new line
canvas: Some("#bevy-canvas".to_string()), // new line
..default()
}),When building, we’ll target wasm and produce a
/web directory with the files we need.
cargo build --release --target wasm32-unknown-unknown
wasm-bindgen \
--out-dir ./web/out \
--target web \
./target/wasm32-unknown-unknown/release/blue_noise_bevy.wasmAdd an index.html file to the new /web
directory, with these contents. I want to use the demo in an
iframe, so I made the #bevy-canvas take up
100% of the window width and height, with no margins. We’ll post a
message when bevy is initialized, so the iframe can move
out of its loading state.
<!doctype html>
<html lang="en">
<body>
<style>
body, html, canvas {
width: 100%;
height: 100%;
margin: 0;
padding: 0;
box-sizing: border-box;
display: block;
}
</style>
<canvas id="bevy-canvas"></canvas>
<script type="module">
import init from './out/blue_noise_bevy.js'
init()
.then(() =>
window.parent.postMessage('bevy-ready', '*')
)
.catch(e => {
if (!e.message?.startsWith(
"Using exceptions for control flow"
)) throw e;
});
</script>
</body>
</html>We can test locally by running an HTTP server. I normally use Python
for this, out of habit. Run the command below then navigate to
localhost:8000/web.
python3 -m http.server 8000Upload the files wherever (I used static storage alongside my website) and use an iframe to share them. This is getting a bit meta, but I used this HTML to embed the demo:
<div class="bevy-frame">
<div class="bevy-loading-message" id="bevy-overlay">
<button onclick="loadBevy()">Click to run demo</button>
</div>
</div>Along with this script to start loading upon a button click, and hide the loading message once loading completes.
function loadBevy() {
const overlay = document.getElementById('bevy-overlay');
overlay.innerHTML = '<p>Loading demo...</p>';
const frame = document.createElement('iframe');
frame.src = '//vitez.me/files/bridson/index.html';
document.querySelector('.bevy-frame').appendChild(frame);
window.addEventListener('message', (e) => {
if (e.data === 'bevy-ready') {
overlay.classList.add('hidden');
}
});
}It might take a little while to load. The demo window needs focus for inputs to be captured.
Cook, Robert L. 1986. Stochastic sampling in computer graphics. ACM Transactions on Graphics Volume 5, Issue 1.↩︎
Mitchell, Don P. 1987. Generating antialiased images at low sampling densities. SIGGRAPH ’87: Proceedings of the 14th annual conference on Computer graphics and interactive techniques.↩︎
Ulichney, Robert A. 1993. Void-and-cluster method for dither array generation. Proceedings Volume 1913, Human Vision, Visual Processing, and Digital Display IV.↩︎
Bridson, Robert. 2007. Fast Poisson Disk Sampling in Arbitrary Dimensions. SIGGRAPH sketches 10, no. 1.↩︎
de Goes, Fernando. Blue noise through optimal transport. ACM Transactions on Graphics (TOG), Volume 31, Issue 6.↩︎
Muratori, Casey. 2014. The Color of Noise. Casey’s Tech Stuff.↩︎
Giunchi, Daniele. 2022. Fast Blue-Noise Generation via Unsupervised Learning. International Joint Conference on Neural Networks.↩︎
Donnelly, William. 2024. FAST: Filter-Adapted Spatio-Temporal Sampling for Real-Time Rendering. Proceedings of the ACM on Computer Graphics and Interactive Techniques, Volume 7, Issue 1.↩︎