My Journey into Shaders: Building a Particle Tunnel with React Three Fiber
How I moved from standard meshes to custom shaders to create a cool glowing tunnel effect.
In the world of 3D web graphics, libraries like Three.js and React Three Fiber (R3F) are amazing. They make it super easy to drop a 3D box or sphere into a scene and make it look realistic. But sometimes, "realistic" isn't what you want. Sometimes you want something crazy and creative—like a glowing sci-fi tunnel made of thousands of floating particles.
To do that properly, standard materials just aren't enough. I realized I had to learn Shaders.
In this post, I want to share how I built a 3D particle tunnel. We will start with the React setup and then dive into the GLSL shader code (the code that runs on the graphics card).
Part 1: The Setup (React & Geometry)
Before we touch the GPU code, we need some data. A particle system is basically just a big list of points in 3D space. We don't need to load a heavy 3D model for this; we can calculate the positions using simple JavaScript math.
I wanted a tunnel shape. Geometrically, that is just a cylinder. I generated 2000 random points, placed them in a circle, and stretched them out along the Z-axis.
Here is my R3F component:
"use client";
import { useMemo } from "react";
import { useFrame } from "@react-three/fiber";
import * as THREE from "three";
// (I will show the Shaders in the next section...)
export default function Tunnel() {
const count = 2000;
const radius = 2;
const length = 40;
// 1. Generate the positions
const particlesPosition = useMemo(() => {
// We need a flat array: [x,y,z, x,y,z, x,y,z...]
const positions = new Float32Array(count * 3);
for (let i = 0; i < count; i++) {
const i3 = i * 3;
// Random angle around the circle (0 to 360 degrees)
const angle = Math.random() * Math.PI * 2;
// Vary radius slightly so the tunnel walls look thick
const r = radius + Math.random() * 0.5;
// Math to convert angle/radius to X and Y
const x = Math.cos(angle) * r;
const y = Math.sin(angle) * r;
// Spread particles deep into the screen (negative Z)
const z = Math.random() * -length;
positions[i3 + 0] = x;
positions[i3 + 1] = y;
positions[i3 + 2] = z;
}
return positions;
}, [count, radius, length]);
// 2. The Bridge to the GPU (Uniforms)
// This is how we pass data from JS to the Shader
const uniforms = useMemo(
() => ({
uTime: { value: 0 },
uLength: { value: length },
}),
[length],
);
return (
<points>
{/* Feed our calculated positions into the geometry */}
<bufferGeometry>
<bufferAttribute
attach="attributes-position"
count={count}
array={particlesPosition}
itemSize={3}
/>
</bufferGeometry>
{/* The Custom Material */}
<shaderMaterial
depthWrite={false} // Important! Lets transparent particles stack nicely
fragmentShader={fragmentShader}
vertexShader={vertexShader}
uniforms={uniforms}
transparent={true}
/>
</points>
);
}
I recommend reading Understanding Three.js Buffers and Points by Creating a Tunnel. first to get up to speed.
Part 2: The Brain (The Shaders)
This was the scary part for me. A shaderMaterial needs two small programs written in a language called GLSL.
- Vertex Shader: Runs once for every single dot. It handles Position.
- Fragment Shader: Runs once for every pixel inside the dot. It handles Color.
The Vertex Shader (Where does the dot go?)
Our goal here is to take that XYZ position from React and figure out where it sits on your screen.
// 1. Receive data from JS
uniform float uLength;
// 2. Prepare a variable to send to the Fragment shader
varying float vDepthRatio;
void main() {
// --- POSITIONING ---
// Combine the object position (Model) and camera position (View)
vec4 mvPosition = modelViewMatrix * vec4(position, 1.0);
// Project it onto the 2D screen
gl_Position = projectionMatrix * mvPosition;
// --- SIZING ---
// This is the magic perspective line!
// It divides 100 by the depth (-mvPosition.z).
// Far away things get divided by a big number -> they become small.
gl_PointSize = 100.0 / -mvPosition.z;
// --- COLOR DATA PREP ---
// Calculate how deep this particle is (0.0 to 1.0)
vDepthRatio = position.z / -uLength;
}
Wait, what is vec4?
I was confused by this too. Our points are 3D $(x, y, z)$, but the shader uses 4 numbers. The 4th number ($w$) is basically a helper that allows the math matrices to move the point around. If $w$ is 1.0, it's a specific spot in space.
And gl_PointSize?
Standard points are all 1 pixel wide. That looks flat. By dividing by the Z-depth, we manually create perspective.
The Fragment Shader (What color is it?)
By default, points in OpenGL are squares. We want circles. We have to "carve" the circle out of the square pixel by pixel.
// Receive the depth info from the vertex shader
varying float vDepthRatio;
void main() {
// --- SHAPE ---
// gl_PointCoord gives us coordinates inside the tiny point square (0 to 1)
// We check the distance from the center (0.5, 0.5)
float distanceToCenter = distance(gl_PointCoord, vec2(0.5));
// If the pixel is in the corner (too far from center), delete it!
if (distanceToCenter > 0.5) discard;
// --- COLOR ---
vec3 colorNear = vec3(1.0, 0.0, 0.5); // Hot Pink
vec3 colorFar = vec3(0.0, 1.0, 1.0); // Cyan
// Mix the colors based on depth!
vec3 finalColor = mix(colorNear, colorFar, vDepthRatio);
gl_FragColor = vec4(finalColor, 1.0);
}
We used a cool trick called a Varying. We calculated the depth ratio in the first shader, passed it to the second shader, and used mix() to blend the colors. Now, the tunnel fades from Pink to Cyan as it goes deeper!
Part 3: What else can Shaders do?
Once I understood this basic pipeline (Vertex -> Fragment), I realized you can do so much more than just tunnels. Here are a few examples of what you can build with similar logic:
1. Audio Visualizers
You can pass audio data (like bass or treble levels) into the shader as a uniform.
- In Vertex Shader: Make the particles scale up or shake when the bass hits.
- In Fragment Shader: Make the colors flash brighter when the volume is loud.
[Image of webgl audio visualizer particles]
2. Interactive Water/Terrain
Instead of random dots, you can use a plane geometry (a flat grid).
- In Vertex Shader: Use math functions (like Sine waves or Perlin Noise) to move the vertices up and down. This creates moving waves or hills without needing to model them by hand.