Bridge Pattern
Separate what something does from how it does it, so both can change independently.
The Problemβ
You have audio sources (Ode To The Winners, The Epic Hero, Gaia) and visualizers (Nebula, Silk, Smear, Cosmos). Without Bridge, you'd need OdeNebula, OdeSilk, OdeSmear, EpicNebula... an explosion of classes.
interface AudioSource {
getFrame(time: number): number[];
}
interface Visualizer {
render(frame: number[]): void;
}
abstract class MusicPlayer {
constructor(
protected source: AudioSource,
protected visualizer: Visualizer,
) {}
abstract play(): void;
}
class LivePlayer extends MusicPlayer {
play() {
const frame = this.source.getFrame(Date.now());
this.visualizer.render(frame);
}
}
Two hierarchies, an abstract class, and concrete subclasses. All to combine two things that vary independently.
The Solutionβ
Pass the function as a parameter. The two axes compose freely.
type AudioFrame = Uint8Array;
type Visualizer = (
ctx: CanvasRenderingContext2D,
frame: AudioFrame,
w: number, h: number,
accent: string, time: number,
) => void;
function visualize(
ctx: CanvasRenderingContext2D,
frame: AudioFrame,
renderer: Visualizer,
w: number, h: number,
accent: string, time: number,
) {
renderer(ctx, frame, w, h, accent, time);
}
// 3 sources Γ 5 visualizers = 15 combinations, zero coupling
visualize(ctx, frame, nebulaVisualizer, w, h, accent, t);
visualize(ctx, frame, silkVisualizer, w, h, accent, t);
visualize(ctx, frame, cosmosVisualizer, w, h, accent, t);
No classes. No inheritance. Just functions and data.
This solution doesn't use Pithos. That's the point.
In functional TypeScript, passing a function as a parameter is the Bridge pattern. Eidos exports a @deprecated createBridge() function that exists only to guide you here.
Live Demoβ
Pick an audio source and a visualizer. The two axes vary independently: 3 sources Γ 5 visualizers = 15 combinations, all from one visualize(frame, renderer) call.
- bridge.ts
- audio.ts
- Usage
/**
* The Bridge function.
*
* Two independent axes: AudioSource Γ Visualizer.
* This function is the bridge β it connects a frame of audio data
* to any renderer, with zero coupling between them.
*/
import type { AudioFrame, Visualizer } from "./types";
export function visualize(
ctx: CanvasRenderingContext2D,
frame: AudioFrame,
renderer: Visualizer,
w: number,
h: number,
accent: string,
time: number,
): void {
renderer(ctx, frame, w, h, accent, time);
}
/**
* Web Audio API engine β loads, plays, pauses, seeks audio tracks.
* Independent from the visualizer (that's the bridge).
*/
import type { AudioFrame } from "./types";
let audioCtx: AudioContext | null = null;
let analyser: AnalyserNode | null = null;
let gainNode: GainNode | null = null;
let currentSourceNode: AudioBufferSourceNode | null = null;
let currentBuffer: AudioBuffer | null = null;
let startOffset = 0;
let startTime = 0;
function getAudioContext(): AudioContext {
if (!audioCtx) {
audioCtx = new AudioContext();
analyser = audioCtx.createAnalyser();
analyser.fftSize = 512;
analyser.smoothingTimeConstant = 0.82;
gainNode = audioCtx.createGain();
analyser.connect(gainNode);
gainNode.connect(audioCtx.destination);
}
return audioCtx;
}
const bufferCache = new Map<string, AudioBuffer>();
export async function loadAndPlay(url: string): Promise<void> {
const ctx = getAudioContext();
if (ctx.state === "suspended") await ctx.resume();
if (currentSourceNode) { currentSourceNode.stop(); currentSourceNode.disconnect(); currentSourceNode = null; }
let buffer = bufferCache.get(url);
if (!buffer) {
const response = await fetch(url);
const arrayBuffer = await response.arrayBuffer();
buffer = await ctx.decodeAudioData(arrayBuffer);
bufferCache.set(url, buffer);
}
currentBuffer = buffer;
const source = ctx.createBufferSource();
source.buffer = buffer;
source.loop = true;
source.connect(analyser as AnalyserNode);
source.start(0);
startOffset = 0;
startTime = ctx.currentTime;
currentSourceNode = source;
}
export function stopPlayback(): void {
if (currentSourceNode) { currentSourceNode.stop(); currentSourceNode.disconnect(); currentSourceNode = null; }
}
export function pausePlayback(): void {
if (audioCtx?.state === "running") audioCtx.suspend();
}
export function resumePlayback(): void {
if (audioCtx?.state === "suspended") audioCtx.resume();
}
export function setVolume(v: number): void {
if (gainNode) gainNode.gain.value = v;
}
export function getVolume(): number {
return gainNode ? gainNode.gain.value : 1;
}
export function isPlaying(): boolean {
return currentSourceNode !== null && audioCtx !== null && audioCtx.state === "running";
}
export function getCurrentTime(): number {
if (!audioCtx || !currentBuffer) return 0;
const elapsed = audioCtx.currentTime - startTime + startOffset;
return elapsed % currentBuffer.duration;
}
export function seekTo(time: number): void {
if (!audioCtx || !currentBuffer || !analyser) return;
if (currentSourceNode) { currentSourceNode.stop(); currentSourceNode.disconnect(); }
const source = audioCtx.createBufferSource();
source.buffer = currentBuffer;
source.loop = true;
source.connect(analyser);
source.start(0, time);
startOffset = time;
startTime = audioCtx.currentTime;
currentSourceNode = source;
}
export function getDuration(): number {
return currentBuffer?.duration ?? 0;
}
export function readFrame(): AudioFrame {
if (!analyser) return new Uint8Array(256);
const data = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(data);
return data;
}
import { useState, useEffect, useCallback } from "react";
import { loadAndPlay, stopPlayback, pausePlayback, resumePlayback, setVolume, isPlaying, getDuration } from "@/lib/audio";
import { setCoverImage } from "@/lib/visualizers/sun";
import { SOURCES } from "@/data/tracks";
import type { SourceKey, VisualizerKey } from "@/lib/types";
export function useMusicPlayer() {
const [sourceKey, setSourceKey] = useState<SourceKey>("track1");
const [vizKey, setVizKey] = useState<VisualizerKey>("bars");
const [playing, setPlaying] = useState(false);
const [loading, setLoading] = useState(false);
const [volume, setVolumeState] = useState(0.7);
const sourceMeta = SOURCES.find((s) => s.key === sourceKey);
const accent = sourceMeta?.color ?? "#f43f5e";
const handleSourceChange = useCallback(async (key: SourceKey) => {
setSourceKey(key);
const meta = SOURCES.find((s) => s.key === key);
if (!meta) return;
setCoverImage(`${import.meta.env.BASE_URL}${meta.cover}`);
setLoading(true);
await loadAndPlay(`${import.meta.env.BASE_URL}${meta.filename}`);
setVolume(volume);
setLoading(false);
setPlaying(true);
}, [volume]);
const handlePlayPause = useCallback(async () => {
if (playing) {
pausePlayback();
setPlaying(false);
} else if (isPlaying() || getDuration() > 0) {
resumePlayback();
setPlaying(true);
} else {
const meta = SOURCES.find((s) => s.key === sourceKey);
if (!meta) return;
setLoading(true);
await loadAndPlay(`${import.meta.env.BASE_URL}${meta.filename}`);
setVolume(volume);
setLoading(false);
setPlaying(true);
}
}, [playing, sourceKey, volume]);
const handleVolume = useCallback((v: number) => {
setVolumeState(v);
setVolume(v);
}, []);
useEffect(() => {
const meta = SOURCES.find((s) => s.key === sourceKey);
if (meta) setCoverImage(`${import.meta.env.BASE_URL}${meta.cover}`);
return () => stopPlayback();
}, []);
return {
sourceKey, vizKey, setVizKey, playing, loading, volume, accent,
handleSourceChange, handlePlayPause, handleVolume,
};
}
APIβ
- createBridge
@deprecatedβ just pass functions as parameters
Related
- Eidos: Design Patterns Module All 23 GoF patterns reimagined for functional TypeScript
- Why FP over OOP? The philosophy behind Eidos: no classes, no inheritance, just functions and types
- Zygos Result Wrap bridge calls in
Resultfor typed error handling