Pattern Bridge
Séparez ce que quelque chose fait de comment il le fait, pour que les deux puissent évoluer indépendamment.
Le Problème
Vous avez des sources audio (Ode To The Winners, The Epic Hero, Gaia) et des visualiseurs (Nebula, Silk, Smear, Cosmos). Sans Bridge, il faudrait OdeNebula, OdeSilk, OdeSmear, EpicNebula... une explosion de classes.
interface AudioSource {
getFrame(time: number): number[];
}
interface Visualizer {
render(frame: number[]): void;
}
abstract class MusicPlayer {
constructor(
protected source: AudioSource,
protected visualizer: Visualizer,
) {}
abstract play(): void;
}
class LivePlayer extends MusicPlayer {
play() {
const frame = this.source.getFrame(Date.now());
this.visualizer.render(frame);
}
}
Deux hiérarchies, une classe abstraite, et des sous-classes concrètes. Tout ça pour combiner deux choses qui varient indépendamment.
La Solution
Passez la fonction en paramètre. Les deux axes se composent librement.
type AudioFrame = Uint8Array;
type Visualizer = (
ctx: CanvasRenderingContext2D,
frame: AudioFrame,
w: number, h: number,
accent: string, time: number,
) => void;
function visualize(
ctx: CanvasRenderingContext2D,
frame: AudioFrame,
renderer: Visualizer,
w: number, h: number,
accent: string, time: number,
) {
renderer(ctx, frame, w, h, accent, time);
}
// 3 sources × 5 visualiseurs = 15 combinaisons, zéro couplage
visualize(ctx, frame, nebulaVisualizer, w, h, accent, t);
visualize(ctx, frame, silkVisualizer, w, h, accent, t);
visualize(ctx, frame, cosmosVisualizer, w, h, accent, t);
Pas de classes. Pas d'héritage. Juste des fonctions et des données.
Cette solution n'utilise pas Pithos. C'est justement le point.
En TypeScript fonctionnel, passer une fonction en paramètre est le pattern Bridge. Eidos exporte une fonction @deprecated createBridge() qui n'existe que pour vous guider ici.
Démo
Choisissez une source audio et un visualiseur. Les deux axes varient indépendamment : 3 sources × 5 visualiseurs = 15 combinaisons, le tout depuis un seul appel visualize(frame, renderer).
- bridge.ts
- audio.ts
- Usage
/**
* The Bridge function.
*
* Two independent axes: AudioSource × Visualizer.
* This function is the bridge — it connects a frame of audio data
* to any renderer, with zero coupling between them.
*/
import type { AudioFrame, Visualizer } from "./types";
export function visualize(
ctx: CanvasRenderingContext2D,
frame: AudioFrame,
renderer: Visualizer,
w: number,
h: number,
accent: string,
time: number,
): void {
renderer(ctx, frame, w, h, accent, time);
}
/**
* Web Audio API engine — loads, plays, pauses, seeks audio tracks.
* Independent from the visualizer (that's the bridge).
*/
import type { AudioFrame } from "./types";
let audioCtx: AudioContext | null = null;
let analyser: AnalyserNode | null = null;
let gainNode: GainNode | null = null;
let currentSourceNode: AudioBufferSourceNode | null = null;
let currentBuffer: AudioBuffer | null = null;
let startOffset = 0;
let startTime = 0;
function getAudioContext(): AudioContext {
if (!audioCtx) {
audioCtx = new AudioContext();
analyser = audioCtx.createAnalyser();
analyser.fftSize = 512;
analyser.smoothingTimeConstant = 0.82;
gainNode = audioCtx.createGain();
analyser.connect(gainNode);
gainNode.connect(audioCtx.destination);
}
return audioCtx;
}
const bufferCache = new Map<string, AudioBuffer>();
export async function loadAndPlay(url: string): Promise<void> {
const ctx = getAudioContext();
if (ctx.state === "suspended") await ctx.resume();
if (currentSourceNode) { currentSourceNode.stop(); currentSourceNode.disconnect(); currentSourceNode = null; }
let buffer = bufferCache.get(url);
if (!buffer) {
const response = await fetch(url);
const arrayBuffer = await response.arrayBuffer();
buffer = await ctx.decodeAudioData(arrayBuffer);
bufferCache.set(url, buffer);
}
currentBuffer = buffer;
const source = ctx.createBufferSource();
source.buffer = buffer;
source.loop = true;
source.connect(analyser as AnalyserNode);
source.start(0);
startOffset = 0;
startTime = ctx.currentTime;
currentSourceNode = source;
}
export function stopPlayback(): void {
if (currentSourceNode) { currentSourceNode.stop(); currentSourceNode.disconnect(); currentSourceNode = null; }
}
export function pausePlayback(): void {
if (audioCtx?.state === "running") audioCtx.suspend();
}
export function resumePlayback(): void {
if (audioCtx?.state === "suspended") audioCtx.resume();
}
export function setVolume(v: number): void {
if (gainNode) gainNode.gain.value = v;
}
export function getVolume(): number {
return gainNode ? gainNode.gain.value : 1;
}
export function isPlaying(): boolean {
return currentSourceNode !== null && audioCtx !== null && audioCtx.state === "running";
}
export function getCurrentTime(): number {
if (!audioCtx || !currentBuffer) return 0;
const elapsed = audioCtx.currentTime - startTime + startOffset;
return elapsed % currentBuffer.duration;
}
export function seekTo(time: number): void {
if (!audioCtx || !currentBuffer || !analyser) return;
if (currentSourceNode) { currentSourceNode.stop(); currentSourceNode.disconnect(); }
const source = audioCtx.createBufferSource();
source.buffer = currentBuffer;
source.loop = true;
source.connect(analyser);
source.start(0, time);
startOffset = time;
startTime = audioCtx.currentTime;
currentSourceNode = source;
}
export function getDuration(): number {
return currentBuffer?.duration ?? 0;
}
export function readFrame(): AudioFrame {
if (!analyser) return new Uint8Array(256);
const data = new Uint8Array(analyser.frequencyBinCount);
analyser.getByteFrequencyData(data);
return data;
}
import { useState, useEffect, useCallback } from "react";
import { loadAndPlay, stopPlayback, pausePlayback, resumePlayback, setVolume, isPlaying, getDuration } from "@/lib/audio";
import { setCoverImage } from "@/lib/visualizers/sun";
import { SOURCES } from "@/data/tracks";
import type { SourceKey, VisualizerKey } from "@/lib/types";
export function useMusicPlayer() {
const [sourceKey, setSourceKey] = useState<SourceKey>("track1");
const [vizKey, setVizKey] = useState<VisualizerKey>("bars");
const [playing, setPlaying] = useState(false);
const [loading, setLoading] = useState(false);
const [volume, setVolumeState] = useState(0.7);
const sourceMeta = SOURCES.find((s) => s.key === sourceKey);
const accent = sourceMeta?.color ?? "#f43f5e";
const handleSourceChange = useCallback(async (key: SourceKey) => {
setSourceKey(key);
const meta = SOURCES.find((s) => s.key === key);
if (!meta) return;
setCoverImage(`${import.meta.env.BASE_URL}${meta.cover}`);
setLoading(true);
await loadAndPlay(`${import.meta.env.BASE_URL}${meta.filename}`);
setVolume(volume);
setLoading(false);
setPlaying(true);
}, [volume]);
const handlePlayPause = useCallback(async () => {
if (playing) {
pausePlayback();
setPlaying(false);
} else if (isPlaying() || getDuration() > 0) {
resumePlayback();
setPlaying(true);
} else {
const meta = SOURCES.find((s) => s.key === sourceKey);
if (!meta) return;
setLoading(true);
await loadAndPlay(`${import.meta.env.BASE_URL}${meta.filename}`);
setVolume(volume);
setLoading(false);
setPlaying(true);
}
}, [playing, sourceKey, volume]);
const handleVolume = useCallback((v: number) => {
setVolumeState(v);
setVolume(v);
}, []);
useEffect(() => {
const meta = SOURCES.find((s) => s.key === sourceKey);
if (meta) setCoverImage(`${import.meta.env.BASE_URL}${meta.cover}`);
return () => stopPlayback();
}, []);
return {
sourceKey, vizKey, setVizKey, playing, loading, volume, accent,
handleSourceChange, handlePlayPause, handleVolume,
};
}
API
- createBridge
@deprecated— passez simplement des fonctions en paramètres
Liens connexes
- Eidos : Module Design Patterns Les 23 patterns GoF réimaginés pour le TypeScript fonctionnel
- Pourquoi la FP plutôt que la POO ? La philosophie derrière Eidos : pas de classes, pas d'héritage, juste des fonctions et des types
- Zygos Result Encapsulez vos appels bridge dans
Resultpour une gestion d'erreurs typée