The Für Elise Scroll Experience lets you play Beethoven's most recognizable melody with nothing but your scroll wheel. Each scroll tick advances through the score. Each note plays with velocity-mapped duration. The whole thing runs in 950 lines of self-contained HTML with zero dependencies. No libraries, no build tools, no frameworks.
Except it was built entirely with frameworks.
Not JavaScript frameworks. Not React, not Svelte, not anything you install with npm. These frameworks were extracted from sound engineering, film cinematography, motion graphics, and scroll behavior research. They are decision-making systems that turned a vague idea ("what if you could play piano by scrolling?") into specific, implementable architecture.
This article walks through exactly how 24 frameworks from four different disciplines converged to build it.
Try the experience first Für Elise Scroll Experience Play Beethoven with your scroll wheelFour Disciplines, One Interface
The experience draws from four framework series, each extracted from a domain that has solved problems web design is still figuring out:
Sound Engineering
How audio professionals shape sound became the blueprint for shaping each note's behavior, envelope, and spatial feel.
Film Cinematography
How directors control what you see and when became the system for managing note states and visual focus.
Motion Graphics
How motion designers time, ease, and choreograph movement became the language for note transitions and performance.
Scroll Behavior
How scroll engineers build responsive, accessible, performant scroll experiences became the structural foundation.
None of these disciplines teach you JavaScript. None of them know what a DOM node is. But all of them have spent decades refining how humans perceive timing, space, focus, and movement. That refinement is exactly what frameworks capture.
The Audio Engine: Sound Engineering Meets CSS
When you scroll through the score and hear each note, you are hearing decisions that came directly from sound engineering frameworks.
ADSR Envelopes Shape Each Note
AUDIO-004 The ADSR framework (Attack, Decay, Sustain, Release) is how sound engineers describe the life cycle of any sound. A piano note has a sharp attack, fast decay, minimal sustain, and a long release. That is what gives piano its character. A violin is the opposite: slow attack, no decay, indefinite sustain. The framework doesn't tell you to use Web Audio API. It tells you that every interaction has these four phases.
In the scroll experience, the ADSR envelope is implemented with three gain ramps. The attack is 15 milliseconds. The decay drops to 58% of peak volume over 30% of the note duration. Then an exponential ramp to near-silence provides the release. Those specific numbers came from studying how acoustic piano envelopes behave.
// ADSR envelope applied to each note
env.gain.setValueAtTime(0, now);
env.gain.linearRampToValueAtTime(0.6, now + 0.015); // attack
env.gain.exponentialRampToValueAtTime(0.35, now + dur * 0.3); // decay
env.gain.exponentialRampToValueAtTime(0.01, now + dur); // releaseA second oscillator runs one octave above the fundamental frequency at very low volume. This adds warmth without muddying the tone. That decision came from AUDIO-002, which teaches frequency separation: how to add richness in a different register so sounds don't mask each other.
Reverb Creates the Room
AUDIO-003 Spatial Depth teaches that reverb doesn't just add echo. It creates a room around the sound. In professional mixing, the wet/dry ratio determines how close or far a sound feels. The scroll experience uses a simple delay-based reverb: 120ms delay time with a 15% gain feedback. This places the piano in a small, intimate space, like a private recital room rather than a concert hall.
Reverb doesn't add decoration. It adds location.
Velocity Maps to Duration
AUDIO-001 Dynamic Range teaches that louder isn't always better. In sound engineering, compression controls the ratio between quiet and loud. In the scroll experience, scroll speed controls note duration. Slow scrolling produces longer, more expressive notes (up to 0.7 seconds). Fast scrolling creates short, staccato notes (as brief as 0.1 seconds). The scroll wheel becomes a velocity-sensitive input, just like a weighted piano key.
The Velocity Formula
A five-sample moving average smooths the raw scroll velocity to prevent jittery note lengths. The duration formula maps the smoothed velocity inversely: duration = 0.7 - (speed * 0.35), clamped between 0.1 and 1.2 seconds. Slow, deliberate scrolling rewards you with full, singing notes. Rushing through produces a rapid, percussive run.
The Visual System: Cinematography Meets DOM
How the notes look as you scroll past them is not CSS decoration. It is a camera system borrowed directly from film cinematography.
Depth of Field Controls Focus
FILM-005 In film, depth of field determines what is sharp and what is blurred. A shallow depth of field pulls your eye to the subject. Everything else falls away. The scroll experience applies this same principle to notes using four visual states:
Active: The note at center viewport. Full opacity, 130% scale, copper glow with animated pulse, note label visible. This is the focal point.
Approaching: The next two or three notes. 80% opacity, 105% scale, no color shift. These are entering your peripheral awareness.
Played: Notes behind the center point. 30% opacity, 95% scale, desaturated to muted grey. They are memories now, not active elements.
Distant: Notes far ahead in the score. 40% opacity, 0.5px blur filter applied. Out of focus, literally and visually.
This four-state system mirrors exactly how a camera with a shallow depth of field renders a scene. One sharp subject, a soft gradient of focus falling away in both directions.
Camera Movement as Scroll Direction
FILM-002 The vocabulary of camera movement is precise. A pan moves horizontally. A dolly moves the camera itself through space. A tilt moves vertically. The scroll experience uses a "dolly" motion: the viewport moves through the score, not the score scrolling past a fixed viewport. This distinction matters. When you scroll, you feel like you are moving forward through the music, not watching it slide by.
Technically, this is achieved by mapping vertical scrollY to horizontal translateX. The body is tall enough to create scroll distance, but the fixed-position score container moves sideways based on scroll position. The input is vertical (natural scroll direction), but the visual movement is horizontal (natural reading direction for musical notation).
The Timing System: Motion Graphics Meets Transitions
Every transition in the experience, the way notes grow, glow, and fade, is governed by motion graphics frameworks.
Easing Creates Character
MOTION-002 Easing Intelligence teaches that timing curves have personality. Linear motion feels mechanical. Ease-out feels natural, like something coming to rest. Ease-in feels like something building energy. The note transitions use CSS ease curves at 0.25 to 0.3 seconds, fast enough to feel responsive but long enough for the eye to register the change. The opacity, scale, and text-shadow transitions all share the same easing function, which creates a unified feeling of "one object changing state" rather than "three separate properties animating."
Choreography Across the Score
MOTION-001 The Choreography framework teaches stagger and sequence. When you scroll through a passage of eighth notes, the state changes cascade naturally because each note hits the center viewport in sequence. But the framework's real contribution is the entrance/exit order principle: approaching notes fade in at 80% opacity before reaching the active state, and played notes fade down to 30% after passing. This creates a wave-like visual rhythm that mirrors the musical rhythm.
Performance Under Pressure
MOTION-006 Motion Performance Intelligence establishes the 16ms budget: every frame must complete in under 16 milliseconds or you drop frames and break the illusion. The scroll experience uses will-change: transform on the score container to hint GPU compositing, and throttles scroll handling through requestAnimationFrame to prevent layout thrashing. The note state updates use only opacity, transform, and filter, all compositor-friendly properties that avoid triggering layout or paint.
The Foundation: Scroll Behavior Engineering
The entire experience is a scroll experience. Every decision about how the scroll input is captured, processed, and translated into visual and audio output comes from the scroll behavior frameworks.
Passive Listeners and rAF Throttling
SCROLL-001 CSS Scroll-Driven Animations Intelligence teaches the performance hierarchy: compositor-thread animations are fastest, then rAF-throttled JavaScript, then unthrottled scroll handlers. The experience uses a rAF-gated scroll listener with a passive: true flag. This means the browser never waits for the handler to decide whether to cancel the scroll event. The scroll is always smooth.
// rAF-throttled scroll (SCROLL-001 principle)
let ticking = false;
window.addEventListener('scroll', () => {
if (!ticking) {
requestAnimationFrame(() => {
onScroll();
ticking = false;
});
ticking = true;
}
}, { passive: true });Scroll Physics Shape the Feel
SCROLL-003 Scroll Physics teaches that scroll behavior is felt before it is seen. The experience calculates instantaneous scroll velocity, smooths it with a five-sample moving average, and applies a decay function when scrolling stops. This smoothed velocity drives two things simultaneously: the BPM display and the note duration. The velocity decay (80% reduction every 100ms when idle) prevents lingering ghost notes after you stop scrolling.
Spatial Navigation: Vertical Input, Horizontal Output
SCROLL-006 Spatial Navigation Architecture provides the decision framework for when horizontal scroll is appropriate. Musical notation reads left to right. Scroll wheels move vertically. The framework's content-scroll fit matrix identifies this exact pattern: when the content axis and input axis are perpendicular, you need an explicit mapping. The body height is set equal to the canvas width, creating a 1:1 pixel ratio between scrollY and translateX.
Accessibility is Not Optional
SCROLL-005 The first line of CSS in the experience is a prefers-reduced-motion media query that disables transitions and animations. Keyboard navigation is fully supported: arrow keys advance by one note spacing in either direction. The scroll experience never hijacks native scroll behavior. You scroll normally. The visual and audio layers respond to that scroll. You can always scroll away, scroll back, or leave the page. There is no scroll jacking.
The framework draws a hard line: scroll-triggered animation (preserving native scroll) is fine. Scroll hijacking (overriding it) almost never is.
Choreography Ties It All Together
SCROLL-004 Scroll-Triggered Choreography Intelligence provides the pin-and-scrub pattern that gives the experience its structure. The score container is position-fixed (pinned), and the scroll position scrubs through it. The ending screen triggers at 92% scroll progress, with a 1.5 second delay before fading in. The play-again button resets all state variables, re-shows the instruction overlay, and scrolls back to zero without a page reload.
What the Frameworks Actually Did
None of these 24 frameworks wrote a single line of code. They did something more valuable. They answered design questions before those questions became debugging sessions.
How should a note sound? AUDIO-004 said: attack, decay, release. Map the envelope to the interaction physics. Use exponential ramps, not linear ones.
How should notes look as they pass? FILM-005 said: focal point, transition zone, background. Four states, each with measurable visual properties.
How fast should transitions be? MOTION-006 said: under 16ms compute budget. Use compositor-friendly properties only. Hint the GPU with will-change.
How should the scroll input work? SCROLL-006 said: when content axis is perpendicular to input axis, create an explicit 1:1 pixel mapping. Use rAF throttling. Never block the main thread.
What about accessibility? SCROLL-005 said: prefers-reduced-motion first. Keyboard support always. Never override native scroll behavior.
The Compound Effect
Each framework series also has a compound framework (AUDIO-007, FILM-007, MOTION-007, SCROLL-007) that synthesizes the individual frameworks into a unified decision system. These compounds resolved the integration questions: how does ADSR timing interact with scroll velocity? How does depth of field interact with easing curves? The answer in every case was the same principle: let each system own its domain and connect them through shared variables.
950 Lines, Zero Dependencies
The final experience is a single HTML file. No build step, no package.json, no node_modules. The CSS is inline. The JavaScript is inline. The Web Audio engine generates tones with raw oscillator nodes. The reverb is a feedback delay. The visual system is pure DOM manipulation with CSS transitions.
This is not a constraint for its own sake. It is a direct consequence of the frameworks. MOTION-006 teaches that every dependency is a performance risk. SCROLL-001 teaches that compositor-thread operations are always faster than JavaScript library abstractions. AUDIO-003 teaches that a simple delay creates more convincing space than a complex convolution reverb if the parameters are right.
The frameworks didn't just guide the design. They eliminated the need for tools that most developers would reach for by default. When you know exactly what each note should sound like, look like, and feel like, you don't need a library to figure it out for you.
Experience it yourself Play Für Elise With Your Scroll Wheel 52 notes, zero dependencies, 24 frameworks