Nine planets, one helmet, and the sound of places you can't breathe.

SoundTrip started as a festival aggregator — trip planning, interactive maps, everything you need to actually get to the music. Somewhere along the way, the map kept expanding. Past the coastlines. Past the atmosphere. Here's where we are — and where we're going.

The map is live. All nine worlds.

You can open SoundTrip right now and fly from a festival in Berlin to the rim of a crater on Luna, then drop into Mars' Valles Marineris — 4,000 km of canyon that we filled with dark techno because the reverb profile demanded it.

Earth shows real festivals with real dates. Every other planet hosts fictional events placed on real coordinates, using real NASA satellite imagery. The craters are real. The volcanoes are real. The methane lakes on Titan are real. The festivals are the only fiction.

Each planet has its own marker style: gold diamonds on the Moon, copper hexagons on Mars, purple circles on Mercury, gold rhombuses on Venus, amber drops on Titan. Not decorative choices — they're visual signals that tell you where you are before you even read the name.

You can put on a helmet.

On non-Earth planets, there's a helmet button. Press it and the screen changes: a visor frame closes around your view. Dust starts accumulating on the glass. A HUD lights up — coordinates, signal bars, event name, audio filter readout.

The comms system types out messages character by character, the way radio transmissions feel when you're far from anything. On the Moon it tells you that sound reaches you only through solid conduction — because there's no air. It tells you Earth is visible at bearing 078. It reminds you to wipe the regolith dust off your visor, because it's electrostatically charged.

What happens when the helmet starts making sound? That's Post 004.

What we're building now.

The helmet is visual. We want it to be auditory too — and physical, as much as a screen can be. Here's what's in progress and what's coming.

Done → Post 004
Helmet Audio Engine
A real-time audio chain that simulates how sound propagates through a spacesuit on each planet. On the Moon: no air, ground conduction only, suit resonance, 35 Hz seismic rumble with 12-second decay. All derived from Apollo mission data. Full story in Post 004.
Done → Post 004
Haptic feedback
Seismic events now shake the screen and trigger vibration on Android. On iOS, where Safari blocks the vibration API, the audio engine sends a 20 Hz impulse through the speaker — low enough to feel, not hear. It's not a haptic motor, but your phone vibrates. Solved as part of the Helmet Audio Engine.
Evolved → Post 003
Groups — shared exploration
We rethought this from the ground up. No server, no real-time sync — just multiple passports loaded on the same device. Sit next to your friends, load your passports together, and see your explorations overlap on the map. The group exists between friends, not on a server. Full story in Post 003.
Done → Post 002
Gamepad navigation
The map works with a controller. Pan with sticks, zoom with triggers, cycle planets, scroll through markers, enter events. Full DualSense and Xbox controller support. Bearing-aware navigation — push the stick and the map moves where you're looking. Spec in Post 002.
Planned
Full planetary datasets
Right now only Mars has complete event data — genre, description, acoustic narrative. Luna, Mercury, Venus, and Titan have names and coordinates but nothing else. Each event needs a sonic identity rooted in the actual surface feature it sits on. A festival inside Caloris Basin on Mercury should sound different from one on the methane shores of Kraken Mare on Titan.

Where this goes.

The question SoundTrip is built around is simple: a festival on Earth — or on another planet?

On Earth, SoundTrip is a travel companion. Festivals, maps, the kind of information you need to actually plan a trip — where to go, when to be there, what's nearby. The platform is built to help you explore, discover, and get moving.

Off Earth, it becomes something else entirely. The same instinct — explore, discover, feel something — but directed at places where no one has ever stood. Where the atmosphere is wrong, or missing. Where sound itself behaves differently. Helmet mode, audio engines, planetary acoustics — they all serve the same idea: what if you could experience a festival on the surface of Mars, and the experience was shaped by actual Martian physics?

Both sides feed each other. The trip planning makes SoundTrip useful. The planetary exploration makes it unforgettable.

The craters are real. The volcanoes are real. The methane lakes are real. The festivals are the only fiction.

More soon.

We gave the map a controller. It changed everything.

SoundTrip already felt more like exploring a world than browsing a website. So we asked the obvious question: what if you could navigate it with a gamepad? We spent a week inside the Gamepad API, WebHID, and a DualSense. Here's what we found.

The analog stick changes the relationship.

A mouse is a pointer. You click on things. A keyboard is a set of discrete commands. You press and something happens. An analog stick is different — it's a continuous gesture. You push gently and the map drifts. You push hard and it flies. The relationship between your hand and the map becomes physical, proportional, felt.

On a planetary map with nine worlds to explore, that shift matters. Panning across Valles Marineris with a stick feels like steering. Zooming into a crater on Mercury with a trigger feels like descending. Selecting a festival with a face button feels like arriving.

We wrote the spec before the code.

Before implementing anything, we built a full interactive specification — a single HTML document that covers the entire Gamepad API surface: core polling loop, button and axis mapping, browser compatibility, controller hardware profiles, haptic feedback, and the WebHID layer that unlocks what the standard API can't reach.

The spec focuses on what actually works today, not what the W3C spec promises in theory. We tested across Chrome, Firefox, and Safari, with DualSense, Xbox, and Switch Pro controllers. The results were sometimes surprising.

What we discovered along the way.

Firefox drops the analog.

On Chrome and Edge, the L2 and R2 triggers return a smooth float between 0.0 and 1.0 — press halfway, get 0.5. On Firefox, the same triggers return only 0 or 1. All the analog depth is gone. For any interaction that depends on gradual pressure — smooth zoom, progressive acceleration, intensity control — Firefox isn't an option. This isn't a controller issue, it's a browser implementation choice.

The standard API has a ceiling.

The Gamepad API gives you buttons, axes, and basic vibration. That's it. The DualSense has a gyroscope, an accelerometer, a touchpad, adaptive triggers, and a programmable light bar — none of which are accessible through the standard API. To reach them, you need WebHID: a low-level protocol that lets the browser talk directly to the USB device.

The DualSense is a full instrument.

Through WebHID, we wrote a driver that reads raw input reports from the DualSense at 250Hz. Gyroscope data becomes map rotation — tilt the controller and the map tilts with you. The touchpad becomes a pan surface. The light bar changes color based on zoom level, cycling from deep blue when zoomed out to green-yellow when you're close to the ground.

The gyro pipeline alone required bias calibration, exponential moving average smoothing, non-linear response curves, and configurable dead zones. Small hand tremors produce zero movement. Deliberate tilts produce smooth, curved rotation. It's the difference between a raw sensor and a usable input device.

Mapping a controller to a planet.

The standard W3C mapping is modeled on the Xbox layout — 17 buttons and 4 axes. Every controller that reports mapping: "standard" follows this scheme, regardless of brand. That gives us a stable foundation: left stick for panning, right stick for camera rotation, triggers for zoom, face buttons for interaction.

But SoundTrip isn't a game. The controls need to feel like exploration, not combat. So the mapping is gentle — slow default speeds, acceleration curves that reward patience, and a dead zone wide enough that resting your thumb on the stick produces nothing. The map only moves when you mean it to.

A 2,777-line reference.

The specification document is complete — every API surface, every compatibility caveat, every browser quirk, every DualSense byte offset. It's not a tutorial. It's a reference we built for ourselves and decided to keep as part of the project archive.

Implementation into SoundTrip is next. The research is done. The mapping is defined. The driver is written. Now it needs to meet the map.

The map should work with a controller. Now we know exactly how.

More soon.

Your passport is not a profile. It's a notebook you carry.

The passport started as an identity file — encrypted, local, yours. Then we realized it could hold more than a name and a planet. It could hold your entire relationship with music. Here's what changed.

Save what matters. Forget the rest.

Every platform you've ever used saves everything. Every click, every pause, every scroll — logged, timestamped, profiled. Your listening history isn't yours, it's a dataset that works for someone else.

The SoundTrip passport does the opposite. It saves nothing unless you tell it to. Found an artist you want to share with a friend? Save it. Found the fastest route to a festival? Save it. Heard something at 2 AM on Mars that you can't explain but don't want to lose? Save it.

Everything else disappears when you close the tab. No history. No behavioral log. No ghost of your last session haunting your next one.

Each saved item is a block appended to the passport file. The structure grows with you, but it stays a single file. A few hundred bytes per entry. Thousands of saved items still weigh less than one photo on your phone.

Bring your friends. Bring their passports.

We spent weeks thinking about how to build groups. WebSocket rooms, real-time sync, server-side state management, presence channels. The architecture was almost done. Then we threw it all away.

Not because it didn't work. Because it didn't fit.

A server that knows who's in a room is a server that stores relationships. A channel that syncs cursors is a channel that logs movement. Every real-time feature we designed required something we didn't want to build: an intermediary that sees what you do.

So we went back to something older. You sit next to your friend. You open SoundTrip on one device. You load your passport — and then you load theirs.

Two explorations, overlapping on the same map. The festivals they saved that you haven't seen. The route you found that they didn't know existed. The artist you both saved independently — there, visible as a convergence on the screen, without any algorithm suggesting it.

Three passports loaded? Three layers of curation, three perspectives on the same planet. The group exists on that device, in that moment, in that room. No server knows it happened. When you close the tab, it's gone — unless someone saved something new to their passport while they were there.

The group doesn't live on a server. It lives wherever your circle of friends decides to meet.

Split the headphone output. Navigate together. Decide where to go this summer while the map is right there between you. It's the same thing you did as a kid with a second controller — except now the game is real, and the levels are festivals you can actually attend.

Your listening history belongs to you. Now use it.

If you've been on Spotify for years, you have a history. Artists, tracks, play counts, timestamps — all of it exportable as a JSON dump that Spotify gives you because the law says they have to.

That export sits on your hard drive doing nothing. It's a dataset designed to be useful inside Spotify and awkward everywhere else. Until now.

We're building an import path. Take your Spotify export — or Apple Music, or any service that lets you pull your data — parse it, filter it, and load it into your passport. Not all of it. You choose what comes in. The artists that matter. The ones you want to see on the map.

Once they're in the passport, those artists become map data. SoundTrip cross-references them against festival lineups, open databases like MusicBrainz, and real-world event coordinates. The result: your personal listening history, overlaid on nine planets of live music. An artist you've listened to 300 times is playing 200 km from you in July — and you see it because the passport told the map, not because an algorithm decided to show you an ad.

The streaming service stays your streaming service. SoundTrip doesn't replace it, doesn't compete with it, doesn't even know which one you use. It just takes what was always yours and shows you what it connects to in the real world.

A file that grows with you.

The passport was an identity. Now it's becoming a companion — a single encrypted file that holds your name, your planet, your saved artists, your routes, your imported history, and whatever else you decide is worth keeping.

It weighs almost nothing. Even after years of use, with thousands of entries, it stays under a few megabytes. Smaller than a single song. Portable enough to move between devices with a drag and drop. No cloud. No sync service. No account recovery. If you lose it and forget the password, it's gone — like a notebook left on a train.

That's not a flaw. That's the point.

It weighs less than a song and holds more than a profile.

More soon.

You can hear the Moon now.

We built an audio engine that simulates what sound would actually do on the lunar surface. Not an approximation. Not a mood. A DSP chain derived from Apollo seismic data, NASA spacesuit pressure specs, and the physics of a vacuum.

The Helmet Audio Engine

When you enter a lunar event, the helmet activates automatically. You land — visor on, audio on. No toggle, no menu. You're there.

What you hear is synthesized in real time. No samples, no files. Every sound is generated procedurally from physics parameters: the breathing cycle inside a pressurized O₂ helmet at 29.6 kPa. The pink noise of a PLSS ventilation fan filtered through suit fabric. The sub-bass rumble of a seismic event propagating through regolith at 104 meters per second — a number measured by Apollo 14's active seismic experiment in 1971.

When you press Comms, a two-tone radio chirp cuts through static hiss. The typewriter text on your visor tells you about the ground beneath your boots. When you press Seismic, the map shakes, dust kicks up, and a 35 Hz oscillator decays for twelve seconds through a convolution reverb modeled on the Moon's famous "bell" effect — the phenomenon that made Apollo 12 scientists watch their instruments ring for 55 minutes after impact.

Bring your own music.

The helmet has a player. You load your own tracks — one file, or twenty. They get routed through the lunar DSP chain: bandpass-filtered like they're playing from a small speaker inside your helmet, colored by the resonance of a pressurized cavity, mixed with the ambient breathing and suit noise that never stops.

Your music on the Moon doesn't sound like your music on Earth. That's the point.

The player adapts to what you give it. One file, one play button. Three files, you get prev and next. No autoplay — you decide when the music starts. The helmet's ambient layer is the default experience. The music is yours to add.

Planetary events.

Copernicus crater is the first event with full audio support. 9.62°N, 20.08°W. Ninety-three kilometers across, three point eight deep. Five points of interest inside: Central Peaks, North Rim Terrace, South Wall Slump, Impact Melt Pool, Ray Origin Point. You fly in, the view locks, the helmet activates.

Seven more lunar events are placed at real coordinates — Mare Serenitatis, Tycho, Tranquillitatis, the far side, Aristarchus, Shackleton at the south pole, Taurus-Littrow where Apollo 17 walked for the last time. They're waiting for their data. They're waiting for better tiles.

Waiting for Artemis.

The current lunar tiles work for the world view. They don't work for zoom level 11 inside a crater. We know this. The architecture is ready — when Artemis missions return high-resolution orbital photography, we update one URL and the experience transforms. The DSP chain, the events, the POIs, the helmet — all of it is resolution-independent. We built it that way on purpose.

In the meantime, we're pulling imagery from NASA's public domain archive for the event popups. Lunar Orbiter 2's "Picture of the Century" for Copernicus. Apollo 17 surface photography for Taurus-Littrow. Hubble ultraviolet composites for Aristarchus. The images exist. They've been waiting fifty years for a reason to be seen again.

Passport required.

Planetary events are now gated behind the SoundTrip Passport. No account, no email, no server. Just a file encrypted on your device that proves you chose a home planet. Click a lunar event without a passport and you'll be asked to create one — it takes thirty seconds. With a passport loaded, you're in.

This isn't a paywall. It's a door. A door that asks one thing: where are you from?

What's next.

We're working on two things that will change how the passport feels in your hands.

The first is multi-passport import. Right now the passport is a single identity — one name, one planet. We're building the ability to load multiple passports on the same device, so you and your friends can sit together, each load your own file, and see your explorations overlap on the same map. The group doesn't live on a server. It lives between the people in the room.

The second is MusicBrainz integration. MusicBrainz is an open encyclopedia of music — artist data, release histories, genre taxonomies — all free and community-maintained. We're connecting it to SoundTrip so that every festival and every event carries richer artist information without relying on any commercial API. No Spotify dependency. No licensing middlemen. Just open data flowing into open exploration.

More soon.