The Problem
Most virtual pet games are menu-driven. You tap a button labeled "feed" or select "play" from an icon grid. The game tells you what each action does. There is nothing to discover.
The design challenge was to build a virtual pet where the gene-editing mechanic is implicit. CRISPR in biology is precise and learnable but not self-explanatory - you have to understand the rules before the system responds the way you want. The game should work the same way. The player has three buttons and one organism. The mutation is real, but the rule that triggers it is hidden. Discovery is the experience.
The computational constraint is real: an ESP32 with 520 KB of SRAM and a 128x64 OLED display running a single-threaded C++ loop. Every state the organism can be in - idle, alert, unstable, mutated - must be readable from that one screen without modes, menus, or context switches.
Approach
The organism has a state machine with four observable states. Idle is the baseline: the specimen moves with continuous ambient motion. Probe pressure (repeated button 16 presses) accumulates alert state and holds it. Adding disturb (button 17) while alert is active escalates the organism into unstable. Alert and unstable are visible on the OLED - not as labels but as motion and animation changes.
The mutation mechanic is the CRISPR layer. A specific input rhythm, feed (button 15) -> disturb -> feed, repeated twice inside documented timing windows, triggers echo_bloom. Near misses clear trigger memory. Waiting too long resets. When the rhythm completes, a discovery cue appears briefly on the OLED, then the mutation overlay settles. The mutation is persistent but temporary - it decays over time.
I structured the runtime around six observable channels: [button], [input], [dispatch], [organism], [render], and [runtime]. These emit structured output that the host verification suite reads. The same channels visible on the serial monitor during play are the ones the test harness asserts against. There is no separate debug mode.
Architecture
Three GPIO buttons feed an input dispatch layer. The dispatch layer drives organism state transitions. The organism state feeds a render layer that composes the single OLED frame.
GPIO 15 (feed) / GPIO 16 (probe) / GPIO 17 (disturb)
↓
Input Dispatch [input] → [dispatch]
↓
Organism State Machine [organism]
├─ idle → alert → unstable
└─ echo_bloom trigger check (rhythm + timing window)
↓
OLED Render [render] (128x64, single composed frame)
↓
Adafruit SSD1306 → Physical Display
EEPROM persistence saves organism state every 5 minutes. On boot, the last state is restored. The mutation history survives a power cycle.
Key Technical Details
Trait system defines the organism's behavioral parameters. Each trait modifies hunger decay rate and happiness decay rate. Traits accumulate from mutations and interact:
struct Trait {
String name;
String displayName;
int hungerModifier;
int happinessModifier;
};
Echo_bloom timing gate enforces the mutation rhythm precisely. The trigger is a two-phase sequence: feed -> disturb -> feed, repeated twice, each phase inside a millisecond window. The implementation tracks last trigger time and phase index. A near miss (correct button, wrong window) resets phase to zero. A timeout (correct phase, window expired) also resets:
bool checkEchoBloomTrigger(uint8_t button, unsigned long now) {
if (now - lastTriggerTime > ECHO_BLOOM_WINDOW_MS) {
triggerPhase = 0; // timeout clears progress
}
if (button == expectedPhaseButton[triggerPhase]) {
triggerPhase++;
lastTriggerTime = now;
} else {
triggerPhase = 0; // wrong button resets
}
return triggerPhase >= ECHO_BLOOM_REQUIRED_PHASES;
}
Java integration test validates the assembled three-button loop in a host environment. The test drives button inputs programmatically and asserts organism state transitions via the observable channel output. This means M002 acceptance criteria can be verified without physical hardware:
@Test
void probeEscalatesOrganismToAlert() {
handheld.pressButton(PROBE);
handheld.pressButton(PROBE);
handheld.pressButton(PROBE);
assertThat(handheld.getOrganismState()).isEqualTo(ALERT);
}
Single-frame rendering is the design constraint that enforces gameplay coherence. The player is never navigating menus - they are always watching the same organism. Alert, unstable, idle motion, and mutation overlay all coexist on one 128x64 composition.
Three decisions shaped the design. C++ over MicroPython for real-time performance: the game loop must handle button debouncing, state transitions, animation timing, and OLED rendering in sequence without missing a frame. Single composed frame over multiple screens: mode switching would break the perturb-and-observe interaction model. Java test harness over C++ unit tests: authoring integration tests in Java is simpler and more expressive than equivalent assertions in C++ with no standard test runner on ESP32.
What moved, what constrained it, and what trade-offs stayed visible.
Operational outcome, the limits around it, and the practical decisions that shaped the work.
Built a working ESP32 handheld with a gene-editing mutation mechanic. The hidden echo_bloom rhythm triggers a persistent mutation overlay that coexists with the full specimen loop on one 128x64 OLED frame. M002 demo complete: all five observable channel types confirmed, mutation mechanic verified reproducible.
ESP32 520 KB SRAM limits runtime state complexity. OLED 128x64 frame is the entire UI surface - all states must be readable from one composition. Single-threaded C++ loop must complete input, state, and render within one frame budget without blocking.
C++ over MicroPython: real-time frame timing requires deterministic execution. Single frame over mode switching: keeps the perturb-and-observe mechanic coherent. Java test harness over C++ unit tests: simpler integration test authoring, host-verifiable without hardware.