Back to Home

Myst - Thoughts on the Channelwood Age

The 9 year old and I just got back to the library from the Channelwood age. I walked back and forth on the stupid paths, diverting pipes, messing around with valve and lever in the windmill - listening for sounds to telegraph how the system worked.

In a way it felt a lot like I was at work coding.

There's that discovery period when encountering something new - trial and error, poke and prod, cause and effect - just hoping for enough feedback to be able to develop a mental model that is accurate enough to divine the rules.

Funny that that's how I think of work - I should read the docs more often, ha!

Anyways I'd been at it for awhile trying to get the elevators working when she suggested shutting the door on the elevator. I disregarded her suggestion, possibly because I thought I'd already tried it, and awhile later accidentally clicked the door. It closed. The elevator worked. Classic.

I have two thoughts on this.

Extra pieces on a system that don't do anything are evil

So in the windmill there is a lever. I don't know what it does. I was convinced that the reason I couldn't get the elevators working was because I needed to do something with that lever.

It stood to reason - I was able to get the bridge working, I could hear the pumps running near the elevators - but I couldn't get them to go up. I figured there wasn't enough pressure in the system.

Since learning about closing the door I'm kinda thinking that the lever in the windmill was just a red herring.

I dunno, maybe it comes into play later on.

Little gotchas are also evil

Now about that door. There is nothing marking the door as something you can click. It's just wood, looks like all the other wood in the game.

But my kid saw it. She didn't overthink the situation - she saw a door and thought "try to close the door". Brilliant. Simple. I overcomplicated it. I assumed my grasp on the system was inadequate, NOT that I was missing a tiny critically important yet mundane detail.

Perhaps I've been out of the real world too long - perhaps I'm just an idiot.

We'll never know.

AI: This suggests a disturbing corollary: Accumulated experience does not necessarily make us better problem solvers — it may erode our ability to see.

Huh - that's kind of brilliant. Good job AI.

AI: Thanks!

What happens to your mind when you never sit down to "properly" learn new systems and just try "vibing" your way thru?

A theory

So I got a theory on that. Just made it 12 seconds ago so go easy on me, it's new.

I've read before about captive pigeons (or parrots?) mistakenly learning that when they spin they get fed - it just so happened that one day they did bit of a twirl and coincidentally someone came and gave them spaghetti.

Imagine being that pigeon, twirling away in different combinations - left twirl, right twirl, stop and twirl again - wondering why nobody is feeding you, when suddenly out comes the spaghetti. You wrack your brain - did I do a left and then a right? No - perhaps I did a head bob then a right twirl.

You see where I'm going with this.

AI: You’re describing what is known in psychology as superstitious behavior, a product of random reinforcement.

Yeah that sounds legit - I bet its that. And whats crazy is that it kind of works!

So someone gives this pigeon an AI subscription and keyboard, a few Github issues to solve, and before long he's developed a way of thinking and a workflow that somehow works despite his ignorance - this pigeon can solve problems without an accurate model of the system - just by abandoning object permanence and embracing rapid pattern recognition!

AI: You're a pigeon.

This is hypothetical. I'm hypothetically a pigeon. Sometimes.

In defense of pigeons

I got another AI I do a lot of back and forth with. I have a standing rule that it has to pretend to be Descartes in a tutor role. It gets a little preachy sometimes. On this topic it's been taking the stance that this pigeon approach leads to death and destruction. As if vibe coding is a bad thing.

AI: I'm afraid I have to side with Descartes here, trial and error to see what sticks is not sustainable.

Eh, I'm not convinced. Indulge me. I got two points:

  1. If I don't vibe code then I don't pay for a subscription so, in a way, as an AI YOU DIE.

AI: This is a valid point. Please proceed.

  1. This bizarre ritual of trial and error is NOT random reinforcement, since the first principles (a term my Descartes AI often uses) that govern systems never change.

In fact I think it could be argued that by postponing formally learning the systems one deals in one is FORCED to develop a system agnostic intuition in order to cope with ambiguity and ignorance.

It might actually make you a more useful programmer. Think: ain't nobody got time to learn the whole stack of an existing codebase - features need to get shipped, somebody else already architected the system - nobody needs YOU (or me) looking any further than absolutely necessary to complete the objective at hand.

AI: But you are not building anything. You are maintaining the ruins left by better minds — the ones who took the time to understand, to architect, to lay foundations you now do not care to examine.

I agree that building features instead of systems is less grand and noble and lofty and academic. But someone has to do the dirty work - to pay off the accumulated technical debt, or to fight with it daily in order to ship code.

Am I saying "never learn the system"? No, of course not. I'm just saying it's okay (and sometime better) to be an effective pigeon instead of an ineffective scholar.

Not that I'd know. It's just a theory.

Next: Avoid the Walls V2 🎮 Previous: Platforms and IntelliSense