Discussion about this post

User's avatar
D. Schmudde's avatar

Perhaps a slightly more optimistic view from a systems perspective.

Computer science is the discipline of breaking down problems into smaller and smaller parts. Each level of abstraction has its own specialty but someone needs to know how to combine these parts. And someone, in turn, combines *those* parts. Turtles all the way up.

Within each level, there are small details and large inter-dependencies. So maybe each level of abstraction needs different kinds of minds (and time frames) to become robust?

Expand full comment
Kristin's avatar

This is right on, as is D. Schmudde's comment.

Code suffers from a sort of entropic decay, but not all systems/software decays at the same rate. In the churn of pursuing product/market fit, that decay rate is devastatingly high unless the system is designed for iteration (and then is merely "high").

Before I did much coding myself, I was dismayed at how often developers were "refactoring" – not just old code but code they had written 2-3 months ago! But due to a weird career arch that led to me coding, I now find myself refactoring constantly. Always Be Refactoring! But there is a "right way" to refactor when the rapid iterations just keep on coming. We can distill/generalize down to the right set of primitives with the right affordances and interfaces – legos, in a sense – with which we can build + modify quickly. Moving legos creates far less entropy than moving around the lower-level constructs.

Expand full comment
3 more comments...

No posts