|
Each week, I share one insight. One piece of wisdom. One question to reflect on. (and a little Lagniappe) InsightAn illusionist is not a sorcerer. While pulling a rabbit from a hat can feel like real magic, it isn't. A thinking person knows that we are being fooled. That is part of the fun. The illusionist knows that we know, even if we can't quite figure out how the trick works. But, no matter how clever the illusion, it will never transcend to "real" magic. No matter how skilled the illusionist becomes, they can't ever truly materialize a rabbit from thin air. This is where we are with AI. We are being told that if the model gets just a little bit bigger or the training gets a little bit better, our sophisticated "next most likely token prediction machine" will transcend the limits of its training data and inference and exceed the capabilities of deeply creative work of all humanity. An illusionist who sells you on the fact that they are a sorcerer is either delusional or a crook. Wisdom"The first principle is that you must not fool yourself, and you are the easiest person to fool." — Richard Feynman, Caltech commencement address, 1974 ReflectionWhat assumption are we making right now that we haven't tested? Lagniappe
|
Practical insights on platform engineering, developer experience, and building teams that ship. Each issue is written to be useful, actionable, and applicable. No filler, no promotions-only emails. Enter your email and sign up for free right now.
Each week, I share one insight. One piece of wisdom. One question to reflect on. (and a little Lagniappe) Insight This week LiteLLM, the most popular open-source LLM proxy in the python ecosystem, was hit by a really gnarly software supply chain attack. The awful part was that the attack vector was through Trivy, a security scanner LiteLLM trusted to help protect its code. Attackers compromised Trivy's GitHub Actions and used that to steal LiteLLM's PyPI publishing credentials, and used them...
Each week, I share one insight. One piece of wisdom. One question to reflect on. (and a little Lagniappe) Insight In Will Larson's book, Crafting Engineering Strategy, he nails why so many executives fail at executing on strategy. However, my experience is that engineering strategies fail for very mundane reasons—the most common of which is that executives assume their strategy will roll itself out. The second most common reason is forgetting to spend time validating the details. Both are...
Each week, I share one insight. One piece of wisdom. One question to reflect on. (and a little Lagniappe) Insight It is easy to treat Change Management as a means of controlling the change itself, as if changes were discrete events you could shove into a box on a specific timeline. But change is continuous, it's fluid, and it's much more powerful than any of us can truly control. Systems were changing long before we intervened, and they will continue to change long after we are gone. Surfers...