I have some critique of the approach if you’re doing complex adaptive discovery work (where alignment can be a weakness), but I can see it working well in a more ordered, predictable, bureaucratic situation (where alignment is kind of a prerequisite for anyone to get anything done).
Thanks for sharing Tom, you always find these niche articles. Why do you think that alignment can be a weakness when doing complex adaptive discovery work? The way I understand this is that Airbnb has very much found fit, and now its main thing is to keep that fit, and they just need to be more deliberate and not sacrifice quality to do that.
I think you’ve nailed it with Airbnb. They have a clear idea of what they do and what outcomes they support/chase. They have distribution sorted. And they also have the capital to take bigger swings. (This is very different from their early days, when they ran hundreds of different experiments as fast as possible.)
When working with a novel startup (or feature), I think we mostly accept that we can’t know enough about how it’s going to work upfront to be able to specify all the details.
But the lack of knowledge goes even further than that in reality. Frequently we can't even know enough upfront to be able to specify which outcome(s) we want to be chasing. There are too many patterns/degrees of freedom/possible paths. Some outcomes that we would like are not possible to reach from where we are. Other outcomes would be great next steps on the journey, but we don't consider them.
When I talk about alignment being a weakness, it's referring to situations where we insist on agreeing on a firm goal, vision, OKR, etc. but it's premature convergence.
Put another way, the risk of focus is that you miss everything that's outside of the focal point. If you can see the target clearly and the way there is safe, that's going to be OK. But most product work is more like wandering about in thick fog looking for a target you heard about on someone's blog.
From what you highlighted in the Cheeky Chesky piece, I reckon you’ll like this from Vaughn Tan, George: https://vaughntan.org/unpacking-boris
I have some critique of the approach if you’re doing complex adaptive discovery work (where alignment can be a weakness), but I can see it working well in a more ordered, predictable, bureaucratic situation (where alignment is kind of a prerequisite for anyone to get anything done).
Thanks for sharing Tom, you always find these niche articles. Why do you think that alignment can be a weakness when doing complex adaptive discovery work? The way I understand this is that Airbnb has very much found fit, and now its main thing is to keep that fit, and they just need to be more deliberate and not sacrifice quality to do that.
Also
"you always find these niche articles"
When the mainstream dogma is failing, the niches are where we'll find better ways xx
Made me re-read that BORIS article, going to try it out. Loved it. Thanks again for sharing, Tom! Keep them coming, please.
Fascinating question, George.
I think you’ve nailed it with Airbnb. They have a clear idea of what they do and what outcomes they support/chase. They have distribution sorted. And they also have the capital to take bigger swings. (This is very different from their early days, when they ran hundreds of different experiments as fast as possible.)
When working with a novel startup (or feature), I think we mostly accept that we can’t know enough about how it’s going to work upfront to be able to specify all the details.
But the lack of knowledge goes even further than that in reality. Frequently we can't even know enough upfront to be able to specify which outcome(s) we want to be chasing. There are too many patterns/degrees of freedom/possible paths. Some outcomes that we would like are not possible to reach from where we are. Other outcomes would be great next steps on the journey, but we don't consider them.
When I talk about alignment being a weakness, it's referring to situations where we insist on agreeing on a firm goal, vision, OKR, etc. but it's premature convergence.
Put another way, the risk of focus is that you miss everything that's outside of the focal point. If you can see the target clearly and the way there is safe, that's going to be OK. But most product work is more like wandering about in thick fog looking for a target you heard about on someone's blog.