New tools continue to radically improve workflows. Hugging face is horrible but a whole lot better than what existed before. The structure to a Pytorch-Lightning repo is a light and day improvement over one written in pure torch, and provides a lot of tooling around reproducability that heavily overlaps with MLOps. Dagster made the testing of pipelines far easier in the Data Engineering world.

Good ML packages (and their documentation and community) are the most efficient way to nudge users towards a better engineering and ML culture / set of processes. I find that the quality of the code I write and systems I build most closely correlates to the quality of the core APIs I build on top of, not deadlines or culture. Good systems evolve naturally, bottom up when they're easy and fast to implement, and better tooling buys you the time to spend more time (and budget - hire more business savvy people, less raw tech people) thinking about the business problem <-> technology solution fit. Tooling can help all the way to the business value, systems that link the quality of predictions to financial fundamentals help to evaluate if the project makes any sense in the first place.

The pure volume of overlapping tooling feels silly, but that's just free market capitalism, not the wrong focus. These arguments imply that there is some plateau in fundamental underlying technologies and their designs which couldn't be further from the truth.

I agree with the main thesis, but disagree with the dismissal of tooling.

Expand full comment