Two Years in, reflecting on GenAI / LLM / MLops open-source development
I went on a bit of an Amazon tear, & it feels right to put some physical, real-world textbooks covering LLM, GenAI, and RAG pipelines on the shelf (photo below).
Reflecting on the last two years since GenAI & GPTs took center stage, I’ve never before experienced being this close to the bleeding edge of open source development – where I’m pulling code literally minutes after commits are pushed, and watching this rapid iteration happen in real-time over an extended period.
On the one hand, It’s incredible to be part of this immediate feedback loop between development and community usage, and it really highlights how far software collaboration has evolved.
On the other hand, it’s incredibly frustrating to finally get code running, but it breaks unexpectedly right before a team demo. Not because you changed anything at all, but the rug pulled from source control. The community decided to refactor or re-engineer core functionality – justifiably, of course, the refactor is better. So, back to the starting block, it is.
The contrast between traditional software development cycles and the near-instantaneous nature of modern open-source development across cloud data platforms is quite remarkable. The ground covered in two years time, the pace of mind-blowing ah-ha moments & progress in functionality is unlike anything else I have been a part of.
Better buckle up and learn the moves or hop off the ride because AI isn’t going to affect your life. But a human who knows how to maximize their output with AI power boosting will, no doubt.
My current MLops stack and top GenAI/ML tools – those that emerged on top after two years in the trenches are pinned @ https://lnkd.in/eXy76MdG