I am taking some time off, primarily to recharge, refocus, and spend some high quality time with my family. After that, I’ll be ready to move fast and break things. 😀 2/2
but it’s good to remember that incremental improvements on the ImageNet problem have lead us to superhuman vision that we now take for granted. It’s not unreasonable to suppose that something similar might play out with LLMs and superhuman intelligence. 3/3
focused everyone’s attention and resources on LLMs. People are now starting to get impatient and frustrated with incremental-seeming improvements over the past year (at best!), 2/3
or any other topic that interests you and I might be the relevant person for it. Link to my official topic in this year’s GTC catalog: www.nvidia.com/gtc/session-...www.nvidia.com/gtc/session-...www.nvidia.com/gtc/www.nvidia.com/gtc/pricing/
The focus of my talk will be on some more advanced and nuanced topics, such as multi-GPU training and deployment, Shapley values, and unsupervised learning, but I also plan to have a fairly long Q&A session where I answer many other questions that you might have either about XGBoost, 2/3
The pdf is freely available online: fleuret.org/public/lbdl....
I might have a fleeting awareness of this subject.
I got your back bro. Just don’t get fired from Bluesky. I don’t think I’d be able to handle that much cruelty in this racket.
If Waterloo is anything like Indiana those don’t come too often. So try to enjoy it!