BLUE
Profile banner
TV
Thomas Vitale
@thomasvitale.com
Software Engineer 📚 Author of “Cloud Native Spring in Action” 🛳️ CNCF Ambassador 🎙️ International Speaker 👨‍💻 Open Source. I write about Java, Cloud Native, Kubernetes, Security, Continuous Delivery, and Platform Engineering.
106 followers83 following52 posts
TVthomasvitale.com

4. Ollama is set to Assess. It's an open-source tool to run and manage LLMs on local environments, useful for development, testing (check out the Testcontainers Ollama module), and for running inference services on-prem. www.thoughtworks.com/radar/tools/...

Ollama | Technology Radar | Thoughtworks
Ollama | Technology Radar | Thoughtworks

Ollama is an open-source tool for running and managing large language models (LLMs) on your local machine. Previously, we talked about the benefits of self-hosted [...]

1

TVthomasvitale.com

5. Overenthusiastic LLM Use is put on Hold. LLMs might not be the right solution to some problems. For example, some sentiment analysis and classification problems "can be solved more cheaply and easily using traditional natural language processing (NLP)". www.thoughtworks.com/radar/techni...

Overenthusiastic LLM use | Technology Radar | Thoughtworks
Overenthusiastic LLM use | Technology Radar | Thoughtworks

In the rush to leverage the latest in AI, many organizations are quickly adopting large language models (LLMs) for a variety of applications, from content [...]

0
Profile banner
TV
Thomas Vitale
@thomasvitale.com
Software Engineer 📚 Author of “Cloud Native Spring in Action” 🛳️ CNCF Ambassador 🎙️ International Speaker 👨‍💻 Open Source. I write about Java, Cloud Native, Kubernetes, Security, Continuous Delivery, and Platform Engineering.
106 followers83 following52 posts