Latest post
Building local LLM AI-Powered Applications with Quarkus, Ollama and Testcontainers
Traditionally, many AI-powered applications rely on cloud-based APIs or centralized services for model hosting and execution. While this approach has its advantages, such as scalability and ease of use, it also introduces challenges around latency, data privacy, and dependency on …
-
Langchain4J Musings
I’m coming relatively late to the LLM party, but I rarely come very early in the hype cycle. For example, I never bought into blockchain, the solution still searching for problems to solve, nor in microservices, the latest in the …
1-2 of 2