You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Title: Understanding Jlama in Quarkus Short Description:
Jake Luciani and Mario Fusco join us to explain the benefits of LLM-inference in the JVM for Java applications, discuss Jlama's pros and cons, and show how to create a Java LLM-infused application using Quarkus, LangChain4j, and Jlama.
Longer Descripton:
Jake Luciani and Mario Fusco join us to explain the benefits of performing LLM-inference directly in the same JVM as your Java application. They will discuss Jlama's technical aspects, its pros and cons, ongoing work, and future improvements. They will provide a practical example of how Quarkus, LangChain4j, and Jlama simplify creating a pure Java LLM-infused application.
Title: Understanding Jlama in Quarkus
Short Description:
Jake Luciani and Mario Fusco join us to explain the benefits of LLM-inference in the JVM for Java applications, discuss Jlama's pros and cons, and show how to create a Java LLM-infused application using Quarkus, LangChain4j, and Jlama.
Longer Descripton:
Jake Luciani and Mario Fusco join us to explain the benefits of performing LLM-inference directly in the same JVM as your Java application. They will discuss Jlama's technical aspects, its pros and cons, ongoing work, and future improvements. They will provide a practical example of how Quarkus, LangChain4j, and Jlama simplify creating a pure Java LLM-infused application.
Guests: Mario Fusco & Jake Luciani
Twitter: @mariofusco, @tjake
The text was updated successfully, but these errors were encountered: