This is a standalone configuration type, prefix from configuration root: langchain4j.jlama.chat-model
Configuration options
| key | type | default value | description |
|---|---|---|---|
|
string |
|
Generated from dev.langchain4j.model.jlama.JlamaChatModel.JlamaChatModelBuilder.authToken(java.lang.String) |
|
boolean |
|
If set to |
|
int |
|
Generated from dev.langchain4j.model.jlama.JlamaChatModel.JlamaChatModelBuilder.maxTokens(java.lang.Integer) |
|
Path |
|
Generated from dev.langchain4j.model.jlama.JlamaChatModel.JlamaChatModelBuilder.modelCachePath(java.nio.file.Path) |
|
boolean |
|
Generated from dev.langchain4j.model.jlama.JlamaChatModel.JlamaChatModelBuilder.quantizeModelAtRuntime(java.lang.Boolean) |
|
float |
|
Generated from dev.langchain4j.model.jlama.JlamaChatModel.JlamaChatModelBuilder.temperature(java.lang.Float) |
|
int |
|
Generated from dev.langchain4j.model.jlama.JlamaChatModel.JlamaChatModelBuilder.threadCount(java.lang.Integer) |
|
Path |
|
Generated from dev.langchain4j.model.jlama.JlamaChatModel.JlamaChatModelBuilder.workingDirectory(java.nio.file.Path) |
|
DType (BOOL, U8, I8, I16, U16, F16, BF16, I32, U32, F32, F64, I64, U64, Q4, Q5) |
|
Generated from dev.langchain4j.model.jlama.JlamaChatModel.JlamaChatModelBuilder.workingQuantizedType(com.github.tjake.jlama.safetensors.DType) |