llama.rn / LlamaContext
- applyLoraAdapters
- bench
- completion
- detokenize
- embedding
- getFormattedChat
- getLoadedLoraAdapters
- loadSession
- release
- removeLoraAdapters
- saveSession
- stopCompletion
- tokenize
• new LlamaContext(«destructured»
)
Name | Type |
---|---|
«destructured» |
NativeLlamaContext |
• gpu: boolean
= false
• id: number
• model: Object
= {}
Name | Type |
---|---|
isChatTemplateSupported? |
boolean |
• reasonNoGPU: string
= ''
▸ applyLoraAdapters(loraList
): Promise
<void
>
Name | Type |
---|---|
loraList |
{ path : string ; scaled? : number }[] |
Promise
<void
>
▸ bench(pp
, tg
, pl
, nr
): Promise
<BenchResult
>
Name | Type |
---|---|
pp |
number |
tg |
number |
pl |
number |
nr |
number |
Promise
<BenchResult
>
▸ completion(params
, callback?
): Promise
<NativeCompletionResult
>
Name | Type |
---|---|
params |
CompletionParams |
callback? |
(data : TokenData ) => void |
Promise
<NativeCompletionResult
>
▸ detokenize(tokens
): Promise
<string
>
Name | Type |
---|---|
tokens |
number [] |
Promise
<string
>
▸ embedding(text
, params?
): Promise
<NativeEmbeddingResult
>
Name | Type |
---|---|
text |
string |
params? |
NativeEmbeddingParams |
Promise
<NativeEmbeddingResult
>
▸ getFormattedChat(messages
, template?
): Promise
<string
>
Name | Type |
---|---|
messages |
RNLlamaOAICompatibleMessage [] |
template? |
string |
Promise
<string
>
▸ getLoadedLoraAdapters(): Promise
<{ path
: string
; scaled?
: number
}[]>
Promise
<{ path
: string
; scaled?
: number
}[]>
▸ loadSession(filepath
): Promise
<NativeSessionLoadResult
>
Load cached prompt & completion state from a file.
Name | Type |
---|---|
filepath |
string |
Promise
<NativeSessionLoadResult
>
▸ release(): Promise
<void
>
Promise
<void
>
▸ removeLoraAdapters(): Promise
<void
>
Promise
<void
>
▸ saveSession(filepath
, options?
): Promise
<number
>
Save current cached prompt & completion state to a file.
Name | Type |
---|---|
filepath |
string |
options? |
Object |
options.tokenSize |
number |
Promise
<number
>
▸ stopCompletion(): Promise
<void
>
Promise
<void
>
▸ tokenize(text
): Promise
<NativeTokenizeResult
>
Name | Type |
---|---|
text |
string |
Promise
<NativeTokenizeResult
>