package kaun
sectionYPositions = computeSectionYPositions($el), 10)"
x-init="setTimeout(() => sectionYPositions = computeSectionYPositions($el), 10)"
>
Flax-inspired neural network library for OCaml
Install
dune-project
Dependency
Authors
Maintainers
Sources
raven-1.0.0.alpha2.tbz
sha256=93abc49d075a1754442ccf495645bc4fdc83e4c66391ec8aca8fa15d2b4f44d2
sha512=5eb958c51f30ae46abded4c96f48d1825f79c7ce03f975f9a6237cdfed0d62c0b4a0774296694def391573d849d1f869919c49008acffca95946b818ad325f6f
doc/kaun.models/Kaun_models/GPT2/Tokenizer/index.html
Module GPT2.TokenizerSource
GPT-2 tokenizer instance with BPE
Create a BPE tokenizer for GPT-2. Either provide vocab_file and merges_file paths, or a model_id to download from HuggingFace (defaults to gpt2)
Encode text directly to input tensors ready for forward pass
Source
val encode_batch :
t ->
?max_length:int ->
?padding:bool ->
string list ->
(int32, Rune.int32_elt) Rune.tEncode multiple texts with optional padding
sectionYPositions = computeSectionYPositions($el), 10)"
x-init="setTimeout(() => sectionYPositions = computeSectionYPositions($el), 10)"
>