tokenizer
Projects with this topic
-
Tools and thoughts to construct learnable tokenizer, that is, the module that cuts a string (say, some text) into sub-strings (say some words). The goal is to work on (possibly really) noisy environment, as for instance at the output of optical character recognition task, or for specific applications / expert-domain speech recognition.
Updated -
Extract part of a string in a versatile way, and without destroying information from the parent string. Allows discontinuous part of a string to be collected as an ExtractionString. Allows several strategies of string-splitting at the same time, for a given string.
Updated -
Deprecated from sept. 2022. See https://framagit.org/nlp/extractionstring for improved tools to extract any sub-string from a parent one without losing information from the parent string.
Updated