This endpoint can be used to convert strings into tokens.
It is a simple proxy forwarding your requests to the desired model.
Any LightOn model is deployed on a vLLM-based image.
Supported Input:
prompt: Simple text string to tokenizemessages: Array of chat messages to tokenize (alternative to prompt)Bearer authentication header of the form Bearer <token>, where <token> is your auth token.
Response serializer for tokenize endpoint results.
Total number of tokens in the input text
The requested model for tokenization
The model used for tokenization
The type of tokenizer used
The id of the response
The model used for tokenization