Module Imandrakit_twine.Decode

type t
val of_string : string -> t
type 'a decoder = t -> int -> 'a
type cstor_index = int
val fail : string -> 'a
val failf : ('a, unit, string, 'b) format4 -> 'a
module Value : sig ... end
module Array_cursor : sig ... end
module Dict_cursor : sig ... end
val deref_rec : int decoder

Given any value, follow pointers until a non-pointer value is reached, a return its address.

val read : ?auto_deref:bool -> Value.t decoder

Read a value of any kind.

  • parameter auto_deref

    if true (default), follow pointers implicitly

val null : unit decoder
val bool : bool decoder
val int_truncate : int decoder
val int64 : int64 decoder
val float : float decoder
val string : string decoder
val blob : string decoder
val tag : (int * int) decoder

Entrypoint

val get_entrypoint : t -> int

Offset of the entrypoint (the topevel value)

val read_entrypoint : t -> Value.t

Read the entrypoint, from the end of the slice

val decode_string : 'a decoder -> string -> 'a

Caching

Caching is used to reflect the sharing of values embedded in a Twine slice, into the decoded values. It means that, for a given type, if values of this type are encoded with sharing (e.g. a graph-heavy term representation), then with caching we can decode the values to OCaml values that also have sharing.

type 'a cache_key

Generative key used to cache values during decoding

val create_cache_key : unit -> _ cache_key

Generate a new (generative) cache key for a type.

NOTE this should be called only at module toplevel, as a constant, not dynamically inside a function: let key: foo value_pack.Deser.cache_key = value_pack.Deser.create_cache_key ();;. Indeed, this is generative, so creating multiple keys for a type will result in sub-par performance or non-existent caching.

val with_cache : 'a cache_key -> 'a decoder -> 'a decoder

with_cache key dec is the same decoder as dec but it uses key to retrieve values directly from an internal table for entries/values that have already been decoded in the past. This means that a value that was encoded with a lot of sharing (e.g in a graph, or a large string using Ser.add_string) will be decoded only once.

val add_cache : 'a decoder ref -> unit

add_cache dec_ref modifies the decoder so it uses a new cache key. It is the same as:

let key = create_cache_key ()
let () = dec_ref := with_cache key !dec_ref