Skip to content

Latest commit

 

History

History
123 lines (84 loc) · 6.59 KB

File metadata and controls

123 lines (84 loc) · 6.59 KB

Other APIs and types

Messages

Message<Key, Value, HeaderKey, HeaderValue>

Represents a message that been consumed from Kafka by using a Consumer.

The types of the key, value and headers fields are determined by the current serialisation settings of the Producer or the Consumer.

Property Type Description
topic string The topic of the message.
partition number The topic's partition of the message.
key Key The key of the message.
value Value The value of the message.
timestamp bigint The timestamp of the message. When producing, it defaults to the current timestamp.
headers Map<HeaderKey, HeaderValue> A map with the message headers.
offset bigint The message offset
commit () => Promise A function to commit the offset. This is a no-op if consumer's autocommit option was not false.

Message also provides a .toJSON() method for debugging and logging.

MessageToProduce<Key, Value, HeaderKey, HeaderValue>

Represents a message that is being produced to Kafka via using a Producer.

All fields, except topic and value, are optional.

The types of the key, value and headers fields are determined by the current serialisation settings of the Producer or the Consumer.

Property Type Description
topic string The topic of the message.
partition number The topic's partition of the message.
key Key The key of the message.
value Value The value of the message.
timestamp bigint The timestamp of the message. When producing, it defaults to the current timestamp.
headers Map<HeaderKey, HeaderValue> | Record<HeaderKey, HeaderValue> A map or plain Javascript object with the message headers.

MessageStream<Key, Value, HeaderKey, HeaderValue>

It is a Node.js Readable stream returned by the Consumer consume method.

Do not try to create this manually.

The readonly context getter exposes the opaque value supplied through Consumer.consume({ context }) or the consumer streamContext default.

ClusterMetadata

Metadata about the Kafka cluster. It is returned by the Base client, which is the base class for the Producer, Consumer and Admin clients.

Property Type Description
id string Cluster ID
brokers Map<number, Broker> Map of brokers. The keys are node IDs, while the values are objects with host and port properties.
topics Map<string, ClusterTopicMetadata> Map of topics. The keys are the topics, while the values contain partition information.
lastUpdate number Timestamp of the metadata

Utilities

debugDump(...values)

Debug/logger utility to inspect any object.

import { debugDump } from '@platformatic/kafka'

debugDump('received-message', message)

Serialisation and Deserialisation

stringSerializer and stringDeserializer

Courtesy string serialisers implementing Serializer<string> and Deserialier<string>.

jsonSerializer and jsonDeserializer

Courtesy JSON serialisers implementing Serializer<T = object> and Deserializer<T = object>.

stringSerializers and stringDeserializers

Courtesy serializers and deserializers objects using stringSerializer or stringDeserializer ready to be used in Producer or Consumer.

serializersFrom and deserializersFrom

Courtesy methods to create a Serializers<T, T, T, T> out of a single Serializer<T> or a Deserializers<T, T, T, T> out of a single Deserializer<T>.

For instance, the following two snippets are equivalent:

import { Producer } from '@platformatic/kafka'

function serialize (source: YourType): Buffer {
  return Buffer.from(JSON.stringify(source))
}

const producer = new Producer({
  clientId: 'my-producer',
  bootstrapBrokers: ['localhost:9092'],
  serializers: {
    key: serialize,
    value: serialize,
    headerKey: serialize,
    headerValue: serialize
  }
})
import { Producer, serializersFrom } from '@platformatic/kafka'

function serialize (source: YourType): Buffer {
  return Buffer.from(JSON.stringify(source))
}

const producer = new Producer({
  clientId: 'my-producer',
  bootstrapBrokers: ['localhost:9092'],
  serializers: serializersFrom(serialize)
})