Kafka-Flink-Druid demo stack
DRUID_VERSION=26.0.0 docker compose --profile all-services up -d --build
DRUID_VERSION=26.0.0 docker compose --profile all-services downdocker compose run sql-client
- Kafka brokers: listeners
- at
localhost:9092from host's point of view - at
kafka:9094from inside docker compose
- at
- Flink console at
localhost:18081 - Druid console at
localhost:8888
run all containers and log on to Flink SQL, see above
make a copy of create-table-adsb-raw.sql and enter the proper Confluent Cloud credentials
run the copied create-table-adsb-raw.sql - this puts a table on the stream with raw value, but reading all the metadata correctly
run create-table-adsb-json.sql - this defines the target table
run insert-adsb-json.sql- starts the transformation job
go to Druid and set up a supervisor for bootstrap server kafka:9092, topic adsb-json