conduktor.io ↗
← All errors
high PostgreSQL MySQL SQL Server MongoDB MariaDB Resources

org.apache.kafka.common.errors.RecordTooLargeException: The message is N bytes when serialized which is larger than 1048576

Root cause

A single change event exceeds Kafka's default 1 MB message size limit. Common with large TEXT/BLOB/JSONB columns, wide tables with many columns, or when using the JSON converter with schemas.enable=true (full schema embedded in every message, often 10–100× larger than data alone).

How to fix

  1. Increase producer limit in connector config: "producer.override.max.request.size": "10485760"
  2. Increase topic limit:
    kafka-configs.sh --bootstrap-server <broker>:9092 --entity-type topics --entity-name <topic> --alter --add-config max.message.bytes=10485760
  3. Set CONNECT_CONNECTOR_CLIENT_CONFIG_OVERRIDE_POLICY=All on the Kafka Connect worker.
  4. Exclude large columns with column.exclude.list if they are not needed downstream.
Official Debezium documentation ↗