Skip to content
Advertisement

Debezium flush timeout and OutOfMemoryError errors with MySQL

Using Debezium 0.7 to read from MySQL but getting flush timeout and OutOfMemoryError errors in the initial snapshot phase. Looking at the logs below it seems like the connector is trying to write too many messages in one go:

JavaScript

Wonder what the correct settings are http://debezium.io/docs/connectors/mysql/#connector-properties for sizeable databases (>50GB). I didn’t have this issue with smaller databases. Simply increasing the timeout doesn’t seem like a good strategy. I’m currently using the default connector settings.

Update

Changed the settings as suggested below and it fixed the problem:

JavaScript

Advertisement

Answer

This is a very complex question – first of all, the default memory settings for Debezium Docker images are quite low so if you are using them it might be necessary to increase them.

Next, there are multiple factors at play. I recommend to do follwoing steps.

  1. Increase max.batch.size and max.queue.size – reduces number of commits
  2. Increase offset.flush.timeout.ms – gives Connect time to process accumulated records
  3. Decrease offset.flush.interval.ms – should reduce the amount of accumulated offsets

Unfortunately there is an issue KAFKA-6551 lurking in backstage that can still play a havoc.

User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement