When we are trying thto queue the page to queue processor which had attachment stream in it,[the size of the page was becoming more than 5MB], So kafka 'producer' was not able to queue this page to Kafka 'server/broker'.
We were getting below error:
Caused by: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.RecordTooLargeException: The message is 10812412 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.
Approach 1:update prconfig file
This approach is better if you have single node. When application is deployed to higher environment you need to make sure that 'prconfig' in that environment is updated.
<env name="dnode/kafka/producer/max.message.bytes" value="5000000"/>
<env name="dnode/kafka/producer/max.request.size" value="5000000">
<env name="dsm/services/stream/server_properties/max.request.size" value="5000000"/>
<env name="dsm/services/stream/server_properties/message.max.bytes" value="5000200"/>
<env name="dsm/services/stream/server_properties/replica.fetch.max.bytes" value="5000200"/>
<env name="dsm/services/stream/server_properties/replica.fetch.response.max.bytes" value="5000200"/>
Approach 2: Update DSS
This aproach is best suited -If you have multiple node environment. because then you dont need to maintain the configuration file for each node, DSS will take care of it accorss all nodes.
message.max.bytes-> should be same in 'Kafka producer' and 'kafka server'
replica.fetch.max.bytes-> must be greater than ‘message.max.bytes’ of 'Kafka server'
replica.fetch.response.max.bytes-> must be greater than ‘message.max.bytes’ of 'Kafka server'