Sending kafka broker logs to mysql database

I have been working on storing apache-kafka broker logs into mysql database. Kafka has log4j jar file which has JDBC appender through that we can store logs directly into database. So I found log4j.properties file in kafka and tried to add JDBC properties lines in properties so it can send logs to database along with console and log file which is default logging feature in kafka

Waiting for every to connect to a topic with new consumer group in kafka (node-rdkafka)

I’m building a websocket backend which would connect to a topic (with only one partition) and consume data from the earliest position and keep consuming for new data till the websocket connection is disconnected. At a single time more than one websocket connection can exist.

How to control the concurrency of processing messages by ConsumerGroup

I am using kafka-node ConsumerGroup to consume message from a topic. The ConsumerGroup when consumes a message requires calling an external API, which might even take a second to response.
I wish to control consuming next message from the queue until I get the response from the API, so that the messages are processed sequentially.