xxxxxxxxxx
Kafka Metrics
It is possible to use Kafka for operational monitoring data.
Also, to produce centralized feeds of operational data, it involves aggregating statistics from distributed applications.
Kafka Log Aggregation
Moreover, to gather logs from multiple services across an organization.
Stream Processing
While stream processing, Kafka’s strong durability is very useful.
Reliabitly Scalabitly Durability Performance
Build in partioning replicateion fault tolerance
Global Scale, Real time , Persistent storage, Stream Processing
Use cases
Messagin
metrics
Log Aggregation
Stream Processing
Activitiy Tracking
Event SOurcing
xxxxxxxxxx
@EnableKafka
@Configuration
public class KafkaConsumerConfig {
@Bean
public ConsumerFactory<String, String> consumerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(
ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG,
bootstrapAddress);
props.put(
ConsumerConfig.GROUP_ID_CONFIG,
groupId);
props.put(
ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
props.put(
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
StringDeserializer.class);
return new DefaultKafkaConsumerFactory<>(props);
}
@Bean
public ConcurrentKafkaListenerContainerFactory<String, String>
kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, String> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
return factory;
}
}