Kafka Training

Kafka Consumer: Advanced Consumers

Kafka Tutorial 14: Creating Advanced Kafka Consumers in Java - Part 1

In this tutorial, you are going to create advanced Kafka Consumers.

Before you start

The prerequisites to this tutorial are

Welcome to the first article on Advanced Kafka Consumers.

In this article, we are going to set up an advanced Kafka Consumer.

Continue reading

Kafka Tutorial

Kafka Tutorial

This comprehensive Kafka tutorial covers Kafka architecture and design. The Kafka tutorial has example Java Kafka producers and Kafka consumers. The Kafka tutorial also covers Avro and Schema Registry.

Kafka Training - Onsite, Instructor-led

Training for DevOps, Architects and Developers

This Kafka course teaches the basics of the Apache Kafka distributed streaming platform. The Apache Kafka distributed streaming platform is one of the most powerful and widely used reliable streaming platforms. Kafka is a fault tolerant, highly scalable and used for log aggregation, stream processing, event sources and commit logs. Kafka is used by LinkedIn, Yahoo, Twitter, Square, Uber, Box, PayPal, Etsy and more to enable stream processing, online messaging, facilitate in-memory computing by providing a distributed commit log, data collection for big data and so much more.

Continue reading

Kafka Tutorial: Creating Advanced Kafka Producers in Java

Kafka Tutorial 13: Creating Advanced Kafka Producers in Java

In this tutorial, you are going to create advanced Kafka Producers.

Before you start

The prerequisites to this tutorial are

This tutorial picks up right where Kafka Tutorial Part 11: Writing a Kafka Producer example in Java and Kafka Tutorial Part 12: Writing a Kafka Consumer example in Java left off. In the last two tutorial, we created simple Java example that creates a Kafka producer and a consumer.

Continue reading

Kafka Tutorial: Creating Advanced Kafka Consumers in Java

Kafka Tutorial 14: Creating Advanced Kafka Consumers in Java

In this tutorial, you are going to create advanced Kafka Consumers.

UNDER CONSTRUCTION.

Before you start

The prerequisites to this tutorial are

This tutorial picks up right where Kafka Tutorial Part 11: Writing a Kafka Producer example in Java left off. In the last tutorial, we created advanced Java producers, now we will do the same with Consumers.

Continue reading

Kafka Architecture: Log Compaction

Kafka Architecture: Log Compaction

This post really picks off from our series on Kafka architecture which includes Kafka topics architecture, Kafka producer architecture, Kafka consumer architecture and Kafka ecosystem architecture.

This article is heavily inspired by the Kafka section on design around log compaction. You can think of it as the cliff notes about Kafka design around log compaction.

Kafka can delete older records based on time or size of a log. Kafka also supports log compaction for record key compaction. Log compaction means that Kafka will keep the latest version of a record and delete the older versions during a log compaction.

Continue reading

Kafka Architecture: Low Level

If you are not sure what Kafka is, see What is Kafka?.

Kafka Architecture: Low-Level Design

This post really picks off from our series on Kafka architecture which includes Kafka topics architecture, Kafka producer architecture, Kafka consumer architecture and Kafka ecosystem architecture.

This article is heavily inspired by the Kafka section on design. You can think of it as the cliff notes.


Kafka Design Motivation

LinkedIn engineering built Kafka to support real-time analytics. Kafka was designed to feed analytics system that did real-time processing of streams. LinkedIn developed Kafka as a unified platform for real-time handling of streaming data feeds. The goal behind Kafka, build a high-throughput streaming data platform that supports high-volume event streams like log aggregation, user activity, etc.

Continue reading

Kafka Tutorial: Creating a Kafka Consumer in Java

Kafka Tutorial: Writing a Kafka Consumer in Java

In this tutorial, you are going to create simple Kafka Consumer. This consumer consumes messages from the Kafka Producer you wrote in the last tutorial. This tutorial demonstrates how to process records from a Kafka topic with a Kafka Consumer.

This tutorial describes how Kafka Consumers in the same group divide up and share partitions while each consumer group appears to get its own copy of the same data.

Continue reading

Kafka Tutorial: Creating a Kafka Producer in Java

Kafka Tutorial: Writing a Kafka Producer in Java

In this tutorial, we are going to create simple Java example that creates a Kafka producer. You create a new replicated Kafka topic called my-example-topic, then you create a Kafka producer that uses this topic to send records. You will send records with the Kafka producer. You will send records synchronously. Later, you will send records asynchronously.

Before you start

Prerequisites to this tutorial are Kafka from the command line and Kafka clustering and failover basics.

Continue reading

Kafka Tutorial: Kafka clusters, Kafka consumer failover, and Kafka broker failover

If you are not sure what Kafka is, start here “What is Kafka?”.

Getting started with Kafka cluster tutorial

Understanding Kafka Failover

This Kafka tutorial picks up right where the first Kafka tutorial from the command line left off. The first tutorial has instructions on how to run ZooKeeper and use Kafka utils.

In this tutorial, we are going to run many Kafka Nodes on our development laptop so that you will need at least 16 GB of RAM for local dev machine. You can run just two servers if you have less memory than 16 GB. We are going to create a replicated topic. We then demonstrate consumer failover and broker failover. We also demonstrate load balancing Kafka consumers. We show how, with many groups, Kafka acts like a Publish/Subscribe. But, when we put all of our consumers in the same group, Kafka will load share the messages to the consumers in the same group (more like a queue than a topic in a traditional MOM sense).

Continue reading

Kafka Tutorial: Using Kafka from the command line

If you are not sure what Kafka is, start here “What is Kafka?”.

Getting started with Kafka tutorial

Let’s show a simple example using producers and consumers from the Kafka command line.

Download Kafka 0.10.2.x from the Kafka download page. Later versions will likely work, but this was example was done with 0.10.2.x.

We assume that you have Java SDK 1.8.x installed.

We unzipped the Kafka download and put it in ~/kafka-training/, and then renamed the Kafka install folder to kafka. Please do the same.

Continue reading

                                                                           

Apache Spark Training
Kafka Tutorial
Akka Consulting
Cassandra Training
AWS Cassandra Database Support
Kafka Support Pricing
Cassandra Database Support Pricing
Non-stop Cassandra
Watchdog
Advantages of using Cloudurable™
Cassandra Consulting
Cloudurable™| Guide to AWS Cassandra Deploy
Cloudurable™| AWS Cassandra Guidelines and Notes
Free guide to deploying Cassandra on AWS
Kafka Training
Kafka Consulting
DynamoDB Training
DynamoDB Consulting
Kinesis Training
Kinesis Consulting
Kafka Tutorial PDF
Kubernetes Security Training
Redis Consulting
Redis Training
ElasticSearch / ELK Consulting
ElasticSearch Training
InfluxDB/TICK Training TICK Consulting