Apache Kafka Streams for building Real Time Data Processing with STOMP over WebSocket using Spring Boot and Angular 8


In this tutorial I will show you how to work with Apache Kafka Streams for building Real Time Data Processing with STOMP over Websocket using Spring Boot and Angular 8. We will see how to build push notifications using Apache Kafka, Spring Boot and Angular 8. We need to provide some basic things that Kafka Streams requires, such as, the cluster information, application id, the topic to consume, Serdes to use, and so on. I will show you how to build the application using both gradle and maven build tools.

I would not make the content lengthy by explaining about Kafka Streams but you can always find very good documentation here.

Let’s look at some of these basic things that Kafka Streams requires:

Cluster Information

By default, the binder will try to connect to a cluster that is running on localhost:9092. If that is not the case, you can override that by using the available configuration properties.

Application ID

In a Kafka Streams application, application.id is a mandatory field. Without it, you cannot start a Kafka Streams application. By default, the binder will generate an application ID and assign it to the processor. It uses the function bean name as a prefix.

Topic to consume from

You need to provide a topic from where Kafka will consume the stream of messages.

Serialization and Deserialization (Serdes)

Kafka Streams uses a special class called Serde to deal with data marshaling. It is essentially a wrapper around a deserializer on the inbound and a serializer on the outbound. Normally, you have to tell Kafka Streams what Serde to use for each consumer. Binder, however, infers this information by using the parametric types provided as part of Kafka Streams.


At least Java 8, Eclipse 4.12, Spring Boot 2.2.2, Maven 3.6.1, Gradle 5.6, Spring Kafka 2.3.4, Angular 8

How to setup and work with Apache Kafka in Windows Environment

How to create new Angular project in Windows

Server Application

Create Project

Let’s create a project either maven or gradle based in Eclipse IDE. The name of the project is spring-apache-kafka-streams-websocket-stomp-server.

If you are creating gradle based project then you can use below build.gradle script:

buildscript {
	ext {
		springBootVersion = '2.2.2.RELEASE'
    repositories {
    dependencies {

plugins {
    id 'java-library'
    id 'org.springframework.boot' version '2.2.2.RELEASE'

sourceCompatibility = 12
targetCompatibility = 12

repositories {

dependencies {
    implementation 'org.springframework.kafka:spring-kafka:2.3.4.RELEASE'
    implementation 'org.apache.kafka:kafka-streams:2.4.0'

I f you are creating maven based project then you can use below pom.xml file:

<project xmlns="http://maven.apache.org/POM/4.0.0"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">





					<source>at least 8</source>
					<target>at least 8</target>

Application Properties

We will create application.properties file under classpath directory src/main/resources to configure some basic settings for Kafka.

Producer will produce messages into roytuts-input topic. Kafka stream processor will consume the message from roytuts-input topic and write into roytuts-output topic. Next consumer will consume messages from roytuts-output topic.

Finally SimpMessagingTemplate will write to /topic/greeting.




Create Topics

We will create two topics in Kafka for consuming and publishing messages.

To create a topic we need to add a bean of type NewTopic. If the topic already exists then this bean is ignored.

The topics will be created during application start up.

package com.roytuts.spring.apache.kafka.streams.websocket.stomp.server.config;

import org.apache.kafka.clients.admin.NewTopic;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

public class KafkaTopicConfig {

	private String kafkaInputTopic;

	private String kafkaOutputTopic;

	public NewTopic inputTopic() {
		NewTopic newTopic = new NewTopic(kafkaInputTopic, 1, (short) 1);

		return newTopic;

	public NewTopic outputTopic() {
		NewTopic newTopic = new NewTopic(kafkaOutputTopic, 1, (short) 1);

		return newTopic;


Kafka Streams

Kafka stream processor will consume from input topic and do some business processing on input data and write to output topic. Though in this example the processor just reads the messages and writes to topic but ideally your application will do some business processing on the input data.

package com.roytuts.spring.apache.kafka.streams.websocket.stomp.server.config;

import java.util.Properties;

import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.KafkaStreams;
import org.apache.kafka.streams.KeyValue;
import org.apache.kafka.streams.StreamsBuilder;
import org.apache.kafka.streams.StreamsConfig;
import org.apache.kafka.streams.kstream.Consumed;
import org.apache.kafka.streams.kstream.KStream;
import org.apache.kafka.streams.kstream.Produced;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

public class KafkaStreamsConfig {

	private String kafkaInputTopic;

	private String kafkaOutputTopic;

	private String kafkaBootstrapServer;

	public KStream<String, String> kstream() {
		Properties props = new Properties();
		props.put(StreamsConfig.APPLICATION_ID_CONFIG, "roytuts-stomp-websocket");
		props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBootstrapServer);
		props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass());
		props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Serdes.String().getClass());

		StreamsBuilder streamsBuilder = new StreamsBuilder();

		final KStream<String, String> stream = streamsBuilder.stream(kafkaInputTopic,
				Consumed.with(Serdes.String(), Serdes.String()));

		stream.map((key, value) -> KeyValue.pair(key, value)).to(kafkaOutputTopic,
				Produced.with(Serdes.String(), Serdes.String()));

		KafkaStreams streams = new KafkaStreams(streamsBuilder.build(), props);

		return stream;


Configure WebSocket and Stomp

The above WebSocketConfig class is annotated with @Configuration to indicate that it is a Spring configuration class.

The class is also annotated @EnableWebSocketMessageBroker and @EnableWebSocketMessageBroker enables WebSocket message handling, backed by a message broker.

The configureMessageBroker() method overrides the default method in WebSocketMessageBrokerConfigurer interface to configure the message broker.

It starts by calling enableSimpleBroker() to enable a simple memory-based message broker to carry the greeting messages back to the client on destinations prefixed with /topic.

The registerStompEndpoints() method registers the /websocket endpoint, enabling SockJS fallback options so that alternate transports may be used if WebSocket is not available.

The SockJS client will attempt to connect to /websocket and use the best transport available (websocket, xhr-streaming, xhr-polling, etc).

package com.roytuts.spring.apache.kafka.streams.websocket.stomp.server.config;

import org.springframework.context.annotation.Configuration;
import org.springframework.messaging.simp.config.MessageBrokerRegistry;
import org.springframework.web.socket.config.annotation.EnableWebSocketMessageBroker;
import org.springframework.web.socket.config.annotation.StompEndpointRegistry;
import org.springframework.web.socket.config.annotation.WebSocketMessageBrokerConfigurer;

public class WebSocketConfig implements WebSocketMessageBrokerConfigurer {

	public void registerStompEndpoints(StompEndpointRegistry registry) {

	public void configureMessageBroker(MessageBrokerRegistry config) {


Related Posts:

Greeting Service

This class will generate different greet messages depending upon the time of the day. I have also appended random generated string so that we will get different random string appended with actual greet message to differentiate from each other every and it will actually tell us that we are getting every time the new message.

package com.roytuts.spring.apache.kafka.streams.websocket.stomp.server.service;

import java.util.Calendar;
import java.util.UUID;

import org.springframework.stereotype.Service;

public class GreetingService {

	public String greet() {
		Calendar c = Calendar.getInstance();

		int timeOfDay = c.get(Calendar.HOUR_OF_DAY);

		StringBuilder sb = new StringBuilder();

		String message = "Have a Good Day";

		if (timeOfDay >= 0 && timeOfDay < 12) {
			message = "Good Morning";
		} else if (timeOfDay >= 12 && timeOfDay < 16) {
			message = "Good Afternoon";
		} else if (timeOfDay >= 16 && timeOfDay < 21) {
			message = "Good Evening";
		} else if (timeOfDay >= 21 && timeOfDay < 24) {
			message = "Good Night";

		sb.append(message).append(" - ").append(generateString());

		return sb.toString();

	private String generateString() {
		String uuid = UUID.randomUUID().toString();
		return uuid;


Send Message

Spring’s KafkaTemplate is auto-configured and it can be autowired directly into bean to send a message.

You will get different overloaded methods of send() and you can choose according to your needs.

In your real application the source of data ideally would be different, such as some feed URL, external web service or anything else but in this example I am pushing data every 3 seconds using scheduler to simulate the data feed.

The data or messages are sent to topic roytuts-input.

package com.roytuts.spring.apache.kafka.streams.websocket.stomp.server.producer;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;

import com.roytuts.spring.apache.kafka.streams.websocket.stomp.server.service.GreetingService;

public class MessageProducer {

	private String kafkaInputTopic;

	private GreetingService greetingService;

	private KafkaTemplate<String, String> kafkaTemplate;

	@Scheduled(fixedRate = 1000)
	public void produce() {
		String msg = greetingService.greet();

		System.out.println("Greeting Message :: " + msg);

		kafkaTemplate.send(kafkaInputTopic, msg);


Consume Message

Now we will consume the messages which were written to roytuts-output topic by Kafka stream processor.

With Apache Kafka infrastructure a bean can be annotated with @KafkaListener to create a listener endpoint on a topic.

Finally we will send to stomp topic /topic/greeting.

package com.roytuts.spring.apache.kafka.streams.websocket.stomp.server.consumer;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.messaging.simp.SimpMessagingTemplate;
import org.springframework.stereotype.Component;

public class MessageConsumer {

	private String stompTopic;

	private SimpMessagingTemplate messagingTemplate;

	@KafkaListener(topics = "${kafka.output.topic}", groupId = "${spring.kafka.consumer.group-id}")
	public void consumeMessage(String msg) {
		System.out.println("Message received: " + msg);
		messagingTemplate.convertAndSend(stompTopic, msg);


Create Main Class

A main class means a class is having the main method that starts the application. So in our Spring Boot application main class is enough to run the application.

package com.roytuts.spring.apache.kafka.streams.websocket.stomp.server;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication(scanBasePackages = "com.roytuts.spring.apache.kafka.streams.websocket.stomp.server")
public class SpringKafkaStreamaRealTimeApp {

	public static void main(String[] args) {
		SpringApplication.run(SpringKafkaStreamaRealTimeApp.class, args);


Testing the Application

Make sure your ZooKeeper server and Kafka broker are running before you run the main class.

Running the main class will produce the stream of messages in the console:

Greeting Message :: Good Afternoon - a6bc90a4-0edf-43ec-9e13-2ede13465fdd
Message received: Good Afternoon - a6bc90a4-0edf-43ec-9e13-2ede13465fdd
Greeting Message :: Good Afternoon - 035ab6dd-9b35-4144-a7a7-10c02bace24c
Message received: Good Afternoon - 035ab6dd-9b35-4144-a7a7-10c02bace24c
Greeting Message :: Good Afternoon - 8dc634ac-a316-4de3-af9d-214ee3c60e66
Message received: Good Afternoon - 8dc634ac-a316-4de3-af9d-214ee3c60e66
Greeting Message :: Good Afternoon - 87d57df9-1a8b-4610-90c2-85a62a7a7e67
Message received: Good Afternoon - 87d57df9-1a8b-4610-90c2-85a62a7a7e67
Greeting Message :: Good Afternoon - d48eeccf-b68d-4fdc-ab08-070207237d1c
... and so on

We are done with the server application on building real time data processing using Apache Kafka Streams.

Now we will see how to create client application in Angular 8 to see push notifications continuously on browser.

Client Application

Create Project

As I said in prerequisites section how to create Angular project in Windows environment, so first create an Angular project. The name of the project is spring-apache-kafka-streams-websocket-stomp-client-angular.

Installing Required Modules

Install the required modules with the following commands:

npm install stompjs
npm install sockjs-client
npm install jquery
npm i net -S

The stompjs is required to connect over STOMP.

The sockjs-client is required to establish connection with WebSocket server.

The jquery is required to directly access DOM elements in the HTML page.

To avoid net issue we need to install net module.

Update index.html

We need to declare window in the src/index.html file to avoid the below issue:

Uncaught ReferenceError: global is not defined
    at Object../node_modules/sockjs-client/lib/utils/browser-crypto.js

The complete content of src/index.html file is given below:

<!doctype html>
<html lang="en">
  <meta charset="utf-8">
  <base href="/">
  <meta name="viewport" content="width=device-width, initial-scale=1">
  <link rel="icon" type="image/x-icon" href="favicon.ico">
	if (global === undefined) {
      var global = window;

Update app.component.html

We will update the src/app/app.component.html file to put a div tag where greeting message will be updated.

<div class="msg"></div>


Update app.component.ts

We will update src/app/app.component.ts file to consume the message over STOMP.

We set the page title by implementing OnInit interface in the ngOnInit() method.

We establish connection to the WebSocket server, client socket subscribe to the topic /topic/greeting destination, where the server will publish greeting messages and finally we update the div (having a class msg) on HTML page.

import { OnInit, Component } from '@angular/core';
import { Title } from '@angular/platform-browser';

import * as Stomp from 'stompjs';
import * as SockJS from 'sockjs-client';
import $ from 'jquery';

  selector: 'app-root',
  templateUrl: './app.component.html',
  styleUrls: ['./app.component.css']
export class AppComponent implements OnInit {
  url = 'http://localhost:8080/websocket'
  client: any;
  greeting: string;
  ngOnInit() {
	this.title.setTitle('Angular Spring Websocket');
  constructor(private title: Title){
    let ws = new SockJS(this.url);
    this.client = Stomp.over(ws);
    let that = this;
    this.client.connect({}, function(frame) {
      that.client.subscribe("/topic/greeting", (message) => {
        if(message.body) {
          this.greeting = message.body;

We are here to just get the message from server as a push notification towards clients.

Testing the Application

With your server running now run the client application by executing command ng serve --open.

Your application opens at http://localhost:4200 and you will see the message being updated every 3 seconds.

Apache Kafka Streams for building Real Time Data Processing with STOMP over Websocket using Spring and Angular

In the above image the highlighted random string will be continuously changing.

Source Code


Thanks for reading.

Leave a Reply

Your email address will not be published. Required fields are marked *