Micronaut Kafka Consumer Producer example.

Micronaut Kafka consumer and producer example.

Micronaut is a java framework and it’s been popular to develop microservice-based applications because of lower memory footprint and fast startup.

In this article, we will see how to write down simple Kafka consumers and producers using the micronaut framework. 

You can read my articles on micronaut framework on https://www.techwasti.com/

Start generating project using https://micronaut.io/launch/

You can create a project either using the launch site or using CLI tool.

$ mn create-app techwastikafkaexample --features kafka

Micronaut version for this demo is 2.0.0 and Java 8.

generating project and adding Kafka profile CLI provides powerful option such as;

$ mn create-app techwasti-kafka-service --profile kafka

Prerequisites:– 

  1. Java programming
  2. Kafka
  3. Micronaut

I am assuming the people. You know about this if you don’t know then learn it.

Micronaut features dedicated support for defining both Kafka Producer and Consumer instances.

Kafka Producer:-

We will create one simple Producer using annotation.

package com.techwasti.kafkaex;

import io.micronaut.configuration.kafka.annotation.KafkaClient;
import io.micronaut.configuration.kafka.annotation.KafkaKey;
import io.micronaut.configuration.kafka.annotation.Topic;

@KafkaClient
public interface GreetMessageClient {

    // greet is a kafka topic
    @Topic("greet")
    void sendGreetMessage(@KafkaKey String day, String message);

    void sendGreetMessage(@Topic String topic, @KafkaKey String day, String message);
}

@KafkaClient annotation is used to mark this is the kafka client.

@Topic annotation is indicated, on which topic message should get published. 

@Kafkakey annotation is to have a message key.

In the above code snippet we have defined two methods:

  1. In the first method accepting two arguments key and value and topic name is annotated using @Topic. 
  2. In the second method instead of annotating the topic name, we are accepting the topic name in the argument. 

If you omit the @KafkaKey then it’s null.

As we are aware of the beauty of micronaut framework, it will produce an implementation of the above client interface. We can retrieve this instance either by looking up the bean from ApplicationContext or by injecting the bean using @Inject. 

GreetMessageClient client = applicationContext.getBean(GreetMessageClient.class); client.sendProduct("Thursday", "Good morning");

Now our producer is ready and we sent a successful message as well.

Let us create Kafka Consumer.

Kafka Consumer:-

As we have seen a couple of annotations to create a producer and produce a message over a topic. The same way we have @KafkaListener annotation to create a kafka consumer.

package com.techwasti.kafkaex;

import io.micronaut.configuration.kafka.annotation.KafkaKey;
import io.micronaut.configuration.kafka.annotation.KafkaListener;
import io.micronaut.configuration.kafka.annotation.OffsetReset;
import io.micronaut.configuration.kafka.annotation.Topic;

@KafkaListener(offsetReset = OffsetReset.EARLIEST)
public class GreetMessageConsumer {

    @Topic("greet")
    public void receive(@KafkaKey String day, String message) {
        System.out.println("Got Message for the  - " + day + " and Message is  " + message);
    }
}

@KafkaListener is used to indicate this is kafka consumer and while reading a message from the topic “greet” and offset should be earliest this will start reading a message from the start.

receive method having two arguments one is key and another one is the message. 

This is a simple example of having kafka consumers and producers. 

Advanced Options for producer and consumer:

@Header: To add a header to kafka message. 

When we want to add some header into the kafka producer when we produce a message, let us say we want to add authentication token while publishing message over kafka in this case 

e.g.

@Header(name = “JWT-Token”, value = “${my.authentication.token}”)

Also, you can pass the header as a method argument the same as a topic name. 

@Body: to explicitly indicate the message body.

Generally, the value sent by the producer resolved using @Body annotation only but if we haven’t mentioned it then the first argument resolved as message body. 

e.g 

@Topic(“greet”)
void sendGreetMessage(@KafkaKey String day, String message);

or

@Topic(“greet”)
void sendGreetMessage(@KafkaKey String day, @Body String message);

For more such things please visit micronaut documentation.

Reference:

https://micronaut-projects.github.io/micronaut-kafka/latest/guide/

Spring Boot Neo4j Reactive CRUD.

This article is about the spring data for neo4j database. Neo4j is a popular graph database.

neo4j.com

Spring Data Neo4j module is there which is support only imperative style and currently, it’s only in support and maintenance.

Prerequisites:-

You head this article that means you at least heard about Neo4j and spring boot. below are the prerequisites

  1. Neo4j (https://neo4j.com/graphacademy/online-training/introduction-to-neo4j-40/)
  2. Installation of Neo4j on local or use Neo4j sandbox.
  3. Knowledge with spring data and spring boot.
  4. For this example, we are using JDK 11.

If you don’t know anything about the above things then I will recommend you should start exploring these things and come back.

In this example, I am using Neo4j sandbox environment: https://neo4j.com/sandbox/

Advantages of using SDN-Rx:

  1. It supports both imperative and reactive development.
  2. Built-in OGM(Object graph mapping) and very lightweight.
  3. Support immutable entities for both Java and kotlin.

Maven/Gradle Dependencies:-

Right now Spring Data Neo4j Reactive starter is not yet part of the official Spring repositories so we have to add that manually, so it won’t be available in the spring initializer website.

## maven dependency
<dependency>
    <groupId>org.neo4j.springframework.data</groupId>
    <artifactId>spring-data-neo4j-rx-spring-boot-starter</artifactId>
    <version>1.1.1</version>
</dependency>
## gradle 
dependencies {
    compile 'org.neo4j.springframework.data:spring-data-neo4j-rx-spring-boot-starter:1.1.1'
}

PrePare Database:-

For this article, we are using the Neo4j-standard movie graph database because it’s in small size and it’s available in your sandbox as well as in your local.

use this command to start:

:play movies

Execute the command and deck is an interactive mode, so its seamless execution. The movie database contains a database such as a movie name, release date, crew, director of movie, a rating is given by different individuals or rating companies. The minimal schema relation could be like this

(:Person {name})-[:ACTED_IN {roles}]->(:Movie {title,released})
movie DB schema

Create Project:

The best way to start with the spring boot project is start.spring.io. Create a spring boot project.

Do not choose Spring Data Neo4j here, as it will show the legacy generation of Spring Data Neo4j that has only imperative support.

Once your project is ready then add the spring data neo4j Rx dependency in your POM or build.gradle.

Configurations:

You can put here your database-specific configurations.

org.neo4j.driver.uri=neo4j://localhost:7474
org.neo4j.driver.authentication.username=neo4j
org.neo4j.driver.authentication.password=password
spring.data.neo4j.repositories.type=reactive

Domain Entity:

All our configurations are done now let us begin and define the domain entity object. As we stated we are using a movie database so we have to create Movie as a domain entity with few properties.

Entities are nodes.

package com.techwasti.entity;

import org.neo4j.springframework.data.core.schema.Id;
import org.neo4j.springframework.data.core.schema.Node;
import org.neo4j.springframework.data.core.schema.Property;
import org.neo4j.springframework.data.core.schema.Relationship;

import java.util.HashSet;
import java.util.Set;

import static org.neo4j.springframework.data.core.schema.Relationship.Direction.INCOMING;

@Node("Movie")
public class Movie {

    @Id
    private final String mtitle;

    @Property("tagline")
    private final String tagline;

    @Relationship(type = "ACTED_IN", direction = INCOMING)
    private Set<Person> actors = new HashSet<>();

    @Relationship(type = "DIRECTED", direction = INCOMING)
    private Set<Person> directors = new HashSet<>();

    public Movie(String title, String tagline) {
        this.mtitle = title;
        this.tagline = tagline;
    }

    public String getTitle() {
        return mtitle;
    }

    public String getTagline() {
        return tagline;
    }
    
    public Set<Person> getActors() {
        return actors;
    }

    public void setActors(Set<Person> actors) {
        this.actors = actors;
    }

    public Set<Person> getDirectors() {
        return directors;
    }

    public void setDirectors(Set<Person> directors) {
        this.directors = directors;
    }
}

In the movie entity, we defined a movie name, tagline, actors, and directors.

@Node annotation marks the given class is the managed node. @Id annotation to have a unique property and then we defined different relationships using @Relationship annotation. In the same way, we have a Person entity that contains two fields.

package com.techwasti.entity;

import org.neo4j.springframework.data.core.schema.Id;
import org.neo4j.springframework.data.core.schema.Node;

@Node("Person")
public class Person {

    @Id
    private final String name;

    private final Integer born;

    public Person(Integer born, String name) {
        this.born = born;
        this.name = name;
    }

    public String getName() {
        return name;
    }

    public Integer getBorn() {
        return born;
    }
}

In these entities, we just defined one-way relation to have demonstrated things simple but you can also define an entity in such a way to fulfill two-way relationships.

Let us create a repository class then.

package com.techwasti.dao;

import com.techwasti.entity.Movie;
import org.neo4j.driver.internal.shaded.reactor.core.publisher.Mono;
import org.neo4j.springframework.data.repository.ReactiveNeo4jRepository;

public interface MovieRepository extends ReactiveNeo4jRepository<Movie, String> {
    Mono<Movie> findOneByTitle(String title);
}

This is to demonstrate the reactive programming style so we used here ReactiveNeo4jRepository which is reactive repository implementation.

You can hit below endpoints to see the output:

GET http://localhost:8080/movies

DELETE http://localhost:8080/movies/The Matrix

This is it for now.

References:-

https://neo4j.com/developer/spring-data-neo4j-rx/
https://neo4j.com/developer/spring-data-neo4j/

https://spring.io/guides/gs/accessing-data-neo4j/

Micronauts Launch: The best way to getting started.

Micronaut has launched a website to generate the Micronauts project using the website without installing the Micronauts CLI SDK.

In a couple of blogs, we have seen about Micronauts if you don’t know you can check below blog post:

Micronaut is very much similar to the spring framework. Micronaut took inference from Spring framework and most the API are in sync only, that’s why adopting Micronauts for spring developer is very easy. As we have start.spring.io to start and create spring or spring boot project on the same note Micronauts also launched website aka Micronauts launch.

https://micronaut.io/launch/

As we know we can generate micronaut project using CLI but we can take same advantage using website as well.

If you see here we have different options few are listed below.

  1. Application type
  2. Java version
  3. Base Package
  4. Name of application
  5. etc

1. Application type:-

Application type where we have to specify which type of application we want such as Application (web or any other application), CLI application, Serverless function, gRPC application, and Messaging application. This application type will help us to organize the dependencies.

2. Java Version:-

Java version where we have to specify on which JDK we want to develop your application, e.g Java 8,11, 14 etc.

3. Base Package:-

Base package here we have specify our package of the application under which we want organise our classes, interfaces.

e.g com.techwasti.micronaut.demo

4. Name:-

Here we have to specify name of the application.

e.g HelloworldLaunch.

5. Micronaut Version:-

Which micronaut version our application should be compatible latest one when I am writing this blog post is 2.0.0.

6. Language:-

Select which language do you want to write down the beautiful code, right now micronaut support java, kotlin and Groovy.

7. Build Tool:-

Select which build tool either from maven or gradle.

8. Test Framework:-

Here we have a choice to select the test framework anything from the list such as Junit, Spock, and kotlintest.

9. Features:

When you click on features button one popup will launch.

In features we have different groups such as cache, config, database, etc.

10. Diff:-

This is to show the difference. This is interesting option. Shows the changes that the selected features have on an application generated without any features selected.

11. Preview:-

Another best option this site provides is the preview of your project based on your selection.

The final option is to generate the project and once you click on this are getting zip file. After zip extraction, you will get below kind of structure.

Were we have docker file, build file, gitignore along with source directory structure. Download and import this in any of your IDE(eclipse, intellij) and happy coding.

This is it for now. Let me know your finding on this if any.

Spring boot Cloud Native Buildpacks and Layered jars.

In May 2020 spring boot 2.3 is released with some interesting features. There are many but in this article, we will talk about support for building OCI images using cloud-native build packs. 

Cloud-Native BuildPacks?

These days cloud migration and in that cloud-native application development is becoming a trend.

Cloud-Native Buildpacks
transform your application source code to images that can run on any cloud.

Cloud-native buildpacks definition from From https://buildpacks.io/,

The Cloud Native Buildpacks project was initiated by Pivotal and Heroku in January 2018 and joined the Cloud Native Sandbox in October 2018. The project aims to unify the buildpack ecosystems with a platform-to-buildpack contract that is well-defined and that incorporates learnings from maintaining production-grade buildpacks for years at both Pivotal and Heroku.

credit goes to https://buildpacks.io/

Cloud-Native Buildpacks embrace modern container standards, such as the OCI(Open container initiative) image format. They take advantage of the latest capabilities of these standards, such as cross-repository blob mounting and image layer “rebasing” on Docker API v2 registries.

All the above information is from buildpack website only. The buildpack in fewer words: will transform your beautiful source code into runnable container images.

The Paketo java build is used by default to create an image.

Prerequisites for this example:

  1. Java
  2. Docker
  3. Any IDE if you want.

Note:- For this demo, I am using spring-boot:2.3.1 version, JDK-8 and maven.

and Always for spring start with https://start.spring.io/

I have imported the project into the VSCode. If you want to learn about Spring tools for Visual Studio Code, please go through this link: https://www.techwasti.com/spring-tools-4-for-visual-studio-code/

Create One REST Controller:-

As part of this article, our focus is on buildpack not on complex coding.

We have a simple controller this will return the current date.

package com.techwasti.spring.buildpackex.springboot23ocibuildpackex;

import java.util.Date;

import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class CurrentDateController{


    @GetMapping("/gettodaysdate")
     public String getTodaysDate(){
        return new Date().toString();
     }
}

Output: Sun Jul 19 09:00:36 IST 2020

Build the image for our app:

As mentioned above and as per the spring boot documentation, we will build an image. Package the source code and build the image as per OCI standard using the maven task.

$ mvn spring-boot:build-image

When you fire this command everything will be taken by spring boot build-image task. After a successful building image.

In your log you will see similar logs:

Successfully built image ‘docker.io/library/spring-boot23-oci-buildpack-ex:0.0.1-SNAPSHOT’

We can validate our docker image using the below command.

$ docker images| grep spring

Output
spring-boot23-oci-buildpack-ex 0.0.1-SNAPSHOT ddabb93c2218 40 ago 231MB

Now our image is ready, let us run the image and create a container 

$ docker run -d -p 8080:8080  — name springbuildpackex spring-boot23-oci-buildpack-ex:0.0.1-SNAPSHOT

once a container ready verify using 

$ docker ps

Hit REST API endpoint http://localhost:8080/gettodaysdate

You can hit actuator endpoints 

http://localhost:8080/actuator/metrics

http://localhost:8080/actuator/info

Spring Boot 2.3.0.RC1 Paketo Java buildpack is used by default to create images.

you can check on your local docker when you fire below command you can see Paketo docker image was downloaded.

$ docker images| grep gcr
gcr.io/paketo-buildpacks/run       base-cnb                c8c8215efa6f        8 days ago          71.1MB
gcr.io/paketo-buildpacks/builder base-platform-api-0.3 e49209451fa6 40 years ago 696MB

Customize build pack configuration:

Now we have seen that by default the name of the image is based on artifactId and a tag is a version of our maven project. Image name if spring-boot23-oci-buildpack-ex 0.0.1-SNAPSHOT

docker.io/library/${project.artifactId}:{project.version}

In your real-life project, you would like to push the OCI image to a specific docker image registry, which is internal to your organization. Here I am using the docker hub is a public central registry. You configure parameters such as the name of the docker image in POM.xml

<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>

</configuration>
</plugin>

With these custom tags and names, I can push this image to the docker hub.

docker push docker.io/maheshwarligade/spring-boot23-oci-buildpack-ex

Using the command line as well:

$ mvn spring-boot:build-image -Dspring-boot.build-image.imageName=enterprise.com/library/domain/sspring-boot23-oci-buildpack-ex

We can also configure build packs builder to build the image using below configuration param

<configuration>            

</configuration>

Proxy Configuration:

If any proxy is configured between the Docker daemon the builder runs in and network locations that build-packs download artifacts from, you will need to configure the builder to use the proxy. When using the default builder, this can be accomplished by setting the HTTPS_PROXY and/or HTTP_PROXY environment:

<configuration>      

</configuration>

Layered Jars:

We have seen above to create the image we used to build-pack, but you might not want to use build pack to build an image, perhaps we want to use some tool which is used within your organization based docker file to create an application image, Spring wanted to make it also easier to create optimized Docker images that can be built with a regular dockerfile so Spring has added support for layered jars.

basically we follow approach to create a docker image using spring boot application fat jar and add that into docker file and add a command to execute this.

The jar is organized into three main parts:

  • Classes used to bootstrap jar loading
  • Your application classes in BOOT-INF/classes
  • Dependencies in BOOT-INF/lib

Since this format is unique to Spring Boot, In spring boot in 2.3.0.M1 providing a new layout type call LAYERED_JAR

As we know about the docker file its layered file and when we rebuild image for dev purpose it should build where changes had happened instead of rebuilding the fat jar layer again and again. The layered jar type is designed to separate code based on how likely it is to change between application builds. Library code is less likely to change between builds, so it is placed in its own layers to allow tooling to re-use the layers from the cache. Application code is more likely to change between builds so it is isolated in a separate layer.

  • dependencies (for regularly released dependencies)
  • snapshot-dependencies (for snapshot dependencies)
  • resources (for static resources)
  • application (for application classes and resources)

Build the docker file 

<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<layout>LAYERED_JAR</layout>
</configuration>
</plugin>
</plugins>
</build>

Build jar for application

$ mvn clean package

We can have layered jar using jarmode

$ java -Djarmode=layertools -jar target/spring-boot23-oci-buildpack-ex-0.0.1-SNAPSHOT.jar list

Output

dependencies
snapshot-dependencies
resources
application

based on this we can craft docker file which will be similar like below:

FROM adoptopenjdk:11-jre-hotspot as builder
WORKDIR application
ARG JAR_FILE=target/*.jar
COPY ${JAR_FILE} application.jar
RUN java -Djarmode=layertools -jar application.jar extract

FROM adoptopenjdk:11-jre-hotspot
WORKDIR application
COPY --from=builder application/dependencies/ ./
COPY --from=builder application/snapshot-dependencies/ ./
COPY --from=builder application/resources/ ./
COPY --from=builder application/application/ ./
ENTRYPOINT ["java", "org.springframework.boot.loader.JarLauncher"]

Summary:

Here we have seen multiple ways to create an image of our spring boot application. With buildpacks, docker files, & existing plugins such as jib, there is no conclusion that which is the best way. Each approach has pros and cons and we have to use these tools based on our easiness and simplification.

Source Code: https://github.com/maheshwarLigade/spring-boot23-oci-buildpack-ex

References: