Showing posts with label sbt. Show all posts
Showing posts with label sbt. Show all posts

Saturday, December 23, 2017

Developers Needs SDKMAN Not Super-Man


Every developer has pain for setup development environment to his/her machine with lots of the setups. Sometimes, the pain goes beyond while we need to test same application on multiple versions of sdks or virtual machines.


If you are a Mac user, you have the best option called brew installer.




But if you are Linux user, your pain is unpredictable. 






We are Java developers and Linux users and have the same pain for setting development environment with lots of configuration and different versions virtual machines.

For the sake of innocent developers, for the sake of time, we are going to introduce our superhero called SDKMAN. Which saves us from the cruel world of setup developments tools.



Technical Introduction:  


SDKMAN! is a tool for managing parallel versions of multiple Software Development Kits on most Unix based systems. It provides a convenient Command Line Interface (CLI) and API for installing, switching, removing and listing Candidates. SDKMAN is primary used for JVM based languages and framework. In future, they were plan to move SDKMAN for other environments as well. Currently SDKMAN have a huge list of sdks, which we get from here

Install SDKMAN

$ curl -s "https://get.sdkman.io" | bash

$ source "$HOME/.sdkman/bin/sdkman-init.sh"

$ sdk version

Install Java

For installing java, SDK provide simple and easy command as below:


$ sdk install java

Downloading: java 8u152-zulu

In progress...

######################################################################## 100.0%

Repackaging Java 8u152-zulu...

Done repackaging...

Installing: java 8u152-zulu
Done installing!


Setting java 8u152-zulu as default.
root@a33316a976d9:~/.sdkman# java -version
openjdk version "1.8.0_152"
OpenJDK Runtime Environment (Zulu 8.25.0.1-linux64) (build 1.8.0_152-b16)
OpenJDK 64-Bit Server VM (Zulu 8.25.0.1-linux64) (build 25.152-b16, mixed mode)

By default, sdkman download the zulu or open source JDK of java. But if we require installing some specific version of JDK or Specific Oracle JDK, what can we do???


SDKMAN gave us the way to download sdk's with specific versions as well. We can easily list out the existing SDK's which SDKMAN support and install it as per requirements.


$ sdk list

================================================================================
Available Candidates
================================================================================
q-quit                                  /-search down
j-down                                  ?-search up
k-up                                    h-help

--------------------------------------------------------------------------------
Ant (1.10.1)                                             https://ant.apache.org/

Apache Ant is a Java library and command-line tool whose mission is to drive
processes described in build files as targets and extension points dependent
upon each other. The main known usage of Ant is the build of Java applications.
Ant supplies a number of built-in tasks allowing to compile, assemble, test and
run Java applications. Ant can also be used effectively to build non Java

So on ...................

$ sdk list java

================================================================================
Available Java Versions
================================================================================
     9.0.1-zulu                                                                    
     9.0.1-oracle                                                                  
     9.0.0-zulu                                                                    
 > * 8u152-zulu                                                                    
     8u151-oracle                                                                  
     8u144-zulu                                                                    
     8u131-zulu                                                                    
     7u141-zulu                                                                    
     6u93-zulu                              
	 
$ sdk install java 8u151-oracle

Oracle requires that you agree with the Oracle Binary Code License Agreement
prior to installation. The license agreement can be found at:

  http://www.oracle.com/technetwork/java/javase/terms/license/index.html

Do you agree to the terms of this agreement? (Y/n): y


Downloading: java 8u151-oracle

In progress...

######################################################################## 100.0%

Repackaging Java 8u151-oracle...

Done repackaging...

Installing: java 8u151-oracle
Done installing!

Do you want java 8u151-oracle to be set as default? (Y/n): y

Setting java 8u151-oracle as default.

$ java -version
java version "1.8.0_151"
Java(TM) SE Runtime Environment (build 1.8.0_151-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)

As it shows, we can install oracle java successfully. But, at the start of this blog, as we discussed we can install multiple version of the same SDK easily and manage easily. If we go through the blog again, first we install OpenJDK after we are installing OracleJDK, single machine multiple JDKS and we also set OracleJDK as default, so how can we use OpenJDK as per our requirements??

Below are powerfull and ease commands of SDKMAN which help us to achieve this functionality.


$ sdk list java

================================================================================
Available Java Versions
================================================================================
     9.0.1-zulu                                                                    
     9.0.1-oracle                                                                  
     9.0.0-zulu                                                                    
   * 8u152-zulu                                                                    
 > * 8u151-oracle                                                                  
     8u144-zulu                                                                    
     8u131-zulu                                                                    
     7u141-zulu                                                                    
     6u93-zulu                                                                     
                                                                                   
                                                                                   
                                                                                   
                                                                                   
                                                                                   
                                                                                   

================================================================================
+ - local version
* - installed
> - currently in use
================================================================================


$ java -version
java version "1.8.0_151"
Java(TM) SE Runtime Environment (build 1.8.0_151-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)

$ sdk use java 8u152-zulu

$ java -version
openjdk version "1.8.0_152"
OpenJDK Runtime Environment (Zulu 8.25.0.1-linux64) (build 1.8.0_152-b16)
OpenJDK 64-Bit Server VM (Zulu 8.25.0.1-linux64) (build 25.152-b16, mixed mode)

I am sure, now you can feel the power of SDKMAN and how easy is using this tool. This makes developers life happy and safe.


References: 

  1. http://sdkman.io/index.html
  2. Thanks to google funny images.

Sunday, July 2, 2017

Cinnamon: Way For Monitoring & Metrics Generation for Akka ActorSystem.

We are developing huge applications and deployed on multiple virtual machines and clusters. For monitoring applications, we need to enable logs into our application and analysis that logs with help of some tools like elastic stack.
But !! what if we need to check health of our application on virtual machines and clusters? For that, we are using several Metrics for system health check like Gauges, Histograms and more.

Lightbend Telemetry gives us one of the way for metrics generation and monitoring systems (application) by using Cinnamon plugins. Today's we look into Cinnamon for monitoring akka ActorSytem. Configuring cinnamon is not a rocket science, there are simple steps, which, we are going to define here.

The first step for using cinnamon is, we need to create and account on lightbend, from where we can download credentials and paste into our home directory(for linux users). All instructions are define in this link.

Note: For today's example, we are using sbt project, but we can easily integrate with maven and gradle as well.

>>> We need to add cinnamon plugin in our plugin.sbt file as below:


>>> Now, require to add some dependencies to add in our build.sbt:


>>> Add cinnamon configuration on application.conf as below:


There are lots of option for configure monitoring to actor system, for more information please click on this link.

Example: 

My example is a simple hello world actor application, but when we run that example, in console we are seeing cinnamon metrics. The example as below:


For running the application we need to execute sbt run command as below:


As in logs, we have seeing, 4 types of metrics are there. Every metrics have its own befits and analysis. These metrics gives us a report on akka actor systems like actors count, threads count, mailbox capacity and more.

For more examples, you can check github repo.

References:



Saturday, February 25, 2017

Apache Kafka: Multiple ways for Produce or Push Message to Kafka topics

Today, I am going to describe what are the various ways in Apache kafka, for put the messages into topics. Apache Kafka have supports for several languages and also provide api's for Java, one of the reason is, Java is the primary language of JVM and most of the JVM based languages have full support for using Java libraries easily.

Kafka have a concept of topics, partitions etc. which you can explore from Apache kafka documentation or Confluent documentation. For put messages in kafka queue, kafka supports serialization and various formats for messages. Some of the formats kafka provides by default, but for kafka recommended format is Apache Avro. Avro is a lightweight and type safe format for serialized data. For more, you can explore apache avro.

Kafka have a concept of Producer/Consumer. Producer produce the data to queue and Consumer consume the data from queue. Today we are creating various Kafka Producers for produce data in kafka topic.

Prerequisite:

  • Install JDK 8.
  • Download Apache Kafka.
  • Download Zookeeper.
  • Download Confluent Kafka Kit. 
  • IDE
  • Build Tool ( We are using SBT) 

I. Simple Producer: 

First, we are creating kafka simple producer for producing messages to the kafka topic using java. 

public class SimpleProducer {

    private static Properties kafkaProps = new Properties();
    private static KafkaProducer kafkaProducer;

    static {
        kafkaProps.put("bootstrap.servers", "localhost:9092");
        kafkaProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        kafkaProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        kafkaProducer = new KafkaProducer<>(kafkaProps);
    }


    public static void fireAndForget(ProducerRecord record) {
        kafkaProducer.send(record);
    }

    public static void asyncSend(ProducerRecord record) {
        kafkaProducer.send(record, (recordMetaData, ex) -> {
            System.out.println("Offset: "+ recordMetaData.offset());
            System.out.println("Topic: "+ recordMetaData.topic());
            System.out.println("Partition: "+ recordMetaData.partition());
            System.out.println("Timestamp: "+ recordMetaData.timestamp());
        });
    }

    public static void main(String[] args) throws InterruptedException {
        ProducerRecord record1 = new ProducerRecord<>("CustomerCountry",
        "Record 1", "Japan1"
        );

        ProducerRecord record2 = new ProducerRecord<>("CustomerCountry",
                "Record 2", "Punjab1"
        );

        fireAndForget(record1);
        asyncSend(record2);

        Thread.sleep(10000);
    }
}

In this example, we are just produce the data into queue with Kafka Default serialization `StringSerializer`.

II. Apache Avro Serialization Generic Format: 

For using Apache Avro, we need to create schema for our messages, because that schema help us for deserialize messages with type safety. For using avro, there are various build tools plugins, provide's us for generate our POJO classes from avro schema file. For good practices, we must use that tools, because some time our messages may too complex and for manually, we are always going with a mistake.

NOTE: For using avro, we need to start confluent registry server, because that registry server is used to manage messages schema's and our messages are managed by kafka queues. For more details, please visit documentation.

For SBT, I am using sbt-avro plugin. Its depends on you, which build tool you are using or you can write manually also. 
Like we discuss, for avro we need to create schema first like in my example i have following schema: 

{"namespace": "com.harmeetsingh13.java",
    "type": "record",
    "name": "Customer",
    "fields": [{"name": "id","type": "int"},
            {"name": "name","type": "string"}]
}

By using sbt-avro plugin, my pojo class is generated automatically. Now, we are creating our Kafka Producer by using Avro.

public class AvroSpecificProducerOne {
    private static Properties kafkaProps = new Properties();
    private static KafkaProducer kafkaProducer;

    static {
        kafkaProps.put("bootstrap.servers", "localhost:9092");
        kafkaProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        kafkaProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
        kafkaProps.put("schema.registry.url", "http://localhost:8081");
        kafkaProducer = new KafkaProducer<>(kafkaProps);
    }

    public static void fireAndForget(ProducerRecord record) {
        kafkaProducer.send(record);
    }

    public static void asyncSend(ProducerRecord record) {
        kafkaProducer.send(record, (recordMetaData, ex) -> {
            System.out.println("Offset: " + recordMetaData.offset());
            System.out.println("Topic: " + recordMetaData.topic());
            System.out.println("Partition: " + recordMetaData.partition());
            System.out.println("Timestamp: " + recordMetaData.timestamp());
        });
    }

    public static void main(String[] args) throws InterruptedException, IOException {
        Customer customer1 = new Customer(1001, "Jimmy");
        Customer customer2 = new Customer(1002, "James");

        ProducerRecord record1 = new ProducerRecord<>("AvroSpecificProducerOneTopic",
                "KeyOne", customer1
        );
        ProducerRecord record2 = new ProducerRecord<>("AvroSpecificProducerOneTopic",
                "KeyOne", customer2
        );

        asyncSend(record1);
        asyncSend(record2);

        Thread.sleep(1000);
    }
}

In this example, the one thing we need to note about is that, for serializing our message key, we are still using Kafka  `StringSerializer` class, because when we deserialize our string value using Avro `KafkaAvroDeserializer` we are facing this issue:

Error deserializing Avro message for id 351 org.apache.kafka.common.errors.SerializationException: 
Error deserializing Avro message for id 351 Caused org.apache.kafka.common.errors.SerializationException: 
string specified by the writers schema could not be instantiated to find the readers schema

For more details, you look into this discussion. 

As we are discuss, There are various ways for  serialize messages to kafka queue using avro. I the above example we are using Generic way for message serialization but we can serialize messages with more specific way also. Which we are discuss in our next examples.

III: Apache Avro Serialization Specific Format One: 

Another avro example is for serialize messages with one specific way as below:

public class AvroSpecificProducerOne {
    private static Properties kafkaProps = new Properties();
    private static KafkaProducer kafkaProducer;

    static {
        kafkaProps.put("bootstrap.servers", "localhost:9092");
        kafkaProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        kafkaProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
        kafkaProps.put("schema.registry.url", "http://localhost:8081");
        kafkaProducer = new KafkaProducer<>(kafkaProps);
    }

    public static void fireAndForget(ProducerRecord record) {
        kafkaProducer.send(record);
    }

    public static void asyncSend(ProducerRecord record) {
        kafkaProducer.send(record, (recordMetaData, ex) -> {
            System.out.println("Offset: " + recordMetaData.offset());
            System.out.println("Topic: " + recordMetaData.topic());
            System.out.println("Partition: " + recordMetaData.partition());
            System.out.println("Timestamp: " + recordMetaData.timestamp());
        });
    }

    public static void main(String[] args) throws InterruptedException, IOException {
        Customer customer1 = new Customer(1001, "Jimmy");
        Customer customer2 = new Customer(1002, "James");

        ProducerRecord record1 = new ProducerRecord<>("AvroSpecificProducerOneTopic",
                "KeyOne", customer1
        );
        ProducerRecord record2 = new ProducerRecord<>("AvroSpecificProducerOneTopic",
                "KeyOne", customer2
        );

        asyncSend(record1);
        asyncSend(record2);

        Thread.sleep(1000);
    }
}

IV: Apache Avro Serialization Specific Format Two: 

Another way for serialize messages using avro as below: 

public class AvroSpecificProducerTwo {

    private static Properties kafkaProps = new Properties();
    private static KafkaProducer kafkaProducer;

    static {
        kafkaProps.put("bootstrap.servers", "localhost:9092");
        kafkaProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        kafkaProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, KafkaAvroSerializer.class);
        kafkaProps.put("schema.registry.url", "http://localhost:8081");
        kafkaProducer = new KafkaProducer<>(kafkaProps);
    }

    public static void fireAndForget(ProducerRecord record) {
        kafkaProducer.send(record);
    }

    public static void asyncSend(ProducerRecord record) {
        kafkaProducer.send(record, (recordMetaData, ex) -> {
            System.out.println("Offset: " + recordMetaData.offset());
            System.out.println("Topic: " + recordMetaData.topic());
            System.out.println("Partition: " + recordMetaData.partition());
            System.out.println("Timestamp: " + recordMetaData.timestamp());
        });
    }

    private static byte[] convertCustomerToAvroBytes(Customer customer) throws IOException {
        Parser parser = new Parser();
        Schema schema = parser.parse(AvroSpecificProducerOne.class
                .getClassLoader().getResourceAsStream("avro/customer.avsc"));

        SpecificDatumWriter writer = new SpecificDatumWriter<>(schema);
        try (ByteArrayOutputStream os = new ByteArrayOutputStream()) {
            BinaryEncoder encoder = EncoderFactory.get().binaryEncoder(os, null);
            writer.write(customer, encoder);
            encoder.flush();

            return os.toByteArray();
        }
    }

    public static void main(String[] args) throws InterruptedException, IOException {
        Customer customer1 = new Customer(1001, "Jimmy");
        Customer customer2 = new Customer(1002, "James");


        byte[] customer1AvroBytes = convertCustomerToAvroBytes(customer1);
        byte[] customer2AvroBytes = convertCustomerToAvroBytes(customer2);

        ProducerRecord record1 = new ProducerRecord<>("AvroSpecificProducerTwoTopic",
                "KeyOne", customer1AvroBytes
        );
        ProducerRecord record2 = new ProducerRecord<>("AvroSpecificProducerTwoTopic",
                "KeyOne", customer2AvroBytes
        );

        asyncSend(record1);
        asyncSend(record2);

        Thread.sleep(1000);
    }
}

There are still multiple ways for serialize messages using avro or other message serializer for kafka. In the next post, we will discuss kafka consumers and various way for consume messages from kafka queue using avro.

For above examples, you can download code from github repo also.