Start-Up Kit for Machine Learning – I

A few days ago, I have written an article regarding the “Paradigm Shift” in which Machine Leaning is one of the listed stuff. In this article, I will give some information regarding Machine Learning rather a start-up kit for getting started with it. To begin with…

What is Machine Learning?

Before understanding the concept behind Machine Leaning and the programming for the same let’s see how we used to work with traditional programming style.

Image title

In traditional programming, we usually provide the data (inputs) and the program (algorithm) that will consume those data to produce the result on a platform which is Machine. The machine just has the ability to execute what developer provides to it.

How Programming for Machine Learning is different from Traditional Programming?

But what if we provide the data(inputs) and some sample test results to the Machine on which it will infer the logic of how those inputs yield the sample test outputs to develop an algorithm. With the help of that algorithm, it will start predicting results for new inputs. Also while doing so it will refine the algorithm to improve the predictions.

Image title

But deducing an algorithm is not always as easy as said above. The Machine has to do a lot of processing and computation to predict an accurate result.

We will see more details later but for now, just for starters, we must know that Machine Learning is not just executing something to get the result but it is the other way around to develop a program based on the sample outputs.

Usage of Machine Learning:

Continue reading


Need of the hour: The paradigm shift in technology

There come times when we need to shift from an existing level to the next level reason being to stay ahead in the race or to save self from being obsolete. This is known as Paradigm Shift meaning switching from what you have to a new but feasible level.

And in the world of technology, this shift is nothing but mandatory as every day or the other we see a new, fast, error-proof and cost-effective technology. In this short article, I would like to bring in some such areas where we are shifts rapidly.  This list might grow as is and I urge to keep them listed in the comments!

Continue reading

PUT or POST – Which one to choose!!!

When we develop a RESTFul application we use the HTTP methods (verbs) to create, modify or access to the resources from the application. So, what are these HTTP methods anyway? They are nothing but defines an action that the HTTP request will perform on the server.  These are the HTTP Verbs available to perform action :

  1. GET
  2. POST
  3. PUT
  4. PATCH

Most of these verbs are self-explanatory, right?

But, what about the POST and PUT ??? The most common answer is the POST for creating resources while PUT for updating. However, this is how it is being used but why?

In this article, I would try to explain the difference between POST and PUT…

Continue reading

Maven: A Brief look…

Every application regardless of small or big or huge we have to follow some procedures or cycles. Configuring these steps manually everytime we need to run that cycle is a very cumbersome job. For example Dependencies Management, Pre-Deployment Validations and Checks etc.

Maven came into existence for the purpose of eliminating most of the manual efforts and automate this process.

What is Maven?

It is well known as “The Build Tool”, but I believe it is more than just “A Build Tool” given the facts of its capabilities, but in this tutorial, we will stick to basics of Maven.

  • Create a Maven Project
  • Understand Build Cycle, Build Phase & Goals
  • Understand POM
  • Understand GroupID and ArtifactID
  • Parent POM Concept; Inheritance & Aggregation
  • Simple Maven Project.

Create a Maven Project: 

This is quite easy so I will save it for the last part of the article while creating a simple project, but for now, let’s understand how it works first!!

Understand Build Cycle, Build Phase & Goals:

Build Cycle(s): As I said before there are lots of procedure we have to follow for a build process and these are coined as “Build Cycle”. A cycle consists of one or more “Build Phase(s)” which runs sequentially and in turn, each phase has being assigned with one or more “Goal(s)”


mvn clean dependency:copy-dependencies

Build cycle(s) is defined and can be executed as a whole. It signifies a stage of the build. Example: site – This is a build cycle that is responsible for documentation, clean – This is a build cycle that is responsible for gracefully clean up the maven directory where the compiled code resides and default – This is the default process that controls some of the base functionalities like validating the dependencies and so on…

Each Build Cycle contains sequential phases, we can invoke a phase directly from a build cycle. Build Phase run in a sequential manner, so when we execute a build phase, the phase(s) before that phase are also executed prior to the execution.

mvn install

When this above command is executed, all the phases before “install” are executed prior to that.

Now, Goal(s)… There are the granular level commands present in each build phase. If the Goal is associated with a single phase then we can call it directly else we can use the phase name and then the goal name separated by “:”.

mvn clean dependency:copy-dependencies

Understand POM (Project Object Model): 

This is an XML file that contains the information required by Maven to create the builds, like Project Name, ArtifactID, GroupID, Version, Dependencies, etc. This also defines the different build cycle, phases for building the application. Basically, this contains what and hows for the builds.

This file has to be kept in the root folder of the project and has to be named “pom.xml”. A simple pom.xml would be as follows.


pom.xml derives some information from the super POM which can be overridden in the pom.xml at the project level as well. Later, we will see how to inherit information from Parent POM to other POMs and also how to aggregate POM into parent POM.

Understand GroupID and ArtifactID:

These tags {<groupId> , <artifactId> and <version>} are like address or unique identifier of POM file. When we create a POM we define

  • Group ID as the organization web domain or if it is a common project we can add the project name as well.
  • Artifact ID is mostly the project name and this is used by Maven for naming the Jar file or War file.
  • A Version is a revision done on the POM/Project.

Parent POM Concept; Inheritance & Aggregation:

As Maven encourages the DRY principle, the capability is available to inherit the common properties from a parent project to sub-projects without repeating the same kinds of stuff. This is known as POM inheritance. Let’s see how to do that:


Also, there will be the scenario where there will be multiple sub-projects and we might have to aggregate all sub-project’s pom.xml into a parent pom.xml.


NB: <packaging>pom</packaging> – This means that the parent POM will be packaged as POM and would be used by reference only.

Hint: It would be very difficult to see from which POM the dependencies are pulled in, so to find the effective POM we can use the following command.

mvn help:effective-pom

Before jumping into the code to create a simple maven project with inheritance and aggregation, we will see one more concept in Maven i.e its folder structure.


Simple Maven Project:

In this simple project, I will create a Maven Project which will be a parent project for another sub-project and in parent project, I will add a common dependency for javax.mail. I will then create a new maven project that will inherit the parent maven project’s pom.xml and have the parent (common) dependencies in it too.

Parent pom.xml


Child pom.xmlParent Child project has the mail.jar dependency as well inherited from the parent even without mentioning the same in dependencies list in the child.


Hope this help in understanding Maven.

Thanks and Happy Coding,


Spring Boot – Profiles…

Spring Boot is gaining its popularity like any thing in the present time and I know it will be a persistent player in the coming days as well. It there are some features that every technology has and it is every useful in enterprise applications. I am going to write about one “Profiles”.

What is Profiles ?

Every enterprise application has many environment like

Dev | Test | Stage | Prod | UAT / Pre-Prod

Each environment require certain setting specific to them, For Example, in DEV we do not need to check database consistency always whereas in TEST and STAGE we need to. These environment specific configurations are called as Profiles.

How do we maintain Profiles? 

Simple. Properties files!!
We make properties files for each environment and set the profile in the application accordingly so it will pick the respective properties file. Don’t worry we will see how to set it up.

This article will show how to setup Profiles for Spring Boot Application.

Let’s Start with setting up a Spring Boot Application from Spring Starter.

Screen Shot 2018-09-02 at 12.43.23 AM.png

Next, Import the Project into STS as Maven Project. Below is the project structure.
Screen Shot 2018-09-02 at 12.48.17 AM.pngIn this demo application we will see how to configure different database at runtime based on the specific environment by their respective profiles.

As DB connection is better to be kept in a property file so it remains external to application and can be changed, we will do so. But Spring Boot by default provides just 1 property file ( So how will we segregate the properties based on environment?

The solution would be to create more property file add “profile” name as suffix and configure Spring Boot to pick the appropriate properties based on the “profile”.

Create 3 more


Of course, will remain as master properties file but if we override any key in profile specific file the later will gain precedence. 

I will now define DB configuration properties for in respective properties file and add code in DBConfiguration.class to pick the appropriate settings.


Screen Shot 2018-09-02 at 8.33.25 AM.png

In DEV we will use in-memory database

Screen Shot 2018-09-02 at 8.33.58 AM.png

In TEST, we will be using lower instance of RDS mysql database and in PROD higher instance of mysql database. (It’s price that matters…)

Screen Shot 2018-09-02 at 8.43.08 AMScreen Shot 2018-09-02 at 8.43.23 AM

We are done with properties files, let’s configure in DBConfiguration.class to pick the correct one.

Screen Shot 2018-09-02 at 8.48.41 AM.png

We have used @Profile(“Dev”) to let the system know that this is the BEAN that should be picked up when we set the application profile to DEV. Other two beans will not be created at all.

One last setting, how to let the system know that this is DEV or TEST or PROD?

For doing that we will use the to use the key as below.

From here, Spring Boot will know which profile to pick . Lets run the application now!!

Profile in DEV mode and in DEV it should pick H2 DB.

Screen Shot 2018-09-02 at 9.05.26 AM.png

Screen Shot 2018-09-02 at 9.07.25 AM

Change the profile to PROD, we will see mysql with HIGH Config for DB should be picked and the message would be overridden with PROD message.

Screen Shot 2018-09-02 at 9.09.29 AMScreen Shot 2018-09-02 at 9.09.44 AM

That’s it!! We just have to change only once at the to let Spring Boot know which environment the code is deployed and it will do the magic with setting.

Please visit the repository, to access the code to see this happening!!

Happy Coding

Netflix Eureka – Microservice – Registry-Discovery

In the headline, we saw three buzzwords.

  1. Microservice
  2. Netflix Eureka
  3. RegistryDiscovery

What is the microservice?

In simple words, microservice(s) are clusters of small applications that work together in coordination to provide a complete solution.

When we say a lot of small application running independently together, then all will have their own URLs and PORTs. In that scenario, it would be very cumbersome to maintain all these microservice to run in synchronization and more importantly on monitoring. Even this problem will increase manifold when we start implementing load balancers.

To solve this issue we need a tool that will monitor and maintain the registry of all the microservice(s) in the ecosystem.

What is Netflix Eureka?

This is a tool provided by Netflix to provide a solution to the above problem. It consists of the Eureka server and Eureka clients. Eureka server is in itself a microservice to which all other microservice(s) registers. Eureka Clients are the independent microservices. We will see how to configure this in a microservice ecosystem.

I will be using Spring Boot to create few microservice(s) which will act as Eureka Clients and a Discovery Server which will be a Eureka Server. Here is the complete project structure.


Let’s now discuss the Eureka Discovery Server

This is the Eureka server and for that, we have to include Eureka Dependency in the Project. Below is the pom.xml for eureka discovery server.


Also, we need to update the properties file for this project to indicate that is a discovery server and not a client.


To bind the discovery application to a specific port and name the application we need to add the following as well.



One last thing to do is to Annotate the Spring Boot Application to enable this as Eureka Server. To do so we need to add @EnableEurekaServer.


Boot up the application we will see a UI provided by Eureka to list all the servers that get registered. But at this point we have none!!


Now, let’s add few microservice(s) into the ecosystem and register them to the discovery server. For this also we need to add dependencies required in each service and register the same to the server. We will see in the details below.

I have created three simple microservice (microservice1, microservice2, microservice3) with Spring Boot and each one running in its own port (8002, 8003 and 8004).


As a client, it should register itself to the server and that happens in the property file as below.


And the main application would be annotated with @EnableEurekaClient in each microservice.


Boot up this application to run in port 8004 and it will automatically register itself to the discovery server. In a similar manner, I have created two more microservice and register the same in the discovery.


We can see three servers are running in the ecosystem and we can monitor the status of these servers too.

This ease the monitoring of all the servers and there replica in case we have used the load balancer.

I hope this will help get started with using Discovery Server and Clients using Eureka…

Eureka! We are done!!

Reference to the GIT Repository containing the code used for this demo!

Happy Coding!!

Deploying Spring Boot on Docker

Docker is currently a hot cake in the container based deployment areas where as Spring Boot is the same for Mircoservice development. Both Spring Boot and Docker together forms a great combo for developing microservice based application(s). In this article I will try to explain in very simple words

  • What is Docker and its benefits.
  •  What is Spring Boot Application and how to create a simple Spring Boot Application.
  • Hosting the Spring Boot Application in a Docker Container.


This is a tool that makes it very easy to deploy and run an application by using containers. A container allow a developer to create a all-in-one package of the developed application with all its dependancies. For example, a java application requires java libraries and when we deploy it in any system or VM, we need to install Java in that as first. But in a container everything is kept together and shipped as one package. Docker container. Read this article for more information about Docker Containers.


Spring Boot is a framework that ease the development of web applications. It has a lot of pre-configured modules that eliminates the manual addition of dependancies for developing an application with Spring. This is the sole reason of this being one of the favourites creating MicroServices. Lets see now how to create a Spring Boot Application in few minutes.

Open Spring Starter  to create a Java Maven Application with Spring Starter Libraries.

Screen Shot 2018-08-20 at 2.56.59 PM

Provide the Artifact Group & Name and in dependancies add “Web” and leave everything else with default which would create a Maven project with java & Spring Boot. This will generate a ZIP which is to be imported into STS as a Maven Project.

Screen Shot 2018-08-20 at 3.08.38 PM

That’s it!! You have just created a Spring Boot Application in the workspace. Now, we would need to add a simple RestController so we can test the API.

Screen Shot 2018-08-20 at 3.12.46 PM.png

Upon running the application and accessing the endpoint of the API we will see the output “Simple Spring Boot Application” will be shown in the browser.

Screen Shot 2018-08-20 at 3.20.32 PM

We have successfully created and run the application in the embedded server of the IDE, but now we deploy the same in the Docker Container. For this we would have to create a Docker File that will contain the steps that will executed by Docker to create an image of this application and will be running that image from docker.

JAR file of this application:

As in the POX.XML we have defined that the packaging will be of type JAR, let us run the maven commands to create a JAR file for us.
Screen Shot 2018-08-20 at 3.40.41 PM

To do so, first clean up the target folder.

mvn clean       [This can also be done from IDE, Run as Maven Clean]
mvn install      [This can also be done from IDE, Run as Maven Install]
These command will create a “dockerdemo.jar” in the target directory of the working directory.

Screen Shot 2018-08-20 at 3.43.03 PM

What is a Docker File?

Docker gives the user the capability of creating there own docker images and deploy the same in the docker. To create your own docker image we have to create out own docker file.  Basically a  Docker File is a simple text file with all the instructions required to build the image.

Here is our Docker File :
Create a simple File in the project folder and add these steps in that file.

Screen Shot 2018-08-20 at 3.27.39 PM.png

FROM java:8
This line means this is a Java Application and will require all the Java Libraries so it will pull all the java related libraries and add to the container,

This means that we would like to expose 8080 to the outside world to access our application.

ADD /target/dockerdemo.jar dockerdemo.jar
ADD <source from where docker should create the image> <destination>

ENTRYPOINT [“java”, “-jar”, “dockerdemo.jar”]
This will run the command as the entry point. As this is a JAR and we need to run this Jar from within the docker.

These are the four steps for that will create an image of our Java Application to be able to run the docker.

Okay!! We have two pieces ready…

  1.  Java – Spring Boot Application
  2. DockerFile that will create the Image to be run in the Docker Container.

For loading these up in the Docker Container, we have to first create the image and then run that image from the docker container. We need to run certain commands in the folder that contains the DockerFile.

Screen Shot 2018-08-20 at 3.56.29 PM.png
This will create our image in the docker and loads up to the container.

Screen Shot 2018-08-20 at 4.00.43 PM

Now that we have the image ready to run… let’s do that with the following command…

Screen Shot 2018-08-20 at 4.06.44 PM.png There you go.. Spring Boot Application Boots up.. and the server is running on the port (8080).

Screen Shot 2018-08-20 at 4.28.34 PM

Here we go…. The spring boot application is running from Docker Container 🙂

Hope this will help get started with Spring Boot Application and Docker Container Deployment

Happy Coding!