Donation

If you found the contents in this blog useful, then please make a donation to keep this blog running. You can make donations via Skrill with email address shazin.sadakath@gmail.com

Friday, December 8, 2017

Writing Reactive Repositories for Spring Data with Mongodb

1. Overview

Reactive Programming has been alive for sometime now. Programming frameworks like Akka, Reactive Streams, Reactor, RxJava etc are good examples. In simple terms reactive programming is about writing non blocking software that are asynchronous and event driven.
Reactive Programming requires a small number of threads to scale vertically (Scale up inside a single JVM) instead of horizontally (Scale out to different nodes by means of clustering).
With Spring 5.0 there is out of the box support for Reactive Programming and now Spring Data project also has Reactive support. Now we will be looking at those latest features here in detail:

2. Setup

In order to use Spring Data Reactive Repositories we need to include spring-boot-starter-data-mongodb-reactive, de.flapdoodle.embed.mongo (for testing), rxjava and rxjava-reactive-streams. Plus reactive mongo-db driver is needed to make full use of the reactive capabilities. The maven dependencies will look like below:

Complete pom.xml file can be found at the Github repository listed at the conclusion section.

3. Enabling Spring Data Reactive Repositories for Mongodb

As the title suggests we will have a look the Spring Data Reactive Repositories with Mongodb. The new @EnableReactiveMongoRepositories is introduced to enable Reactive Repository support for Mongodb. The following configuration enables Spring Data Reactive Repositories for Mongodb:

4. Reactive Repositories

Spring Data project uses the repositories programming model which is the most high-level abstraction to deal with data. They’re comprised of a set of CRUD methods defined in a Spring Data provided interface and domain-specific query methods.
Mainly by using an interface named CrudRepository which exposes methods like findOne, delete, save. With Reactive Programming support Spring Data project has now introduced two more interfaces named ReactiveCrudRepository and RxJava2CrudRepository (for RxJava project support)
A typical Spring Data Reactive Repository would look like below:

And a RxJava2 version of the same Repository would look like below:


Note that Spring 5.0 Reactor Project specific Flux is returned in ReactiveTaxiRepository and RxJava project specific Flowable is returned in RxJava2TaxiRepository.
These repositories are really identical to standard Spring Data Repositories except for the fact that now they can return and/or accept as parameters, reactive elements such as Flux, Mono and Flowable.  By default, reactive repositories use Project Reactor types but other reactive libraries can also be used such as RxJava2 as shown above.

5. Using Spring Data Reactive Repositories for Mongodb

When using the new Spring Data Reactive Repositories, We can use the full features of Reactive Programming provided by the entities Flux, Mono or Flowable (RxJava2).
And now we look at Reactor version would like below:


And now we look at the RxJava2 version:

The above codes will find Taxis by Number CAL-4259 and collect that Flux stream or Flowable stream into a List and will block until the collection is finished.

6. Streaming Data with Tailable Cursor

Spring Data Reactive Repositories provide a way to Stream Data as it arrives into Mongodb with a @Tailable annotation, sort of like an Event Source system. Sticking to our example of Taxis, we can Subscribe to a Tailable Stream and while being subscribed, insert Taxi entities into Mongodb.
This will enable to see newly added Taxi entities coming into system in real time in a streaming manner until the subscription is disposed of. Simulating Taxis entering into a City in real time:


7. Advantages

Compared to a Standard Spring Data Repository, A Reactive Repository provides all the features of Reactive Programming to Data Retrieval. Just by using Reactive Repositories, we can easily filter, process, aggregate data returned declaratively and use asynchronous capabilities provided out of the box in Reactive Programming.

8. Conclusion

Reactive Programming provides a lot of features such Functional, Declarative style of Coding which is being rapidly adopted by developers and enables to write scale-able, easy to understand code.
Now with Spring Data Reactive Repositories these features can be easily incorporated into the existing features of Spring Data project. The complete Source code for the project can be found at GitHub .

Tuesday, October 3, 2017

Java + Spring Boot implementation of Blockchain

Blockchain is the buzz word these days as it is technology used by Cryptocurrencies which is taking over the world like crazy. There are so many articles written on the Theory of Blockchain like this one but there are very little implementations. But recently I stumbled upon an article by Daniel van Flymen who happens to have written a Python based implementation of the Blockchain.

According to him and I am pretty sure many Software Engineers would agree that the best way to learn something is to actually implement it so that you learn more than you get to learn while reading theory documentation.

So this inspired me to write Java + Spring Boot based Implementation of Daniel's work so that I can learn the Blockchain workings. For everyone who is interested following is a basic version of how a Blockchain will look like in the wild.


So Daniel talks a lot about the details of the Blockchain which you can read but in short following are the Steps required to write a Blockchain implementation.


  1. Building a Blockchain
  2. Building an API to access the Blockchain
  3. Interacting with the Blockchain 
  4. Consensus
You can see all above Steps implemented in Java + Spring Boot at https://github.com/shazin/block-chain


Monday, September 18, 2017

Server Sent Event Processing in Spring MVC 4.2

In a previous post I spoke about the features Spring MVC 4.2 has from Streaming Request Processing. I have discussed how we can StreamingResponseBody to send large data asynchronously in a streaming manner. 

In this post I am planning to talk about SseEmitter which is another way to send semi structured data clients in an asynchronous streaming manner.

If you want to learn more about the definition and structure of a Server Sent Event you can read the Mozilla Documentation. But in short a Server Sent Event is a push event from a server which will be taking place inside of a single TCP/IP Socket connection from Client to Server. This event being push means it eliminates the need for constant polling of the Server for data thus reduces unwanted load on Server and the Client can receive the data in real time.

In Spring MVC following controller code can be written to send server sent events easily and in the client end Javascript can be used to read these events and present in the web page. For demo purpose I am showing a live Cricket match score and commentary coming down to a Web page client in real time. The server code will look like below; 



And Javascript client client code will look like below;


And as you can see from the image below there is only one request being initiated to /score end point but multiple commentary data being received through that long lived connection.


Saturday, September 9, 2017

Spring Web Flux - The Non Blocking Asynchronous Functional Reactive Web Framework - 2

Spring Web Flux Framework can be used with its latest Release Candidate Release 5.0.0.RC3 if you want to try it out. You just need to add the Spring Milestone Repository in your gradle or maven file.


I have written the following service which emulates a delay of maximum 1000 milliseconds in a service to test how the conventional Spring Web MVC vs Spring Web Flux work.


And the following controller implementations one with a Blocking conventional Spring Web MVC Controller method which returns a list to get people and a Non Blocking Spring Web Flux Controller method which returns a Fluxwhich is a defferred result and processed differently.


And Tested sending 1000 concurrent requests to both methods using a Gatling test. The Application was running inside of a Tomcat Container.

Spring Web MVC results were


And Spring Web Flux results were


If you compare the 99th Percentile, Max and Mean response times of the both results you can see Spring Web Flux is considerably faster without any code changes to improve performance. This is a promising sign. The numbers can be improved even more if we use the Netty Reactive Servers instead of conventional Tomcat servers in my understanding.



Spring Web Flux - The Non Blocking Asynchronous Functional Reactive Web Framework - 1

Spring has a very diverse eco system of projects from Batch Jobs to Web Security. The best part of Spring project is that they are always ahead of time and do innovative work which sometimes are adopted by the Java Platform itself. For Example the @Autowired Annotation introduced by Spring was later standardized in JSR 330 as @Inject

Spring 5 is something they are working on right now (Have released 5.0.0 Release Candidate 3 Version as of writing this post) and has a lot of innovative features. Following are a list of those features;
  • Reactive programming: introducing our Spring WebFlux framework built on Reactor 3.1, with support for RxJava 1.3 & 2.1 and running on Tomcat, Jetty, Netty or Undertow.
  • Functional style with Java 8 & Kotlin: several API refinements and Kotlin extensions across the framework, in particular for bean registration and functional web endpoints.
  • Integration with Java EE 8 APIs: support for Servlet 4.0, Bean Validation 2.0, JPA 2.2, as well as the JSON Binding API (as an alternative to Jackson/Gson in Spring MVC).
  • Ready for JDK 9: fully aligned with JDK 9 at runtime, on the classpath as well as the module path (on the latter: as filename-based “automatic modules” for the time being).
One of most interesting is the introduction of Spring Web Flux - A Non Blocking Functional Reactive Web Framework. Reactive Programming has been alive for sometime in form of RxJava project and many more programming frameworks like Akka, NodeJS etc.

In simple terms reactive programming is about writing non blocking software that are asynchronous and event driven which require a small number of threads to scale vertically (Scale up inside a single JVM) instead of horizontally (Scale out to different nodes by means of clustering). 

If we take a look at the Spring Web MVC framework which was a Synchronous Blocking Framework based on Servlet (Prior to Servlet 3.x). Every Request Reaching a Spring Web MVC Controller would use the Request Thread that came a long with the Client Request to cater that request and will be blocking, Which meant that the scale-ability was bound to the maximum number of Request Threads a Web Container had. 

This was partially solved by the introduction of Servlet 3.x with asynchronous servlets where now a Request Thread would delegate the task of catering the request to a Separate Thread. Spring Web MVC framework allowed to return a DefferedResult from a Controller method which meant that the Request thread will not be blocking until the processing of the request by the Controller method.



Spring Web Flux uses a completely new approach of using Reactive Streams instead of the Servlet API. A key aspect of reactive is that it won't overwhelm the consumers when producers produce at a rate which is faster than the consumers can consume. This concept is called back pressure.


Reactive Programming will not let this happen (Image Courtesy : iStock)


References

  1. https://docs.spring.io/spring-framework/docs/5.0.0.BUILD-SNAPSHOT/spring-framework-reference/html/web-reactive.html
  2. https://spring.io/blog/2017/05/08/spring-framework-5-0-goes-rc1

Monday, August 28, 2017

Java 9 : The Future 3

Why is a module path better than a class path? 

Class path 

  • Class path can get in trouble when multiple classpath elements contain the same package 
  • Class path is searched (linearly) every time a new class is requested 

Module Path 

  • Modules form a partition of packages 
    • no package can be in more than one runtime module
  • Modules perform a directed search 
    • Once the runtime finds a module,it remembers its packages 
    • Never has to search for those packages again 
    • Class loading becomes O(1), not O(n)

So now instead Dependency List as with Classpath, we have a Module Dependency Graph which is Directed Graph on Dependencies. 

Following is the Dependency Graph for java.se module which is the Entire JVM Utilities available out of the box.


Breaking a single rt.jar into modules which can be defined in the graph shown above was the reason why Java 9 release took longer than other Java releases. It must have been very difficult task for the Engineering team and hats off to them.

So what about Backward Compatibility??

All Old Codes which are not modularized (available in Classpath) will be put into a one big module called the "Unnamed Module". That module will by default
  • Requires every other module
  • Exports / opens every package available in that module 
  • Maximum compatibility with classpath behavior 
This sort of behavior allows Java code to incrementally modularized when and if required. 



Tuesday, August 8, 2017

Java 9 : The Future 2

This is a continuation of Java 9 : Future 1 Post

Modules enforce Strong Encapsulation as we discussed in the previous post. Now a Package is not public by default to all other modules unless it is explicitly exported in the module-info.java Module Descriptor.

Can Class D from Module A access Class F from Module B?


With the Module Descriptors mentioned below, yes it is possible as Module A has mentioned as it requires Module B and Module B has exports package Z.

Strong Encapsulation

  • Module must require other Modules to be used in them.
  • Module exports packages to specific Modules or to all.
  • Module opens packages explicitly for reflection.
  • Accessibility is enforced by compiler, JVM and Reflection.

But why Modules again?

Remember packages starting with sun.* or com.* that were available in rt.jar (JVM Runtime) which were only specific for certain JVM Vendor Implementations and not all. Java programmers have been constantly warned not to use these packages as they will break Platform Independence. But up until Java 9 Modules there was no way to completely stop developers from using those packages.

Enter java.base Module

Following is the Module Descriptor for java.base module. With this as you can see only certain packages are exported and com.*/sun.* packages are only available within the module.

// module-info.java
module java.base { 
      exports java.lang;  
      exports java.io;  
      exports java.net;  
      exports java.util; 
}

Every Module in Java 9 implicitly will have require to java.base. Just like every class in Java will automatically import java.lang.*. 

So what does this means?

This means that now there can be a Java Application which is 100% modular with all Jar files being modules. Which means we can't use classpath in those scenarios. 

Enter Module Path

  • Path containing directories which contain modules
  • Modules can be exploded directories, or modular JARs (other forms too) 
  • The runtime searches the module path when looking for a module.
Plus a Java Application can be an hybrid of Modular as well as Backward Compatible Application in those cases there will be both a Module Path and Class Path.

To be continued..