Wednesday, December 28, 2016

Docker for Java development - simple example

1.Intro

Docker site: https://www.docker.com/

From wikipedia:
Docker is an open-source project that automates the deployment of Linux applications inside software containers. Quote of features from Docker web pages:
Docker containers wrap up a piece of software in a complete filesystem that contains everything it needs to run: code, runtime, system tools, system libraries – anything you can install on a server. This guarantees that it will always run the same, regardless of the environment it is running in.[5]
Docker provides an additional layer of abstraction and automation of operating-system-level virtualization on Linux.[6] Docker uses the resource isolation features of the Linux kernel such as cgroups and kernel namespaces, and a union-capable file system such as OverlayFS and others[7] to allow independent "containers" to run within a single Linux instance, avoiding the overhead of starting and maintaining virtual machines.[8]

In shorts, docker is a way of creating containers, deploying applications into containers and  running application inside container. What is container in a docker world? It's a some kind of virtual PC but very lightweight and easy to use.

2. Docker installation

Of course, first of all, docker has to be installed. On official site there are instructions for different operational systems: https://docs.docker.com/engine/installation/

After installation you can run

# docker run hello-world

And result should be something like:

Hello from Docker!
This message shows that your installation appears to be working correctly.
To generate this message, Docker took the following steps:
 1. The Docker client contacted the Docker daemon.
 2. The Docker daemon pulled the "hello-world" image from the Docker Hub.
 3. The Docker daemon created a new container from that image which runs the
    executable that produces the output you are currently reading.
 4. The Docker daemon streamed that output to the Docker client, which sent it
    to your terminal.
To try something more ambitious, you can run an Ubuntu container with:
 $ docker run -it ubuntu bash

As you can see, a lot of magic happen by executing just one command.

2. Main concepts/components.

In previous step, Docker's output gave a lot of information how it works:
 - where are docker images which are located in Docker Hub.
 - if we don't have needed image locally - it will be pulled down from Docker Hub
 - based on this image Docker is creating container which it may run

To check list of downloade images we can execute:
# docker images

In result will be present just downloaded hello-world image:

REPOSITORY                TAG                 IMAGE ID            CREATED             SIZE
hello-world               latest              c54a2cc56cbb        6 months ago        1.848 kB


3. Test application for running inside docker container

Now let's create a simple application(in JAR file) which we want to put inside a container. 
I generated simple maven SpringBoot application on http://start.spring.io/ : I selected "Web" in dependencies and pressed button "generate project".

In downloaded project we need just 2 files: 

pom.xml
- I removed test dependency and specified jar file name(MyTestApp):


<?xml version="1.0" encoding="UTF-8"?><project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
   <modelVersion>4.0.0</modelVersion>

   <groupId>com.demien</groupId>
   <artifactId>demo</artifactId>
   <version>0.0.1-SNAPSHOT</version>
   <packaging>jar</packaging>

   <name>demo</name>
   <description>Demo project for Spring Boot</description>

   <parent>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-parent</artifactId>
      <version>1.4.2.RELEASE</version>
      <relativePath/> <!-- lookup parent from repository -->   </parent>

   <properties>
      <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
      <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
      <java.version>1.8</java.version>
   </properties>

   <dependencies>
      <dependency>
         <groupId>org.springframework.boot</groupId>
         <artifactId>spring-boot-starter-web</artifactId>
      </dependency>

   </dependencies>

   <build>
                <finalName>MyTestApp</finalName>
      <plugins>
         <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
         </plugin>
      </plugins>
   </build>


</project>

 
And Main application runner DemoApplication.java

package com.demien;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController@SpringBootApplicationpublic class DemoApplication {

   @RequestMapping("info")
   public String info() {
      return "Hello world!";
   }

   public static void main(String[] args) {
      SpringApplication.run(DemoApplication.class, args);
   }
}

- I just added one endpoint:   /info which is returning "Hello world!" message.
We can now run our application by running from command line: mvn spring-boot:run
and open URL: http://localhost:8080/info
 - result should be "Hello world!"

4. Putting test application into container

4.1 Dockerfile

Now we have to create a container. First of all we have to create a file with container description. File name should be "Dockerfile" and it content in this example is:

FROM maven
ADD target/MyTestApp.jar /usr/local/app/MyTestApp.jar
EXPOSE 8080
CMD ["java", "-jar", "/usr/local/app/MyTestApp.jar"]  

Let's explain it step by step:

FROM maven 
- base image for container is MAVEN. List of all imaged can be queried by command "docker search XXX". For example:
# docker search java

NAME                   DESCRIPTION                                     STARS     OFFICIAL   AUTOMATED
java                   Java is a concurrent, class-based, and obj...   1246      [OK]      
anapsix/alpine-java    Oracle Java 8 (and 7) with GLIBC 2.23 over...   168                  [OK]
develar/java                                                           53                   [OK]
isuper/java-oracle     This repository contains all java releases...   47                   [OK]
lwieske/java-8         Oracle Java 8 Container - Full + Slim - Ba...   30                   [OK]
nimmis/java-centos     This is docker images of CentOS 7 with dif...   20                   [OK]

Next line  from Dockerfile:
ADD target/MyTestApp.jar /usr/local/app/MyTestApp.jar
- to image from previous step we have to add our application JAR file: file has to be taken from ./target directory and put into /usr/local/app directory in image filesystem

EXPOSE 8080
- our application is using port 8080

CMD ["java", "-jar", "/usr/local/app/MyTestApp.jar"] 
- command which will be executed when we will ask docker to run our container.

4.2 Building new container image

After Dockerfile creation, we can create a new image based on it content:
#docker build -t demien/docker-test1 .

Docker will read Dockerfile in a current directory and create a new image: demien/docker-test1
During build phase, docker will download MAVEN image and all images on which it depends. Now we can execute again:
# docker images
 - to check. New image should be present in result list:

REPOSITORY                TAG                 IMAGE ID            CREATED             SIZE
demien/docker-test1       latest              108d927b5ebb        5 days ago          667.4 MB
<none>                    <none>              ffd0c9386045        5 days ago          667.4 MB
<none>                    <none>              b41d536b6005        5 days ago          667.4 MB
<none>                    <none>              9ce7e4000959        5 days ago          667.4 MB
<none>                    <none>              3a3615d1befb        5 days ago          667.4 MB
<none>                    <none>              f5d4c54045dc        5 days ago          667.4 MB
<none>                    <none>              04dde1d25ea3        6 days ago          707.5 MB
maven                     latest              a90451161cc5        9 days ago          653.2 MB
hello-world               latest              c54a2cc56cbb        6 months ago        1.848 kB


5. The end

We're almost done! Just one thing: we have to run our created image by executing from command line:
#docker run -p 8080:8080 demien/docker-test1

Parameter  "-p 8080:8080" means that we are redirecting our local port 8080 to port 8080 of container.

After start of our application we can open http://localhost:8080/info - result should be again "Hello world!". But this result we got from container: our request to endpoint "/info" on port 8080 was forwarded to port 8080 of our container which is running MyTestApp.jar located at it local directory "/usr/local/app" (we defined it by CMD ["java", "-jar", "/usr/local/app/MyTestApp.jar"]). And all this magic happen by Docker, invisible for us!

All source files can be downloaded from here

Monday, May 9, 2016

SPARK - rest framework for java


Don't be surprised : ApacheSpark and SparkJava - it's a 2 different technologies !
In this post a'm talking about SparkJava - simple rest framework for Java :  http://sparkjava.com/

Related posts :
Spring Boot - simple example
SpringBoot with SpringData and H2 database

SpringBoot is a very good framework, but it takes as dependencies almost full stack of all Spring libraries. And if you are not planing to use them in your project - you will start thinking about more "compact" rest frameworks. SparkJava is one of them.

1. Goal 

As a lazy developer I want to create a rest application with ability to add new entities in very simple way : by just extending "base" class :
GenericController<Item> itemController=new GenericController<Item>("/item", Item.class, itemService);



2. Application structure

Standard "mave-based" application structure:


3.Maven project(pom.xml) file

Just spark,slfj,gson and junit dependencies:

<?xml version="1.0" encoding="UTF-8"?><project xmlns="http://maven.apache.org/POM/4.0.0"         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.demien</groupId>
    <artifactId>sparktest</artifactId>
    <version>1.0-SNAPSHOT</version>

    <dependencies>
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-core</artifactId>
        <version>2.0.0</version>
    </dependency>
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-simple</artifactId>
        <version>1.7.7</version>
    </dependency>
        <dependency>
            <groupId>com.google.code.gson</groupId>
            <artifactId>gson</artifactId>
            <version>2.2.4</version>
        </dependency>
        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.12</version>
        </dependency>

    </dependencies>


</project>


4. Domain objects

For my POJO data objects i created interface  IPersistable
public interface IPersistable {
    Long getId();
    void setId(Long id);

}

For having ability to operate ID field(by getId method) in controllers and test classes. So, all my domain objects are implementing this interface :

public class Item implements IPersistable {

    private Long id;
    private String name;
    private Long parentId;

public class Param implements IPersistable {

    private Long id;
    private String name;
    private String dataType;
    private Item item;

5. Controller

As I mentioned before, I'm too lazy, so I want to move common operations like "add", "get", "update", "delete" into one class :

package com.demien.sparktest.controller;

import com.demien.sparktest.util.JsonUtil;
import com.demien.sparktest.domain.IPersistable;
import com.demien.sparktest.service.GenericService;
import spark.Request;
import spark.Response;
import spark.Spark;

public class GenericController<T extends IPersistable> {
    private GenericService<T> service;
    private Class<T> cl;

    public GenericController(String basePath, Class<T> cl, GenericService<T> service) {
        this.cl=cl;
        this.service=service;
        Spark.get(basePath,this::getAll, JsonUtil::toJson);
        Spark.get(basePath+"/:id",this::getById, JsonUtil::toJson);
        Spark.get(basePath+"/test",this::test, JsonUtil::toJson);
        Spark.post(basePath,this::add, JsonUtil::toJson);
        Spark.put(basePath,this::update, JsonUtil::toJson);
        Spark.delete(basePath,this::delete, JsonUtil::toJson);
    }

    public Object test(Request request, Response response) {
        return "Hello world!";
    }

    public Object getAll(Request request, Response response) {
        return service.getAll();
    }

    public Object getById(Request request, Response response) {
        String id = request.params(":id");
        return service.getById(Long.parseLong(id));
    }

    public T restoreObjectFromRequest(Request request) {
        return (T)JsonUtil.toObject(request.body(),cl);
    }

    public Object add(Request request, Response response) {
        return service.add(restoreObjectFromRequest(request));
    }

    public Object update(Request request, Response response) {
        return service.update(restoreObjectFromRequest(request));
    }

    public Object delete(Request request, Response response) {
        service.delete(restoreObjectFromRequest(request));
        return "";
     }




}

So, now, for my entities(item and param) I have just to extend this class, without creation of controllers  :
GenericController<Item> itemController=new GenericController<Item>(ITEM_PATH, Item.class, itemService);
GenericController<Param> paramController=new GenericController<Param>(PARAM_PATH, Param.class, paramService);


6. "Dummy" service. 

It's just a simple demo project, so I decided not to use Hibernate, and created simple class for storing objects in a HashMap :

package com.demien.sparktest.service;

import com.demien.sparktest.domain.IPersistable;

import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;

public class GenericService<T extends IPersistable> {
    private Long maxId=0L;

    private Map<Long, T> storage=new HashMap<Long, T>();

    public void clearStorage() {
        storage.clear();
    }

    public T getById(Long id) {
        return storage.get(id);
    }

    public T add(T element) {
        Long id=element.getId();
        if (id==null) {
            maxId++;
            element.setId(maxId);
        } else {
            if (maxId.longValue()<id.longValue()) {
                maxId=id+1;
            }
        }
        return update(element);
    }

    public List<T> getAll() {
        List<T> result=new ArrayList<T>();
        for (T element:storage.values()) {
            result.add(element);
        }
        return result;
    }

    public T update(T element) {
        storage.put(element.getId(), element);
        return storage.get(element.getId());
    }

    public void delete(T element) {
        storage.remove(element.getId());
    }

}

7. Main application file. 

Spark is running just like regular java application. In main() procedure i have to "start" my controllers :
package com.demien.sparktest;

import com.demien.sparktest.controller.GenericController;
import com.demien.sparktest.domain.Param;
import com.demien.sparktest.service.GenericService;
import spark.Spark;
import com.demien.sparktest.domain.Item;

public class App {

    public final static int SPARK_PORT=8080;
    public final static String APP_PATH="http://localhost:"+SPARK_PORT;

    public final static GenericService<Item> itemService=new GenericService<>();
    public final static String ITEM_PATH="/item";

    public final static GenericService<Param> paramService=new GenericService<>();
    public final static String PARAM_PATH="/param";

    public static void main(String[] args) {
        Spark.setPort(8080);
        GenericController<Item> itemController=new GenericController<Item>(ITEM_PATH, Item.class, itemService);
        GenericController<Param> paramController=new GenericController<Param>(PARAM_PATH, Param.class, paramService);
    }

}

8. Utils

Also I had to create few simple utils :


8.1. JsonUtil  - just for conversion json<=>object

package com.demien.sparktest.util;

import com.google.gson.Gson;

public class JsonUtil {
    public static String toJson(Object object) {
        return new Gson().toJson(object);
    }

    public static Object toObject(String json, Class<?> cl) {
        return new Gson().fromJson(json, cl);
    }
}


8.2. RestTestUtil  - for testing : sending requests 

package com.demien.sparktest.util;

import java.io.*;
import java.net.HttpURLConnection;
import java.net.URL;
import java.nio.charset.StandardCharsets;

public class RestTestUtil {

    public static class RequestResult {

        public final String body;
        public final int status;

        private RequestResult(int status, String body) {
            this.body = body;
            this.status = status;
        }
    }

    public static RequestResult sendRequest(String method, String path, String urlParameters) throws IOException {

        URL url = new URL(path);
        HttpURLConnection conn = (HttpURLConnection) url.openConnection();
        conn.setDoOutput(true);
        conn.setInstanceFollowRedirects(false);
        conn.setRequestMethod(method);

        if (urlParameters!=null) {
            byte[] postData = urlParameters.getBytes(StandardCharsets.UTF_8);
            int postDataLength = postData.length;

            conn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
            conn.setRequestProperty("charset", "utf-8");
            conn.setRequestProperty("Content-Length", Integer.toString(postDataLength));
            conn.setUseCaches(false);
            conn.getOutputStream().write(postData);
        }

        Reader in = new BufferedReader(new InputStreamReader(conn.getInputStream(), "UTF-8"));
        StringBuilder sb = new StringBuilder();
        for (int c; (c = in.read()) >= 0; )
            sb.append((char) c);
        String responseBody = sb.toString();
        int responseCode=conn.getResponseCode();
        return new RequestResult(responseCode, responseBody);

    }

    public static RequestResult sendRequest(String method, String path) throws IOException {
        return sendRequest(method, path, null);
    }


}

8.3 object populator - for testing : to "fill" test object with random generated data. 

package com.demien.sparktest.util;

import com.demien.sparktest.domain.IPersistable;
import com.demien.sparktest.domain.Item;

import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;

public class ObjectPopulator {

    interface RandomGenerator {
        Object getRandomValue();
    }

    enum DataType {
        Integer(() -> {
            return new Integer((int) (Math.random() * 1000));
        }),
        Long(() -> {
            return new Long((long) (Math.random() * 1000));
        }),
        Date(()-> {
            return new Date(new Date().getTime() - (int) (Math.random() * 1000 * 60 * 60 * 24 * 100));
        }),
        String(() -> {
            StringBuffer result = new StringBuffer();
            String[] letters = new String[]{"A", "B", "C", "D", "E", "F", "G"};
            int length = (int) (Math.random() * 15) + 5;
            for (int i = 0; i < length; i++) {
                int pos = (int) (Math.random() * letters.length);
                result.append(letters[pos]);
            }
            return result.toString();
        }
        );

        private RandomGenerator generator;

        DataType(RandomGenerator generator) {
            this.generator = generator;
        }

        Object getRandomValue() {
            return generator.getRandomValue();
        }
    }

    public static Object populate(IPersistable instance) throws IllegalAccessException {
        List<Field> fields = getAllFields(instance);
        for (Field eachField : fields) {
            eachField.setAccessible(true);
            String typeName=eachField.getType().getSimpleName();
            if (eachField.getType().getTypeName().startsWith("com.demien")) {
                Object obj=null;
                try {
                     obj=eachField.getType().newInstance();
                } catch (InstantiationException e) {
                    e.printStackTrace();
                }
                obj=populate((IPersistable) obj);
                eachField.set(instance, obj);
            } else {
                DataType dataType = DataType.valueOf(typeName);
                eachField.set(instance, dataType.getRandomValue());
            }
        }
        return instance;
    }


    private static List<Field> getAllFields(Object instance) {
        Field[] fields = instance.getClass().getDeclaredFields();
        List<Field> result = new ArrayList<Field>();
        for (int i = 0; i < fields.length; i++) {
            if (!java.lang.reflect.Modifier.isFinal(fields[i].getModifiers())
                    && !java.lang.reflect.Modifier.isStatic(fields[i]
                    .getModifiers())) {
                result.add(fields[i]);
            }
        }
        return result;
    }

 
}

9. Generic integration test class

For base operations i created generic controller test class - other controllers will just extend it :
package com.demien.sparktest;

import com.demien.sparktest.domain.IPersistable;
import com.demien.sparktest.service.GenericService;
import com.demien.sparktest.util.JsonUtil;
import com.demien.sparktest.util.ObjectPopulator;
import com.demien.sparktest.util.RestTestUtil;
import org.junit.*;
import spark.Spark;

import java.util.List;

public abstract class GenericControllerIT<T extends IPersistable> {
    private String fullPath;
    private Class<T> cl;
    private GenericService<T> service;

    public GenericControllerIT(String basePath, Class<T> cl, GenericService<T> service){
        this.fullPath= App.APP_PATH+basePath;
        this.cl=cl;
        this.service=service;
    }

    @BeforeClass    public static void init() {
        App.main(null);
        try {
            Thread.sleep(5000);
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }

    @AfterClass    public static void tearDown() {
        Spark.stop();
    }

    @Before    public void initTest() {
        service.clearStorage();
    }

    T getTestObject() throws Exception {
        T testObject=cl.newInstance();
        ObjectPopulator.populate(testObject);
        return testObject;
    }

    @Test    public void addTest() throws Exception {
        T testObject=getTestObject();
        RestTestUtil.RequestResult res= RestTestUtil.sendRequest("POST", fullPath, JsonUtil.toJson(testObject));
        Assert.assertEquals(200, res.status);
        Assert.assertEquals(service.getById(testObject.getId()), testObject);
    }

    @Test    public void getTest() throws Exception {
        T testObject=getTestObject();
        service.add(testObject);
        RestTestUtil.RequestResult res= RestTestUtil.sendRequest("GET", fullPath +"/"+testObject.getId());
        Assert.assertEquals(200, res.status);
        T receivedObject=(T)JsonUtil.toObject(res.body, cl);
        Assert.assertEquals(testObject, receivedObject);
    }

    @Test    public void getAllTest() throws Exception {
        T testObject1=getTestObject();
        service.add(testObject1);
        T testObject2=getTestObject();
        service.add(testObject2);
        RestTestUtil.RequestResult res= RestTestUtil.sendRequest("GET", fullPath);
        Assert.assertEquals(200, res.status);
        List<T> receivedObjectList=(List<T>)JsonUtil.toObject(res.body, java.util.List.class);
        Assert.assertTrue(receivedObjectList.size()==2);
    }


    @Test    public void updateTest() throws Exception {
        T testObject=getTestObject();
        service.add(testObject);
        T updatedObject=getTestObject();
        updatedObject.setId(testObject.getId());

        RestTestUtil.RequestResult res= RestTestUtil.sendRequest("PUT", fullPath, JsonUtil.toJson(updatedObject));
        Assert.assertEquals(200, res.status);

        T updatedObjectFromService=service.getById(testObject.getId());
        Assert.assertEquals(updatedObject, updatedObjectFromService);
    }

    @Test    public void deleteTest() throws Exception {
        T testObject=getTestObject();
        service.add(testObject);

        RestTestUtil.RequestResult res= RestTestUtil.sendRequest("DELETE", fullPath, JsonUtil.toJson(testObject));
        Assert.assertEquals(200, res.status);

        T deletedObject=service.getById(testObject.getId());
        Assert.assertNull(deletedObject);
    }





}


10. Integration test classes for ItemController and ParamController. 

Now we can just extend GenericControlerIT class and all tests for base operation will be inherited :

package com.demien.sparktest;

import com.demien.sparktest.domain.Item;

public class ItemControllerIT extends GenericControllerIT<Item> {
    public ItemControllerIT() {
        super(App.ITEM_PATH, Item.class, App.itemService);
    }
}


package com.demien.sparktest;

import com.demien.sparktest.domain.Param;

public class ParamControllerIT extends GenericControllerIT<Param> {
    public ParamControllerIT() {
        super(App.PARAM_PATH, Param.class, App.paramService);
    }
}

11. The end

Complete source code can be downloaded from here.

Sunday, January 24, 2016

Mongo DB getting started - CRUD operations - java

1. Intro

First of all we have to add mongo dependency into pom.xml file :
<dependency>
  <groupId>org.mongodb</groupId>
  <artifactId>mongo-java-driver</artifactId>
  <version>3.2.0</version>
</dependency>

After that, we can connect to mongo db  in our application :
MongoClient mongoClient = new MongoClient();
MongoDatabase db = mongoClient.getDatabase("course");
MongoCollection<Document> coll=db.getCollection("findTest");

Also I wrote several simple functions which can be useful :

public int getRandomInt() {
    int result=(int)Math.round(Math.random()*1000);
    return result;
}

public List<Document> getAllDocs() {
    List<Document> docs=coll.find().into(new ArrayList<Document>());
    return docs;
}

public void printList(String comment, List<?> list) {
    System.out.println(comment + " " + Arrays.toString(list.toArray()));

}

2. Insert

Now let's try to insert some documents into our test collection.
Before inset, first we have to create a Document. To populate Document structure we can use "append" method :

coll.drop();
coll.insertOne(new Document().append("name","Fernando").append("age", 41));
coll.insertOne(new Document().append("name",new Document("fist_name","Huan").append("second_name","Sebastyan")).append("age",40));

Document john=new Document().append("name","John").append("age",32);
Document smith=new Document().append("name","Smith").append("age",25);
coll.insertMany(Arrays.asList(john, smith));

printList("inserted documents:",getAllDocs());

Results:
inserted documents: [
Document{{_id=56a4957ac6e31a20f88dbfdc, name=Fernando, age=41}}, Document{{_id=56a4957ac6e31a20f88dbfdd,
                     name=Document{{fist_name=Huan, second_name=Sebastyan}},
                     age=40}},
Document{{_id=56a4957ac6e31a20f88dbfde, name=John, age=32}}, Document{{_id=56a4957ac6e31a20f88dbfdf, name=Smith, age=25}}
]


3. Find - basic operations

Let's populate test collection :
coll.drop();
for (int i=0;i<10;i++) {
    Document doc=new Document("x",i);
    coll.insertOne(doc);
}

To get count of documents in collection : 
long cnt=coll.count();
System.out.println("Count of documents:"+cnt);

To get first element : 
Document firstDoc= coll.find().first();
System.out.println("First document :"+firstDoc.toJson());

To get all documents : 
List<Document> allDoc=coll.find().into(new ArrayList<Document>());
System.out.println("Array with collection documents :"+Arrays.toString(allDoc.toArray()));


Working with cursors : 
MongoCursor<Document> cursor=coll.find().iterator();

try {
    while (cursor.hasNext()) {
        Document doc=cursor.next();
        System.out.println(" document from cursor "+doc.toJson());
    }
} finally {
    cursor.close();
}

4. Find - filtering
Of course, most often operation - is to filter "find" results by some criteria. For that, we have to populate a Document with "criteria". Another option - is using "builders" from Filters class:

Let's create a test collection :

coll.drop();
for (int i=0;i<10;i++) {
    for (int j=0;j<5;j++) {
        Document doc=new Document("x",i).append("y",j);
        coll.insertOne(doc);
    }

}

Just a simple filter : 
Document simpleFilter=new Document("x",5);
List<Document> filteredDocs=coll.find(simpleFilter).into(new ArrayList<Document>());
System.out.println("Filtered by [x=5] documents:" + Arrays.toString(filteredDocs.toArray()));


More complex filter : 
Document complexFilter=new Document("x",5).append("y", new Document("$gt", 3));
filteredDocs=coll.find(complexFilter).into(new ArrayList<Document>());
System.out.println("Filtered by [x=5 && y>3] documents:" + Arrays.toString(filteredDocs.toArray()));


Using Filter builder : 
Bson bsonFilter= Filters.and(Filters.eq("x", 5), Filters.lt("y", 3));
filteredDocs=coll.find(bsonFilter).into(new ArrayList<Document>());
System.out.println("Filtered by [x=5 && y<3] documents:" + Arrays.toString(filteredDocs.toArray()));

Results : 
Filtered by [x=5] documents:[Document{{_id=56a49896c6e31a21ddc31eee, x=5, y=0}}, Document{{_id=56a49896c6e31a21ddc31eef, x=5, y=1}}, Document{{_id=56a49896c6e31a21ddc31ef0, x=5, y=2}}, Document{{_id=56a49896c6e31a21ddc31ef1, x=5, y=3}}, Document{{_id=56a49896c6e31a21ddc31ef2, x=5, y=4}}]
Filtered by [x=5 && y>3] documents:[Document{{_id=56a49896c6e31a21ddc31ef2, x=5, y=4}}]
Filtered by [x=5 && y<3] documents:[Document{{_id=56a49896c6e31a21ddc31eee, x=5, y=0}}, Document{{_id=56a49896c6e31a21ddc31eef, x=5, y=1}}, Document{{_id=56a49896c6e31a21ddc31ef0, x=5, y=2}}]

5. Find - projection

Sometimes, in result set we don't need all fields, but only some specific. For that purpose we can use "projection" documents, in which we will specify which fields we want to show or hide. And, also, like with filters, we can use builders "Projections". 

Let's create one test object : 
coll.drop();
coll.insertOne(new Document("name", "Joe").append("age", 32));

Filter : 
Bson filter=new Document("name","Joe");

Projection with hiding fields "age" and "_id" : 
Bson projection=new Document().append("age",0).append("_id", 0);
List<Document> docs= coll.find(filter).projection(projection).into(new ArrayList<Document>());
System.out.println("filtered with projection[name]:"+Arrays.toString(docs.toArray()));

Hiding of "_id" field using builder : 
projection= Projections.exclude("_id");
docs= coll.find(filter).projection(projection).into(new ArrayList<Document>());
System.out.println("filtered with projection[name,age]:"+Arrays.toString(docs.toArray()));

Hiding of "_id" field and including "age" filed using builder :
projection= Projections.fields( Projections.excludeId(), Projections.include("age")); docs= coll.find(filter).projection(projection).into(new ArrayList<Document>()); System.out.println("filtered with projection[age]:"+Arrays.toString(docs.toArray()));

Results : 
filtered with projection[name]:[Document{{name=Joe}}]
filtered with projection[name,age]:[Document{{name=Joe, age=32}}]
filtered with projection[age]:[Document{{age=32}}]


6. Find - sorting

To sort by some criteria, as usual we have 2 options : 1. just create a "sort document" 2. use builder: 

Create test collection: 
coll.drop();
for (int i=0;i<1000;i++) {
    coll.insertOne(new Document("x",getRandomInt()).append("y", getRandomInt()).append("z", getRandomInt()));
}

Filter:
Document filterXlt10=new Document("x", new Document("$lt",10) );

Sort document : 
Document sortByXasc=new Document("x",1);
List<Document> docs=coll.find().filter(filterXlt10).sort(sortByXasc).into(new ArrayList<Document>());
printList("filtered by x<10 and sorted by x asc",docs);
filtered by x<10 and sorted by x asc [
Document{{_id=56a49c2fc6e31a231094c9a1, x=2, y=582, z=852}}, Document{{_id=56a49c2fc6e31a231094c958, x=3, y=688, z=922}}, Document{{_id=56a49c2fc6e31a231094c95e, x=3, y=74, z=396}}, Document{{_id=56a49c2fc6e31a231094c9b2, x=3, y=736, z=131}}, Document{{_id=56a49c2fc6e31a231094c9f4, x=3, y=35, z=856}}, Document{{_id=56a49c2fc6e31a231094cba5, x=4, y=369, z=14}}, Document{{_id=56a49c2fc6e31a231094cb02, x=5, y=990, z=816}}, Document{{_id=56a49c2ec6e31a231094c8d8, x=7, y=535, z=501}}, Document{{_id=56a49c2fc6e31a231094cb75, x=7, y=34, z=933}}, Document{{_id=56a49c2fc6e31a231094cbb4, x=8, y=795, z=878}}, Document{{_id=56a49c30c6e31a231094cbd0, x=8, y=989, z=255}}, Document{{_id=56a49c2fc6e31a231094ca17, x=9, y=486, z=603}}, Document{{_id=56a49c2fc6e31a231094cacc, x=9, y=843, z=457}}]
Sorting using builder : Bson sortYDesc= Sorts.descending("y"); docs=coll.find().filter(filterXlt10).sort(sortYDesc).into(new ArrayList<Document>()); printList("filtered by x<10 and sorted by y desc",docs);

filtered by x<10 and sorted by y desc [
Document{{_id=56a49c2fc6e31a231094cb02, x=5, y=990, z=816}}, Document{{_id=56a49c30c6e31a231094cbd0, x=8, y=989, z=255}}, Document{{_id=56a49c2fc6e31a231094cacc, x=9, y=843, z=457}}, Document{{_id=56a49c2fc6e31a231094cbb4, x=8, y=795, z=878}}, Document{{_id=56a49c2fc6e31a231094c9b2, x=3, y=736, z=131}}, Document{{_id=56a49c2fc6e31a231094c958, x=3, y=688, z=922}}, Document{{_id=56a49c2fc6e31a231094c9a1, x=2, y=582, z=852}}, Document{{_id=56a49c2ec6e31a231094c8d8, x=7, y=535, z=501}}, Document{{_id=56a49c2fc6e31a231094ca17, x=9, y=486, z=603}}, Document{{_id=56a49c2fc6e31a231094cba5, x=4, y=369, z=14}}, Document{{_id=56a49c2fc6e31a231094c95e, x=3, y=74, z=396}}, Document{{_id=56a49c2fc6e31a231094c9f4, x=3, y=35, z=856}}, Document{{_id=56a49c2fc6e31a231094cb75, x=7, y=34, z=933}}]

Complex sorting: 
Bson sortXascYdesc=Sorts.orderBy(Sorts.ascending("x"), Sorts.descending("y"));
docs=coll.find().filter(filterXlt10).sort(sortXascYdesc).into(new ArrayList<Document>());
printList("filtered by x<10 and sorted by x asc and y desc",docs); 

filtered by x<10 and sorted by x asc and  y desc [
Document{{_id=56a49c2fc6e31a231094c9a1, x=2, y=582, z=852}}, Document{{_id=56a49c2fc6e31a231094c9b2, x=3, y=736, z=131}}, Document{{_id=56a49c2fc6e31a231094c958, x=3, y=688, z=922}}, Document{{_id=56a49c2fc6e31a231094c95e, x=3, y=74, z=396}}, Document{{_id=56a49c2fc6e31a231094c9f4, x=3, y=35, z=856}}, Document{{_id=56a49c2fc6e31a231094cba5, x=4, y=369, z=14}}, Document{{_id=56a49c2fc6e31a231094cb02, x=5, y=990, z=816}}, Document{{_id=56a49c2ec6e31a231094c8d8, x=7, y=535, z=501}}, Document{{_id=56a49c2fc6e31a231094cb75, x=7, y=34, z=933}}, Document{{_id=56a49c30c6e31a231094cbd0, x=8, y=989, z=255}}, Document{{_id=56a49c2fc6e31a231094cbb4, x=8, y=795, z=878}}, Document{{_id=56a49c2fc6e31a231094cacc, x=9, y=843, z=457}}, Document{{_id=56a49c2fc6e31a231094ca17, x=9, y=486, z=603}}]


7. Find - limit and skip 

Everything is very simple here : 2 additional methods "limit" and "skip" for that : 

coll.drop();
for (int i=0;i<1000;i++) {
    coll.insertOne(new Document("x",i).append("y", getRandomInt()).append("z", getRandomInt()));
}
List<Document> docs=coll.find().limit(5).skip(20).into(new ArrayList<Document>());
printList("limit=5, skip=20",docs);

limit=5, skip=20 [
Document{{_id=56a49d1ec6e31a2377101042, x=20, y=629, z=576}}, 
Document{{_id=56a49d1ec6e31a2377101043, x=21, y=152, z=835}}, 
Document{{_id=56a49d1ec6e31a2377101044, x=22, y=599, z=833}}, 
Document{{_id=56a49d1ec6e31a2377101045, x=23, y=252, z=136}}, 
Document{{_id=56a49d1ec6e31a2377101046, x=24, y=949, z=25}}]


8.Update

For update operation we need to define 2 objects : FILTER and VALUE. 
Format of operation : 
UPDATE(FILTER-which documents we want to update,VALUE-what exactly we have to put)
FILTER document - the same document which we use for FIND operation.  

VALUE document - it can just another NewDocument - in that case, document(s) which was(were) found by FILTER statement will be replaced with NewDocument. Another option - is using "$set" document : in that case we can identify what exactly we want to replace. 

Let's create a test collection : 
coll.drop();
for (int i=0;i<5;i++) {
    coll.insertOne(new Document("_id",i).append("x", i));
}
docs=getAllDocs();
printList("original collection",docs);
original collection [
Document{{_id=0, x=0}}, 
Document{{_id=1, x=1}}, 
Document{{_id=2, x=2}}, 
Document{{_id=3, x=3}}, 
Document{{_id=4, x=4}}]


Full replace of old document with the new one: 
coll.replaceOne(Filters.eq("x", 3), new Document("x", 333).append("state", "replaced"));
docs=getAllDocs();
printList("collection with replacing [_id=3] :", docs);

collection with replacing [_id=3] : [
Document{{_id=0, x=0}}, 
Document{{_id=1, x=1}}, 
Document{{_id=2, x=2}}, 
Document{{_id=3, x=333, state=replaced}}, 
Document{{_id=4, x=4}}]


Update of just one field using $set 
coll.updateOne(Filters.eq("_id", 2), new Document("$set", new Document("state", "updated")));
docs=getAllDocs();
printList("collection with updating [_id=2] :", docs);

collection with updating [_id=2] : [
Document{{_id=0, x=0}}, 
Document{{_id=1, x=1}}, 
Document{{_id=2, x=2, state=updated}}, 
Document{{_id=3, x=333, state=replaced}}, 
Document{{_id=4, x=4}}]


Using "upsert" option : if record does not exists - it will be created.
coll.updateOne(Filters.eq("_id", 5), new Document("$set", new Document("state", "upserted")), new UpdateOptions().upsert(true));
docs=getAllDocs();
printList("collection with upserted [_id=5] :", docs);

collection with upserted [_id=5] : [
Document{{_id=0, x=0}}, 
Document{{_id=1, x=1}}, 
Document{{_id=2, x=2, state=updated}}, 
Document{{_id=3, x=333, state=replaced}}, 
Document{{_id=4, x=4}}, 
Document{{_id=5, state=upserted}}]


Update many records : 
coll.updateMany(new Document("_id",new Document("$lt",3)), new Document("$inc",new Document("x",10)));
docs=getAllDocs();
printList("collection with updated by [_id<3] set x=x+10",docs);

collection with updated by [_id<3] set x=x+10 [
Document{{_id=0, x=10}}, 
Document{{_id=1, x=11}}, 
Document{{_id=2, x=12, state=updated}}, 
Document{{_id=3, x=333, state=replaced}}, 
Document{{_id=4, x=4}}, 
Document{{_id=5, state=upserted}}]


9. Delete

Execution of delete is similar to "find" - we just need to specify how to find the object(s) , which of course will be deleted. 

Let's create a test collection : 
coll.drop();
for (int i=0;i<5;i++) {
    coll.insertOne(new Document("_id",i).append("x", i));
}

Original collection: 
docs=getAllDocs();
printList("original collection", docs);

original collection [
Document{{_id=0, x=0}}, 
Document{{_id=1, x=1}}, 
Document{{_id=2, x=2}}, 
Document{{_id=3, x=3}}, 
Document{{_id=4, x=4}}]
Delete several documents: 
coll.deleteMany(new Document("_id", new Document("$gt", 3)));docs=getAllDocs();
printList("deleted by DeleteMany by condition[_id>3]", docs);

deleted by DeleteMany by condition[_id>3] [
Document{{_id=0, x=0}}, 
Document{{_id=1, x=1}}, 
Document{{_id=2, x=2}}, 
Document{{_id=3, x=3}}]

Delete just one document: 
coll.deleteOne(new Document("_id", new Document("$gt",0)));docs=getAllDocs();
printList("deleted by DeleteOne by condition[_id>0]", docs); 

deleted by DeleteOne by condition[_id>0] [

Document{{_id=0, x=0}}, 

Document{{_id=2, x=2}}, 

Document{{_id=3, x=3}}]


10. Aggregation
Pipe for aggregation we can create in java code, or by parsing mongo documents : 
List<Document> pipeline=Arrays.asList(
        new Document("$group",
                 new Document("_id","$state").
                 append("totalPop", new Document("$sum", "$pop"))),
        new Document("$match",
                 new Document("totalPop",new Document("$gt",500000)))
);
List<Document> docs =coll.aggregate(pipeline).into(new ArrayList<Document>());
printList("aggregation:population>500000:",docs);




pipeline=Arrays.asList(Document.parse("{'$group':{'_id':'$state','totalPop':{'$sum':'$pop'}}}")
                      ,Document.parse("{'$match':{'totalPop': {'$lt':500000}}}")

);
docs =coll.aggregate(pipeline).into(new ArrayList<Document>());
printList("aggregation:population<500000:",docs);


result : 

aggregation:population>500000: [Document{{_id=CT, totalPop=661324}}, Document{{_id=CA, totalPop=953386}}, Document{{_id=NJ, totalPop=847495}}]
aggregation:population<500000: [Document{{_id=NY, totalPop=485267}}]