JavaBeat

  • Home
  • Java
    • Java 7
    • Java 8
    • Java EE
    • Servlets
  • Spring Framework
    • Spring Tutorials
    • Spring 4 Tutorials
    • Spring Boot
  • JSF Tutorials
  • Most Popular
    • Binary Search Tree Traversal
    • Spring Batch Tutorial
    • AngularJS + Spring MVC
    • Spring Data JPA Tutorial
    • Packaging and Deploying Node.js
  • About Us
    • Join Us (JBC)
  • Privacy

ExpressJS Session Store using MongoDB, PostgreSQL

September 16, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial explains how to use ExpressJS Session Stores using MongoDB and PostgreSQL. 

In one of our previous posts we saw how to work with sessions in ExpressJS app. In that example we made use of memory as the session store. We also mentioned that we can replace memory with other session stores like MongoDB, Redis, MySQL, FileSystem and so on. In this post we will explore that further and show how to use MongoDB and PsotgreSQL as the session stores.

Express-2

Why : ExpressJS Session Stores as DataBase?

Why do we need to store the session in an external DB?. In application deployed on different machines/nodes keeping the session data in memory will make the session data local to that machine/node. So if the request is served from a different machine/node then the session is not available for the user.

In order to avoid this issue, we store the session data in an external storage like MySQL, MonogDB and other DBs. This is very common practice to store the session data in DB and distribute among the different servers to avoid and loss of session data if any one of the server is down.

MongoDB Session Store

To use MongoDB as the session store we would have to install connect-mongo package along with the existing driver to connect to MongoDB. Connect-mongo is one of the popular and well tested ExpressJS middleware for storing the sessions.

Let us initialize the app and install the required packages. In the below code we are installing the following packages:

  • Mongodb
  • connect-mongo
  • express
  • express-session

[code lang=”shell”]
$> mkdir mongodb-session-store

$> cd mongodb-session-store

$> npm init

Press ^C at any time to quit.
name: (mongodb-session-store)
version: (1.0.0)
description: sample using mongodb as session store
entry point: (index.js)
test command:
git repository:
keywords:
author: mohamed sanaulla
license: (ISC)
About to write to G:\node\mongodb-session-store\package.json:

{
"name": "mongodb-session-store",
"version": "1.0.0",
"description": "sample using mongodb as session store",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "mohamed sanaulla",
"license": "ISC"
}

Is this ok? (yes)

$> npm install –save mongodb
$> npm install –save connect-mongo
$> npm install –save express
$> npm install –save express-session
[/code]

Now let us create a simple express app which makes use of sessions. This example can be found in the post here, copying the same for this tutorial. Subsequently we will replace the memory session storage with MongoDB.

[code lang=”javascript”]
var express = require(‘express’);
var session = require(‘express-session’);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

One of the options in the session options is specifying the store. We can provide the new store in the store property of the sessionOptions object. In order to connect to MongoDB We would also have to provide it with an existing connection object or provide it with the connection URL i.e parameters required to establish connection to MongoDB. Below is the code with MongoDB as the session store:

[code lang=”javascript” highlight=”3,11-14″]
var express = require(‘express’);
var session = require(‘express-session’);
var MongoStore = require(‘connect-mongo’)(session);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false,
store: new MongoStore({
url:"mongodb://localhost/test",
//other advanced options
})
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

Before running the application make sure you have installed mongodb and the monogdb daemon i.e monogd is running. The parts of the code that has changed has been highlighted above and its a minimal change.

To confirm that we are indeed writing the session data to MongoDB, load the application http://localhost:3300/ in the browser a few times and then connect to mongodb and run the below commands:

[code lang=”shell” highlight=”3, 8-13″]
> show collections
scores
sessions
students
system.indexes
system.profile
tasks
> db.sessions.find().pretty()
{
"_id" : "dTzBBxq8UTuS-aGuivY1iqgiITaEcJdz",
"session" : "{\"cookie\":{\"originalMaxAge\":null,\"expires\":null,\"httpOnly\":true,\"path\":\"/\"},\"views\":80}",
"expires" : ISODate("2015-09-27T03:30:55.179Z")
}
[/code]

A collection by name sessions is created where each document represents a session data as shown in the highlighted code above.

PostgreSql Session Store

Now let us see how we can use PostgreSql as the session store. It involves a bit of setup in the PostgreSql. We would have to create the session table in the db. The structure of the table is provided with connect-pg-simple node package. Lets first install that package and then look at the table structure. Below are the commands we run to create a new app for this and install relevant packages.

[code lang=”shell”]
G:\node>mkdir postgres-session-store

G:\node>cd postgres-session-store

G:\node\postgres-session-store>npm init

Press ^C at any time to quit.
name: (postgres-session-store)
version: (1.0.0)
description: Demo to use PostgreSql as session store
entry point: (index.js)
test command:
git repository:
keywords:
author: mohamed sanaulla
license: (ISC)
About to write to G:\node\postgres-session-store\package.json:

{
"name": "postgres-session-store",
"version": "1.0.0",
"description": "Demo to use PostgreSql as session store",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "mohamed sanaulla",
"license": "ISC"
}

Is this ok? (yes)
G:\node\postgres-session-store>npm install –save express
G:\node\postgres-session-store>npm install –save express-session
G:\node\postgres-session-store>npm install –save connect-pg-simple
[/code]

The session table structure can be found in the path: node_modules/connect-pg-simple/table.sqland the structure of the table is as given below:

[code lang=”sql”]
CREATE TABLE "session" (
"sid" varchar NOT NULL COLLATE "default",
"sess" json NOT NULL,
"expire" timestamp(6) NOT NULL
)
WITH (OIDS=FALSE);

ALTER TABLE "session" ADD CONSTRAINT "session_pkey"
PRIMARY KEY ("sid") NOT DEFERRABLE INITIALLY IMMEDIATE;
[/code]

Run the above sql to create the table in your Postgres DB. We are going to use postgres db.

Let us use the same express app code which we used at the start of this article. We are repeating the code here for your ease:

[code lang=”javascript”]
var express = require(‘express’);
var session = require(‘express-session’);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

Let us update the above code to declare PostgreSql session store as shown below:

[code lang=”javascript” highlight=”3,11-17″]
var express = require(‘express’);
var session = require(‘express-session’);
var PostgreSqlStore = require(‘connect-pg-simple’)(session);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false,
store : new PostgreSqlStore({
/*
connection string is built by following the syntax:
postgres://USERNAME:PASSWORD@HOST_NAME:PORT/DB_NAME
*/
conString: "postgres://postgres:postgres@localhost:5433/postgres"
})
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

In the above code the highlighted parts are the ones which set the session store as PostgreSql. Let us run the application and load the URL http://localhost:3300/ multiple times. Then execute the below query in Postgres to check the data in the session table:

[code lang=”sql”]
select * from session;
[/code]

And the output:
express session stores

Conclusion

In this article we saw how we can make use of ExpressJS Session Store as MongoDB and Postgres to store the session data. This is very useful when the application is distributed across multiple nodes. In similar ways we can use Redis, LevelDB and other external stores to store session data. If you have any questions, please write it in the comments section.

Filed Under: NodeJS Tagged With: ExpressJS Tutorials, NodeJS Tutorials

NodeJS : ExpressJS Session Management

September 11, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial explains the basic concept of ExpressJS session management. Sessions are an important part of web application. HTTP being stateless, to maintain state across requests among many other approaches, sessions and cookies is one approach.

In this article we will explore how we can make use of Node package express-session to maintain session in a ExpressJS based web application.

ExpressJS and NodeJS

Setting up the app

Let us create an empty Node project using npm as shown below:

[code lang=”shell”]
$ mkdir session-demo
$ cd session-demo
$ npm init
name: (session-demo)
version: (1.0.0)
description: Session demo for expressjs app
entry point: (index.js)
test command:
git repository:
keywords:
author: mohamed sanaulla
license: (ISC)
About to write to G:\node\session-demo\package.json:

{
"name": "session-demo",
"version": "1.0.0",
"description": "Session demo for expressjs app",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "mohamed sanaulla",
"license": "ISC"
}

Is this ok? (yes)
[/code]

Install relevant node packages for Express and Express session:

[code lang=”shell”]
$npm install express –save
$npm install express-session –save
[/code]

Let us create a simple Express app as shown below:

[code lang=”javascript”]
//Filename – index.js
var express = require(‘express’);
var session = require(‘express-session’);

var app = express();
app.get("/", function(req, res){
res.json({
"status" : "ok"
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

The above app returns a JSON object on calling http://localhost:3300/. Its a very simple example.

ExpressJS Session Management Example

In the below example, I am using session to record frequency of API invocation for a user. express-session by default stores the session data in memory. It provides support for replacing this default storage with different storage options as listed here.

express-session accepts a few properties in the options object. This object is passed while setting up the session with express app as shown below:

[code lang=”javascript”]
var express = require(‘express’);
var session = require(‘express-session’);
var app = express();
var sessionOptions = {};
app.use(session(sessionOptions));
[/code]

Different properties of the sessions options object are:

  • cookie: Options object for the session ID cookie. The default value is { path: '/', httpOnly: true, secure: false, maxAge: null }.
  • genid: Function to generate the session ID. Default is to use uuid
  • name:The name of the session ID cookie to set in the response (and read from in the request).
  • proxy: Trust the reverse proxy when setting secure cookies.
  • resave: If true forces a session to be saved back to store even if it was not modified in the request.
  • rolling: Forces a cookie to be set on every request.
  • saveUninitialized: If true it forces a newly created session without any modifications to be saved to the session store.
  • secret: It is a required option and is used for signing the session ID cookie.
  • store: Session store instance. Default is to use memory store.
  • unset: Controls the handling of session object in the store after it is unset. Either delete or keep the session object. Default is to keep the session object

Let us update the express app code to increment the frequency of the API invocation per user and record it in the session as shown below:

[code lang=”javascript”]
var express = require(‘express’);
var session = require(‘express-session’);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

Run the above application by using command: node . from the app root directory. Continuously load the URL: http://localhost:3300/ and see the frequency changing.
expressjs session management

Conclusion

  • Also Read : HOW TO : SignIn with Twitter using Node.js and Express.js

This was a very simple example of expressjs session management. I hope the expressjs session example I have provided is very helpful.  It is a pre-requiste information to be known before working on authentication related features as authentication uses session to record authenticated user info. We will also see how to use different session stores i.e mysql, MongoDB and so on.

Filed Under: NodeJS Tagged With: ExpressJS Tutorials

MongoDB : GridFS Tutorial

September 10, 2015 by Mohamed Sanaulla Leave a Comment

This GridFS tutorial explains how to use GridFS for stroring the larger files in MongoDB and when to use GridFS in MongoDB. This GridFS tutorial stores image file of 16.5 MB size to illustrate an example.

MongoDB limits the size of the document in a collection to 16 MB. Realistically this should be good enough for data which is a group of key-value pairs. But when you want to store files along with this key-value pairs, we often find the upper limit of 16 MB to be a limitation. To overcome this, MongoDB provides an API called GridFS where we can store files greater than 16 MB along with any metadata related to the file.

In this post we will look at how we can store and retrieve files from and to mongodb using GridFS and MongoDB java driver. Let us first look at the basics of GridFS.

GridFS Tutorial

gridfs

Basics of GridFS

GridFS stores the files in the form of a series of chunks where by default each chunk size can be of maximum 255k. There are two collections namely files and chunks. The files collection stores the metadata of the file and chunks collection stores the chunks of the files with each chunk having file_id and n where file_id is the _id of the parent chunk and n is the chunk number.

The collections files and chunks are stored under a namespace. By default the namespace is fs. We can override the namespace and provide our own.

Saving files to Mongodb using GridFS

Let us first look at saving file to Mongodb using GridFS. For this I am going to consider an image with size 16.5MB.

  • Also Read : Node.js + MongoDB – Performing CRUD Operations

Note: You can download any image file for your use which is large in size.
The below code saves the image at a given location to Mongodb using GridFS:

[code lang=”java”]
import java.io.File;
import java.io.IOException;
import com.mongodb.DB;
import com.mongodb.DBCollection;
import com.mongodb.MongoClient;
import com.mongodb.gridfs.GridFS;
import com.mongodb.gridfs.GridFSInputFile;

public class GridFsSaveDemo {

public static void main(String[] args) throws IOException {
MongoClient mongo = new MongoClient("localhost", 27017);
DB db = mongo.getDB("filesdb");

//Location of file to be saved
String imageLocation = "C:/Users/Mohamed/Pictures/image2.jpg";

//Create instance of GridFS implementation
GridFS gridFs = new GridFS(db);

//Create a file entry for the image file
GridFSInputFile gridFsInputFile = gridFs.createFile(new File(imageLocation));

//Set a name on GridFS entry
gridFsInputFile.setFilename("image1");

//Save the file to MongoDB
gridFsInputFile.save();
}
}
[/code]

Let us open mongo shell to verify if the above file got saved:

[code lang=”shell”]
> use filesdb
switched to db filesdb
> show collections
fs.chunks
fs.files
system.indexes
> db.fs.chunks.find().count()
67
> db.fs.files.find().pretty()
{
"_id" : ObjectId("55e3279f355311259428f3a9"),
"filename" : "image1",
"aliases" : null,
"chunkSize" : NumberLong(261120),
"uploadDate" : ISODate("2015-08-30T15:56:15.017Z"),
"length" : NumberLong(17315594),
"contentType" : null,
"md5" : "a6bc0171f7beff8036715dd1d022c1a0"
}
[/code]

You can notice above that the file was divided into 67 chunks stored in fs.chunks collection and file metadata i.e the fix size, file name, upload date stored in fs.files collection.

Reading files from MongoDB using GridFS

The code for reading the image saved above is given below:

[code lang=”java”]
import java.io.IOException;
import com.mongodb.DB;
import com.mongodb.MongoClient;
import com.mongodb.gridfs.GridFS;
import com.mongodb.gridfs.GridFSDBFile;

public class GridFsReadDemo {
public static void main(String[] args) throws IOException {
MongoClient mongo = new MongoClient("localhost", 27017);
DB db = mongo.getDB("filesdb");

//Create instance of GridFS implementation
GridFS gridFs = new GridFS(db);

//Find the image with the name image1 using GridFS API
GridFSDBFile outputImageFile = gridFs.findOne("image1");

//Get the number of chunks
System.out.println("Total Chunks: " + outputImageFile.numChunks());

//Location of the image read from MongoDB to be written
String imageLocation = "C:/Users/Mohamed/Pictures/mongoImage.jpg";
outputImageFile.writeTo(imageLocation);
mongo.close();

}
}
[/code]

The above code queries MongoDB to find the image chunks by using the image name. And then writes those image chunks to file system.

  • Also Read : Spring Boot : RESTful API using Spring Boot and MongoDB

GridFS makes it very easy to store and retrieve files from MongoDB. We can use it to store not only files greater then 16MB but also store files lesser than 16MB.

I hope this GridFS tutorial helped you to understand how to store the files larger than 16 MB and storing an image file in the MongoDB database.

Filed Under: MongoDB Tagged With: GridFS Tutorials

Node.js : RESTful APIs using StrongLoop Arc

September 1, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial guides you through writing REST APIs StrongLoop Arc tool. This is an alternative to its command line tools for developing REST APIs. When you finish this tutorial, you will be able to develop REST APIs using the StrongLoop’s Arc UI composer.

In our previous post we saw how to create REST API using Loopback and MySQL by using the command line tool slc. StrongLoop also provides a UI composer for building the APIs and that UI composer is called StrongLoop Arc. In this post let us see how we can build the APIs using the UI composer.

arc

Creating empty Loopback project

First let us create an empty Loopback project using the command slc loopback:

[code lang=”shell”]
G:\node>slc loopback

_—–_
| | .————————–.
|–(o)–| | Let’s create a LoopBack |
`———´ | application! |
( _´U`_ ) ‘————————–‘
/___A___\
| ~ |
__’.___.’__
´ ` |° ´ Y `

? What’s the name of your application? loopback-rest-uicomposer
? Enter name of the directory to contain the project: loopback-rest-uicomposer
create loopback-rest-uicomposer/
info change the working directory to loopback-rest-uicomposer
[/code]

And enter the name of the application as loopback-rest-uicomposer as shown above.

Once the application is created, change the directory to the app you have created now: cd loopback-rest-uicomposer. The project structure is as shown below:
loopbqack empty project

Next is to install MySQL connector for loopback by running the following command: npm install --save loopback-connector-mysql

Launching Strongloop Arc tool

Once you are in the app directory i.e in the loopback-rest-uicomposer directory, run the below command to launch the Strongloop Arc tool. It gets launched in your default browser.

[code lang=”shell”]
G:\node\loopback-rest-uicomposer>slc arc
Loading workspace G:\node\loopback-rest-uicomposer
StrongLoop Arc is running here: http://localhost:49619/#/
[/code]

And the app gets launched in the browser you will be able to see a login box. If you are first time user you would have to register at https://strongloop.com/register/ and then use the username and password to login to the application:
strongloop arc loginAfter login you will be able to see the features supported by StrongLoop Arc as shown below:

strongloop arc tools

Click on the Composer button to open the API composer UI. The composer UI looks like below:

strongloop arc composer

Creating MySQL Datasource

New datasource can be created either by clicking on the button MySQL button or the Add New Data Source link as shown below:
loopback mysql datasource

You will get an UI to configure your data source. Fill it with the connection settings as shown below:
loopbackClick on Test Connection button to test the connection. You will see Success printed beside the button as shown below:
test connectionAfter you click on Save Datasource button you will see the new datasource listed under the data source as shown below:
datasource

Creating Book Model

New model can be created by clicking on the button New or on the Add New Model link as shown below:
modelsYou will get an UI to configure your model. Fill it with the model details, properties and also select the data source as mysql as shown in the image below:
arc-1

Click on the Save Model button to find the model created and listed under the available models as shown below:
arc-2

Running the application

StrongLoop Arc provides buttons to launch the application and these are available in the top right corner as shown below:
strongloop arc example
Open the URL: http://localhost:3000/explorer/ in the browser to view the APIs available. You can test the APIs as explained in the previous article.

Conclusion

In this article we saw how we could leverage the StrongLoop Arc tool to compose the APIs visually. The Arc tool also supports the following:

  1. Building and Deploying the app
  2. Process Manager
  3. Tracing the app
  4. Profiling the app
  5. Collecting app metrics

We will explore the other features in the coming articles.

Filed Under: NodeJS Tagged With: NodeJS Tutorials, StrongLoop Tutorials

Node.js : Building RESTful APIs using Loopback and MySQL

September 1, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial guides you through writing simple REST APIs using Loopback (a Node.js framework) using MySQL as the back end. At the end of this tutorial, you would be able to write your own REST APIs using Loopback framework. 

In one of our previous posts here, we wrote about building RESTful APIs using ExpressJS and MongoDB. For every programming language ecosystem we have more than one web framework and the same is true for Node.js as well. One of such popular frameworks is the Loopback. Both ExpressJS and Loopback are sponsored by StrongLoop.

In this post we are going to implement the same example we used in this post but using Loopback and MySQL DB.

Loopback and Node.js

Table of Contents

List of sections covered in this tutorial are:

  1. What is Loopback?
  2. Installing Loopback
  3. Create Loopback App
  4. Creating MySQL Datasource
  5. Creating Model in Loopback
  6. Testing CRUD APIs
  7. Conclusion

Before jump into learning loopback, lets have more idea on what is this new framework in Node.js ecosystem and how this is different from other frameworks like ExpressJS.

What is Loopback?

Loopback is a opensource Node.js API framework for building REST APIs for your client applications (which can be anything like browser, mobile, etc) in most simplest way. While this is another runner in the fast growing Node.js frameworks crowd, but Loopback distinguishes itself from other frameworks like ExpressJS by providing very simple and easy to develop APIs. This is built on top of Express, most popular framework for Node.js.

There are many great advantages of loopback that makes its powerful to choose as a defacto framework for the Node.js. The future is more of JavaScript frameworks that is going to dominate the internet, loopback fits perfectly on the server side language with its easy and simple development to attract more developers.

javascript-stack

Installing Loopback

  • Loopback is available as a NPM package. As with all the NPM packages, LoopBack can be installed using npm.
  • Firstly we will install slc command line tool using this command npm install -g strongloop. slc command line tool can be used to create applications and other app artifacts.
  • Next is to install Loopback which can be done by running the command: npm install -g loopback.
  • But to create loopback application using slc command line tool, we would have to install two more npm packages namely: cookie-parser and errorhandler. These packages can be installed using the command: npm install -g cookie-parser, errorhandler.

Note: On windows one would have to run the command prompt as an administrator to be able to install slc.

  • Also Read : Packaging and Deploying Node.js Applications

Create loopback app

Strongloop node package comes with a command line tool called slc which can be used to create new application, model classes, add properties to it, define data sources to name a few.

Run this command to start creating new application: slc loopback. You will be prompted for name of your application and the folder (defaults to the name of the application) in which your application would reside. Below is what you would find on your screens:

[code lang=”shell”]
G:\node>slc loopback

_—–_
| | .————————–.
|–(o)–| | Let’s create a LoopBack |
`———´ | application! |
( _´U`_ ) ‘————————–‘
/___A___\
| ~ |
__’.___.’__
´ ` |° ´ Y `

? What’s the name of your application? loopback-rest
? Enter name of the directory to contain the project: loopback-rest
create loopback-rest/
info change the working directory to loopback-rest

Generating .yo-rc.json

I’m all done. Running npm install for you to install the required dependencies. If this fails, try running the command yourself.

create .editorconfig
create .jshintignore
create .jshintrc
create README.md
create server\boot\authentication.js
create server\boot\explorer.js
create server\boot\rest-api.js
create server\boot\root.js
create server\middleware.json
create server\server.js
create .gitignore
create client\README.md

…. npm packages installation output ….

Next steps:

Change directory to your app
$ cd loopback-rest

Create a model in your app
$ slc loopback:model

Compose your API, run, deploy, profile, and monitor it with Arc
$ slc arc

Run the app
$ node
[/code]

You will find the next steps also printed. But for now we have a directory structure created which looks like below:
loopback project structure

There is lot of auto generated code. Let us not worry about that for now.

We can run the application now to see what all is generated for us. Change to the app directory and run the command: node .. You can see the below output:

[code lang=”shell”]
G:\node\loopback-rest>node .
Browse your REST API at http://localhost:3000/explorer
Web server listening at: http://localhost:3000/
[/code]

Opening the URL: http://localhost:3000/explorer in the browser gives us the list of APIs available in the application.
loopback run applicationIn the above image you can notice a list of APIs exposed just for the User. And all this is shipped by the framework.

Opening the URL: http://localhost:3000/ in the browser gives us:
loopback code

Creating MySQL data source

Let us associate a data source with the application. As mentioned we will be using MySQL database. First we have to install the loopback mysql connector which can be done by running the command: npm install loopback-connector-mysql --save.

Next we create a data source for the application which would use mysql connector using the below command:

[code lang=”shell”]
G:\node\loopback-rest>slc loopback:datasource
? Enter the data-source name: mysql_db
? Select the connector for mysql_db: MySQL (supported by StrongLoop)
[/code]

It requires the data source name and the connector to be used which in this case is MySQL.

The above command updates the server/datasources.json with the following:

[code lang=”javascript”]
"mysql_db": {
"name": "mysql_db",
"connector": "mysql"
}

We would have to update the above datasource to define the database name, hostname, port, username and password. Let us update it with the following data:
[code lang="javascript"]
"mysql_db": {
"name": "mysql_db",
"connector": "mysql",
"database":"test",
"host":"localhost",
"port":3306,
"password":"password",
"username":"root"
}
[/code]

Creating Model in Loopback

Let us use slc tool to create a new model in loopback. The new model can be created by using the command: slc loopback:model. It will ask a series of questions starting with the model name, data source, model’s base class and properties and their types in the model class. Below is how we provided the answers to the questions and also the properties:

[code lang=”java”]
G:\node\loopback-rest>slc loopback:model
? Enter the model name: book
? Select the data-source to attach book to: mysql_db (mysql)
? Select model’s base class: PersistedModel
? Expose book via the REST API? Yes
? Custom plural form (used to build REST URL): books
Let’s add some book properties now.

Enter an empty property name when done.
? Property name: name
(!) generator#invoke() is deprecated. Use generator#composeWith() – see http://yeoman.io/authoring/composability.html
invoke loopback:property
? Property type: string
? Required? Yes

Let’s add another book property.
Enter an empty property name when done.
? Property name: isbn
(!) generator#invoke() is deprecated. Use generator#composeWith() – see http://yeoman.io/authoring/composability.html
invoke loopback:property
? Property type: string
? Required? Yes

Let’s add another book property.
Enter an empty property name when done.
? Property name: author
(!) generator#invoke() is deprecated. Use generator#composeWith() – see http://yeoman.io/authoring/composability.html
invoke loopback:property
? Property type: string
? Required? No

Let’s add another book property.
Enter an empty property name when done.
? Property name: pages
(!) generator#invoke() is deprecated. Use generator#composeWith() – see http://yeoman.io/authoring/composability.html
invoke loopback:property
? Property type: number
? Required? No

Let’s add another book property.
Enter an empty property name when done.
? Property name:
[/code]

Once completed you will find two files generated: book.js and book.json as shown in the image below:
loopback model datasource mysql
Apart from the above two files, the new model gets recorded in model-config.json as shown below:

[code lang=”javascript”]
"book": {
"dataSource": "mysql_db",
"public": true
}
[/code]

When we load the URL http://localhost:3000/explorer/#!/books/ you will find lot of APIs associated with books. Loading any of the books APIs, for example this one: http://localhost:3000/api/books will result in an error stating table test.books was not found. Let us create the table now:

[code lang=”sql”]
CREATE TABLE `test`.`book` (
`id` VARCHAR(250) NOT NULL,
`name` VARCHAR(250) NOT NULL ,
`isbn` VARCHAR(20) NOT NULL ,
`author` VARCHAR(500) NULL ,
`pages` INT NULL
) ENGINE = InnoDB;
[/code]

Let us now load the same URL http://localhost:3000/api/books and this time we will get an empty JSON array response.

So just creating a model has created a whole set of REST APIs around that model. In the next section let us try out the APIs for CRUD operations.

Testing CRUD APIs

The URL: http://localhost:3000/explorer not only exposes the list of APIs for each model, but also provides a tool for testing out the APIs.

Adding a new book

The request to add a book is made using the explorer url as shown in the image below:
loopback-add-data
The response after adding the book is as shown in the image below:loopback responseGetting a given book details

The URL: http://localhost:3000/api/books/{id} is used for getting the book details where the book is identified by the id parameter. Let us get the details of the book added above by using the URL: http://localhost:3000/api/books/123. This gives us the response as shown in the image below:
W7aMaiw

Updating book details

The request to update book details is made using the tool as shown below:
loopback update data
You can see in the above image that I am updating just the name and isbn and keeping rest of the same. The response from the API is as shown in the image below:

loopback

Deleting Book

The request to delete the book is made as shown below:
delete-bookThe response is as shown below:
delete-response-loopbackNow trying to get the details of the book with id 123 will result in an error as shown below:
UBWCVNE

Getting list of books

Lets add few new books. Then load the URL http://localhost:3000/api/books to get the list of books as shown below:
O2Ef5fF

Conclusion

So with this we have completed demonstration of CRUD operations using the APIs of Loopback and we didn’t write even a single line of code. All of the functionality above was provided by Loopback framework. In the coming posts we will explore more of this framework and see how we can build APIs using it. I have published my next article on how to use SpringLoop ARC tool for developing REST APIs using GUI.

Filed Under: NodeJS Tagged With: Loopback Tutorials, NodeJS Tutorials

Node.js : Operating System Utilities in Node.js OS Module

August 25, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial walks through the usage of Operating System related utilities in the Node.js OS module. The OS module in Node.js offers wide set of methods that are useful in getting the relevant details about the native operating system. 

If you are Node.js beginner, please read our Introduction article on Node.js.

Node.js provides operating system related utilities in os module. In this article we will show you the different APIs provided by the os module. We will show each API and its output along with it. Let us first create a js file by name: node_os_interface.js and add the following line to it:

[code lang=”javascript”]
var os = require(‘os’);
[/code]

Node.js OS Module API

There are 14 methods defined in the OS module of Node.js. These are the methods that are used for reading and interacting with native Operating System to fetch various details like process, memory, etc. The list of methods that are part of the OS module are :

  1. os.tmpdir()
  2. os.endianness()
  3. os.hostname()
  4. os.type()
  5. os.platform()
  6. os.arch()
  7. os.release()
  8. os.uptime()
  9. os.loadavg()
  10. os.totalmem()
  11. os.freemem()
  12. os.cpus()
  13. os.networkInterfaces()
  14. os.EOL

The following sections explains each method in the Node.js OS module with simple examples.

tmpdir()

Now let us start with the first API tmpdir() – this API prints the temporary directory of the OS:

[code lang=”javascript”]
console.log("OS Temp Dir: " + os.tmpdir());
[/code]

The above prints: OS Temp Dir: C:\Users\Mohamed\AppData\Local\Temp on my system. It will be different for different systems.

endianness()

This API prints whether the CPU architecture is Big Endian (BE) or Little Endian (LE).

[code lang=”javascript”]
console.log("CPU is BigEndian(BE) or LittleEndian(LE): " + os.endianness());
[/code]

Output

[code lang=”shell”]
CPU is BigEndian(BE) or LittleEndian(LE): LE
[/code]

hostname()

This API prints the operating system hostname.

[code lang=”javascript”]
console.log("OS Hostname: " + os.hostname());
[/code]

Output

[code lang=”shell”]
OS Hostname: Sana-Laptop
[/code]

type()

This API prints the type of the operating system.

[code lang=”javascript”]
console.log("OS Type: " + os.type());
[/code]

Output

[code lang=”shell”]
OS Type: Windows_NT
[/code]

platform()

This API prints the platform of the OS.

[code lang=”javascript”]
console.log("OS Platform: " + os.platform());
[/code]

Output

[code lang=”shell”]
OS Platform: win32
[/code]

arch()

This API prints CPU architecture – whether it is 32 bit, 64 bit or arm architecture.

[code lang=”javascript”]
console.log("OS CPU Architecture: " + os.arch());
[/code]

Output

[code lang=”shell”]
OS CPU Architecture: x64
[/code]

release()

This API prints OS release number.

[code lang=”javascript”]
console.log("OS Release: " + os.release());
[/code]

Output

[code lang=”shell”]
OS Release: 6.3.9600
[/code]

uptime()

This API returns the uptime of the machine i.e the number of seconds it has been running.

[code lang=”javascript”]
console.log("OS Uptime (seconds): " + os.uptime());
[/code]

Output

[code lang=”shell”]
OS Uptime (seconds): 104535.1365887
[/code]

loadavg()

This API returns the load average of the machine in the last 1, 5 and 15 minutes. This concept is relevant to UNIX systems and will return 0,0,0 for windows systems.

[code lang=”javascript”]
console.log("OS load average (Returns 0,0,0 in windows): " + os.loadavg());
[/code]

Output

[code lang=”shell”]
OS load average (Returns 0,0,0 in windows): 0,0,0
[/code]

totalmem()

This API returns the total memory (RAM) available in the system.

[code lang=”javascript”]
console.log("Total RAM (mb): " + (os.totalmem()/1024)/1024);
[/code]

Output

[code lang=”shell”]
Total RAM (mb): 8084.2734375
[/code]

freemem()

This API returns the free memory (RAM) available in the system.

[code lang=”javascript”]
console.log("Free RAM (mb): " + (os.freemem()/1024)/1024)
[/code]

Output

[code lang=”shell”]
Free RAM (mb): 2169.56640625
[/code]

cpus()

This API returns the CPUs available and information about them. We will use JSON.stringify function to pretty print the JSON string.

[code lang=”javascript”]
var cpus = os.cpus();
console.log("CPU Information: " + JSON.stringify(cpus, null, 2));
[/code]

Output

[code lang=”shell”]
CPU Information:
[
{
"model": "Intel(R) Core(TM) i7-4510U CPU @ 2.00GHz",
"speed": 2594,
"times": {
"user": 2376265,
"nice": 0,
"sys": 2880406,
"idle": 66987203,
"irq": 239203
}
},
{
"model": "Intel(R) Core(TM) i7-4510U CPU @ 2.00GHz",
"speed": 2594,
"times": {
"user": 2378703,
"nice": 0,
"sys": 2435625,
"idle": 67429250,
"irq": 136937
}
},
{
"model": "Intel(R) Core(TM) i7-4510U CPU @ 2.00GHz",
"speed": 2594,
"times": {
"user": 2435640,
"nice": 0,
"sys": 2670859,
"idle": 67137046,
"irq": 25000
}
},
{
"model": "Intel(R) Core(TM) i7-4510U CPU @ 2.00GHz",
"speed": 2594,
"times": {
"user": 2503703,
"nice": 0,
"sys": 1726234,
"idle": 68013625,
"irq": 24640
}
}
]
[/code]

The above information contains the CPU speed and the time the CPU has spent in doing user operations, system operations and being idle for each CPU.

networkInterfaces()

This API returns the network interfaces in the system i.e the entities that interface between OS and the network. These can be physical devices and logical entities for connection to localhost.

[code lang=”javascript”]
console.log("Network Interfaces: " + JSON.stringify(os.networkInterfaces(), null, 2));
[/code]

Output

[code lang=”shell”]
Network Interfaces:
{
"Wi-Fi": [
{
"address": "fe80::19c3:55f9:ecd:8a5a",
"netmask": "ffff:ffff:ffff:ffff::",
"family": "IPv6",
"mac": "b0:10:41:68:31:53",
"scopeid": 2,
"internal": false
},
{
"address": "192.168.1.4",
"netmask": "255.255.255.0",
"family": "IPv4",
"mac": "b0:10:41:68:31:53",
"internal": false
}
],
"Loopback Pseudo-Interface 1": [
{
"address": "::1",
"netmask": "ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff",
"family": "IPv6",
"mac": "00:00:00:00:00:00",
"scopeid": 0,
"internal": true
},
{
"address": "127.0.0.1",
"netmask": "255.0.0.0",
"family": "IPv4",
"mac": "00:00:00:00:00:00",
"internal": true
}
],
"Teredo Tunneling Pseudo-Interface": [
{
"address": "2001:0:9d38:6abd:2444:2b76:8a3f:ec06",
"netmask": "ffff:ffff:ffff:ffff::",
"family": "IPv6",
"mac": "00:00:00:00:00:00",
"scopeid": 0,
"internal": false
},
{
"address": "fe80::2444:2b76:8a3f:ec06",
"netmask": "ffff:ffff:ffff:ffff::",
"family": "IPv6",
"mac": "00:00:00:00:00:00",
"scopeid": 8,
"internal": false
}
]
}
[/code]

Each of the above network interfaces have an entry each for IPv4 and IPv6.

EOL

This returns the EOL marker for the OS. In windows it is \n.

[code lang=”javascript”]
console.log("EOL Marker for OS: " + os.EOL);
[/code]

The complete program is given below:

[code lang=”javascript”]
//file name: node_os_interface.js
var os = require(‘os’);

console.log("OS Temp Dir: " + os.tmpdir());
console.log("CPU is BigEndian(BE) or LittleEndian(LE): " + os.endianness());
console.log("OS Hostname: " + os.hostname());
console.log("OS Type: " + os.type());
console.log("OS Platform: " + os.platform());
console.log("OS CPU Architecture: " + os.arch());
console.log("OS Release: " + os.release());
console.log("OS Uptime (seconds): " + os.uptime());
console.log("OS load average (Returns 0,0,0 in windows): " + os.loadavg());
console.log("Total RAM (mb): " + (os.totalmem()/1024)/1024);
console.log("Free RAM (mb): " + (os.freemem()/1024)/1024)
var cpus = os.cpus();
console.log("CPU Information: " + JSON.stringify(cpus, null, 2));
console.log("Network Interfaces: " + JSON.stringify(os.networkInterfaces(), null, 2));
console.log("EOL Marker for OS: " + os.EOL);
[/code]

The above can be executed by running the command: node node_os_interface.js. The output is given below:

[code lang=”shell”]
OS Temp Dir: C:\Users\Mohamed\AppData\Local\Temp
CPU is BigEndian(BE) or LittleEndian(LE): LE
OS Hostname: Sana-Laptop
OS Type: Windows_NT
OS Platform: win32
OS CPU Architecture: x64
OS Release: 6.3.9600
OS Uptime (seconds): 145774.5905403
OS load average (Returns 0,0,0 in windows): 0,0,0
Total RAM (mb): 8084.2734375
Free RAM (mb): 2064.69921875
CPU Information:
… CPU Information already shown above …
Network Interfaces:
… Network interface information already shown above …
EOL Marker for OS:

[/code]

I hope this tutorial helped you to understand the Node.js OS module with simple examples. In my next articles, I will cover few more topics on Node.js framework with more examples.

Filed Under: NodeJS Tagged With: NodeJS Tutorials

Working with SQL Databases and Spring Boot

August 21, 2015 by Mohamed Sanaulla Leave a Comment

In this tutorial I am going to explain your how to use SQL Databases and Spring Boot together. If you are working in a  Spring Framework projects, you should know very well about how to use  SQL databases and Spring Boot for persisting application data.

SQL Databases are an integral part of any application being development. They help in persisting application data. SQL Databases provide advanced support for querying data using the Structured Query Language(SQL). Spring Boot provides great support for interacting with SQL databases with minimal or no XML configurations. In this article we will look at the following:

  1. Configuring in memory database and fetching data using JdbcTemplate
  2. Configuring production database and fetching data using JdbcTemplate
  3. Using JPA and Spring Data

In Memory Database Using JdbcTemplate

The fastest way to test something is to make use of embedded db. This helps in getting started with the application without much hassles of installing the db. Let us see how we can use embedded db and also how we can initialize the schema we need to be created in the embedded db. First let us add the dependencies to pom.xml:

  • Read : Apache Maven for Beginners

[code lang=”xml”]
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-jdbc</artifactId>
</dependency>
<dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<scope>runtime</scope>
</dependency>
[/code]

  • In the above dependencies we have included the JDBC dependency – this gives us JdbcTemplate and other JDBC libraries, the org.hsqldb dependency adds embedded hsqldb.
  • We need not add any configurations to connect to this embedded db, its all managed by Springboot.
  • These embedded DBs are in-memory and each time the application shuts down the schema and data gets erased. One way to keep schema and data in the in-memory is to populate it during application startup. This is taken care by Springboot.
  • One of the approaches is to create schema.sql and data.sql files in the application classpath.
  • Spring JDBC uses these sql files to create schema and populate data into the schema. There are other techniques which are listed here.

One can create mulptiple schema.sql and data.sql files, one for each db platform. So we can have schema-hsqldb.sql, data-hsqldb.sql, schema-mysql.sql and so on. And the file to be picked is decided by the value assigned to the property spring.datasource.platform. In this post we are going to create a schema-hsqldb.sql file with the following contents:

[code lang=”sql”]
CREATE TABLE person(
first_name VARCHAR(150),
last_name VARCHAR(150),
age INTEGER,
place VARCHAR(100)
);
[/code]

Next is to create application-local.properties file to define the values for application properties. Please read our previous articles about external configurations. Below is the contents for application-local.properties:

[code]
spring.datasource.platform=hsqldb
[/code]

Next is to create the Person model class:

[code lang=”java”]
package net.javabeat;

public class Person {
private String firstName;
private String lastName;
private int age;
private String place;
public String getFirstName() {
return firstName;
}
public void setFirstName(String firstName) {
this.firstName = firstName;
}
public String getLastName() {
return lastName;
}
public void setLastName(String lastName) {
this.lastName = lastName;
}
public int getAge() {
return age;
}
public void setAge(int age) {
this.age = age;
}
public String getPlace() {
return place;
}
public void setPlace(String place) {
this.place = place;
}

public String toString(){
StringBuilder builder = new StringBuilder();
builder.append(this.getFirstName())
.append(", ")
.append(this.getLastName())
.append(", ")
.append(this.getPlace())
.append(", ")
.append(this.getAge());

return builder.toString();
}

}
[/code]

Next is to create a service class PersonService which makes use of JdbcTemplate to insert data and retrieve data from hsqldb. There are two method in the service class- addPerson and getAllPerson. addPerson adds a new row to the person table and getAllPerson fetches all the rows in the person table. Below is the PersonService class definition:

[code lang=”java”]
package net.javabeat;

import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.List;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.jdbc.core.JdbcTemplate;
import org.springframework.jdbc.core.RowMapper;
import org.springframework.stereotype.Service;

@Service
public class PersonService {

@Autowired
private JdbcTemplate jdbcTemplate;

public int addPerson(Person person){
String sql = "INSERT INTO person(first_name, last_name, age, place) VALUES(?,?,?,?)";
return jdbcTemplate.update(sql, person.getFirstName(),
person.getLastName(), person.getAge(), person.getPlace());
}

public List<Person> getAllPerson(){
return jdbcTemplate.query("SELECT * FROM person", new RowMapper<Person>(){

public Person mapRow(ResultSet rs, int arg1) throws SQLException {
Person p = new Person();
p.setAge(rs.getInt("age"));
p.setFirstName(rs.getString("first_name"));
p.setLastName(rs.getString("last_name"));
p.setPlace(rs.getString("place"));
return p;
}

});
}
}
[/code]

Creation of DataSource instance, JdbcTemplate instance is all taken care by Spring Boot. Next is creation of the main class which launches the SpringApplication. Below is the definition of SpringbootSqlDemo class:

[code lang=”java”]
package net.javabeat;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class SpringbootSqlDemo implements CommandLineRunner{

Logger logger = LoggerFactory.getLogger(SpringbootSqlDemo.class);

@Autowired
PersonService personService;

public void run(String… args) {
Person person = new Person();
person.setFirstName("FName");
person.setLastName("LName");
person.setAge(20);
person.setPlace("Place");

if ( personService.addPerson(person) > 0){
logger.info("Person saved successfully");
}

for(Person p : personService.getAllPerson()){
logger.info(p.toString());
}

}

public static void main(String[] args) {
SpringApplication.run(SpringbootSqlDemo.class, args);
}

}
[/code]

The above class implements CommandLineRunner interface so that it can run once the Spring Boot application context has been completely initialized. This allows for creation of instances of DataSource, JdbcTemplate, PersonService and other beans.

The run(String... args) method first inserts one row into the person table and then retrieves all the rows in the person table and logs them using the logger. So once we run the application we will be able to see the rows inserted. With all these new files the project structure looks like:
Spring Boot SQL Databases

Let us run the application using the command: mvn spring-boot:run -Dspring.profiles.active=local. Below is the output snippet obtained after the application executes:

[code]
…
2015-08-20 19:50:00.704 INFO 5572 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : Person saved successfully
2015-08-20 19:50:00.719 INFO 5572 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
…
[/code]

You can notice that a new row has been added to person table and also all the rows available in the table have been printed.

Production Database Configurations Using JdbcTemplate

In-memory databases have lot of restriction and are useful in the early stages of the application and that too in local environments. As the application development progresses we would need data to be present even after application ends.

In such cases we will configure an installed database. To illustrate this example we will use PostgreSql. One can download and install PostgreSql from here. Let us create a new profile and name it as stage. This profile will make use of PostgreSql. For this we have to create application-stage.properties as shown below:

[code]
spring.database.driverClassName=org.postgresql.Driver
spring.datasource.url=jdbc:postgresql://localhost:5433/postgres
spring.datasource.username=postgres
spring.datasource.password=postgres
[/code]

The above are the connection url, username and password to connect to Postgres instance. We would have to create the person table in the Postgres instance.

We will also update the pom.xml to comment out the hsqldb dependency and instead add dependency to Postgres driver as shown below:

[code lang=”xml”]
<!– <dependency>
<groupId>org.hsqldb</groupId>
<artifactId>hsqldb</artifactId>
<scope>runtime</scope>
</dependency> –>
<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<version>9.4-1201-jdbc41</version>
</dependency>
[/code]

Let us now run the application by using the command: mvn spring-boot:run -Dspring.profiles.active=stage. You can see the same output you saw with the above run:

[code]
2015-08-20 20:06:23.829 INFO 3676 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : Person saved successfully
2015-08-20 20:06:23.845 INFO 3676 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
[/code]

But in this case if you run it multiple times then you will see multiple entries in the person table as shown below:

[code]
2015-08-20 20:54:21.914 INFO 6924 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : Person saved successfully
2015-08-20 20:54:21.945 INFO 6924 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
2015-08-20 20:54:21.945 INFO 6924 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
[/code]

This is because the data is persistent in this case unlike hsqldb which is in-memory db.

JPA and Spring Data with Spring Boot

Above sections we saw interacting with db using JdbcTemplate. In this section we will see how the same can be achieved using Java Persistance API. Spring Data provides excellent mechanism to achieve the persistence using JPA. First step is to update pom.xml to add dependency on JPA as shown below:

[code lang=”xml”]
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
[/code]

Next we are going to introduce new column in the person table and also create a sequence to auto increment the value in the new column and bind that sequence to the new column being created. Below SQL commands help us achieve that:

[code lang=”sql”]
CREATE SEQUENCE person_id_seq START WITH 1 INCREMENT BY 1;
ALTER TABLE person ADD COLUMN id numeric DEFAULT nextval(‘person_id_seq’);
ALTER SEQUENCE person_id_seq OWNED BY person.id;
[/code]

Next is to create an entity class that maps to the underlying table. Let us create PersonEntity as shown below:

[code lang=”java”]
package net.javabeat;

import java.io.Serializable;

import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.Table;

@Entity
@Table(name = "person")
public class PersonEntity implements Serializable{

private static final long serialVersionUID = -1801714432822866390L;

@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private long id;

@Column(name="first_name", nullable = false)
private String firstName;

@Column(name="last_name", nullable = false)
private String lastName;

private int age;

private String place;

protected PersonEntity(){

}

public PersonEntity(String firstName, String lastName, int age, String place){
this.firstName = firstName;
this.lastName = lastName;
this.age = age;
this.place = place;
}

public String toString(){
StringBuilder builder = new StringBuilder();
builder.append(this.getId()).append(", ")
.append(this.getFirstName()).append(", ")
.append(this.getLastName()).append(", ")
.append(this.getPlace()).append(", ")
.append(this.getAge());

return builder.toString();
}

public long getId() {
return id;
}

public void setId(long id) {
this.id = id;
}

public String getFirstName() {
return firstName;
}

public void setFirstName(String firstName) {
this.firstName = firstName;
}

public String getLastName() {
return lastName;
}

public void setLastName(String lastName) {
this.lastName = lastName;
}

public int getAge() {
return age;
}

public void setAge(int age) {
this.age = age;
}

public String getPlace() {
return place;
}

public void setPlace(String place) {
this.place = place;
}
}
[/code]

Next is to create a repository class that will provide us with basic APIs to interact with db and also provide facility to add new APIs to interact with db. We will be using the CrudRepository provided by spring data. It provides us with APIs to do CRUD operations and some find operations like findAll, findOne, count. Let us create PersonRepository interface as shown below:

[code lang=”java”]
package net.javabeat;

import org.springframework.data.repository.CrudRepository;

public interface PersonRepository extends CrudRepository<PersonEntity, Long>{
}
[/code]

Next is to update SpringbootSqlDemo class with code to access the db using the PersonRepository and PersonEntity. We will keep the JdbcTemplate code as well. Add the below code to the run method of the SpringbootSqlDemo class:

[code lang=”java”]
logger.info("Using JPA for insert and find");
PersonEntity personEntity = new PersonEntity("fName2", "lName2", 24, "Bangalore");
personEntity = personRepository.save(personEntity);
logger.info("Person with ID: " + personEntity.getId() + " saved successfully");

for ( PersonEntity pEntity : personRepository.findAll()){
logger.info(pEntity.toString());
}
[/code]

The application project structure looks something like:
Spring Boot Spring Data JPALet us run the application by using the command: mvn spring-boot:run -Dspring.profiles.active=stage. You can notice the data inserted and read from JdbcTemplate as well as the data inserted and read using JPA as shown below:

[code lang=”java”]
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : Person saved successfully
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : fName2, lName2, Bangalore, 24
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : FName, LName, Place, 20
2015-08-21 06:18:59.054 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : Using JPA for insert and find
2015-08-21 06:18:59.101 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : Person with ID: 9 saved successfully
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 1, FName, LName, Place, 20
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 2, FName, LName, Place, 20
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 3, FName, LName, Place, 20
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 4, FName, LName, Place, 20
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 5, FName, LName, Place, 20
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 6, FName, LName, Place, 20
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 7, fName2, lName2, Bangalore, 24
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 8, FName, LName, Place, 20
2015-08-21 06:18:59.272 INFO 13544 — [tSqlDemo.main()] net.javabeat.SpringbootSqlDemo : 9, fName2, lName2, Bangalore, 24
[/code]

Using JPA reduces lot of boiler plate code. One can even create methods in the Repository class and annotate it with the SQL we want to run. The code used in this article is available in the JavaBeat’s github repository here. (Also originally written in Sanaulla’s repository here).

In this article we saw how we moved from in-memory database to installed databases and also saw how we could use JdbcTemplate and JPA to interact with the db. We didn’t have to write any sort of XML configuration and everything was managed by auto configuration provided by Springboot. Overall, you would have got good idea on how to use SQL Databases and Spring Boot together for persisting the application data.

How are you doing in your projects, please share your experience in our comments section.

Filed Under: Spring Framework Tagged With: SpringBoot Tutorials

Logging Configuration in Spring Boot

August 18, 2015 by Mohamed Sanaulla Leave a Comment

Logging is an important part of any application. Checking the logs is the first step towards debugging any issue. So logging the right information is important. At the same time logging too much can lead to bloated log file in turn using lot of disk space. In some applications developers use synchronous logging which can impact the application’s performance. Logging libraries like Logback, Log4j2 provide async logging.

  • Also Read : Spring Boot Tutorials

Spring Boot provides great support for logging and provides lot of hooks to configure the same. In this article we are going to see the default support for logging in Spring Boot, then use the hooks i.e the spring properties to configure the logging. At the end of this article, you will be familiar with the logging configuration in spring boot applications.

Logging Configurations Tutorial using Spring Boot

Create Spring Boot Project using Maven

Let us first create a simple maven project and update the pom.xml with the below code:

[code lang=”xml”]
&amp;lt;project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"&amp;gt;
&amp;lt;modelVersion&amp;gt;4.0.0&amp;lt;/modelVersion&amp;gt;
&amp;lt;groupId&amp;gt;net.javabeat&amp;lt;/groupId&amp;gt;
&amp;lt;artifactId&amp;gt;springboot-logging&amp;lt;/artifactId&amp;gt;
&amp;lt;version&amp;gt;0.1&amp;lt;/version&amp;gt;
&amp;lt;name&amp;gt;SpringBoot Logging&amp;lt;/name&amp;gt;
&amp;lt;properties&amp;gt;
&amp;lt;java.version&amp;gt;1.8&amp;lt;/java.version&amp;gt;
&amp;lt;start-class&amp;gt;net.javabeat.SpringBootLoggingDemo&amp;lt;/start-class&amp;gt;
&amp;lt;/properties&amp;gt;
&amp;lt;parent&amp;gt;
&amp;lt;groupId&amp;gt;org.springframework.boot&amp;lt;/groupId&amp;gt;
&amp;lt;artifactId&amp;gt;spring-boot-starter-parent&amp;lt;/artifactId&amp;gt;
&amp;lt;version&amp;gt;1.2.3.RELEASE&amp;lt;/version&amp;gt;
&amp;lt;/parent&amp;gt;
&amp;lt;dependencies&amp;gt;
&amp;lt;dependency&amp;gt;
&amp;lt;groupId&amp;gt;org.springframework.boot&amp;lt;/groupId&amp;gt;
&amp;lt;artifactId&amp;gt;spring-boot-starter&amp;lt;/artifactId&amp;gt;
&amp;lt;/dependency&amp;gt;
&amp;lt;/dependencies&amp;gt;

&amp;lt;build&amp;gt;
&amp;lt;plugins&amp;gt;
&amp;lt;plugin&amp;gt;
&amp;lt;groupId&amp;gt;org.springframework.boot&amp;lt;/groupId&amp;gt;
&amp;lt;artifactId&amp;gt;spring-boot-maven-plugin&amp;lt;/artifactId&amp;gt;
&amp;lt;/plugin&amp;gt;
&amp;lt;/plugins&amp;gt;
&amp;lt;/build&amp;gt;
&amp;lt;/project&amp;gt;
[/code]

The dependency spring-boot-starter includes the dependencies for logging spring-boot-starter-logging, so we need not add the logging dependency again. The spring-boot-starter-logging includes SLF4J and logback dependencies with appropriate SLF4J wrappers for other logging libraries.

Default Logging Support in Spring Boot

Spring Boot reference document says:

By default, If you use the ‘Starter POMs’, Logback will be used for logging. Appropriate Logback routing is also included to ensure that dependent libraries that use Java Util Logging, Commons Logging, Log4J or SLF4J will all work correctly.

Let us create SpringBootLoggingDemo class in net.javabeat package which will be the starting point for our application. By default the logging will be written to the console using a fixed logging format used by logback as shown below:

[code]
09:59:03.086 [net.javabeat.SpringBootLoggingDemo.main()] ERROR net.javabeat.SpringBootLoggingDemo – Message logged at ERROR level
[/code]

The above log contains of following parts:

  1. Time with millisecond precision
  2. Name of the thread in square brackets ([])
  3. Log level
  4. Logger name, generally class name truncated for brevity
  5. Actual log message

The default log level configured by logback is DEBUG i.e any messages logged at ERROR, WARN, INFO and DEBUG get printed on the console. Let us create the SpringBootLoggingDemo class with a logger and log messages at ERROR, WARN, INFO and DEBUG level as shown below:

[code lang=”java”]
package net.javabeat;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class SpringBootLoggingDemo {

private static final Logger logger = LoggerFactory.getLogger(SpringBootLoggingDemo.class);

public static void main(String[] args) {
SpringApplication springApplication = new SpringApplication(new Object[]{SpringBootLoggingDemo.class});
springApplication.run(args);

logger.error("Message logged at ERROR level");
logger.warn("Message logged at WARN level");
logger.info("Message logged at INFO level");
logger.debug("Message logged at DEBUG level");

}
}
[/code]

Run the application using the command: mvn spring-boot:run to see the log messages printed on the console as shown:

[code lang=”shell”]
10:01:40.996 [net.javabeat.SpringBootLoggingDemo.main()] ERROR net.javabeat.SpringBootLoggingDemo – Message logged at ERROR level
10:01:40.996 [net.javabeat.SpringBootLoggingDemo.main()] WARN net.javabeat.SpringBootLoggingDemo – Message logged at WARN level
10:01:40.996 [net.javabeat.SpringBootLoggingDemo.main()] INFO net.javabeat.SpringBootLoggingDemo – Message logged at INFO level
10:01:40.996 [net.javabeat.SpringBootLoggingDemo.main()] DEBUG net.javabeat.SpringBootLoggingDemo – Message logged at DEBUG level
[/code]

Modifying the default configuration by providing logback.xml

Let us now modify the default logback configuration by providing a logback.xml configuration file as shown below:

[code lang=”xml”]
&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;
&amp;lt;configuration&amp;gt;
&amp;lt;appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"&amp;gt;
&amp;lt;!– Log message format –&amp;gt;
&amp;lt;encoder&amp;gt;
&amp;lt;pattern&amp;gt;%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} – %msg%n
&amp;lt;/pattern&amp;gt;
&amp;lt;/encoder&amp;gt;
&amp;lt;/appender&amp;gt;

&amp;lt;!– Setting the root level of logging to INFO –&amp;gt;
&amp;lt;root level="info"&amp;gt;
&amp;lt;appender-ref ref="STDOUT" /&amp;gt;
&amp;lt;/root&amp;gt;
&amp;lt;/configuration&amp;gt;
[/code]

The above configuration does the following:

  • Set up a appender STDOUT using ConsoleAppender which prints to the console
  • Provide a pattern to the appender to build the log message
  • Set up a root logger which logs any message above INFO level using the STDOUT appender

We have to place this logback.xml in the classpath of the application for Spring Boot to pick the configuration. I have placed this in src/main/resources as shown below:
Spring Boot LoggingLet us run the application again using mvn spring-boot:run and see that only ERROR, WARN, INFO are logged and the one DEBUG is not logged as shown below:

[code lang=”shell”]
14:04:23.805 [net.javabeat.SpringBootLoggingDemo.main()] ERROR net.javabeat.SpringBootLoggingDemo – Message logged at ERROR level
14:04:23.808 [net.javabeat.SpringBootLoggingDemo.main()] WARN net.javabeat.SpringBootLoggingDemo – Message logged at WARN level
14:04:23.809 [net.javabeat.SpringBootLoggingDemo.main()] INFO net.javabeat.SpringBootLoggingDemo – Message logged at INFO level
[/code]

This was a simple logback.xml configuration. Let us see how we can configure it to set different log levels for different java packages in the application. For that let us first create two classes: TestModel.java in net.javabeat.model and TestService.java in net.javabeat.service as as shown below:

TestModel.java

[code lang=”java”]
package net.javabeat.model;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class TestModel {
private static final Logger logger = LoggerFactory.getLogger(TestModel.class);

public TestModel(){
logger.debug("Log message at DEBUG level from TestModel constructor");
logger.info("Log message at INFO level from TestModel constructor");
}
}
[/code]

TestService.java

[code lang=”java”]
package net.javabeat.service;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

public class TestService {

private Logger logger = LoggerFactory.getLogger(TestService.class);

public void service(){
logger.info("Message at INFO level from TestService.service()");
logger.warn("Message at WARN level from TestService.service()");
}
}
[/code]

The project structure with these new files looks like below:
Spring Boot Logback

We are going to set the log level to WARN for net.javabeat.service package and to INFO for net.javabeat.model package as show below:

[code lang=”xml” highlight=”7,14″]
&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;
&amp;lt;configuration&amp;gt;
&amp;lt;appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"&amp;gt;
&amp;lt;!– Log message format –&amp;gt;
&amp;lt;encoder&amp;gt;
&amp;lt;pattern&amp;gt;%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} – %msg%n&amp;lt;/pattern&amp;gt;
&amp;lt;/encoder&amp;gt;
&amp;lt;/appender&amp;gt;

&amp;lt;!– Setting the logging level to WARN for code in net.javabeat.service –&amp;gt;
&amp;lt;logger name="net.javabeat.service" level="WARN" /&amp;gt;

&amp;lt;!– Setting the logging level to INFO for code in net.javabeat.model –&amp;gt;
&amp;lt;logger name="net.javabeat.model" level="INFO" /&amp;gt;

&amp;lt;!– Setting the root level of logging to INFO –&amp;gt;
&amp;lt;root level="info"&amp;gt;
&amp;lt;appender-ref ref="STDOUT" /&amp;gt;
&amp;lt;/root&amp;gt;
&amp;lt;/configuration&amp;gt;
[/code]

We would have to update the SpringBootLoggingDemo class and add the below lines just before creating a new instance of SpringApplication:

[code language=””]
TestModel model = new TestModel();
TestService service = new TestService();
service.service();
[/code]

Now, running the application give us the below output:

[code lang=”shell”]
17:36:07.616 [net.javabeat.SpringBootLoggingDemo.main()] ERROR net.javabeat.SpringBootLoggingDemo – Message logged at ERROR level
17:36:07.632 [net.javabeat.SpringBootLoggingDemo.main()] WARN net.javabeat.SpringBootLoggingDemo – Message logged at WARN level
17:36:07.632 [net.javabeat.SpringBootLoggingDemo.main()] INFO net.javabeat.SpringBootLoggingDemo – Message logged at INFO level
17:36:07.632 [net.javabeat.SpringBootLoggingDemo.main()] INFO net.javabeat.model.TestModel – Log message at INFO level from TestModel constructor
17:36:07.632 [net.javabeat.SpringBootLoggingDemo.main()] WARN net.javabeat.service.TestService – Message at WARN level from TestService.service()
[/code]

You can observe that for TestModel the message at DEBUG level was not logged and for TestService the message at INFO level was not logged.

Sending the log messages to a file

In this tutorial I am using the logback.xml as the logging configuration. Spring boot by default uses the logback as the default logging implementation if you are not specifying any other implementations. If you want to write the log files to an file but you don’t want to use the logback.xml file for the configurations, you can use the spring boot properties logging.path or logging.file to specify the location of the log files.

Spring boot will look for the above two properties, if those are are not specified then the log messages are sent to the console appender. Otherwise the log messages will be sent to file specified by those properties.

Spring Boot Logging Properties
Spring Boot Logging Properties

Now let’s look at how to use the logback for printing the log message sin the files. Till now we saw the log messages being printed on console, let us now see how the same can be written to a file. We need to update logback.xml configuration to add a new appender to write to file and then use that appender in the loggers defined in the configuration file as shown below:

[code lang=”xml” highlight=”10-16,18,21,26″]
&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;
&amp;lt;configuration&amp;gt;
&amp;lt;appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"&amp;gt;
&amp;lt;!– Log message format –&amp;gt;
&amp;lt;encoder&amp;gt;
&amp;lt;pattern&amp;gt;%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} – %msg%n&amp;lt;/pattern&amp;gt;
&amp;lt;/encoder&amp;gt;
&amp;lt;/appender&amp;gt;
&amp;lt;!– Ned appender to write to file –&amp;gt;
&amp;lt;appender name="FILE" class="ch.qos.logback.core.FileAppender"&amp;gt;
&amp;lt;!– Name of the file where the log messages are written –&amp;gt;
&amp;lt;file&amp;gt;myApp.log&amp;lt;/file&amp;gt;
&amp;lt;encoder&amp;gt;
&amp;lt;pattern&amp;gt;%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} – %msg%n&amp;lt;/pattern&amp;gt;
&amp;lt;/encoder&amp;gt;
&amp;lt;/appender&amp;gt;
&amp;lt;logger name="net.javabeat.service" level="WARN"&amp;gt;
&amp;lt;appender-ref ref="FILE" /&amp;gt;
&amp;lt;/logger&amp;gt;
&amp;lt;logger name="net.javabeat.model" level="INFO"&amp;gt;
&amp;lt;appender-ref ref="FILE" /&amp;gt;
&amp;lt;/logger&amp;gt;

&amp;lt;!– Setting the root level of logging to INFO –&amp;gt;
&amp;lt;root level="info"&amp;gt;
&amp;lt;appender-ref ref="FILE" /&amp;gt;
&amp;lt;/root&amp;gt;
&amp;lt;/configuration&amp;gt;
[/code]

Now running the application will redirect all the log messages to the file myApp.log present in the current directory.

Making use of default logback configuration base.xml

Till now we haven’t made use of Spring Boot features or the defaults it provides. In this section we are going to show you how we can make use of the logback configuration base.xml. Base.xml sets up the following things:

  • Wiring the spring properties- logging.file, logging.path with the logback configuration
  • Setting up console and file appenders
  • Setting log levels for some packages

Let us update the logback.xml configuration file with the below configuration:

[code lang=”xml”]
&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;
&amp;lt;configuration&amp;gt;
&amp;lt;include resource="org/springframework/boot/logging/logback/base.xml"/&amp;gt;
&amp;lt;logger name="net.javabeat.service" level="WARN"&amp;gt;&amp;lt;/logger&amp;gt;
&amp;lt;logger name="net.javabeat.model" level="INFO"&amp;gt;&amp;lt;/logger&amp;gt;
&amp;lt;/configuration&amp;gt;
[/code]

And add application.properties file with a property logging.file having the name of the file as shown: logging.file=demo_logging.log. The logging will now be written both to the console and to the file. Let us verify this by running the application using mvn spring-boot:run to get the below log messages printed on console and in the file demo_logging.log:

[code lang=”shell”]
…
2015-08-18 19:35:09.725 ERROR 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.SpringBootLoggingDemo : Message logged at ERROR level
2015-08-18 19:35:09.725 WARN 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.SpringBootLoggingDemo : Message logged at WARN level
2015-08-18 19:35:09.725 INFO 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.SpringBootLoggingDemo : Message logged at INFO level
2015-08-18 19:35:09.726 INFO 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.model.TestModel : Log message at INFO level from TestModel constructor
2015-08-18 19:35:09.727 WARN 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.service.TestService : Message at WARN level from TestService.service()
…

C:\Users\Mohamed\workspace\springboot-logging&amp;gt;more demo_logging.log
…
2015-08-18 19:35:09.725 ERROR 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.SpringBootLoggingDemo : Message logged at ERROR level
2015-08-18 19:35:09.725 WARN 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.SpringBootLoggingDemo : Message logged at WARN level
2015-08-18 19:35:09.725 INFO 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.SpringBootLoggingDemo : Message logged at INFO level
2015-08-18 19:35:09.726 INFO 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.model.TestModel : Log message at INFO level from TestModel constructor
2015-08-18 19:35:09.727 WARN 7316 — [net.javabeat.SpringBootLoggingDemo.main()] net.javabeat.service.TestService : Message at WARN level from TestService.service()
[/code]

You can see that with a minimal logback.xml we have achieved a lot more, thanks to the defaults provided by Springboot. The code for this article can be found here.

Conclusion

I hope this tutorial have provided enough insight on how to use the logging mechanism in Spring Boot application. Also I have written about the default logging support in the Spring Boot applications and how to override them by modifying the logback.xml file. If you have any questions in logging configuration in spring boot, please write it in the comments section.

Filed Under: Spring Framework Tagged With: Spring Boot Tutorials

Spring Boot : External Configurations for Spring Boot

August 14, 2015 by Mohamed Sanaulla Leave a Comment

This tutorials explains you the different ways how you can do external configurations for Spring Boot applications. When you work with the real time environments, External Configurations for Spring Boot would become important for the flexibility. If you have any questions, please write it in the comments section.

Spring Boot : External Configurations for #SpringBoot http://t.co/tEFQ71UP3w pic.twitter.com/u8rwUB7iJ5

— JavaBeat (@javabeat) August 14, 2015

Applications have to store configuration of its database i.e the hostname of the db, the name of the database, user of the db, then there is information like hostname of any entities the application is dependent on, some spring related configuration overrides and so on. The spring related configurations can be found here.

Spring Boot External Configurations Example

There are multiple sources from where the configuration can be read from and the order in which the configuration properties are overridden is determined by Spring Boot. The order as described here is listed below:

  • Command line arguments
  • JNDI attributes from java:comp/env
  • Java System properties (System.getProperties())
  • OS environment variables
  • A RandomValuePropertySource that only has properties in random.*
  • Profile-specific application properties outside of your packaged jar (application-{profile}.properties and YAML variants)
  • Profile-specific application properties packaged inside your jar (application-{profile}.properties and YAML variants)
  • Application properties outside of your packaged jar (application.properties and YAML variants)
  • Application properties packaged inside your jar (application.properties and YAML variants)
  • @PropertySource annotations on your @Configuration classes
  • Default properties (specified using SpringApplication.setDefaultProperties)

In this post I am going to show you how to define properties using the following approaches:

  • Command line arguments
  • Profile-specific application properties outside of your packaged jar (application-{profile}.properties and YAML variants)
  • Profile-specific application properties packaged inside your jar (application-{profile}.properties and YAML variants)
  • Application properties outside of your packaged jar (application.properties and YAML variants)
  • Application properties packaged inside your jar (application.properties and YAML variants)
  • Default properties (specified using SpringApplication.setDefaultProperties)

Lets create a basic SpringBoot application as explained in this article. In short, you would have to create a maven project and update the pom.xml as shown below:

[code lang=”xml” highlight=”11″]
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.intbittech</groupId>
<version>1.0</version>
<name>DocAppAPI</name>
<description>RESTful API for DocApp</description>
<artifactId>DocAppAPI</artifactId>
<properties>
<java.version>1.8</java.version>
<start-class>net.javabeat.ExternalConfigApplication</start-class>
</properties>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>1.2.3.RELEASE</version>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
</project>
[/code]

In the above highlighted code we haven’t yet created the class ExternalConfigApplication. It would be the starting point for our SpringBoot application. The ExternalConfigApplication class definition is given below:

[code lang=”java”]
package net.javabeat;

import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class ExternalConfigApplication {

public static void main(String[] args) throws Exception {
SpringApplication.run(new Object[] { ExternalConfigApplication.class }, args);
}

}
[/code]

I will also create a component class ExternalConfigComponent annotated with @Component whose definition is shown below:

[code lang=”java”]
package net.javabeat;

import javax.annotation.PostConstruct;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;

@Component
public class ExternalConfigComponent {
private static Logger logger = LoggerFactory.getLogger(ExternalConfigComponent.class);

//No properties configured yet.

@PostConstruct
public void postConstruct(){
//This is where we are going to print the values of the properties
}
}
[/code]

Let us compile and run the above application as shown below:

[code lang=”shell”]
C:\Users\Mohamed\workspace\springboot-external-config>mvn clean package
…
…
[INFO] ————————————————————————
[INFO] BUILD SUCCESS
[INFO] ————————————————————————
[INFO] Total time: 3.547 s
[INFO] Finished at: 2015-08-09T21:55:30+05:30
[INFO] Final Memory: 22M/218M
[INFO] ————————————————————————

C:\Users\Mohamed\workspace\springboot-external-config>java -jar target\SpringBootExternalConfig-1.0.jar
…
2015-08-09 21:56:29.592 INFO 10444 — [ main] net.javabeat.ExternalConfigApplication : Started ExternalConfigApplication in 1.195 seconds (JVM running for 1.675)
…
[/code]

You will notice that the application starts and then terminates successfully. As this is just any other Java based command line application, it executes the application and then exits unlike Web applications.

Another way to compile and run the application in one go is to use the spring-boot maven plugin as shown below:

[code lang=”shell”]
C:\Users\Mohamed\workspace\springboot-external-config>mvn spring-boot:run
[/code]

Through out the post I am going to use the above spring-boot maven plugin to run the application.

Now let us start exploring the external configurations for Spring Boot applications.

External Configurations for Spring Boot

Default properties (specified using SpringApplication.setDefaultProperties)

SpringApplication has an API setDefaultProperties which accepts a Map or a Properties object. I am enhancing the ExternalConfigApplication class to set the values for two properties namely: property.one and property.two as shown below:

[code lang=”java” highlight=”21-27″]
package net.javabeat;

import java.util.HashMap;
import java.util.Map;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;

@SpringBootApplication
public class ExternalConfigApplication {

private static final Logger logger = LoggerFactory.getLogger(ExternalConfigApplication.class);

@Autowired
private ExternalConfigComponent externalConfigComponent;

public static void main(String[] args) throws Exception {
SpringApplication springApplication = new SpringApplication(new Object[] { ExternalConfigApplication.class });

Map<String, Object> defaultProperties = new HashMap<String, Object>();
defaultProperties.put("property.one", "Value One");
defaultProperties.put("property.two", "Value Two");

springApplication.setDefaultProperties(defaultProperties);

springApplication.run(args);

}

}
[/code]

Let us access the two properties defined above by binding them to two fields in the ExternalConfigComponent using the @Value annotation as shown below:

[code lang=”java”]
package net.javabeat;

import javax.annotation.PostConstruct;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;

@Component
public class ExternalConfigComponent {
private static Logger logger = LoggerFactory.getLogger(ExternalConfigComponent.class);

@Value("${property.one}")
public String propertyOne;

@Value("${property.two}")
public String propertyTwo;

@PostConstruct
public void postConstruct(){
logger.info("Property One: " + propertyOne);
logger.info("Property Two: " + propertyTwo);
}

}
[/code]

Lets launch the above application using the spring-boot maven plugin as shown below:

[code lang=”shell” highlight=”3,4″]
$ mvn spring-boot:run
…
2015-08-10 01:46:59.530 INFO 4964 — [lication.main()] net.javabeat.ExternalConfigComponent : Property One: Value One
2015-08-10 01:46:59.530 INFO 4964 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Two: Value Two
…
[/code]

Among the lot of text printed on the console you can find the above two lines with the value of the property.

Application properties packaged inside your jar

Another approach is to create a file by name application.properties and package it with your jar. This file contains the key, value pairs for the properties. One can name these property files with any name, but the name has to be provided to the spring configuration property so that it can read that file. By default spring searches for application.properties file. Let us create application.properties in the folder src/main/resources as shown in the image below:
Spring Boot External Configurations 1

And the contents of the application.properties is as shown below:

[code]
property.two=Value Two Overridden
property.three=Value Three
[/code]

It contains two properties namely: property.two and property.three. You might wonder about the name property.two being repeated. Yes, it is repeated and the purpose of this repetition is to show the overriding of the properties. The order of precedence of property sources has already been listed in the beginning and the Default Properties comes at the last, preceded by the application.properties file packaged in the jar. So lets see this overriding in action. Before that lets update the ExternalConfigComponent class with the new property as shown below:

[code lang=”java” highlight=”20,21,27″]
package net.javabeat;

import javax.annotation.PostConstruct;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;

@Component
public class ExternalConfigComponent {
private static Logger logger = LoggerFactory.getLogger(ExternalConfigComponent.class);

@Value("${property.one}")
private String propertyOne;

@Value("${property.two}")
private String propertyTwo;

@Value("${property.three}")
private String propertyThree;

@PostConstruct
public void postConstruct(){
logger.info("Property One: " + propertyOne);
logger.info("Property Two: " + propertyTwo);
logger.info("Property Three: " + propertyThree);
}

}
[/code]

Lets launch the application and view its output:

[code lang=”shell” highlight=”4,5″]
$ mvn spring-boot:run
…
2015-08-10 02:03:34.572 INFO 920 — [lication.main()] net.javabeat.ExternalConfigComponent : Property One: Value One
2015-08-10 02:03:34.572 INFO 920 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Two: Value Two Overridden
2015-08-10 02:03:34.573 INFO 920 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Three: Value Three
…
[/code]

You can observe above that the value of Property Two has been overridden by the value specified in the application.properties file.

Application properties outside of your packaged jar

Above we saw application.properties packaged inside the jar, now let us see how application.properties file can be placed in the /config sub directory of the current directory and let Springboot load the properties defined in that file. The content of this application.properties files is given below:

[code]
property.three=Value Three Overridden
property.four=Value Four
[/code]

As usual there are two properties property.three and property.four. Let us see the updated code for ExternalConfigComponent below:

[code lang=”java” highlight=”30, 31″]
//Add this below along with the other property declarations
@Value("${property.four}")
private String propertyFour;

//Append the below to the postConstruct() method
logger.info("Property Four: " + propertyFour);
[/code]

Running the application gives us the below output:

[code lang=”shell” highlight=”4″]
….
2015-08-14 17:06:13.469 INFO 12624 — [lication.main()] net.javabeat.ExternalConfigComponent : Property One: Value One
2015-08-14 17:06:13.470 INFO 12624 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Two: Value Two Overridden
2015-08-14 17:06:13.471 INFO 12624 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Three: Value Three Overridden
2015-08-14 17:06:13.472 INFO 12624 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Four: Value Four
…
[/code]

You will notice in the highlighted lines above that the property defined in the application.properties file defined inside the Jar has been overridden by the application.properties defined outside the jar in the /config sub directory.

Profile-specific application properties packaged inside your jar

We can have multiple values for a property depending on the environment the application is running in i.e dev environment or production environment. To support this feature Springboot has concept called profiles. One can create a profile by name local for local environment, then dev for development environment and so on. So the names of the property files will now be application-local.properties and application-dev.properties respectively.

The profile can be selected by setting the profile name in the Spring property spring.profiles.active. We can set this value using any one of the approaches defined here, and we would use the approach to set the value via command line. But before we show that let us see the contents of the application-local.properties:

[code]
property.four=Value Four Overridden
property.five=Value Five
[/code]

Let us update the ExternalConfigComponent to accomodate the new property property.five with the below code:

[code lang=”java”]
//Add this below along with the other property declarations
@Value("${property.five}")
private String propertyFive;

//Append the below to the postConstruct() method
logger.info("Property Five: " + propertyFive);
[/code]

Let us run the application by using the command: mvn spring-boot:run -Dspring.profiles.active=local. After the application is run successfully you will see the below output:

[code lang=”shell” highlight=”5,6″]
…
2015-08-14 17:37:29.348 INFO 8760 — [lication.main()] net.javabeat.ExternalConfigComponent : Property One: Value One
2015-08-14 17:37:29.348 INFO 8760 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Two: Value Two Overridden
2015-08-14 17:37:29.348 INFO 8760 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Three: Value Three Overridden
2015-08-14 17:37:29.348 INFO 8760 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Four: Value Four Overridden
2015-08-14 17:37:29.348 INFO 8760 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Five: Value Five
…
[/code]

You can notice that the property property.four from the /config/application.properties has been overridden by application-local.properties and also the new property property.five provided by the new properties files.

Profile-specific application properties outside of your packaged jar

Just as we saw with application.properties placed in the jar and outside the jar in a sub directory, we can do the same with profile specific application properties. Let us create application-local.properties with the below data and store it in the /config sub directory created earlier:

[code]
property.five=Value Five Overridden
property.six=Value Six
[/code]

And the below are the updates to the ExternalConfigComponent:

[code lang=”java”]
//Add this below along with the other property declarations
@Value("${property.six}")
private String propertySix;

//Append the below to the postConstruct() method
logger.info("Property Six: " + propertySix);
[/code]

Let us run the application by using the command: mvn spring-boot:run -Dspring.profiles.active=local. After the application is run successfully you will see the below output:

[code lang=”shell”]
…
015-08-14 17:51:59.469 INFO 11564 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Five: Value Five Overridden
2015-08-14 17:51:59.469 INFO 11564 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Six: Value Six
..
[/code]

You can notice the above two lines printed among other lines in the output after executing the program. So the application-local.properties file in the /config sub directory has overridden the property.five defined in the application-local.properties packaged in the jar.

Command line arguments

The value for these properties are passed via command line while launching the application. I am going to configure two properties by name: property.six and property.seven and bind them to our ExternalConfigComponent class as shown below:

[code lang=”java”]
//Add this below along with the other property declarations
@Value("${property.seven}")
private String propertySeven;

//Append the below to the postConstruct() method
logger.info("Property Seven: " + propertySeven);
[/code]

Lets try to launch the application by passing the values for the above two configuration properties as shown below:

[code lang=”shell” highlight=”3,4″]
C:\Users\Mohamed\workspace\springboot-external-config>mvn spring-boot:run -Dspring.profiles.active=local -Dproperty.six="Value Six Overridden" -Dproperty.seven="Value Seven"
…
2015-08-14 19:24:23.430 INFO 5584 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Six: Value Six Overridden
2015-08-14 19:24:23.430 INFO 5584 — [lication.main()] net.javabeat.ExternalConfigComponent : Property Seven: Value Seven
…
[/code]

You can see the values of the configuration properties (highlighted above) printed on the console among other outputs and also see that the value for property.six has been overridden by the value passed as a command line argument.

With this we come to an end of the article. We have shown you different ways we can setup properties in Spring Boot and also how the order/precedence of the sources override the values defined in different property sources. If you have any questions on External Configurations for Spring Boot, please write it in the comments section.

The complete code can be accessed from Github here. If you want to run the complete code then you have to make use of the command: mvn spring-boot:run -Dspring.profiles.active=local -Dproperty.six="Value Six Overridden" -Dproperty.seven="Value Seven".

Filed Under: Spring Framework Tagged With: Spring Boot Tutorials

Java 9 : Use Process API to Get Process Information

August 8, 2015 by Mohamed Sanaulla Leave a Comment

With every one focused on Java 8, let me take you through some of the features in Java 9. As Java 9 is being continuously under development features discussed here might change before the final release of JDK 9.

Few links to start with:

  1. In this post I am using Java 9 Build b75 downloaded from here.
  2. The Javadocs for Java 9 can be found here.

Process API

The details of the Process API updates can be found here in the official project page. In this post I am going to show you how to:

Process API in Java 9

  1. Print details of the current process i.e the JVM process
  2. Print the details of all the processes running in the system
  3. Print the details of the newly created process

A new Java class by name ProcessHandle has been introduced which is described as:

ProcessHandle identifies and provides control of native processes. Each individual process can be monitored for liveness, list its children, get information about the process or destroy it.

The Process class has been enhanced with the Process APIs to get ProcessHandle for that process, to get the process info, to get process id.

In the subsequent post I will use the below utility method to print the process information. ProcessHandle class provides info() method which returns an instance of ProcessHandle.Info, this instance can be used to get the following details:

  • Executable pathname for the process.
  • Arguments if any of the process.
  • Start time of the process.
  • Total CPU time of the process.
  • User of the process.

[code lang=”java”]
//The code for utility class
import java.time.Instant;
import java.time.Duration;
import java.time.temporal.ChronoUnit;

public class ProcessAPIDemoUtil{
public static void printProcessDetails(ProcessHandle currentProcess){
//Get the instance of process info
ProcessHandle.Info currentProcessInfo = currentProcess.info();
if ( currentProcessInfo.command().orElse("").equals("")){
return;
}
//Get the process id
System.out.println("Process id: " + currentProcess.getPid());
//Get the command pathname of the process
System.out.println("Command: " + currentProcessInfo.command().orElse(""));
//Get the arguments of the process
String[] arguments = currentProcessInfo.arguments().orElse(new String[]{});
if ( arguments.length != 0){
System.out.print("Arguments: ");
for(String arg : arguments){
System.out.print(arg + " ");
}
System.out.println();
}
//Get the start time of the process
System.out.println("Started at: " + currentProcessInfo.startInstant().orElse(Instant.now()).toString());
//Get the time the process ran for
System.out.println("Ran for: " + currentProcessInfo.totalCpuDuration().orElse(Duration.ofMillis(0)).toMillis() + "ms");
//Get the owner of the process
System.out.println("Owner: " + currentProcessInfo.user().orElse(""));
}
}
[/code]

Print details of the current process i.e the JVM process

The ProcessHandle class provides a API current() to get the current process handle. This current process is nothing but the JVM process which is executing the java code. The below code retrieves the current process and prints its details using the above utility method:

[code lang=”java”]
public class CurrentProcessDemo{
public static void main(String[] args){
//Get the handle for current process i.e the JVM process
ProcessHandle currentProcess = ProcessHandle.current();
System.out.println("**** Current process info ****");
ProcessAPIDemoUtil.printProcessDetails(currentProcess);
}
}
[/code]

The output of the above code is:

[code lang=”shell”]
G:\java9>javac CurrentProcessDemo.java

G:\java9>java CurrentProcessDemo
**** Current process info ****
Process id: 8896
Command: C:\Program Files\Java\jdk1.9.0\bin\java.exe
Started at: 2015-08-08T14:08:06.756Z
Ran for: 187ms
Owner: SANA-LAPTOP\Mohamed
[/code]

Print the details of all the processes running in the system

The ProcessHandle class provides an API allProcesses() to get all active processes running in the system. As there are many processes returned, I am going to filter out those not having any command associated with and limit the results to 3. Below is the code for retrieving the processes:

[code lang=”java”]
public class GetAllProcessesDemo{
public static void main(String[] args){
//list atmost 5 processes running in the system which have some command
ProcessHandle.allProcesses()
.filter(processHandle -> processHandle.info().command().isPresent())
.limit(3)
.forEach((process) ->{
ProcessAPIDemoUtil.printProcessDetails(process);
});
}
}
[/code]

The output for above code:

[code lang=”shell”]
G:\java9>javac GetAllProcessesDemo.java

G:\java9>java GetAllProcessesDemo
Process id: 4524
Command: C:\Windows\System32\taskhostw.exe
Started at: 2015-08-04T12:19:49.342Z
Ran for: 10609ms
Owner: SANA-LAPTOP\Mohamed
Process id: 4540
Command: C:\Windows\System32\sihost.exe
Started at: 2015-08-04T12:19:49.364Z
Ran for: 67125ms
Owner: SANA-LAPTOP\Mohamed
Process id: 4560
Command: C:\Program Files\Synaptics\SynTP\SynTPEnh.exe
Started at: 2015-08-04T12:19:49.449Z
Ran for: 852609ms
Owner: SANA-LAPTOP\Mohamed

G:\java9>
[/code]

Print the details of the newly created process

The Process class has been enhanced with APIs to get process info, process id and the ProcessHandle for that process. Below is the code to get the details of the newly created process:

[code lang=”java”]
import java.io.IOException;
import java.io.*;
import java.util.*;

public class NewProcessDetailDemo{
public static void main(String[] args) throws IOException{
//Create a new process
Process process = Runtime.getRuntime().exec("cmd /k dir");
System.out.println("**** Process details for created process ****");
//now print this process details
ProcessAPIDemoUtil.printProcessDetails(process.toHandle());
Scanner reader = new Scanner(process.getInputStream());
while(reader.hasNext()){
System.out.println(reader.nextLine());
}

reader.close();
}
}
[/code]

The output of the above code is:

[code lang=”shell”]

G:\java9>javac NewProcessDetailDemo.java

G:\java9>java NewProcessDetailDemo
**** Process details for created process ****
Process id: 7532
Command: C:\Windows\System32\cmd.exe
Started at: 2015-08-08T14:39:06.148Z
Ran for: 0ms
Owner: SANA-LAPTOP\Mohamed
Volume in drive G is Development Base
Volume Serial Number is 9E68-AF66

Directory of G:\java9

08/07/2015 08:13 PM <DIR> .
08/07/2015 08:13 PM <DIR> ..
08/08/2015 07:38 PM 642 CurrentProcessDemo.class
08/07/2015 07:33 PM 321 CurrentProcessDemo.java
08/08/2015 07:42 PM 1,717 GetAllProcessesDemo.class
08/08/2015 07:42 PM 378 GetAllProcessesDemo.java
08/07/2015 02:33 PM 427 HelloWorld.class
08/07/2015 02:33 PM 114 HelloWorld.java
08/08/2015 08:09 PM 1,144 NewProcessDetailDemo.class
08/08/2015 08:09 PM 595 NewProcessDetailDemo.java
08/07/2015 06:47 PM 3,191 ProcessAPIDemo.class
08/07/2015 08:12 PM 1,099 ProcessAPIDemo.java
08/08/2015 07:38 PM 1,845 ProcessAPIDemoUtil.class
08/08/2015 07:32 PM 1,333 ProcessAPIDemoUtil.java
12 File(s) 12,806 bytes
2 Dir(s) 110,687,162,368 bytes free
[/code]

With this I have shown you how we can get process information using the new APIs in Java 9. In the coming posts I will explore more of the features of Java 9, so stay tuned. If you have any feedback, please write it in the comments section.

Filed Under: Java Tagged With: Java 9 Features

  • 1
  • 2
  • 3
  • …
  • 11
  • Next Page »

Follow Us

  • Facebook
  • Pinterest

As a participant in the Amazon Services LLC Associates Program, this site may earn from qualifying purchases. We may also earn commissions on purchases from other retail websites.

JavaBeat

FEATURED TUTORIALS

Answered: Using Java to Convert Int to String

What is new in Java 6.0 Collections API?

The Java 6.0 Compiler API

Copyright © by JavaBeat · All rights reserved