JavaBeat

  • Home
  • Java
    • Java 7
    • Java 8
    • Java EE
    • Servlets
  • Spring Framework
    • Spring Tutorials
    • Spring 4 Tutorials
    • Spring Boot
  • JSF Tutorials
  • Most Popular
    • Binary Search Tree Traversal
    • Spring Batch Tutorial
    • AngularJS + Spring MVC
    • Spring Data JPA Tutorial
    • Packaging and Deploying Node.js
  • About Us
    • Join Us (JBC)
  • Privacy
  • Contact Us

5 Ways to Streamline Your Node.js Applications

July 28, 2016 by Krishna Srinivasan Leave a Comment

5 Ways to Streamline Your Node.js Applications
Photo credit to Hacker Noon

Node.js is already a really efficient way to build quick web applications, but there’s always room for improvement. Think about implementing any of these tips into your Node.js app if you’re looking to speed up or clean up your applications.

 

  1. Use Caching

Caching is such a simple way to improve the performance of your Node.js apps. Use it to fetch frequently used data that doesn’t run or change often. Set a variable once to grab the data in question and then re-use it throughout every request.

2. Use NginX

NginX
Photo credit to Nginx

NginX is a lightweight server that you can use in conjunction with Node.js to serve static content to your server. Because Node.js isn’t intended to serve static content, using NginX to for this purpose will prevent Node.js from doing any unnecessary work, which in turn will make your app faster and more efficient.

3. Remove Unnecessary Processes 

If you’re using a pre-built application, chances are there are modules and middleware within the app that you don’t need or won’t use for your site. Go through and remove anything that isn’t essential, OR sees if you can find more lightweight solutions for any modules that you might want to keep.

4. Enable gzip Compression

When you use gzip compression it allows for the server to compress requests before sending them to the browser (if the browser is gzipped compatible). This can significantly speed up the amount it times for a browser to fetch certain files and is definitely worth enabling on your Node.js apps.

5. Minify and Concatenate your CSS/JS files

This one might seem obvious because really it applies to any sort of app,  including one built using Node.js. Minifying and combining all of your CSS and your JS files into one (one for CSS, one for JS) makes things so much quicker and because there will be significantly fewer HTTP requests to be made.

Filed Under: Java, NodeJS Tagged With: Caching, CSS/JS files, HTTP requests, NginX, Node.js

ExpressJS Session Store using MongoDB, PostgreSQL

September 16, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial explains how to use ExpressJS Session Stores using MongoDB and PostgreSQL. 

In one of our previous posts we saw how to work with sessions in ExpressJS app. In that example we made use of memory as the session store. We also mentioned that we can replace memory with other session stores like MongoDB, Redis, MySQL, FileSystem and so on. In this post we will explore that further and show how to use MongoDB and PsotgreSQL as the session stores.

Express-2

Why : ExpressJS Session Stores as DataBase?

Why do we need to store the session in an external DB?. In application deployed on different machines/nodes keeping the session data in memory will make the session data local to that machine/node. So if the request is served from a different machine/node then the session is not available for the user.

In order to avoid this issue, we store the session data in an external storage like MySQL, MonogDB and other DBs. This is very common practice to store the session data in DB and distribute among the different servers to avoid and loss of session data if any one of the server is down.

MongoDB Session Store

To use MongoDB as the session store we would have to install connect-mongo package along with the existing driver to connect to MongoDB. Connect-mongo is one of the popular and well tested ExpressJS middleware for storing the sessions.

Let us initialize the app and install the required packages. In the below code we are installing the following packages:

  • Mongodb
  • connect-mongo
  • express
  • express-session

[code lang=”shell”]
$> mkdir mongodb-session-store

$> cd mongodb-session-store

$> npm init

Press ^C at any time to quit.
name: (mongodb-session-store)
version: (1.0.0)
description: sample using mongodb as session store
entry point: (index.js)
test command:
git repository:
keywords:
author: mohamed sanaulla
license: (ISC)
About to write to G:\node\mongodb-session-store\package.json:

{
"name": "mongodb-session-store",
"version": "1.0.0",
"description": "sample using mongodb as session store",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "mohamed sanaulla",
"license": "ISC"
}

Is this ok? (yes)

$> npm install –save mongodb
$> npm install –save connect-mongo
$> npm install –save express
$> npm install –save express-session
[/code]

Now let us create a simple express app which makes use of sessions. This example can be found in the post here, copying the same for this tutorial. Subsequently we will replace the memory session storage with MongoDB.

[code lang=”javascript”]
var express = require(‘express’);
var session = require(‘express-session’);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

One of the options in the session options is specifying the store. We can provide the new store in the store property of the sessionOptions object. In order to connect to MongoDB We would also have to provide it with an existing connection object or provide it with the connection URL i.e parameters required to establish connection to MongoDB. Below is the code with MongoDB as the session store:

[code lang=”javascript” highlight=”3,11-14″]
var express = require(‘express’);
var session = require(‘express-session’);
var MongoStore = require(‘connect-mongo’)(session);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false,
store: new MongoStore({
url:"mongodb://localhost/test",
//other advanced options
})
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

Before running the application make sure you have installed mongodb and the monogdb daemon i.e monogd is running. The parts of the code that has changed has been highlighted above and its a minimal change.

To confirm that we are indeed writing the session data to MongoDB, load the application http://localhost:3300/ in the browser a few times and then connect to mongodb and run the below commands:

[code lang=”shell” highlight=”3, 8-13″]
> show collections
scores
sessions
students
system.indexes
system.profile
tasks
> db.sessions.find().pretty()
{
"_id" : "dTzBBxq8UTuS-aGuivY1iqgiITaEcJdz",
"session" : "{\"cookie\":{\"originalMaxAge\":null,\"expires\":null,\"httpOnly\":true,\"path\":\"/\"},\"views\":80}",
"expires" : ISODate("2015-09-27T03:30:55.179Z")
}
[/code]

A collection by name sessions is created where each document represents a session data as shown in the highlighted code above.

PostgreSql Session Store

Now let us see how we can use PostgreSql as the session store. It involves a bit of setup in the PostgreSql. We would have to create the session table in the db. The structure of the table is provided with connect-pg-simple node package. Lets first install that package and then look at the table structure. Below are the commands we run to create a new app for this and install relevant packages.

[code lang=”shell”]
G:\node>mkdir postgres-session-store

G:\node>cd postgres-session-store

G:\node\postgres-session-store>npm init

Press ^C at any time to quit.
name: (postgres-session-store)
version: (1.0.0)
description: Demo to use PostgreSql as session store
entry point: (index.js)
test command:
git repository:
keywords:
author: mohamed sanaulla
license: (ISC)
About to write to G:\node\postgres-session-store\package.json:

{
"name": "postgres-session-store",
"version": "1.0.0",
"description": "Demo to use PostgreSql as session store",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "mohamed sanaulla",
"license": "ISC"
}

Is this ok? (yes)
G:\node\postgres-session-store>npm install –save express
G:\node\postgres-session-store>npm install –save express-session
G:\node\postgres-session-store>npm install –save connect-pg-simple
[/code]

The session table structure can be found in the path: node_modules/connect-pg-simple/table.sqland the structure of the table is as given below:

[code lang=”sql”]
CREATE TABLE "session" (
"sid" varchar NOT NULL COLLATE "default",
"sess" json NOT NULL,
"expire" timestamp(6) NOT NULL
)
WITH (OIDS=FALSE);

ALTER TABLE "session" ADD CONSTRAINT "session_pkey"
PRIMARY KEY ("sid") NOT DEFERRABLE INITIALLY IMMEDIATE;
[/code]

Run the above sql to create the table in your Postgres DB. We are going to use postgres db.

Let us use the same express app code which we used at the start of this article. We are repeating the code here for your ease:

[code lang=”javascript”]
var express = require(‘express’);
var session = require(‘express-session’);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

Let us update the above code to declare PostgreSql session store as shown below:

[code lang=”javascript” highlight=”3,11-17″]
var express = require(‘express’);
var session = require(‘express-session’);
var PostgreSqlStore = require(‘connect-pg-simple’)(session);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false,
store : new PostgreSqlStore({
/*
connection string is built by following the syntax:
postgres://USERNAME:PASSWORD@HOST_NAME:PORT/DB_NAME
*/
conString: "postgres://postgres:postgres@localhost:5433/postgres"
})
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

In the above code the highlighted parts are the ones which set the session store as PostgreSql. Let us run the application and load the URL http://localhost:3300/ multiple times. Then execute the below query in Postgres to check the data in the session table:

[code lang=”sql”]
select * from session;
[/code]

And the output:
express session stores

Conclusion

In this article we saw how we can make use of ExpressJS Session Store as MongoDB and Postgres to store the session data. This is very useful when the application is distributed across multiple nodes. In similar ways we can use Redis, LevelDB and other external stores to store session data. If you have any questions, please write it in the comments section.

Filed Under: NodeJS Tagged With: ExpressJS Tutorials, NodeJS Tutorials

NodeJS : ExpressJS Session Management

September 11, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial explains the basic concept of ExpressJS session management. Sessions are an important part of web application. HTTP being stateless, to maintain state across requests among many other approaches, sessions and cookies is one approach.

In this article we will explore how we can make use of Node package express-session to maintain session in a ExpressJS based web application.

ExpressJS and NodeJS

Setting up the app

Let us create an empty Node project using npm as shown below:

[code lang=”shell”]
$ mkdir session-demo
$ cd session-demo
$ npm init
name: (session-demo)
version: (1.0.0)
description: Session demo for expressjs app
entry point: (index.js)
test command:
git repository:
keywords:
author: mohamed sanaulla
license: (ISC)
About to write to G:\node\session-demo\package.json:

{
"name": "session-demo",
"version": "1.0.0",
"description": "Session demo for expressjs app",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "mohamed sanaulla",
"license": "ISC"
}

Is this ok? (yes)
[/code]

Install relevant node packages for Express and Express session:

[code lang=”shell”]
$npm install express –save
$npm install express-session –save
[/code]

Let us create a simple Express app as shown below:

[code lang=”javascript”]
//Filename – index.js
var express = require(‘express’);
var session = require(‘express-session’);

var app = express();
app.get("/", function(req, res){
res.json({
"status" : "ok"
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

The above app returns a JSON object on calling http://localhost:3300/. Its a very simple example.

ExpressJS Session Management Example

In the below example, I am using session to record frequency of API invocation for a user. express-session by default stores the session data in memory. It provides support for replacing this default storage with different storage options as listed here.

express-session accepts a few properties in the options object. This object is passed while setting up the session with express app as shown below:

[code lang=”javascript”]
var express = require(‘express’);
var session = require(‘express-session’);
var app = express();
var sessionOptions = {};
app.use(session(sessionOptions));
[/code]

Different properties of the sessions options object are:

  • cookie: Options object for the session ID cookie. The default value is { path: '/', httpOnly: true, secure: false, maxAge: null }.
  • genid: Function to generate the session ID. Default is to use uuid
  • name:The name of the session ID cookie to set in the response (and read from in the request).
  • proxy: Trust the reverse proxy when setting secure cookies.
  • resave: If true forces a session to be saved back to store even if it was not modified in the request.
  • rolling: Forces a cookie to be set on every request.
  • saveUninitialized: If true it forces a newly created session without any modifications to be saved to the session store.
  • secret: It is a required option and is used for signing the session ID cookie.
  • store: Session store instance. Default is to use memory store.
  • unset: Controls the handling of session object in the store after it is unset. Either delete or keep the session object. Default is to keep the session object

Let us update the express app code to increment the frequency of the API invocation per user and record it in the session as shown below:

[code lang=”javascript”]
var express = require(‘express’);
var session = require(‘express-session’);

var app = express();

var sessionOptions = {
secret: "secret",
resave : true,
saveUninitialized : false
};

app.use(session(sessionOptions));

app.get("/", function(req, res){
if ( !req.session.views){
req.session.views = 1;
}else{
req.session.views += 1;
}

res.json({
"status" : "ok",
"frequency" : req.session.views
});
});

app.listen(3300, function (){
console.log("Server started at: http://localhost:3300");
});
[/code]

Run the above application by using command: node . from the app root directory. Continuously load the URL: http://localhost:3300/ and see the frequency changing.
expressjs session management

Conclusion

  • Also Read : HOW TO : SignIn with Twitter using Node.js and Express.js

This was a very simple example of expressjs session management. I hope the expressjs session example I have provided is very helpful.  It is a pre-requiste information to be known before working on authentication related features as authentication uses session to record authenticated user info. We will also see how to use different session stores i.e mysql, MongoDB and so on.

Filed Under: NodeJS Tagged With: ExpressJS Tutorials

Node.js : RESTful APIs using StrongLoop Arc

September 1, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial guides you through writing REST APIs StrongLoop Arc tool. This is an alternative to its command line tools for developing REST APIs. When you finish this tutorial, you will be able to develop REST APIs using the StrongLoop’s Arc UI composer.

In our previous post we saw how to create REST API using Loopback and MySQL by using the command line tool slc. StrongLoop also provides a UI composer for building the APIs and that UI composer is called StrongLoop Arc. In this post let us see how we can build the APIs using the UI composer.

arc

Creating empty Loopback project

First let us create an empty Loopback project using the command slc loopback:

[code lang=”shell”]
G:\node>slc loopback

_—–_
| | .————————–.
|–(o)–| | Let’s create a LoopBack |
`———´ | application! |
( _´U`_ ) ‘————————–‘
/___A___\
| ~ |
__’.___.’__
´ ` |° ´ Y `

? What’s the name of your application? loopback-rest-uicomposer
? Enter name of the directory to contain the project: loopback-rest-uicomposer
create loopback-rest-uicomposer/
info change the working directory to loopback-rest-uicomposer
[/code]

And enter the name of the application as loopback-rest-uicomposer as shown above.

Once the application is created, change the directory to the app you have created now: cd loopback-rest-uicomposer. The project structure is as shown below:
loopbqack empty project

Next is to install MySQL connector for loopback by running the following command: npm install --save loopback-connector-mysql

Launching Strongloop Arc tool

Once you are in the app directory i.e in the loopback-rest-uicomposer directory, run the below command to launch the Strongloop Arc tool. It gets launched in your default browser.

[code lang=”shell”]
G:\node\loopback-rest-uicomposer>slc arc
Loading workspace G:\node\loopback-rest-uicomposer
StrongLoop Arc is running here: http://localhost:49619/#/
[/code]

And the app gets launched in the browser you will be able to see a login box. If you are first time user you would have to register at https://strongloop.com/register/ and then use the username and password to login to the application:
strongloop arc loginAfter login you will be able to see the features supported by StrongLoop Arc as shown below:

strongloop arc tools

Click on the Composer button to open the API composer UI. The composer UI looks like below:

strongloop arc composer

Creating MySQL Datasource

New datasource can be created either by clicking on the button MySQL button or the Add New Data Source link as shown below:
loopback mysql datasource

You will get an UI to configure your data source. Fill it with the connection settings as shown below:
loopbackClick on Test Connection button to test the connection. You will see Success printed beside the button as shown below:
test connectionAfter you click on Save Datasource button you will see the new datasource listed under the data source as shown below:
datasource

Creating Book Model

New model can be created by clicking on the button New or on the Add New Model link as shown below:
modelsYou will get an UI to configure your model. Fill it with the model details, properties and also select the data source as mysql as shown in the image below:
arc-1

Click on the Save Model button to find the model created and listed under the available models as shown below:
arc-2

Running the application

StrongLoop Arc provides buttons to launch the application and these are available in the top right corner as shown below:
strongloop arc example
Open the URL: http://localhost:3000/explorer/ in the browser to view the APIs available. You can test the APIs as explained in the previous article.

Conclusion

In this article we saw how we could leverage the StrongLoop Arc tool to compose the APIs visually. The Arc tool also supports the following:

  1. Building and Deploying the app
  2. Process Manager
  3. Tracing the app
  4. Profiling the app
  5. Collecting app metrics

We will explore the other features in the coming articles.

Filed Under: NodeJS Tagged With: NodeJS Tutorials, StrongLoop Tutorials

Node.js : Building RESTful APIs using Loopback and MySQL

September 1, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial guides you through writing simple REST APIs using Loopback (a Node.js framework) using MySQL as the back end. At the end of this tutorial, you would be able to write your own REST APIs using Loopback framework. 

In one of our previous posts here, we wrote about building RESTful APIs using ExpressJS and MongoDB. For every programming language ecosystem we have more than one web framework and the same is true for Node.js as well. One of such popular frameworks is the Loopback. Both ExpressJS and Loopback are sponsored by StrongLoop.

In this post we are going to implement the same example we used in this post but using Loopback and MySQL DB.

Loopback and Node.js

Table of Contents

List of sections covered in this tutorial are:

  1. What is Loopback?
  2. Installing Loopback
  3. Create Loopback App
  4. Creating MySQL Datasource
  5. Creating Model in Loopback
  6. Testing CRUD APIs
  7. Conclusion

Before jump into learning loopback, lets have more idea on what is this new framework in Node.js ecosystem and how this is different from other frameworks like ExpressJS.

What is Loopback?

Loopback is a opensource Node.js API framework for building REST APIs for your client applications (which can be anything like browser, mobile, etc) in most simplest way. While this is another runner in the fast growing Node.js frameworks crowd, but Loopback distinguishes itself from other frameworks like ExpressJS by providing very simple and easy to develop APIs. This is built on top of Express, most popular framework for Node.js.

There are many great advantages of loopback that makes its powerful to choose as a defacto framework for the Node.js. The future is more of JavaScript frameworks that is going to dominate the internet, loopback fits perfectly on the server side language with its easy and simple development to attract more developers.

javascript-stack

Installing Loopback

  • Loopback is available as a NPM package. As with all the NPM packages, LoopBack can be installed using npm.
  • Firstly we will install slc command line tool using this command npm install -g strongloop. slc command line tool can be used to create applications and other app artifacts.
  • Next is to install Loopback which can be done by running the command: npm install -g loopback.
  • But to create loopback application using slc command line tool, we would have to install two more npm packages namely: cookie-parser and errorhandler. These packages can be installed using the command: npm install -g cookie-parser, errorhandler.

Note: On windows one would have to run the command prompt as an administrator to be able to install slc.

  • Also Read : Packaging and Deploying Node.js Applications

Create loopback app

Strongloop node package comes with a command line tool called slc which can be used to create new application, model classes, add properties to it, define data sources to name a few.

Run this command to start creating new application: slc loopback. You will be prompted for name of your application and the folder (defaults to the name of the application) in which your application would reside. Below is what you would find on your screens:

[code lang=”shell”]
G:\node>slc loopback

_—–_
| | .————————–.
|–(o)–| | Let’s create a LoopBack |
`———´ | application! |
( _´U`_ ) ‘————————–‘
/___A___\
| ~ |
__’.___.’__
´ ` |° ´ Y `

? What’s the name of your application? loopback-rest
? Enter name of the directory to contain the project: loopback-rest
create loopback-rest/
info change the working directory to loopback-rest

Generating .yo-rc.json

I’m all done. Running npm install for you to install the required dependencies. If this fails, try running the command yourself.

create .editorconfig
create .jshintignore
create .jshintrc
create README.md
create server\boot\authentication.js
create server\boot\explorer.js
create server\boot\rest-api.js
create server\boot\root.js
create server\middleware.json
create server\server.js
create .gitignore
create client\README.md

…. npm packages installation output ….

Next steps:

Change directory to your app
$ cd loopback-rest

Create a model in your app
$ slc loopback:model

Compose your API, run, deploy, profile, and monitor it with Arc
$ slc arc

Run the app
$ node
[/code]

You will find the next steps also printed. But for now we have a directory structure created which looks like below:
loopback project structure

There is lot of auto generated code. Let us not worry about that for now.

We can run the application now to see what all is generated for us. Change to the app directory and run the command: node .. You can see the below output:

[code lang=”shell”]
G:\node\loopback-rest>node .
Browse your REST API at http://localhost:3000/explorer
Web server listening at: http://localhost:3000/
[/code]

Opening the URL: http://localhost:3000/explorer in the browser gives us the list of APIs available in the application.
loopback run applicationIn the above image you can notice a list of APIs exposed just for the User. And all this is shipped by the framework.

Opening the URL: http://localhost:3000/ in the browser gives us:
loopback code

Creating MySQL data source

Let us associate a data source with the application. As mentioned we will be using MySQL database. First we have to install the loopback mysql connector which can be done by running the command: npm install loopback-connector-mysql --save.

Next we create a data source for the application which would use mysql connector using the below command:

[code lang=”shell”]
G:\node\loopback-rest>slc loopback:datasource
? Enter the data-source name: mysql_db
? Select the connector for mysql_db: MySQL (supported by StrongLoop)
[/code]

It requires the data source name and the connector to be used which in this case is MySQL.

The above command updates the server/datasources.json with the following:

[code lang=”javascript”]
"mysql_db": {
"name": "mysql_db",
"connector": "mysql"
}

We would have to update the above datasource to define the database name, hostname, port, username and password. Let us update it with the following data:
[code lang="javascript"]
"mysql_db": {
"name": "mysql_db",
"connector": "mysql",
"database":"test",
"host":"localhost",
"port":3306,
"password":"password",
"username":"root"
}
[/code]

Creating Model in Loopback

Let us use slc tool to create a new model in loopback. The new model can be created by using the command: slc loopback:model. It will ask a series of questions starting with the model name, data source, model’s base class and properties and their types in the model class. Below is how we provided the answers to the questions and also the properties:

[code lang=”java”]
G:\node\loopback-rest>slc loopback:model
? Enter the model name: book
? Select the data-source to attach book to: mysql_db (mysql)
? Select model’s base class: PersistedModel
? Expose book via the REST API? Yes
? Custom plural form (used to build REST URL): books
Let’s add some book properties now.

Enter an empty property name when done.
? Property name: name
(!) generator#invoke() is deprecated. Use generator#composeWith() – see http://yeoman.io/authoring/composability.html
invoke loopback:property
? Property type: string
? Required? Yes

Let’s add another book property.
Enter an empty property name when done.
? Property name: isbn
(!) generator#invoke() is deprecated. Use generator#composeWith() – see http://yeoman.io/authoring/composability.html
invoke loopback:property
? Property type: string
? Required? Yes

Let’s add another book property.
Enter an empty property name when done.
? Property name: author
(!) generator#invoke() is deprecated. Use generator#composeWith() – see http://yeoman.io/authoring/composability.html
invoke loopback:property
? Property type: string
? Required? No

Let’s add another book property.
Enter an empty property name when done.
? Property name: pages
(!) generator#invoke() is deprecated. Use generator#composeWith() – see http://yeoman.io/authoring/composability.html
invoke loopback:property
? Property type: number
? Required? No

Let’s add another book property.
Enter an empty property name when done.
? Property name:
[/code]

Once completed you will find two files generated: book.js and book.json as shown in the image below:
loopback model datasource mysql
Apart from the above two files, the new model gets recorded in model-config.json as shown below:

[code lang=”javascript”]
"book": {
"dataSource": "mysql_db",
"public": true
}
[/code]

When we load the URL http://localhost:3000/explorer/#!/books/ you will find lot of APIs associated with books. Loading any of the books APIs, for example this one: http://localhost:3000/api/books will result in an error stating table test.books was not found. Let us create the table now:

[code lang=”sql”]
CREATE TABLE `test`.`book` (
`id` VARCHAR(250) NOT NULL,
`name` VARCHAR(250) NOT NULL ,
`isbn` VARCHAR(20) NOT NULL ,
`author` VARCHAR(500) NULL ,
`pages` INT NULL
) ENGINE = InnoDB;
[/code]

Let us now load the same URL http://localhost:3000/api/books and this time we will get an empty JSON array response.

So just creating a model has created a whole set of REST APIs around that model. In the next section let us try out the APIs for CRUD operations.

Testing CRUD APIs

The URL: http://localhost:3000/explorer not only exposes the list of APIs for each model, but also provides a tool for testing out the APIs.

Adding a new book

The request to add a book is made using the explorer url as shown in the image below:
loopback-add-data
The response after adding the book is as shown in the image below:loopback responseGetting a given book details

The URL: http://localhost:3000/api/books/{id} is used for getting the book details where the book is identified by the id parameter. Let us get the details of the book added above by using the URL: http://localhost:3000/api/books/123. This gives us the response as shown in the image below:
W7aMaiw

Updating book details

The request to update book details is made using the tool as shown below:
loopback update data
You can see in the above image that I am updating just the name and isbn and keeping rest of the same. The response from the API is as shown in the image below:

loopback

Deleting Book

The request to delete the book is made as shown below:
delete-bookThe response is as shown below:
delete-response-loopbackNow trying to get the details of the book with id 123 will result in an error as shown below:
UBWCVNE

Getting list of books

Lets add few new books. Then load the URL http://localhost:3000/api/books to get the list of books as shown below:
O2Ef5fF

Conclusion

So with this we have completed demonstration of CRUD operations using the APIs of Loopback and we didn’t write even a single line of code. All of the functionality above was provided by Loopback framework. In the coming posts we will explore more of this framework and see how we can build APIs using it. I have published my next article on how to use SpringLoop ARC tool for developing REST APIs using GUI.

Filed Under: NodeJS Tagged With: Loopback Tutorials, NodeJS Tutorials

Node.js : Operating System Utilities in Node.js OS Module

August 25, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial walks through the usage of Operating System related utilities in the Node.js OS module. The OS module in Node.js offers wide set of methods that are useful in getting the relevant details about the native operating system. 

If you are Node.js beginner, please read our Introduction article on Node.js.

Node.js provides operating system related utilities in os module. In this article we will show you the different APIs provided by the os module. We will show each API and its output along with it. Let us first create a js file by name: node_os_interface.js and add the following line to it:

[code lang=”javascript”]
var os = require(‘os’);
[/code]

Node.js OS Module API

There are 14 methods defined in the OS module of Node.js. These are the methods that are used for reading and interacting with native Operating System to fetch various details like process, memory, etc. The list of methods that are part of the OS module are :

  1. os.tmpdir()
  2. os.endianness()
  3. os.hostname()
  4. os.type()
  5. os.platform()
  6. os.arch()
  7. os.release()
  8. os.uptime()
  9. os.loadavg()
  10. os.totalmem()
  11. os.freemem()
  12. os.cpus()
  13. os.networkInterfaces()
  14. os.EOL

The following sections explains each method in the Node.js OS module with simple examples.

tmpdir()

Now let us start with the first API tmpdir() – this API prints the temporary directory of the OS:

[code lang=”javascript”]
console.log("OS Temp Dir: " + os.tmpdir());
[/code]

The above prints: OS Temp Dir: C:\Users\Mohamed\AppData\Local\Temp on my system. It will be different for different systems.

endianness()

This API prints whether the CPU architecture is Big Endian (BE) or Little Endian (LE).

[code lang=”javascript”]
console.log("CPU is BigEndian(BE) or LittleEndian(LE): " + os.endianness());
[/code]

Output

[code lang=”shell”]
CPU is BigEndian(BE) or LittleEndian(LE): LE
[/code]

hostname()

This API prints the operating system hostname.

[code lang=”javascript”]
console.log("OS Hostname: " + os.hostname());
[/code]

Output

[code lang=”shell”]
OS Hostname: Sana-Laptop
[/code]

type()

This API prints the type of the operating system.

[code lang=”javascript”]
console.log("OS Type: " + os.type());
[/code]

Output

[code lang=”shell”]
OS Type: Windows_NT
[/code]

platform()

This API prints the platform of the OS.

[code lang=”javascript”]
console.log("OS Platform: " + os.platform());
[/code]

Output

[code lang=”shell”]
OS Platform: win32
[/code]

arch()

This API prints CPU architecture – whether it is 32 bit, 64 bit or arm architecture.

[code lang=”javascript”]
console.log("OS CPU Architecture: " + os.arch());
[/code]

Output

[code lang=”shell”]
OS CPU Architecture: x64
[/code]

release()

This API prints OS release number.

[code lang=”javascript”]
console.log("OS Release: " + os.release());
[/code]

Output

[code lang=”shell”]
OS Release: 6.3.9600
[/code]

uptime()

This API returns the uptime of the machine i.e the number of seconds it has been running.

[code lang=”javascript”]
console.log("OS Uptime (seconds): " + os.uptime());
[/code]

Output

[code lang=”shell”]
OS Uptime (seconds): 104535.1365887
[/code]

loadavg()

This API returns the load average of the machine in the last 1, 5 and 15 minutes. This concept is relevant to UNIX systems and will return 0,0,0 for windows systems.

[code lang=”javascript”]
console.log("OS load average (Returns 0,0,0 in windows): " + os.loadavg());
[/code]

Output

[code lang=”shell”]
OS load average (Returns 0,0,0 in windows): 0,0,0
[/code]

totalmem()

This API returns the total memory (RAM) available in the system.

[code lang=”javascript”]
console.log("Total RAM (mb): " + (os.totalmem()/1024)/1024);
[/code]

Output

[code lang=”shell”]
Total RAM (mb): 8084.2734375
[/code]

freemem()

This API returns the free memory (RAM) available in the system.

[code lang=”javascript”]
console.log("Free RAM (mb): " + (os.freemem()/1024)/1024)
[/code]

Output

[code lang=”shell”]
Free RAM (mb): 2169.56640625
[/code]

cpus()

This API returns the CPUs available and information about them. We will use JSON.stringify function to pretty print the JSON string.

[code lang=”javascript”]
var cpus = os.cpus();
console.log("CPU Information: " + JSON.stringify(cpus, null, 2));
[/code]

Output

[code lang=”shell”]
CPU Information:
[
{
"model": "Intel(R) Core(TM) i7-4510U CPU @ 2.00GHz",
"speed": 2594,
"times": {
"user": 2376265,
"nice": 0,
"sys": 2880406,
"idle": 66987203,
"irq": 239203
}
},
{
"model": "Intel(R) Core(TM) i7-4510U CPU @ 2.00GHz",
"speed": 2594,
"times": {
"user": 2378703,
"nice": 0,
"sys": 2435625,
"idle": 67429250,
"irq": 136937
}
},
{
"model": "Intel(R) Core(TM) i7-4510U CPU @ 2.00GHz",
"speed": 2594,
"times": {
"user": 2435640,
"nice": 0,
"sys": 2670859,
"idle": 67137046,
"irq": 25000
}
},
{
"model": "Intel(R) Core(TM) i7-4510U CPU @ 2.00GHz",
"speed": 2594,
"times": {
"user": 2503703,
"nice": 0,
"sys": 1726234,
"idle": 68013625,
"irq": 24640
}
}
]
[/code]

The above information contains the CPU speed and the time the CPU has spent in doing user operations, system operations and being idle for each CPU.

networkInterfaces()

This API returns the network interfaces in the system i.e the entities that interface between OS and the network. These can be physical devices and logical entities for connection to localhost.

[code lang=”javascript”]
console.log("Network Interfaces: " + JSON.stringify(os.networkInterfaces(), null, 2));
[/code]

Output

[code lang=”shell”]
Network Interfaces:
{
"Wi-Fi": [
{
"address": "fe80::19c3:55f9:ecd:8a5a",
"netmask": "ffff:ffff:ffff:ffff::",
"family": "IPv6",
"mac": "b0:10:41:68:31:53",
"scopeid": 2,
"internal": false
},
{
"address": "192.168.1.4",
"netmask": "255.255.255.0",
"family": "IPv4",
"mac": "b0:10:41:68:31:53",
"internal": false
}
],
"Loopback Pseudo-Interface 1": [
{
"address": "::1",
"netmask": "ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff",
"family": "IPv6",
"mac": "00:00:00:00:00:00",
"scopeid": 0,
"internal": true
},
{
"address": "127.0.0.1",
"netmask": "255.0.0.0",
"family": "IPv4",
"mac": "00:00:00:00:00:00",
"internal": true
}
],
"Teredo Tunneling Pseudo-Interface": [
{
"address": "2001:0:9d38:6abd:2444:2b76:8a3f:ec06",
"netmask": "ffff:ffff:ffff:ffff::",
"family": "IPv6",
"mac": "00:00:00:00:00:00",
"scopeid": 0,
"internal": false
},
{
"address": "fe80::2444:2b76:8a3f:ec06",
"netmask": "ffff:ffff:ffff:ffff::",
"family": "IPv6",
"mac": "00:00:00:00:00:00",
"scopeid": 8,
"internal": false
}
]
}
[/code]

Each of the above network interfaces have an entry each for IPv4 and IPv6.

EOL

This returns the EOL marker for the OS. In windows it is \n.

[code lang=”javascript”]
console.log("EOL Marker for OS: " + os.EOL);
[/code]

The complete program is given below:

[code lang=”javascript”]
//file name: node_os_interface.js
var os = require(‘os’);

console.log("OS Temp Dir: " + os.tmpdir());
console.log("CPU is BigEndian(BE) or LittleEndian(LE): " + os.endianness());
console.log("OS Hostname: " + os.hostname());
console.log("OS Type: " + os.type());
console.log("OS Platform: " + os.platform());
console.log("OS CPU Architecture: " + os.arch());
console.log("OS Release: " + os.release());
console.log("OS Uptime (seconds): " + os.uptime());
console.log("OS load average (Returns 0,0,0 in windows): " + os.loadavg());
console.log("Total RAM (mb): " + (os.totalmem()/1024)/1024);
console.log("Free RAM (mb): " + (os.freemem()/1024)/1024)
var cpus = os.cpus();
console.log("CPU Information: " + JSON.stringify(cpus, null, 2));
console.log("Network Interfaces: " + JSON.stringify(os.networkInterfaces(), null, 2));
console.log("EOL Marker for OS: " + os.EOL);
[/code]

The above can be executed by running the command: node node_os_interface.js. The output is given below:

[code lang=”shell”]
OS Temp Dir: C:\Users\Mohamed\AppData\Local\Temp
CPU is BigEndian(BE) or LittleEndian(LE): LE
OS Hostname: Sana-Laptop
OS Type: Windows_NT
OS Platform: win32
OS CPU Architecture: x64
OS Release: 6.3.9600
OS Uptime (seconds): 145774.5905403
OS load average (Returns 0,0,0 in windows): 0,0,0
Total RAM (mb): 8084.2734375
Free RAM (mb): 2064.69921875
CPU Information:
… CPU Information already shown above …
Network Interfaces:
… Network interface information already shown above …
EOL Marker for OS:

[/code]

I hope this tutorial helped you to understand the Node.js OS module with simple examples. In my next articles, I will cover few more topics on Node.js framework with more examples.

Filed Under: NodeJS Tagged With: NodeJS Tutorials

Node.js : Column Chart using FusionCharts with MongoDB

August 4, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial explains one of the leading charting library FusionCharts and how to create a column chart using the FusionCharts.

Understanding and analyzing data in the form of numbers is always difficult. But when the same is represented visually things become easier to understand and make sense very quickly. Ìn this post I am going to show you how to represent data in the form of charts.

The sample data I am going to use is that of the top 10 run scorers in ODI Cricket in the year 2015. The same data can be found on cricinfo here.

For building the charts there are lot of options available both client side and server side charts. But in this post I am going to make use of a popular Javascript API FusionCharts.

fusioncharts

What is FusionCharts?

As per their official documentation, a FusionCharts is:

FusionCharts is an open-source FREE flash charting component that can be used to render data-driven animated charts. Made in Macromedia Flash MX, FusionCharts can be used with any web scripting language like PHP, ASP, .NET, JSP, ColdFusion, JavaScript, Ruby on Rails etc., to deliver interactive and powerful charts. Using XML as its data interface, FusionCharts makes full use of fluid beauty of Flash to create compact, interactive and visually-arresting charts.

The post is going to consists of following parts:

  1. Building Node.js backend for fetching data
  2. Interacting with the Node.js backend from Javascript client
  3. Rendering the chart using the data obtained

Building Node.js backend for fetching data

  • You might like to read : Node.js and MongoDB Performing CRUD Operations

First let us import the below data into mongodb collection:

[code lang="javascript"]
[
  {"player": "AB de Villiers", "runs":162},
  {"player": "CH Gayle", "runs":215},
  {"player": "KJ Coetzer", "runs":156},
  {"player": "L Ronchi", "runs":170},
  {"player": "TM Dilshan", "runs":161},
  {"player": "MJ Guptil", "runs":237},
  {"player": "HM Amla", "runs":153},
  {"player": "AB de Villiers", "runs":149},
  {"player": "HM Amla", "runs":159},
  {"player": "DA Warner", "runs":178}
]

[/code]

Save the above JSON into a file name data.json and then use the following command to import the above data:

[code lang="shell"]
mongoimport -d test -c top_scores --type json --file data.json --jsonArray
[/code]

With this we now have the data in our mongodb and the same can be confirmed by running the following command:

[code lang="javascript"]
> db.top_scores.find();
{ "_id" : ObjectId("55be11f845ed173efcfc34d4"), "player" : "AB de Villiers", "runs" : 162 }
{ "_id" : ObjectId("55be11f845ed173efcfc34d5"), "player" : "CH Gayle", "runs" : 215 }
{ "_id" : ObjectId("55be11f845ed173efcfc34d6"), "player" : "KJ Coetzer", "runs" : 156 }
{ "_id" : ObjectId("55be11f845ed173efcfc34d7"), "player" : "L Ronchi", "runs" : 170 }
{ "_id" : ObjectId("55be11f845ed173efcfc34d8"), "player" : "TM Dilshan", "runs" : 161 }
{ "_id" : ObjectId("55be11f845ed173efcfc34d9"), "player" : "MJ Guptil", "runs" : 237 }
{ "_id" : ObjectId("55be11f845ed173efcfc34da"), "player" : "HM Amla", "runs" : 153 }
{ "_id" : ObjectId("55be11f845ed173efcfc34db"), "player" : "AB de Villiers", "runs" : 149 }
{ "_id" : ObjectId("55be11f845ed173efcfc34dc"), "player" : "HM Amla", "runs" : 159 }
{ "_id" : ObjectId("55be11f845ed173efcfc34dd"), "player" : "DA Warner", "runs" : 178 }
[/code]

Now let us write a Node.js based web server using Express to fetch data from the MongoDB and return it in the form:

[code]
[{"label":"MJ Guptil", "value": 237}, ...]
[/code]

The above structure of data is what processed by FusionCharts API.

The below is the code for the Node.js server:

[code lang="javascript"]
//Express is required for creating Node.js based web apps
var express = require('express');

//Importing the required mongodb driver
var MongoClient = require('mongodb').MongoClient;

//MongoDB connection URL
var dbHost = 'mongodb://localhost:27017/test';

//Name of the collection
var myCollection = "top_scores";

//DB Object
var dbObject;

//Connecting to the Mongodb instance.
//Make sure your mongodb daemon mongod is running on port 27017 on localhost
MongoClient.connect(dbHost, function(err, db){
  if ( err ) throw err;
  dbObject = db;
});

var app = express();
app.set('port', 3300);

//Starting up the server on the port: 3300
app.listen(app.get('port'), function(){
  console.log('Server up: http://localhost:' + app.get('port'));
});

app.get("/scores", function(req, res){
  //Query Mongodb and iterate through the results
  dbObject.collection(myCollection).find({},{},{}).toArray(
    function(err, docs){
      var result = [];
      for(index in docs){
        var doc = docs[index];
        var resultObject = {};
        resultObject.label = doc.player;
        resultObject.value = doc.runs;
        result.push(resultObject);
      }
      res.json(result);
    }
  );

});
[/code]

Save the above code in the file and name it as: column_chart_sample_server.js. Run the above code by using the following command:

[code]
G:\node\testapp>node column_chart_sample_server.js
Server up: http://localhost:3300
[/code]

This will run the server on port 3300. Open your browser and load the URL: http://localhost:3300/scores. You would see the response of the API.

Interacting with the Node.js backend from Javascript client

  • also read: Introduction to Node.js

Now let us invoke the above API from Javascript and for that I would use jQuery as shown below:

[code lang="javascript"]
$(function(){
  $.ajax({

    url: 'http://localhost:3300/scores',
    type: 'GET',
    crossDomain: true,
    success : function(data) {
      console.log(data);
    }
  });
});
[/code]

Rendering the FusionCharts using the data obtained

As mentioned at the beginning of the post I would be using FusionCharts Javascript library for rendering column chart. Firstly, Download the following:

  • Required Javascript code for FusionCharts from here.
  • Bootstrap CSS library from here. I will use it for any CSS related requirements.
  • Latest jQuery code from here.

After downloading all the relevant sources, copy the files from their corresponding locations and build a directory structure as shown in the image below:
FusionChartsIn the above image there are two files namely: column_chart_sample.html and column_chart_sample.js. I would be implementing these two files in this post. column_chart_sample.html contains the required HTML code and column_chart_sample.js contains the code for invoking the server API and initializing the FusionCharts code.

The below is the code for column_chart_sample.html:

[code lang="html"]
<!DOCTYPE html>
<html>
<head>
  <title>Sample for Column Chart using FusionChart</title>
  <link rel="stylesheet" href="css/bootstrap.css"/>
</head>
<body>
  <div class="container">
    <div class="row">
      <div class="col-md-12">
        <h3>Top 10 run scorers in ODI cricket in the year 2015</h3>
        <!-- Container for rendering the chart -->
        <div id="chart-container"></div>
      </div>
    </div>
  </div>
  <!-- jQuery Javascript -->
  <script src="js/jquery-2.1.4.js"></script>

  <script src="js/bootstrap.js"></script>
  <!-- FusionCharts related Javascript library -->
  <script src="js/fusioncharts.js"></script>
  <script src="js/fusioncharts.charts.js"></script>
  <script src="js/fusioncharts.theme.zune.js"></script>
  <script src="js/column_chart_sample.js"></script>
</body>
</html>
[/code]

Let us know step by step build the Javascript code for rendering the chart.

  1. Create a JSON object of chart properties:
    [code lang="javascript"]
    var chartProperties = {
      "caption": "Top 10 run scorers in ODI Cricket in 2015",
      "xAxisName": "Player",
      "yAxisName": "Runs Scored",
      "rotatevalues": "1",
      "theme": "zune"
    };
    [/code]
  2. Get the data for the chart by invoking the Server side API created at the beginning as shown below:
    [code lang="javascript"]
    var chartData;
    $(function(){
      $.ajax({
    
        url: 'http://localhost:3300/scores',
        type: 'GET',
        crossDomain: true,
        success : function(data) {
          chartData = data;
        }
      });
    });
    [/code]
  3. Creating an instance of FusionCharts by passing the id of the HTML element where the chart would be rendered, dimensions of the chart, type of the chart, data format and data source of the chart as shown below:
    [code lang="javascript"]
    var apiChart = new FusionCharts({
      type: 'column2d',
      renderAt: 'chart-container',
      width: '550',
      height: '350',
      dataFormat: 'json',
      dataSource: {
        "chart": chartProperties,
        "data": chartData
      }
    });
    [/code]

    We have already seen how chartProperties and chartData object is built.

  4. Render the chart using the below code:
    [code lang="javascript"]
    apiChart.render();
    [/code]

Putting all together the column_chart_sample.js looks like below:

[code lang="javascript"]
var chartData;
$(function(){
  $.ajax({

    url: 'http://localhost:3300/scores',
    type: 'GET',
    crossDomain: true,
    success : function(data) {
      chartData = data;
      var chartProperties = {
        "caption": "Top 10 run scorers in ODI Cricket in 2015",
        "xAxisName": "Player",
        "yAxisName": "Runs Scored",
        "rotatevalues": "1",
        "theme": "zune"
      };

      var apiChart = new FusionCharts({
        type: 'column2d',
        renderAt: 'chart-container',
        width: '550',
        height: '350',
        dataFormat: 'json',
        dataSource: {
          "chart": chartProperties,
          "data": chartData
        }
      });
      apiChart.render();
    }
  });
});
[/code]

To run this example on Chrome, follow the steps below:

  1. Install the Allow-Control-Allow-Origin: * plugin from Chrome Webstore.
  2. Load the column_chart_sample.html in the Chrome browser
    OR
    Copy the entire project folder to an Apache Webserver www folder and load the following URL in the browser: http://localhost:8080/fusioncharts/column_chart_sample.html

On other browsers you would see an error message similar to: “CORS header ‘Access-Control-Allow-Origin’ missing“.

Once you have loaded the HTML you would see a chart rendered as shown below:
FusionCharts

With this I have shown you how to load the data from a DB and represent it in the form of a column chart to the end user.

Filed Under: NodeJS Tagged With: fusioncharts

Node.js : Reading and Writing File in Node.js

July 1, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial explains how to access the file system and read / write files using Node.js libraries. This tutorial explains about a module called fs for interacting with file system. 

I have written various Node.js articles covering the topics Scheduling Tasks in Node.js using Cron Package, ExpressJS and Bootstrap development and Node.js and MongoDB CRUD Operations. Also learn about how to package and deploy the Node.js application.

Node.js provides a module called fs for interacting with file system. This includes not just reading/writing to file, but also includes more things like linking files, permissions related to files and directories, watching files, directories for changes and others. Node.js file operations has lot more, here I explain only read and write APIs.

  • Book Reference : Web Development with Node and Express

In this post I will show you how to read from files and write to files both synchronously and asynchronously in Node.js using this fs module. File API defines each method with both synchronous and asynchronous version. You can either use the  synchronous call or asynchronous call to read, write and remove files.

I assume that you have enough exposure to Node.js for setting up and import the packages. So, i would focus only on explaining the Node.js file operations in this tutorial.

Node.js File Operations

NodeJS File perations

There are quite a few ways of reading the file. In this post I am going to show two ways:

  • using the readFile API.
  • using the ReadStream API

Reading File Asynchronously – readFile API

This example helps you to read a file asynchronously using the readFile() method.

[code language=”javascript”]
//Filename: node-read-async.js
var fs = require(‘fs’);
var readSource = ‘C:/input.txt’;

/** Using the readFile API – Asynchronous */
fs.readFile(readSource, "utf8", function(err, data){
if ( err ){ throw err;}
console.log("Reading file asynchronously");
console.log(data);
});
[/code]

Run the above code by executing: node node-read-async.js

The output we get:

[code language=”shell”]
G:\node\node-file-io>node node-read-async.js
Reading file asynchronously
As an asynchronous event driven framework, Node.js is designed to build scalable network applications. In the following "hello world" example, many connections can be handled concurrently. Upon each connection the callback is fired, but if there is no work to be done Node is sleeping.
[/code]

Reading File Synchronously – readFileSync API

[code language=”javascript”]
//Filename: node-read-sync.js
var fs = require(‘fs’);
var readSource = ‘C:/input.txt’;

/** Using the readFile API – Asynchronous */
console.log("Reading file synchronously");
var fileData = fs.readFileSync(readSource, "utf8");
console.log(fileData);
[/code]

Executing the above we get:

[code language=”shell”]
G:\node\node-file-io>node node-read-sync.js
Reading file synchronously
As an asynchronous event driven framework, Node.js is designed to build scalable network applications. In the following "hello world" example, many connections can be handled concurrently. Upon each connection the callback is fired, but if there is no work to be done Node is sleeping.
[/code]

Reading File – ReadStream API

[code language=”javascript”]
var fs = require(‘fs’);
var readSource = ‘C:/input.txt’;

/** Reading file using ReadStream API */
//Creating a stream out of the file
var readStreamObject = fs.createReadStream(readSource, { flags: ‘r’,
encoding:"utf8",
fd: null,
mode: 0666,
autoClose: true
});

//Setting up event handlers on the stream object
//readable – this event is fired when data can be read from stream
readStreamObject.on(‘readable’, function(){
console.log("*** Reading from file using ReadStream");
});
//data – this event is fired when data is available to be read from stream
readStreamObject.on(‘data’, function(data){
console.log(data);
});

//end – this event is fired when there is no more data available to be read from stream
readStreamObject.on(‘end’, function(){
console.log("*** Completed Reading from file using ReadStream");
});
[/code]

Executing the above code as shown below we get:

[code language=”shell”]
G:\node\node-file-io>node node-read-readstream.js
As an asynchronous event driven framework, Node.js is designed to build scalable network applications. In the following "hello world" example, many connections can be handled concurrently. Upon each connection the callback is fired, but if there is no work to be done Node is sleeping.
*** Reading from file using ReadStream
*** Completed Reading from file using ReadStream
[/code]

Above you might notice that the event handling is not in order, i.e one would expect the message “*** Reading from file using ReadStream” first, then the contents of the file and then finally “*** Completed Reading from file using ReadStream”. But due to the async nature of the API the order is not guaranteed.

Write To File Asynchronously – writeFile API

[code language=”javascript”]
var fs = require(‘fs’);
var writeSource = ‘C:/Users/Mohamed/Documents/node-write-demo.txt’;

fs.writeFile(writeSource, "Writing to a file from node.js", {"encoding":’utf8′}, function(err){
if ( err ) { throw err; }
console.log("*** File written successfully");
//Now reading the same file to confirm data written
fs.readFile(writeSource, "utf8", function(err, data){
if ( err ){ throw err;}
console.log("*** Reading just written file");
console.log(data);
});

});

[/code]

Executing the above script we get:

[code language=”javascript”]
G:\node\node-file-io>node node-write-async.js
*** File written successfully
*** Reading just written file
Writing to a file from node.js
[/code]

Write To File Synchronously – writeFileSync API

[code language=”javascript”]
var fs = require(‘fs’);
var writeSource = ‘C:/Users/Mohamed/Documents/node-write-sync-demo.txt’;

fs.writeFileSync(writeSource, "Writing to a file synchronously from node.js", {"encoding":’utf8′});

console.log("*** File written successfully");

//Now reading the same file to confirm data written
fs.readFile(writeSource, "utf8", function(err, data){
if ( err ){ throw err;}
console.log("*** Reading just written file");
console.log(data);
});
[/code]

Executing the above script we get:

[code language=”shell”]
G:\node\node-file-io>node node-write-sync.js
*** File written successfully
*** Reading just written file
Writing to a file synchronously from node.js
[/code]

Write To File – WriteStream API

[code language=”javascript”]
var fs = require(‘fs’);
var writeSource = ‘C:/Users/Mohamed/Documents/node-write-stream-demo.txt’;
//Create a stream with the required path
var writeStreamObject = fs.createWriteStream(writeSource);

//write to the stream using the API
writeStreamObject.write("Writing to a file using WriteStream", "utf8");

//Now read the same file to verify that the contents have been successfully written
fs.readFile(writeSource, "utf8", function(err, data){
if ( err ){ throw err;}
console.log("*** Reading the recently written file");
console.log(data);
});
[/code]

Executing the above script we get:

[code langauage=”javascript”]

G:\node\node-file-io>node node-write-writestream.js
*** Reading the recently written file
Writing to a file using WriteStream
[/code]

Summary

This tutorial explained how to read and write file using the the fs package in Node.js. There are two types of methods defined for each operations (read, write and removed), synchronous and asynchronous. With this I have shown different ways in which we can read and write to files in Node.js. If you have any questions, please write it in the comments section.

Filed Under: NodeJS Tagged With: NodeJS Tutorials

Node.js : Modularizing Express.js and Node.js Applications

June 9, 2015 by Mohamed Sanaulla Leave a Comment

This tutorial explains how to modullarizing Node.js and Express.js applications. I am using the existing code from my previous post and explaining how to write a modular Node.js applications.

When it comes to writing the modular Node.js applications, there isn’t any one better way to write the modular applications. It comes through your experience and you may encounter different way to re-organize the code and every time it may look better approach for you. In this tutorial, I am going to explain which is better approach from my experience with Express.js and Node.js.

I have shown you how to build RESTful APIs using Express.js and Node.js here. In this post I will show you how we can modularize the code i.e keep related parts of the code in single Javascript file. For example, code dealing with DB can be placed in one javascript file, the code dealing with handling the requests can be placed in another Javascript file.

  • RESTful Example using ExpressJS + Node.js + MongooseJS
  • Mongoose – Node.js + MongoDB with Mongoose Tutorial
  • Book Reference : Web Development with Node and Express

Writing Modular Node.js Applications

I am going to create the following modules from the example in this post:

  1. Application Start point
  2. Database Access Module
  3. Controller module

Basic Introduction to Modularization in Node.js

nodejs-logo

Before I go a bit advanced into modularization of Node.js applications, I would like to take a very simple example to explain the basic concept behind modularizing code in Node.js. I will write a simple calculator app which supports: Addition, Subtraction, Multiplication. I will create a module for doing the calculation and another one which actually runs the application. So there would be 2 Javascript files namely: calculator.js and app.js.

Lets look at the calculator.js:

[code language=”javascript”]
//File name: calculator.js
//Use module.exports to export any function to be available via require()
module.exports.add = function(a,b){return a+b;}
module.exports.subtract = function(a,b){return a-b;}
module.exports.multiply = function(a,b){return a*b;}
[/code]

module is an Object in Node.js used to create Modules. We make the methods in calculator.js available by assigning them to the exports property of module object. We now have a module for performing calculations.

Note: Interesting read about difference between module.exports and exports.

Now lets use this module in our app.js which is the entry point for this simple sample to demonstrate basic concept behind modularity.

[code language=”javascript”]
var calculator = require("./calculator");
console.log("Sum of 4,5: "+ calculator.add(4,5));
console.log("Difference between 4,5: "+ calculator.subtract(4,5));
console.log("Product of 4,5: "+ calculator.multiply(4,5));
[/code]

Executing the above we would see something like below:
Basic Modularity in Node.js

With this basic introduction to Modularity in Node.js let me show you how we can refactor the example in the post: RESTful Example using ExpressJS + Node.js + MongooseJS and modularize it.

Modularizing RESTful Example : Node.js Application

As stated at the beginning there would be a module for DB related code, a module for controller operations and then an application starting point. The directory structure of the sample looks like below:
Node Modularity Directory Structure

Install the required packages i.e express, body-parser, mongoose using the command npm install express body-parser mongoose --save.

The code for the book_dao.js is as given below:

[code language=”javascript”]
//Filename: book_dao.js
//mongoose is used for interacting with MongoDB
var mongoose = require(‘mongoose’);

var dbHost = ‘mongodb://localhost:27017/test’;
mongoose.connect(dbHost);
//Create a schema for Book
var bookSchema = mongoose.Schema({
name: String,
//Also creating index on field isbn
isbn: {type: String, index: true},
author: String,
pages: Number
});

//Create a Model by using the schema defined above
//Optionally one can provide the name of collection where the instances
//of this model get stored. In this case it is "mongoose_demo". Skipping
//this value defaults the name of the collection to plural of model name i.e books.
var Book = mongoose.model(‘Book’, bookSchema);

//Connecting to Mongod instance.
mongoose.connection;

module.exports.findOne = function(isbn, callback){
Book.findOne({isbn: isbn}, function(err, result){
if ( err ) throw err;
callback(result);
});
}

module.exports.findAll = function(callback){
Book.find({}, function(err, result){
if ( err ) throw err;
callback(result);
});
}

module.exports.addNewBook = function(body, callback){
var book = new Book({
name:body.name,
isbn: body.isbn,
author: body.author,
pages: body.pages
});

//Saving the model instance to the DB
book.save(function(err, result){
if ( err ) throw err;
callback({
messaage:"Successfully added book",
book:result
});
});
}

module.exports.editBook = function(body, isbn, callback){
Book.findOne({isbn: isbn}, function(err, result){
if ( err ) throw err;

if(!result){
callback({
message:"Book with ISBN: " + isbn+" not found.",
});
}

result.name = body.name;
result.isbn = body.isbn;
result.author = body.author;
result.pages = body.pages;

result.save(function(err, result){
if ( err ) throw err;
callback({
message:"Successfully updated the book",
book: result
});
});

});
}

module.exports.deleteBook = function(isbn, callback){
Book.findOneAndRemove({isbn: isbn}, function(err, result){
callback({
message: "Successfully deleted the book",
book: result
});
});
}
[/code]

From the above code you can see that the book_dao.js depends only on the Mongoose API.

The code for book_controller.js is as given below:

[code language=”javascript”]
var bookDao = require("./book_dao");

module.exports.getBookDetails = function(params, callback){
console.log("Fetching details for book with ISBN: " + params.isbn);
bookDao.findOne(params.isbn, callback);
}

module.exports.getAllBooks = function(callback){
console.log("Fetching all books");
bookDao.findAll(callback);
}

module.exports.addNewBook = function(body, callback){
console.log("Adding new book");
bookDao.addNewBook(body, callback);
}

module.exports.editBook = function(body, isbn, callback){
console.log("Editing Book");
bookDao.editBook(body, isbn, callback);
}
module.exports.deleteBook = function(isbn, callback){
console.log("Deleting book");
bookDao.deleteBook(isbn, callback);
}
[/code]

The code for routes.js is given below:

[code language=”javascript”]
var bookController = require("./book_controller");

module.exports = function(app){
//Get the details of the book with the given isbn
app.get(‘/book/:isbn’, function(req, res){
bookController.getBookDetails(req.params, function(results){res.json(results);});
});

//Get all the books
app.get(‘/book’, function(req, res){
bookController.getAllBooks(function(results){res.json(results);});
});

app.post(‘/book’, function(req, res){
bookController.addNewBook(req.body, function(results){
res.json(results);
});
});

app.put(‘/book/:isbn’, function(req, res){
bookController.editBook(req.body, req.params.isbn, function(results){
res.json(results);
});
});

app.delete(‘/book/:isbn’, function(req, res){
bookController.deleteBook(req.params.isbn, function(results){
res.json(results);
});
});
}
[/code]

routes.js is responsible for mapping the URLs to the methods in the controller.

From above code its clear that book_dao.js depends only on Mongoose, book_controller.js depends only on the book_dao.js and routes.js depends only on book_controller.js

Running Modular Node.js Example Application

The below is the code for app.js which is the starting point for our application or in other words the place where the server is initiated and launched.

[code language=”javascript”]
//Filename: app.js
//Express is required for creating Node.js based web apps
var express = require(‘express’);

//body-parser is used to parse the Request body and populate the req.
var bodyParser = require(‘body-parser’);

var app = express();
app.set(‘port’, 3300);

//Configuring Express App to make use of BodyParser’s JSON parser to parse
//JSON request body
app.use(bodyParser.json());

//Including the routes module
var routes = require("./lib/routes");
routes(app);

//Starting up the server on the port: 3300
app.listen(app.get(‘port’), function(){
console.log(‘Server up: http://localhost:’ + app.get(‘port’));
});
[/code]

The above app.js depends on routes.js to setup the routes and controller binding.

Lets run the above sample as shown below:

[code language=”shell”]
G:\node\testapp\modular_sample_2>node app.js
Server up: http://localhost:3300
[/code]

The code for this sample can be found in the github as well.

  • RESTful Example using ExpressJS + Node.js + MongooseJS
  • Mongoose – Node.js + MongoDB with Mongoose Tutorial
  • Book Reference : Web Development with Node and Express

I hope this tutorial helped you to understand how to modularize a Node.js application. This tutorial used RESTful example from our previous tutorials. Note that there is no one rule for modularizing the code, it can be improvized based on your experience. If you have any questions on writing the modular Node.js applications, please write it in the comments section.

Filed Under: ExpressJS, NodeJS Tagged With: NodeJS Tutorials

Node.js : Packaging and Deploying Node.js Applications [COMPLETE GUIDE]

June 4, 2015 by Mohamed Sanaulla Leave a Comment

In this tutorial I am going to explain you how to use the Node Package Manager (NPM) to package and deploy your own Node.js application. I assume that you have already installed the Node.js and NPM in your local system to run this example.

  • Simple Steps to Install Node.js and NPM on Windows

In my previous post I showed you how to authenticate with Twitter and also make use of Twitter API to retrieve tweets. I have made use of several Node packages to achieve what I intended to. But what if someone wants to take the source code and run it in their machine? For example one of the readers have copied the source code and they now just want to run the application without knowing what all dependencies are there from the code and then installing them individually. Is this even possible in Node.js? Is there something like Maven where in we just mention the dependencies and the mvn automatically downloads the dependencies and can also launch the application.

nodejs-logo

The answer for the above questions is Yes, the solution for the above problmes is the Node Package Manager(NPM). Just like maven can host libraries which are reused in applications, even NPM can host libraries which are reused in your Node.js applications. NPM comes bundled with Node.js, so you can download Node.js from here and install it to also get NPM.

  • RESTful Example using ExpressJS + Node.js + MongooseJS
  • Mongoose – Node.js + MongoDB with Mongoose Tutorial
  • Book Reference : Web Development with Node and Express

Now let us pick the sample app developed in this post, and create a deployable package which any user can download and run it in his environment.

The following Node packages are used in that sample app:

  1. express
  2. express-handlebars
  3. request
  4. querystring

Creating an Empty Node.js Code Base

I am creating a new directory by name: node_packaging and in that directory execute the following command:

[code language=”shell”]
npm init
[/code]

When you run the above command, after that it will ask you a series of questions (just like when you are initializing project using maven). The series of questions are asked so as to create and populate a new package.json file (think of it as your pom.xml in maven). If you want to know more about package.json and its attributes, visit here. The below is how it looks after initializing using NPM. Look at the below screen and input the answers which shown in the screenshot. These values are used by the NPM to create a package.json file.

NPM Init

After that you can see a file by name package.json created with the following content:

[code language=”javascript”]
{
"name": "node-packaging-demo",
"version": "1.0.0",
"description": "Demo to show node packaging",
"main": "social_signin.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC"
}
[/code]

I can edit/update the package.json, for example add the author details as shown below:

[code language=”javascript”]
{
"name": "node-packaging-demo",
"version": "1.0.0",
"description": "Demo to show node packaging",
"main": "social_signin.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Mohamed Sanaulla",
"license": "ISC"
}
[/code]

Adding Dependencies using NPM

All the dependencies of the application are declared in package.json. One either edit the file directly or make use of the npm tool to update the dependencies in package.json. The below code adds the required dependencies to our app:

[code language=”shell”]
npm install –save express express-handlebars request querystring
[/code]

The above command along with installing the mentioned packages, adds them as a dependency in pakcage.json. Let us now look at the contents of package.json:

[code language=”javascript” highlight=”11-16″]
{
"name": "node-packaging-demo",
"version": "1.0.0",
"description": "Demo to show node packaging",
"main": "social_signin.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Mohamed Sanaulla",
"license": "ISC",
"dependencies": {
"express": "^4.12.4",
"express-handlebars": "^2.0.1",
"querystring": "^0.2.0",
"request": "^2.57.0"
}
}
[/code]

In the above the highlighted lines are the dependencies of our application and the number against it is the version of the package we are using. If no version is mentioned while executing npm install --save then the latest in NPM repository is used.

And when you list the files in your project folder you would see a folder by name node_modules. This is where all your modules get downloaded to.

nodemodules

nodemodules1

Building the Node.js Application

The below is the directory structure of the application:

[code]
|–social_signin.js
|–views
|–home.handlebars
|–my.handlebars
|–layouts
|–main.handlebars
[/code]

This directory structure is relative to the directory we created first i.e node_packaging.

Following is the source code of our application

File: social_signin.js

[code language=”javascript”]
//FileName: social_signin.js
var express = require(‘express’);

//NPM Module to integrate Handlerbars UI template engine with Express
var exphbs = require(‘express-handlebars’);

//NPM Module to make HTTP Requests
var request = require("request");

//NPM Module To parse the Query String and to build a Query String
var qs = require("querystring");

var app = express();

//Declaring Express to use Handlerbars template engine with main.handlebars as
//the default layout
app.engine(‘handlebars’, exphbs({defaultLayout: ‘main’}));
app.set(‘view engine’, ‘handlebars’);

//URL To obtain Request Token from Twitter
var requestTokenUrl = "https://api.twitter.com/oauth/request_token";

//To be obtained from the app created on Twitter
var CONSUMER_KEY = "GET_IT_FROM_TWITTER";
var CONSUMER_SECRET = "GET_IT_FROM_TWITTER";

//Oauth Object to be used to obtain Request token from Twitter
var oauth = {
callback : "http://localhost:3000/signin-with-twitter",
consumer_key : CONSUMER_KEY,
consumer_secret : CONSUMER_SECRET
}
var oauthToken = "";
var oauthTokenSecret = "";
app.get(‘/’, function (req, res) {
//Step-1 Obtaining a request token
request.post({url : requestTokenUrl, oauth : oauth}, function (e, r, body){

//Parsing the Query String containing the oauth_token and oauth_secret.
var reqData = qs.parse(body);
oauthToken = reqData.oauth_token;
oauthTokenSecret = reqData.oauth_token_secret;

//Step-2 Redirecting the user by creating a link
//and allowing the user to click the link
var uri = ‘https://api.twitter.com/oauth/authenticate’
+ ‘?’ + qs.stringify({oauth_token: oauthToken})
res.render(‘home’, {url : uri});
});

});

//Callback to handle post authentication.
app.get("/signin-with-twitter", function(req, res){
var authReqData = req.query;
oauth.token = authReqData.oauth_token;
oauth.token_secret = oauthTokenSecret;
oauth.verifier = authReqData.oauth_verifier;

var accessTokenUrl = "https://api.twitter.com/oauth/access_token";
//Step-3 Converting the request token to an access token
request.post({url : accessTokenUrl , oauth : oauth}, function(e, r, body){
var authenticatedData = qs.parse(body);
console.log(authenticatedData);

//Make a request to get User’s 10 latest tweets
var apiUrl = "https://api.twitter.com/1.1/statuses/user_timeline.json" + "?"
+ qs.stringify({screen_name: authenticatedData.screen_name, count: 10});

var authenticationData = {
consumer_key : CONSUMER_KEY,
consumer_secret : CONSUMER_SECRET,
token: authenticatedData.oauth_token,
token_secret : authenticatedData.oauth_token_secret
};

request.get({url : apiUrl, oauth: authenticationData, json:true}, function(e, r, body){

var tweets = [];
for(i in body){
var tweetObj = body[i];
tweets.push({text: tweetObj.text});
}

var viewData = {
username: authenticatedData.screen_name,
tweets: tweets
};

res.render("my", viewData);

});

});
});

app.listen(3000, function(){
console.log(‘Server up: http://localhost:3000’);
});
[/code]

File Name: main.handlebars

[code language=”html”]
<!– FileName: main.handlebars–>
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Example App</title>
</head>
<body>
{{{body}}}
</body>
</html>
[/code]

File Name: home.handlebars

[code language=”html”]
<!– FileName: home.handlebars–>
<h1>SignIn With Twitter Sample</h1>

<a href="{{url}}">Signin with Twitter</a>
[/code]

File Name: my.handlebars

[code language=”html”]
<!– FileName: my.handlebars–>
<h1>Welcome {{username}} </h1>
<h3>Your tweets</h3>
<ul>
{{#tweets}}
<li>{{text}}</li>
{{/tweets}}
</ul>
[/code]

Running the Node.js Sample Application

[code]
G:\node\node_packaging>node social_signin.js
Server up: http://localhost:3000
[/code]

Navigating to http://localhost:300 gives use a page with “Sign with Twitter” link. Clicking on it shows us the latest 10 tweets.

Nodejs Server Up

Twitter Signin Nodejs

Pushing the code to Github

Now I am going to push the code to Github and then show you how packaging using NPM and package.json helps in sharing the application among other developers. Before carrying out the below task, install Git on your machine.

Carry out the following steps to push the code to github:

    1. Create a new repository on github.com
    2. Run git init in your project folder to initialize a new git repository locally.
    3. Run git status in your project folder, you will see the following output:

      [code language=”shell”]
      $ git status
      On branch master

      Initial commit

      Untracked files:
      (use "git add <file>…" to include in what will be committed)

      node_modules/
      package.json
      social_signin.js
      views/

      nothing added to commit but untracked files present (use "git add" to track)
      [/code]

      We need to remove node_modules folder from getting committed to Github. So we create a new .gitignore file in the project root folder i.e in node_packaging folder. The content of .gitignore file is:

      [code]
      $ cat .gitignore
      node_modules
      [/code]

      Run git status again, this time you will see that the node_modules folder is not listed in the files to be committed:

      [code]
      $ git status
      On branch master

      Initial commit

      Untracked files:
      (use "git add <file>…" to include in what will be committed)

      .gitignore
      package.json
      social_signin.js
      views/

      nothing added to commit but untracked files present (use "git add" to track)
      [/code]

    4. Add the Github repository you created in Step-1 as a remote repository to the local repository you created in Step-2. This can be done using the following command: git remote add origin GITHUB_REPO_URL. You can get the GITHUB_REPO_URL from GitHub Repository page you created in Step-1
    5. Now commit and push your changes to the Github repository. It can be done as follows:

      [code]
      $ git add -A
      $ git commit
      $ git push origin master
      [/code]

Now the code should be available in your Github repository. Mine is at: https://github.com/sanaulla123/nodejs-package-demo

Advantage of Node.js Packaging

So we have setup everything for any developer to just clone the Github repository and start running the application locally. Let us also try to clone the repository in another location and see if this packaging really works!!

[code language=”shell”]
Mohamed@SANA-LAPTOP /g/node/node_packaging (master)
$ cd ..

Mohamed@SANA-LAPTOP /g/node
$ mkdir node_packaging_clone

Mohamed@SANA-LAPTOP /g/node
$ cd node_packaging_clone/

#Cloning the Github repository
Mohamed@SANA-LAPTOP /g/node
$ git clone https://github.com/sanaulla123/nodejs-package-demo.git

Mohamed@SANA-LAPTOP /g/node/node_packaging_clone
$ ls
nodejs-package-demo

Mohamed@SANA-LAPTOP /g/node/node_packaging_clone
$ cd nodejs-package-demo/

#There is no node_modules folder!!
Mohamed@SANA-LAPTOP /g/node/node_packaging_clone/nodejs-package-demo (master)
$ ls
package.json social_signin.js views

#This command picks up the dependencies from package.json and installs them to the local project folder.
Mohamed@SANA-LAPTOP /g/node/node_packaging_clone/nodejs-package-demo (master)
$ npm install

#The node_modules folder got created after the above command.
Mohamed@SANA-LAPTOP /g/node/node_packaging_clone/nodejs-package-demo (master)
$ ls
node_modules package.json social_signin.js views
[/code]

Now you should be able to run the application as before. This was a small introduction to packaging your node.js applications. Hope you found it easy to understand and beneficial. If you have any questions on Node.js Application Packaging, please write it in the comments section. Enjoy reading more tutorials about Node.js in our blog.

Filed Under: ExpressJS, NodeJS Tagged With: NodeJS Tutorials

  • 1
  • 2
  • Next Page »

Follow Us

  • Facebook
  • Pinterest

As a participant in the Amazon Services LLC Associates Program, this site may earn from qualifying purchases. We may also earn commissions on purchases from other retail websites.

JavaBeat

FEATURED TUTORIALS

Answered: Using Java to Convert Int to String

What is new in Java 6.0 Collections API?

The Java 6.0 Compiler API

Copyright © by JavaBeat · All rights reserved