Tutorial: Designing A Basic REST API

Creating The REST API

If you have not yet done a checkout on the YourAPIExpert GitHub repository you should do so now.  Let’s walk through the directory structure of ‘001 Basic’.

001 Basic GitHub

The picture above shows the directory structure of ‘001 Basic’ (this tutorial) on GitHub.  The ‘config’ directory contains a single file (development.json) which is used to establish parameters and settings in the application.  The ‘controllers’ directory contains individual files to manage each of the endpoints (routes).

package.jsonserver.jsroute.js
This file, in JSON format, describes the list of packages and dependencies required to run the application.

The listed dependencies should align to that which is used in the main application file (server.js) and contains specific repository and versioning information in the event that such specificity is required.

The package.json file is read by NodeJS’ npm tool in order to install the required dependencies.  From inside the ‘001 Basic’ directory it is safe to type ‘npm install’ in order to install the required dependencies.

The server.js file contains the application logic required to initialize the application.  Although all code can reside in this file it is undesirable to do so because it will make the file excessively large and difficult to analyze and troubleshoot.

Instead this file contains enough code to initialize various functions and libraries and then passes control on to other code located in libraries and modules.

This file is referred to by the main server.js file and contains a routine to scan for files in the controllers/ directory and to include it dynamically in the route map.

In essence this routine builds a map of endpoints (eg: /hello and /hello/{name}) which when called passes control to the relevant module contained in the controllers/ directory.

Installing Dependencies

The Node Package Manager (NPM) application reads the package.json file and installs the required dependencies.  Make sure that you are in the ‘001 Basic’ directory and then issue the command ‘npm install’ to begin installing the required dependencies.

The Application – Dependencies And Configuration

The file server.js contains the application initialization code as shown below.

/**
 __   __                _    ____ ___ _____                      _   
 \ \ / /__  _   _ _ __ / \  |  _ \_ _| ____|_  ___ __   ___ _ __| |_ 
  \ V / _ \| | | | '__/ _ \ | |_) | ||  _| \ \/ / '_ \ / _ \ '__| __|
   | | (_) | |_| | | / ___ \|  __/| || |___ >  <| |_) |  __/ |  | |_ 
   |_|\___/ \__,_|_|/_/   \_\_|  |___|_____/_/\_\ .__/ \___|_|   \__|
                                                |_|                  

 @file A basic RESTify API template for use during YourAPIExpert.com tutorials.
 @author YourAPIExpert <yourapiexpert@gmail.com>
 @version 1.0.0
 @module server
*/

// The Essentials
// These are libraries that the program depends upon to
// function.  You will find them listed in package.json and
// are installed by using 'npm'
var restify = require('restify'); // The main restify library for our API
var cluster = require('cluster'); // To take advantage of multi-core systems
var bunyan = require('bunyan'); // Bunyan logging component
var bsyslog = require('bunyan-syslog'); // Syslog add-on for Bunyan
var config = require('config'); // A convenient module to read config files
var os = require('os'); // A convenient module to access os functions
var path = require('path'); // Library to work with file paths

// Configuration
// The variables below are used throughout the program and are therefore
// declared at the top-level.  We will later discuss scopes and how they
// apply to NodeJS.
var numCPUs = require('os').cpus().length;
var serviceConfig = config.get('general.service');
var httpConfig = config.get('connections.http');
var loggingConfig = config.get('connections.sysLog');

Lines 15-26 are top-level variable definitions which serve to instantiate and provide a reference to the individual libraries.  These should align with the declarations made in the file package.json.

I have made some special additions and modifications to this API at this early stage as I would like students to begin the discipline of some essential practices, namely:

  • Clustering (line 20) which assists the developer to make use of all available cores on the system.  Typically NodeJS makes use of only one core per instance which in a production environment becomes a limiting factor.
  • Logging (lines 21-22) is a preferred practice over dumping text to the console.  Logging to files is persistent whereas console data lasts only as long as the buffers allow which will hinder diagnostics and troubleshooting.
  • Configuration data (line 23) is removed from the main application.  So that development and production environments can remain isolated the configuration data is supplied by a JSON formatted file read from the config/ directory.

In line 31 we tell NodeJS to calculate the maximum number of CPUs (or cores) available in the system whilst lines 32-34 set up top-level variables to reference the individual configuration sections.

Logging
// Logging Subsystem
// At this early stage I am going to bring in logging as I don't find
// console.log to be a practice I would like to teach.  The same result
// can be achieved by selective logging which has the added benefit of 
// being persistent for later diagnostics and troubleshooting.
var log = bunyan.createLogger({
  name: serviceConfig.name,
  streams: [ {
    level: 'debug',
    type: 'raw',
    stream: bsyslog.createBunyanStream({
    type: 'sys',
    facility: bsyslog.local0,
    host: loggingConfig.host,
    port: loggingConfig.port
    })
  },
  {
    stream: process.stdout,
    level: 'debug'
  }]
});

By default the logging subsystem in lines 41-57 are set to log to the console as well as to syslog on Linux and macOSX hosts.  Whilst the code will work on a Windows host there is no local syslog implementation on Windows machines.  For this reason I recommend removing lines 44-51 if you are developing in a Windows machine.

Starting The Cluster
// Fork the cluster.
// By default NodeJS makes use of a single CPU only which is a performance
// limitation in multi-CPU or multi-core systems.  The code below will assist
// NodeJS to make use of all CPUs and cores for maximum performance.
if (cluster.isMaster) {
  // Fork workers.
  for (var i = 0; i < numCPUs; i++) {
    cluster.fork();
  }
  cluster.on('exit', function(deadWorker, code, signal) {
    var worker = cluster.fork();

    log.error('worker ' + deadWorker.process.pid + ' died');
    log.error('worker ' + worker.process.pid + ' spawned');
  });
} else {

Between lines 63-77 we implement application code to manage the forking of NodeJS processes and the management of processes which die.  ‘Forking’ is a programmatic term which means to spawn or create a separate process and return control back to the main application.  In this practical application we will create one NodeJS instance of our application for each available CPU (line 69).

Creating HTTP Servers
  // Create the server.
  server = restify.createServer({
    name: 'YourAPIExpert',
    log: log,
    version: '1.0.0'
    });
  server.use(restify.CORS({
    origins: [ '*' ],
    methods: ['GET,PUT,POST,DELETE,PATCH,MERGE'],
    headers: ['Content-Type']
  }));
  // RESTify includes a number of bundled plugins (middleware) to make things
  // easier.  Instead of me wasting space in the comments please see
  // http://www.restify.com
  server.use(restify.queryParser());
  server.use(restify.bodyParser({
    maxBodySize: 0,
    mapParams: true,
    mapFiles: false,
    overrideParams: false,
    multipartHandler: function(part) {
      part.on('data', function(data) {
        // TODO - Do something with multipart data
      });
    },
    multipartFileHandler: function(part) {
      part.on('data', function(data) {
        // TODO - Do something with multipart data
      });
    },
    keepExtensions: false,
    uploadDir: os.tmpdir(),
    multiples: true
  }));
  server.use(restify.gzipResponse());

From line 79 through 113 our application code creates the RESTify HTTP servers.  As REST is data exchange format on top of HTTP the library takes care of creating the underlying transport as well as parsing and interpreting the RESTful data.  The various ‘server.use’ directives between lines 85-113 instruct the RESTify library to initialize various plugins to assist with the parsing of data in query strings (eg: domain.com/action?parameter1=value1&parameter2=value2) and HTTP POST bodies.  Other plugins help with cross-domain requests and the compression of output data to save on transmission costs.

Initialization Of Routes
// Next we need to incorporate our routes but these can very quickly get
  // out of hand with application logic and what not.  For this reason we
  // move them out of this main file and in to separate smaller files.  Below
  // we provide reference to these files
  require('./route.js')(__dirname+'/controllers', server);

In line 119 we have a special declaration which initializes a separate function in file route.js which we will discuss in a few paragraphs.  Beside the declaration we supply several additional parameters to assist with the initialization of the routes.

Starting The Server
 // Start the server and set it to listen on the port defined in the config.
  server.listen(httpConfig.port, function startServer() {
    log.info('HTTP server listening on port '+httpConfig.port);
  });

Lastly in line 122 we start the HTTP server and if successful we send a message to the logging subsystem which will print the text both in the console as well as the system logs (/var/log/syslog or /var/log/messages) on Linux hosts.

Up Next:  Managing Routes

Pages: 1 2 3 4 5 6

Written by YourAPIExpert