How to process POST data in Node.js?

How do you extract form data (form[method="post"]) and file uploads sent from the HTTP POST method in Node.js?

I've read the documentation, googled and found nothing.

function (request, response) {
    //request.post????
}

Is there a library or a hack?


Solution 1:

You can use the querystring module:

var qs = require('querystring');

function (request, response) {
    if (request.method == 'POST') {
        var body = '';

        request.on('data', function (data) {
            body += data;

            // Too much POST data, kill the connection!
            // 1e6 === 1 * Math.pow(10, 6) === 1 * 1000000 ~~~ 1MB
            if (body.length > 1e6)
                request.connection.destroy();
        });

        request.on('end', function () {
            var post = qs.parse(body);
            // use post['blah'], etc.
        });
    }
}

Now, for example, if you have an input field with name age, you could access it using the variable post:

console.log(post.age);

Solution 2:

If you use Express (high-performance, high-class web development for Node.js), you can do this:

HTML:

<form method="post" action="/">
    <input type="text" name="user[name]">
    <input type="text" name="user[email]">
    <input type="submit" value="Submit">
</form>

API client:

fetch('/', {
    method: 'POST',
    headers: {
        'Content-Type': 'application/json'
    },
    body: JSON.stringify({
        user: {
            name: "John",
            email: "[email protected]"
        }
    })
});

Node.js: (since Express v4.16.0)

// Parse URL-encoded bodies (as sent by HTML forms)
app.use(express.urlencoded());

// Parse JSON bodies (as sent by API clients)
app.use(express.json());

// Access the parse results as request.body
app.post('/', function(request, response){
    console.log(request.body.user.name);
    console.log(request.body.user.email);
});

Node.js: (for Express <4.16.0)

const bodyParser = require("body-parser");

/** bodyParser.urlencoded(options)
 * Parses the text as URL encoded data (which is how browsers tend to send form data from regular forms set to POST)
 * and exposes the resulting object (containing the keys and values) on req.body
 */
app.use(bodyParser.urlencoded({
    extended: true
}));

/**bodyParser.json(options)
 * Parses the text as JSON and exposes the resulting object on req.body.
 */
app.use(bodyParser.json());

app.post("/", function (req, res) {
    console.log(req.body.user.name)
});

Solution 3:

A lot of answers here are not good practices anymore or don't explain anything, so that's why I'm writing this.

Basics

When the callback of http.createServer is called, is when the server has actually received all the headers for the request, but it's possible that the data has not been received yet, so we have to wait for it. The http request object(a http.IncomingMessage instance) is actually a readable stream. In readable streams whenever a chunk of data arrives, a data event is emitted(assuming you have registered a callback to it) and when all chunks have arrived an end event is emitted. Here's an example on how you listen to the events:

http.createServer((request, response) => {
  console.log('Now we have a http message with headers but no data yet.');
  request.on('data', chunk => {
    console.log('A chunk of data has arrived: ', chunk);
  });
  request.on('end', () => {
    console.log('No more data');
  })
}).listen(8080)

Converting Buffers to Strings

If you try this you will notice the chunks are buffers. If you are not dealing with binary data and need to work with strings instead I suggest use request.setEncoding method which causes the stream emit strings interpreted with the given encoding and handles multi-byte characters properly.

Buffering Chunks

Now you are probably not interested in each chunk by it's own, so in this case probably you want to buffer it like this:

http.createServer((request, response) => {
  const chunks = [];
  request.on('data', chunk => chunks.push(chunk));
  request.on('end', () => {
    const data = Buffer.concat(chunks);
    console.log('Data: ', data);
  })
}).listen(8080)

Here Buffer.concat is used, which simply concatenates all buffers and return one big buffer. You can also use the concat-stream module which does the same:

const http = require('http');
const concat = require('concat-stream');
http.createServer((request, response) => {
  concat(request, data => {
    console.log('Data: ', data);
  });
}).listen(8080)

Parsing Content

If you are trying to accept HTML forms POST submission with no files or handing jQuery ajax calls with the default content type, then the content type is application/x-www-form-urlencoded with utf-8 encoding. You can use the querystring module to de-serialize it and access the properties:

const http = require('http');
const concat = require('concat-stream');
const qs = require('querystring');
http.createServer((request, response) => {
  concat(request, buffer => {
    const data = qs.parse(buffer.toString());
    console.log('Data: ', data);
  });
}).listen(8080)

If your content type is JSON instead, you can simply use JSON.parse instead of qs.parse.

If you are dealing with files or handling multipart content type, then in that case, you should use something like formidable which removes all the pain from dealing with it. Have a look at this other answer of mine where I have posted helpful links and modules for multipart content.

Piping

If you don't want to parse the content but rather pass it to somewhere else, for example send it to another http request as the data or save it to a file I suggest piping it rather than buffering it, as it'll be less code, handles back pressure better, it'll take less memory and in some cases faster.

So if you want to save the content to a file:

 http.createServer((request, response) => {
   request.pipe(fs.createWriteStream('./request'));
 }).listen(8080)

Limiting the Amount of Data

As other answers have noted keep in my mind that malicious clients might send you a huge amount of data to crash your application or fill your memory so to protect that make sure you drop requests which emit data past a certain limit. If you don't use a library to handle the incoming data. I would suggest using something like stream-meter which can abort the request if reaches the specified limit:

limitedStream = request.pipe(meter(1e7));
limitedStream.on('data', ...);
limitedStream.on('end', ...);

or

request.pipe(meter(1e7)).pipe(createWriteStream(...));

or

concat(request.pipe(meter(1e7)), ...);

NPM Modules

While I described above on how you can use the HTTP request body, for simply buffering and parsing the content, I suggesting using one of these modules rather implementing on your own as they will probably handle edge cases better. For express I suggest using body-parser. For koa, there's a similar module.

If you don't use a framework, body is quite good.