Today I have yet another PostgreSQL-related tip. Some time ago I needed to put contents of some binary file into a bytea
column, using Node and Knex.js. I couldn’t find any tutorials, so I decided to try to do the simplest thing and see if it breaks. To my astonishment, it didn’t – it Just Worked™!
So, here is how I did it. First of all, I read a file from disk:
const file_contents = fs.readFileSync(file_name)
(now file_contents
contains a buffer), and then just insert it into the database:
knex('table').insert({file_name, file_contents})
And that’s it! We have now our binary file in PostgreSQL!
Reading it back is similarly easy:
const file = await knex('table') .select('file_content', 'file_name') .where({file_name})
Now, what Knex returns here is a buffer (again). You can convert a buffer to a string (which sometimes is what you need), but my use-case was a bit different – the files were images and I wanted to serve them in a web app. It turns out that Express.js can send buffers directly to the client:
res.set('Content-Type', 'image/png').send(file[0].file_content)
While we’re at it, I never liked the fact that Knex always returns an array of results, even if it is guaranteed that at most one result will be there. Instead of file[0]
, I prefer to attach a .then(_.head)
(if I use Lodash) or .then(R.head)
(if I use Ramda) to the query, which gives me the (only) element of the array if there is one and undefined
otherwise.
That’s it for today. Note that this may be improved. For example, my way reads the whole image to the memory and then serves it. A better possibility – at least for large images – would be to use streams and pass the data from the database to the web server, without holding all of them in RAM – but that is another story which I might describe another time. Good luck coding!