Blog

Last edit

Summary: bez kalendarza

Deleted:

< month:+0


For the English part of the blog, see Content AND Presentation.

2020-02-24 Some psql tips

I have to admit that I’m more and more enamored by PostgreSQL. Since I often use its default client psql, I started searching the Internet for some features I’d like it to have – and I wasn’t disappointed.

First of all, I am accustomed to what Bash does with my history: not putting commands beginning with a space into it, and saving identical subsequent history entries only once. It turns out that psql can do exactly the same – the manual even says:
Note: This feature was shamelessly plagiarized from Bash.

It turns out that it is enough to add the line

\set HISTCONTROL ignoreboth

to your ~/.psqlrc file (creating it if necessary).

This, however, is only a beginning. Here are a few things I find especially useful with psql.

First of all, there is the “expanded” mode, usually turned on by the \x metacommand. (“Metacommands” are commands preceded with a backslash and interpreted by psql and not the PostgreSQL server.) In expanded mode, instead of writing out each row in a single line of text, every column’s value is written out in its own line. This is very useful for queries returning a lot of things, but much less useful for queries returning only a handful of columns. If you encounter both cases frequently, consider using \x auto, which automatically enables the expanded mode if the data returned by the query would be wider than the screen.

Another nice command is \C (that is, backslash followed by capital “C”). It sets the title of every table printed as a result of a query. This doesn’t seem very useful unless you decide to use the control character sequence clearing the terminal as the title. This way, every query will be presented on its own, starting with an empty screen. Of course, you do not need to remember that sequence – psql’s metacommands accept command-line commands in backticks, so you can say \C `clear` and you’re good to go.

There is also a lot of commands which operate on the “current query buffer”. This expression means roughly “whatever precedes the metacommand”. (If there is nothing preceding it, it usually operates on the previous query instead.) For instance, \e opens your favorite text editor with the current query buffer or the previous query and lets you edit it in a nicer way than psql​’s readline library. Another useful one is \gx, which turns the expanded mode on for just this query (or repeats the last query with expanded mode temporarily on). There is also \w, which writes the current query buffer (or the last query) into a file, and so on.

There are also some commands which act as wrappers around SQL’s filesystem-manipulating features, like the COPY TO and a few others. The problem with COPY TO is that the file it generates is owned by the postgres user, which is usually not what we want. Therefore, psql includes a wrapper around COPY, called \copy, which reads or writes files as the user running psql. This may be quite handy both in scripts (metacommands can be used in psql scripts, did you know that? There is even a full set of conditionals!).

The last thing I wanted to mention is the \watch metacommand. It is very similar to the watch command-line utility, which runs a given command periodically and updates the screen with its standard output. The psql \watch has a few drawbacks – the biggest one being that it does not clear the screen before each execution of the given query. (This can be remedied with the \C `clear` trick, though.) Another potential issue is that it cannot do stuff watch can like highlighting differences. Still, it is much more comfortable than running psql with watch (since then quoting quickly becomes nightmarish).

CategoryEnglish, CategoryBlog, CategoryPostgreSQL

Comments on this page

2020-02-15 PostgreSQL and null values in jsonb

Some time ago I was writing some PostreSQL queries involving columns of type jsonb. For some reason I could not understand my query didn’t work as expected. I decided to dig a little bit deeper, and here is what I learned.

As we know, SQL (and PostgreSQL as well, of course) has the null value, which is kind of tricky (for instance, it is not equal to itself). That makes sense, because SQL’s null means “of unknown value”.

As we also know, JSON also has the null value (or type). In JavaScript, it is equal (and even identical) to itself. This also makes sense, because JSON’s null means “empty/no value”.

When we use PostgreSQL to query JSON values which may contain null values, however, hilarity ensues. First of all, if you want to actually type a JSON null in Postgres, you can say e.g. 'null'::jsonb (let’s agree on using jsonb and avoiding json, and calling it just JSON). Of course, this value is different from an SQL null. If you try to say null::jsonb instead, you get an SQL null, since (apparently) casting null to anything results in null (I could not find evidence for this in the PostgreSQL manual, but a few experiments seem to confirm this behavior, and it makes perfect sense).

Now there are at least two ways one can achieve some kind of null when extracting parts of JSON objects: either an explicit null is present in JSON, or the requested property is missing altogether. In the former case, we get a JSON null, and in the latter an SQL null, which is slightly weird given the semantic above, but I can accept it in light of the fact that SQL does not have an “empty” type/value, and there are no natural candidates for the job of representing a value not present in the JSON object (well, technically, PostgreSQL could throw an error in such a case, but it doesn’t – go figure).

Then, there are two ways one can extract something from JSON in PostgreSQL – as JSON (e.g. ->) or as text (e.g. ->>).

Combining the above we get the following results.

select '{"a": 1, "b": null}'::jsonb->'b';

yields a JSON null (so something that does not satisfy the SQL “predicate” is null). However,

select '{"a": 1, "b": null}'::jsonb->'c';

gives a proper SQL null.

Using the text extraction operators, however, we get an SQL null both times:

select '{"a": 1, "b": null}'::jsonb->>'b' is null;
select '{"a": 1, "b": null}'::jsonb->>'c' is null;

The latter is not unexpected at all – we get an SQL null, which (after casting to text) remains null. The former, however, came as a surprise to me. I thought it should be the SQL string 'null', not the SQL null, as casting an explicit JSON null to text gives that string.

The programmers behind PostgreSQL didn’t agree, though. I could not understand why, and I asked a question on StackOverflow about it. After several hours I got a very detailed and interesting response. To cut a long story short, here is the rationale behind this seemingly strange behavior: if PostgreSQL returned 'null' (i.e., a string) in response to select '{"a": 1, "b": null}'::jsonb->>'b';, then the result would be indistinguishable from the result of select '{"a": 1, "b": "null"}'::jsonb->>'b';. One of the design principles of PostgreSQL, however, is that casting anything to text should give something parsable back to the original value (whenever possible).

Well, nice to learn something new.

And while we are at PostgreSQL null and related topics, here is a nice tip I learned some time ago: if you put the line \pset null '∅' in your .psqlrc file, psql will render null values as Unicode “empty set” symbol. It is extremely useful if you need to distinguish between null and an empty string in psql​’s output (unless you have Unicode empty set symbols in your data, of course).

And finally another tip, this time related to the jsonb type, is the jsonb_strip_nulls function. Given a JSONB object, it removes all properties whose values are explicit null​s (and it does it recursively, so deeper levels of hierarchy are affected, too). Note that it keeps null​s that are elements of arrays, though.

CategoryEnglish, CategoryBlog, CategoryPostgreSQL

Comments on this page

2020-02-10 My first steps with Lua

Well, the title of this post is a misnomer – I am already well past my first steps with Lua, since I’ve written a few (admittedly, very simple) Lua scripts. In the near future, though, I’d like to play around with Lua a tad more seriously. (I still do not have a lot of spare time for that, but we’ll see how it goes.)

One thing I missed when I tried Lua last time (it could have been a year or so ago) was a feature found in Node.JS’s npm package manager. In Node, you can describe all the packages you need for your applications/script/whatever and run npm install to have them installed in a local node_modules directory. While this is not necessarily optimal with respect to the disk usage, it is very convenient – every app can be pretty much self-contained. When I researched Lua, it seemed to me that you cannot do a similar thing with Lua “rocks” (which is a cool name btw) – you just have to install them globally (per system or per user).

Well, apparently I was wrong. It seems that it is possible to make Lua behave like Node and install modules under the current directory, and it was already possible to do that almost four years ago. The trick to do that (since it is a trick, this is not something Luarocks can do by default) is described here.

Since reading is one thing and learning is another, let us apply the method from that guide to install a Lua module locally. I decided to use Lua-CSV for that. (Spoiler alert: it wasn’t as simple as I wished.)

I started with creating an empty directory and installing it there using Luarocks:

cd /tmp
rm -rf lua-csv-test
mkdir lua-csv-test
cd lua-csv-test
luarocks install --tree lua_modules lua-csv

Then, following the howto, I created the set-paths.lua file:

local version = _VERSION:match("%d+%.%d+")
print(version)
package.path = 'lua_modules/share/lua/' .. version .. '/?.lua;lua_modules/share/lua/' .. version .. '/?/init.lua;' .. package.path
package.cpath = 'lua_modules/lib/lua/' .. version .. '/?.so;' .. package.cpath

Finally, let us create the script which is going to actually use the Lua-CSV module. (Of course, I am going to utilize the example script from the README for that.)

local csv = require("lua-csv.csv")
local f = csv.open("file.csv")
for fields in f:lines() do
  for i, v in ipairs(fields) do print(i, v) end
end

The tricky part here was the call to require. It turns out that require("lua-csv") wouldn’t work, as it would try to find the file ./lua_modules/share/lua/5.3/lua-csv.lua, and the luarocks install command installed the module as ./lua_modules/share/lua/5.3/lua-csv/csv.lua. Frankly, I have no idea how to predict where the module file will land after installing other than inspecting the lua_modules directory. (I told you these are my first steps with Lua!).

To top it off, we need the csv file itself:

Letter,Number
A,65
B,66

And now, I can say lua -l set-paths lua-csv-test.lua, and it Just Works™!

All I can say is that this is way more complicated than just npm install module followed by const module = require('module'), but that won’t discourage me. In the next installment (some day!), I am going to use Lua to process my Org agenda in csv format.

CategoryEnglish, CategoryBlog, CategoryLua

Comments on this page

More...