converting lines to json in bash

--raw-input, then --slurp

Just summarizing what the others have said in a hopefully quicker to understand form:

cat /etc/hosts  | jq  --raw-input .  | jq --slurp .

will return you:

[
  "fe00::0 ip6-localnet",
  "ff00::0 ip6-mcastprefix",
  "ff02::1 ip6-allnodes",
  "ff02::2 ip6-allrouters"
]

Explanation

 --raw-input/-R:

       Don´t parse the input as JSON. Instead, each line of text is passed
       to  the  filter  as  a  string.  If combined with --slurp, then the
       entire input is passed to the filter as a single long string.

 --slurp/-s:

       Instead of running the filter for each JSON object  in  the  input,
       read  the entire input stream into a large array and run the filter
       just once.

You can also use jq -R . to format each line as a JSON string and then jq -s (--slurp) to create an array for the input lines after parsing them as JSON:

$ printf %s\\n aa bb|jq -R .|jq -s .
[
  "aa",
  "bb"
]

The method in chbrown's answer adds an empty element to the end if the input ends with a linefeed, but you can use printf %s "$(cat)" to remove trailing linefeeds:

$ printf %s\\n aa bb|jq -R -s 'split("\n")'
[
  "aa",
  "bb",
  ""
]
$ printf %s\\n aa bb|printf %s "$(cat)"|jq -R -s 'split("\n")'
[
  "aa",
  "bb"
]

If the input lines don't contain ASCII control characters (which have to be escaped in strings in valid JSON), you can use sed:

$ printf %s\\n aa bb|sed 's/["\]/\\&/g;s/.*/"&"/;1s/^/[/;$s/$/]/;$!s/$/,/'
["aa",
"bb"]

I was also trying to convert a bunch of lines into a JSON array, and was at a standstill until I realized that -s was the only way I could handle more than one line at a time in the jq expression, even if that meant I'd have to parse the newlines manually.

jq -R -s -c 'split("\n")' < just_lines.txt
  • -R to read raw input
  • -s to read all input as a single string
  • -c to not pretty print the output

Easy peasy.

Edit: I'm on jq ≥ 1.4, which is apparently when the split built-in was introduced.

Tags:

Bash

Jq