How to insert (file) data into a PostgreSQL bytea column?

as superuser:

create or replace function bytea_import(p_path text, p_result out bytea) 
                   language plpgsql as $$
declare
  l_oid oid;
begin
  select lo_import(p_path) into l_oid;
  select lo_get(l_oid) INTO p_result;
  perform lo_unlink(l_oid);
end;$$;

lo_get was introduced in 9.4 so for older versions you would need:

create or replace function bytea_import(p_path text, p_result out bytea) 
                   language plpgsql as $$
declare
  l_oid oid;
  r record;
begin
  p_result := '';
  select lo_import(p_path) into l_oid;
  for r in ( select data 
             from pg_largeobject 
             where loid = l_oid 
             order by pageno ) loop
    p_result = p_result || r.data;
  end loop;
  perform lo_unlink(l_oid);
end;$$;

then:

insert into my_table(bytea_data) select bytea_import('/my/file.name');

Use pg_read_file('location_of file')::bytea.

For example,

create table test(id int, image bytea);
insert into test values (1, pg_read_file('/home/xyz')::bytea);

Manual


This solution isn't exactly efficient in terms of runtime, but it's trivially easy compared to making your own headers for COPY BINARY. Further, it doesn't require any libraries or scripting languages outside of bash.

First, convert the file into a hexdump, doubling the size of the file. xxd -p gets us pretty close, but it throws in some annoying newlines that we have to take care of:

xxd -p /path/file.bin | tr -d '\n' > /path/file.hex

Next, import the data in PostgreSQL as a very large text field. This type holds up to one GB per field value, so we should be okay for most purposes:

CREATE TABLE hexdump (hex text); COPY hexdump FROM '/path/file.hex';

Now that our data is a gratuitously large hex string, we use PostgresQL's decode to get it into a bytea type:

CREATE TABLE bindump AS SELECT decode(hex, 'hex') FROM hexdump;