Hernando, my use case for this is those cases where the data are small enough to load it in R, but not so small so one can not put up with a bunch of insert statements (the default when you use dplyr::copy_to()) Sometimes it is more convenient to initially load and wrangle a dataset in R but when you later need it up in postgres, this is a quick-and-dirty way to do it (if the data are not just a few rows but millions, it would take forever to use a bunch of insert statements)

pg_copy_data(
  con,
  data,
  table_name,
  sep = "|",
  drop_table = TRUE,
  create_table = TRUE,
  if_not_exists = TRUE,
  execute = TRUE
)

Arguments

con

A database connection.

data

data to copy to postgres

table_name

name of the table in postgres

sep

The separator between columns. Defaults to the character in the set [,\t |;:] that separates the sample of rows into the most number of lines with the same number of fields. Use NULL or "" to specify no separator; i.e. each line a single character column like base::readLines does.

drop_table

logical, whether to drop the table before creating it

create_table

boolean TRUE if the table should be created. Otherwise, it assumes the table exists

if_not_exists

logical, to add IF NOT EXISTS to the query

execute

logical, whether to execute the query using con

Value

this is used for the side effect