Hernando, my use case for this is those cases where the data are small enough to load it in R, but not so small so one can not put up with a bunch of insert statements (the default when you use dplyr::copy_to()) Sometimes it is more convenient to initially load and wrangle a dataset in R but when you later need it up in postgres, this is a quick-and-dirty way to do it (if the data are not just a few rows but millions, it would take forever to use a bunch of insert statements)
pg_copy_data(
con,
data,
table_name,
sep = "|",
drop_table = TRUE,
create_table = TRUE,
if_not_exists = TRUE,
execute = TRUE
)
A database connection.
data to copy to postgres
name of the table in postgres
The separator between columns. Defaults to the character in the set [,\t |;:]
that separates the sample of rows into the most number of lines with the same number of fields. Use NULL
or ""
to specify no separator; i.e. each line a single character column like base::readLines
does.
logical, whether to drop the table before creating it
boolean TRUE if the table should be created. Otherwise, it assumes the table exists
logical, to add IF NOT EXISTS to the query
logical, whether to execute the query using con
this is used for the side effect