Monday, 15 July 2013

Saving large data.frame into PostgreSQL with R -



Saving large data.frame into PostgreSQL with R -

i'm saving big data.frame (30 1000000 rows) postgresql database r , kills pc. result of calculations produced dplyr, i'd mind utilize build in functionality of package, copy_to doesn't work such huge tables. suggestions?

can re-create dataframe csv or tab delimited text file, load postgresql re-create command [1]? implements mass load approach may perform faster.

in cases, may possible utilize rscript emit info stream , pipe straight psql:

<rscript output tab delmited rows> | psql -c "copy <tablename> (columnlist, ...) stdin (format text)"

in long running cases, set | pv | in middle track progress (http://www.ivarch.com/programs/pv.shtml).

[1] http://www.postgresql.org/docs/current/interactive/sql-copy.html

r postgresql dplyr

No comments:

Post a Comment