dump a perl array into a psql DB via 'copy'; help?
Alex Beamish
talexb-Re5JQEeQqe8AvxtiuMwx3w at public.gmane.org
Wed Dec 15 22:48:30 UTC 2004
On Wed, 15 Dec 2004 01:38:56 -0500, Madison Kelly <linux-5ZoueyuiTZhBDgjK7y7TUQ at public.gmane.org> wrote:
> Hi all (again, I know)
>
> Quick update first; a while back I was asking for advice on how to
> spead up database performance and several people suggested avoiding
> calling and reading 'ls' and instead use 'readdir'. I avoided it at the
> time because I needed all of the file's information. Well, with 'stat'
> and '$size = -s $file' I can get it now. With that and other
> improvements my performance has increased by more than five-fold.
>
> But I want more. :)
>
> What I doing currently is opening a file, starting the file with
> 'psql' copy command, then for each file being processed write a line of
> data and finally cap off the file with '\.', With this written
> (currently taking 11 seconds to process 22,000 files on my machine) I
> then call 'psql' to read in the contents. This works but the read alone
> takes another 31 seconds. I know this sounds somewhat trivial but I need
> it to be faster.
We just discussed something similar on Perl Monks:
http://perlmonks.org/?node_id=414600
My own contribution suggested using LOAD DATA INFILE or some variety
of bcp, the Bulk Copy Program, if available for your installation.
Alex
--
The Toronto Linux Users Group. Meetings: http://tlug.ss.org
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://tlug.ss.org/subscribe.shtml
More information about the Legacy
mailing list