Postgres/Perl performance help needed

Fraser Campbell fraser-eicrhRFjby5dCsDujFhwbypxlwaOVQ5f at public.gmane.org
Tue May 25 23:43:43 UTC 2004


On Tuesday 25 May 2004 16:16, Madison Kelly wrote:

>    Before I started messing with things I record a directory with 2,490
> files and folders (just the names, obviously!) in 23 seconds which was
> not reasonable. When I tried to record a filesystem with 175,000 records
> it took 32 minutes... Since I have started tweaking the same number of
> records takes 35 seconds.

How are you connecting?  I see a variable $DB but I'm not sure what type of 
object it is.  Is it possible that you are reconnecting to the database on 
every single select/insert/update, that will massively slow down the 
operations.  I'm not a DBI (or perl) expert but perhaps if you can show how 
your establishing the database connection, and doing subsequent 
select/insert/deletes, someone else can chip in.

I expect that having indexes on the database could slow you down (at least for 
inserts).

If you can't get postgresql working right you could always dump it for mysql. 
Mysql does have some licensing quirks these days that may (legitimately) give 
you second thoughts but don't let the database guys tell you that it's "just 
an interface to files" (or something like that).  Still I'd give postgresql 
an honest shot since you started out with that.

Mysql has never been a bottleneck for me, you can only know for sure about 
your app by trying it.

-- 
Fraser Campbell <fraser-Txk5XLRqZ6CsTnJN9+BGXg at public.gmane.org>                 http://www.wehave.net/
Georgetown, Ontario, Canada                               Debian GNU/Linux
--
The Toronto Linux Users Group.      Meetings: http://tlug.ss.org
TLUG requests: Linux topics, No HTML, wrap text below 80 columns
How to UNSUBSCRIBE: http://tlug.ss.org/subscribe.shtml





More information about the Legacy mailing list