[Dbix-class] Memory usage, something hanging on?

Adam Sjøgren adsj at novozymes.com
Mon Jan 8 15:51:50 GMT 2007


  Hi.


In a Catalyst application I am working on, in which we use
DBIx::Class, server.pl just grew to use over 300MB ram, staying that
size from then on.

I've tried to replicate what happens, and hope someone will comment on
whether this is expected behaviour, or something else.

To test this, I create a testdb, and make a simple table:

 CREATE TABLE testtable (
   id serial,
   title text,
   fill text,
   primary key (id)
 );

TestDB.pm:

 package TestDB;

 use strict;
 use warnings;
 use base qw/DBIx::Class::Schema/;

 __PACKAGE__->load_classes();

 1;

TestDB/Testtable.pm:

 package TestDB::Testtable;

 use strict;
 use warnings;

 use base 'DBIx::Class';

 __PACKAGE__->load_components('PK::Auto', 'Core');
 __PACKAGE__->table('testtable');
 __PACKAGE__->add_columns(qw(id title fill));
 __PACKAGE__->set_primary_key('id');

 1;

I fill this table with rather large records:

 #!/usr/bin/perl

 use strict;
 use warnings;

 use TestDB;

 my $schema=TestDB->connect("dbi:Pg:dbname=testdb;host=localhost", "user", "pass", { AutoCommit=>0 });

 my $long_string='afahabada' x 20000;

 my $i=0;
 while ($i<1000) {
     $schema->resultset('TestDB::Testtable')->create({
                                                      title=>'title_' . $i++,
                                                      fill=>$long_string,
                                                     });
 }

 $schema->txn_commit;

Then I keep an eye on memory usage (watch 'ps aux | grep [p]erl')
while running this simple test-script:

 #!/usr/bin/perl

 use strict;
 use warnings;

 use TestDB;

 my $schema=TestDB->connect("dbi:Pg:dbname=testdb;host=localhost", "user", "pass", { AutoCommit=>0 });

 do_search();

 print "sleeping\n";
 sleep 10000;

 exit 0;

 sub do_search {
     my $xs=$schema->resultset('TestDB::Testtable')->search();
     print $xs->count . "\n";
     while (my $x=$xs->next) {
         ;
     }
 }

What happens is that the memory usage grows to ~250MB when ->search()
is called. This, I sort of expected.

But when do_search() returns, the memory is still in use. The only way
I can make it not be so, is by terminating the script or calling
$schema->storage->disconnect. I would guess the memory should be freed
when $xs goes out of scope(-ish)?

If I call do_search() multiple times, memory usage falls when a new
do_search() is started, then climbs to reach approximately the same
maximum, suggesting that it is not really a leak and that the lump is
freed just before a new one is being built up. This is just guessing
on my part, though.

(I should do paging, and I will, but I'm still curious as to whether
this is intended or not).


I am using DBIx::Class 0.07003, DBI 1.51, DBD::Pg 1.49, perl 5.8.8 and
PostgreSQL 8.1 on Ubuntu 6.10/x86 (edgy).


  Best regards,

     Adam

-- 
                                                          Adam Sjøgren
                                                    adsj at novozymes.com



More information about the Dbix-class mailing list