[Dbix-class] Memory usage, something hanging on?

Matt S Trout dbix-class at trout.me.uk
Mon Jan 8 16:59:20 GMT 2007


On 8 Jan 2007, at 15:51, Adam Sjøgren wrote:

>   Hi.
>
>
> In a Catalyst application I am working on, in which we use
> DBIx::Class, server.pl just grew to use over 300MB ram, staying that
> size from then on.
>
> I've tried to replicate what happens, and hope someone will comment on
> whether this is expected behaviour, or something else.
>
> To test this, I create a testdb, and make a simple table:
>
>  CREATE TABLE testtable (
>    id serial,
>    title text,
>    fill text,
>    primary key (id)
>  );
>
> TestDB.pm:
>
>  package TestDB;
>
>  use strict;
>  use warnings;
>  use base qw/DBIx::Class::Schema/;
>
>  __PACKAGE__->load_classes();
>
>  1;
>
> TestDB/Testtable.pm:
>
>  package TestDB::Testtable;
>
>  use strict;
>  use warnings;
>
>  use base 'DBIx::Class';
>
>  __PACKAGE__->load_components('PK::Auto', 'Core');
>  __PACKAGE__->table('testtable');
>  __PACKAGE__->add_columns(qw(id title fill));
>  __PACKAGE__->set_primary_key('id');
>
>  1;
>
> I fill this table with rather large records:
>
>  #!/usr/bin/perl
>
>  use strict;
>  use warnings;
>
>  use TestDB;
>
>  my $schema=TestDB->connect("dbi:Pg:dbname=testdb;host=localhost",  
> "user", "pass", { AutoCommit=>0 });
>
>  my $long_string='afahabada' x 20000;
>
>  my $i=0;
>  while ($i<1000) {
>      $schema->resultset('TestDB::Testtable')->create({
>                                                        
> title=>'title_' . $i++,
>                                                       fill=> 
> $long_string,
>                                                      });
>  }
>
>  $schema->txn_commit;
>
> Then I keep an eye on memory usage (watch 'ps aux | grep [p]erl')
> while running this simple test-script:
>
>  #!/usr/bin/perl
>
>  use strict;
>  use warnings;
>
>  use TestDB;
>
>  my $schema=TestDB->connect("dbi:Pg:dbname=testdb;host=localhost",  
> "user", "pass", { AutoCommit=>0 });
>
>  do_search();
>
>  print "sleeping\n";
>  sleep 10000;
>
>  exit 0;
>
>  sub do_search {
>      my $xs=$schema->resultset('TestDB::Testtable')->search();
>      print $xs->count . "\n";
>      while (my $x=$xs->next) {
>          ;
>      }
>  }
>
> What happens is that the memory usage grows to ~250MB when ->search()
> is called. This, I sort of expected.
>
> But when do_search() returns, the memory is still in use. The only way
> I can make it not be so, is by terminating the script or calling
> $schema->storage->disconnect. I would guess the memory should be freed
> when $xs goes out of scope(-ish)?
>
> If I call do_search() multiple times, memory usage falls when a new
> do_search() is started, then climbs to reach approximately the same
> maximum, suggesting that it is not really a leak and that the lump is
> freed just before a new one is being built up. This is just guessing
> on my part, though.

Sadly, I think I can tell you exactly what this is, and if I'm right  
it's nothing we can help you with :(

DBD::Pg doesn't use cursors, it fetches the entire data into local  
memory. DBIx::Class uses prepare_cached to get the $sth.

So, I believe what you're seeing is DBD::Pg fetching the full dataset  
and not releasing it until the $sth is re-executed by the next search.

Try forcing the $dbh to disconnect at the end of do_search and see if  
that drops the memory.

-- 
Matt S Trout, Technical Director, Shadowcat Systems Ltd.
Offering custom development, consultancy and support contracts for  
Catalyst,
DBIx::Class and BAST. Contact mst (at) shadowcatsystems.co.uk for  
details.
+ Help us build a better perl ORM: http://dbix- 
class.shadowcatsystems.co.uk/ +





More information about the Dbix-class mailing list