[Dbix-class] Re: Using DBIx::Class to copy data between different RDBMS

Phillip Moore w.phillip.moore at gmail.com
Wed Dec 2 21:30:43 GMT 2009

Um, let me try that again, with the premature send....

I got this working (I think), by simply extracting the row data using
get_columns.   So, the logic reduced to:

        my $rs_in = $schema_in->resultset( $source );

        my $rs_out = $schema_out->resultset( $source );


        foreach my $row ( $rs_in->all ) {
            $rs_out->create({ $row->get_columns });

Now I recall why I thought get_columns might not work: if I understand
the docs correctly, get_columns() only returns key/value pairs for the
CACHED columns form the database.  If I get a row, and dbic only gets
me a subset of the available columns, is there an easy way to force
dbic to get them ALL?

With Class::DBI, I dealt with this by defining all the columns as part
of the Essential column group, but column groups are a feature that
appear to be missing from DBIx::Class.   I used them very heavily, and
had to roll my own mechanism for grouping them.  Wasn't rocket

Do I need to force a query of every possible column for each row, to
ensure that lazy population doesn't result in missing data in my
SQLite copy of the data?

On Wed, Dec 2, 2009 at 2:54 PM, Phillip Moore <w.phillip.moore at gmail.com> wrote:
> I have a DBIx::Class application that manages a lot of metadata for a
> global configuration management system.   A lot of this data is
> relatively static, and I want to be able to code some utilities that
> can access this data without granting those applications direct access
> to the RDBMS where the data is managed (MySQL or Oracle).
> Since DBIx::Class works with SQLite, myplan was to define the same
> schema definitions in a SQLite database, distribute that globally so
> that there are local copies in each of my remote data centers, and
> then use the same DBIx::Class API to access the data.   The only
> difference would be that the DBI connection string will specify the
> SQLite file (which will be accessed via a readonly NFS mount, FWIW),
> instead of the master read/write MySQL or Oracle database.
> I've written all the code to implement this, and after wasting my time
> with DBD::AnyData (which can convert FROM a DBI source, but not TO a
> DBI source :-( ), now I've got code that just creates two schema
> objects: one for the source database (MySQL/Oracle) and one for the
> target database (SQLite).
> It seems to me there should be a relatively easy way, for a given
> DBIx::Class::ResultSet, to get all the rows from the source database,
> and insert them into the target.   I can't figure it out.   I've got
> something like this:
> my $schema_in = $class->connect( <<arguments for the source database >> );
> my $schema_out = $class->connect( <<arguments for the target database >> );
> foreach my $source ( $schema_in->sources ) {
>    my $rs_in = $schema_in->resultset( $source );
>    my $rs_out = $schema_out->resultset $source );
>    my @rows = $rs_in->all;
>    $rs_out->delete_all;
>    # OK..... Now what?
> }
> FWIW, I'm regenerating the contents from scratch, so one of the things
> I do is empty out the target table, and my intention was just to
> insert all the rows from the source, without worrying about (a) if
> they are present or not, and (b) deleting stale records.
> Each element of @rows is a DBIx::Class::Row object, but I can't just
> pass that to $rs_out->create().  From what I can see, I would need to
> enumerate the columns, by name, and create a hashref for the create()
> method.  I can see how to do that, but I am suspecting there's a much
> simpler way to accomplish this.
> Groping the dbic docs hasn't produced any hints.
> Suggestions?

More information about the DBIx-Class mailing list