[Catalyst] DBIx::Class and managing schemas

Daniel McBrearty danielmcbrearty at gmail.com
Mon Jun 12 21:53:21 CEST 2006


I'm currently migrating engoi from the current mysql schema to pg.
Along the way, the schema is changing shape quite drastically, for
various reasons.

Here's how I'm doing it:

1. writing the new pg schema in a single file (pg.sql)
2. writing a migrate script that gets data out of the old db and into the new.
3. testing, updating the cat app
4. repeat for a new chunk of original schema.

Everything uses DBIx::Class now (can't believe I was rolling my own
ORM a year back ...)

My question : given that this new schema will not be set in stone (it
will get added to as we add new functions, though hopefully we will
not have to rehash the whole thing again) - am I making life harder
than needs be for future?

What I can see happening for future is this :

1. dev a new feature, along with a scripts to add to and maybe
populate the schema as needed.
2. when ready use scripts to update the prod server.

I just had a speed read of the cookbook for DBIx, and saw some stuff
about managing schema versioning. I'm just wondering if my current way
of working is going to be something I'm going to regret down the line
... to ge honest I don't have too much time to read all the docs, and
I'm not sure if I'd get all teh implications on a first reading anyhow
...

thanks

D


-- 
Daniel McBrearty
email : danielmcbrearty at gmail.com
www.engoi.com : the multi - language vocab trainer
BTW : 0873928131



More information about the Catalyst mailing list