[Catalyst-dev] Important Question About Recommended Environment For The Tutorial

hkclark at gmail.com hkclark at gmail.com
Mon Dec 8 17:12:17 GMT 2008


Hi Everyone,

After talking to MST I realized that I have been way overdue getting a note
out to the group on an issue that's pretty fundamental to the tutorial (I
had some one-on-one discussions about it, but I don't think I did anything
to the whole group).  Sorry for not getting something like this out to the
group sooner and sorry for this email being so long (but it's kind of a big
topic with a somewhat long history and potentially important ramifications
for the future).

Two of the big issues I have been grappling with since I first did the
tutorial almost 3 years ago are:

1) How to get an environment where users can quickly get up and running to
work through the tutorial (IOW, mostly how to avoid problems with CPAN)

2) How to have a 'known good" set of modules that the tutorial should work
against.


Another key question is: "What is the purpose of the tutorial?"  My thought
has been that it needs to provide a good, stable environment where newcomers
can learn the basics of Catalyst.  It should try to show relatively current
best practices and features, but IMHO, having it "just work" is more
important than having the latest and greatest of everything -- if people get
frustrated in the early learning phases they will never "stick around" to
learn the finer points.  Any comments on this point of view?

In terms of trying to address 1 & 2 above, my initial plan of attack was to
encourage the use of MST's really cool "cat-install" script (while still
mentioning other options like "cat in a box").  I stuck
Catalyst::Manual::Installation::CentOS4 out there as a way that people could
start from scratch with a relatively popular distro and build a Catalyst
environment.  Unfortunately, using cat-install to always pull the latest
versions of everything seemed to create more problems with #1 and #2 above
than I expected.  More times than not it ran into at least one module that
failed to install correctly.  And, even if people did get past the install,
there was no way to know if the tutorial was now broken once they got into
the details.

I have talked to MST over the years about trying to have a way to
automatically regression test the tutorial on an ongoing basis.  If such a
thing existed, it seems like a great solution -- we could have a nightly
cron job automatically tell us if some new module broke either the install
*or* the tutorial itself.  However, I'm not aware of any frameworks or tools
to automatically test all that.  Does anyone else know of one?  I spent some
time thinking about how to use some =3Dfor tags in the POD to automate the
testing, but I didn't get very far with it -- unless I'm missing something,
it seems like a big job to do it right.  So far I have done it manually: I
start with a "minimal" install of Linux in VMWare, I run cat-install
following the exact directions in Catalyst::Manual::Installation::CentOS4
(almost always updating them because of some dependency or module change
requiring a hack), manually walking through every command and and cutting
and pasting every chunk of code into the right place.  It's obviously very
slow (even though I have done it so many times I can almost do it in my
sleep), :-)  but it's the only way I know to make sure it works (at least I
know it worked the day I finished my testing... all bets were off as soon as
the underlying modules get updated the next day).

Then I happened to try Catalyst on Ubuntu 8.04 back when that first came out
earlier this year.  I couldn't believe how fast and easy it was!  In about 2
minutes I was able to boot the Live CD, uncomment the universe repositories
in /etc/apt/sources.list and run one apt-get command.  Boom, I was done and
it "just worked".  I could do the entire tutorial in that environment.  No
more waiting an hour for cat-install to finish only to realize it failed on
some module 50 minutes earlier (don't get me wrong, I think cat-install is
terrific, but it does have to download, compile and test each and every
module).   Yeah, Ubuntu didn't give me the latest and greatest of every
module, but there was an advantage to that -- it gave me a known environment
for the next 6 months until Ubuntu did their next release.  So while it
wasn't perfect, it seemed like the lesser evil to me -- the setup was quick,
painless, and pretty darn bulletproof, Ubuntu is obviously very popular so
lots of people are familiar with it (not that they need to know anything
about Ubuntu for the tutorial), and, because they release a new version of
Ubuntu every 6 months it stays fairly current, etc.  All I have to do is
test and update everything every six months as a new version of Ubuntu comes
out (I'm currently working on an update for Ubuntu 8.10).

Thoughts on this?  Baring an automated testing methodology that really
exercises all parts of the tutorial process (the install, running the helper
commands, copying code from the pod files, etc.), it seems like something
along the lines of Ubuntu is a pretty good compromise.  Especially if we
don't have a volunteer to automate the testing process. :-)

Thoughts? Comments?  Suggestions?

Thanks,
Kennedy

PS -- Note that with the Ubuntu approach, we do have the option of having
them use CPAN for one or more modules if we want to get around a serious bug
and/or pull in a module that newer than can be found in Ubuntu universe.
And, because apt-get would do the heavy lifting of getting the 172
modules/packages installed first, the job would still be a lot faster,
simplier, and more tightly controlled than a raw build against CPAN.

PPS - Another idea I have thought about and seen discussed is having a
VMWare virtual appliance image available.  It sounds like a great way to go,
but we would need to find a way to maintain it.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.scsys.co.uk/pipermail/catalyst-dev/attachments/20081208/f=
3f80d88/attachment.htm


More information about the Catalyst-dev mailing list