[Catalyst] Catalyst handling big files?
Tomas Doran
bobtfish at bobtfish.net
Fri Aug 13 11:39:40 GMT 2010
On 13 Aug 2010, at 12:22, Alex Povolotsky wrote:
> I'm working on processing relatively big (10+Mb) XML files, and it
> seems to me that Catalyst is taking an awful lot of time on some
> internal processing before call to handler, using surprisingly 200
> MBs of RAM (about 40 MBs before request)
>
> Can I somehow improve Catalyst's performance?
Vague question is vague.
I very much doubt that this is a Catalyst issue. You know that
Catalyst log lines are batched till the end of request by default,
right?
I think that you'll find that you are sucking 10Mb+ of XML into perl
(using _significantly_ more memory for a large scalar), and then
munging over it in some way to produce a huge data structure (which is
also in RAM).
So, this is nothing to do with Catalyst - if you load a massive scalar
into RAM, and then parse that into a massive data structure, it's
going to use a lot of RAM, full stop...
Maybe a streaming (SAX parser?) based approach would be better so that
you never read the entire file into RAM.
Cheers
t0m
More information about the Catalyst
mailing list