[Dbix-class] Regarding Oracle's 1000 element limit in IN clause

Jorge Gonzalez jorge.gonzalez at daikon.es
Mon Oct 24 09:55:51 GMT 2011

El 24/10/11 10:48, Matija Grabnar escribió:
> You have another possibility: create the temporary tables in another 
> database. This database can be on your OWN server, so the HUGE company 
> doesn't even have to know about it.

Yes but I need to cross information from the HUGE company database, so I 
need to join the tables. I want to filter a table with millions of rows 
(HUGE db) with a local list of values for a field (SMALL db). For this, 
if I want to use temp tables I have two options:

A. Create a temp table in HUGE database with the contents of SMALL 
database values. I'm not allowed to do this, as I said in my former mails.

B. Create a temp table in SMALL database with the contents of the HUGE 
database I want to filter. Thi would mean transfering a copy of the 
table with _millions_ of rows to local, just to discard it afterwards. 
Seems not very reasonable.

If I want to join the tables they need to be in the same schema (or at 
least the same DB server), so I can do only two things: move the SMALL 
table where the HUGE one is (not allowed), or move the HUGE one where 
the SMALL one is (not reasonable).

> If you were to download the full resultset and do the searching in 
> Perl you would in effect be doing the
> same thing, except your database would be in your RAM, and you would 
> be using Perl instead of SQL
> to search for it. For large datasets (which this seems to be) a 
> database on the local server is more efficient than doing it all in RAM.

Question is, my server has enough RAM to slurp the resultset and then 
search (which is what I'm doing now). If enough RAM is available, no 
disk-based SQL server can beat that, provided that efficient search 
algorithms are used (I'm using the resultset to create several RAM based 
indexes - perl hashes - before doing any searching).

Thanks for the idea anyway.

More information about the DBIx-Class mailing list