the ->get() method gets executed well but the problem is faced when i try to walk thru the results
foreach ($query->result() as $row)
Is ActiveRecord class (or the mysql driver of CI) intended only for small tables and resulsets?
It seems that 128M is more than enough.
I ran this same process over plain php and ADODB and it works perfect.
Any help would be highly appreciated.
I imagine it has to do with however you are processing your data. You should post more code where you get the data from the db and then manipulate it, especially more of the loops you started to post above.
@InsiteFX, I see what’s your point, do you see mine?
may be i didn’t have myself clear enough.
if so I will aprovide a generic example from where this defect can be tested.
yes anyone can get this error even raising the amount of memory for a single script, sooner or later the ammount of rows will beat your ram memory, that’s why resulsets are not stored in memory unless you ask for or you are in a caching strategy where memcaching is needed.
that’s why programming it’s such an exciting job, one day you’re working with a shopping cart and the other you end up trying to run a process over a million records.!!!
My problem right now is that I’m trying to do it with CI
I know at least 3 other ways to do the same without using CI, but I made this post hoping for an answer of why can’t I open a table with a million records?
In fact I can hack the library to do what I want.
As InsiteFX has said. Why “WOULD” you want to open a table with a million records?
Any developer worth their salt should know that large tables must be paged/paginated into chunks.
I’ve only ever come across two instances where you would “need” to process large amounts of data, in one go and in those instances, I have used a direct connection to the database using PHP, i.e. mysql_open.
CodeIgniter and ActiveRecord (IMO) has been built for the common developer who, every day, has to constantly keep on writing the same code over and over again. It’s been build so that accessing single, or small result sets is made extremely easy, where normally, it would require a lot more work.
If you need to process large result sets, I would suggest using plain PHP and MySQL calls, so that CI and ActiveRecord do not consume any additional memory during the processing…
The query is very simple, jsut a select from a table.
The explanation of my needs are irrelevant to the bug/defect, you can’t criticize my needs based on your own.
May be it’s not the most common scenario but an abstraction layer should be capable of handle 1M records seamlessly like most other abstraction layers.
If DB layer of CI is intended for small resulsets, that should be noted and remarked in the docs, and the library should be safe in that way and handle the exceptions.
Let’s say you start with an empty database and for some reason you end up with a big result, then you will end up with a memory exhaustion error.
If you can provide an actual reason for returning 1M records in one resultset, then maybe there is a flaw in the CI DB system, but under normal circumstances, even databases with more than 1M records will work fine on the CI DB system by use of pagination or correctly processing the data.
For example, regardless of how many records you have, you should have pagination to breakdown the results when presenting. i.e.
select * from Table limit 0, 100
which returns 100 rows.
If you have 1M records that you need to process in one movement, you can either use native mysql calls that PHP provides, or what’s probably better, is to batch process them in say 10000 rows at a time.