⚠ In case you've missed it, we have migrated to our new website, with a brand new forum. For more details about the migration you can read our blog post for website migration. This is an archived forum. ⚠

  •     

profile picture

Memory error on large tables



LHammonds

LHammonds
  • profile picture
  • Member

Posted 26 April 2017 - 11:52 AM

Greetings,

 

I am trying to throw together a quick app to provide read-only access to a few tables...some of which have over a million rows and amount of fields can be over 200. Finally getting rid of our AS/400.

 

I am using the DataTables theme which my users like the most.  Tested it with a small database.

 

By default, it tries to load the entire dataset into memory...which will not work.

 

I found the option for server-side processing but because of my lack of knowledge, I don't know what I need to modify and where to make this work.

 

Here are the changes I made to the default install of CodeIgniter/GroceryCRUD:

 

application/database.php - Added connection info for MySQL and set save_queries to FALSE.

 

application/controllers/Main.php - Set the index to show various table pages and functions for each table.  Each table section uses the following simple settings:

 * Set the theme to datatables

 * Set the table name

 * Unset the add, delete and edit buttons

 * render the output.

 

Example:

public function table1()
{
$crud = new grocery_CRUD();
$crud->set_table('table1');
$crud->set_subject('Table description');
$crud->set_theme('datatables');
$output = $crud->render();
$this->_example_output($output);
}

 

Error message (memory_limit in PHP.ini was increased but that is not feasible with this amount of data):

Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 3203257904 bytes) in /var/www/html/testapp/system/database/drivers/mysqli/mysqli_driver.php on line 305

A PHP Error was encountered

Severity: Error

Message: Allowed memory size of 536870912 bytes exhausted (tried to allocate 3203257904 bytes)

Filename: mysqli/mysqli_driver.php

Line Number: 305

Backtrace:

 

Server information:

Ubuntu Server 16.04.2 LTS
MariaDB 10.1.22
Apache 2.4.18
PHP 7.0.15
CodeIgniter 3.1.2
GroceryCRUD 1.5.8

 


Amit Shah

Amit Shah
  • profile picture
  • Member

Posted 27 April 2017 - 11:03 AM

Simple solution .. dont use data tables theme. Rather get a bootstrap theme. Datatables will load all the data.. now all it depends is the machines capacity 2 handle.


LHammonds

LHammonds
  • profile picture
  • Member

Posted 27 April 2017 - 18:59 PM

Simple solution .. dont use data tables theme. Rather get a bootstrap theme. Datatables will load all the data.. now all it depends is the machines capacity 2 handle.

I like simple solutions!  Thanks for the help.

 

I switched to the Flexigrid and it worked somewhat.

 

Of the 6 tables, 5 were able to display.  I still get a memory issue on one that has over 8 million rows.  Two tables have over 2 million rows but are able to avoid crashing.

 

However, when I click the next page or last page buttons, I notice the data never changes even when I click the refresh button next to the pagination controls.

 

It also does not change when I do a global search or search in a specific field.

 

I purchased the bootstrap theme, installed it and I get the exact same results.

 

Why would the app run out of memory on the web server if it is only pulling 10 records at a time regardless of how large the repository?

 

As for the no search / refresh issue, is this a typical noob mistake in configuration?

 

EDIT: It is the web server that runs out of RAM.  The dedicated (separate) SQL server happily processes each request.

 

Thanks,

LHammonds


mckaygerhard

mckaygerhard
  • profile picture
  • Member

Posted 28 April 2017 - 19:36 PM

for your information:

 

using linux client and flexygrid in firefox 20 no problems

 

using linux client and datatables in firefox 20 no problems

 

using any OS and datatables in LASTED firefox or chrome, always problems

 

using guindows mocosotf and any browesr, please u know what will happened: always problems

 

the problems are rely on how machine handle the data, and today waste of resources seems a standard

 

in clien i detect the OS and browsers, launch and alert and limit amount of rows.. if users wnat see others rows need make a pre-filtering to the data.. i mean:

* user set filters

* i receive the filters in post, make a query to table and create another  (not using temp table)

* later render over the temptable

* only erase the table before each new request..

 

due the bad management of mocosoft and moder linux os, that behaviour u want are impossible without made a server-side procesing with ajax, and as u said, its a complex solution


Amit Shah

Amit Shah
  • profile picture
  • Member

Posted 02 May 2017 - 06:24 AM

Well,

i will recommend checking the output of the network call. The pagination calls are all ajax calls. Check the outout of ajax on the developer console (network tab) for the request. Check if the same is giving any notice / warning in the output. If something like this comes in or if there is any error, it will not proceed further as it expects JSON and the error will give in HTML output.


LHammonds

LHammonds
  • profile picture
  • Member

Posted 02 May 2017 - 19:39 PM

I noticed there was a slightly newer version of CodeIgniter available.  Rather than overwrite the current folder, I ended up just creating a new folder with CodeIgniter 3.1.4 and Grocery_CRUD 1.5.8

 

I then installed the sample database and tied it to that just to see the example work.  It did as it was supposed to...so...I then tied it to my database and proceeded to butcher the example controller and view to look at my tables.  It still croaked on the 8mil table but the filtering and navigation worked unlike the 1st one I setup.

 

Since it worked for the 2mil tables, I figured I could just keep incrementing the memory_limit variable until the 8mil table would finish loading.

 

The server had 1024M of physical memory and 2048M of swap space available.  So I bumped up the "memory_limit" variable in php.ini from 512M to 1024M and that allowed it to display the table with 8 million rows.  I watched the RAM on the server and it did indeed use every bit available to reach the end of its query.

 

Thanks for the help and getting me straightened out about how themes affect more than just the visuals.

 

LHammonds


Amit Shah

Amit Shah
  • profile picture
  • Member

Posted 03 May 2017 - 09:41 AM

Well,

 

Playing around with volume of data definitely going to be asking for trouble if you not having proper indexing and also u looking to pull in heavy data set. Thats where pagination is going to work / help you with themes like bootstrap / flexigrid.

Data tables - u may be able to process the data @server side with increasing the memory to handle the same. But will be tough on the client end who dont have huge memories to handle its rendering.

 

Always happy to help ..

Happy GCing :)