Mysql php out of memory

PHP Getting out of memory

I am trying to insert data from postgres database into mysql database. There are about 100000 records that I need to import. However Iam always getting out of memory issue. Out of memory (allocated 1705508864) (tried to allocate 222764 bytes) I am using Laravel 5 to do this, here is code:

// to avoid memory limit or time out issue ini_set('memory_limit', '-1'); ini_set('max_input_time', '-1'); ini_set('max_execution_time', '0'); set_time_limit(0); // this speeds up things a bit DB::disableQueryLog(); $importableModels = [ // array of table names ]; $failedChunks = 0; foreach ($importableModels as $postGresModel => $mysqlModel) < $total = $postGresModel::count(); $chunkSize = getChunkSize($total); // customize chunk size in case of certain tables to avoid too many place holders error if ($postGresModel === 'ApplicationFormsPostgres') < $chunkSize = 300; >$class = 'App\\Models\\' . $mysqlModel; $object = new $class; // trucate prev data // Eloquent::unguard(); DB::statement('SET FOREIGN_KEY_CHECKS=0;'); $object->truncate(); DB::statement('SET FOREIGN_KEY_CHECKS=1;'); Eloquent::reguard(); $postGresModel::chunk($chunkSize, function ($chunk) use ($postGresModel, $mysqlModel, $failedChunks, $object) < // make any adjustments $fixedChunk = $chunk->map(function ($item, $key) use ($postGresModel) < $appendableAttributes = $postGresModel::APPEND_FIELDS; $attributes = $item->getAttributes(); // replace null/no values with empty string foreach ($attributes as $key => $attribute) < if ($attribute === null) < $attributes[$key] = ''; >> // add customized attributes and values foreach ($appendableAttributes as $appendField) < if ($appendField === 'ssn') < $value = $attributes['number']; $attributes[$appendField] = substr($value, 0, 4); >else < $attributes[$appendField] = ''; >> return $attributes; >); // insert chunk of data in db now if (!$object->insert($fixedChunk->toArray())) < $failedChunks++; >>); > 

Memory issue comes when about 80000 rows are inserted not before that. I suspect something is wrong with collection map function or loops inside the map function. I have even tried setting memory setting and time limit settings to unlimited but to no avail. May be I need to use reference variables or something but I am not sure how. Can any optimizations be made in above code to reduce memory usage? Or how do I efficiently import large data from large PostgreSQL database to MySQL through code ? Can anyone tell what I am doing wrong here or why whole memory gets consumed up ? PS: I am doing this on local development machine which has 4GB ram (Windows 8). PHP version: 5.6.16

Источник

MySQL running out of memory after multiple queries

I’m trying to iterate through two tables and pull records. The tables have hundreds of thousands of rows, so I know pulling them all at once is not going to work, so looping through and pulling them in batches seems like the best approach, but it’s not releasing the memory, so after a few pulls it runs out of memory and crashes. My Output

400000 Pulled, currently on 0 || Out of 1623230 400000 Pulled, currently on 400000 || Out of 1623230 Fatal error: Allowed memory size of 536870912 bytes exhausted (tried to allocate 39 bytes) in C:\xampp\htdocs\misc\manage tickets\test2.php on line 38

Ignore the line number, I have a giant block of commented out code at the top of the file. The code is:

$iterate = 400000; $query = "SELECT id FROM $old_db.ticketaudit_archive"; $get = $g_mysqli->query($query); $count = $get->num_rows; $i = 0; while($i query($query); echo $get->num_rows.' Pulled, currently on '.$i.' || Out of '.$count.'
'; while($z = $get->fetch_assoc()) < $old_ticket_id = $z[old_ticket_id]; $ticket_id = $z[ticket_id]; unset($z[old_ticket_id], $z[ticket_id], $z[id]); if($old_ticket_id != '')< $ticketaudit[old_ticket_id][$old_ticket_id][] = $z; $xx += 1; >elseif($ticket_id != '') < $ticketaudit[ticket_id][$ticket_id][] = $z; $xx += 1; >> $i += $iterate; $get->free(); $get->free_result(); $get->close(); >

I’ve been at this all day, and can’t find a way to get MySQL to drop the damned memory cache for previous queries.

Читайте также:  Pink html color codes

Источник

Out of memory error in php

I have tried using memory_get_usage() to determine the cause of memory leak, but since the code is huge it is taking a lot of time, is there any way I can get the details of all the objects in memory so that I can debug the issue better?

select b.Id as Id,b.Lang from groups g left join table1 b on b.Group_Id = g.Id left join table2 bs on bs.Id = b.Id where g.Id = ? and b.Lang = ? 

Best would be if you also include the sql command. As it could be a problem in there (for example that you join multiple tables without using a restriction so that every row is joined with every row).

The sql is «select b.Id as Id,b.Lang from groups g left join table1 b on b.Group_Id = g.Id left join table2 bs on bs.Id = b.Id where g.Id = ? and b.Lang = ?». Please could u explain better why the memory usage will increase because of sql statement problem.

For the leakproblem itself I think the answers are good there. From what I see the sql doesn’t cause a general problem (as it restricts which rows are associated with which row). Seems like really either way too many datarows in the tables themselves or a memory leak. As you mentioned that the code is huge. Do you unset the variables where you store the input into after they have become obsolete? (as else they still reserve memory)

I am assuming the variable you are expecting to be unset is $result in this line «$result = fetchAll($sql,$data,Zend_Db::FETCH_ASSOC);» I do not unset the variable as the variable is in the scope of the method so I am expecting that it should not be retained in memory? Could there be a possibility that even variables that are in the scope of a method exists even after the method execution completes? Please could u also share any good link for the memory leak problem.

Читайте также:  Short number in java

Источник

1. create_summary function

create_summary function fetches activity data from database and loop through activities using for each loop, and insert data into the below text file like below:

zfilename71801404123.txt

A|201309|R|C|2|014000956|014000956|2200|201211|M|3118.72|35215.12|1639.96|40749.29|46183.13|44653.83|1529.3|||423|9999|EVERGREEN IMPLEMENT INC A|201309|R|C|2|014000956|014000956|2201|201211|M|0|13.86|0|15.22|13.86|15.22|-1.36|||423|9999|EVERGREEN IMPLEMENT INC 

2. insertdatafromfile function

insertdatafromfile function will read the contents of the same text file[zfilename71801404123.txt] and insert them into the summary table using the command LOAD DATA.

Code

function RebuildSummary() < $random = date('dmyhis'); $zfilename = "zfilename".$random; create_summary($zfilename); insertdata($zfilename); >function create_summary($zfilename) < $activities // data from DB $filepath = $_SERVER['DOCUMENT_ROOT']."\z".$zfilename.".txt"; foreach ($activities as $activity) < $sql_summary = "SELECT A.AcctDb as AcctDb, '" . $default->DeftReportPeriod . "' as SumReportPer, '" . $default->DeftReportBase . "' as SumReportBase, '" . $default->DeftPeriodBasis . "' as SumPeriodBasis, '" . $default->DeftBasisAdj . "' as SumBasisAdj, '" . $AcctNo . "' as AcctNo,'" . $AcctTaxId . "' as AcctTaxId, '" . $RevLoc . "' as SumRevLoc, '" . $YTDStart . "' as SumYtdStart, '" . $CurrFreq . "' as SumCurrFreq, '" . $Curr . "' as SumCurrAmt, '" . $Ytd . "' as SumYtdAmt, '" . $Lastcurr . "' as SumLastCurr, '" . $LastYTD . "' as SumLastYtd, '" . $Last12 . "' as SumLast12, '" . $Prior12 . "' as SumPrior12, '" . $Last12diff . "' as SumLast12Diff, A.AcctDateOpen as SumDateOpen, A.AcctDateClosed as SumDateClosed, A.GroupCode as SumGroupCode, A.AcctHomeLoc as SumHomeLoc, A.AcctBusName as SumBusName, A.ClassCode as SumClassCode, '" . $Currdiff . "' asSumCurrDiff, '" . $Ytddiff . "' as SumYtdDiff, '" . $Mon['0'] . "' as SumMon01, '" . $Mon['1'] . "' as SumMon02, '" . $Mon['2'] . "' as SumMon03, '" . $Mon['3'] . "' as SumMon04, '" . $Mon['4'] . "' as SumMon05, '" . $Mon['5'] . "' as SumMon06, '" . $Mon['6'] . "' as SumMon07, '" . $Mon['7'] . "' as SumMon08, '" . $Mon['8'] . "' as SumMon09, '" . $Mon['9'] . "' as SumMon10, '" . $Mon['10'] . "' as SumMon11, '" . $Mon['11'] . "' as SumMon12,'" . $Amt['0'] . "' as SumAmt01, '" . $Amt['1'] . "' as SumAmt02, '" . $Amt['2'] . "' as SumAmt03,'" . $Amt['3'] . "' as SumAmt04, '" . $Amt['4'] . "' as SumAmt05, '" . $Amt['5'] . "' as SumAmt06, '" . $Amt['6'] . "' as SumAmt07, '" . $Amt['7'] . "' as SumAmt08, '" . $Amt['8'] . "' as SumAmt09, '" . $Amt['9'] . "' as SumAmt10, '" . $Amt['10'] . "' as SumAmt11, '" . $Amt['11'] . "' as SumAmt12 FROM accounts A WHERE A.AcctDb = '" . $AcctDb . "' and A.AcctTaxId='" . $AcctTaxId . "' ;"; $exist_activity1 = $this->db->query($sql_summary); $activities1 = $exist_activity1->result_array(); $flag_index = 0; foreach ($activities1[0] as $key => $value) < if ($flag_index == 0) < >$result .= $value . "|"; $flag_index = 1; > $j++; $result = rtrim($result, "|"); $handle = fopen($filepath, 'a') or die('Cannot open file: ' . $filepath); fwrite($handle, $result); $new_data = "\n"; fwrite($handle, $new_data); $result = ""; > > function insertdatafromfile($zfilename)

System Configuration:

Processor: Intel(R) Xeon(TM) CPU 2.80Ghz, 2.79 Ghz (2 Processors) Installed memory(RAM) : 6.00 GB System Type: 64 bit Operating System Server: Windows IIS 7 

PHPINFO

max_input_time: 60000 max_file_uploads: 2048M memory_limit: 20000M post_max_size: 20000M upload_max_filesize: 15000M 

My Question:

I am getting the below error while calling the RebuildSummary function.

[17-Apr-2014 03:54:42 America/Los_Angeles] PHP Fatal error: Out of memory (allocated 1517289472) (tried to allocate 64 bytes) in C:\HostingSpaces\wwwroot\system\database\drivers\mysql\mysql_result.php on line 162 

I have enough memory on server side, still how the system display this error «Out of memory»

Читайте также:  Объединить два массива java

Источник

How do I prevent running out of memory when inserting a million rows in mysql with php

I have built a script in Laravel that reads a JSON file line by line and imports the contents into my database. However, when running the script, I get an out of memory error after inserting about 80K records.

mmap() failed: [12] Cannot allocate memory mmap() failed: [12] Cannot allocate memory PHP Fatal error: Out of memory (allocated 421527552) (tried to allocate 12288 bytes) in /home/vagrant/Code/sandbox/vendor/laravel/framework/src/Illuminate/Database/Query/Builder.php on line 1758 mmap() failed: [12] Cannot allocate memory PHP Fatal error: Out of memory (allocated 421527552) (tried to allocate 32768 bytes) in /home/vagrant/Code/sandbox/vendor/symfony/debug/Exception/FatalErrorException.php on line 1 

I have built a sort of makeshift queue to only commit the collected items every 100, but this made no difference. This is what the part of my code that does the inserts looks like:

public function callback($json) < if($json) < $this->queue[] = [ 'type' => serialize($json['type']), 'properties' => serialize($json['properties']), 'geometry' => serialize($json['geometry']) ]; if ( count($this->queue) == $this->queueLength ) < DB::table('features')->insert( $this->queue ); $this->queue = []; > > > 

It’s the actual inserts ( DB::table(‘features’)->insert( $this->queue ); ) that are causing the error, if I leave those out I can perfectly iterate over all lines and echo them out without any performance issues. I guess I could allocate more memory, but doubt this would be a solution because I’m trying to insert 3 million records and it’s currently already failing after 80K with 512Mb memory allocated. Furthermore, I actually want to run this script on a low budget server. The time it takes for this script to run is not of any concern, so if I could somehow slow the insertion of records down that would be a solution I could settle for.

Источник

Оцените статью