Asked  2 Years ago    Answers:  5   Viewed   103 times

I am reading csv & checking with mysql that records are present in my table or not in php.

csv has near about 25000 records & when i run my code it display "Service Unavailable" error after 2m 10s (onload: 2m 10s)

here i have added code

// for set memory limit & execution time
ini_set('memory_limit', '512M');
ini_set('max_execution_time', '180');

//function to read csv file
function readCSV($csvFile)
{
    $file_handle = fopen($csvFile, 'r');
    while (!feof($file_handle) ) {

       set_time_limit(60); // you can enable this if you have lot of data

       $line_of_text[] = fgetcsv($file_handle, 1024);
   }
   fclose($file_handle);
   return $line_of_text;
 }

// Set path to CSV file
$csvFile = 'my_records.csv';

$csv = readCSV($csvFile);

for($i=1;$i<count($csv);$i++)
{
   $user_email= $csv[$i][1];

   $qry = "SELECT u.user_id, u.user_email_id FROM tbl_user as u WHERE u.user_email_id = '".$user_email."'";

   $result = @mysql_query($qry) or die("Couldn't execute query:".mysql_error().''.mysql_errno());

   $rec = @mysql_fetch_row($result);

   if($rec)
   {
      echo "Record exist";
   }
   else
   {
      echo "Record not exist"; 
   }
}

Note: I just want to list out records those are not exist in my table.

Please suggest me solution on this...

 Answers

3

An excellent method to deal with large files is located at: https://.com/a/5249971/797620

This method is used at http://www.cuddlycactus.com/knownpasswords/ (page has been taken down) to search through 170+ million passwords in just a few milliseconds.

Sunday, December 25, 2022
1
$howlong = array("0-30days"=>0, "31-60days"=>0, "61-90days"=>0, "91+days"=>0);
foreach($csv as $car){
  ...
  $numberDays = intval($numberDays); //last line in your code
  if($numberDays<=30) $howlong["0-30days"]++;
  elseif($numberDays<=60) $howlong["31-60days"]++;
  elseif($numberDays<=90) $howlong["61-90days"]++;
  elseif($numberDays>90) $howlong["91+days"]++;
}

echo $howlong["0-30days"]." cars 0-30 days old";
echo $howlong["31-60days"]." cars 31-60 days old";
//etc
Thursday, December 22, 2022
 
heymega
 
3

Say the integers are all 0-15. Then you can store 2 per byte:

<?php
$data = '';
for ($i = 0; $i < 500000; ++$i)
  $data .= chr(mt_rand(0, 255));

echo serialize($data);

To run: php ints.php > ints.ser

Now you have a file with a 500000 byte string containing 1,000,000 random integers from 0 to 15.

To load:

<?php
$data = unserialize(file_get_contents('ints.ser'));

function get_data_at($data, $i)
{
  $data = ord($data[$i >> 1]);

  return ($i & 1) ? $data & 0xf : $data >> 4;
}

for ($i = 0; $i < 1000; ++$i)
  echo get_data_at($data, $i), "n";

The loading time on my machine is about .002 seconds.

Of course this might not be directly applicable to your situation, but it will be much faster than a bloated PHP array of a million entries. Quite frankly, having an array that large in PHP is never the proper solution.

I'm not saying this is the proper solution either, but it definitely is workable if it fits your parameters.

Note that if your array had integers in the 0-255 range, you could get rid of the packing and just access the data as ord($data[$i]). In that case, your string would be 1M bytes long.

Finally, according to the documentation of file_get_contents(), php will memory map the file. If so, your best performance would be to dump raw bytes to a file, and use it like:

$ints = file_get_contents('ints.raw');
echo ord($ints[25]);

This assumes that ints.raw is exactly one million bytes long.

Monday, August 1, 2022
 
4

//get the csv file $file = $_FILES[csv][tmp_name]; $handle = fopen($file,"r");

//loop through the csv file and insert into database
do {
    if ($data[0]) {
        mysql_query("INSERT INTO contacts_tmp (contact_first, contact_last, contact_email) VALUES
            (
                '".addslashes($data[0])."',
                '".addslashes($data[1])."',
                '".addslashes($data[2])."'
            )
        ");
    }
} while ($data = fgetcsv($handle,1000,",","'"));
Tuesday, September 6, 2022
5

There are several ways:

  1. Using cvsread:
    Assuming you have N rows in the file1:

    a = csvread( FILENAME, 0, 1, [0 1 N-1 1 ] );
    
  2. You might also consider xlsread

    a = xlsread( FILENAME, 'B:B' );  
    

    See specific example on the xlsread doc.

  3. Another option is dlmread

    a = dlmread( FILENAME, ',', [0 1 N-1 1] );
    

1 - A nice (and fast) way to count the number of lines in the file in Matlab can be found in this answer by Rody Oldenhuis.

Saturday, September 24, 2022
 
littm
 
Only authorized users can answer the search term. Please sign in first, or register a free account.
Not the answer you're looking for? Browse other questions tagged :
 

Browse Other Code Languages