Viewed   87 times

I'm trying to import a .sql file through PHP code. However, my code shows this error:

There was an error during import. Please make sure the import file is saved in the same folder as this script and check your values:

MySQL Database Name:    test
MySQL User Name:    root
MySQL Password: NOTSHOWN
MySQL Host Name:    localhost
MySQL Import Filename:  dbbackupmember.sql

And this is my code:

<?php
//ENTER THE RELEVANT INFO BELOW
$mysqlDatabaseName ='test';
$mysqlUserName ='root';
$mysqlPassword ='';
$mysqlHostName ='localhost';
$mysqlImportFilename ='dbbackupmember.sql';
//DONT EDIT BELOW THIS LINE
//Export the database and output the status to the page
$command='mysql -h' .$mysqlHostName .' -u' .$mysqlUserName .' -p' .$mysqlPassword .' ' .$mysqlDatabaseName .' < ' .$mysqlImportFilename;
exec($command,$output=array(),$worked);
switch($worked){
    case 0:
        echo 'Import file <b>' .$mysqlImportFilename .'</b> successfully imported to database <b>' .$mysqlDatabaseName .'</b>';
        break;
    case 1:
        echo 'There was an error during import. Please make sure the import file is saved in the same folder as this script and check your values:<br/><br/><table><tr><td>MySQL Database Name:</td><td><b>' .$mysqlDatabaseName .'</b></td></tr><tr><td>MySQL User Name:</td><td><b>' .$mysqlUserName .'</b></td></tr><tr><td>MySQL Password:</td><td><b>NOTSHOWN</b></td></tr><tr><td>MySQL Host Name:</td><td><b>' .$mysqlHostName .'</b></td></tr><tr><td>MySQL Import Filename:</td><td><b>' .$mysqlImportFilename .'</b></td></tr></table>';
        break;
}
?>

What am I doing wrong? The SQL file is in the same directory.

 Answers

1

Warning: mysql_* extension is deprecated as of PHP 5.5.0, and has been removed as of PHP 7.0.0. Instead, either the mysqli or PDO_MySQL extension should be used. See also the MySQL API Overview for further help while choosing a MySQL API.
Whenever possible, importing a file to MySQL should be delegated to MySQL client.

I have got another way to do this, try this

<?php

// Name of the file
$filename = 'churc.sql';
// MySQL host
$mysql_host = 'localhost';
// MySQL username
$mysql_username = 'root';
// MySQL password
$mysql_password = '';
// Database name
$mysql_database = 'dump';

// Connect to MySQL server
mysql_connect($mysql_host, $mysql_username, $mysql_password) or die('Error connecting to MySQL server: ' . mysql_error());
// Select database
mysql_select_db($mysql_database) or die('Error selecting MySQL database: ' . mysql_error());

// Temporary variable, used to store current query
$templine = '';
// Read in entire file
$lines = file($filename);
// Loop through each line
foreach ($lines as $line)
{
// Skip it if it's a comment
if (substr($line, 0, 2) == '--' || $line == '')
    continue;

// Add this line to the current segment
$templine .= $line;
// If it has a semicolon at the end, it's the end of the query
if (substr(trim($line), -1, 1) == ';')
{
    // Perform the query
    mysql_query($templine) or print('Error performing query '<strong>' . $templine . '': ' . mysql_error() . '<br /><br />');
    // Reset temp variable to empty
    $templine = '';
}
}
 echo "Tables imported successfully";
?>

This is working for me

Monday, September 5, 2022
2

I would say just build it yourself. You can set it up like this:

$query = "INSERT INTO x (a,b,c) VALUES ";
foreach ($arr as $item) {
  $query .= "('".$item[0]."','".$item[1]."','".$item[2]."'),";
}
$query = rtrim($query,",");//remove the extra comma
//execute query

Don't forget to escape quotes if it's necessary.

Also, be careful that there's not too much data being sent at once. You may have to execute it in chunks instead of all at once.

Saturday, November 5, 2022
2

The function you're looking for is find_in_set:

 select * from ... where find_in_set($word, pets)

for multi-word queries you'll need to test each word and AND (or OR) the tests:

  where find_in_set($word1, pets) AND find_in_set($word2, pets) etc 
Wednesday, August 17, 2022
2

I hope that in the near future there is a native functionality from MySQL.

An option (not simple query) is something like the following script (adapt as needed). Depending on the number of items may have performance problems.

File: /path/to/file/loadsetProfile.json:

[
  {
    "executionDateTime":"2017-07-07 15:21:15",
    "A":1,
    "B":1
  },
  {
    "executionDateTime":"2017-07-07 15:21:15",
    "A":2,
    "B":2
  },
  {
    "executionDateTime":"2017-07-07 15:21:15",
    "A":3,
    "B":3
  },
  {
    "executionDateTime":"2017-07-07 15:21:15",
    "A":4,
    "B":4
  }
]

MySQL Command-Line:

mysql> SELECT VERSION();
+-----------+
| VERSION() |
+-----------+
| 5.7.18    |
+-----------+
1 row in set (0.00 sec)

mysql> DROP PROCEDURE IF EXISTS `import_from_json`;
Query OK, 0 rows affected (0.00 sec)

mysql> DROP FUNCTION IF EXISTS `uuid_to_bin`;
Query OK, 0 rows affected (0.00 sec)

mysql> DROP TABLE IF EXISTS `temp_my_table`, `my_table`;
Query OK, 0 rows affected (0.00 sec)

mysql> CREATE TABLE IF NOT EXISTS `temp_my_table` (
    ->   `id` BINARY(16) NOT NULL PRIMARY KEY,
    ->   `content` JSON NOT NULL
    -> );
Query OK, 0 rows affected (0.00 sec)

mysql> CREATE TABLE IF NOT EXISTS `my_table` (
    ->   `executionDateTime` TIMESTAMP,
    ->   `A` BIGINT UNSIGNED,
    ->   `B` BIGINT UNSIGNED
    -> );
Query OK, 0 rows affected (0.00 sec)

mysql> CREATE FUNCTION `uuid_to_bin` (`id` VARCHAR(36))
    -> RETURNS BINARY(16)
    -> DETERMINISTIC
    ->   RETURN UNHEX(REPLACE(`id`, '-', ''));
Query OK, 0 rows affected (0.00 sec)

mysql> DELIMITER //

mysql> CREATE PROCEDURE `import_from_json`(`_id` VARCHAR(36))
    -> BEGIN
    ->   DECLARE `_id_current_json` BINARY(16) DEFAULT `uuid_to_bin`(`_id`);
    ->   DECLARE `_items_length`,
    ->           `_current_item` BIGINT UNSIGNED DEFAULT 0;
    ->   DECLARE `_content` JSON DEFAULT (SELECT `content`
    ->                                    FROM `temp_my_table`
    ->                                    WHERE `id` = `_id_current_json`);
    -> 
    ->   IF JSON_VALID(`_content`) THEN
    ->     SET `_items_length` := JSON_LENGTH(`_content`),
    ->         @`insert_import_from_json` := NULL;
    ->     WHILE `_current_item` < `_items_length` DO
    ->       SET @`insert_import_from_json` := CONCAT('
    '>         INSERT INTO `my_table` (
    '>            `executionDateTime`,
    '>            `A`,
    '>            `B`
    '>         )
    '>         SELECT
    '>           `content` ->> '$[', `_current_item`, '].executionDateTime',
    '>           `content` ->> '$[', `_current_item`, '].A',
    '>           `content` ->> '$[', `_current_item`, '].B'
    '>         FROM `temp_my_table`
    '>         WHERE `id` = '', `_id_current_json`, ''
    '>       ');
    ->       PREPARE `stmt` FROM @`insert_import_from_json`;
    ->       EXECUTE `stmt`;
    ->       SET `_current_item` := `_current_item` + 1;
    ->     END WHILE;
    ->
    ->     IF `_current_item` > 0 THEN
    ->       SET @`insert_import_from_json` := NULL;
    ->       DEALLOCATE PREPARE `stmt`;
    ->     END IF;
    ->   END IF;
    -> END//
Query OK, 0 rows affected (0.00 sec)

mysql> DELIMITER ;

mysql> SET @`UUID` := UUID();
Query OK, 0 rows affected (0.00 sec)

mysql> LOAD DATA LOCAL INFILE '/path/to/file/loadsetProfile.json' 
    -> INTO TABLE `temp_my_table`
    -> LINES TERMINATED BY 'r'
    -> (`content`)
    -> SET `id` = `uuid_to_bin`(@`UUID`);
Query OK, 1 row affected (0.00 sec)
Records: 1  Deleted: 0  Skipped: 0  Warnings: 0

mysql> CALL `import_from_json`(@`UUID`);
Query OK, 0 rows affected (0.00 sec)

mysql> SELECT
    ->   `executionDateTime`,
    ->   `A`,
    ->   `B`
    -> FROM
    ->   `my_table`;
+---------------------+------+------+
| executionDateTime   | A    | B    |
+---------------------+------+------+
| 2017-07-07 15:21:15 |    1 |    1 |
| 2017-07-07 15:21:15 |    2 |    2 |
| 2017-07-07 15:21:15 |    3 |    3 |
| 2017-07-07 15:21:15 |    4 |    4 |
+---------------------+------+------+
4 rows in set (0.01 sec)
Friday, September 30, 2022
 
5

I'd suggest using the ETL(extract translate load) tool from the Pentaho Business Intelligence package. It's got a bit of a learning curve but it'll do exactly what you're looking for. Their ETL tool is called Kettle and it's extremely powerful once you get the hang of it.

There are two versions of Pentaho, an enterprise version that has a free trial, and a free community version. The community version is more than capable but you might give the enterprise version a test ride too.

Here's some links

Pentaho Community Edition Site

Kettle Site

Pentaho Enterprise Site

Update: Multiple table outputs

One of the key steps in your transformation is going to be a combination lookup-update. This step checks a given table to see if a record from your data-stream exists and inserts a new record if it does not. Regardless of whether it's a new or old record it's going to append the key field from that record into your data-stream. As you keep going you'll use these keys as foreign keys as you import data into related tables.

Wednesday, November 30, 2022
 
Only authorized users can answer the search term. Please sign in first, or register a free account.
Not the answer you're looking for? Browse other questions tagged :