Ketan Patel    19-Apr-2013

When you are working on any project having large database, then before applying new modifications you might need to take a backup of the current database. It might be possible that you receive an error while importing the large database because it exceeded the timeout. So, How to import large database in phpmyadmin?


Solution 1: SQL file compression

This is very basic and simple solution which can be useful for .sql files which are not so large in size. We can compress the .sql file using bzip2, gzip, zip or any other compression method. As the .sql file is plain text type, we can achieve noticeable compression. If you are still facing issue with size you can try other options as well.


Solution 2: Increase PHP file size limit

Open your php.ini file and update the following lines and save the file.

max_execution_time = 1800    // 30 minutes 
max_input_time = 1800        // 30 minutes 
memory_limit = 2048M         // 2 GB
upload_max_filesize = 2048M  // 2 GB 
post_max_size = 2048M        // 2 GB

Now restart your server and try to import the .sql file. Don't forget to reset the values once you are done uploading your .sql file.


Solution 3: Using mysql command in terminal

Upload your .sql file to the web server and run flowing command in your terminal:

mysql -u <username> -p <database> < /path/your-file.sql

<username> - Your MySQL username. e.g. root
<database> - Database you want to import
/path/your-file.sql - Full path to your .sql file

When prompt, provide your MySQL password and complete the import process.


Solution 4: Using BigDump

What is BigDump?

When you are about to replicate or move a database then importing large SQL dumps  can be difficult for those with only web access to the MySQL server. phpMyAdmin has certain shortcomings when importing files of more than a modest size. In such cases BigDump can be a helpful utility.

How to use BigDump?

  • Download the BigDump from here : BigDump
  • Extract the file and open the bigdump.php file in a text editor.
  • Configure access details:

 // Database configuration
 $db_server   = "localhost";
 $db_name     = "your_DB_name"; 
 $db_username = "your_DB_user_name";
 $db_password = "your_DB_password"; 
  • Find the below line and change it's value to 4500. This value represents the lines to be executed per one import session.
$linespersession    = 3000;
  • Find the below line and change it's value to 4000. This value represents how many lines may be considered to be one query (except text lines)
$max_query_lines = 300;
  • Upload both your SQL file and bigdump.php to the same directory on your webserver.
  • Now browse to bigdump.php and click import
  • Wait for few minutes and you will see a confirmation screen similar to the one below.

import large file in phpmyadmin


About Ketan Patel

A Web Developer having 7+ years of experience in web development. Worked with Laravel-Lumen, CakePHP, CodeIgniter, Opencart to make sites for startup companies and small businesses. Apart from blogging, he likes to read Novels, Listening music and Net surfing.