These forums are locked and archived, but all topics have been migrated to the new forum. You can search for this topic on the new forum: Search for Importing a big MySQL database through Virtualmin on the new forum.
Hello,
I'm trying to import a database of 300MB ca. into a database for one of my virtual domains.
I've tried going to Manage > Run MySQL > with a non-zipped file and also with a gzipped file (its 32MB when gzipped) but the page keeps on loading and never stops, and the database isn't totally imported.
I also increased php.ini parameters and tried uploading the sql file though FTP and then running the command mysql -u USERNAME --password=PASSWORD DATABASE < FILENAME.sql but it acts the same way.
Please help. Thanks you.
Try to upload your MySQL dump via FTP on your server (regardless of the location).
You can then import the dump with Webmin's MySQL module. Here :
Webmin > Servers > MySQL Database Server > yourdatabase > Execute SQL > Run SQL from file
Hi Nico94, I tried that already and it does the same thing and doesn't completely import the sql.
If you tried it with the
mysql
command on the console, it's not really a Virtualmin issue, but for some reason MySQL itself seems to have problems executing the SQL script.I'm not too familiar with debugging MySQL, but I guess the first step would be to check if it logs any errors to the syslog, or see if it can write a debug log and see if any errors pop up there.
If that doesn't help, you could try successively splitting the huge file into multiple smaller ones, e.g. one per table, and import them one by one, until you reach the one that has problems.
Thank you Locutus, the mysqld.log isn't reporting any problem ( it isn't reporting a lot actually ) but I tried importing the database, one table at a time and found out that there's one table which is not being totally imported. It's the biggest table of the database ( 200MB ).
Okay, you might check the tool "atop" while the problematic import is running, to see if it's not just taking a very long time.
Otherwise you could split the biggest table further into smaller files (each data row should have its own INSERT statement) and see if that helps.
It might also help to use (if still possible) different parameters when creating the dump, like "use complete INSERT statements" or "use transactions" or so. There's a number of parameters to tweak there.
I did as you said and split that file into other smaller file and then I finally made it. Thank you Locutus