Libove Blog

Personal Blog about anything - mostly programming, cooking and random thoughts

Django Migrate MySQL to Postgres

# Published: by h4kor

I've recently migrated my tools project from an old the server to a new one. In the process I've decided to change the used database from MySQL to PostgreSQL. To do so I've used this amazing guide from calazan.com.

As the guide was written for django 1.6 some minor changes were necessary when working with newer versions (I've used django 3.1.4).

Setup for migration

Steps 1 + 2 work as described in the guide above. A new database is created and added to the settings.py.

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.mysql',
        'NAME': 'dbname',
        'USER': 'dbuser',
        'PASSWORD': 'dbpass',
        'HOST': 'mysql.example.com',
        'PORT': '',
    },
    'postgresql': {
        'ENGINE': 'django.db.backends.postgresql_psycopg2',
        'NAME': 'dbname',
        'USER': 'dbuser',
        'PASSWORD': 'dbpass',
        'HOST': 'postgresql.example.com',
        'PORT': '',
    }
}

Apply migrations to new database

python manage.py migrate --database=postgresql

syncdb is no longer available in newer django versions and was replaced by migrate. The flag --no-initial-data is no longer available and I didn't find a substitution.

Clean Up the new database

Some migrations will create initial data in your database. This will probably create conflicts, as we are migrating existing data into this database. I did this step by hand, but the command python manage.py sqlflush --database=postgresql will give you the SQL query to do it automatically.

Copy data to new database

In order to copy the data we use dumpdata and loaddata.

When dumping the data it is important to use the flag --natural-foreign, otherwise the import process cannot import data with foreign keys.

python manage.py dumpdata --all --natural-foreign > dump.json

The dumped data can be loaded into the new database. This process can take a long time. Ensure that your machine executing this command is as close as possible to the database, ideally running on the same machine. First, I started this command on my local machine. After 30 minutes I decided that it took too long and uploaded the dump to the server running the database, which finished the command in about a minute. My dump file was only 16 MB big.

python manage.py loaddata dump.json --database=postgresql

Adjust config

Now all data is available in the new database. Just adjust the settings to use the postgresql as the default database.