A bit of context
Before PostgreSQL 9.4 (released December 2014) introduced jsonb, the only way to store JSON data was as a string: both json and text types from PostgreSQL store strings.
Also, django.contrib.postgres.fields.JSONField did not exist before Django 1.9 (released December 2015). You had to use third-party libraries to handle JSON in your models (or make your own field).
We chose django-jsonfield.
But now it's 2018, we have PostgreSQL 10, our systems are running with Python 3.6 and Django 1.11… and django-jsonfield isn't compatible with any newer version of Django and Python, and so we couldn't upgrade to Django 2.0.
It was time to tackle the issue!
Our constraints
Our systems run 24/7. No disruptions in service when we can prevent it.
This means that we cannot change the code and change the column format all at once. If we did, we would have some server with the old code running at the same time as servers with the new code. This rarely works with schema migrations.
Our basic migration scheme in similar cases is:
- step 1:
- create a new_field,
- edit code to write in both fields but read in old_field,
- migrate all existing data after deploying the new code (in an empty Django migration);
- step 2:
- read in new_field,
- drop old_field.
But this takes 2 releases and you sometimes have to edit many lines of code to swap the uses of old_field to new_field everywhere. And then many of your field names are called new_* which is a bit frustrating when you are as perfectionist as we are!
That's a lot of work, and after that, we will still have import jsonfield in our migrations, so we will need to squash migrations to clean that, which is tedious and takes a long time. That would delay our migration to Django 2.
Since we did not want to wait too long, we tried another approach.
How we did it
The nice thing is that writing JSON in a PostgreSQL database uses the same raw SQL whether the column type is jsonb, json or text. This means that we don't need the old_field/new_field dance of our basic migration scheme.
The only thing we need is to have a Django model field that handle both jsonb and text formatted columns when reading.
So our migration scheme becomes:
- deploy code with a MigrationJSONField handling both json and jsonb columns;
- change column type in the database;
- clean the code for the next release to use the default JSONField that does not handle text columns.
MigrationJSONField
We used a class inherited from django.contrib.postres.fields.JSONField that has a specific decoder:
class MigrationJSONDecoder(json.JSONDecoder):
"""Handle JSON stored as text or jsonb in PostgreSQL."""
def decode(self, obj, *args, **kwargs):
obj = super().decode(obj, *args, **kwargs)
return self.post_decode(obj)
@classmethod
def post_decode(cls, obj):
if isinstance(obj, str):
obj = json.loads(obj)
return obj
class MigrationJSONField(pg_fields.JSONField):
"""Handle migration from jsonfield.JSONField to pg_fields.JSONField."""
def __init__(self, **kwargs):
self.decoder = MigrationJSONDecoder
super().__init__(**kwargs)
def from_db_value(self, value, _expression, _connection, _context):
return self.decoder.post_decode(value)
This way, whenever the decoder encounters a string, it tries to load it into a JSON object.
Beware, this will not work if you store data that are not JSON such as plain text or just a text-formatted number. Decoding the plain text will raise a json.JSONDecodeError but decoding a text-formatted number will return a number and this will be different from the string you initially stored. The same goes for the "true" string. As Django has TextField, IntegerField and FloatField, I don't see any reason to use a JSONField to store this kind of data.
If you used the default django-jsonfield encoder, or a custom one, you need to add it in the __init__ method:
def __init__(self, **kwargs):
self.decoder = MigrationJSONDecoder
kwargs['encoder'] = myJSONEncoder
super().__init__(**kwargs)
Migrating the data in PostgreSQL
Migrating from text to jsonb with PostgreSQL is very easy:
ALTER TABLE my_table ALTER COLUMN my_column TYPE JSONB USING my_column::JSONB
Putting everything in our code
Before running the PostgreSQL data migration, the MigrationJSONField behaves exactly as the old JSONField. So we replaced all jsonfields.JSONField in migrations and models with MigrationJSONField, and removed django-jsonfield from the requirements.
Then we added a migration for every model that we edited:
from django.db import migrations
sql = "ALTER TABLE t ALTER COLUMN col TYPE JSONB USING col::JSONB"
reverse_sql = "ALTER TABLE t ALTER COLUMN col TYPE TEXT USING col::TEXT"
class Migration(migrations.Migration):
dependencies = [
('app_name', 'previous_migration_name'),
]
operations = [
migrations.RunSQL(
sql=sql,
reverse_sql=reverse_sql,
),
]
The tricky parts
Big tables
The first issue is that running this data migration on a big table (over 10 millions entries) is long, so the migrations might exceed the default connection timeout. It might even require to be launched in off-peak periods.
Null values
Another issue is that django-jsonfield allows you to store null values even if the field isn't nullable. That's because a null JSON becomes the empty string '' which can be stored in a non nullable text field.
We had to correct the code that allowed us to write these empty strings, and we added a migration to convert all empty strings into empty dicts ('{}') before the migration with the raw SQL.
Null characters
Last issue, we had some JSON that contained strings with null characters (\\x00) in it and PostgreSQL could not convert the column. Since we only had 10 entries with null, and since this were in some logs from a web service that a partner called a long time ago, we just edited the strings in a Django shell, and we launched the data migration again.
Conclusion
All in all, this migration went relatively well, considering the number of impacted columns accross all our systems (46). That was our last showstopper before migrating to Django 2, which will be our next major upgrade.