Gedmo Translations in Symfony 4

Today I’ve had to install Gedmo Translations into a Symfony 4 app and I had some trouble, so after fixing them problems I thought of writing them down in case someone else can benefit from it 🙂

Let’s start fromt he begginig.

Create a Symfony 4 project

Please use docker! 🙂 More info here in the Docker for Symfony 4 post.

Here’s what my docker.compose.yml looks like

#docker.compose.yml
version: "3.1"

volumes:
    db-data:

services:
    mysql:
      image: mysql:5.6
      container_name: ${PROJECT_NAME}-mysql
      working_dir: /application
      volumes:
        - db-data:/application
      environment:
        - MYSQL_ROOT_PASSWORD=docker_root
        - MYSQL_DATABASE=gedmoapp_db
        - MYSQL_USER=gedmoapp_user
        - MYSQL_PASSWORD=gedmoapp_pw
      ports:
        - "8306:3306"

    webserver:
      image: nginx:alpine
      container_name: ${PROJECT_NAME}-webserver
      working_dir: /application
      volumes:
        - .:/application
        - ./docker/nginx/nginx.conf:/etc/nginx/conf.d/default.conf
      ports:
        - "8000:80"

    php-fpm:
      build: docker/php-fpm
      container_name: ${PROJECT_NAME}-php-fpm
      working_dir: /application
      volumes:
        - .:/application
        - ./docker/php-fpm/php-ini-overrides.ini:/etc/php/7.2/fpm/conf.d/99-overrides.ini

Installation of Gedmo’s bundle

composer require stof/doctrine-extensions-bundle

Configuration

Now we need to update our configuration file, so in the doctrine.yaml, under the doctrine key, we should go from this

#doctrine.yaml
doctrine:
#...
    orm:
        auto_generate_proxy_classes: '%kernel.debug%'
        naming_strategy: doctrine.orm.naming_strategy.underscore
        auto_mapping: true
        naming_strategy: doctrine.orm.naming_strategy.underscore
        auto_mapping: true
        mappings:
            App:
                is_bundle: false
                type: annotation
                dir: '%kernel.project_dir%/src/Entity'
                prefix: 'App\Entity'
                alias: App

To this

#doctrine.yaml
doctrine:
#...
	orm:
        auto_generate_proxy_classes: '%kernel.debug%'
#        naming_strategy: doctrine.orm.naming_strategy.underscore
#        auto_mapping: true
        entity_managers:
            default:
#                connection: default
                naming_strategy: doctrine.orm.naming_strategy.underscore
                auto_mapping: true
                mappings:
                    App:
                        is_bundle: false
                        type: annotation
                        dir: '%kernel.project_dir%/src/Entity'
                        prefix: 'App\Entity'
                        alias: App
                    gedmo_translatable:
                        type: annotation
                        prefix: Gedmo\Translatable\Entity
                        dir: "%kernel.root_dir%/../vendor/gedmo/doctrine-extensions/lib/Gedmo/Translatable/Entity"
                        alias: GedmoTranslatable # (optional) it will default to the name set for the mapping
                        is_bundle: false
                    gedmo_translator:
                        type: annotation
                        prefix: Gedmo\Translator\Entity
                        dir: "%kernel.root_dir%/../vendor/gedmo/doctrine-extensions/lib/Gedmo/Translator/Entity"
                        alias: GedmoTranslator # (optional) it will default to the name set for the mapping
                        is_bundle: false
#                    gedmo_loggable:
#                        type: annotation
#                        prefix: Gedmo\Loggable\Entity
#                        dir: "%kernel.root_dir%/../vendor/gedmo/doctrine-extensions/lib/Gedmo/Loggable/Entity"
#                        alias: GedmoLoggable # (optional) it will default to the name set for the mappingmapping
#                        is_bundle: false
                    gedmo_tree:
                        type: annotation
                        prefix: Gedmo\Tree\Entity
                        dir: "%kernel.root_dir%/../vendor/gedmo/doctrine-extensions/lib/Gedmo/Tree/Entity"
                        alias: GedmoTree # (optional) it will default to the name set for the mapping
                        is_bundle: false

Also, in our stof_doctrine_extensions.yaml, let’s add the following configuration for the doctrine extensions.

#stof_doctrine_extensions.yaml
stof_doctrine_extensions:
    default_locale: en_US
    orm:
        default:
            tree: true
            translatable: true
            sluggable: true

Now, if we try to update our schema

php bin/console doc:sch:update --dump-sql
The following SQL statements will be executed:

     CREATE TABLE ext_translations (id INT AUTO_INCREMENT NOT NULL, locale VARCHAR(8) NOT NULL, object_class VARCHAR(255) NOT NULL, field VARCHAR(32) NOT NULL, foreign_key VARCHAR(64) NOT NULL, content LONGTEXT DEFAULT NULL, INDEX translations_lookup_idx (locale, object_class, foreign_key), UNIQUE INDEX lookup_unique_idx (locale, object_class, field, foreign_key), PRIMARY KEY(id)) DEFAULT CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci ENGINE = InnoDB ROW_FORMAT = DYNAMIC;

Looks about right, so let’s hit it!

php bin/console doc:sch:update --force

And we get the following error

Updating database schema...


In AbstractMySQLDriver.php line 126:
                                                                                                                                
  An exception occurred while executing 'ALTER TABLE ext_translations CHANGE object_class object_class VARCHAR(255) NOT NULL':  
                                                                                                                                
  SQLSTATE[42000]: Syntax error or access violation: 1071 Specified key was too long; max key length is 767 bytes               
                                                                                                                                

In PDOConnection.php line 109:
                                                                                                                   
  SQLSTATE[42000]: Syntax error or access violation: 1071 Specified key was too long; max key length is 767 bytes  
                                                                                                                   

In PDOConnection.php line 107:
                                                                                                                   
  SQLSTATE[42000]: Syntax error or access violation: 1071 Specified key was too long; max key length is 767 bytes

Now, after looking for this issue with Symfony and Gedmo Translations in the internet, I found this issue about Doctrine and this other one, and the fix here.

As davidbehler and javiereguiluz explain in the posts,

the cause of this is that Symfony advises you to use utf8mb4_general_ci/utf8mb4 as collation/charset for your database. utf8mb4 takes 4 bytes per char, meaning a 255 char field needs 1020 bytes for an index.

And so it seems that MySql 5.6 has a max key length of 767 bytes, so that leaves us with three options:

1. Decrease the field length to 191. However, we would have to override Gedmo’s translations bundle, as it uses 255 char fields everywhere…

2. We could change the charset encoding to ut8 like so (thanks to Kuba Florczuk for the configuration):

#doctrine.yaml
doctrine:
    dbal:
        # configure these for your database server
        driver: 'pdo_mysql'
        server_version: '5.6'
        charset: utf8
        default_table_options:
            charset: utf8
            collate: utf8_unicode_ci

        url: '%env(resolve:DATABASE_URL)%'
#...

3. We could upgrade to mysql 5.7 like so:

#docker-compose.yml
services:
    mysql:
      image: mysql:5.7
#...
#doctrine.yaml
doctrine:
    dbal:
        # configure these for your database server
        driver: 'pdo_mysql'
        server_version: '5.7'
        charset: utf8mb4
        default_table_options:
            charset: utf8mb4
            collate: utf8mb4_unicode_ci

        url: '%env(resolve:DATABASE_URL)%'
#...

All of them solutions work gracefully. Hope this helps, happy coding! 🙂

Gedmo’s Translations Documentation

StofDoctrineExensionBundle in Symfony’s official website.
Doctrine’s extensions GitHub doc.
Translatable’s GitHub doc.

How to build a REST API with Django and JWT Auth

It’s been a while since I wrote anything in the blog.

Today on this short tutorial I’ll explain the steps on how to build an api app with Django, using the JWT (JSON Web Token) as a way to identify users.

Libraries we’ll use are:

Docker
Django 1.11
Python 3
Django’s REST Framework
Django’s JWT

So let’s start by creating a Django project with Docker. If you still don’t know how to do that, more info on how to do create a django project with docker here.

Our requirements.txt would something like this:

django==1.11
mysqlclient
djangorestframework
djangorestframework-jwt
requests

*Note: The requests library is just to create some requests (GET, POST…) inside our python code.

For this example, I’ve named the project ‘apiskeleton’ and the app ‘apiapp’.

Step1: Django’s REST Framework

Once we have created the project and app, let’s register our app and the rest framework app also in our settings.

#settings.py

INSTALLED_APPS = [
	# ...
	'apiapp',
    'rest_framework',
	# ...
]

Great, now we’re ready to create some logic in our app.

First off, if you haven’t done already, create a separate User model. This is best practices for Django.

#models/user.py
from django.db import models
from django.contrib.auth.models import AbstractUser


class User(AbstractUser):
    pass

Update the settings.py

#settings.py

# Custom User model
AUTH_USER_MODEL = 'apiapp.User'

Now, let’s create a serializer for our User model. The serializer will allow us to convert a model instance or queryset to a JSON content type. More info on serializers here.

Our serializers.py would look something like this:

#serializers/user.py
from rest_framework import serializers
from appapi.models import User

class UserSerializer(serializers.Serializer):
    id = serializers.IntegerField(read_only=True)
    username = serializers.CharField(required=True, allow_blank=False, max_length=100)
    password = serializers.CharField(required=True, write_only=True)
    email = serializers.CharField(required=True, allow_blank=True, max_length=100)
    is_staff = serializers.BooleanField(required=False, default=False)
    is_superuser = serializers.BooleanField(required=False, default=False)

    def create(self, data):
        """
        Create and return a new `Snippet` instance, given the validated data.
        """
        instance = User.objects.create(
            username=data.get('username'),
            email=data.get('email'),
            is_staff=data.get('is_staff'),
            is_superuser=data.get('is_superuser'),
        )
        instance.set_password(data.get('password'))
        instance.save()
        return instance

    def update(self, instance, data):
        """
        Update and return an existing `Snippet` instance, given the validated data.
        """
        instance.username = data.get('username', instance.username)
        instance.email = data.get('email', instance.email)
        instance.is_staff = data.get('is_staff', instance.is_staff)
        instance.is_superuser = data.get('is_superuser', instance.is_staff)
        instance.set_password(data.get('password'))
        instance.save()
        return instance 

It’s very simple, I know, but I like the readability here! There are probably better ways to make your serializers, for instance, by extending serializers.ModelSerializer, more info on this here, but it’s not the aim of this tutorial right now. Trying to make things as easy and readable as possible.

Now let’s add a GET API route to list all users, something on the lines of /user/. Since it’s listing all users, it should only be allowed for an admin user to use it right? For now, I’ll just call it api_admin_user_index, so I know this is for admins only. Later on we’ll secure these.

#urls.py
urlpatterns = [
    url(r'^user/$', user.api_admin_user_index, name='api_admin_user_index'),
]

Now, let’s create the view for this url:

#views/user.py
from rest_framework.decorators import api_view
from rest_framework.response import Response
from apiapp.models import User
from apiapp.serializers.user import UserSerializer

@api_view(['GET'])
def api_admin_user_index(request):
    """
    get:
    List all users.
    """
    serializer = UserSerializer(User.objects.all(), many=True)
    return Response(serializer.data)

This first parameter in the serializer is the queryset, and the second parameter is to let the serializer know this is a list. More info on this on the BaseSerializer class from Django’s Rest Framework.

Now, if we open a browser and type this route, we should see a list of 0 users, but the route works right? Success!

Now let’s create some users. We should add the POST verb in our view to do that, so it would look something like this:

#views/user.py
from rest_framework import status
from rest_framework.decorators import api_view
from rest_framework.response import Response
from apiapp.models import User
from apiapp.serializers.user import UserSerializer
from apiapp.utils.jsonreader import JsonReader


@api_view(['GET', 'POST'])
def api_admin_user_index(request):
    """
    get:
    List all users.
    post:
    Create new user.
    """
    if request.method == 'GET':
        serializer = UserSerializer(User.objects.all(), many=True)
        return Response(serializer.data)

    elif request.method == 'POST':
        data = JsonReader.read_body(request)
        serializer = UserSerializer(data=data)
        if serializer.is_valid():
            serializer.save()
            return Response(serializer.data, status=status.HTTP_201_CREATED)
        return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

The POST will create a user through the serializer.

In the code, you can see that I added a JsonReader class. This is a util class I use in most of my projects, it just makes reading the body easier, here’s the code:

#utils/jsonreader.py
import json


class JsonReader:

    @staticmethod
    def read_body(request):
        '''returns a dict from the body of request'''

        data = dict()

        try:
            body = request.body
            data = JsonReader.bytes_to_dict(body)
        except Exception as exc:
            if isinstance(request.data, dict):
                data = request.data
            elif isinstance(request.data, str):
                data = JsonReader.str_to_dict(request.data)

        return data

    @staticmethod
    def str_to_dict(s: str):
        dict = json.loads(s)
        return dict

    @staticmethod
    def dict_to_str(d: dict):
        str = json.dumps(d)
        return str

    @staticmethod
    def bytes_to_dict(b: bytes):
        str = b.decode("utf-8")
        dict = JsonReader.str_to_dict(str)
        return dict

Now we can create some users through our POST method, just sending some JSON with the keys username, password and email, which are the required ones in our serializer. Simple right?

Now it’s time for the rest of the usual verbs, PUT and DELETE. Also, we need a GET for a single user.

The routes would look something like this:

#urls.py
urlpatterns = [
    url(r'^user/$', user.api_admin_user_index, name='api_admin_user_index'),
    url(r'^user/(?P[0-9]+)/$', user.api_admin_user_detail, name='api_admin_user_detail'),
]

And now let’s update the view

#views/user.py
@api_view(['GET', 'PUT', 'DELETE'])
def api_admin_user_detail(request, pk):
    """
    get:
    Detail one user.
    put:
    Update one user.
    delete:
    Delete one user.
    """

    try:
        user = User.objects.get(pk=pk)
    except User.DoesNotExist:
        return Response({'error': "User " + pk + " does not exist"}, status=status.HTTP_404_NOT_FOUND)

    if request.method == 'GET':
        serializer = UserSerializer(user)
        return Response(serializer.data)

    elif request.method == 'PUT':
        data = JsonReader.read_body(request)
        serializer = UserSerializer(user, data=data)
        if serializer.is_valid():
            serializer.save()
            return Response(serializer.data)
        return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

    elif request.method == 'DELETE':
        user.delete()
        return Response(status=status.HTTP_204_NO_CONTENT)

All of this is great, but now let’s secure these admin calls right? Here’s where JWT comes into play.

Step2: Django’s JWT

JWT is a stateless authentication mechanism as the user state is never saved in the server memory. Django will check for a valid JWT in the Authorization header, and if it is there, the user will be allowed. As JWTs are self-contained, all the necessary information is there, reducing the need to go back and forth to the database.

This allows the user to fully rely on data APIs that are stateless. It doesn’t matter which domains are serving your APIs, as Cross-Origin Resource Sharing (CORS) won’t be an issue since it doesn’t use cookies.

With that being said, let’s start by updating our settings.py with the JWT configuration

#settings.py

# Configure the authentication in Django Rest Framework to be JWT
# http://www.django-rest-framework.org/
REST_FRAMEWORK = {
    'DEFAULT_AUTHENTICATION_CLASSES': (
        'rest_framework_jwt.authentication.JSONWebTokenAuthentication',
    ),
}

# Configure the JWTs to expire after 1 hour, and allow users to refresh near-expiration tokens
# https://getblimp.github.io/django-rest-framework-jwt/
JWT_AUTH = {
    # If the secret is wrong, it will raise a jwt.DecodeError telling you as such. You can still get at the payload by setting the JWT_VERIFY to False.
    'JWT_VERIFY': True,

    # You can turn off expiration time verification by setting JWT_VERIFY_EXPIRATION to False.
    # If set to False, JWTs will last forever meaning a leaked token could be used by an attacker indefinitely.
    'JWT_VERIFY_EXPIRATION': True,

    # This is an instance of Python's datetime.timedelta. This will be added to datetime.utcnow() to set the expiration time.
    # Default is datetime.timedelta(seconds=300)(5 minutes).
    'JWT_EXPIRATION_DELTA': datetime.timedelta(hours=1),

    'JWT_ALLOW_REFRESH': True,
    'JWT_AUTH_HEADER_PREFIX': 'JWT',
}

The config is pretty self-explanatory, but there’s more documentation on JWT settings here.

Once this is out of the way, let’s add some JWT route patterns so we can get that JWT token.

#urls.py
urlpatterns = [
    # ...
    url(r'^jwt/refresh-token/', refresh_jwt_token, name='refresh_jwt_token'),
    url(r'^jwt/api-token-verify/', verify_jwt_token, name='verify_jwt_token'),
    url(r'^jwt/api-token-auth/', obtain_jwt_token, name='obtain_jwt_token'),
]

Also, this is not necessary, but instead of exposing the JWT routes, let’s add a login API route that will wrap the obtaining of the JWT token.

#urls.py
urlpatterns = [
    # ...
    url(r'^login/', registration.api_login, name='api_login'),
]

And in our registration view

#views/registration.py
@api_view(['POST'])
def api_login(request):
    """
    post:
    This view is called through API POST with a json body like so:

    {
        "username": "admin",
        "password": "admin"
    }

    :param request:
    :return:
    """
    data = JsonReader.read_body(request)

    response_login = requests.post(
        request.build_absolute_uri(reverse('obtain_jwt_token')),
        data=data
    )
    response_login_dict = json.loads(response_login.content)
    return Response(response_login_dict, response_login.status_code)

Notice that this time I didn’t use the prefix ‘admin’. This is because this will be authorized to anonymous users, unlike the ones we created previously.

So as you can see, it’s pretty straightforward. This view is a wrapper of the JWT route. This way, we can add information to the response if we need to, for instance, the user id, user profile, etc. For now, let’s leave it this way.

If we now test it with a username and password, we should get the token. If you haven’t created any users yet, you could use the POST call we created previously. Otherwise, just add some superusers in the python console or create some fixtures, up to you.

If you’re lazy, here are my fixtures, and the location is

apiapp/fixtures/users.json
[
{
    "model": "apiapp.user",
    "pk": 1,
    "fields": {
        "password": "pbkdf2_sha256$36000$SoF7ueJzOE1I$244qaRI1dReT4DxZKXJi2sRmuKdqPcjeOiUaPdH2UV0=",
        "last_login": null,
        "is_superuser": true,
        "username": "admin",
        "first_name": "",
        "last_name": "",
        "email": "admin@slowcode.io",
        "is_staff": true,
        "is_active": true,
        "date_joined": "2018-08-29T10:09:27.191Z",
        "groups": [],
        "user_permissions": []
    }
},
{
    "model": "apiapp.user",
    "pk": 2,
    "fields": {
        "password": "pbkdf2_sha256$36000$bkA3NXYqXZ4S$zuH97poSj3trZoNRFeROw3PgutbGOHlZunliI8/1jbg=",
        "last_login": null,
        "is_superuser": false,
        "username": "user1",
        "first_name": "",
        "last_name": "",
        "email": "user1@slowcode.io",
        "is_staff": true,
        "is_active": true,
        "date_joined": "2018-08-29T10:10:56.873Z",
        "groups": [],
        "user_permissions": []
    }
},
{
    "model": "apiapp.user",
    "pk": 3,
    "fields": {
        "password": "pbkdf2_sha256$36000$KKujn28LvdGX$OxtfRmIUWNPwbBPsz2iwKwff8klJ5PiPXj3P9N70Hto=",
        "last_login": null,
        "is_superuser": false,
        "username": "user2",
        "first_name": "",
        "last_name": "",
        "email": "user2@slowcode.io",
        "is_staff": false,
        "is_active": true,
        "date_joined": "2018-08-29T10:11:19.576Z",
        "groups": [],
        "user_permissions": []
    }
}
]

To import them, use the command

python manage.py loaddata users

Great, by now you should have some users in your database and you should be able to use our login API we just created. But what do we do with the token we get back?

Well, as mentioned before, this token should be added as a Authorization header in our calls to identify such user.

With JWT we will be able to authorize or prohibit certain calls for certain users, and or we will send different type of info, depending on what user is sending the call.

All of this decision making will be done by a different layer, which I call ‘security voters’ (hereditary from Symfony)

Here’s an example of what a voter would look like:

#security/voters.py
from apiapp.models import User


class AbstractVoter:

    request = None

    def __init__(self, request):
        self.request = request

    def is_logged_in(self):
        if isinstance(self.request.user, User):
            return True

        return False

    def is_superuser(self):
        if self.is_logged_in():
            return self.request.user.is_superuser

        return False


class UserVoter(AbstractVoter):

    def user_can_manage_me(self, user_inst: User):
        if self.is_logged_in():
            if self.is_superuser():
                return True
            if self.request.user == user_inst:
                return True

        return False

Now we can call this voter in our APIs.

#views/user.py
@api_view(['GET', 'POST'])
def api_admin_user_index(request):
    """
    get:
    List all users.
    post:
    Create new user.
    """
    voter = UserVoter(request)
    if not voter.is_superuser():
        return Response({'error': "User API is not allowed by non admin user"}, status=status.HTTP_403_FORBIDDEN)

    #...


@api_view(['GET', 'PUT', 'DELETE'])
def api_admin_user_detail(request, pk):
    """
    get:
    Detail one user.
    put:
    Update one user.
    delete:
    Delete one user.
    """
    voter = UserVoter(request)
    if not voter.is_superuser():
        return Response({'error': "User API is not allowed by non admin user"}, status=status.HTTP_403_FORBIDDEN)

    #...

It can also be useful for API calls that are not for superusers. Let’s create the route for a user detail.

#urls.py
urlpatterns = [
	#...
    url(r'^(?P[0-9]+)/$', user.api_user_detail, name='api_user_detail'),
]

And now in the view.
In the GET verb, Unless the user wants to see himself, it should be denied.

#views/user.py
@api_view(['GET', 'PUT'])
def api_user_detail(request, pk):
    """
    get:
    Detail one user.
    put:
    Update one user.
    """
    try:
        user_inst = User.objects.get(pk=pk)
    except User.DoesNotExist:
        return Response(status=status.HTTP_404_NOT_FOUND)

    voter = UserVoter(request)
    if not voter.user_can_manage_me(user_inst):
        return Response({'error': "User API is not allowed"}, status=status.HTTP_403_FORBIDDEN)

    if request.method == 'GET':
        serializer = UserSerializer(user_inst)
        return Response(serializer.data)

    elif request.method == 'PUT':
        data = JsonReader.read_body(request)
        if 'is_staff' in data:
            if not voter.is_superuser():
                return Response({'error': "Non admin cannot update admin attributes"}, status=status.HTTP_403_FORBIDDEN)
        serializer = UserSerializer(user_inst, data=data)
        if serializer.is_valid():
            serializer.save()
            return Response(serializer.data)
        return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

Conclusion

On this post we’ve seen how to implement the Django’s REST Framework together with Django’s JWT library.

We’ve seen that JWT is a powerful and easy way to identify users that use our API.

If you have any questions or something is confusing, let me know and I’ll do my best to clarify.

I’ve also uploaded the code on github so you can clone the project and try it out for yourself.

git clone https://github.com/joeymasip/django-apiskeleton.git

I use it as a base for API projects, as I usually always need some User API to start with.

Happy coding 🙂

Docker for Django

This blog post is a small guide for getting started with your Django environment with Docker. Since I got very positive feedback from the blog post about Docker and Symfony4, I decided to do the same with Docker and Django.

In this example we’re going to work with Django 1.11 (LTS), Python3 and MySql 5.6

Before we start, you’ll need to install Docker in your machine. You can download it from the official website.

Once Docker is installed, I strongly recommend playing with the getting started guide. Here the guide for macs and here the guide for windows. However, if you’re lazy like me, just use this command to make sure it’s installed.

docker --version

Once all of that is out of the way, we can start with the Django project and the Docker environment that will run it.

We have two cases I’m going to tap on.

Case 1: I’m creating a Django project from scratch and I want to set up a development environment with Docker.

Step1: Clone the docker-django repository which has the docker configuration files.

git clone https://github.com/joeymasip/docker-django.git

Step2: Create the Django project.

First off, let’s start docker containters with the project we just downloaded.

#cd to the location where you cloned the project
cd ~/Development/docker-django
#start the containers
docker-compose up -d 

This command starts the containers. The parameter -d makes them run in the background. If you omit the -d you’ll see the log.

Docker should start building (if it’s the first time for these images) and running with the containers in the background.

If you now try to run

docker ps

in your console, you’ll see that MySql container is running, but Django’s is not. This is normal, as we haven’t installed Django yet in our project.

Now we’ll create the django project with the following command (replace project_name for your project name)

docker-compose run django django-admin.py startproject project_name .

Note: Do not forget the . in the end

Step3: Create the Django application.

Now we’ll create the django application with the following command (replace app_name for your app’s name)

docker-compose run django python manage.py startapp app_name

Now, try running the same

docker-compose up -d

Now, if you run the docker ps

docker ps

This time, Django’s container will have been started.

So right now, if you just open your browser and type http://127.0.0.1:8000/ you should see it working, so you’re already set up to develop!

First off though, let’s update settings.py so we can use our mysql container instead of Django’s sqlite.

Step4: Update settings.

First let’s add your app_name in the installed apps

INSTALLED_APPS = [
    'django.contrib.admin',
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    
    'app_name',
]

Now let’s configure your settings.py so the database settings points to the database service from docker.

Change:

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.sqlite3',
        'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    }
}

To:

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.mysql',
        'NAME': 'docker_django_db',
        'USER': 'dbuser',
        'PASSWORD': 'dbpw',
        'HOST': 'mysql',
        'PORT': '3306',
        'TEST': {
            'NAME': 'docker_django_db_test',
        },
    }
}

Step5: Create the User model.

Under your app_name/models.py file, just create a User model that extends from Django’s Auth model.

#app_name/models.py
from django.db import models
from django.contrib.auth.models import AbstractUser


class User(AbstractUser):
    pass

Once this is done, just tell your project that we’ll be using our User model. This is best practices in case you ever need to make changes to the User model. So in your project_name settings.py, add this line:

AUTH_USER_MODEL = 'app_name.User'

Now our User is plugged in.

Step6: Run migrations.

To run migrations, we need to first enter the python django bash.

docker-compose exec django bash

Once in, we can make the migrations if it’s the first time we create the app, as we won’t have any.

python manage.py makemigrations

And also run them

python manage.py migrate

If you want to create an admin user to log in into Django’s admin panel,

python manage.py createsuperuser

That’s it!

Now just open your browser and type

http://127.0.0.1:8000/
http://127.0.0.1:8000/admin/

You’re already set up to develop, so happy coding with docker!

Case 2: I already have a Django project

Step1: Clone the docker-django repository which has the docker configuration files.

git clone https://github.com/joeymasip/docker-django.git

Step2: Move all files to your already created Django project.

1. The docker folder containing python + django and a MySQL container config for it.
2. The docker-compose.yml file
3. The .env file

Step3: Start the docker images inside your Django project folder.

cd into your Django project folder and type the following command

docker-compose up -d

This command starts the containers. The parameter -d makes them run in the background. If you omit the -d you’ll see the log.

Docker should start building (if it’s the first time for these images) and running with the containers in the background.

Step4: Update database settings.

So if you’re using the docker-compose.yml out of the box, your database name, user and pw need updating in your project’s settings.

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.mysql',
        'NAME': 'docker_django_db',
        'USER': 'dbuser',
        'PASSWORD': 'dbpw',
        'HOST': 'mysql',
        'PORT': '3306',
        'TEST': {
            'NAME': 'docker_django_db_test',
        },
    }
}

Step5: Run migrations.

docker-compose exec django bash

Once in, we can run the migrations.

python manage.py migrate

Now, open a new chrome tab and type the following URL. The port is the one we set up in the docker-compose.yml

http://127.0.0.1:8000/
http://127.0.0.1:8000/admin/
http://127.0.0.1:8000/whatever-slug-you-want-from-your-project

You should see it working.

You’re already set up to develop, so happy coding with docker!

Docker for Symfony 4

This blog post is an introduction to devs who want to start using Docker with Symfony4. It will guide you through creating a Symfony 4 project running on Docker.

Before we start, you’ll need to install Docker in your machine. You can download it from the official website.

Once Docker is installed, I strongly recommend playing with the getting started guide. Here the guide for macs and here the guide for windows. However, if you’re lazy like me, just use this command to make sure it’s installed.

docker --version

Once all of that is out of the way, we can start with the Symfony 4 project and the Docker environment that will run it.

For the sake of the example, I’ve created a local environment running PHP5 to make things a bit trickier, since Symfony4 runs on PHP7.

Let’s try to create an empty SF4 skeleton on my local machine.

composer create-project symfony/skeleton symfony

Right off the bat I get the following error

[InvalidArgumentException]
Could not find package symfony/skeleton with stability stable in a version installable using your PHP version 5.6.29.

So as you can see, we’ve created an environment running PHP5 and we cannot create a SF4 project, which runs on PHP7. Docker should help us solve this problem 🙂

We have two cases I’m going to tap on.

Case 1: I’m creating a SF4 project from scratch and I want to set up a development environment with Docker.

Step1: Clone the docker-symfony4 repository which has the docker configuration files.

git clone https://github.com/joeymasip/docker-symfony4.git

Step2: Create the Symfony project skeleton.

First off, let’s start docker containters with the project we just downloaded.

#cd to the location where you cloned the project
cd ~/Development/docker-symfony4
#start the containers
docker-compose up -d 

This command starts the containers. The parameter -d makes them run in the background. If you omit the -d you’ll see the log.

Docker should start building (if it’s the first time for these images) and running with the containers in the background.

We will need to create the symfony project inside the php image bash, since it’s the only place we have PHP7. Remember we still have PHP5 in our machine, so first thing is to log into the bash for the php7 image.

docker-compose exec php-fpm bash

Once in there, we’re in a PHP7 image, so we should be able to create the skeleton for the symfony project.

#inside php-fpm bash
composer create-project symfony/skeleton symfony

Step3: Move the contents of the skeleton into the root of the application.

Unless you want to change the config of the working dir inside the docker-composer.yml, we need the symfony project to be in the root folder. Moreover, we can not clone the contents into the root folder directly like so (composer create-project symfony/skeleton .) because the installer deletes the contents of the folder you’re cloning into. Since it’s too risky, this option is not allowed. More info here.

Long story short, I’ve found this is the cleaner way to do it.

#inside php-fpm bash
mv /application/symfony/* /application
mv /application/symfony/.* /application

Now we can delete the empty folder we used for creating the skeleton

#inside php-fpm bash
rm -Rf /application/symfony

Step4: Require the components.

We can require whatever components we need.

#inside php-fpm bash
cd /application

composer require annotations
composer require --dev profiler
composer require twig
composer require orm
composer require form
composer require form validator
composer require maker-bundle

These are just a few, feel free to add the ones you want.

Step5: Creating some sample code in Symfony 4 project.

Now lets create a controller to test a sample route to make sure everything works.

#inside php-fpm bash
cd /application

bin/console make:controller Pizza

Now, open a new chrome tab and type the following URL. The port is the one we set up in the docker-compose.yml – If you check the dictionary of the ngix config, you can see that port 8000 maps the 80, which is the usual webserver port.

http://localhost:8000/pizza

We should now see our new controller action rendering a response.

Step6: Sync de database.

Finally, to sync the database, you need to update the .env file with the variables we set on the mysql image.

.env file that has been generated when requiring the orm package in Symfony4.

DATABASE_URL=mysql://db_user:db_password@127.0.0.1:3306/db_name

So if you check the docker-compose.yml file, you’ll see the credentials under the mysql configuration. So change the file to

DATABASE_URL=mysql://dbuser:dbpw@mysql:3306/docker_symfony4

Finally, let’s restart the containers

docker-compose down
docker-compose up -d
docker-compose exec php-fpm bash

Now inside the bash, you should be able to

#inside php-fpm bash
bin/console doc:sch:crea

You can connect an external database client such as Sequel Pro or MysqlWorkbench.

For the credentials remember to put again the ones we set in the mysql image.

Host: 0.0.0.0
Username: dbuser
Password: dbpw
Port: 8002

Also as a reminder, every time you composer down or kill the mysqld image, your schema will disapear! This not only means you’ll have to recreate again the following time, it also means the data will be lost. So make sure to dump the data if you need it later, or create some demo data/fixtures so you don’t have to add data manually.

Case 2: I already have a SF4 project

Maybe you cloned from elsewhere, or maybe you created it in the past with PHP7 in your local machine.

Step1: Clone the docker-symfony4 repository which has the docker configuration files.

git clone https://github.com/joeymasip/docker-symfony4.git

Step2: Move the files from the docker-symfony4 project folder you just cloned into your Symfony project root.

Move the docker-compose.yml and the folder named phpdocker containing nginx and php-fpm config for it to the root of your Symfony4 project.

Step3: Start the docker images inside your Symfony4 project folder.

cd into your Symfony project folder and type the following command

docker-compose up -d

This command starts the containers. The parameter -d makes them run in the background. If you omit the -d you’ll see the log.

Docker should start building (if it’s the first time for these images) and running with the containers in the background.

Now, open a new chrome tab and type the following URL. The port is the one we set up in the docker-compose.yml – If you check the dictionary of the ngix config, you can see that port 8000 maps the 80, which is the usual webserver port.

http://localhost:8000
http://localhost:8000/whatever-slug-you-want-from-your-project

You should see it working.

You’re already set up to develop, so happy coding with docker!

Disclaimer: The project container I created has been generated in phpdocker.io