Author: jimmiw

  • Handle argumented Docker contrainer registry paths

    In our a recent setup I’ve created, we are using Azure Container Registry to hold a base image, which we use when building our main project.

    I use a docker-compose.yml and Dockerfile in our setup and github actions for deployment.

    The base image is deployed to two container registries, because of access rights on the different environments.

    Basically we have a develop and a production environment.

    The develop environment also has a staging setup.

    When using our base image, we need to have this in both the develop and production container registries. This means, that our Dockerfile needs to have two different urls to the base image, since they cannot access the other environments container registry.

    Usually the develop environment has the following setup:

    # Base path for image
    FROM uniquedevrepo.azurecr.io/projectbase:php7.4
    
    # other docker stuff here...
    
    # expose the webserver
    EXPOSE 8080 443

    And the production environment had the following setup:

    # Base path for image
    FROM uniqueprodrepo.azurecr.io/projectbase:php7.4
    
    # other docker stuff here...
    
    # expose the webserver
    EXPOSE 8080 443

    This meant, that when we merged code from development into production, we’d get an access error to the base image, if forget to change the url.

    This caused a lot of confusion, when deploying, if the developers were not used to our setup.

    To handle this, I changed our workfile to pass an argument with the base image container registry url, so Dockerfile will not be changed all the time.

    name: Build and deploy Develop
    
    on:
      push:
        branches: [ develop ]
      pull_request:
        branches: [ develop ]
    
      workflow_dispatch:
    
    
    jobs:
      build:
        name: build and deploy develop
        runs-on: ubuntu-latest
        environment: develop
    
        steps:
          # azure login steps here...
    
          - name: 'Build and push image'
            uses: azure/docker-login@v1
          - run: |
              docker compose build --build-arg CRPATH=uniquedevrepo.azurecr.io
              # other actions to run here... :)

    Notice the “–build-arg CRPATH=uniquedevrepo.azurecr.io” which is passed to Dockerfile. (skipping docker-compose.yml). We can then simply use the ARG command to get the argument and use it:

    # tell Dockerfile that we are going to use the passed argument CRPATH
    ARG CRPATH
    
    # Base path for image - using the CRPATH argument.
    FROM ${CRPATH}/projectbase:php7.4
    
    # other docker stuff here...
    
    # expose the webserver
    EXPOSE 8080 443
    

    This means that I only have to maintain my github actions workflow for each environment and no longer worry about the Dockerfile having the wrong container repositroy path for our base image.

  • CSRF token package

    Working on older projects (as i tend to do), I often get various requirements for security enhancements.

    A few days ago, somebody apparently ordered a security test of a site I was working on, so yesterday i got the report:
    3 HIGH security errors, one being that we had no CSRF tokens on the forms.

    gah…

    I started searching for a nice CSRF token package, but the cool ones from Symfony etc used a lot of internal things and this project i worked on, didn’t really use a nice framework (it’s old – but upgraded through the years it’s been in service.. no main framework though)

    So I got my hands dirty and created a new package to handle this 🙂

    The package is called jimmiw/csrf and is available on github and packagist.

    To install it, write:

    composer install jimmiw/csrf

    And you are good to go. Read the README.md file for more info about how to use the package.

    Please note, that it’s already in version 2!… I guess I released it a bit fast yesterday and then today I changed the API…
    Anyways, the unit test proves that it works 🙂

  • Handle replication errors on Azure Database for MySQL flexible server

    At work we have just recently started moving our data to Azure Cloud. We still have the old on-premise setup running, with databases and web servers.

    To make the move to Azure Cloud, I have used a lot of time prepping the terraform scripts to handle our servers, but I have only recently been able to make a copy of the current database (with a whopping 1.7TB of data) and copy it to the cloud.

    It took a few days to insert the data into Azure MySQL, so after getting a replication error today, I was not ready to drop the replica database, to insert a fresh dump.

    Calling the following select, I could read the error:

     select * from performance_schema.replication_applier_status_by_worker;

    It was a create table error… The master database had created a new table using MyISAM, which Azure DB does not allow.

    Normally, you would increase the global counter, using:

    SET GLOBAL sql_slave_skip_counter = N;

    Azure does not support this, as it requires SUPER privileges (whatever that means). Thankfully Azure have implemented a set of methods to call, where this one exists: (remember to stop the replication first)

    CALL mysql.az_replication_skip_counter;

    This increases the global counter by 1. Then i took the created table, changed the statement to use innodb instead and ran it on the slave. Then i called:

    CALL mysql.az_replication_start;

    And the slave skipped the table create statement from the master, and continued with the replication.

    Link to Azure docs: https://docs.microsoft.com/en-us/azure/mysql/single-server/reference-stored-procedures

  • Old rails project with gem install errors :/

    At work I am going to work on an older rails project… Like many other Rails projects, this one can only run on older versions of ruby. Hence I install rbenv and tried my luck.

    My luck failed though and I had some issues installing “thin” gem which is used with “mailcatcher” gem.

    Eventually I found this post, which helped my Mac compile the thin gem file with native extensions:

    https://meta.discourse.org/t/mailcatcher-gem-installation-issue-on-macos-catalina-and-its-solution/168606

    Long story short:
    install the gem separately with custom compiler flags and run bundler install again.

    rbenv exec gem install thin -v 1.5.1 -- --with-cflags="-Wno-error=implicit-function-declaration"

    At last i can run rbenv exec bundler install and the rest of the installation can begin.

  • Laravel AWS SDK credentials using .env and configs

    Every now and then, I need to integrate with an Amazon web service (aws). I like the whole IAM user way of doing this, so that is usually my preferred choice.

    However, every time i need to setup a connection in a project (using mostly Laravel these days), I forget how I did it last time.

    If you ever struggle with remembering as well, or simple think the documentation on their AWS SDK for PHP is a bit rubbish, then this is my little guide to setting up your clients for connections.

    The steps are as following:

    • Log in to their AWS console on aws.amazon.com
    • Open the IAM service and create a new user. Follow their guide if necessary. Basically just follow the wizard, then click create user and go back and permissions under the “Permission” tab, for the service you need (e.g. AmazonSNSFullAccess for SNS)
    • Add their composer library to your project:
      composer require aws/aws-sdk-php

    After composer gets done, you need to open your Laravel project in the editor (This guide also “works” for other php libraries, but I am just using Laravels .env and config settings)

    Open up .env located in your application root and add a few settings:

    AWS_SNS_KEY=
    AWS_SNS_SECRET=
    AWS_SNS_REGION=
    AWS_SNS_VERSION=latest

    I like to have different IAM users for different services, that is why I am adding the _SNS_ in my settings. Remember to fill them out with your credentials from your newly created IAM user.

    Next step is to create an new aws.php file in config/ folder.
    This will hold your AWS settings, so they can be accessed using the config method.

    Add the following contents to config/aws.php:

    <?php
    
    return [
        'sns' => [
            'region' => env('AWS_SNS_REGION'),
            'version' => env('AWS_SNS_VERSION'),
            'credentials' => [
                'key' => env('AWS_SNS_KEY'),
                'secret' => env('AWS_SNS_SECRET'),
            ],
        ],
    ];
    

    By adding the ‘sns’ key, your settings available under aws.sns.xx

    Let us build a quick unit test to see if we can access our service.
    Create a new file called /test/Integration/Aws/SnsTest.php

    <?php
    
    namespace Tests\Integration\Aws;
    
    use Aws\Sns\SnsClient;
    use Tests\TestCase;
    
    class SnsTest extends TestCase
    {
        public function testAwsConnection()
        {
            $client = new SnsClient([
                'region' => config('aws.sns.region'),
                'version' => config('aws.sns.version'),
                'credentials' => config('aws.sns.credentials'),
            ]);
    
            $topics = $client->listTopics();
            self::assertNotNull($topics);
        }
    }
    

    We create a new SnsClient, passing an array of settings.
    If you have a look at their documentation, they have a 'profile' => 'default'

    Do not add that, as it will tell the client to find the credentials to use, in your local ~/.aws/credentials file.

    Using the credentials key, will read the credentials from your aws.sns.credentials (which points to AWS_SNS_KEY and AWS_SNS_SECRET).

    Run the unit test and profit!

    vendor/bin/phpunit tests/Integration/Aws/SnsTest.php

  • Hacktoberfest 2020

    This year at hacktoberfest 2020 my small project php-time-ago got a bit of attention. This was very nice and i hope that people had a fun time.

    However, most of the attention was spam 🙂

    It feels odd when people take their time to do invalid PRs instead of just doing it right.

    One PR i got was:

    Why was this ever a valid PR? 🙂

    Anyways, it was fun to be part of #hacktoberfest this year and i hope a lot of opensource projects got a lot of attention.

  • Lots of translations on php-time-ago module

    The composer module php-time-ago is now in version 0.4.10 and has included translations for the following languages:
    * Brazilian Portuguese
    * Chinese
    * Danish
    * Dutch
    * English
    * Finnish
    * French
    * Hungarian
    * Japanese
    * Korean
    * Spanish
    * Taiwanese

    Head on over to github for the latest details.

  • Did a small startup and quitted again – what I learned

    A little over a year ago I quitted my day job as a fullstack webdeveloper at a small company in Copenhagen. I did this in order to work fulltime on a startup with two friends of mine.

    The startup company (now known as simply “startup”) is doing LIVE events (video recoding and streaming) and hosting the videos/audio clips afterwards, for shareholding companies in Scandinavia.

    The project was already running live when I started fulltime on it, as I had started coding on the system about 6 months before in my freetime.

    At first we where 2, with a consultant, then we got the last buddy to join and we were 3. We worked from our homes, and used a lot of time doing work, brainstorming and coming up with new things and ways to do the same – only faster, easier and more simple.

    We were 3 different people, with different trades and backgrounds:
    * One sales person
    * One technician
    * One developer (me)

    We have since grown (by a lot) and are now 6 fulltime workers and 2 student workers. We also lost one of the original partners a few months ago, so a lot changed in the company since we started.
    I am still the only developer in the company, and this is where the chain broke for me…

    As many people know, working from 8-16 is not a problem. I had recently become a dad, so that life suited me perfectly. But when doing a startup, you are not expected to work only from 8-16 anymore. Time is consumed heavily, and if something crashes, you are expected to fix it right away.

    What I learned.

    I have spent a lot of time doing Linux server setups. More time than coding i’d say…
    I had some experience with Linux servers, but that was in simple environments, with low traffic.

    It’s odd that when first starting out, you are “hired” to do coding, but end up fending hacker attacks and handling server stability and server load issues instead.

    Before doing the startup, I had mostly worked with PHP and Apache.
    I had some experience with Ruby On Rails (rails), which I liked, so I decided to choose this stack for the startup.

    Rails helped speeding up the coding – a lot. Of course, rails is a large framework, which has a lot of contraints and rules you have to know and follow – or you will die.
    I chose rails for all the gems out there, the community and documentation. I also loved what ruby in general looked like.

    That is also what I generally hear about rails: “lots of gems, just pick a lot of those, and you are done!”

    But let’s face it, that’s not the whole story. A year into the startup, and I’m using only 10 or 15 different gems (besides the rails ones), 5 of them are dealing with deployments (capistrano)… So i’m using 5-10 gems? not a whole bunch.

    Rails have also bitten me in the arse a lot of times. I know that if I was properly schooled in rails, I might not have done a lot of things the way I did it. But that’s just me… “Learn as you go”.
    But as rails is as large (and complex) as it is, you cannot just: “Learn and go”.

    I have spent a lot of time refactoring code, learning new things, handling odd issues, only to find an odd gem, that handles the exact same thing (which i didn’t know about until later). And why did the rails core team deprecate the ActiveResource lib? It was so easy to use… (I’m using the gem now though)

    Scaling rails was easy enough though.
    I had a load balancer in front of my servers, which made horizontal scaling easy enough.

    However, this wasn’t always enough…

    Our systems should be able to handle at least 1000 requests per second, for 1 hour or more in a row (during LIVE events, to listen for changes in the player).
    This might not seem like a lot, but if you have a server that can handle a given request in 100ms, that same thread can only handle 10 requests per second.
    That meant, that at one time we had 6 servers, serving the same content, and they still couldn’t keep up with the number of requests coming in.

    I then started using Redis as a page caching mechanism, and we can now handle around 14000 requests per second (in theory), using only:
    1 load balancer
    2 redis servers
    2.5 web servers

    Would I choose rails again for a startup? Properly not… The reason being, that I really don’t use most of the things rails offers, and it’s just so hard to do things differently.

    PHP on the other hand, lets you do a lot more. And with composer (e.g. using packagist) being more and more adopted, code snippets are shared more and more.

    I’m not saying that PHP is perfect – far from it! But it suits me a lot better.

    What I’m taking with me from rails is:
    Encapsulation
    Helpers
    Modules
    DSL’s
    Automated deployments
    (and a lot more…)

    It should be said, that I’m not a big fan of frameworks though. This goes for both Ruby and PHP. I cannot conform and tend to use bits and bites from different frameworks to handle my stuff.

    I just quit, now what?

    I have just said goodbye to the startup, leaving roughly 18 months of thoughts and code behind.

    I am returning to the PHP world, after having left it behind for a long time. And I am actually returning to the same job I had before starting my startup.
    This (new/old?) job has other developers and designers, that know what they are doing, which will be nice to get back to.

    I feel a lot older and wiser though, having spent time doing all the developing, server handling and project planning myself. I feel like I have a lot more to offer now.

    I also think that every developer should try to do a startup. I had often thought about it, but never dared venture into it. It seemed way to risky.

    And it is…

    You cannot make mistakes, especially if you have something to loose. That being a house, a car… or more importantly: have a family to take care of.

    But you somehow grow with the experience… and become better. Perhaps not at coding, but at taking responsibility and at thinking about decisions.

    Technical stuff

    For those interested, this is what I used to develop and design. And the server software.

    Software used (on the Mac):
    * Firefox Developer Edition
    * Google Chrome Canary
    * Pixelmator (Graphics app – it’s dirt cheap, and works very well)
    * Sequel Pro (SQL app)
    * Sublime Text (2+3) (Lots of modules)
    * Terminal

    Server setup:
    * Ubuntu (very easy to use, lots of updated packages etc)
    * MySQL
    * Nginx, with passenger (played around with puma + nginx in the end as well – just not in production)
    * Redis (as a general cache and page cache)
    * RVM (very easy when upgrading servers to new ruby versions)
    * I also used Nginx as a load balancer, and it works very well.
    * The VPS servers are from digitalocean.com (referral link)

    Please note: This is not meant as a PHP vs Ruby blog post, but simply my experiences with it.

  • php-time-ago plugin updated to version 0.4.0

    Just wanted to say, that after a few requests i’ve updated the php-time-ago plugin to be able to use translations.

    The plugin is available on github and using composer

  • IE + iframe + cookies

    After spending lots of hours building a system for a client, using Ruby on Rails, everything was deployed and worked perfect… until IE came along that is.

    The customer had iframed the project into their current website, and since the user had to sign in to the new project using an iframe, IE began to give all kinds of errors.
    The direct link worked like a charm, but IE seems to want more!

    I stumpled upon an answer on stackoverflow – where else? (link to stackoverflow answer)
    It simply states, that if you want to use cookies in iframes, using IE, you need to add a P3P header.
    And it worked!

    Rails howto

    To do this in Rails, simply open up “application_controller.rb” and add a new filter:
    before_filter :set_pthreep

    The code for the filter is added in the private section of the application_controller.rb file:
    def set_pthreep
    response.headers['P3P']= 'CP="Potato"'
    end

    And that’s all there is to it.