Four Kitchens
Insights

Creating a custom packagist

11 Min. ReadDevelopment

Have you ever found yourself needing to share custom dependencies across several sites? Maybe even for the same client? There are several methods of traversing this workflow, especially if you work in the Drupal ecosystem. The ideology of upstreams, distributions, and multi-sites are something to consider using. However, the fun lies in the challenge of determining how to scale an architecture

Create a custom packagist

The ingredients for creating a custom packagist, a repository of dependencies used by Composer, are surprisingly easy to come by. Keep in mind that a private packagist can be obtained through a hosted service at packagist.com. In our case, we already had the tooling readily available, so we decided to go the custom packagist route

The goal of this article is to give you some ideas on how to host a solid packagist for a team, organization, or client while describing how the Four Kitchens team came up with a fun and creative solution to provide this functionality using the tools our client had on hand. I hope to accomplish this by:

  • Sharing our motivation behind choosing this solution
  • Identifying the ingredients need to cook up the workflow
  • Explain baseline hosting, but elaborate on what you could do if so inclined
  • Layout how we set up automation around the workflow to make our lives easier

Let’s begin.

Motivation

On one client project, we found that we had enough private custom dependencies we were sharing with a private distribution that we needed to scale beyond editing the individual composer.json repositories listing for each site. If we were using an upstream setup, this could be accomplished using Composer Merge Plugin. In this case, however, it made sense to create a custom packagist. Keep in mind, if we didn’t do this, each of our composer.jsons would have had 11 custom packages and 11 VCS entries in the repositories section of our composer.json. That would need to grow with each additional dependency we added to our distribution. We currently maintain 20 sites on this distribution. Our policy is to have code review for every change to a site. So making changes to 21 repos (the distribution and all the downstream sites) was a development time suck

If you are here, you probably know the answer to the question, “Why can’t Composer load repositories recursively?” but if you don’t, check out this great explanation. In short, the repositories section of a composer.json cannot inherit that section from a dependency’s composer.json. So it’s up to the individual projects to make sure they have the right packages when it comes to those custom packages that our distribution requires

We might have been able to reduce our custom dependencies by relying on another hosted packagist such as asset-packagist.org, or working to make other dependencies publicly available. However, providing our own packagist maintained specifically for the client’s needs brought us performance gains over the other solutions and allows us to more closely vet our frontend library dependencies. It allows us to make a single “repositories” entry at the packagist level, and that gets pulled down by all of our sites that are pointing at it. This means less code editing on a per-site basis

So here we are, using an easily maintained solution, and reaping the benefits of performance, scalability, and increased developer productivity, while keeping our client’s ecosystem private. We didn’t even need that much to get started!

Ingredients

Things you will need to get started:

  • Satis, a rudimentary static packagist generator written in PHP using Composer as a library.
  • A repository to house the custom dependencies you want to put in your packagist. Think: all the items you currently have in your repositories section of your composer.json. This isn’t strictly a “must,” but it makes automation possible.
  • A place to host static HTML and JSON files. Anything web accessible. HTTPS is preferred, but curl can work under other protocols. You can get pretty creative here.
    • Cheap hosting service
    • Spare Droplet, Linode, or AWS EC2 instance
    • S3 bucket
    • GitHub
    • FTP
  • Something to build the packagist on dependency update like:
    • GitHub Actions
    • CircleCI
    • Travis
    • Cron
    • A manual implementation like running a command via SSH

Our implementation looks like this:

  • Repository: GitHub
  • Hosting: S3 bucket
  • Builder: CircleCI

These were all resources we were already using. I’ll go into the specifics on how our build works with some suggestions on alternatives.

Satis

It’s pretty simple to set up Satis. There is some decent documentation on Satis at GetComposer.org and on GitHub. What I say here may be a bit of a repeat, but I’m describing an opinionated setup intended to allow for testing and committing changes. This is a necessity when multiple developers are touching the same packagist and you need accountability

Before I dive into the specifics of our setup, I want to mention that if you feel you don’t need this level of control, testing, and revision history, Satis can be set up as a living stand-alone app. You can host it as either a docker container or on a hosting platform. In both of these options, developers would live edit and maintain the packagist via command line by default. You can, however, install a graphical frontend using something like Satisfy

To set Satis like Four Kitchens has, follow the steps below. Code is below if you need examples of how it might look.

  1. Create a new repository.
  2. Initialize a new composer project using composer init.
  3. Require Satis composer require composer/satis.
  4. Add a script to your composer.json to run the full build command.
  5. Add a packages directory to the project with a .gitkeep file. mkdir packages && touch packages/.gitkeep.
  6. Add a .gitignore to ignore the vendor folder and generated package files.
  7. Consider setting up a Lando instance to serve your packagist for testing.
  8. Create satis.json just like you normally would a standard composer.json with the repositories section containing all your packages, repos, and packagists you want available to the projects consuming it.
  9. Add "require-all": true below the repositories section of satis.json. There’s more about usage of require-all versus require in the Satis setup documentation. Use what fits your needs, but if you are adding individual packages instead of entire packagists to your satis.json, require-all is likely all you need.

Your repo could look something like this:

composer.json

{rn  "name": "mycompany/packages",rn  "require": {rn    "composer/satis": "^1.0" 
  },rn  "scripts": {rn    "build": "./vendor/bin/satis build satis.json packages" 
  }
}

satis.json

{rn  "name": "mycompany/packages",rn  "homepage": "https://packages.mycompany.com",rn  "repositories": [rn    { "type": "vcs", "url": "https://github.com/mycompany/privaterepo" },rn    { "type": "vcs", "url": "http://svn.example.org/private/repo" },rn    { "type": "package", "package": [rn      { "name": dropzone/dropzone", "version": "5.9.2", "dist": { "url": "https://github.com/dropzone/dropzone/releases/download/v5.9.2/dist.zip", "type": "zip" }}
    ]}
  ],rn  "require-all": truern}

.lando.yml

name: mycompany-packagesrnrecipe: lemprnconfig:
 webroot: packagesrn  composer_version: 2rn  php: '7.4'

.gitignore

vendorrnpackages/*rn!packages/.gitkeep

From here, run lando start && composer install && composer build. Now, go to another project and add your test packagist to that project. Additionally, add "secure-http":false, to the config section since Lando’s https certificate is insecure by default. Lastly require one of the packages you added to satis.json above.

{rn  ..
 "repositories": [rn    {rn      "type": "composer",rn      "url": "http://mycompany-packages.lndo.site",rn    }
    ..
 ],rn  "require": {rn    "dropzone/dropzone": "^5.9.2" 
  },rn  ..
 "config": {rn    ..
   "secure-http": falsern  }
  ...rn}
 

At this point you should be greeted with a successfully built project and have a local instance of your packagist going. When you are done testing, stop Lando and switch your repository entry in the other project to your packagist’s public URL. Commit all your changes and push up!

Your next step is getting your packagist off your local and out where everyone can use it.

Hosting

Now you can simply copy the files in your packages folder and put them somewhere web accessible. I really want to drive this point home. The entirety of your packagist is simply the contents of that folder and nothing else. The things that make this so complicated are the processes around automating and updating this packagist

You could, for example, now take these files you created and host them anywhere someone can curl to. This means http, ftp, sftp are available to you, to name a few. If you aren’t worried about privacy, you can even go so far as placing these in the webroot or even the sites/default/files folder in your company’s Drupal site. This is a good option if you are strapped for domain names or running a small operation. You would then make sure to copy those files any time someone makes a change to any of the packages that are a part of your packagist.

THE END?

If that’s all you are looking for, you can stop here. You’ve done it! You now have a custom packagist and the rest of the workflow may not matter to you. However, if you want some more ideas and want to build out a more robust automated development workflow, keep reading. The ideas get interesting from here

If you wanted to be creative, you could probably remove the line from .gitignore that excludes the packages folder, commit it, and set your packagist URL to something like https://raw.githubusercontent.com/mycompany/packages/main/ and set up an Accept and Authorization header in your packagist. You can see an example on how to use headers in your packagist at GetComposer.org and below with our S3 example

In fact, the composer.json setup described for the creative Github example is really similar to what we did, except we used a workaround recommended by AWS for restricting access to a specific HTTP referer. Our client wanted the extra security so not just anybody could go and poke around at the packages and versions we had available

In our example, we created a normal bucket, and assigned a CNAME to it with a nice domain name. The CNAME is optional but makes it more “official” and allows us to move the packagist later without disrupting the developer workflow too much. We then added a policy to only accept connections from calls with a referer that is our secret key. A referer doesn’t have to be a website. In our case it’s a lengthy hash that would be difficult to guess. This too is optional, but if you are looking for that extra level of security, it’s a good option to consider. Note that you should not add spaces between the colon and the token when using this policy. Our repositories entry in our projects looks like:

{rn  ..
 "repositories": [rn    {rn      "type": "composer",rn      "url": "https://packages.mycompany.com",rn      "options": {rn        "http": {rn          "header": [rn            "Referer:" 
          ]rn        }
      }
    }
    ..
 ],rn  ...rn}

And that’s it. We copy the files up to the bucket using AWS CLI, and it’s published

Now we need to automate the workflow and get what’s in our hosting location to update automatically.

Building and automation

I’ve pointed out that, if you are willing, you can put Satis somewhere, generate the packagist files, upload them somewhere web accessible, and be ready to roll. This isn’t so different from static site generators like Jekyll or Hugo. However, we add in CI for automation, and revision control for accountability so that we can take the “error” out of human command crunching. It’s worth mentioning again that this is super important when you have entire teams modifying this packagist

In our example, I’m using CircleCI. You can do the same with GitHub Actions, Jenkins, or even run on a cron job, provided you are okay with a time-based cadence. You might even do more than one of these. Our CircleCI job looks like this:

.circleci/config.yml

version: 2.1rnorbs:
 php: circleci/php@1.1.0rn  aws-cli: circleci/aws-cli@2.0.3rnparameters:
 run_dependency_update:
   default: truern    type: boolean
jobs:
 create_packagist:
   executor:
     name: php/defaultrn      tag: '7.4.24'rn    steps:
     - checkoutrn      - aws-cli/setuprn      - php/install-composerrn      - php/install-packagesrn      - run:
       name: Set Github authenticationrn        command: composer config u002du002dglobal github-oauth.github.com "$GITHUB_TOKEN";
      - run:
       name: Link auth for satisrn        command: mkdir ~/.composer; ln -s ~/.config/composer/auth.json ~/.composer/auth.jsonrn      - run:
       name: Build packagist json filesrn        command: composer buildrn      - store_artifacts:
       path: packagesrn      - run:
       name: Copy packagist to aws
       command: aws s3 cp u002du002drecursive ./packages/ s3://packages.mycompany.com/rnworkflows:
 version: 2rn  packagist:
   when: << pipeline.parameters.run_dependency_update >>
    jobs:
     - create_packagist:
       filters:
         branches:
           only:
             - mainrn              - master

There’s a lot to unpack here. I’m using pipeline parameters, because a requirement for me is to be able to call this job when another project updates. This functionality allows me to call this CircleCI job using an API call. I also use CircleCI orbs to make grabbing AWS CLI and getting a PHP environment easy

The meat of the job is the same as what you were doing during testing: running the build command we put in our composer.json. Since some of our repositories are private, we also have to make sure that composer has access creating a GitHub token. Then we copy everything to the bucket using AWS CLI. In our case, we have some behind-the-scenes environment variables defining our keys: AWS_ACCESS_KEY_ID, AWS_DEFAULT_REGION, and AWS_SECRET_ACCESS_KEY

From another project’s perspective, I’m still using CircleCI to run the API call. You can do this really easily in other CI environments, too.

version: 2.1
jobs:
 update-packagist:
   docker:
     - image: cimg/base:2021.12rn    steps:
     - run: "curl u002du002drequest POST u002du002durl https://circleci.com/api/v2/project/github/mycompany/packages/pipeline u002du002dheader "Circle-Token: $CIRCLE_TOKEN" u002du002dheader "content-type: application/json" u002du002ddata '{"parameters":{"run_dependency_update":true}}'" 
workflows:
 build:
   jobs:
     - update-packagist

That’s it. I add this job to every project that’s a VCS entry in our satis.json (provided I have access) and let it go to town. If you find yourself with other dependencies out of your control, consider adding a cron job somewhere or a scheduled pipeline trigger. You are done!

Final thoughts

This workflow can be as easy or as difficult as you want to make it given a few factors like:

  • How often will it change
  • How many people touch it
  • How up-to-date it needs to be

There are a lot of ideas here, with lots of knowledge representing several different application architectures for organizations that have multiple projects or sites. If you don’t want to bother with the home-brewed solution, dish out the cash and get a private packagist. The cost may be worth it

However, if you are already using all the necessary services and have a team of knowledgeable individuals like ours, consider maintaining your own packagist that you can host anywhere. You may find it a productive, performant, and most of all joyful and exciting experience that will bring value to your upstream, distribution, or multi-site setup.