All stories

Rebuilding Our Company Website

For the new version of our company website we decided to give decoupled Drupal 8 a go. In this article series we'll highlight a lot of the pains, the gains and other aspects to consider when building a completely decoupled Drupal 8 site.

7 min read Portrait Rainer by  Rainer Friederich

le-buzz-568591-unsplash.jpg

Rebuilding our company website took around 7 months. We did not have a dedicated project team and people were on- and offboarded when possible or needed in client projects. Not only did this slow down the progress but the technical design decisions were changed multiple times due to changes in the eco system or real life work experiences that forced us to rework and rethink some internals.

At the beginning of the project we asked ourselves the following:

  • Is Drupal 8 the correct platform for a decoupled website?
  • Drupal 8 Core Rest, JSON:API or GraphQL?
  • Core Rest of jsonRPC?
  • Mono repo or 2 applications?
  • VueJS, Nuxt, React, Next, Gatsby or a custom frontend?
  • SPA, SSR or static output?
  • How to not lose too many author capabilities on the backend?
  • How to not lose too many feature capabilities because of the decoupling?
  • Decouple everything or only the parts that change most often?
  • How to do technical SEO in a decoupled website while maintaining the SEO capabilities of Drupal 8?

At the beginning of the project we experimented with different technologies but settled with Drupal 8. Our premature thinking was "we already have a website in Drupal 8 and just add the decoupled capabilities to it". In the end this just turned out plain wrong. Our old website was built using nested paragraphs, complex blocks, views and components that cared more about the layout when rendered as the data storage.

If you build a decoupled website with Drupal 8 this does not work out very well. So, in the end the data model has to be rethought:

  • Content that can be structured should be structured.
  • Move all logic about what should be displayed to the frontend application.
  • Remain on a "one node per url" logic for the frontend application to inject needed meta data.
  • Do not create too many custom node types for different urls but add needed data via specific config_pages.
  • Everything needs to be a fieldable entity, also menu links.
  • Remain with paragraphs only for editorial content such as text with image elements.

For the API layer we decided to go with JSON:API. The moment we started using JSON:API it was stable in the 1.x version but the 2.x version was already in development.

Our decision against core rest was based on the miles better out of the box options from JSON:API and the decision against GraphQL was based on the fact that JSON:API should be included in Drupal 8 core. Also we had little knowledge of GraphQL 7 month ago and already 2 small projects in production using JSON:API.

For everything not entity based we decided for jsonRPC instead of core rest because it felt less of a burden to enable and configure and keep running in the background.

For the frontend user we decided for a Nuxt application. We already had a lot of knowledge in Vue.js and wanted to leverage the SSR capabilities of Nuxt. 

Our goal was that an author can still just login to Drupal and write an article or showcase and preview this content using the SSR of the Nuxt application. Also all other capabilities of the normal Drupal authoring capabilities should be kept.

The author should still be able to:

  • Decide on the public url of the current content.
  • What meta tags should be rendered.
  • Be able to choose images and image crops.
  • Preview content without publishing.

This could be achieved using decoupled_router for retrieving meta data about the current url, a long awaited patch for meta tag and very detailed content modeling in the frontend app. In the end we mapped the Drupal data model nearly 1:1 as objects in the Nuxt application.

We started simple with a mono repo. The classical /frontend and /backend folders from all the demos and resources available on the net. The application was served in a platform.sh multi app. the /frontend folder in a NodeJS 10 container and the /backend folder in a PHP container. At the end this approach did not hold up to its promises.

Our initial thoughts were:

  • Easier to review pull requests.
  • Feature branch environments with both frontend and backend changes in one PR.
  • Ability to deploy API changes and deploying the frontend application at the same time.
  • Internal routing and no network requests to the server.

In the end it was quite an underwhelming experience. The backend developer was confronted with a lot of frontend changes he didn't care about. The frontend developer was faced with huge yaml config file dumps from Drupal configuration changes. The feature branch environments with API and frontend changes did not work out as the backend URL is not available on the build step on platform.sh.

Internal routing was not possible because of the same restrictions. What we also did not consider is that running a NodeJS server in production is completely different than running a PHP website. After some days we got out of memory errors from the NodeJS server which we could not debug. Some rebuilds on platform.sh just changed the port so that the native Nuxt scripts for running the site were not suitable. The SSR approach turned out too slow for real world usage as each change to the frontend would also clear the Drupal caches. Our JSON:API queries turned out to be heavy on the first load before the Drupal caches kicked in and there were and still are issues regarding the page_cache module with json_api.

We decided to change our approach fundamentally to not use SSR on runtime but only at build time to statically generate the frontend. The client side navigation afterwards would then use a CDN for the JSON:API responses which were only invalidated on content changes and not on redeploys of the Drupal instance. The frontend application is now build using netlify and nuxt generate.

For the author preview we migrated the SSR part from platform.sh to now.sh and use serverless AWS lambda functions to serve the preview for the author.

So in the end our simple mono repo turned into 2 applications running on 3 vendors (platform.sh for the Drupal backend, Netlify for the static frontend and now.sh for the SSR preview) and using 2 CDNs (one for the JSON:API responses and one for the images)

We ended up writing 4 custom modules to suit our use cases currently not possible with the eco system in Drupal 8:

  • A jsonRpc endpoint for retrieving the generated and cached simple_xml_sitemap to expose this into the frontend application.
  • A netlify redeploy button for the authors as our usecase differed from the usecase the Drupal netlify module tries to solve. This button also flushes the complete CDN for all JSON:API calls.
  • A content version endpoint which delivers a state variable which basically reflects how often the "redeploy netlify" button was pressed. This is then used in the consumer to purge client caches of API responses from the CDN and is appended to all JSON:API calls as &c_v=3 for example.
  • A redirects endpoint which delivers all redirects in Drupal 8 in the format which is used by netlify to be integrated into the netlify deploy process so authors are able to add redirects, rename contents and change urls.

For our contact forms we started with webform_rest as JSON:API 1.x was not capable to receive webform_submissions. At the time this was possible with 2.x we had already migrated the frontend to netlify and used the forms function available there.

The question at the end of the journey remains "Is Drupal the correct platform" for a decoupled brochure site with only read requests?

The answer is as so often "it depends". Now at the end of the journey the eco system is a lot more mature then 7 months ago and with the recent inclusion of JSON:API into Drupal core our approach will be valid in the long term as the first gains are already visible in regards of time spent in frontend changes. After having dropped SSR and oauth because we only used GET requests for public content it now feels a bit over engineered. 

So would we use Drupal 8 with JSON:API again for this project if we knew at the beginning what we would build? I would say "maybe" because it mainly depends on your personal flavour. If you need 100% control and want to use an open source project for your backend then go for it. If you are a bit flexible there are alternatives which may be better suited for read only brochure sites like storyblok with which we already had very good experiences.

We're currently working on releasing the source code of both the frontend application, the backend application and the custom modules as well as more articles on the topic because we would have loved real world examples and knowledge before starting this project :)

At this point we want to give huge thanks to the complete decoupled Drupal initiative team because this project would not have been possible without all the great inputs and modules out there.

About the author

This is Rainer

Never trust a hippie.

Portrait Rainer