finger pointing at a laptop screen

Maintain Your Pantheon Site

Trust Pantheon’s constantly updated documentation pages like we do. Use them!


hands on a laptop computer with screen showing the ASU site

Apply 1-click upstream updates

It is your responsibility to keep your site up to date and secure. When a Drupal Core or distribution update is released and your site is in GIT mode, the updates will appear on your Pantheon dashboard. Apply the updates with 1-click. To ensure that the upstream updates apply cleanly, never hack core, and always customize in /sites/all. Use the Pantheon Workflow (see below) to test your changes before pulling them to Live. 

Setup a 301 and https redirect

It’s a best practice for search engine optimization to setup a 301 redirect to standardize on one domain. ASU policy requires that your site redirect http traffic to https. You can accomplish both through settings.php. 

Enable Caching for Performance

Configure performance settings in your Live environment so your site will be as fast as possible. In some cases caching may break certain dynamic functionality, but Web Services recommend first trying to enable all caching, and only disable if absolutely necessary. 

Enable Automated Backups, Download Backups

With a free plan you can manually create backups. Once your paid plan is enabled, you can enable automated backups. Backups are not saved indefinitely and Web Services encourage you to download backups periodically.

Use the Pantheon Workflow

When changing code or upgrading a module be sure to make changes in the Dev environment first. Then pull your code from Dev to Test. Next pull your Live database and files into Test. When you have confirmed everything is working on Test, pull your code to Live. Web Services recommends always backing up an environment before pulling code, database or files into it. 

Review ASU's Configuration Management document before updating your code. 

Customize robots.txt for Search Engine Optimization

Robots.txt is a file in the root of your Drupal install that tells search engine crawlers what to index on your site. By default, crawling is disabled on Dev and Test and enabled on Live. If you’re doing a soft launch and don’t want the live environment to be indexed you will need to customize your robots.txt on Dev and pull it to Test and then pull it to Live. 

Michael Dumse (left) pointing at laptop screen in front of Philip Faint (right)

​​​​​​​Getting help