GitHub Pages is pretty damn great.
I’ve been using it for years now but one thing has always bugged me. If you want
to schedule your posts in advance, they just never show up.
Why?
Because for posts to show up you have to trigger a rebuild of the project by
pushing a change.
Sure, you could leverage future: true
in your _config.yml
but then all of
your post-dated posts will show up.
Fine if for posts going out the next day but not ideal if you are trying to
schedule posts a week or so in advance.
Even though I am publishing new posts Monday, Wednesday and Friday, I am
actually doing the bulk of my writing on the Sunday before.
To avoid forgetting to make a post live, I did some research and coded up a
simple JavaScript / Node.js script to force a rebuild of my blog.
Said script is deployed to AWS as a Lambda function and I am leveraging
CloudWatch to schedule a rebuild every hour.
Sadly, it’s a bit too early for me to be able to tell how much this is going to
cost me. With that, after a few days of running it, I’m still clocking in at
$0.00 for both services.
Not saying it’s a free solution, but can’t imagine it’s going to top out at much
more than a few pennies.
Before we begin, you will need to setup a personal access token on GitHub. To do
so, just head over to GitHub’s settings page and set one up.
Also, and this is very important, you will need to create a file in your target
repository named .build
. It can be empty but since this code relies on a
file’s existing SHA hash, you will need to create it manually first else you’re
run into a Missing SHA hash
error.
To get started, create a new directory, cd
into that bad boy and run yarn
to setup the project (creating
init && yarn add requestpackage.json
) and add
our only dependency.
Feel free to do this with npm
if you’re into that.
Now that we have our dependencies all square, edit the file index.js
and drop
this code in there. Nothing needs to be changed as we’ll be passing in
parameters via the command-line:
const request = require('request'
const username = process.env['GH_USER'
const project = process.env['GH_REPO'
const key = process.env['GH_KEY'
const file = `https://${username}:${key}@api.github.com/repos/${username}/${project}/contents/.build`
const start = (event, context, callback) => {
request({
headers: { 'User-Agent': username },
method: 'GET',
url: file,
}, (getErr, getRes, getBody) => {
if (getErr) return console.log('Error GETting File:', getErr
const parsedGetBody = JSON.parse(getBody
if (!parsedGetBody.sha) return console.log('Missing SHA hash'
const { sha } = parsedGetBody
request({
headers: { 'User-Agent': username },
method: 'PUT',
url: file,
body: JSON.stringify({
message: 'Forcing rebuild',
content: new Buffer(new Date().toString()).toString('base64'),
sha,
}),
}, (putErr, putRes, putBody) => {
if (putErr) return console.log('Error PUTting File:', putErr
const parsedPutBody = JSON.parse(putBody
console.log('Parsed PUT Body:', parsedPutBody
if (process.env['LAMBDA_TASK_ROOT']) {
exports.handler = (event, context, callback) => {
process.env['PATH'] += `:${process.env['LAMBDA_TASK_ROOT']}`
start(event, context, callback
}
} else {
start({}, {}, (error, result) => {
console.info(result
}
Save the file and let’s try running it!
To run this locally, your command will look like this:
GH_USER=joshtronic GH_REPO=joshtronic.com GH_KEY=secret node index.js
Be sure to swap out each of the parameters for your own information, else it’s
going to shit the bed!
Assuming everything worked, you could just configure the script to run
periodically from your local machine, or the server of your choosing.
If you’re like me and don’t want to maintain a server or run this locally, you
can proceed over to Amazon Web Services and setup a Lambda function and
CloudWatch to run it!
First thing to do out on Lambda is to create a new function (from scratch). Call
it what you want and configure the role as you see fit. I tend to create a
custom role for each of my functions and name it according to the name of the
function.
Since this code relies on an external dependency, you will need to zip up the
entire directory and upload that to AWS Lambda. Select Upload a .ZIP
from the
Code entry type
and proceed with uploading your .ZIP file of your project.
Quick note about the .ZIP file. You’ll want to zip the contents of your
project’s directory including the installed node_modules
. Your files should be
in the root of the .ZIP file.
Once uploaded, you can try to run it, but it will error since we haven’t
configured the environment variables!
Expand the section labelled Environment variables
and proceed to enter the
same parameters we passed to the script earlier, GH_USER
, GH_REPO
and
GH_KEY
.
Once that’s all set, you can click Save and test
in the header. Upon
completion it will hopefully say succeeded
and show you the data that that
output from GitHub along with some summary data.
Keep in mind that sometimes it can throw a timeout error, happened to me while
testing things out. Just retry a moment later and hopefully the gremlins will
have flushed themselves out.
If you hop over to GitHub and browse the .build
file in the target repository,
it should contain a date string that is around the time that you tested the
Lambda function!
This is all well and good, but we still need to get this running on a schedule.
From there, head over to CloudWatch and click on Rules
(below Events
) on the
sidebar. On the next page, click on Create rule
which will take you to the
first step of the wizard.
Under Event Source
click on Schedule
and configure things how you’d like.
It’s probably overkill, but I went with hourly just in case something craps out
in the middle of the night. It’s probably save enough to run every 6-12 hours if
not daily. Your call.
Once the schedule is set, click on Add target*
under Targets
on the
right-hand side of the page. This reveals a form that defaults to the Lambda
option.
function
Below that, click on the Select function
drop down and select the function we
just created. Click Configure details
, give the rule a name on the next step
and click Create rule
.
Donezo! Now you can schedule posts in the future and have them show up at the
right time (just like this post was 😉