You can download the data from the above link, copy it to the data/ folder and subsequently run ./datagen.sh in your Terminal. This function divides the fer2013.csv file into train and validation sets.
Training the model on Google Cloud
While it is possible to train this model locally, I did so on Goolge's ML Engine with the gcloud command-line tool. Make sure you upload both .csv files to a Google Cloud Storage bucket first. The trainer program and its utility functions are located in the ./train/ folder. An example script (to be run from the root of this directory) would look something like the following:
When the job finishes, the checkpoint files are saved to the location you specified ('gs://path/to/save/model' above). After downloading the files, we can freeze the weights and optimize the graph for inference using the trainer/freeze.py. From the command-line: