Posts

This is another living post that serves more for my personal benefit than anything else. By living, I mean that I will continue to update it over time as I find things I want to keep easily accessible. This post is a place to paste snippets and boiler plate of Markdown/Pandoc/LaTeX material that I find useful but often forget. There’s probably nothing very genius or awe-inspiring here, but hopefully I won’t have to grep through by bash history as often, trying to remember what I did that one time to do that one thing.


Relating to my previous post today about setting up a blog using AWS, Docker Compose, Caddy Server, and Ghost, I found the need to do some web scraping. A number of years ago, my wife began journaling her thoughts in an online service called LDSJournal.com (at least I believe that was the name). About 2 years ago, this service was acquired by a new site called jrnl.com. It seems to be a fairly neat service, but one thing we were concerned with is preserving the data should the account ever disappear.


*Not including the domain name, of course. Also, this makes use of the portion Amazon AWS Free Tier that expires after 12 months, so only really free for that time. Should continue to be very inexpensive, though. My wife previously utilized an online service to host a private journal, which has since been acquired. She wanted to switch to something else, and after scraping the new service which lacks a proper API or export functionality, we needed something else for her to use.


The following are some notes that I’ve gathered after reading Deep Work by Cal Newport. I enjoyed the book, felt it has some concepts I want to apply in my life, so I condensed the precepts I wanted to remember into a form that would fit onto a single sheet of US letter paper. A link to that PDF can be found here: Deep Work Notes PDF Deep Work “Professional activities performed in a state of distraction-free concentration that push your cognitive capabilities to their limit.


The following is mostly for my own personal reference. About 5 years ago, I worked as a technical support agent for a fairly large webhosting company that used to be located in Utah, and at the time I kept a small Wordpress site with some tips or code snippets that often came in handy. I was cleaning out soem old files, and came across the SQL dump of that site when I left that job.


July 4, 2018

TL;DR: CrashPlan died in May 2018, so I switched my family over to using Duplicati, restic, and rclone to sync backups both locally and in the cloud. As a bonus, we’re doing for free for the next year with Google Cloud Storage due to their 12 month, $300 free trial. Motivation For the past two years or so from the date of this post, I’ve had my family using a piece of software known as CrashPlan for backing up all of the computers for my family, my siblings and their families, and my parents.


Another quick post, but something that I hope might be useful to others. Within the past couple weeks, my grandmother of 83 years passed away and my family held a memorial service in her honor. I was asked if I could help out in creating a program booklet or pamphlet that could be given out to the attendees, something that would describe the service itself as well as share a piece of my grandmother’s life with them as we gathered to remember her.


This is a quick post, but if you want a quick way to track body weight, or any metric you choose, and have an automatically updating dashboard, this is it. I used Google Forms to easy record the information and then this information is read by a Google Apps Script that generates a chart with Chart.js. Example Screenshot I roughly followed the instruction found here by Ben Collins: Creating a d3 chart with data from Google Sheets.


Updating matplotlib figures dynamically seems to be a bit of a hassle, but the code below seems to do the trick. This is an example that outputs a figure with multiple subplots, each with multiple plots. Oddly enough, at the time of writing the image will be smaller than the figure until the Jupyter cells stops running, but this can be fixed but generating the figure in one cell, and then updating the image in a subsequent cell 1.


A lot of my lab work and course work involved the use of Jupyter notebooks, though the Python dependencies needed conflict with other areas. I’ve been using virtualenvwrapper to isolate these, and other project, environments from each other. This post goes through the process of installing everything needed to get up and running with a clean Python environment for Jupyter notebooks with separate kernels for each environment, including the installation of jupyter_contrib_nbextensions which adds community developed features.