a technical notebook
You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 

2321 lines
94 KiB

<!DOCTYPE html>
<html lang=en>
<head>
<meta charset="utf-8">
<title>workings: a technical notebook</title>
<link rel=stylesheet href="workings.css" />
<link rel="alternate" type="application/atom+xml" title="changes" href="//squiggle.city/~brennen/workings-book/feed.xml" />
<script src="js/jquery.js" type="text/javascript"></script>
</head>
<body>
<h1 class=bigtitle>workings</h1>
<article>
<h1><a name=a-technical-notebook href=#a-technical-notebook>#</a> a technical notebook</h1>
<p>I&rsquo;ve been working on <a href="https://p1k3.com/userland-book/">a short book</a> on the basics of the command
line. This one is something else altogether. It&rsquo;s a long-form technical
logbook or journal, capturing the outlines of problems and their solutions as I
encounter them. It is dry, incomplete, and sometimes wrong. It is unlikely to
be useful for the general reader. I&rsquo;m compiling it because I want a record of
what I learn, and I hope that writing documentation will help me do cleaner,
more reproducible work. If it helps people with similar problems along the
way, so much the better.</p>
<p>&mdash; bpb / <a href="https://p1k3.com">p1k3</a>
/ <a href="https://ello.co/brennen">@brennen</a>
/ <a href="http://squiggle.city/~brennen/">~brennen</a></p>
<h2><a name=a-technical-notebook-copying href=#a-technical-notebook-copying>#</a> copying</h2>
<p><a href="http://creativecommons.org/licenses/by-sa/4.0/">CC BY-SA 4.0</a></p>
<div class=details>
<h2 class=clicker><a name=a-technical-notebook-contents href=#a-technical-notebook-contents>#</a> contents</h2>
<div class=full>
<div class=contents><ol>
<li><a href="#a-technical-notebook">a technical notebook</a>
<ul>
<li><a href="#a-technical-notebook-copying">copying</a></li>
<li><a href="#a-technical-notebook-contents">contents</a></li>
</ul>
</li>
<li><a href="#Wednesday-December-3-2014">Wednesday, December 3, 2014</a>
<ul>
<li><a href="#Wednesday-December-3-2014-makecitizen">makecitizen</a></li>
</ul>
</li>
<li><a href="#Friday-December-5-2014">Friday, December 5, 2014</a>
<ul>
<li><a href="#Friday-December-5-2014-notes-on-vim">notes on vim</a></li>
<li><a href="#Friday-December-5-2014-keybindings">keybindings</a></li>
</ul>
</li>
<li><a href="#Sunday-December-7-2014">Sunday, December 7, 2014</a>
<ul>
<li><a href="#Sunday-December-7-2014-notes-directory">notes directory</a></li>
</ul>
</li>
<li><a href="#Monday-December-8-2014">Monday, December 8, 2014</a>
<ul>
<li><a href="#Monday-December-8-2014-ssh">ssh</a></li>
<li><a href="#Monday-December-8-2014-mosh">mosh</a></li>
<li><a href="#Monday-December-8-2014-time-tracking">time tracking</a></li>
<li><a href="#Monday-December-8-2014-noobs-raspbian">noobs / raspbian</a></li>
<li><a href="#Monday-December-8-2014-beaglebone-black">beaglebone black</a></li>
<li><a href="#Monday-December-8-2014-reading-list">reading list</a></li>
</ul>
</li>
<li><a href="#Wednesday-December-10-2014">Wednesday, December 10, 2014</a>
<ul>
<li><a href="#Wednesday-December-10-2014-listusers-squiggle-city-repo">listusers / squiggle.city repo</a></li>
</ul>
</li>
<li><a href="#Thursday-December-18-2014">Thursday, December 18, 2014</a>
<ul>
<li><a href="#Thursday-December-18-2014-screencast-gifs">screencast gifs</a></li>
</ul>
</li>
<li><a href="#Friday-December-19-2014">Friday, December 19, 2014</a>
<ul>
<li><a href="#Friday-December-19-2014-drawing-tools">drawing tools</a></li>
</ul>
</li>
<li><a href="#Tuesday-December-23-2014">Tuesday, December 23, 2014</a>
<ul>
<li><a href="#Tuesday-December-23-2014-screenshots">screenshots</a></li>
</ul>
</li>
<li><a href="#Sunday-December-28-2014">Sunday, December 28, 2014</a>
<ul>
<li><a href="#Sunday-December-28-2014-candles-amp-candlemaking">candles &amp; candlemaking</a></li>
</ul>
</li>
<li><a href="#Saturday-January-3-2015">Saturday, January 3, 2015</a>
<ul>
<li><a href="#Saturday-January-3-2015-ipv6">ipv6</a></li>
</ul>
</li>
<li><a href="#Wednesday-January-7-2014">Wednesday, January 7, 2014</a>
<ul>
<li><a href="#Wednesday-January-7-2014-local-webservers-and-static-html-generation">local webservers and static html generation</a></li>
</ul>
</li>
<li><a href="#Monday-January-12">Monday, January 12</a>
<ul>
<li><a href="#Monday-January-12-Debian-packaging">Debian packaging</a></li>
<li><a href="#Monday-January-12-MS-DOS-AGT">MS-DOS / AGT</a></li>
</ul>
</li>
<li><a href="#Tuesday-January-13">Tuesday, January 13</a>
<ul>
<li><a href="#Tuesday-January-13-rtd-bus-schedules-transit-data">rtd / bus schedules / transit data</a></li>
</ul>
</li>
<li><a href="#Wednesday-January-14-2015">Wednesday, January 14, 2015</a></li>
<li><a href="#Friday-January-16">Friday, January 16</a></li>
<li><a href="#Tuesday-January-20">Tuesday, January 20</a></li>
<li><a href="#Thursday-January-22">Thursday, January 22</a>
<ul>
<li><a href="#Thursday-January-22-deleting-files-from-git-history">deleting files from git history</a></li>
<li><a href="#Thursday-January-22-postscript-on-finding-bugs">postscript: on finding bugs</a></li>
</ul>
</li>
<li><a href="#Sunday-January-25-2015">Sunday, January 25, 2015</a>
<ul>
<li><a href="#Sunday-January-25-2015-background-colors-for-tmux">background colors for tmux</a></li>
</ul>
</li>
<li><a href="#Tuesday-January-27">Tuesday, January 27</a>
<ul>
<li><a href="#Tuesday-January-27-what-version-of-what-linux-distribution-is-this">what version of what linux distribution is this?</a></li>
<li><a href="#Tuesday-January-27-armhf">armhf</a></li>
</ul>
</li>
<li><a href="#Wednesday-January-28">Wednesday, January 28</a>
<ul>
<li><a href="#Wednesday-January-28-on-replicating-process">on replicating process</a></li>
<li><a href="#Wednesday-January-28-what-makes-programming-hard">what makes programming hard?</a></li>
<li><a href="#Wednesday-January-28-debian-packaging-again">debian packaging again</a></li>
<li><a href="#Wednesday-January-28-vagrant">vagrant</a></li>
</ul>
</li>
<li><a href="#Thursday-January-29">Thursday, January 29</a>
<ul>
<li><a href="#Thursday-January-29-raspberry-pi-kernels">raspberry pi kernels</a></li>
</ul>
</li>
<li><a href="#Monday-February-2">Monday, February 2</a>
<ul>
<li><a href="#Monday-February-2-kernel-o-matic-amp-pi-finder">kernel-o-matic &amp; pi finder</a></li>
<li><a href="#Monday-February-2-raspberry-pi-2">raspberry pi 2</a></li>
<li><a href="#Monday-February-2-telling-composer-to-ignore-php-version-requirements">telling composer to ignore php version requirements</a></li>
</ul>
</li>
<li><a href="#Sunday-February-8">Sunday, February 8</a>
<ul>
<li><a href="#Sunday-February-8-systemd-amp-fsck">systemd &amp; fsck</a></li>
</ul>
</li>
<li><a href="#Monday-March-2">Monday, March 2</a>
<ul>
<li><a href="#Monday-March-2-python">python</a></li>
</ul>
</li>
<li><a href="#Thursday-April-9">Thursday, April 9</a>
<ul>
<li><a href="#Thursday-April-9-CGI-Fast-and-multi-param">CGI::Fast and multi_param()</a></li>
</ul>
</li>
<li><a href="#Monday-April-20">Monday, April 20</a>
<ul>
<li><a href="#Monday-April-20-getting-recent-posts-from-pinboard-machine-readably">getting recent posts from pinboard machine-readably</a></li>
</ul>
</li>
<li><a href="#Monday-January-18">Monday, January 18</a>
<ul>
<li><a href="#Monday-January-18-moved-to-p1k3-com">moved to p1k3.com</a></li>
</ul>
</li>
<li><a href="#tools-amp-toolchains-for-data-munging-amp-analysis">tools &amp; toolchains for data munging &amp; analysis</a>
<ul>
<li><a href="#tools-amp-toolchains-for-data-munging-amp-analysis-csvkit">csvkit</a></li>
<li><a href="#tools-amp-toolchains-for-data-munging-amp-analysis-jq">jq</a></li>
</ul>
</li>
<li><a href="#systemd-notes">systemd notes</a></li>
</ol>
</div>
</div>
</div>
</article>
<article>
<h1><a name=Wednesday-December-3-2014 href=#Wednesday-December-3-2014>#</a> Wednesday, December 3, 2014</h1>
<h2><a name=Wednesday-December-3-2014-makecitizen href=#Wednesday-December-3-2014-makecitizen>#</a> makecitizen</h2>
<p>{sysops, scripting, adduser, chfn}</p>
<p>Paul Ford sent out an e-mail to the tilde.club waitlist pointing at
~pfhawkins&rsquo;s list of other tildes, so I&rsquo;m getting signup requests. There are
enough that I want to write a script for adding a new squiggle.city user. I&rsquo;m
not determined to be very fancy about this right now; I just want to save some
keystrokes.</p>
<p>The first thing I do is google &ldquo;adduser&rdquo;. <code>adduser(1)</code> is basically just a
front end to <code>useradd(1)</code>. (This distinction will never stop being confusing,
and should probably be a lesson to anyone considering that naming pattern.) I
learn via Wikipedia that the metadata (name, room number, phone, etc.) which
adduser prompts for is called the
<a href="http://en.wikipedia.org/wiki/Gecos_field">GECOS field</a>, and is a relic of something
called the General Electric Comprehensive Operating System, which ran on some
machines at Bell Labs.</p>
<p>You can change that info with <code>chfn(1)</code>.</p>
<p>What my script needs to do is:</p>
<ol>
<li>create a user with a given <code>$USERNAME</code></li>
<li>generate a random password for the user and tell me</li>
<li>do <code>chage -d0 $USERNAME</code></li>
<li>put a given public key in <code>~$USERNAME/.ssh/authorized_keys</code></li>
</ol>
<p>You can&rsquo;t log in to squiggle.city with a password, so why go to the trouble of
setting a random one and forcing users to change it at their first login?
Mostly because users are going to need to know a password for things like
changing their shell or in the case that they get operator privileges one day.</p>
<p>This is what I come up with, after a couple of even dumber iterations:</p>
<pre><code>#!/bin/bash
CITIZEN=$1
KEYSTRING=$2
# Complain and exit if we weren't given a path and a property:
if [[ ! $CITIZEN || ! $KEYSTRING ]]; then
echo "usage: makecitizen &lt;username&gt; &lt;key&gt;"
exit 64
fi
# this should actually check if a _user_ exists,
# not just the homedir
if [ -d /home/$CITIZEN ]; then
echo "$CITIZEN already exists - giving up"
exit 68
fi
PASSWORD=`apg -d -n2`
adduser --disabled-login $CITIZEN
echo "$CITIZEN:$PASSWORD" | chpasswd
chage -d 0 $CITIZEN
echo "$KEYSTRING" &gt;&gt; /home/$CITIZEN/.ssh/authorized_keys
echo "passwd: $PASSWORD"
exit 0
</code></pre>
<p>This is used like so:</p>
<pre><code>root@squiggle:~# ./makecitizen jrandomuser "ssh-rsa ..."
</code></pre>
<p>It&rsquo;ll still do <code>adduser</code> interactively, which is fine for my purposes.</p>
<p>I think this would be improved if it took a fullname and e-mail as input,
and then sent that person a message, or at least output the text of one,
telling them their password.</p>
<p>It&rsquo;d probably be improved even more than that if it operated in batch mode, was
totally idempotent, and could be driven off some separate file or output
containing the set of users.</p>
<p>(Thoughts like this are how systems like Puppet and Chef are born.)</p>
</article>
<article>
<h1><a name=Friday-December-5-2014 href=#Friday-December-5-2014>#</a> Friday, December 5, 2014</h1>
<h2><a name=Friday-December-5-2014-notes-on-vim href=#Friday-December-5-2014-notes-on-vim>#</a> notes on vim</h2>
<p>Vim is a text editor. My slowly-evolving configuration can be found on GitHub,
in <a href="https://github.com/brennen/bpb-kit">bpb-kit</a>.</p>
<p><a href="https://github.com/thcipriani/">Tyler Cipriani</a> is a lot smarter than I am about vim (and, in
fact, most things), but I am particular and don&rsquo;t always share his preferences.</p>
<h2><a name=Friday-December-5-2014-keybindings href=#Friday-December-5-2014-keybindings>#</a> keybindings</h2>
<p>I&rsquo;m starting in on this notebook, which uses a Makefile, and think it might be
nice to have a quick vim keybinding for <code>:make</code>. I would use <code>F5</code>, by analogy
to QBasic, but I&rsquo;ve already bound that to <code>:wall</code>, which writes all the open
buffers with changes.</p>
<p>I think that maybe <code>&lt;leader&gt;m</code>, which in my case means <code>,m</code>, would be ok. Then
I&rsquo;m not sure if something is already mapped starting with that, so I try <code>:map</code>.</p>
<p>I want to search through the list produced by <code>:map</code>, and think it&rsquo;d be nice if
I could just read it into a buffer. The first thing I google is &ldquo;vim read
output of command into file&rdquo;. This could easily enough give hits for reading
the output of a shell command, but the 3rd thing down the page is
<a href="http://vim.wikia.com/wiki/Capture_ex_command_output">Capture ex command output</a>
on the Vim Tips Wiki.</p>
<p>There are a bunch of interesting ideas there, but the first basic idea is this:</p>
<pre><code>:redir @a
:set all
:redir END
</code></pre>
<p>Then you can open a new buffer - <code>:new</code> - and do <code>"ap</code>. This says &ldquo;using the named
register a, paste&rdquo;.</p>
<p>This seems to work with <code>:set all</code>, but not so much with <code>:map</code>. Why not? I skim
<code>:help map</code> and <code>help redir</code> without getting very far. Updates to come.</p>
<p>With that digression still unanswered, the mapping I settled on is simple:</p>
<pre><code>nmap &lt;leader&gt;m :make&lt;CR&gt;
</code></pre>
<p>I never know if these are going to take with me. The handful of custom
bindings that have actually entered my vocabulary are super-useful, but more
often than not I wind up forgetting about an idea not long after I&rsquo;ve
implemented it.</p>
</article>
<article>
<h1><a name=Sunday-December-7-2014 href=#Sunday-December-7-2014>#</a> Sunday, December 7, 2014</h1>
<h2><a name=Sunday-December-7-2014-notes-directory href=#Sunday-December-7-2014-notes-directory>#</a> notes directory</h2>
<p>On organizing todo lists, see <a href="https://p1k3.com/2014/8/23">the p1k3 entry from August of
2014</a>.</p>
<p>For years now, I&rsquo;ve kept that sort of thing in a <code>notes.txt</code>. At some point
notes.txt got its own directory with a haphazard jumble of auxiliary files. It
looks like I turned that directory into a git repository a couple of years ago.</p>
<p>Unlike a lot of what I keep in git, <code>~/notes/</code> isn&rsquo;t meant for any kind of
publication. In fact, it&rsquo;d be pretty dumb to let it out in the world. So I got
to thinking: I should really encrypt this.</p>
<p>So what&rsquo;s the best way to encrypt a single directory on Linux?</p>
<p>Two search strings:</p>
<ul>
<li>linux encrypted directory</li>
<li>encrypted git repo</li>
</ul>
<p>It looks like maybe [<a href="http://ecryptfs.org/">http://ecryptfs.org/</a>][eCryptFS] is the thing? This machine&rsquo;s an
Ubuntu, so let&rsquo;s see what we can find:</p>
<pre><code>$ apt-cache search ecryptfs
ecryptfs-utils - ecryptfs cryptographic filesystem (utilities)
ecryptfs-utils-dbg - ecryptfs cryptographic filesystem (utilities; debug)
libecryptfs-dev - ecryptfs cryptographic filesystem (development)
libecryptfs0 - ecryptfs cryptographic filesystem (library)
python-ecryptfs - ecryptfs cryptographic filesystem (python)
zescrow-client - back up eCryptfs Encrypted Home or Encrypted Private Configuration
</code></pre>
<p>Google suggests that ecryptfs-utils might be what I&rsquo;m looking for.</p>
<p>I become distracted reading about protests and leave this idea for another day.</p>
</article>
<article>
<h1><a name=Monday-December-8-2014 href=#Monday-December-8-2014>#</a> Monday, December 8, 2014</h1>
<h2><a name=Monday-December-8-2014-ssh href=#Monday-December-8-2014-ssh>#</a> ssh</h2>
<p>I use SSH for damn near everything. We need SSH for damn near everything.</p>
<p>I have this thought that SSH is quite possibly the only end-user-exposed
implementation of acceptable crypto in wide use which actually satisfies the
&ldquo;actual human beings can use this&rdquo; constraint at the same time as satisfying
the &ldquo;this makes your shit relatively secure&rdquo; constraint. That&rsquo;s not to say
it&rsquo;s easy for the average mortal to comprehend, but it beats the shit out of
almost everything else I can think of.</p>
<p>In &ldquo;almost everything else&rdquo;, I include SSL/TLS/HTTPS, which sort-of works as
far as the general user population of browsers is concerned, much of the time,
but which is an absolute nightmare to administer and which is a fundamentally
broken design on a political / systems-of-control / economic /
regular-admins-get-this-right level. Arguably, the only thing that has been
worse for the wide adoption of crypto by normal users than SSL/TLS is PGP.</p>
<p>DISCLAIMER: I DON&rsquo;T KNOW SHIT ABOUT CRYPTO. Tell me how I&rsquo;m wrong.</p>
<p style="text-align:center;"></p>
<ul>
<li>&ldquo;<a href="http://harmful.cat-v.org/software/ssh">Sorry Theo, but SSH Sucks</a>&rdquo;</li>
</ul>
<h2><a name=Monday-December-8-2014-mosh href=#Monday-December-8-2014-mosh>#</a> mosh</h2>
<p>I&rsquo;m not exactly sure when mosh started to catch on with people I know, but I&rsquo;d
say it&rsquo;s on the order of a year or two that I&rsquo;ve been aware of it. The basic
thing here is that it&rsquo;s essentially OpenSSH with better characteristics for a
specific cluster of use cases:</p>
<ul>
<li>laggy, high-latency, intermittently-broken network connections</li>
<li>client machines that frequently hop networks and/or suspend operations</li>
<li>unreliable VPNs (which is to say very nearly all VPNS in actual use)</li>
</ul>
<h2><a name=Monday-December-8-2014-time-tracking href=#Monday-December-8-2014-time-tracking>#</a> time tracking</h2>
<p>I&rsquo;m about to start in on some remote contracting stuff, so I go looking for a
time tracking tool. For the moment I settle on this little tray widget called
<a href="http://projecthamster.wordpress.com/">hamster</a>, which looks functional if not
precisely inspiring.</p>
<h2><a name=Monday-December-8-2014-noobs-raspbian href=#Monday-December-8-2014-noobs-raspbian>#</a> noobs / raspbian</h2>
<p>Last year I did a bunch of work on a Raspberry Pi, but it&rsquo;s been a few months
since I booted one up. I got a model B+ (more USB ports, various hardware
tweaks, takes a microSD card instead of the full-size one) in my last employee
order at SparkFun, and I&rsquo;m stepping through what seems to be the stock
recommended installation process.</p>
<ul>
<li><a href="http://www.raspberrypi.org/new-raspbian-and-noobs-releases/">http://www.raspberrypi.org/new-raspbian-and-noobs-releases/</a></li>
<li><a href="http://www.raspberrypi.org/downloads/">http://www.raspberrypi.org/downloads/</a></li>
<li><a href="http://downloads.raspberrypi.org/NOOBS_latest.torrent">http://downloads.raspberrypi.org/NOOBS_latest.torrent</a></li>
</ul>
<p>I torrented <code>NOOBS_v1_3_10.zip</code>. Be careful unzipping this one - everything is at
the top level of the archive (advice to distributors of basically anything: don&rsquo;t
do that).</p>
<p>If I&rsquo;d been smart I probably would have done:</p>
<pre><code>$ mkdir noobs &amp;&amp; unzip NOOBS_v1_3_10.zip -d noobs/
</code></pre>
<p>The basic system here is &ldquo;get an SD card, put the stuff in this zip file on the
SD card, put it in the Pi&rdquo;. Everything about this has always felt kind of
weird (if not actively broken) to me, but it&rsquo;s probably important to remember
that for most users &ldquo;put some files on this media&rdquo; is a lot easier than &ldquo;image
this media with the filesystem contained in this file&rdquo;.</p>
<p style="text-align:center;"></p>
<p>So I plug in all the stuff: microSD card, keyboard, HDMI cable to random spare
monitor, power.</p>
<p>Nothing. Well, almost nothing. Blinkenlights, no video output. Red light is
steady, green light blinks a couple of times periodically.</p>
<p>I am reminded that this is, fundamentally, a terrible piece of hardware.</p>
<p>Power down, remove SD card, mount SD card on Linux machine, google variously,
delete and recreate FAT32 partition using gparted, re-copy NOOBS files, unmount
SD card, replace card in Pi, re-apply power.</p>
<p>Green LED flashes spasmodically for a bit then seems mostly off, but is actually
flickering faintly on closer examination. Red light is solid.</p>
<p><a href="http://elinux.org/R-Pi_Troubleshooting#Red_power_LED_is_on.2C_green_LED_does_not_flash.2C_nothing_on_display">This wiki page</a>
suggests this means that no boot code has been executed at all. It&rsquo;s failing to
read the card, or it&rsquo;s missing some file, or something is corrupt.</p>
<p>Ok, so, mount SD card on Linux machine again; immediately discover that the
card is now a volume called &ldquo;SETTINGS&rdquo;, or seems to be.</p>
<pre><code>$ ls /media/brennen/SETTINGS
lost+found
noobs.conf
$ cat /media/brennen/SETTINGS/noobs.conf
[General]
display_mode=0
keyboard_layout=gb
language=en
brennen@desiderata 15:52:24 /home/brennen ★ sudo parted /dev/mmcblk0
GNU Parted 2.3
Using /dev/mmcblk0
Welcome to GNU Parted! Type 'help' to view a list of commands.
(parted) print
Model: SD SL16G (sd/mmc)
Disk /dev/mmcblk0: 15.9GB
Sector size (logical/physical): 512B/512B
Partition Table: msdos
Number Start End Size Type File system Flags
1 1049kB 823MB 822MB primary fat32 lba
2 826MB 15.9GB 15.1GB extended
3 15.9GB 15.9GB 33.6MB primary ext4
(parted)
</code></pre>
<p>Well, obviously something ran, because I definitely didn&rsquo;t arrange anything
that way. And this seems a little telling:</p>
<pre><code>brennen@desiderata 15:55:36 /home/brennen ★ dmesg | tail -12
[51329.226687] mmc0: card aaaa removed
[51776.154562] mmc0: new high speed SDHC card at address aaaa
[51776.154894] mmcblk0: mmc0:aaaa SL16G 14.8 GiB
[51776.169240] mmcblk0: p1 p2 &lt; &gt; p3
[51781.342106] EXT4-fs (mmcblk0p3): mounted filesystem with ordered data mode. Opts: (null)
[51791.757878] mmc0: card aaaa removed
[51791.773880] JBD2: Error -5 detected when updating journal superblock for mmcblk0p3-8.
[51793.651277] mmc0: new high speed SDHC card at address aaaa
[51793.651601] mmcblk0: mmc0:aaaa SL16G 14.8 GiB
[51793.666335] mmcblk0: p1 p2 &lt; &gt; p3
[51799.516813] EXT4-fs (mmcblk0p3): recovery complete
[51799.518183] EXT4-fs (mmcblk0p3): mounted filesystem with ordered data mode. Opts: (null)
</code></pre>
<p>(The &ldquo;Error -5 detected bit.)</p>
<p>Ok, so I bought a new Sandisk-branded card because I didn&rsquo;t have a decently
fast microSD card laying around. What I&rsquo;m going to check before I go any
further is whether I got one the Pi can&rsquo;t deal with. (Or just one that&rsquo;s bunk.
I bought this thing for 15 bucks at Best Buy, so who knows.)</p>
<p>Here&rsquo;s an 8 gig class 4 card, branded Kingston, but I probably got it off the
shelves at SparkFun some time in the last 3 years, so its actual provenance is
anybody&rsquo;s guess. Looking at what&rsquo;s on here, I&rsquo;ve already used it for a
Raspberry Pi of some flavor in the past. Let&rsquo;s see if it&rsquo;ll boot as-is.</p>
<p>Ok, no dice. I&rsquo;m starting to suspect my problem lies elsewhere, but I&rsquo;ll try
one more time on this card with NOOBS.</p>
<p>Again: No dice.</p>
<p>Also checked:</p>
<ul>
<li>the monitor with other inputs, because who knows</li>
<li>tried a couple of different power supplies - USB cable from my laptop, 5V
wall wart purchased from SFE, cell phone charger.</li>
<li>the usual plug-things-in-one-at-a-time routine.</li>
</ul>
<p style="text-align:center;"></p>
<p>Time to try one of these cards with an older RasPi, if I can figure out where I
put any of them.</p>
<p>After much shuffling through stuff on my dining room table / workbench, I find
a model B. It fails in much the same way, which leads me to suspect again that
I&rsquo;m doing something wrong with the card, but then I can&rsquo;t quite remember if
this one still worked the last time I plugged it in. They can be fragile
little critters.</p>
<p>Here&rsquo;s a thought, using a Raspbian image I grabbed much earlier this year:</p>
<pre><code>brennen@desiderata 17:10:03 /home/brennen/isos ★ sudo dd if=/home/brennen/isos/2014-01-07-wheezy-raspbian.img of=/dev/mmcblk0
</code></pre>
<p>No dice on either the model B or model B+, using the new SanDisk.</p>
<p>Trying with the older card, <code>dd</code> spins through 800ish megs before giving me an I/O error.</p>
<p>It may be time to start drinking.</p>
<p style="text-align:center;"></p>
<p>The next day: I swing through a couple of stores in town with the <a href="http://www.raspberrypi.org/forums/viewtopic.php?t=58151">wiki list
of known cards in hand</a> and buy a pile
of cards across a handful of brands, plus a $20 card reader (the Insignia
NS-CR20A1) since there&rsquo;s not one built in to the laptop I&rsquo;m carrying today.
The first card I try boots NOOBS instantly; an installer is running as I type
this.</p>
<p>Suddenly It occurs to me that the card reader on the laptop I was using last
night might is likely dying/dead.</p>
<p>This is a really slick install process now, so good work to somebody on that.</p>
<h2><a name=Monday-December-8-2014-beaglebone-black href=#Monday-December-8-2014-beaglebone-black>#</a> beaglebone black</h2>
<p>I&rsquo;ve got a Beaglebone Black sitting here new in the box. It comes with a USB
cable, so I plug it in. Instantly there are bright blue blinky lights, and my
laptop tells me I&rsquo;m connected to an ethernet network and I&rsquo;ve got a new drive
mounted with some README files in it.</p>
<p>This is kind of great.</p>
<p>Browsing to to 192.168.7.2 gets a bunch of docs and a link to Cloud9, an
in-browser IDE that happens to include a root terminal.</p>
<p>I don&rsquo;t really know what&rsquo;s going on here. I think it might be a little
scattered and confused as a user experience, in some ways. But it immediately
strikes me as good tech in a bunch of ways.</p>
<p>Josh Datko, who I&rsquo;ve gotten to know a little bit, has a book called <em>Beaglebone
for Secret Agents</em>. It&rsquo;s been on my ever-growing to-read list for a while; I&rsquo;m
going to have to give it a look sooner rather than later.</p>
<h2><a name=Monday-December-8-2014-reading-list href=#Monday-December-8-2014-reading-list>#</a> reading list</h2>
<ul>
<li><a href="http://www.jann.cc/2013/01/15/trying_out_the_adafruit_webide.html">http://www.jann.cc/2013/01/15/trying_out_the_adafruit_webide.html</a></li>
<li><a href="http://www.angstrom-distribution.org/">http://www.angstrom-distribution.org/</a></li>
<li><a href="http://www.raspberrypi.org/documentation/configuration/config-txt.md">http://www.raspberrypi.org/documentation/configuration/config-txt.md</a></li>
</ul>
</article>
<article>
<h1><a name=Wednesday-December-10-2014 href=#Wednesday-December-10-2014>#</a> Wednesday, December 10, 2014</h1>
<h2><a name=Wednesday-December-10-2014-listusers-squiggle-city-repo href=#Wednesday-December-10-2014-listusers-squiggle-city-repo>#</a> listusers / squiggle.city repo</h2>
<p>There&rsquo;s now a <a href="https://github.com/squigglecity/">squigglecity organization on GitHub</a>.
What little is there is a classic duct-tape mess complete with a bunch of
commits made as root, but may contain a few useful bits.</p>
<p>I&rsquo;m planning to clean up <a href="https://github.com/squigglecity/squiggle.city/blob/1a07ccc8415b05ad239116a062d2992ae1537541/listusers.pl">this version of listusers.pl</a> into a
more generic <code>listusers</code> utility that just outputs TSV and pipe to csvkit / <code>jq</code>
for HTML &amp; JSON.</p>
<p>Oh, right &mdash; about the JSON. ~ford proposed a standard <code>tilde.json</code> <a href="http://squiggle.city/tilde.json">kind of
like this</a>, which I think is not a terrible
idea at all though that one&rsquo;s a bit rough and the format could still use a
little tweaking as of this writing.</p>
<p>This is the kind of thing it&rsquo;s unbelievably easy to overthink. I&rsquo;m hoping
we&rsquo;ll give it enough thought to do a few smart things but not so much thought
that no one actually uses it.</p>
</article>
<article>
<h1><a name=Thursday-December-18-2014 href=#Thursday-December-18-2014>#</a> Thursday, December 18, 2014</h1>
<h2><a name=Thursday-December-18-2014-screencast-gifs href=#Thursday-December-18-2014-screencast-gifs>#</a> screencast gifs</h2>
<p>Looking to make some GIFs of things that happen on my screen, found <code>byzanz</code>.</p>
<pre><code>$ sudo apt-get install byzanz
byzanz-record -x 1 -y 1 --delay=4 -h 150 -w 700 hello_world.gif
</code></pre>
<p>Options:</p>
<ul>
<li><code>-x</code> and <code>-y</code> set origin of capture on screen</li>
<li><code>-h</code> and <code>-w</code> set height and width to capture</li>
</ul>
<p>I think I need a more clever way to trigger / manage this than just fiddling
with CLI options, but it works really well and produces lightweight image
files.</p>
<p>I think it would be cool if there were a utility that let me use arrow keys /
hjkl / the mouse cursor to visually select a region of the screen. It could
return x, y, height, and width, then I&rsquo;d let byzanz handle the capture.</p>
<p>That can&rsquo;t be the <em>hardest</em> thing in the world to do.</p>
<p style="text-align:center;"></p>
<p><a href="http://www.semicomplete.com/projects/xdotool/">xdotool</a> seems like kind of a
swiss army knife, and has a <code>getmouselocation</code> command. Theoretically, at
least, you can have it respond to events, including a mouse click. I can&rsquo;t
quite wrap my head around how this is supposed to work, and my first few
attempts fall flat.</p>
<p><a href="https://www.gnu.org/software/xnee/">GNU xnee</a> might also be promising, but I
don&rsquo;t really get anywhere with it.</p>
<p>Eventually I find an
<a href="http://askubuntu.com/questions/107726/how-to-create-animated-gif-images-of-a-screencast">Ask Ubuntu</a>
thread on creating screencast gifs, which points to
<a href="https://github.com/lolilolicon/xrectsel">xrectsel</a>, a tool for
returning the coordinates and size of a screen region selected with the mouse:</p>
<pre><code>brennen@desiderata 22:06:28 /var/www/workings-book (master) ★ xrectsel "%x %y %w %h"
432 130 718 575%
</code></pre>
<p>I wind up with <a href="https://github.com/brennen/bpb-kit/blob/master/bin/gif_sel"><code>gif_sel</code></a>:</p>
<pre><code>#!/usr/bin/env bash
# requires:
# https://github.com/lolilolicon/xrectsel.git
eval `xrectsel "BYZANZ_X=%x; BYZANZ_Y=%y; BYZANZ_WIDTH=%w; BYZANZ_HEIGHT=%h"`
byzanz-record -x $BYZANZ_X -y $BYZANZ_Y --delay=4 -h $BYZANZ_HEIGHT -w $BYZANZ_WIDTH ~/screenshots/screencast-`date +"%Y-%m-%d-%T"`.gif
</code></pre>
<p>I&rsquo;ll probably wind up with a couple of wrappers for this for different lengths
of recording (for starting with dmenu), though it would be nice if I could just
have it record until I press some hotkey.</p>
</article>
<article>
<h1><a name=Friday-December-19-2014 href=#Friday-December-19-2014>#</a> Friday, December 19, 2014</h1>
<p>{timetracking}</p>
<p>So hamster really doesn&rsquo;t scratch my particular itch all that well. Rather
than devote any serious brain energy to finding or writing a replacement that
does, I&rsquo;ve decided to just use a text file.</p>
<p>It looks like the following:</p>
<pre><code>2014-12-17 21:55 - 2014-12-17 11:40
2014-12-18 10:05 - 2014-12-18 12:50
2014-12-18 13:45 - 2014-12-18 16:00
</code></pre>
<p>This is just two datetimes for each range of time when I&rsquo;m working on a given
thing, delimited by <code>/ - /</code>. I just want a quick script to tally the time
represented. (Later, if I need to track more than one project, I&rsquo;ll expand on
this by adding a project name and/or notes to the end of the line.)</p>
<p>It kind of seems like I should be able to do this with GNU <code>date</code>, but let&rsquo;s
find out. Here&rsquo;re the <a href="https://www.gnu.org/software/coreutils/manual/html_node/Examples-of-date.html">official examples</a>. This sounds about
right:</p>
<blockquote><p>To convert a date string to the number of seconds since the epoch (which is
1970-01-01 00:00:00 UTC), use the &ndash;date option with the ‘%s’ format. That
can be useful in sorting and/or graphing and/or comparing data by date. The
following command outputs the number of the seconds since the epoch for the
time two minutes after the epoch:</p>
<pre><code> date --date='1970-01-01 00:02:00 +0000' +%s
120
</code></pre></blockquote>
<p>As a test case, I start here:</p>
<pre><code>$ cat ~/bin/timelog
#!/usr/bin/env bash
date --date="$1" +%s
$ timelog '2014-12-17 21:55'
1418878500
</code></pre>
<p>Ok, groovy.</p>
<p>I was going to do the rest of this in shell or awk or something, but then I
thought &ldquo;I should not spend more than 10 minutes on this&rdquo;, and wrote the following
Perl:</p>
<pre><code>#!/usr/bin/env perl
use warnings;
use strict;
use 5.10.0;
my $total_hours = 0;
# while we've got input from a file/stdin, split it into two datestamps
# and feed that to date(1)
while (my $line = &lt;&gt;) {
chomp($line);
my ($start, $end) = map { get_seconds($_) } split / - /, $line;
my $interval = $end - $start;
my $hours = $interval / 3600;
$total_hours += $hours;
say sprintf("$line - %.3f hours", $hours);
}
say sprintf("%.3f total hours", $total_hours);
sub get_seconds {
my ($stamp) = @_;
my $seconds = `date --date="$stamp" +%s`;
chomp($seconds);
return $seconds;
}
</code></pre>
<p>Which gives this sort of output:</p>
<pre><code>brennen@desiderata 14:54:38 /home/brennen/bin (master) ★ timelog ~/notes/some_employer.txt
2014-12-15 13:10 - 2014-12-15 14:35 - 1.417 hours
2014-12-16 10:00 - 2014-12-16 12:55 - 2.917 hours
2014-12-16 14:00 - 2014-12-16 17:15 - 3.250 hours
2014-12-17 15:00 - 2014-12-17 16:51 - 1.850 hours
2014-12-17 21:55 - 2014-12-17 23:40 - 1.750 hours
2014-12-18 10:05 - 2014-12-18 12:50 - 2.750 hours
2014-12-18 13:45 - 2014-12-18 16:00 - 2.250 hours
2014-12-18 17:00 - 2014-12-18 17:30 - 0.500 hours
16.683 total hours
</code></pre>
<p>This is me once again being lazy and treating Perl as a way to wrap shell
utilities when I want to easily chop stuff up and do arithmetic. It is <em>many
kinds of wrong</em> to do things this way, but right now I don&rsquo;t care.</p>
<p>If this were going to be used by anyone but me I would do it in pure-Perl and
make it robust against stupid input.</p>
<h2><a name=Friday-December-19-2014-drawing-tools href=#Friday-December-19-2014-drawing-tools>#</a> drawing tools</h2>
<p>Ok, so because I&rsquo;m starting to poke at drawing again for the first time in
quite a while (even to the extent that I&rsquo;ll soon be publishing some stuff that
includes cartoon graphics, despite having <em>no</em> idea what I&rsquo;m doing), I thought
I&rsquo;d take some rough notes on where I&rsquo;m at with toolset.</p>
<p>The first thing is that I&rsquo;m not using any Adobe tools, or indeed any
proprietary software (unless you count the firmware on my cameras and maybe
Flickr) to work with images. I am fully aware that this is a <em>ridiculous</em>
limitation to self-impose, but I want to stick with it as best I can.</p>
<p>For a long time, I&rsquo;ve sort of fumbled my way through GIMP whenever I needed to
do the kind of light image editing stuff that inevitably comes up in the life
of a web developer no matter how many things you foist off on your
Photoshop-skilled, design-happy coworkers. I think GIMP gets kind of an unfair
rap; it&rsquo;s a pretty capable piece of software. That said, I&rsquo;ve still never
really put the time in to get genuinely skilled with it, and it&rsquo;s not the most
accessible thing for just doodling around.</p>
<p>Several years back, I <a href="https://p1k3.com/2011/4/13">bought a cheap Wacom tablet</a>.
I was maybe a little optimistic in that writeup, but I still really enjoy
<a href="http://mypaint.intilinux.com/">MyPaint</a>. The problem is that, while it&rsquo;s really
fun for a sketchy/painty/extemperaneous kind of workflow, and dovetails
beautifully with the tablet interface, it deliberately eschews a lot of features
that you start to want for <em>editing</em> an image. I don&rsquo;t blame its developers for
that &mdash; they&rsquo;re obviously trying to do a certain kind of thing, and constraints
often make for great art &mdash; but I&rsquo;m wondering if I can&rsquo;t get some of the same
vibe with a tool that also lets me easily cut/copy/scale stuff.</p>
<p>I&rsquo;m giving <a href="https://krita.org/">Krita</a> a shot with that in mind. It has a real
KDE vibe to it. Lots of modular GUI widgets, menus, etc. A little
bureaucratic. It doesn&rsquo;t feel as fluid or immediate as MyPaint right out of
the gate, but it&rsquo;s definitely got more in the way of features. Could grow on
me.</p>
</article>
<article>
<h1><a name=Tuesday-December-23-2014 href=#Tuesday-December-23-2014>#</a> Tuesday, December 23, 2014</h1>
<h2><a name=Tuesday-December-23-2014-screenshots href=#Tuesday-December-23-2014-screenshots>#</a> screenshots</h2>
<p>Looking to streamline capture of static screenshots a bit. Options:</p>
<ul>
<li><code>gnome-screenshot</code> - use this already, it&rsquo;s fine, whatever.</li>
<li><code>shutter</code> - weirdness with my xmonad setup? Errors and I don&rsquo;t feel like taking
the time to find out why.</li>
<li><code>scrot</code> - buncha nice command line options</li>
</ul>
<p>I wind up forking Tyler&rsquo;s <a href="https://github.com/thcipriani/dotfiles/blob/master/bin/grab">grab</a>,
a nice wrapper for <code>scrot</code>, which is pretty much what I was going to write anyway.</p>
<p>This is pretty good at defining a region for a static screenshot.</p>
</article>
<article>
<h1><a name=Sunday-December-28-2014 href=#Sunday-December-28-2014>#</a> Sunday, December 28, 2014</h1>
<h2><a name=Sunday-December-28-2014-candles-amp-candlemaking href=#Sunday-December-28-2014-candles-amp-candlemaking>#</a> candles &amp; candlemaking</h2>
<p>A year ago at Christmastime, I decided to see what kind of candlemaking
supplies were still at my parents' house, and wound up digging a couple of big
Rubbermaid tubs worth of molds, dyes, additives, wick, wax, &amp;c out of the
basement.</p>
<p>I used to do this a lot, but I&rsquo;ve mostly forgotten the details of technique.</p>
<p>Rough notes:</p>
<ul>
<li>Wax temperature when pouring is important. I&rsquo;m aiming for 210-220 F
with metal molds, but it&rsquo;s hard to get there with the little hot plate I&rsquo;m
using. I can usually get it just over 200, according to the thermometer
I&rsquo;ve got. This doesn&rsquo;t seem to be doing too much damage, but I do think
the results would be a little better with hotter wax.</li>
<li>You&rsquo;re supposed to use a proper double boiler or a purpose-built wax melter.
I put various sizes of can in some water in a medium size pan.</li>
<li>I remember that I used to melt wax on the woodstove in my dad&rsquo;s shop, but if
so we must have been running the stove hotter in those days or I had a lot
more patience. it does work well for holding wax at a reasonable
temperature until you have to do a second pour.</li>
<li>With metal molds, keeping the wax from streaming out the wick hole at the
bottom is often kind of problematic. I think you&rsquo;re supposed to affix the
wicking with a little screw and put some tacky putty-type stuff over the
screw, but if you&rsquo;re low on the putty or don&rsquo;t have just the right size
screw this doesn&rsquo;t work so great. Things tried this time around: The
remaining putty and then everything kind of smashed down on a wood block
(Ben&rsquo;s idea), pouring a little wax in the bottom and letting it harden first,
the wrong size screw, silicone caulk. The wood block and the silicone caulk
both worked pretty well.</li>
<li>You can dye beeswax, but you have to keep in mind that the stuff is already
pretty yellow and opaque. Shades of green work well. Other colors&hellip; Well,
I wound up with some the color of a strange weird woodland fungus.</li>
<li>Last time I did this, I wound up with a bunch of pillars that burned really
poorly and with a small flame. I think I wasn&rsquo;t using a heavy enough wick.
Tried to go with heavier braided wicking this time. Guess I&rsquo;ll see how that
pans out.</li>
</ul>
</article>
<article>
<h1><a name=Saturday-January-3-2015 href=#Saturday-January-3-2015>#</a> Saturday, January 3, 2015</h1>
<h2><a name=Saturday-January-3-2015-ipv6 href=#Saturday-January-3-2015-ipv6>#</a> ipv6</h2>
<p>I was hanging out on the internet and heard that <a href="&#109;&#97;&#105;&#108;&#116;&#111;&#58;&#105;&#109;&#116;&#64;&#112;&#114;&#111;&#116;&#111;&#99;&#111;&#108;&#46;&#99;&#108;&#117;&#98;">&#105;&#109;&#116;&#64;&#112;&#114;&#111;&#116;&#111;&#99;&#111;&#108;&#46;&#99;&#108;&#117;&#98;</a> had set up
<a href="https://club6.nl">club6.nl</a>, a tildebox reachable only over ipv6. I applied
for an account and <a href="https://club6.nl/~brennen">got one</a> (very speedy turnaround,
<a href="https://club6.nl/~imt/">~imt</a>).</p>
<p>The next problem was how to connect. I am an utter prole when it comes to
networking. The first thing I remembered was that DigitalOcean optionally
supports ipv6 when creating a new droplet, and sure enough they
also <a href="https://www.digitalocean.com/community/tutorials/how-to-enable-ipv6-for-digitalocean-droplets">have a guide for enabling it</a> on existing droplets.</p>
<p>TODO: Get my own sites resolving and reachable via ipv6.</p>
</article>
<article>
<h1><a name=Wednesday-January-7-2014 href=#Wednesday-January-7-2014>#</a> Wednesday, January 7, 2014</h1>
<h2><a name=Wednesday-January-7-2014-local-webservers-and-static-html-generation href=#Wednesday-January-7-2014-local-webservers-and-static-html-generation>#</a> local webservers and static html generation</h2>
<p>I haven&rsquo;t always run an httpd on my main local machine, but I&rsquo;ve been doing it
again for the last year or two now, and it feels like a major help. I started
by setting up a development copy of <a href="https://github.com/brennen/display">display</a> under Apache, then noticed
that it was kind of nice to use it for static files. I&rsquo;m not sure why it&rsquo;s any
better than accessing them via the filesystem, except maybe that
<code>localhost/foo</code> is easier to type than <code>file://home/brennen/something/foo</code>, but
it has definitely made me better at checking things before I publish them.</p>
<p>(Why Apache? Well, it was easier to re-derive the configuration I needed for
p1k3 things under Apache than write it from scratch under nginx, although one
of these days I may make the leap anyway. I don&rsquo;t see any reason Perl FastCGI
shouldn&rsquo;t work under nginx. I also still think Apache has its merits, though
most of my domain knowledge has evaporated over the last few years of doing
mainly php-fpm under nginx.)</p>
<p>I&rsquo;ve resisted the static blog engine thing for a long time now, but lately my
favorite way to write things is a super-minimal <code>Makefile</code>, some files in
Markdown, and a little bit of Perl wrapping <code>Text::Markdown::Discount</code>. I
haven&rsquo;t yet consolidated all these tools into a single generically reusable
piece of software, but it would probably be easy enough, and I&rsquo;ll probably go
for it when I start a third book using this approach.</p>
<p>I&rsquo;d like to be able to define something like a standard <code>book/</code> dir that would
be to a given text what <code>.git/</code> is to the working copy of a repo. I suppose
you wouldn&rsquo;t need much.</p>
<pre><code>book/
authors
title
description
license
toc
</code></pre>
<p><code>toc</code> would just be an ordered list of files to include as &ldquo;chapters&rdquo; from the
root of the project. You&rsquo;d just organize it however you liked and optionally
use commands like</p>
<pre><code>book add chapter/index.md after other_chapter/index.md
book move chapter/index.md before other_chapter/index.md
</code></pre>
<p>to manage it, though really a text editor should be enough. (Maybe I&rsquo;m
overthinking this. Maybe there should just be a directory full of chapters
sorted numerically on leading digits or something, but I&rsquo;ve liked being able to
reorder things in an explicit list.)</p>
<p>Before long I might well add handling for some</p>
<p>I should add a feature to Display.pm for outputting all of its content
statically.</p>
</article>
<article>
<h1><a name=Monday-January-12 href=#Monday-January-12>#</a> Monday, January 12</h1>
<h2><a name=Monday-January-12-Debian-packaging href=#Monday-January-12-Debian-packaging>#</a> Debian packaging</h2>
<p>A lot of time today with
the <a href="https://www.debian.org/doc/manuals/maint-guide/">Debian New Maintainer&rsquo;s Guide</a>
and google for a project that needs some simple packages.</p>
<p>This is one of those things where the simple cases are simple and then it&rsquo;s
easy to get lost in a thicket of overlapping mechanisms and terminology.</p>
<p>Thought for providers of technical HOWTOs:</p>
<p>If you&rsquo;re describing the cumulative assembly of a file structure, provide a
copy (repository, tarball, whatever) of that file structure.</p>
<p>(I should probably take this notion to heart.)</p>
<p>Things to remember:</p>
<ul>
<li><a href="http://man.he.net/man1/fakeroot">http://man.he.net/man1/fakeroot</a></li>
</ul>
<h2><a name=Monday-January-12-MS-DOS-AGT href=#Monday-January-12-MS-DOS-AGT>#</a> MS-DOS / AGT</h2>
<p>So I was scrolling through archive.org&rsquo;s newly-shiny MS-DOS archive (with the
crazy in-browser DOSBOX emulation), trying to think of what to look for.</p>
<p>I found some old friends:</p>
<ul>
<li><a href="https://archive.org/details/msdos__1CRYSTL_shareware"><em>Crystal Caves</em></a></li>
<li><a href="https://archive.org/details/msdos_Commander_Keen_1_-_Marooned_on_Mars_1990"><em>Commander Keen</em></a></li>
<li><a href="https://archive.org/details/msdos_Heretic_-_Shadow_of_the_Serpent_Riders_1996"><em>Heretic</em></a> &mdash; still a pretty solid game and maybe my favorite iteration of the Doom Engine</li>
<li><a href="https://archive.org/details/msdos_Rise_of_the_Triad_-_The_Hunt_Begins_Deluxe_Edition_1995"><em>Rise of the Triads</em></a> &mdash; there is absolutely <em>no way</em> that ROTT actually
looked as bad as this emulation at the time on baseline hardware, but we&rsquo;ll let
that slide &mdash; the graphics may have been better than they show here, but it
was the Duke Nukem property of its moment, which is to say ultimately a
regressive and not-very-consequential signpost on the way to later
developments</li>
</ul>
<p>And then I got to thinking about the Adventure Game Toolkit, which was this
sort of declarative, not-really-programmable interpreter for simple adventure
games. The way I remember it, you wrote static descriptions of rooms, objects,
and characters. It was a limited system, and the command interpreter was
pretty terrible, but it was also a lot more approachable than things like TADS
for people who didn&rsquo;t really know how to program anyway. (Like me at the time.)</p>
<p>I&rsquo;d like to get AGT running on squiggle.city, just because. It turns out
there&rsquo;s a <a href="http://www.ifarchive.org/indexes/if-archiveXprogrammingXagtXagility.html">portable interpreter called AGiliTY</a>, although maybe not
one that&rsquo;s well packaged. I&rsquo;ll probably explore this more.</p>
</article>
<article>
<h1><a name=Tuesday-January-13 href=#Tuesday-January-13>#</a> Tuesday, January 13</h1>
<h2><a name=Tuesday-January-13-rtd-bus-schedules-transit-data href=#Tuesday-January-13-rtd-bus-schedules-transit-data>#</a> rtd / bus schedules / transit data</h2>
<p>I&rsquo;m taking the bus today, so I got to thinking about bus schedules. I use
Google Calendar a little bit (out of habit and convenience more than any
particular love), and I was thinking &ldquo;why doesn&rsquo;t my calendar just know the
times of transit routes I use?&rdquo;</p>
<p>I thought maybe there&rsquo;d be, say, iCal (CalDAV? What is actually the thing?)
data somewhere for a given RTD schedule, or failing that, maybe JSON or TSV or
something. A cursory search doesn&rsquo;t turn up much, but I did find these:</p>
<ul>
<li><a href="http://www.rtd-denver.com/Developer.shtml">http://www.rtd-denver.com/Developer.shtml</a></li>
<li><a href="https://developers.google.com/transit/gtfs/reference?csw=1">https://developers.google.com/transit/gtfs/reference?csw=1</a></li>
<li><a href="http://www.rtd-denver.com/GoogleFeeder/">http://www.rtd-denver.com/GoogleFeeder/</a></li>
<li><a href="http://www.rtd-denver.com/GoogleFeeder/google_transit_Jan15_Runboard.zip">http://www.rtd-denver.com/GoogleFeeder/google_transit_Jan15_Runboard.zip</a></li>
</ul>
<p>I grabbed that last one.</p>
<pre><code>brennen@desiderata 16:16:43 /home/brennen ★ mkdir rtd &amp;&amp; mv google_transit_Jan15_Runboard.zip rtd
brennen@desiderata 16:16:51 /home/brennen ★ cd rtd
brennen@desiderata 16:16:53 /home/brennen/rtd ★ unzip google_transit_Jan15_Runboard.zip
Archive: google_transit_Jan15_Runboard.zip
inflating: calendar.txt
inflating: calendar_dates.txt
inflating: agency.txt
inflating: shapes.txt
inflating: stop_times.txt
inflating: trips.txt
inflating: stops.txt
inflating: routes.txt
</code></pre>
<p>Ok, so this is pretty minimalist CSV stuff from the look of most of it.</p>
<pre><code>brennen@desiderata 16:22:12 /home/brennen/rtd ★ grep Lyons stops.txt
20921,Lyons PnR,Vehicles Travelling East, 40.223979,-105.270174,,,0
</code></pre>
<p>So it looks like stops have an individual id?</p>
<pre><code>brennen@desiderata 16:24:41 /home/brennen/rtd ★ grep '20921' ./*.txt | wc -l
87
</code></pre>
<p>A lot of this is noise, but:</p>
<pre><code>brennen@desiderata 16:26:23 /home/brennen/rtd ★ grep 20921 ./stop_times.txt
8711507,12:52:00,12:52:00,20921,43,,1,0,
8711508,11:32:00,11:32:00,20921,43,,1,0,
8711509,07:55:00,07:55:00,20921,43,,1,0,
8711512,16:41:00,16:41:00,20921,43,,1,0,
8711519,05:37:00,05:37:00,20921,3,,0,1,
8711517,16:47:00,16:47:00,20921,1,,0,1,
8711511,17:58:00,17:58:00,20921,43,,1,0,
8711514,13:02:00,13:02:00,20921,1,,0,1,
8711516,07:59:00,07:59:00,20921,1,,0,1,
8711515,11:42:00,11:42:00,20921,1,,0,1,
8711510,19:10:00,19:10:00,20921,43,,1,0,
8711513,18:05:00,18:05:00,20921,1,,0,1,
8711518,06:47:00,06:47:00,20921,1,,0,1,
brennen@desiderata 16:26:57 /home/brennen/rtd ★ head -1 stop_times.txt
trip_id,arrival_time,departure_time,stop_id,stop_sequence,stop_headsign,pickup_type,drop_off_type,shape_dist_traveled
</code></pre>
<p>So:</p>
<pre><code>brennen@desiderata 16:41:47 /home/brennen/code/rtd-tools (master) ★ grep ',20921,' ./stop_times.txt | cut -d, -f1,3 | sort -n
8711507,12:52:00
8711508,11:32:00
8711509,07:55:00
8711510,19:10:00
8711511,17:58:00
8711512,16:41:00
8711513,18:05:00
8711514,13:02:00
8711515,11:42:00
8711516,07:59:00
8711517,16:47:00
8711518,06:47:00
8711519,05:37:00
</code></pre>
<p>That first number is a <code>trip_id</code>, the second one departure time. Trips
are provided in <code>trips.txt</code>:</p>
<pre><code>brennen@desiderata 16:54:56 /home/brennen/code/rtd-tools (master) ★ head -2 trips.txt
route_id,service_id,trip_id,trip_headsign,direction_id,block_id,shape_id
0,SA,8690507,Union Station,0, 0 2,793219
</code></pre>
<p>I don&rsquo;t usually use <code>join</code> very much, but this seems like a logical place for
it. It turns out that <code>join</code> wants its input sorted on the join field, so I do
this:</p>
<pre><code>brennen@desiderata 16:54:38 /home/brennen/code/rtd-tools (master) ★ sort -t, -k1 stop_times.txt &gt; stop_times.sorted.txt
brennen@desiderata 16:54:38 /home/brennen/code/rtd-tools (master) ★ sort -t, -k3 trips.txt &gt; trips.sorted.txt
</code></pre>
<p>And then:</p>
<pre><code>brennen@desiderata 16:51:07 /home/brennen/code/rtd-tools (master) ★ join -t, -1 1 -2 3 ./stop_times.sorted.txt ./trips.sorted.txt | grep 20921
,Y,WK,Lyons PnR,0, Y 16,79481043,,1,0,
,Y,WK,Lyons PnR,0, Y 16,79481043,,1,0,
,Y,WK,Lyons PnR,0, Y 15,79481043,,1,0,
,Y,WK,Lyons PnR,0, Y 41,79480943,,1,0,
,Y,WK,Lyons PnR,0, Y 41,79481043,,1,0,
,Y,WK,Lyons PnR,0, Y 41,79481043,,1,0,
,Y,WK,Boulder Transit Center,1, Y 41,794814
,Y,WK,Boulder Transit Center,1, Y 16,794812
,Y,WK,Boulder Transit Center,1, Y 16,794814
,Y,WK,Boulder Transit Center,1, Y 15,794812
,Y,WK,Boulder Transit Center,1, Y 41,794813
,Y,WK,Boulder Transit Center,1, Y 15,794813
,Y,WK,Boulder Transit Center,1, 206 1,794816
</code></pre>
<p>Ok, waitasec. What the fuck is going on here? The string <code>20921</code> appears
nowhere in these lines. It takes me too long to figure out that the
text files have CRLF line-endings and this is messing with something in
the chain (probably just output from <code>grep</code>, since it&rsquo;s obviously
finding the string). So:</p>
<pre><code>brennen@desiderata 16:59:35 /home/brennen/code/rtd-tools (master) ★ dos2unix *.sorted.txt
dos2unix: converting file stop_times.sorted.txt to Unix format ...
dos2unix: converting file trips.sorted.txt to Unix format ...
</code></pre>
<p>Why does <code>dos2unix</code> operate in-place on files instead of printing to STDOUT?
It beats me, but I sure am glad I didn&rsquo;t run it on anything especially
breakable. It <em>does</em> do what you&rsquo;d expect when piped to, anyway, which is
probably what I should have done.</p>
<p>So this seems to work:</p>
<pre><code>brennen@desiderata 17:04:45 /home/brennen/code/rtd-tools (master) ★ join -t, -1 1 -2 3 ./stop_times.sorted.txt ./trips.sorted.txt | grep 20921
8711507,12:52:00,12:52:00,20921,43,,1,0,,Y,WK,Lyons PnR,0, Y 16,794810
8711508,11:32:00,11:32:00,20921,43,,1,0,,Y,WK,Lyons PnR,0, Y 16,794810
8711509,07:55:00,07:55:00,20921,43,,1,0,,Y,WK,Lyons PnR,0, Y 15,794810
8711510,19:10:00,19:10:00,20921,43,,1,0,,Y,WK,Lyons PnR,0, Y 41,794809
8711511,17:58:00,17:58:00,20921,43,,1,0,,Y,WK,Lyons PnR,0, Y 41,794810
8711512,16:41:00,16:41:00,20921,43,,1,0,,Y,WK,Lyons PnR,0, Y 41,794810
8711513,18:05:00,18:05:00,20921,1,,0,1,,Y,WK,Boulder Transit Center,1, Y 41,794814
8711514,13:02:00,13:02:00,20921,1,,0,1,,Y,WK,Boulder Transit Center,1, Y 16,794812
8711515,11:42:00,11:42:00,20921,1,,0,1,,Y,WK,Boulder Transit Center,1, Y 16,794814
8711516,07:59:00,07:59:00,20921,1,,0,1,,Y,WK,Boulder Transit Center,1, Y 15,794812
8711517,16:47:00,16:47:00,20921,1,,0,1,,Y,WK,Boulder Transit Center,1, Y 41,794813
8711518,06:47:00,06:47:00,20921,1,,0,1,,Y,WK,Boulder Transit Center,1, Y 15,794813
8711519,05:37:00,05:37:00,20921,3,,0,1,,Y,WK,Boulder Transit Center,1, 206 1,794816
</code></pre>
<p>Which seems kind of right for the <a href="http://www3.rtd-denver.com/schedules/getSchedule.action?runboardId=151&amp;routeId=Y&amp;routeType=12&amp;direction=S-Bound&amp;serviceType=3">South</a> &amp;
<a href="http://www3.rtd-denver.com/schedules/getSchedule.action?runboardId=151&amp;routeId=Y&amp;routeType=12&amp;direction=N-Bound&amp;serviceType=3">Northbound</a> schedules, but they&rsquo;re weirdly intermingled. I think
this pulls departure time and a <code>direction_id</code> field:</p>
<pre><code>brennen@desiderata 17:15:12 /home/brennen/code/rtd-tools (master) ★ join -t, -1 1 -2 3 ./stop_times.sorted.txt ./trips.sorted.txt | grep 20921 | cut -d, -f3,13 | sort -n
05:37:00,1
06:47:00,1
07:55:00,0
07:59:00,1
11:32:00,0
11:42:00,1
12:52:00,0
13:02:00,1
16:41:00,0
16:47:00,1
17:58:00,0
18:05:00,1
19:10:00,0
</code></pre>
<p>So southbound, I guess:</p>
<pre><code>brennen@desiderata 17:15:59 /home/brennen/code/rtd-tools (master) ★ join -t, -1 1 -2 3 ./stop_times.sorted.txt ./trips.sorted.txt | grep 20921 | cut -d, -f3,13 | grep ',1' | sort -n
05:37:00,1
06:47:00,1
07:59:00,1
11:42:00,1
13:02:00,1
16:47:00,1
18:05:00,1
</code></pre>
<p>This should probably be where I think oh, right, this is a Google spec - maybe
there&rsquo;s <a href="https://github.com/google/transitfeed">already some tooling</a>. Failing
that, slurping them into SQLite or something would be a lot less painful. Or
at least using csvkit.</p>
</article>
<article>
<h1><a name=Wednesday-January-14-2015 href=#Wednesday-January-14-2015>#</a> Wednesday, January 14, 2015</h1>
<p>On making a web page remind me of a quality I never fully appreciated in
HyperCard.</p>
<p>So I generally am totally ok with scrolling on web pages. I think in
fact it&rsquo;s a major advantage of the form.</p>
<p>Then again, I just got to indulging a few minutes of thinking about
HyperCard, and I think that this time rather than read the same old
articles about its ultimate doom over and over again, maybe I should do
something by way of recreating part of it that was different from the
web in general.</p>
<p>The web has plenty of stupid carousels and stuff, but despite their example I&rsquo;m
curious whether HyperCard&rsquo;s stack model could still hold up as an idea. I was
never sure whether it was the important thing or not. It was so obviously and
almost clumsily a <em>metaphor</em>. (A skeuomorphism which I have never actually
seen anyone bag on when they are playing that game, perhaps because Designer
Ideologues know there&rsquo;s not much percentage in talking shit about HyperCard.)</p>
<p>Here is some JavaScript to start:</p>
<pre><code>$('article').each(function (i, a) {
$(a).hide();
});
$('article').first().show();
</code></pre>
<p>I&rsquo;ll spare you the usual slow-composition narrative of where I go from here,
and jump straight to my eventual <a href="https://github.com/brennen/tildebrennen/commit/560826a9884dae47998843dcf4917266b3344fec">first-pass solution</a>.</p>
<p>(Ok, actually I just repurposed a terrible thing I did for some slides a while
back, after recreating about 75% without remembering that I had already written
the same code within the last couple of months. It&rsquo;s amazing how often that
happens, or I guess it would be amazing if my short term memory weren&rsquo;t so
thoroughly scrambled from all the evil living I do.)</p>
</article>
<article>
<h1><a name=Friday-January-16 href=#Friday-January-16>#</a> Friday, January 16</h1>
<p><a href="http://www.raspberrypi.org/documentation/configuration/wireless/wireless-cli.md">Wireless configuration under Raspbian</a>.</p>
</article>
<article>
<h1><a name=Tuesday-January-20 href=#Tuesday-January-20>#</a> Tuesday, January 20</h1>
<p>I wanted to figure out where I used a library in existing code.</p>
<p>This is what I wound up doing in zsh:</p>
<pre><code>brennen@exuberance 11:48:07 /home/brennen/code $ for foo in `ls -f`; do; if [[ -d $foo/.git ]]; then cd $foo; echo '--' $foo '--'; git grep 'IPC::System::Simple'; cd ~/code; fi; done
-- thcipriani-dotfiles --
-- sfe-sysadmin --
-- pi_bootstrap --
-- bpb-kit --
-- batchpcb --
-- according-to-pete --
-- meatbags --
-- sfe-paleo --
-- instruct --
-- sfe-openstack --
-- YouTube_Captions --
-- batchpcb_rails --
-- userland-book --
slides/render.pl:use IPC::System::Simple qw(capturex);
-- sfe-custom-queries --
-- brennen-sparklib-fork --
-- tilde.club --
-- display --
-- sfe-chef --
-- xrectsel --
-- git-feed --
git-feed:use IPC::System::Simple qw(capturex);
sample_feed.xml: use IPC::System::Simple qw(capturex);
sample_feed.xml:+use IPC::System::Simple qw(capturex);
-- reddit --
-- rtd-tools --
-- sparkfun --
-- mru --
</code></pre>
<p>Lame-ish, but I&rsquo;m perpetually forgetting shell loop and conditional syntax, so
it seems worth making a note of.</p>
</article>
<article>
<h1><a name=Thursday-January-22 href=#Thursday-January-22>#</a> Thursday, January 22</h1>
<h2><a name=Thursday-January-22-deleting-files-from-git-history href=#Thursday-January-22-deleting-files-from-git-history>#</a> deleting files from git history</h2>
<p>Working on a project where we included some built files that took up a bunch of
space, and decided we should get rid of those. The git repository isn&rsquo;t public
yet and is only shared by a handful of users, so it seemed worth thinking about
rewriting the history a bit.</p>
<p>There&rsquo;s reasonably good documentation for this in the usual places if you look,
but I ran into some trouble.</p>
<p>First, what seemed to work: David Underhill has a <a href="http://dound.com/2009/04/git-forever-remove-files-or-folders-from-history/">good short script</a> from
back in 2009 for using <code>git filter-branch</code> to eliminate particular files from
history:</p>
<blockquote><p>I recently had a need to rewrite a git repository’s history. This isn’t
generally a very good idea, though it is useful if your repository contains
files it should not (such as unneeded large binary files or copyrighted
material). I also am using it because I had a branch where I only wanted to
merge a subset of files back into master (though there are probably better
ways of doing this). Anyway, it is not very hard to rewrite history thanks to
the excellent git-filter-branch tool which comes with git.</p></blockquote>
<p>I&rsquo;ll reproduce the script here, in the not-unlikely event that his writeup goes
away:</p>
<pre><code>#!/bin/bash
set -o errexit
# Author: David Underhill
# Script to permanently delete files/folders from your git repository. To use
# it, cd to your repository's root and then run the script with a list of paths
# you want to delete, e.g., git-delete-history path1 path2
if [ $# -eq 0 ]; then
exit 0
fi
# make sure we're at the root of git repo
if [ ! -d .git ]; then
echo "Error: must run this script from the root of a git repository"
exit 1
fi
# remove all paths passed as arguments from the history of the repo
files=$@
git filter-branch --index-filter "git rm -rf --cached --ignore-unmatch $files" HEAD
# remove the temporary history git-filter-branch otherwise leaves behind for a long time
rm -rf .git/refs/original/ &amp;&amp; git reflog expire --all &amp;&amp; git gc --aggressive --prune
</code></pre>
<p>A big thank you to Mr. Underhill for documenting this one. <code>filter-branch</code>
seems really powerful, and not as brain-hurting as some things in git land.
The <a href="http://git-scm.com/docs/git-filter-branch">docs</a> are currently pretty good, and worth a read if you&rsquo;re trying to
solve this problem.</p>
<blockquote><p>Lets you rewrite Git revision history by rewriting the branches mentioned in
the <rev-list options>, applying custom filters on each revision. Those
filters can modify each tree (e.g. removing a file or running a perl rewrite
on all files) or information about each commit. Otherwise, all information
(including original commit times or merge information) will be preserved.</p></blockquote>
<p>After this, things got muddier. The script seemed to work fine, and after
running it I was able to see all the history I expected, minus some troublesome
files. (A version with <code>--prune-empty</code> added to the <code>git filter-branch</code>
invocation got rid of some empty commits.) But then:</p>
<pre><code>brennen@exuberance 20:05:00 /home/brennen/code $ du -hs pi_bootstrap
218M pi_bootstrap
brennen@exuberance 20:05:33 /home/brennen/code $ du -hs experiment
199M experiment
</code></pre>
<p>That second repo is a clone of the original with the script run against it.
Why is it only tens of megabytes smaller, when minus the big binaries I zapped,
it should come in somewhere under 10 megs?</p>
<p>I will spare you, dear reader, the contortions I went through arriving at a
solution for this, partially because I don&rsquo;t have the energy left to
reconstruct them from the tattered history of my googling over the last few
hours. What I figured out was that for some reason, a bunch of blobs were
persisting in a pack file, despite not being referenced by any commits, and no
matter what I couldn&rsquo;t get <code>git gc</code> or <code>git repack</code> to zap them.</p>
<p>I more or less got this far with commands like:</p>
<pre><code>brennen@exuberance 20:49:10 /home/brennen/code/experiment2/.git (master) $ git count-objects -v
count: 0
size: 0
in-pack: 2886
packs: 1
size-pack: 202102
prune-packable: 0
garbage: 0
size-garbage: 0
</code></pre>
<p>And:</p>
<pre><code>git verify-pack -v ./objects/pack/pack-b79fc6e30a547433df5c6a0c6212672c5e5aec5f &gt; ~/what_the_fuck
</code></pre>
<p>&hellip;which gives a list of all the stuff in a pack file, including
super-not-human-readable sizes that you can sort on, and many permutations of
things like:</p>
<pre><code>brennen@exuberance 20:49:12 /home/brennen/code/experiment2/.git (master) $ git log --pretty=oneline | cut -f1 -d' ' | xargs -L1 git cat-file -s | sort -nr | head
589
364
363
348
341
331
325
325
322
320
</code></pre>
<p>&hellip;where <code>cat-file</code> is a bit of a Swiss army knife for looking at objects, with
<code>-s</code> meaning &ldquo;tell me a size&rdquo;.</p>
<p>(An aside: If you are writing software that outputs a size in bytes, blocks,
etc., and you do not provide a &ldquo;human readable&rdquo; option to display this in
comprehensible units, the innumerate among us quietly hate your guts. This is
perhaps unjust of us, but I&rsquo;m just trying to communicate my experience here.)</p>
<p>And finally, <a href="https://stackoverflow.com/questions/223678/which-commit-has-this-blob/223890#223890">Aristotle Pagaltzis&rsquo;s script</a> for figuring out which commit
has a given blob (the answer is <em>fucking none of them</em>, in my case):</p>
<pre><code>#!/bin/sh
obj_name="$1"
shift
git log "$@" --pretty=format:'%T %h %s' \
| while read tree commit subject ; do
if git ls-tree -r $tree | grep -q "$obj_name" ; then
echo $commit "$subject"
fi
done
</code></pre>
<p>Also somewhere in there I learned how to use <a href="http://git-scm.com/docs/git-bisect"><code>git bisect</code></a> (which is
really cool and likely something I will use again) and went through and made
entirely certain there was nothing in the history with a bunch of big files
in it.</p>
<p>So eventually I got to thinking ok, there&rsquo;s something here that is keeping
these objects from getting expired or pruned or garbage collected or whatever,
so how about doing a clone that just copies the stuff in the commits that still
exist at this point. Which brings us to:</p>
<pre><code>brennen@exuberance 19:03:08 /home/brennen/code/experiment2 (master) $ git help clone
brennen@exuberance 19:06:52 /home/brennen/code/experiment2 (master) $ cd ..
brennen@exuberance 19:06:55 /home/brennen/code $ git clone --no-local ./experiment2 ./experiment2_no_local
Cloning into './experiment2_no_local'...
remote: Counting objects: 2874, done.
remote: Compressing objects: 100% (1611/1611), done.
remote: Total 2874 (delta 938), reused 2869 (delta 936)
Receiving objects: 100% (2874/2874), 131.21 MiB | 37.48 MiB/s, done.
Resolving deltas: 100% (938/938), done.
Checking connectivity... done.
brennen@exuberance 19:07:15 /home/brennen/code $ du -hs ./experiment2_no_local
133M ./experiment2_no_local
brennen@exuberance 19:07:20 /home/brennen/code $ git help clone
brennen@exuberance 19:08:34 /home/brennen/code $ git clone --no-local --single-branch ./experiment2 ./experiment2_no_local_single_branch
Cloning into './experiment2_no_local_single_branch'...
remote: Counting objects: 1555, done.
remote: Compressing objects: 100% (936/936), done.
remote: Total 1555 (delta 511), reused 1377 (delta 400)
Receiving objects: 100% (1555/1555), 1.63 MiB | 0 bytes/s, done.
Resolving deltas: 100% (511/511), done.
Checking connectivity... done.
brennen@exuberance 19:08:47 /home/brennen/code $ du -hs ./experiment2_no_local_single_branch
3.0M ./experiment2_no_local_single_branch
</code></pre>
<p>What&rsquo;s going on here? <a href="http://git-scm.com/docs/git-clone">Well</a>, <code>git clone --no-local</code>:</p>
<pre><code>--local
-l
When the repository to clone from is on a local machine, this flag
bypasses the normal "Git aware" transport mechanism and clones the
repository by making a copy of HEAD and everything under objects and
refs directories. The files under .git/objects/ directory are
hardlinked to save space when possible.
If the repository is specified as a local path (e.g., /path/to/repo),
this is the default, and --local is essentially a no-op. If the
repository is specified as a URL, then this flag is ignored (and we
never use the local optimizations). Specifying --no-local will override
the default when /path/to/repo is given, using the regular Git
transport instead.
</code></pre>
<p>And <code>--single-branch</code>:</p>
<pre><code>--[no-]single-branch
Clone only the history leading to the tip of a single branch, either
specified by the --branch option or the primary branch remote’s HEAD
points at. When creating a shallow clone with the --depth option, this
is the default, unless --no-single-branch is given to fetch the
histories near the tips of all branches. Further fetches into the
resulting repository will only update the remote-tracking branch for
the branch this option was used for the initial cloning. If the HEAD at
the remote did not point at any branch when --single-branch clone was
made, no remote-tracking branch is created.
</code></pre>
<p>I have no idea why <code>--no-local</code> by itself reduced the size but didn&rsquo;t really do
the job.</p>
<p>It&rsquo;s possible the lingering blobs would have been garbage collected
<em>eventually</em>, and at any rate it seems likely that in pushing them to a remote
repository I would have bypassed whatever lazy local file copy operation was
causing everything to persist on cloning, thus rendering all this
head-scratching entirely pointless, but then who knows. At least I understand
git file structure a little better than I did before.</p>
<p>For good measure, I just remembered how old much of the software on this
machine is, and I feel like kind of an ass:</p>
<pre><code>brennen@exuberance 21:20:50 /home/brennen/code $ git --version
git version 1.9.1
</code></pre>
<p>This is totally an old release. If there&rsquo;s a bug here, maybe it&rsquo;s fixed by
now. I will not venture a strong opinion as to whether there is a bug. Maybe
this is entirely expected behavior. It is time to drink a beer.</p>
<h2><a name=Thursday-January-22-postscript-on-finding-bugs href=#Thursday-January-22-postscript-on-finding-bugs>#</a> postscript: on finding bugs</h2>
<p>The first thing you learn, by way of considerable personal frustration and
embarrassment, goes something like this:</p>
<blockquote><p>Q: My stuff isn&rsquo;t working. I think there is probably a bug in this mature
and widely-used (programming language | library | utility software).</p>
<p>A: Shut up shut up shut up <em>shut up</em> there is not a bug. Now go and figure
out what is wrong with your code.</p></blockquote>
<p>The second thing goes something like this:</p>
<blockquote><p>Oh. I guess that&rsquo;s actually a bug.</p></blockquote>
<p>Which is to say: I have learned that I&rsquo;m probably wrong, but sometimes I&rsquo;m
also wrong about being wrong.</p>
</article>
<article>
<h1><a name=Sunday-January-25-2015 href=#Sunday-January-25-2015>#</a> Sunday, January 25, 2015</h1>
<h2><a name=Sunday-January-25-2015-background-colors-for-tmux href=#Sunday-January-25-2015-background-colors-for-tmux>#</a> background colors for tmux</h2>
<p>I&rsquo;m logged into too many machines. I make an effort to have prompt colors differ
between hosts, but tmux is the same everywhere.</p>
<p>You can do this sort of thing:</p>
<pre><code>brennen@exuberance 11:54:43 /home/brennen/code $ cat ~/.tmux.conf
# Set window notifications
setw -g monitor-activity on
set -g visual-activity on
set -g status-bg blue
set -g status-fg white
</code></pre>
<p>&hellip;where <code>status-bg</code> and <code>status-fg</code> are colors for the status bar.</p>
<p>It seems like there may be ways to conditionalize this, but at this point I&rsquo;m
tempted to just pull some simple templating system into my <a href="https://github.com/brennen/bpb-kit">dotfile
stuff</a> and generate a subset of config files on a per-host basis.</p>
</article>
<article>
<h1><a name=Tuesday-January-27 href=#Tuesday-January-27>#</a> Tuesday, January 27</h1>
<h2><a name=Tuesday-January-27-what-version-of-what-linux-distribution-is-this href=#Tuesday-January-27-what-version-of-what-linux-distribution-is-this>#</a> what version of what linux distribution is this?</h2>
<p>Some luck <em>may</em> be had with one or more of:</p>
<pre><code>root@beaglebone:~# uname -a
Linux beaglebone 3.8.13-bone47 #1 SMP Fri Apr 11 01:36:09 UTC 2014 armv7l GNU/Linux
root@beaglebone:~# lsb_release -a
No LSB modules are available.
Distributor ID: Debian
Description: Debian GNU/Linux 7.8 (wheezy)
Release: 7.8
Codename: wheezy
root@beaglebone:~# cat /etc/debian_version
7.8
root@beaglebone:~# cat /etc/dogtag
BeagleBoard.org BeagleBone Debian Image 2014-04-23
root@beaglebone:~# cat /etc/os-release
PRETTY_NAME="Debian GNU/Linux 7 (wheezy)"
NAME="Debian GNU/Linux"
VERSION_ID="7"
VERSION="7 (wheezy)"
ID=debian
ANSI_COLOR="1;31"
HOME_URL="http://www.debian.org/"
SUPPORT_URL="http://www.debian.org/support/"
BUG_REPORT_URL="http://bugs.debian.org/"
</code></pre>
<h2><a name=Tuesday-January-27-armhf href=#Tuesday-January-27-armhf>#</a> armhf</h2>
<p><a href="https://blogs.oracle.com/jtc/entry/is_it_armhf_or_armel">Is it armhf or armel?</a>:</p>
<blockquote><p>During diagnosis, the question becomes, how can I determine whether my Linux
distribution is based on armel or armhf? Turns out this is not as
straightforward as one might think. Aside from experience and anecdotal
evidence, one possible way to ascertain whether you&rsquo;re running on armel or
armhf is to run the following obscure command:</p>
<pre><code>$ readelf -A /proc/self/exe | grep Tag_ABI_VFP_args
</code></pre>
<p>If the Tag_ABI_VFP_args tag is found, then you&rsquo;re running on an armhf system.
If nothing is returned, then it&rsquo;s armel. To show you an example, here&rsquo;s what
happens on a Raspberry Pi running the Raspbian distribution:</p>
<pre><code>pi@raspberrypi:~$ readelf -A /proc/self/exe | grep Tag_ABI_VFP_args
Tag_ABI_VFP_args: VFP registers
</code></pre>
<p>This indicates an armhf distro, which in fact is what Raspbian is. On the
original, soft-float Debian Wheezy distribution, here&rsquo;s what happens:</p>
<pre><code>pi@raspberrypi:~$ readelf -A /proc/self/exe | grep Tag_ABI_VFP_args
</code></pre>
<p>Nothing returned indicates that this is indeed armel.</p></blockquote>
<p>On a recent-ish Beaglebone Black:</p>
<pre><code>root@beaglebone:~# readelf -A /proc/self/exe | grep Tag_ABI_VFP_args
Tag_ABI_VFP_args: VFP registers
</code></pre>
</article>
<article>
<h1><a name=Wednesday-January-28 href=#Wednesday-January-28>#</a> Wednesday, January 28</h1>
<h2><a name=Wednesday-January-28-on-replicating-process href=#Wednesday-January-28-on-replicating-process>#</a> on replicating process</h2>
<p>Ok, so here we are. It&rsquo;s 2015. The gold standard for explaining how you
solved a technical problem to the internet at large is a blog post with things
you can copy and paste or maybe some pictures.</p>
<p>If you&rsquo;re really lucky, someone actually has reusable a public repository of
some kind. If you&rsquo;re <em>really</em> lucky, their code works, and if all the gods
are smiling on you at once, their code is <em>documented</em>.</p>
<p>It seems to me that we can do better than this. We possess a great many of the
right <em>tools</em> to do better than this, at least for a lot of common problems.
What does it take to make a given workflow both repeatable and legible to
people without the context we have for a given thing (including ourselves)?
Writing about it is surely desirable, but how do you approach a problem so
that, instead of being scattered across your short term memory and a dozen
volatile buffers, your work becomes a kind of document unto itself?</p>
<p>This is the (beautiful) root of what version control does, after all: It
renders a normally-invisible process legible, and in its newfound legibility,
at least a little susceptible to transmission and reuse.</p>
<p>What do I know works well for transmitting process and discovery, as far as it
goes?</p>
<ul>
<li>version control (so really git, which is severally horrible but also
brilliant and wins anyway)</li>
<li>Makefiles (except that I don&rsquo;t understand make at <em>all</em>)</li>
<li>shell scripts (except that shell programming is an utter nightmare)</li>
<li>Debian packages (which are more or less compounded of the above, and
moderately torturous to build)</li>
<li>IRC, if you keep logs, because it&rsquo;s amazing how much knowledge is most purely
conveyed in the medium of internet chat</li>
<li>Stackoverflow &amp; friends (I hate this, but there it is, it&rsquo;s a fact, we have to
deal with it no matter how much we hate process jockies, just like Wikipedia)</li>
<li>screenshots and screencasts (a pain to make, all-too-often free of context, and
yet)</li>
</ul>
<p>Here are some things that I think are often terrible at this stuff despite
their ubiquity:</p>
<ul>
<li>mailing lists (so bad, so <em>routinely pathological</em>, so utterly necessary to
everything)</li>
<li>web forums like phpBB and stuff (so bad, so ubiquitous, so going to show up
in your google results with the hint you desperately needed, but only if you&rsquo;re
smart enough to parse it out of the spew)</li>
</ul>
<p>Here&rsquo;s one problem: There are a lot of relatively painless once you know them
tools, like &ldquo;let&rsquo;s just make this a dvcs repo because it&rsquo;s basically free&rdquo;,
that if you know they exist and you really want to avoid future suffering you
just get in the habit of using by default. But most people don&rsquo;t know these
tools exist, or that they&rsquo;re <em>generally applicable tools</em> and not just
specialist things you might use for the one important thing at your job because
somebody said you should.</p>
<h2><a name=Wednesday-January-28-what-makes-programming-hard href=#Wednesday-January-28-what-makes-programming-hard>#</a> what makes programming hard?</h2>
<ol>
<li>Most of the existing programs.</li>
<li>Most of the existing programming languages.</li>
<li>Other programmers.</li>
<li>Human thought is brutally constrained in understanding complex systems.</li>
<li>Ok you wrote some programs anyway now GOTO 0.</li>
</ol>
<h2><a name=Wednesday-January-28-debian-packaging-again href=#Wednesday-January-28-debian-packaging-again>#</a> debian packaging again</h2>
<p>I&rsquo;m starting <a href="https://www.debian.org/doc/manuals/maint-guide/">here</a> again.</p>
<ul>
<li><a href="https://wiki.debian.org/IntroDebianPackaging">https://wiki.debian.org/IntroDebianPackaging</a></li>
<li><a href="https://www.debian.org/doc/manuals/packaging-tutorial/packaging-tutorial.en.pdf">https://www.debian.org/doc/manuals/packaging-tutorial/packaging-tutorial.en.pdf</a>
(actually more helpful than anything else I&rsquo;ve found so far)</li>
</ul>
<h2><a name=Wednesday-January-28-vagrant href=#Wednesday-January-28-vagrant>#</a> vagrant</h2>
<p>Vagrant is a thing for quickly provisioning / tearing down / connecting to
virtual machines. It wraps VirtualBox, among other providers. I think the
basic appeal is that you get cheap, more-or-less disposable environments with a
couple of commands, and there&rsquo;s scaffolding for simple scripts to configure a
box when it&rsquo;s brought up, or share directories with the host filesystem. It&rsquo;s
really lightweight to try out.</p>
<p>Go to the <a href="http://www.vagrantup.com/downloads">downloads page</a> and install from
there. I used the 64 bit Ubuntu .deb.</p>
<pre><code>$ sudo apt-get install virtualbox
$ sudo dpkg -i vagrant_1.7.2_x86_64.deb
$ mkdir vagrant_test
$ cd vagrant_test
$ vagrant init hashicorp/precise32
$ vagrant up
$ vagrant ssh
</code></pre>
<p>This stuff takes a while on the first run through, but is generally really
slick. <code>hashicorp/precise32</code> is more or less just a preconfigured image pulled
from a central repository.</p>
<p>Their <a href="http://docs.vagrantup.com/v2/getting-started/index.html">Getting Started</a> is pretty
decent.</p>
<p>People around me have been enthusing about this kind of thing for ages, but I
haven&rsquo;t really gotten around to figuring out why I should care until recently.
I will probably be using this tool for a lot of development tasks.</p>
<p>Other notes:</p>
<ul>
<li>stackoverflow: <a href="https://stackoverflow.com/questions/17117063/how-can-i-create-a-vm-in-vagrant-with-virtualbox-with-two-cpus">How can I create a VM in vagrant with virtualbox with two cpus?</a></li>
<li><a href="https://tylercipriani.com/2014/05/25/lightweight-portable-vagrant-docker.html">Development Environments with Vagrant, Docker, and Supervisord</a></li>
</ul>
</article>
<article>
<h1><a name=Thursday-January-29 href=#Thursday-January-29>#</a> Thursday, January 29</h1>
<h2><a name=Thursday-January-29-raspberry-pi-kernels href=#Thursday-January-29-raspberry-pi-kernels>#</a> raspberry pi kernels</h2>
<ul>
<li><a href="http://elinux.org/Raspberry_Pi_Kernel_Compilation">http://elinux.org/Raspberry_Pi_Kernel_Compilation</a></li>
</ul>
</article>
<article>
<h1><a name=Monday-February-2 href=#Monday-February-2>#</a> Monday, February 2</h1>
<h2><a name=Monday-February-2-kernel-o-matic-amp-pi-finder href=#Monday-February-2-kernel-o-matic-amp-pi-finder>#</a> kernel-o-matic &amp; pi finder</h2>
<p>Published Friday:</p>
<ul>
<li><a href="https://github.com/adafruit/Adafruit-Pi-Kernel-o-Matic">https://github.com/adafruit/Adafruit-Pi-Kernel-o-Matic</a></li>
<li><a href="https://learn.adafruit.com/raspberry-pi-kernel-o-matic">https://learn.adafruit.com/raspberry-pi-kernel-o-matic</a></li>
</ul>
<p>Published a week or so before:</p>
<ul>
<li><a href="https://github.com/adafruit/Adafruit-Pi-Finder">https://github.com/adafruit/Adafruit-Pi-Finder</a></li>
<li><a href="https://blog.adafruit.com/2015/01/23/help-us-test-our-new-raspberry-pi-bootstrap-piday-raspberrypi/">https://blog.adafruit.com/2015/01/23/help-us-test-our-new-raspberry-pi-bootstrap-piday-raspberrypi/</a></li>
</ul>
<p>These have been a lot of my working time these last couple weeks, an
overlapping set of projects aimed at making the Pi (and eventually other
single-board computers) more usable and Adafruit&rsquo;s hardware and tutorials both
more accessible. This has been frustrating and rewarding by turns. I&rsquo;m trying
to reduce the complexity of a domain I just barely understand, in a lot of
ways, which may be a good summary of software development in general.</p>
<p><a href="https://www.vagrantup.com/">Vagrant</a> is something I should have paid attention
to sooner. The interfaces to virtualization are finally starting to overcome
my innate laziness on the whole question.</p>
<p>I just booted a Windows XP box for some reason. It made that noise. You know
the one.</p>
<h2><a name=Monday-February-2-raspberry-pi-2 href=#Monday-February-2-raspberry-pi-2>#</a> raspberry pi 2</h2>
<p>Announced today:</p>
<ul>
<li><a href="http://www.raspberrypi.org/products/raspberry-pi-2-model-b/">http://www.raspberrypi.org/products/raspberry-pi-2-model-b/</a></li>
<li><a href="https://blog.adafruit.com/2015/02/02/raspberry-pi-2-model-b-armv7-with-1g-ram-is-here-benchmarks-and-more-raspberry_pi-raspberryp/">https://blog.adafruit.com/2015/02/02/raspberry-pi-2-model-b-armv7-with-1g-ram-is-here-benchmarks-and-more-raspberry_pi-raspberryp/</a></li>
<li><a href="http://hackaday.com/2015/02/02/introducing-the-raspberry-pi-2/">http://hackaday.com/2015/02/02/introducing-the-raspberry-pi-2/</a></li>
</ul>
<p>Expect this to prove interesting. I&rsquo;ve been having a lot of conversations
about the relative merits of small computing systems, and while this can hardly
be said to address all the complaints you might have about the Pi, boosting
processor and RAM will do a lot for practical usability.</p>
<h2><a name=Monday-February-2-telling-composer-to-ignore-php-version-requirements href=#Monday-February-2-telling-composer-to-ignore-php-version-requirements>#</a> telling composer to ignore php version requirements</h2>
<p>Using <a href="https://getcomposer.org/">Composer</a> to set up a little project, I run
into the problem that the locally-installed PHP is a bit behind the times.</p>
<pre><code>brennen@exuberance 0:09:32 /home/brennen/code/project $ ./composer.phar install
Loading composer repositories with package information
Installing dependencies (including require-dev)
Your requirements could not be resolved to an installable set of packages.
Problem 1
- sparkfun/sparklib 1.1.9 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.8 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.7 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.6 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.5 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.4 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.3 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.2 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.11 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.10 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.1 requires php &gt;=5.5.17 -&gt; no matching package found.
- sparkfun/sparklib 1.1.0 requires php &gt;=5.5.17 -&gt; no matching package found.
- Installation request for sparkfun/sparklib ~1.1 -&gt; satisfiable by sparkfun/sparklib[1.1.0, 1.1.1, 1.1.10, 1.1.11, 1.1.2, 1.1.3, 1.1.4, 1.1.5, 1.1.6, 1.1.7, 1.1.8, 1.1.9].
Potential causes:
- A typo in the package name
- The package is not available in a stable-enough version according to your minimum-stability setting
see &lt;https://groups.google.com/d/topic/composer-dev/_g3ASeIFlrc/discussion&gt; for more details.
Read &lt;http://getcomposer.org/doc/articles/troubleshooting.md&gt; for further common problems.
</code></pre>
<p>Well, ok. I wrote a lot of that code, and I&rsquo;m pretty sure nothing I want out
of it will break under a slightly stale PHP. I check <code>./composer.phar help install</code>,
and sure enough, there&rsquo;s an option to ignore this requirement:</p>
<pre><code>brennen@exuberance 0:13:21 /home/brennen/code/project $ ./composer.phar install --ignore-platform-reqs
Loading composer repositories with package information
Installing dependencies (including require-dev)
- Installing sparkfun/sparklib (1.1.11)
Downloading: 100%
Writing lock file
Generating autoload files
</code></pre>
<p>I never used to quite get the &ldquo;install an executable utility script in the root
directory of your project&rdquo; thing, but the whole paradigm is growing on me a little
as my projects accumulate little Makefiles and shell scripts to render HTML,
publish revisions, or deploy packages.</p>
</article>
<article>
<h1><a name=Sunday-February-8 href=#Sunday-February-8>#</a> Sunday, February 8</h1>
<h2><a name=Sunday-February-8-systemd-amp-fsck href=#Sunday-February-8-systemd-amp-fsck>#</a> systemd &amp; fsck</h2>
<p>I just hit my first real frustration with systemd, which is running on the
Novena. The default storage here is a microSD card, and I&rsquo;ve had to
force-reboot this thing enough times that I&rsquo;d like to run <code>fsck</code> on the
root filesystem.</p>
<p>It used to be that you could call <code>shutdown -F</code> to force an fsck on boot.
The old aliases still exist, but I think the thing I&rsquo;m supposed to do here
is <code>systemctl reboot</code>, and there doesn&rsquo;t seem to be an analogous pattern any
more.</p>
<p>On the other hand, some of the choices immediatly evident in the design of
<code>systemctl</code> and <code>journalctl</code> seem interesting and not without merit.</p>
</article>
<article>
<h1><a name=Monday-March-2 href=#Monday-March-2>#</a> Monday, March 2</h1>
<h2><a name=Monday-March-2-python href=#Monday-March-2-python>#</a> python</h2>
<p>Significant whitespace isn&rsquo;t exactly a disaster, but on balance still feels
to me like it causes more problems than it solves: Copy &amp; paste headaches,
editor hassles, etc.</p>
</article>
<article>
<h1><a name=Thursday-April-9 href=#Thursday-April-9>#</a> Thursday, April 9</h1>
<h2><a name=Thursday-April-9-CGI-Fast-and-multi-param href=#Thursday-April-9-CGI-Fast-and-multi-param>#</a> CGI::Fast and multi_param()</h2>
<p>A little while ago, changes were made to <a href="http://search.cpan.org/~leejo/CGI-4.14/lib/CGI.pod">Perl&rsquo;s CGI.pm</a> because of a <a href="http://seclists.org/vulnwatch/2006/q4/6">class
of exploits</a> arising from calling <code>param()</code> in list context.</p>
<p>I had code in a wrapper for <a href="https://github.com/brennen/display">Display</a> that called <code>param()</code> in list context
deliberately:</p>
<pre><code># Handle input from FastCGI:
while (my $query = CGI::Fast-&gt;new) {
my @params = $query-&gt;param('keywords');
print $d-&gt;display(@params);
}
</code></pre>
<p>In due course, I started getting warnings about calling <code>param()</code> in list context.
They looked sort of like this:</p>
<pre><code>brennen@exuberance 18:46:13 /home/brennen/www (master) ★ perl display.fcgi 2&gt;&amp;1 | head -1
CGI::param called in list context from package main line 38, this can lead to vulnerabilities. See the warning in "Fetching the value or values of a single named parameter" at /usr/local/share/perl/5.20.1/CGI.pm line 408.
</code></pre>
<p>Problematic, since a variable containing that list is <em>exactly what I want</em>. On
googling, I found that in addition to the warning, CGI.pm had been amended to
include <code>multi_param()</code> for <a href="http://search.cpan.org/~leejo/CGI-4.14/lib/CGI.pod#Fetching_the_value_or_values_of_a_single_named_parameter:">the cases</a> where you explicitly want a list.
Ok, cool, I&rsquo;ll use that.</p>
<p>Fast forward to just now. <code>display.fcgi</code> is blowing up on my local machine. Why?</p>
<pre><code>[Thu Apr 09 18:28:29.606663 2015] [fcgid:warn] [pid 13984:tid 140343326992128] [client 127.0.0.1:41335] mod_fcgid: stderr: Undefined subroutine CGI::Fast::multi_param
</code></pre>
<p>Well, ok, I upgraded Ubuntu a while back. Maybe I need to reinstall CGI::Fast
from CPAN because the Ubuntu packages aren&rsquo;t up to date. So:</p>
<pre><code>$ sudo cpan -i CGI::Fast
</code></pre>
<p>No dice. What am I missing here? Oh, right. CGI::Fast inherits from CGI.pm.</p>
<pre><code>$ sudo cpan -i CGI
</code></pre>
<p>Golden.</p>
<p>Granted, I should probably stop using CGI.pm altogether.</p>
</article>
<article>
<h1><a name=Monday-April-20 href=#Monday-April-20>#</a> Monday, April 20</h1>
<h2><a name=Monday-April-20-getting-recent-posts-from-pinboard-machine-readably href=#Monday-April-20-getting-recent-posts-from-pinboard-machine-readably>#</a> getting recent posts from pinboard machine-readably</h2>
<p>I&rsquo;ve been experimenting again with using Pinboard to track links of interest,
and thought that maybe it&rsquo;d be a good idea to use these to add a linkblog back
to p1k3.</p>
<p>First I thought ok, there&rsquo;s probably an API, which, sure enough, <a href="https://pinboard.in/api">is
true</a>. Here&rsquo;s a one-liner that will grab JSON of recent posts by
a user:</p>
<pre><code>curl "https://brennen:[brennen's password goes here]@api.pinboard.in/v1/posts/recent?count=25&amp;format=json"
</code></pre>
<p>&hellip;but then I thought ok, this is dumb. I know there&rsquo;s RSS, so why not just use
a standard format that could pull from other sources as well?</p>
<pre><code>curl https://feeds.pinboard.in/rss/u:brennen/
</code></pre>
<p>Further thoughts: Instead of doing this dynamically on the server, I could
just periodically pull data into the p1k3 archives and render it using some
service. I&rsquo;m getting ever-more leery of running any dynamic code where I don&rsquo;t
have to, and even considering rewriting all of the p1k3 stuff to generate
static files instead of building pages on the fly, so maybe this would be a
good experiment.</p>
</article>
<article>
<h1><a name=Monday-January-18 href=#Monday-January-18>#</a> Monday, January 18</h1>
<h2><a name=Monday-January-18-moved-to-p1k3-com href=#Monday-January-18-moved-to-p1k3-com>#</a> moved to p1k3.com</h2>
<p>I&rsquo;ve decided to pick this project back up, but it seems like I&rsquo;ll probably be
better at updating it if I <a href="https://p1k3.com/2016/1/16">integrate it</a> into
<a href="https://p1k3.com/">p1k3.com</a>. I&rsquo;ve copied all of these entries over into
the p1k3 tree, and new ones will appear there, but I&rsquo;ll leave this document
in place since I feel like it&rsquo;s uncool to break links.</p>
</article>
<article>
<h1><a name=tools-amp-toolchains-for-data-munging-amp-analysis href=#tools-amp-toolchains-for-data-munging-amp-analysis>#</a> tools &amp; toolchains for data munging &amp; analysis</h1>
<h2><a name=tools-amp-toolchains-for-data-munging-amp-analysis-csvkit href=#tools-amp-toolchains-for-data-munging-amp-analysis-csvkit>#</a> csvkit</h2>
<p>This is super handy. Wish I&rsquo;d started using it sooner:</p>
<blockquote><p>csvkit is a suite of utilities for converting to and working with CSV, the
king of tabular file formats.</p>
<p>&hellip;</p>
<p>csvkit is to tabular data what the standard Unix text processing suite (grep,
sed, cut, sort) is to text. As such, csvkit adheres to the Unix philosophy.</p>
<ol>
<li>Small is beautiful.</li>
<li>Make each program do one thing well.</li>
<li>Build a prototype as soon as possible.</li>
<li>Choose portability over efficiency.</li>
<li>Store data in flat text files.</li>
<li>Use software leverage to your advantage.</li>
<li>Use shell scripts to increase leverage and portability.</li>
<li>Avoid captive user interfaces.</li>
<li>Make every program a filter.</li>
</ol>
</blockquote>
<p>&ndash; <a href="https://csvkit.readthedocs.org/en/0.9.0">csvkit 0.9.0</a></p>
<h2><a name=tools-amp-toolchains-for-data-munging-amp-analysis-jq href=#tools-amp-toolchains-for-data-munging-amp-analysis-jq>#</a> jq</h2>
<p>Also super handy, if a notch less intuitive. Powerful DSL / pretty-printer /
filter for working with JSON records at the command line.</p>
<ul>
<li><a href="http://stedolan.github.io/jq/">http://stedolan.github.io/jq/</a></li>
</ul>
</article>
<article>
<h1><a name=systemd-notes href=#systemd-notes>#</a> systemd notes</h1>
<ul>
<li><a href="https://letsgettechnicalblog.wordpress.com/2014/07/07/systemd-on-raspbian/">https://letsgettechnicalblog.wordpress.com/2014/07/07/systemd-on-raspbian/</a></li>
</ul>
</article>
<script>
$(document).ready(function () {
// ☜ ☝ ☞ ☟ ☆ ✠ ✡ ✢ ✣ ✤ ✥ ✦ ✧ ✩ ✪
var closed_sigil = 'show';
var open_sigil = 'hide';
var togglesigil = function (elem) {
var sigil = $(elem).html();
if (sigil === closed_sigil) {
$(elem).html(open_sigil);
} else {
$(elem).html(closed_sigil);
}
};
$(".details").each(function () {
var $this = $(this);
var $button = $('<button class=clicker-button>' + open_sigil + '</button>');
var $details_full = $(this).find('.full');
$button.click(function (e) {
e.preventDefault();
$details_full.toggle({
duration: 550
});
togglesigil(this);
});
$(this).find('.clicker').append($button);
$button.show();
});
// $('.details .full').hide();
});
</script>
</body>
</html>