Creating and rotating log files from piped input

Say you have an application that sends logs to STDOUT and you want to capture that stream, log it to a file and rotate the files based on time and/or size. rotatelogs is a small utility bundled with Apache’s HTTPD that can do it for you.

I wrote a small sample program as a test, one that outputs a bunch of Lorem ipsum text to STDOUT (lorem):

package main

import (


func main() {
	for {
		fmt.Println(lorem.Paragraph(3, 8))
		time.Sleep(time.Duration(rand.Intn(1000)) * time.Millisecond)

This prints a Lorem ipsum paragraph between 3 and 8 sentences once every (pseudo) random 0 to 1000 milliseconds.

Once built, you can do this:

$ lorem | rotatelogs -l "lorem-%Y%m%d-%H%M%S.log" 86400 5M

This sends the output of lorem to a date/time based log file like lorem-20150406-120400.log and rotates the log every night at midnight and/or after the content reaches 5 megabytes in size.

The rotatelogs manfile includes other useful examples I recommend reading through.

On athletic endeavours thus far

A little over five years ago I was at a friend’s Christmas party and heard about a guy there who had just completed a race at the Sears Tower (now Willis Tower), climbing up the stairwell all the way to the top. It was one of the most intriguing things ever. I had to talk to him. I had to learn more.

The reason it was so intriguing may require a bit of an explanation. I had just gotten into running and getting fitter in general. I was entering races seemingly left and right just for the fun of it all. Newness. But, a few weeks prior I ran the YMCA Turkey Trot in Dallas where I ran too many miles too soon for my newfound running body. I developed a really bad case of tendonitis that took a few months to recover from because I simply didn’t know how much was too much.

I eventually made my way over to the guy and started talking with him. Very interesting. Triathlete that has been doing stair races in the off-season. He told me a little bit about the race, also mentioning that there was one coming up in the next month. Luckily climbing stairs didn’t bother the tendonitis injury and I was up for a challenge. I knew I had to sign up the moment I got home.

I spent the next few weeks alternating between dangling my legs in an ice-cold pool, working out with P90X DVDs and attending intense workout classes at my residence. The lofts my wife and I were living at the time provided a great place to train with the workout classes and its six-floor parking garage. I made use of both the stairwells and the garage itsef for the next month.

The month passed by quickly and I was as ready as I was going to be for my first stair climb. The day came; my first Big D Climb. 52 stories of a thorough ass-kicking. I finished in 11 minutes, 43 seconds and felt like I was going to throw up while being treated with a beautiful view of Dallas for my effort. (I was able to enjoy the view - a few minutes after finishing I was fine.)

Stair climbing is easily the hardest thing I’ve ever done. Not that I’m super experienced in doing a lot of physically demanding activities, but completing a stair climb as fast as possible is terribly grueling. And to the right type of person, it’s also incredibly fun.

I’ve learned a lot about myself in the past 5-6 years of athletic endeavors. A good physical challenge is something I now live for. Since the day I discovered my love of running clad in a pair of hideous Vibram Five-Fingers, I’ve completed quite a few events. Stair climbs, 5ks, 10ks, road races, trail races, mud races, a few trail ultra-marathons up to 50 miles and one road marathon. And this isn’t a brag - at least that’s not how I intend it.

I grew up with seasonal allergies that always brought on an asthma attack. At least once, usually twice a year I was down, incapable of doing much of anything for one to two weeks. I never got into sports because my asthma always seemed to get in the way - or at least that’s what I thought would happen according to my regular visits with various doctors. But, it was bullshit. The best thing I could have done was to get regular exercise - to push limits safely. It took me 20 or so years to figure that out.

My point is that if you haven’t experienced the awesomeness that is being physically active in something you love to do or haven’t yet discovered, get out there. Try things. The Earth is a playground. Treat it with respect. Go slow. Have fun. If I can do it, you probably can too.

While at the Big D Climb, I learned about another race coming up in Dallas in a month. Again, I was signed up for the race right the moment I got home. The ALA Climb - American Lung Association; oh, that’s something personal (remember: asthma).

With the first race under my belt, I knew a bit more about what to expect. I ramped up my training specific to stairclimbing. I hit the spin bike more often, did parking garage workouts and hit any stairwell I could find. The next race came and I finished in under 10 minutes. 9 minutes, 44 seconds to be exact. What an improvement a little experience and a month of specific training makes. That placed me 39th in a field of over 500. Not bad!

A year went by. Many more 5ks, 10ks, an ultra-marathon, another 8 mile Turkey Trot and a marathon, and I was ready for round two of the Big D Climb. This time, I had done some race-specific training and was ready for a killer time.

While I thought I might have a chance at finishing around 7:30, it was hopeful at best. The race kicked my ass again. I finished in just over 8 minutes, placing #10 in a field of over 800 participants. What an improvement.

A week later I went on to finish a 50 mile race. Note that I said finish - not compete, or really even “race” - just finish. It was one of the coolest things I’ve ever done, and I hope to get back to long trail runs within the next couple of years.

Wait, get back to it? Yes, because about two months later my wife gave birth to our daughter and as everybody says, kids change things.

In 2013, my time slowed at the Big D Climb. It was never officially recorded because the timing company screwed my timing up (amongst others), but it was a little over 10 minutes.

My time slowed again to 10:46 in 2014. I kept on saying that I was going to get back to training to be competitive again. But after having a kid and buying a house, I was focused on the minimal amount of training possible. But the main problem is that I’ve never had any sort of method to my training. I’ve always just done whatever I’ve felt like because it’s all about having fun anyway. Well, this doesn’t work so well when you’re focusing on a minimal training method because you absolutely MUST be consistent. I wasn’t. It showed.

Things are still the same this year as well. I finished in 11:12 last Saturday on my fifth Big D Climb event. This time at a new building, but still close to the same height. I’m getting slower and slower, close to my first climb results while I should be getting faster and faster. Oops. Normally it wouldn’t matter, but I’m not having fun being slow in stair climbing. I wanna be fast!

I’m finally ready to change things up. I’m in a good routine of going to the gym. I have a good place to accomplish effective workouts. I’ve made a real effort over the last week to develop a stair-climbing specific training program for myself. One that is sustainable and will bring results.

The last week went extremely well. I did a mixture of strength training with the barbell (upper and lower body), two sessions of 400m repeats on the erg, hill repeats on the treadmill, a little bit of work on the stepmill, a long slow run at a trail at the lake, and even some kettlebell work at home.

Now for this week. More strength training. More erg. More spin bike. More stepmill and more trail running. I’ve developed my programming using energy system training methods recommended by Joel Jamieson in his book entitled Ultimate MMA Conditioning. I’m interested to see what happens when following a structured program instead of just going out to the trail a few times a week and doing whatever I feel like for that day.

Next week there is another race, the ALA Climb. I’m not sure if I’m doing it or not due to the price. There is another a week later that is more intriguing, simply because I haven’t done it before, it requires some travel and the race is shorter and plays more to my strengths.

I’ll make a decision soon. I’m just happy to be back to harder aerobic training. Makes me feel goooood.

Now there’s a cat climbing on top of me and I must get back to work anyway. Have fun out there!

Wiki on BeagleBone Black

I just picked up a Rev 4 BeagleBone Black from Micro Center for ~$40 with the intent of using it as a wiki server for both my own personal notes and those for my home and family. While the BeagleBone comes with an embedded 4GB chip pre-installed with Debian, I need more space to store my notes, so I also picked up a 16GB micro SD card.

Since the BeagleBone Black (BBB) is more powerful than the Raspberry Pi, I suppose you could use pretty much any wiki software that runs on the ARM architecture with reasonable performance. But, I wrote software specifically for this purpose: a Git-backed Markdown-based wiki in a single executable - Goiki. This is what I’ll be running.

First, I connected the BBB to my network using the supplied RJ45 jack on the board. To power it up, I used the supplied USB cable and connected it to an open USB port on my laptop.

To find the IP address, I logged-in to my router to see what changed on my DHCP server. Ah, there it is! It shows up with a hostname of “beaglebone”. I’ll just use that.

ssh root@beaglebone

There is no password associated with the root account. This will need to change. I used passwd to update the password.

I like to tackle hardware issues first so I then set-up the microSD card to act as storage for my notes. Debian is set-up to auto-mount the micro SD card, so I was able to use df -h to see what loaded.

The filesystem that was set-up on the card wasn’t appropriate for my needs (well, I suppose it was technically, but I wanted to change it anyway). So, I used fdisk to remove the current partitions and re-created one big partition. I then used mkfs.ext4 to make an ext4 filesystem on that partition. Then it was ready to go.

To make sure the partition is always available, a mount-point for the device needed to be set-up. The partition is located at /dev/mmcblk1p1 and I mounted it at /media/microsd (mkdir /media/microsd first). Below is the entry I used in /etc/fstab to set-up the mount point:

/dev/mmcblk1p1      /media/microsd  ext4    defaults    0   2

Now the storage device automatically mounts upon boot-up to /media/microsd. My Goiki data directories were then set-up on the device.

mkdir -p /media/microsd/goikipersonal/data
mkdir -p /media/microsd/goikifamily/data

Install Goiki

Goiki is a Git-based Markdown-powered wiki written in Golang. I wrote it specifically for the purpose of having a simple wiki for low-powered ARM devices such as the Raspberry Pi and BeagleBone Black. Of course it can also run on many other platforms too.

I don’t want to compile the project on the ARM device itself due to unnecessary thrashing of the micro SD card. Instead, I cross-compile the binary from my Mac. Maybe one day I’ll have binaries available for multiple platforms, but since Goiki is in such a beta state (though completely usable), you’ll have to compile it yourself.

A quick primer on cross-compiling Go for ARM on OS X

Read the Go: Getting Started section on the Golang website first. Though, the following might get you there without needing to do that:

brew install go
mkdir ~/go
export GOPATH=~/go
cd ~/go
mkdir -p src/
cd src/
git clone
cd goiki
go get

For cross compiling Go projects on OS X for ARM, do the following once:

cd `brew --prefix go`
cd libexec/src
GOOS=linux GOARCH=arm CGO_ENABLED=0 ./make.bash --no-clean

And then to compile Goiki:

cd ~/go/src/
GOOS=linux GOARCH=arm GOARM=7 go build -o goiki.arm

Now copy over the binary:

scp root@beaglebone:goiki.arm

And on the BBB:

mv ~/goiki.arm /usr/local/bin/goiki

Now we can run the goiki executable.

Configure Goiki

Goiki packages its default configuration in the binary and provides a command line option for outputting it to the command line. Capture the output to a configuration file:

`mkdir /etc/goiki`
`goiki -d > /etc/goiki/personal.toml`
`goiki -d > /etc/goiki/family.toml`

The configuration file is self-documenting. Edit with your favorite editor and come back for setting up a daemon.

Setting up a daemon

Since init is available we’ll just use that. I found a terrific init.d script template to make this easy. You’ll just need to change the first couple of lines for your own executable:

# myapp daemon
# chkconfig: 345 20 80
# description: myapp daemon
# processname: myapp


DAEMONOPTS="-my opts"

DESC="My daemon description"

case "$1" in
	printf "%-50s" "Starting $NAME..."
	PID=`$DAEMON $DAEMONOPTS > /dev/null 2>&1 & echo $!`
	#echo "Saving PID" $PID " to " $PIDFILE
        if [ -z $PID ]; then
            printf "%s\n" "Fail"
            echo $PID > $PIDFILE
            printf "%s\n" "Ok"
        printf "%-50s" "Checking $NAME..."
        if [ -f $PIDFILE ]; then
            PID=`cat $PIDFILE`
            if [ -z "`ps axf | grep ${PID} | grep -v grep`" ]; then
                printf "%s\n" "Process dead but pidfile exists"
                echo "Running"
            printf "%s\n" "Service not running"
        printf "%-50s" "Stopping $NAME"
            PID=`cat $PIDFILE`
            cd $DAEMON_PATH
        if [ -f $PIDFILE ]; then
            kill -HUP $PID
            printf "%s\n" "Ok"
            rm -f $PIDFILE
            printf "%s\n" "pidfile not found"

  	$0 stop
  	$0 start

        echo "Usage: $0 {status|start|stop|restart}"
        exit 1

I set one up for each of my wikis: goikipersonal and goikifamily.

chmod +x /etc/init.d/goikipersonal
chmod +x /etc/init.d/goikifamily

An example of how I changed the goikifamily init script:

# goiki daemon
# chkconfig: 345 20 80
# description: goiki family daemon
# processname: goiki



DESC="Goiki wiki for family notes"

Now, start them up!

/etc/init.d/goikipersonal start
/etc/init.d/goikifamily start

You should now be able to hit the box from your favorite webbrowser: beaglebone:[port].

DIY Hair Pomade


  • 1 tsp Pure beeswax
  • 1 tsp Coconut oil
  • 1 drop Essential oils

Use 8 parts of each to fill a small tin.


  • Double boiler
  • Storage tin


  1. Get about an inch of water boiling in the pot.
  2. Pour the beeswax into the bowl and wait until liquified.
  3. Mix in the coconut oil and essential oils.
  4. Stir until liquified.
  5. Carefully pour into the storage tin.
  6. Wait for a few hours until solid.

This recipe and accompanying instructions originally came from the article entitled DIY Hair Pomade for Father’s Day (All-Natural) by Ryan Foy.

On Shoes

I prefer shoes that are built for the anatomy of the foot, not solely for style. However, I also like a stylish shoe. Not very long ago this type of shoe was almost, if not completely impossible to find. Luckily for us, there are now choices!


Yes, barefoot is best. I prefer to be barefoot as much as possible, even when it may not be as socially acceptable as it probably should be.


I love Luna Sandals. They are the most comfortable pairs of shoes I own. Okay, so they aren’t the most stylish; oh well.

Luna Mono with Pittards Footbed Source:

Barefoot Ted and team absolutely nailed the straps (and footbeds, and soles!). I have a few older pairs, but when it’s time to buy new I will grab the Luna Mono sandals with the pittards foot bed for an everyday sandal and the Luna Oso for trail running madness.


Lems Shoes makes some nice casual shoes. I own the Lems Mariner which is super comfortable worn without socks. The next shoe I’d like to try from Lems is the Lems Nine2five Coffee&Cream. I figure it’d be good to wear with pants once it gets a bit cooler. The Lems Primal 2 and Lems Boulder Boot are also enticing.

Lems Nine2five Coffee&Cream Source:

Vivobarefoot has some great casual shoes as well. I have a pair of [Gobi]() and Ra that are both wonderfully comfortable. There are two problems with the Ra (the Gobi is perfect). One is fairly minor and can be fixed with thicker socks: the back digs into the heel a bit. The other is stylish: the toebox is so wide that it starts looking like a clown shoe at certain angles. Good news though, Vivobarefoot has completely fixed all the problems with the original Ra in the Ra II.


While I don’t currently have a need for dress shoes, Vivobarefoot’s Bannister, Porto and Lisbon are new luxury styles that I would purchase in a heartbeat if I had to go into the office everyday.

Vivobarefoot Porto in Dark Brown Source:


When hiking in sandals is not appropriate, I currently use a traditional pair of hiking boots. But I’d like to replace them with a barefoot alternative, and the Vivobarefoot Off Road Hi is the one I’ll be going after.

Vivobarefoot Off Road Hi Source:


When talking about barefoot shoes, one can’t forget Soft Star Shoes. I partly lied about the Luna Sandals being the most comfortable pair of shoes I own earlier. For summer, that is definitely true, but the Roo Moccasins win that award in the colder months. Sometimes I wear them out, and yes they look funky (they come in custom color configurations too), but sometimes it’s worth embracing your inner elf for the sheer joy one experiences when wearing his or her elfish moccasins out for a day.

Soft Star Shoes Roo Moccasins Source:


Hosting multiple domains with Nginx in Ubuntu 14.04 on Digital Ocean

So I was having problem with my previous server. Now that I’ve rebuilt everything I’ve realized that the problem I was having could have been easily fixed with my previous install. But, it doesn’t matter; it was time to upgrade from Ubuntu 12.04 to 14.04 anyway.

The goal of the server is simple: host two domains using Nginx. The problem I was having with the 12.04 install was that I could never get the second domain to serve up the right files; it would always serve up the first domain. This turned out to be a problem with the Nginx configuration. I thought this was the case at the time, but since the install wasn’t doing anything else anyway, I thought it was a good time to rebuild and start from scratch.

There are four articles of importance here:

  1. Initial server setup with Ubuntu 14.04
  2. How to install Nginx on Ubuntu 14.04 LTS
  3. How to set up Nginx server blocks (virtual hosts) on Ubuntu 14.04 LTS
  4. How to install and use Fail2Ban on Ubuntu 14.04

There is also a fifth article that may be useful to you: How to set up a host name with DigitalOcean.

The following is a quick summary of commands and configuration for the four articles listed above.

Initial server setup with Ubuntu 14.04

  1. Login: ssh root@[server_ip_address]
  2. Change the root password: passwd
  3. Add a new user: adduser [username]
  4. Give root priveleges to new user: visudo
  5. BACK OUT IMMEDIATELY with CTRL-X and export EDITOR=vim ;-)
  6. Give root privelges to new user: visudo

    [username] ALL=(ALL:ALL) ALL

  7. Alter SSH configuration: vim /etc/ssh/sshd_config

    1. Update port: port [port_number]
    2. Refuse Root Login: PermitRootLogin no
    3. Allow only certain users (careful!): AllowUsers [username]
  8. Restart SSH: service ssh restart

  9. Test SSH config from new terminal

    1. ssh -p [port_number] [username]@[server_ip_address]
    2. sudo vim
  10. Exit the root terminal: exit

How to install Nginx on Ubuntu 14.04 LTS

  1. Install Nginx:
    1. sudo apt-get update
    2. sudo apt-get install nginx
  2. Check the installation: curl http://[server_ip_address] or curl http://[domain_uri]

How to set up Nginx server blocks (virtual hosts) on Ubuntu 14.04 LTS

  1. Create a directory for each domain’s files:
    1. sudo mkdir -p /var/www/[domain1]/html
    2. sudo mkdir -p /var/www/[domain2]/html
  2. Change ownership to each directory:
    1. sudo chown -R $USER:$USER /var/www/[domain1]/html
    2. sudo chown -R $USER:$USER /var/www/[domain2]/html
  3. Make sure the permissions of the web roots are correct: sudo chmod -R 755 /var/www
  4. Create sample pages for each site:
    1. vim /var/www/[domain1]/html/index.html -> HTML
    2. cp /var/www/[domain1]/html/index.html /var/www/[domain2]/html/index.html
    3. vim /var/www/[domain2]/html/index.html -> HTML
  5. Create server blocks for each domain:

    1. sudo cp /etc/nginx/sites-available/default /etc/nginx/sites-available/[domain]
    2. sudo vim /etc/nginx/sites-available/[domain1]
        server {
            listen 80;
            listen [::]:80;
            server_name [domain1] www.[domain1];
            root /var/www/[domain1]/html;
            index index.html index.htm;
            location / {
                try_files $uri $uri/ =404;
    1. sudo vim /etc/nginx/sites-available/[domain1] /etc/nginx/sites-available/[domain2]
    2. Adjust the second domain’s configuration to match the domain name and appropriate directory locations.
  6. Enable the server blocks and restart Nginx

    1. sudo ln -s /etc/nginx/sites-available/[domain1] /etc/nginx/sites-enabled/
    2. sudo ln -s /etc/nginx/sites-available/[domain2] /etc/nginx/sites-enabled/
    3. sudo vim /etc/nginx/nginx.conf and make the following change: server_names_hash_bucket_size: 64;
    4. sudo service nginx restart

How to install and use Fail2Ban on Ubuntu 14.04

  1. sudo apt-get update
  2. sudo apt-get install fail2ban

Einstein on reading

Reading, after a certain age, diverts the mind too much from its creative pursuits. Any man who reads too much and uses his own brain too little falls into lazy habits of thinking.

– Albert Einstein

First post using Hugo

Ahhh, new beginnings. About the only time I ever write something is when I am changing software. I suppose it’s time to change that.

About Me

Hi, I’m Justin.

My interests are in permaculture and ecological design; software development and architecture; running, kettlebells, diet and overall fitness; photography; and many tiny, specific little things.

This is mostly a place to share a few things with my future self. Bonus points if you find something of value too.

Want to get in touch? Email me at: the at justinlanghorst dot com, or tweet in my general direction @justajot.

Octopress on S3

I’m currently working on redoing a company website, moving from WordPress to Nanoc. Since the new website will be static, and mostly a single-pager for a while, we now have the option of moving from a traditional host to the Amazon S3 datastore. Before committing a company website to the process, I decided to go ahead and move this blog over to S3 (it uses Octopress) first. Here are a few notes that I jotted down along the way.

Sign-up to S3 and create buckets

Amazon has excellent documentation for hosting a static website on S3. I basically followed this excellent walkthrough. To begin, create two buckets for your domain, one with the “www” subdomain and one without. Point the “www” bucket to the non-www bucket, and keep all your files there. Or just ignore the “www” version entirely if you don’t care to resolve the www subdomain to your website. Some might say it’s deprecated anyway.

Add S3 deployment to Octopress

Jacob Elder wrote about deploying Octopress to S3 and also included a bit of code for deployments using s3cmd. Check out the blog post for detailed info, or just update your Rakefile with the contents of this patch.

Updates to Jacob’s patch

I ended up changing a few things from Jacob’s code, mostly just adding the option to use reduced redundancy since the contents of my website are most definitely “durably stored elsewhere”.

## – S3 Deploy Config – ## # Requires s3cmd. brew install s3cmd or see http:/ s3_bucket = “” s3_delete = false s3_reduced = true

That’s the config, and this is the Rake task:

desc “Deploy website to Amazon S3” task :s3 do puts “## Deploying website via s3cmd” exclude = File.exists?(“./s3-exclude”) ? “–exclude-from ‘#{File.expand_path(‘./s3-exclude’)}‘” : “” ok_failed system(“s3cmd sync –guess-mime-type –acl-public #{exclude} #{‘–delete-removed’ unless s3_delete == false} #{‘–reduced-redundancy’ unless s3_reduced == false} #{public_dir}/ s3://#{s3_bucket}/“) end

Create routes, switch DNS servers

Keep following Amazon’s walkthrough. Once you’ve switched over your DNS servers to point to those listed under the Route 53 entry, all you have to do is wait a bit to see if you did everything correctly. If you’re reading this, I’m guessing everything my attempt was a success. ;-)

Using AirPort Utility 5.6.1 in Lion

So I’m in the beginning stages of planning a whole-house automation system. Since music is pretty damn important to me, the first part of this system I’d like to get right is whole-house audio. I want to be able to play music in pretty much any room I’m in, all controlled from my phone or tablet. Since I’m already running Apple products everywhere, it just makes since to use an AirPort Express for each zone.

To get things started, I have an older version of the AirPort Express that plugs directly into the wall. It hasn’t been updated in a while, but my OS X systems have. I’m currently running Lion (not Mountain Lion) and the current AirPort Utility application (version 6.1) doesn’t support the old AirPort Express. The older version that does support it only runs on Leopard/Snow Leopard. Or so it says. What to do?

Well, there is an excellent explanation here. I’ll forgive the errors in the title of his/her site because the solution and walk-through are great.

Briefly, here’s what you need to do:

  1. Download the disk image from []().
  2. Mount the image / move AirPortUtility.pkg to your Desktop.
  3. Open Terminal, and cd ~/Desktop ; mkdir tmp ; cd tmp.
  4. Extract the payload: xar -x -f ~/Desktop/AirPortUtility.pkg Payload.
  5. Extract the app: gzcat AirPortUtility.pkg/Payload | tar -xf -.
  6. Open up the app located withing the new Applications directory.

Following the instructions above allowed me to finally fix my old AirPort Express to get it showing in my AirPlay list again. Now I just need to buy a few more to get audio in each room. I think it’s time to go listen to some music. Yup.

On pens

What goes well with notebooks? Pens! Since my last post was about notebooks, it only makes sense for this one to be about pens. Right? Right.

Earlier this year I was on an all out yetti hunt to find the perfect notebook after Moleskine completely screwed their previously lovely square-ruled notebook with ink dark enough to compete with my own writing. The quest was successful after I found my perfect notebook in the Behance Dot Grid Journal. The bright green strap just does it for me.

For the Dot Grid Journal, my favorite pen so far is the Papermate Flair UF. The Fisher Space Pen is great if you’re into ballpoints, as is the Parker Jotter. And while I have the stainless steel version, I far prefer the lime green (yes, bright green is a favorite color of mine). The weight and feel is a bit different too, and I actually like the colored plastic versions better.

If I were still writing on Moleskines, there is no question what pen I would be using: the Pilot Hi-Tec-C. And to make things more interesting, there have been a few successful Kickstarter projects for better housings.

My choice would be the Render K. Just something about it. I haven’t felt like forking over $40 just yet though. Hmmm, the PHX-Pen looks interesting as well. Decisions, decisions.

A search for the perfect notebook

I’ve been on a mad hunt for the perfect notebook the last few days. Not a laptop. Certainly not one of those spiral notebooks typically used in gradeschool. No, just a regular black notebook. Oh, but with graph lines. Not blank. Not ruled.

My last Moleskine Squared Notebook is nearing completion with 11 pages left. Why the hunt? Why not just buy another one? There are several Barnes and Noble stores near me that sell them, BUT, there’s one major reason why I won’t do it again – the graph lines are TOO DARK!

It wasn’t long ago that the Moleskine Squared Notebook was the perfect one for me. Except for being a bit pricey, I loved everything about them – the color of the paper, the graph lines, the way the binding allows the notebook to lay flat when you have the notebook open. Oh man! Anyway, while I’m sure you care about the details of what works for me (I know you don’t), that isn’t the point.

After calling a few places and driving around to several different stores in the area, I came up with nothing. Needing a notebook immediately, I gave up and bought a ruled Moleskine from B&N. But, here is what I found.

After a large search around the area, I wasn’t able to get my hands on a physical copy, but apparently Piccadilly makes exact replicas of the Moleskine notebooks on the cheap. Apparently they are both made in China. I wouldn’t be surprised if they are made by the same company.

Ecosystem has a really nice line of notebooks themselves. They use 100% post-consumer recylced paper and are 100% USA made. Awesomeness. Unfortunately for me, their Architect line of notebooks just isn’t what I’m looking for. The lines are printed dark and are much too small. If I was looking for a ruled journal, I would definitely consider them – their flexi cover is amazing and I just love their story.

Markings sells a line of products at Staples and Target. I find their cover disgusting and refuse to use it.

Behance sells the Dot Grid Journal, which looks pretty interesting. I know I was looking for a squared ruling, but if the dots aren’t too dark, it could be perfect.

UPDATE! The Behance Dot Grid Journal is the best notebook I’ve ever owned. Plus, their customer service is awesome.

UPDATE #2 It appears as if the lines in the squared Moleskine notebooks are back to normal - yay!

UPDATE #3 Lately I’ve been using nice printer paper, folded up twice to make four squares. I actually like this better than a notebook for daily notes - keeping around five pages with me at a time. I think I’ll punch holes in them and put them in a binder.

pbcopy | pbpaste, and opening a new terminal tab in the current working directory

While I try my best to improve all areas of my life continuously (kaizen), sometimes I fail to perform with exemplary status. Countless times a day I have the need to open a new Terminal tab in the same working directory. Before today, I issued a pwd command, copied the output using the mouse, Command-T, and cd Command-V. Ugh. That’s a convoluted mess.

When I was growing up and mastering conventional memory management in DOS for the sole purpose of viewing various graphical demos from around the world, I knew about and how to use every command available within the operating system. Even though that’s probably not realistic these days, it’s something definitely worth shooting for over time (hmmm Linux from Scratch. If the same were true today, then I would have already been using the pbcopy and pbpaste utilities.

Now I can simply pipe the current working directory to the clipboard:

pwd | pbcopy

Open a new terminal tab or window, and use a combination of cd and Command-V or pbpaste to get back to the original directory:

cd `pbpaste`
cd [Command-V]

While pbcopy and pbpaste are excellent utilities, they aren’t the best solution for what I originally set out to do: open a new tab in the current working directory. I’m sure there is an even quicker way to get to the same current working directory, probably with an AppleScript, and I suspect that OS X Lion has an option for this in Terminal, but for now, pbcopy will do the trick.