I previously had a script that notifies me of updates to Homebrew packages via Growl. However, I recently upgraded to Mac OS 10.8 Mountain Lion. And since Mountain Lion has a Notification Center built-in, I figured I'd try to use that method of notifications rather than Growl. I found terminal-notifier which seemed to be great for this purpose. So I updated my script to use terminal-notifier and published the script as a gist. Make sure that you have terminal-notifier installed first (just run gem install terminal-notifier).
Assuming the script is at ~/bin/brew-update-notifier, you can install the script to a crontab by running sudo crontab -e, then adding the line 0 12 * * * /Users/<username>/bin/brew-update-notifier to the end of the file (substituting <username> for your username, or wherever you've put the script). I've chosen to run the script every day at noon because ...
I frequently find myself wondering if a bug in a Python package has been fixed and whether there is an upgrade for that package that might fix the bug. So I find that I end up running pip freeze and then having to compare the package versions to those on PyPI manually. Well, anytime you say "run X manually", you're being a chump.
I just saw down and wrote a script to get the list of currently installed packages in the current environment (so it works with virtualenv). Then it checks to see what the latest version of the package is on PyPI and prints out the status. If you work with Python and packages, this is awesomesauce.
Over the weekend, I finally upgraded my system to Mac OS Lion. I also took the opportunity to do a completely fresh install of my system, doing a final TimeMachine backup before erasing the hard drive and then installing Lion off a USB thumb drive.
I have long used MacPorts as my open source package manager, but I've had issues recently with certain ports not being updated or being out of date. So I was really interested in Homebrew. The fact that it is all on github, open and actively developed really appealed to me. After getting it up and running, I wanted to port my package update notifier to use Homebrew. Doing so was really quite easy. Here's what I came up with, which is also in my dotfiles on github:
Assuming the script is at ~/bin/brew-update-notifier, you can install the script to a ...
Today was World Backup Day, and, in the footsteps of Ars, decided I'd detail my current backup strategy. I first started backing up regularly after an incident back in college. One day the hard drive in my laptop began making the telltale sounds of imminent failure. Then the computer stopped booting. And I hadn't backed up the drive in several weeks, which would have caused me to lose a bunch of data. I ordered a new drive, and just before I was about to install it, I was luckily able to boot my laptop and then copy the data off of it, losing nothing. Since then, I've been an big advocate of backups.
My primary computer is a Macbook Pro. Since it is also the computer that all my photos and important data is on, I don't want to lose any of it. Every day I make sure to plug in my 1 TB external hard drive that Time Machine mirrors the contents of drive to. So at all times, I've got a backup of my drive and a history of the files. I also have another external drive that I keep at my parent's house in their safe that is filled with Time Machine backups. So if my house blows up, I have something to fall back on. The problem with that disk though is that it is several months out of date. And that sucks because my data changes all the time.
But thats not all for my laptop. I also make use of CrashPlan. At home, I also have a Linux box with a 4 hard drives in a RAID 5 setup, so if one drive dies, I don't lose any data, I can just replace the drive. So I ...
I wrote a quick little script this morning to notify myself of updates to MacPorts via the Mac OS X notification app Growl. The script is a bash script that is designed to run as root from a cronjob (as the port sync command requires root permissions). You can find the script committed in my dotfiles repository on github. For convenience, I pasted the script below, though the copy on github will always be the most up-to-date copy:
Assuming the script is at ~/bin/port-update-notifier, you can install the script to a crontab by running sudo crontab -e, then adding the line 0 12 * * * /Users/<username>/bin/port-update-notifier to the end of the file (substituting <username> for your username, or wherever you've put the script ...
I've wanted to recently upload some of my pictures to Facebook. However, there are several things I really dislike about Aperture's built-in Facebook syncing.
First, order is not preserved. I want pictures ordered in a predicable way. Often, by date. I have yet to figure out how Aperture chooses to export, it seems to be quite random and is infuriating.
Second, tagging friends on Facebook from the built-in Aperture Faces is spotty, at best. Sometimes it works, and other times I get lots of pictures with people's names tagged, but those names are not linked to the proper Facebook friends, despite being tagged with the exact same name. Again infuriating!
Third, making any change in Aperture (adding a tag, etc), causes the photo to be re-uploaded to Facebook, creating an updated photo album feed story. I don't want this behavior. I want to export my photos to Facebook and then choose when it should be updated again. I'll let you guess my emotion.
So, what other options are there out there? Well, this guy Sean Farley created a plugin. However, it is broken on my computer. Supposedly, it does work for some people on Aperture 3, but I cannot get it to even show up in the Export menu and I get no Console messages. I've even emailed Sean twice trying to contact him and help figure out why it is broken. No response.
Finally, I've given up with these options. I'm not a prolific Facebook photo uploader, but there are times when I want to put up pictures on Facebook instead of Flickr. And I couldn't do it without exporting the photos from Aperture to my disk, and then using Facebook's photo upload.
Google recently added the ability to create a birthday calendar. However, Google didn't give one the ability to send reminders for the events on that calendar. This is the same thing that Apple has done for years with their Address Book and iCal integration. And I hate missing someone's birthday when I don't happen to look at my calendar that day.
I saw this as an opportunity for improvement. I also saw this an opportunity to learn about Google App Engine. So I wrote a python application that lets you schedule email reminders for the Contacts Birthday Calendar. In addition, you can select a specific hour in a specific time-zone to send the reminders at as well. The application use's the Google App Engine user authentication and the Google Data AuthSub permission request API to get upcoming events for display and email reminders.
This gave me some good experience with Google App Engine's scheduled tasks and Task Queues, as well as a chance to hone up on my Python. Without further ado, I present GCalendar Reminders. Feel free to use it to send yourself emails using the security of Google App Engine.
At work, I recently was tasked with looking into some NoSQL solutions for upcoming projects. For various reasons, I focused on the open source Redis project. Redis looks to be adding new features quickly and seemed to be a great potential solution.
I then started looking into PHP clients as our current environment is mostly PHP. We require that the client support consistent hashing, and, from a quick search, a couple turned up. PRedis seemed to offer the most potential, and after some quick tests, also seemed to offer the greatest performance. So I set up a more elaborate benchmark of the the client and server package.
My test setup involved using 5 servers with between 2 and 5 enabled at a time on the clients (ie. I disabled up to 3 of the servers in the client configurations). For performance, I configured the servers to never write to disk, though periodically syncing to disk should not cause too much of a performance loss. In fact performance was most greatly affected by forcing an fsync after every write. I then had 9 other client boxes running the same code base, with all 9 enabled for each test.
Each client would start a master PHP process that forked 20, 30 or 40 child processes to simulate greater and greater load. Each forked PHP process then did 10,000 SETs on random keys with 4 byte payloads (early tests showed that payload size didn't drastically affect the results). I was using the PHP 4.2.6 branch of the PRedis client, and had optimized it a bit so that it did fewer counts of the consistent hash array. I made the optimizations based on some results after profiling the code. I then had the master PHP process on each box repeat ...
Well, it sure has been a while since my last post on here. So I thought I'd kick it off with a discussion of how I went about getting my email backed up.
First, a description of my situation. I run all my email through GMail. I enjoy the interface and the fact that it is a cloud service; I can access my email seamlessly on my phone, my home computer, my work computer, some other computer, etc. However, I don't want to lose all that information. Google is great, but who is to say that something terrible won't happen and some (or all) of my mail is lost? So I wanted to setup some sort of backup. And then once I got that setup, make it automated.
At home, I run an Ubuntu box, that I just upgraded to 9.04, Jaunty Jackalope. This machine primarily serves as a media box, hosting video that streams to my Tivo off the 1.5TB RAID 5 array. I also use it as a network mounted TimeMachine box as well. Since I have extra storage on it, I figured I'd get something to sync my mail over IMAP periodically, and then I have a nice little backup.
After some searching, I came across twosites that had instructions using the utility mbsync (formerly isync). I found that following the instructions worked pretty well, though I had to customize the patch provided to get it to work with the version provided by Ubuntu. And then I thought I'd detail my steps here for others to see.
I've been hunting for apartments in Santa Barbara recently, and one of my resources is the Santa Barbara News Press newspaper site. However, there isn't any way to tell new postings. And since there are so many listings, I couldn't remember them all. So I created a script that scrapes the Santa Barbara News Press apartments listings website and then creates a feed of all the listings. I've found it pretty handy to follow all the new listings and maybe someone else will too.
Update:I removed the feed, since it hasn't been updated in a very long time.