My default yearly conference, for many years, has been UberConf. I really love UberConf because it's packed full of tons of great sessions, and it's conveniently local. However, because I go to various local user groups and attend so often, I find that, if I go two years in a row there are too many sessions I've seen before, and I wind up disappointed. So for the past few years, I've been alternating between UberConf and something new. Two years ago, it was OSCON, and this year it was QCon New York.
I chose QCon for a few reasons. One, the sessions seemed very focused on architecture and higher-level concepts, with very few language/technology talks. This was right up my alley because, while there are some languages and tools I'd like to go deeper on, I think a more significant area for improvement for me is architecture and scalability. We get tons of traffic at my job - more than any other place I've ever worked - so I've had to learn a lot about scalability, and the nature of the work has forced me to really see broad system design differently.
I went to QCon specifically wanting to improve some areas where I was weak, namely containerization, microservices, and reactive programming. I hear a lot of buzz about these things, and they pop up on recent ThoughtWorks Technology Radars, and QCon seemed to have a lot of sessions to offer in these areas. It took a LOT of convincing to get my employer to agree to send me to a conference location as expensive as New York, but eventually they approved it. Here I will detail some of my thoughts about the experience, in case it may be of use to others considering QCon.
This blog was never intended to be popular by any stretch of the imagination. Largely I started it simply to have a place to gather solutions to technical problems I've encountered, so that I could easily look those solutions up if I needed them again. The blog has always run on my own shared hosting server, on a self-installed version of Wordpress.
To my great surprise, a few of my posts have found their way to the front page of reddit. My post about Star Wars has been mentioned on King of the Nerds and The Big Bang Theory, and even landed me an Interview on NPR.
Needless to say, the traffic to my blog has been both extremely unexpected and unpredictable. The Star Wars post had been online for months with virtually no traffic before Wired suddenly linked to it, instantly decimating my web server. I've fought and fought with various configurations for Wordpress, used as much caching as possible, and even had my web host temporarily upgrade my service, all trying to keep a web site that makes no money online even when traffic increases by a factor of 100 overnight. When my site goes down, it's embarrassing, because even though it's just a personal blog on a shared host, it gives the impression that I, as a software developer, don't know how to make a web site scale.
Switching to Jekyll
So after the most recent pummeling I took due to a Hacker News link, I decided it was time to bite the bullet and convert the entire site to Jekyll. I've messed around with the technology before to build another, smaller, blog, so I was somewhat familiar with the constructs and idioms. A lot of work and ten custom plugins later, the entire site was converted, with very little loss of functionality.
Years ago, I wrote about a particular type of interview question that I despise. Today I'd like to discuss a much more specific question, rather than a type. I've never been asked this question myself, but I have seen it asked in an actual interview, and I officially nominate it as the worst question I've ever heard in an interview.
A co-worker at a previous company used to ask this question, and it was the first time I'd ever heard it in an interview setting. This company did pair interviews, two engineers with one candidate. One day he and I were the two engineers interviewing some poor candidate. The candidate had actually done pretty well as far as I was concerned, and then my co-worker busted this question out. The candidate stumbled over the answer, visibly frustrated with himself. In the post-interview pow-wow, all of the engineers who'd interviewed him gave him the thumbs up, except my interview partner, who refused to hire him on the grounds that he completely flubbed this question, and "any engineer worth his salt should be able to answer it." He actually said that if we hired this individual, he would be unwilling to work on a team with the candidate. For what it's worth, the story has a happy ending, in that we hired the candidate in spite of his protests, fired the co-worker within a few months, and the candidate is still at that company, doing quite well.
Anyway, I think this question perfectly represents everything that can go wrong with an interview question, so I'd like to discuss it here to explain why it's almost hilariously awful as an interview question:
Write a function that can detect a cycle in a linked list.
Seems like your basic algorithm coding question at first, right? Hop up and write the function on the white board; totally reasonable, right? Except it's not, it's brain-meltingly terrible. Let's break it down.
When I graduated with a Computer Science degree ten years ago, I was excited to dive into the world of professional programming. I had done well in school, and I thought I was completely ready to be employed doing my dream job: writing code. What I discovered in my very first interview, however, was that I was massively underprepared to be an actual professional programmer. I knew all about data structures and algorithms, but nothing about how actual professional, "enterprise" software was written. I was lucky to find a job at a place willing to take a chance on me, and proceeded to learn as much as I could as quickly as I could to make up for my deficiencies. This involved reading a LOT of books.
Here I reflect on my 10-year experience programming professionally and all of the books I've read in that time, and offer up the ten that had the most profound impact on my career. Note that these are not the "10 best" programming books. I do feel all of these books are very good, but that's not the only reason I'm selecting them here; I'm mentioning them because I felt that I was a profoundly different person after reading each than I was beforehand. Each of these books forced me to think differently about my profession, and I believe they helped mold me into the programmer I am today.
None of these books are language books. I may feel like learning to program in, say, Scala, had a profound impact on how I work professionally, but the enlightening thing was Scala itself, not the book I used to help me learn it. Similarly, I'd say that learning to use Git had a significant impact on how I view version control, but it was Git that had the impact on me, not the book that I used to teach myself the tool. The books on this list are about the the content they dumped into my brain, not just a particular technology they taught me, even if a technology had a profound impact on me.
So, without further ado...
TL;DR: After a decade of service, CenturyLink decided I wasn't worth keeping as a customer, so I switched to Comcast Business Internet. Even if CenturyLink tells you that you have no data caps, you do.
When I first moved to Colorado nearly a decade ago, I signed up for Comcast's Residential high-speed internet service, and I hated it. I had to reset my cable modem once every week because it would stop working properly, and my internet noticeably slowed down when people would get home from school/work in my apartment complex.
I did some research and determined that, for the kinds of internet speeds being offered at the time (around 1-5Mbps down), Qwest's DSL had similar prices to Comcast. Additionally, because it was DSL, I could use my own Netgear router that had a DSL modem built in, and I wouldn't have to mess with company equipment. Also, DSL's speed varies based on your distance to a hub, not based on how many people are currently using it. I thought I could avoid both of my major problems by switching to DSL for about the same price, so I did, paying for the maximum possible speed at the time, 5Mbps.
For many years, Qwest was the only monthly bill I never had any trouble with. Sometimes my cable would go out or look pixelated, sometimes my heating bill would be surprisingly massive, I was always having annoying issues with my cell phone service, but month after month, I never even noticed I had Qwest. It was just there, it worked, it never went down, it never slowed down. It was great.
I've finally written my first real Android app. I dabbled a bit with Android development in the very early days, writing an app that interacted with the web services of the company I was working for at the time. This app was ugly, written for Froyo, and frankly barely worked at all. It was a 20% time project at my then-employer, but I never went back and worked on it after the initial effort, and never even bothered getting it packaged into the Android Market, largely out of embarrassment.
I've been wanting to get into Android development more seriously since that effort, because I overall liked the idea, and as a big Android user myself, I felt that it was essential I be able to develop applications for my own device, as being unable to do so was a lot like using Linux without the ability to write shell scripts.
I've taken two all-day training sessions on Android before as part of larger development conferences, and while I was able to suss out some basics about the lifecycle and other Android fundamentals from them, neither left me with any sense of real understanding of how I could develop something for Android that people might actually use. But at OSCON 2013, I took an excellent half-day Android class taught by Marko Garenta. Among other things, he showed me, for the first time, how to write a modern-looking Holo app, how to use asynchronous background tasks, how to transition between multiple activities, and how to handle fragmentation issues. These were never touched on in my all-day sessions, and they're all large barriers to writing real applications.
Once I left the class I had a sense that I actually now knew enough that, with some help from web searches, I could actually write an Android app. I just needed a good idea, so I tried to take notice of various itches in my life to see if I could scratch any with a phone app.
This year, I went to O'Reilly's Open Source Convention, OSCON. Every year for the last four years, I've gone to a big tech conference. For the last three, I went to NoFluffJustStuff, which was later renamed UberConf. UberConf is held in my home state, I can drive to it from my house every day so there's no plane or hotel involved, which makes it inexpensive enough that I've been able to get my employers to pay for it. However, due to having attended UberConf consecutively for three years, last year I'd seen about half the sessions already either in previous years on at local Java User Group meetings, so I decided that this year I'd try something different.
OSCON was a radical departure for me. UberConf is a "Java/Agility" conference, and since I work almost exclusively with the JVM in an Agile environment, it's more or less custom-tailored to my interests. OSCON, however, had a huge variety of different tracks and a similarly varied group of attendees. There were Python folks, Ruby folks, hardware hackers, system admins, operations gurus, cloud nerds, data geeks, perl wonks, and more. I picked OSCON because, while the variety was less tailored to my interests, the sheer number of tracks (18 concurrent sessions per time slot!) made up for it.
Here is my review of the OSCON experience. OSCON was the first conference I've been to outside of my home state, and really the first one not run by the Rocky Mountain Software Symposium. As such, it will be unavoidable that I will be comparing it largely to my UberConf experience, since it's my only real frame of reference. I will try to address each element of the conference separately.
First thing's first, how were the sessions? I don't go to tech conferences to network or hand out business cards, though I hear that's half the point. I treat conferences like an intense week of school, I take notes and try to learn as much as I can in the sessions. A tech conference's quality is going to be 95% the quality of the sessions for me, so they're the most important thing by far.
I don't post book reviews here very often. Typically I write up a few paragraphs about a book when I finish it and post it to my Goodreads account, which I consider enough of a review for nearly every book I read.
But "Presentation Patterns: Techniques for Crafting Better Presentations" by Neal Ford, Matthew McCullough, Nathaniel Schutta is a bit more than a book. I'm not joking when I say this book has actually changed my life. As such, I felt it was necessary to devote an entire post to it to draw extra attention to it.
In the interest of full disclosure, I should admit that I know the authors personally, sort of. I've had a conversation or two with Neal and Nathaniel at various developer conferences, though I seriously doubt either of them would remember or even think my face looks familiar. McCullough I've interacted with quite a bit more, but I once managed to get him to tell me he wanted to punch me in the face. If you've ever met Matthew, you'd know this is pretty much like getting Gandhi to call you a stupid asshole. The point is, I'm not affiliated with the authors or getting anything out of promoting the book. I just found it extremely valuable and wanted to share it.
In any case, Presentation Patterns is excellent. The book is full of tons of real-world, usable tips, ranging from how to speak clearly to how to organize your thoughts to the actual mechanics of doing specific things in Keynote and Powerpoint. It's very detailed in this way, rarely leaving the reader wondering how to do a thing the book describes. Reading this book after seeing many presentations by speakers like Ford, McCullough, and Schutta was an eye-opening experience, something akin to seeing how the sausage is made.
Another semester is over, and it was quite the doozy like the last one. This time, however, it wasn't because I hated the workload (like last semester) but because I hated the material. Or more specifically, half of it, but more on that later.
This semester, I finally took the cross-department course from the Business School that I was dreading. My school's program is a joint-degree, Computer Science and Information Systems, with the CS stuff coming from the Engineering school and the IS stuff coming from the Business school. The degree has a number of requirements but one of them is that you must take at least one PhD-level class in each school. This means that I am able to take nothing but pure-CS courses, except I have to take one PhD-level course in the Business school. I've been dreading this since I started, somehow hoping this requirement would be removed before I had to take it, but this was not the case, and I had to take it this semester. With the credits I am able to transfer from my Master's degree, this actually completes all of my course requirements, but because I was interested I also took a course on Graph Theory, which was a class crosslisted as both an undergrad and master's level class. And once again, I somehow managed to get stuck grading.
I started the semester taking three courses on Coursera as well, but quickly decided that doing so contributed to my sense of being overwhelmed last semester, so I quickly dropped them and decided to download all of the lectures when the classes ended, and hence won't be mentioning them further.
To catch people up on some blogosphere drama:
Last month, Heather Arthur posted on her blog about an unfortunate incident in which some people on Twitter had found some code she wrote in GitHub, and people started trashing her on Twitter for it. Some of those people are considered leaders in the Software Craftsmanship movement, in particular Corey Haines. Corey immediately apologized for acting like an asshole, and I think his apology was sincere because I've met Corey and frankly the guy is almost annoyingly nice (he went around RubyConf 2012 taking a picture with every damn person there). But Ted Neward saw this turn of events and concluded that Corey's actions were not orthogonal to his involvement in Software Craftsmanship, but actually influenced by them, and he posted as such on his blog.
Neward's basic point is that this is the exact kind of behavior we should expect as a side effect of the Software Craftsmanship movement. By it's nature, it attempts to create a segregation between those who are "in the know" and those who are not, and he felt the behavior of Corey and others was a byproduct of this segregation. They saw Heather as an "other" and were overly harsh in their criticisms, emboldened in so doing by their sense of Craftsmanship. Neward took a lot of shit for this view on Twitter, with most people arguing that this was done by Software Craftsmen, but it wasn't done because they were Software Craftsmen, and it makes no sense to criticize the entire movement based on the action of a few members, regardless of how high their profile is. Neward responded to all of this feedback in a second blog post, where he reinforces his original point.
Uncle Bob Martin, one of the first five signers of the Software Craftsmanship Manifesto and author of Clean Code and The Clean Coder, then responded to all of this drama with his own post. Bob takes particular issue with Neward's promotion of what he sees as the opposite of a Software Craftsman, a Software Laborer. In Neward's words, a Laborer is someone "who cranked out one crappy app after another in (what else?) Visual Basic [...] their apps were sloppy, bloated, and ugly...cut-and-paste cobbled-together duct-tape wonders." At the end of the post, Neward bows "with respect to the "software laborers" of the world, who churn out quality code without concern for "craftsmanship", because their lives are more than just their code."