Posted on August 23rd, 2013 No comments
[I wrote this post several months ago, shortly after the divorce was finalized. I have held off publishing it until all the final details were completed. Now they are.]
“How long were you married,” was always the first question. “Nine years,” I would reply. It was only a slight exaggeration. Our anniversary was only a few months away, and the mandatory waiting period meant we wouldn’t be divorced for six months, at least. “That’s a long time. You must have gotten married young.”
It didn’t feel like it at the time. We were both out of college – she had a master’s degree already! – when I proposed. We had career paths and car payments, student loans and credit cards, rent checks and insurance premiums. We felt like adults, though still struggling to learn who we really were and where we really fit and what was really important. Several of our other friends had already said their vows a year or more before, and we loved each other. She agreed, and the ceremony was a year later. We were twenty-three.
The years seem to have rushed by now, looking back. Our career paths detoured, and we sold our cars. A new city meant new friends gained, and old friends grown distant. The credit cards and loan payments were still there, but we exchanged the rent check for a mortgage payment, and put down roots. There were no children – a fact for which I am ever so grateful now – and as we discovered more of who we were, our love was changed and redefined, but was still a constant to me.
When I uncovered her affair, my world view shattered. I left her. I told our mutual friends what was happening, moved out of the house, and spent the next four-and-a-half months hurting. I posted to a secret, anonymous blog, dumping to it anything that came into my head: I wrote angry epitaphs, aimed at her. I wrote of the unexpected sadness I felt at the family we would never now have. I transcribed the dreams from which I awoke crying at three o’clock in the morning.
And then, without warning, I got over her. I moved on.
“That’s too fast,” people would say. “Ten years is a long time. You can’t be over her that quickly. Give yourself time to heal.” Know thyself, said somebody or another. And I guess I do. I had set myself up to succeed the moment I decided the marriage was over. With the help of my friends and family, I had made it through to the other side. For the first time in a decade, I was ready to live only for myself. I could do anything I wanted, and I did. I lost forty-five pounds. I went on trips and to concerts. I ate goat brains and learned to love seafood. I started dating.
I threw a big party this past New Years Eve. As I mingled with friends new and old, I found myself telling them that 2012 had been a really good year for me. Sure, a terrible thing had happened, but I came through it happier and healthier and more fulfilled than ever before. Loss and pain have acted like a lens, focusing me on what I want and what is important to me. My future is clearer to me now than ever before.
Posted on December 11th, 2012 No comments
Everything old is new again, and that includes Chromatic Coffee in Santa Clara, California. Formerly Barefoot Coffee, longtime visitors were surprised by the sudden change in name and signage. But concerns that the hangout would lose its charm or passion for great joe were quickly ameliorated: The staff tells me that the owner simply decided to open his own roastery to go with the shop – which only means even more interesting micro-roasts and single-origin coffees to experience! And the warm interior, friendly and professional staff, funky playlist, and ever-changing gallery of local art all remain, as welcoming as ever.
Chromatic Coffee runs two primary coffee stations. The first is a three-head La Marzocco rotary espresso machine, providing shots for the usual selection of lattes, macchiatos, cappuccinos, and doppios. Two time-modded grinders provide the day’s pair of espresso selections. The baristas are well-trained and know their product. When I arrived shortly after opening one morning, the barista Christine was still dialing in the machine to provide the optimum extraction for that day’s beans: grind-pull-sip-toss-adjust, grind-pull-sip-toss-adjust, grind-pull-sip-toss-adjust. Her extra few minutes of effort paid off in the end, though, with an excellent cafe latte: The Emperor blend she used held an excellent sweet chocolate, but with enough brightness to kick over the perfectly micro-foamed milk. And all topped with the a lovely bit of art that it seems we’ve all come to take for granted. The Guatamala Eucaliptos in the second grinder provided me with a doppio that landed on my tongue with a mild start, but quickly evolved into a swath of fruit and earth (the packaging describes it as “cherry cola”).
The second of Chromatic’s stations is a four-funnel manual pour-over stand, used to highlight the company’s selection of single-origin beans and artisan roasting. Each order is individually ground, filtered, and poured by hand into a single cup – a meticulous process, but one that brings out the flavor characteristics of each coffee. If you’re looking to experience the aroma and flavor of a particular region of coffee, this is how you want to do it. At the recommendation of Kyle, I sampled the El Cerro from El Salvador, and was thoroughly surprised by the different ends of the flavor spectrum it yielded: a tart fruitiness on the top with a deep base of chocolate underlying.
Chromatic Coffee’s ample seating and medium-to-large tables make it a great place to meet friends for a chat, or gather with larger groups. Coupled with the free wifi, they also make Chromatic an excellent spot for a laptop-bug to work while staying caffeinated. But please, no mooching! And a typical assortment of pastries are delivered fresh each morning, if you need more than just caffeine to function.
Chromatic Coffee is located at 5237 Stevens Creek Blvd. in
Santa Clara, California. Follow @CHROMATICCOFFEE on Twitter, @chromaticcoffee on Instagram, and find them on Facebook. For more photos, be sure to check out my Chromatic Coffee set on Flickr.
Laptop Friendly: Yes
Posted on December 3rd, 2012 No comments
During the question and answer section of the panel I recently spoke on at DCWeek 2012, one questioner asked the panel to describe an API that had “disappointed” us at some point. I replied: Twitter. Though he was angling for technical reasons – poor design, bad documentation, or insufficient abstraction – I had different reasons in mind.
Twitter’s Successful API
Twitter’s primary API is without a doubt a hallmark of good design. It is singularly focused on one particular task: broadcasting small messages from one account to all following accounts as close to real-time as possible. That extreme focus led to simplicity, and that simplicity meant it is easy for developers to code and test their applications. Interactions with the API’s abstraction are straight-forward and often self-documenting. When coupled with Twitter’s excellent documentation and sense of community, the early years meant that developers were free to explore and experiment, leading to a plethora of interesting – and sometimes terrible – Twitter clients (including my own Java IRC bot JackBot).
Coincidentally, the explosion of smart phones, social networking, and always-on Internet connectivity meant Twitter’s raison d’être was also a means to explosive growth. The Fail Whale was an all-too-familiar sight during those early growing pains, but the same focus and simplicity that made it an easy API for developers to use also made it possible for Twitter to dramatically improve the implementation. Today, Twitter serves over 300 million messages daily – up several orders of magnitude from when I joined – yet our favorite marine mammal surfaces rarely.
Twitter’s early business model is a familiar story. A cool idea formed the basis of a company, funded by venture-capital and outside investment. There was little thought given to how to turn a profit. Seeing themselves in competition with the already-huge Facebook, growing the user-base was the only real concern. For many years, Twitter continued to foster its community: In a symbiotic relationship with developers and users – who were often the same – Twitter expanded and modified the API, improved the implementation, and actively encouraged new developers to explore new and different ways of interacting with the Twitter systems. So important was this relationship that even things like the term “tweet”, the concept of the re-tweet, and even Twitter’s trademarked blue bird logo all originated with third-parties.
But the good times can’t roll forever; eventually the investors want a return, and the company began seeking a method to make money. Seeing itself as a social network, advertising was the obvious choice. But there was a problem: the company’s own policy and openness had made advertising difficult to implement. Here’s what I wrote in December 2009:
More than 70% of users on Twitter post from third-party applications that aren’t controlled by Twitter. Some of those applications are other services – sites like TwitterFeed that syndicate information pulled from other places on the web (this blog, included). Others are robots like JackBot, my Java IRC bot which tweets the topics of conversation for a channel I frequent.
Advertisers purchase users’ attention, and if you can’t guarantee that access, you can’t sell ads. But what third-party client is going to show ads on behalf of Twitter? Users – particularly the developers creating those third-party apps – don’t want to see ads if they can avoid it. You won’t make much money selling ads to only 30% of your users (who are also likely the least savvy 30%). What’s a little blue birdie to do?
The chosen path was to limit – and perhaps eliminate entirely – third-party clients. The recent 100,000 limit on client tokens is an obvious technological step, and they are already completely cutting off access for some developers. Additionally, where technological restrictions are difficult, changes to the terms of service have placed legal restrictions on how clients may interact with the API, display tweets, and even in how they may interact with other similar services. (Twitter clients are not allowed to “intermingle” messages from Twitter’s API with messages from other services.) It seems likely that the screws will continue to tighten.
A Way Forward: Get On The Bus
Twitter has built the first ubiquitous, Internet-wide, scalable pub-sub messaging bus. Today that bus primarily carries human-language messages from person to person, but there are no technical limitations preventing its broader use. The system could be enhanced and expanded to provide additional features – security, reliability, bursty-ness, quantity of messages, quantity of followers, to name just a few – and then Twitter can charge companies for access to those features. Industrial control and just-in-time manufacturing, stock quotes and financial data, and broadcast and media conglomerates would all have benefited from a general-purpose, simple message exchange API.
Such a generalized service would be far more useful to the world at large than just another mechanism for shoving ads in my face, and I would bet that the potential profits from becoming the de facto worldwide messaging bus would dwarf even the wildest projections for ad revenues. It wouldn’t be easy: highly available, super-scalable systems are fraught with difficulty – just ask Amazon – but Twitter is closer to it than anyone else, and their lead and mindshare would give them a huge network-effect advantage in the marketplace.
With this new model replacing the advertising money, third-party clients would no longer be an existential threat. Twitter could remove the pillow from the face of their ecosystem and breath new life back into their slowly-suffocating community.
Will they take this path? I doubt it. The company’s actions in the past several months clearly telegraph their intentions. Twitter’s API teaches us an important lesson that, no matter how well designed, documented, and supported an platform is, there will always be people behind it making business decisions. Those decisions can affect the usability of the API just as deeply as bad design, and often much more suddenly. Caveat programmer!
Posted on November 29th, 2012 No comments
Little pockets of downtime pepper our lives: waiting for the bus, waiting at a crosswalk, waiting for that one person in the group to come back from the bathroom again. We make smalltalk, we look at our watches, we check our phones. These moments flit away like a mote of dust passing through a sunbeam, a few seconds at at time.
Win, Lose, Banana is a game for these moments. A typical game lasts less than ten seconds, turning that awkward silence where everyone would normally be feigning sudden extreme interest in the patterns on the tin ceiling into an awkward argument over which player has the banana.
Win, Lose, Banana is a so-called convincing game for three players. It consists of merely three cards: Win, Lose, and Banana. Each player randomly choses a card, and the player with the Win card simply shows it and announces victory. Congratulations! The winner is then entitled to the banana.
Ah, but who has it? The two remaining players must then convince the winner that they have the banana – and that the other player is the loser. If the winner chooses correctly, both she and the banana may smugly gloat over the loser. But if the loser manages to be more convincing, he must mercilessly mock the other two players. The only real rule is that you may not simply show the winner your card; everything else is fair game. The lengths to which players can go to be convincing are otherwise bounded only by decorum and your imagination. And perhaps the length of time it takes that one friend to pee.
It’s hard to describe how much fun I’ve had with this game. Passing moments on the street with friends would sudden turn rowdy as we argued over possession of the banana. And at only $1, it’s a no-brainer that you ought to buy it. In fact, buy a dozen and give them away to friends. This may be the single best value in gaming – ever – and quite possibly the best $1 you’ve ever spent.
Posted on November 10th, 2012 No comments
I had the opportunity to speak on a panel at DCWeek 2012 this past week: “Five Crucial APIs to Know About”. (I am not listed on the speakers page, as I was a rather last-minute addition.) Conversation ranged from what goes into making a good API – dogfooding, documentation, focus – to pitfalls to be aware of when building your business on an external API. It was a fun and informative discussion, and I walked away with plenty to chew on.
An API is all about two things: Abstraction and Interaction. It takes something messy, abstracts away some of the details, and then you, as a programmer, interact with that abstraction. That interaction causes the underlying code to do something (and hopefully making your life easier). If you interact with it differently, you’ll get different results. Understanding an API, then, requires understanding both the abstraction as well as how you are meant to interact with it.
Now, DCWeek focuses primarily on the startup scene. As such, I expected that most of my fellow panelists would be focusing on web-exposed APIs. Sure enough, there was plenty of talk on Facebook, Twilio, Twitter, and laundry list of other HTTP-accessible APIs. All of which are great! Note, though, that these APIs share one common thing: They are all network-reliant APIs. As such, they are built on a whole bunch of other APIs, but at the end of the day, they all route through one specific API (or a clone): Berkley Sockets.
Why should you care about a 30-year-old API when you care about tweets and friends and phone calls? Stop for a moment and think about what those high-level APIs are built on: a network. Worse – the Internet. A series of tubes. Leaky, lossy, variable-bandwidth tubes. And it’s only getting worse – sometimes you’re on a high-bandwidth wifi connection; other times you’re on a crappy, intermittent cellular connection in a subway tunnel.
The user’s experience with a high-level network API is going to be directly impacted by socket options chosen several layers down – often just by default – but different experiences require different expectations from the network. Do you have a low-latency API that provides immediate user-interactive feedback in super-short bursts? Then you might want to learn about Nagel’s Algorithm and TCP_NODELAY. Does your app require a user to sit and stare at a throbber while you make a network call? You might want to consider adjusting your connection, send, and receive timeouts to provide more prompt feedback when the network fails.
And believe me: the network will fail. But how do you handle it? As programmers, we tend to focus on the so-called “happy path”, relegating failure handling to second-class status. Unfortunately, treating failure as unlikely is simply not acceptable in a world of ubiquitous networking and web services. Not all network failures are the same, and providing the best user experience requires understanding the difference between the various types of failures in the specific context of what you were attempting to accomplish.
So take a moment and do some research. If you’re using a networked API that exposes network details, learn about them and tweak them for the specific task at hand. If you’re writing an API, consider how users will be accessing it, and provide them guidance with how to achieve the best possible experience over the network. The people using your apps will thank you.
I’d like to thank my fellow panelists – Greg Cypes, Sasha Laundy, and Evgeny Popov – for such an interesting discussion, as well as to thank our moderator Matt Hunckler for keeping us on track. I hope we can do it again in the future.
Posted on March 20th, 2012 No comments
I don’t have anything to say publicly about Mike Daisey’s lies. What I will say publicly is that Heather and I saw The Agony and The Ecstasy of Steve Jobs at Woolly Mammoth Theater on April 14, 2011. It was a powerful piece, and – unusually for me – I saved the program. After reading this essay by the former marketing directory of the Woolly Mammoth Theatre Company, I have scanned page three and page four and placed them here. I have slightly edited the image of page three in an obvious manner. You may click on the the images to view full-size versions.
Posted on February 25th, 2012 No comments
I have always had a thing for books. I started reading when I was a kid, and I never stopped. I oscillate regularly between fiction and nonfiction, binging for a while on science fiction and epic fantasy before devouring title after title on politics, economics, science, and philosophy. Packing for every trip included at least one hardcover – preferred over paperbacks for their sturdiness and aesthetics on my bookshelf – and sometimes two or three.
I purchased an iPad in May 2010.
Since then, every single book I have read has been on the iPad. This wasn’t because I had some idealistic desire to switch to eBooks. It just sort of happened, in retrospect I think because they were both cheaper and just easier to get. I have bought and read nearly two dozen books on my iPad in the past year-and-three-quarters, but I planned on purchasing the hardcover for books that I really wanted on my shelf, that I really thought were special – the books I really wanted to read, the way you only can with a hardcover.
Ironically, the first book to pass that bar was Walter Isaacson’s biography Steve Jobs.
I had pre-ordered it, and also received a copy as a gift, so I had two copies sitting on my shelf while I churned through the reading list on my iPad. Finally, a couple of weeks back, I located the large, white cover, pulled it off the shelf, and dove in.
It immediately pissed me off.
Not the content of the book, what Walter Isaacson wrote, but the book itself – the actual physical thing. I had had no problems reading from wood pulp and ink for three three decades before the iPad, yet suddenly I found myself constantly annoyed by its characteristics. I was shocked at all the annoying things about a paper book that I had never noticed before.
The paper book is bigger than my iPad, and it’s heavier, too. The book takes up four or five times the room in my backpack. Before, I had sometimes carried around two or three of these things at once! How the hell did I ever do that?
It’s more cumbersome and awkward, too – a lot more. My iPad is evenly balanced and always the same shape, making it a simple matter to re-orient it, hand it off to someone, or just lean it against something for support. The physical construction of the book – there’s a hinge in the middle! – makes it lopsided most of the time, which makes it difficult to re-orient. And don’t you dare try laying it flat or propping it up without something holding down the pages – you’ll lose your place in no time as it unhelpfully flips pages around.
What is a book for, if not reading? And yet, though I had never noticed it before, a book on paper is actively hostile to reading. The pages curve inwards towards the binding, distorting and hiding the words. It isn’t a constant thing, either, because the curvature and distortion becomes worse as you get closer to the binding. I regularly find myself physically reorienting the book – made more difficult by its aforementioned cumbersome nature – just to read the words on the page! In contrast, the words on my iPad are always flat and never distorted.
The final nail in the coffin is the paltry lack of features in the paper book. There is no built-in dictionary, so I can’t just tap a word and check its definition, nor is it possible to search for all occurrences of a particular word or phrase. As a little bonus, if I have a few moments to spare, I can pull out my iPhone and pick up where I left off; my current spot is synchronized automatically. Though they are not essential, I have become used to such niceties.
Which is not to say that reading on the iPad is perfect. Glare can be an issue, especially if I am outside. A paper book isn’t going to run out of battery any time soon, and it isn’t prone to breaking if I drop it from a height. Nor is it a tempting target for thieves. And as a long-term archival mechanism, or just a pretty thing to show off on a shelf, the paper book wins hands-down.
But for reading, the actual process of reading: I’m a convert. Give me an eBook.
Posted on January 18th, 2012 No comments
So if you’ve been to Wikipedia at all today, you have no doubt noticed that instead of your desired web page, you’re instead being shown a big, black page directing you to take action to prevent Congress from passing a really stupid piece of legislation called SOPA. It’s a really bad law which will infringe free speech and basically break the Internet all at the behest of some already-stupidly-wealthy special interests. I definitely encourage you to take action against SOPA if you haven’t already.
Some of us have work to do, though, and besides being a great resource on Justin Bieber’s hair, Wikipedia also has a plethora of important and useful technical information. Fortunately for us, Wikipedia chose about the most brain-dead way possible to implement their blackout: a script. So, if you would like to bypass their blackout, simply block the following URL using your favorite ad blocker in your browser:
Posted on January 15th, 2012 3 comments
I graduated from college in 2000, with some student loans courtesy of the Department of Education. I consolidated them in 2004, and have been paying diligently through direct debit for the past eight years. I was fortunate and didn’t have much student debt, and the payment amount was low enough that I rarely thought about it. Month after month, year after year, my payment would be withdrawn from my checking account, marching steadily towards a payoff date sometime in 2013.
Several weeks back, I started receiving letters about my loan. First was a notice from the Department of Education that they had given my loan to a company called Mohela for servicing. Then I received similar letters from Mohela itself. (Stupidly, Mohela has also started sending me monthly paper statements.) But whatever – it’s fine. I don’t really care who collects the money as long as my loan is being paid off and the terms haven’t changed.
Today I opened another letter from Mohela, dated January 10, 2012. It explains inside that my loan terms have changed. The interest rate is the same, but my monthly payment has decreased 74%, and my loan term has been extended until 2019! The only explanation for this change is a cryptic statement at the very end of the letter: ”** Your terms have been re-calculated to maintain federal/program guidelines **”.
So let me get this straight, Mohela: You want me to pay the same interest rate on a principal balance that – under the new terms – is now decreasing at a significantly slower rate than before for an additional six years?! I don’t think so. Your little “re-calculation” works out to a significant increase in the total amount of interest I’m paying on that loan, without my consent. Sorry – I didn’t agree to that. You’re trying to steal from me.
I’ll be calling the Department of Education to complain on Tuesday, and I’ll be paying off the loan immediately. They aren’t going to get a single cent of additional interest out of me. I won’t be calling Mohela, however. They can suck it. It’s just not worth my time to try and get the payment terms restored.
Posted on January 5th, 2012 No comments
It’s always cool when music you’ve liked for a really long time hits mainstream. It gets used on a commercial or in a movie trailer or played at a sporting event, and when the person next to you says, “Wow! What is that song?” you can just tell them. That happened several times with E.S. Posthumus, and every time it was awesome introducing somebody to new music. (And if you are a fan of grand, cinematic-style music and haven’t checked out E.S. Posthumus yet – well, you’re missing out.)
I don’t have a whole lot of interest in seeing the new movie Extremely Loud and Incredibly Close, but the commercials playing on TV are playing a great track off The Cinematic Orchestra’s album Ma Fleur called To Build A Home. You should probably just go ahead and buy it now.