tag:chriszf.posthaven.com,2013:/posts chris zee eff 2015-04-24T17:22:31Z tag:chriszf.posthaven.com,2013:Post/771028 2014-11-17T15:45:59Z 2014-11-29T15:23:24Z Linux Nostalgia

I woke up this morning thinking to myself, "Self, you know what I miss? I miss the unadulterated agony of installing linux in the 90s. How can I recapture the fear and shame of that whole process in this present age of miracle and wonders?"

At 31, I'm not terribly old but I've observed a good majority of the evolution of modern computing. I once gave an interview and talked about how things were back in the day. My interviewer frowned at me with a look reserved for when a person tells you they used to ride a triceratops to work when they were younger. But really, 1994 was a long time ago and computers were hilariously primitive back then: that was the year of zip disks, the 14.4k modem, that was the era of America Online and Compuserve. I remember thinking you could never in a million years fill an entire gigabyte of space and that Bill Gates was nuts to declare that we could deliver entire movies over the internet. Chances are you probably didn't even own a computer then. (Actually, chances are you were still in diapers.)

The point is computers were rare, and it wasn't even apparent what you might use them for should you own one. Social networks didn't really exist, and not everyone had email, so it was out as a communication tool. Word processing and desktop publishing were in their infancy and printers could not be bothered to work (some things never change). Typewriters were still more reliable. I was all of 11 years old, so what in the world did I have that needed computing? (It turns out that 3D computation was surprisingly accessible even then, but that's another story.) So, with no obvious purpose and nothing else to do, I devoted a lot of time to exploring the corners of the machine.

Over the course of the years, I had become a fair hand at DOS. I could free base memory in my sleep (important for gaming, you see). It was trivial killing TSRs and moving things into expanded, then later, extended memory. Config.sys, baby. Windows 3.0 held few challenges, I had subjugated progman.exe and bent it to my will. I was unstoppable with a computer.

Then, some time in 1995, my father brought me to something he called a 'computer show'. Think of it as a precursor to the tech conference, a faire where vendors hawk new hardware and software from stalls. It was the bright red cartoon fedora, I think, that compelled me to investigate the display of RedHat CDs. I didn't even understand what my father meant when he nodded thoughtfully, "It's an operating system. Like Xenix." Whatever the case, madness took me and I took a copy. I knew then that this was my next challenge, but at that tender age I could never have dreamed of the realms of pain waiting for me.

Remembering Mother's Day

On a good day, linux installers of the time were... recalcitrant. I was not having a good day. I burned the first half fiddling with IDE cables just to get the CD to boot. When it finally did, I was unprepared for the parade of horrors to follow. Gone were my familiar A and C drives, replaced instead with cryptic the cryptic sequences /dev/fd0 and /dev/hda0. I was asked to choose a partition scheme by a text-mode installer that yielded no assistance. It gave me a list of the potential filesystems to use, I typed in '83' and prayed for the best. I hope I don't need that 'linux swap' thing.

For days after, my computer was a battleground. Linux fell to my assault in pieces: first the filesystem, next the package manager, then the display drivers and the serial peripherals. Finally, after getting my monitor refresh rate right, I was rewarded with a pair of cartoon eyes that followed my cursor around the screen. Behind those eyes lay the wreckage of a devastated DOS setup and a master boot record that would never be the same again. It was worth it, though. Redhat and X11 were installed, all that was left to do was get on the internet. Just follow the documentation on ppp and Hayes-compatible modems and I'd be gold.

I don't remember exactly when I gave up. Several more days had passed, that much I know. Something in the documentation just didn't jive. It asked me to run a command that didn't exist, to start a process that, near as I could tell, was little more than fanciful dream in some fiction writer's eye. The modem would dial, a connection was made, and then nothing. I couldn't tell which of the dozens of config files I had broken. By this point, it had been days since I had a working computer, and it was too much for me. As fascinating as the process had been, I was just a child and I wanted to play my games and get on this new and exciting 'internet' thing. I broke out my DOS 6.22 floppy disks and started rebuilding. Microsoft welcomed me back with warm, inviting arms.

Slow Moving Technology

Windows 95 and 98 came and went on my PC tower with the funny master boot record. (Forever after, it would insist that you hit enter twice after the system diagnostic stage before it would boot.) I eventually came to discover that I had been installing the 'notorious' Mother's Day Release of Redhat. Deemed irretrievably broken by those who knew better, it was a wonder I made it as far as I did. It was 1999 before I saw fit to try again, the scars had not yet faded. Perseverence paid off. This time, when the cartoon eyes came up, so did my TCP/IP connection.

The thing was, the whole process had been as painful as my first go-around. Nothing had really improved in the intervening years. The concept of user friendliness had not yet been invented. We still had text mode installers that only got you partway through the process then dumped you into an unadorned terminal as root. It's a forgone conclusion now, but in those days you had to decide whether or not you wanted a windowing system. If you did, there were dozens of ways to get there, but the installer would yield no secrets. Heck, it didn't even tell you that there was a windowing system to be installed. I spent months installing and uninstalling every interestingly-named package in the Debian repository just to see what they would do.

And for some perverse reason, this is exactly what I wanted today. I wanted the uncertainty that comes with total control. I wanted to roll around in malformed dotfiles and configurations until my eyes bled. I wanted to fight XF86Config for supremacy and come out on top. But what to do? Ubuntu had brought linux to the desktop with its convenient installer and live CDs. A few clicks and you had a ready-made desktop system waiting for you. Everyone modern distro had basically followed suit. I wanted to scratch a nostalgic itch, not build a linux distro from scratch. (Yet another adventure for another time.) Surely there was a happy medium.

The Search Goes On

And so it was casual masochism that had me trying out obscure linux distros, looking for the pain that hurt so good. After trying out Mint, it was clear that all Ubuntu-based distros were out. They were built on the idea that linux should be accessible. Definitely not what I wanted. The same went for all Fedora-based distributions. Slackware held promise, but I wasn't sure I was ready to summon those demons.

Crunchbang, Antix, Mepis, Damn Small Linux, Puppy Linux, the list went on. I was just about ready to try out a BSD when I realized that my irrational fear of leaving dpkg behind was limiting my options. In the course of my search, I had dismissed Arch early on because it used a package manager named 'pacman'. On top of that, it had poorly chosen flags. These now seemed like the right direction for my search. In Debian, installing a package called foo would use a command like apt-get install foo. A little cryptic, but the word 'install' at least gave you an idea of what would happen next. In Arch, pacman -S foo was the equivalent. -S for 'So Promising'.

Life Under the Arch

Minutes into the install process, I knew I was home. The install guide is terse, almost aggressive. Partitioning is considered part of the installation process in most distributions. Arch considers this a pre-install step: you are on your own. Even the so-called beginner's guide offers this gem, with no guidance on whether or not you might want to perform this action.

Erase partition table

If you want to start from scratch, and do not intend to keep existing partitions, erase the partition table with the following command. This simplifies creating new partitions and avoids problems with converting disks from MBR to GPT and vice versa.

# sgdisk --zap-all /dev/sda

Perfect.

In the end, I only had to reboot twice to make sure I wasn't destroying any of my existing partitions. I only ruined the UEFI boot process once and it only took 20 minutes to recover. I wrestled with xinit, screwed around with synaptics drivers, and had to build many of my favorite packages from source. I had to use a chroot to bootstrap the system, arcane magic I'd never done before. Breathe in deep. This is the linux of my childhood.

To be completely fair, the vast majority of things worked out of the box. I had a workable system in hours, not weeks. Arch was merely challenging, not a life or death ordeal. Perusing the wiki, I found a page titled The Arch Way. In it, they describe how they emphasize user responsibility over user friendliness. In a way, it's like a codification of the wild west of linux in the 90s, honed to a fine point. Arch goes out of its way to make sure that you have that paralyzing freedom of choice when getting your system up and running, minus the annoying fiddly bits. I'll admit that as I've gotten older, I've lost a lot of interest in tweaking everything so that it worked just so, but it's been very refreshing to feel that kind of control again.

The Verdict

Should you install Arch? It's hard to say. The answer, I think, is a solid 'maybe'. If you're like me, you live in vim and only ever have two windows open: a browser and a terminal. You can get that anywhere, and Arch has nothing special for you in that regard. You may find it a nice afternoon diversion to try to recapture your lost childhood, but unless you're trying to bring a very old machine online, you'll probably have a better time playing with a Skip-it.

If, however, you have no opinion on the IDE -> SATA changeover, (or even ISA -> PCI, if you like) then this will be an eye-opening experience for you. It will force you to reconsider your definition of a 'working computer'. There is a certain joy to be had in cherry-picking the pieces of your operating system. The modern twist is that there's very little risk that you'll have nothing to show for your trouble. It is absolutely worth doing once, even if you go back to Ubuntu in the end. For a little extra flavor, try an old-school window manager like Fluxbox rather than a full desktop environment. See for yourself how the older half lived.

In the end, I had a ton of fun. And for that, Arch gets a solid 9/10 squirrels.

]]>
tag:chriszf.posthaven.com,2013:Post/582307 2013-06-03T06:27:55Z 2013-10-08T17:26:02Z Year 1 of the Revolution

I must confess, I am addicted to anonymity. Not the crass commodity that powers John Gabriel's first law of internet, but the real stuff, anonymity bordering on invisibility. I aspire to leave as little trace on the internet as possible, stopping just shy of creating deliberately misleading identities to conceal my antics. You may notice a few twigs broken here and there, scant castings as digital records become inescapable, but you'll find no proof of me. The sasquatch is my spirit animal.

My dreams of growing up to be Harry Henderson are sas-squashed today because something important has happened, and I'm obligated to announce it.

Almost without fail, the world must move before I'll write. The two are so correlated, an observer may not perceive the causality of the situation. It may seem as if the mere act of putting virtual pen to virtual paper summons apocalyptic tremors. In reality, I'm more of a seismograph for technology, selectively writing about things that move me. What I'm feeling now are simply the foreshocks of a greater shift in the way Silicon Valley hires engineers.

In the past three weeks, some nineteen job offers have been made to Hackbright students. For those of you unfamiliar, Hackbright is a trade school with an atypical proposition: the total time to go from complete novice to skilled developer clocks in at just under three months, or two and a half werewolf cycles. I'll qualify my use of 'skilled developer'. In honesty, the phrase is a bit of propaganda; the reality is that a Hackbright student is competitive with a new graduate armed with a four-year computer science degree...

...

Now it sounds even more preposterous. 

But is it that unbelievable? If you take a typical college student, estimating that they retain only about half of what they learn, some fraction of which is medieval studies, the bar seems much less altitudinous. Mix a healthy dose of computer science, practical system administration, and modern software development practices, and the playing field is level. With a small class size (less than thirty), Hackbright is essentially the small-batch artisanal house blend of IT. You've probably never heard of it.

You have, however, heard of the employers who believe in our process and our students. Eventbrite, Survey Monkey, New Relic, Facebook, Bitcasa, StubHub, Facebook.... Facebook! Let me assure that none of these organizations are making special dispensation for our students and no punches were pulled during the interview process. If they were lowering their standards at all, I would be employed at one of these places:


The recent Facebook hire serves as final emphasis on a story we've been writing for a year now. With the right person, the right curriculum, and the right instructors, it's possible to train a person in ten weeks to be competitive with a college graduate with a computer science degree, even for a position at a tier-1 engineering organization. Beyond that, it appears to be repeatable.

Let's ignore the obvious conclusion for a moment*, as this raises a few questions about state of hiring in Silicon Valley. If an entry-level engineer can be vat-grown in a couple of months, what does that say when we need a specialist? Can we repeat the same process? Can thoughtful, immersive training replace years of study and experience? Our past successes suggest that it's possible, and at least worth looking into. 

In my experience, the average new hire takes about two months to come up to speed and produce something useful. This is assuming a typical onboarding process, where they're expected to accrete knowledge and osmose it through their semi-permeable membranes without direct intervention. Instead, with careful guidance, we might be able to abbreviate that to a fortnight. Or we could go all-out, and use the two months to train someone up from an average developer to be a great one in almost any niche required.

We're basically replicating the plot of Captain America, but with nerds.

There are caveats of course. Not everyone is suited to having their brain inserted into a particle accelerator and blasted with knowledge protons. It takes certain characteristics which I'm not yet ready to codify here. (I'll save that for another treatise.) Still, hiring processes could be greatly reformed. Imagine if culture fit and potential were the overriding factors, knowing that any gap in skill could be accounted for in the training period. It wouldn't solve all of our staffing problems, but it would be a huge boon.

The thing is, you don't even have to imagine this. One company already does this, sending new hires through a 'bootcamp' experience for their first month and a half: Facebook (I'm sensing a theme, here). By all accounts, this program is a great success. Their only flaw here is that they exclusively accept 'great' programmers to start. Their criteria could be amended to include candidates who are merely 'very promising', and do just as well. It recommends a scenario where the solution to all of the Silicon Valley's hiring problems is not just fixing the hiring process, but the training process as well. In my opinion, this shakes up pretty much everything we know about hiring knowledge workers and the role of college in producing them.

Every offer made to every one of my students is the result of their hard work and dedication, and of them I could not be more proud. For myself, I am honored to have played my small part. Indeed, to all my protégés, scattered to the four corners of Silicon Valley as you are, it has been a great pleasure to work with such talented engineers. I could not have asked for a more wonderful experience. Thank you all for making Hackbright's first year a success.

* Go to college. Seriously. HBA is not a replacement for college. It's the obvious conclusion, but it's wrong.

Follow me on Twitter... when I put it that way, it sounds like an order. Just do it though, it probably won't hurt.

]]>
tag:chriszf.posthaven.com,2013:Post/534647 2013-04-29T09:44:20Z 2015-01-14T23:13:33Z Straight From The Horse's Mouth, Or The Unofficial Hackbright Admissions FAQ

Hint -- The final interview will be conducted by a talking horse.

Q: Lolwut?

I figure we're about due for some clarification. I've been head of instruction for Hackbright Academy for just about one year now, and unfortunately, our admissions process remains completely opaque. Part of this is due to the simple fact that admissions is a difficult thing, the remainder is due to my own incompetence and absent-mindedness. This has resulted in more than a few broken hearts, and more than one person's application getting 'lost in the system'. (I would like to comment that it goes both ways: if we email you and you don't respond, that makes us equally sad.) For all that, I apologize. It is a hard thing to do. Consider this: as agonizing as it is for you to receive a rejection letter, amplify that by the hundreds of times that we have to tell someone that we didn't think they were a good fit, no matter how eager and excited they were. We're not so large and faceless yet that each rejection doesn't affect us personally.

Think of this FAQ as both a clarification of our process, so that when we slip up, you might find it in your heart to forgive us our failings, and a pre-emptive public apology.

Q: What exactly is the process?

The general process is three steps:
  • Submit an application
  • First round interview
  • Second round interview

After the second round, we give our final decision. Generally, we do not deviate from this process, and already my heart goes out to everyone who's reading this sentence and feeling disappointment.

Q: What exactly are my chances?

First, if you're a dude, I commend your tenacity in applying despite the fact that we do not allow men in the course. So while you may have an impressive resume and be a compelling applicant, your chances are still zero and that will not change in the near future.

For the rest of you, our actual demographic, statistically, your chances aren't much better than that dude. That's not to say you shouldn't apply, because you absolutely should. The issue at hand is that we run a physical facility, which limits our ability to accept applicants. Currently, we're capped at 26 students per session. We potentially have the ability to accept a few more, with a hard stop at 32, but we haven't yet solved the problem of noise levels, and adding more would just make a bad problem worse. That said, you'd think, hey, 26 is a decent size, and it is. Only, we never anticipated how important our work would be, and how much it would mean to people.

Our first class was 12 students, and we got 40 applications. Our second class grew a little, to 16, and so did our applicant pool, about 90 applied. The third and current class (as of this post) is 26-strong, with roughly 200 applications. At the rate we're receiving applications for the 2014 Summer Session, we should top out between 400 to 500, putting us at the same acceptance rate as Harvard. Dang.

Q: How do I improve my chances?

Our application page is simple, but that does not mean it's easy. We have to filter out the vast majority of applicants just from the contents of the application, so make yourself stand out. There are only a few fields, so fill them out thoroughly. Where we ask you to teach us something new, the key word here is teach, not new. Explain any concept of your choice so completely that we could not possibly have any further questions.

Do the optional coding challenge! It's optional, yes, and that's slightly misleading. We understand that you don't already know how to code, and that's why you're applying. But you should know, before you apply, that this is what you want to do for 8 hours a day for three months, and then, the rest of your life. Spend a few hours, a week, maybe, learning just enough to complete the challenge. There are great resources out there, Codecademy, Ruby Monk, Treehouse, etc., and any one of these can get you just enough coding know-how to do so. Think of it like a puzzle: taking time to solve it shows great initiative that can set you apart from the casually-interested. I'm especially fond of answers that aren't written in code, but instead in mathematical notation or enumerated, logical steps. More than once, that has guaranteed a second interview.

Be personable and entertaining in your answers. We don't have a stodgy admissions department looking to mark checkboxes. David and I read each application personally, and if your writing resonates with either of us, that can be the difference between an interview and being passed over.

Q: I solved your challenge but I really don't know anything about programming at all, should I apply?

Yes, please, absolutely. Rather than make it easy on ourselves, we've chosen to explicitly teach people with little to no experience at all in programming. You are our target demographic, and we've had great success with people whose only experience was the 3 hours on Codecademy needed to solve our coding challenge. We actively turn away people who know too much because they don't need our help.

Q: Why is this interview so important anyway?

The word 'academy' is in our name, but we're not even remotely close to being a normal school. When it comes to instruction, I don't just lecture for an hour and walk away. I spend the entire day interacting with students and answering questions. The interview serves as a filter. You may be a more-than-qualified applicant, but if you can't stand me for an hour, I probably won't end up growing on you. That's problematic if we're spending 8 hours in a room together.

Because our program is so short and so intense, we can't rely on the traditional model of presenting information and waiting for students to absorb knowledge through osmosis. You will be asked to bring all your mental faculties to bear down on a rigorous discipline, solving difficult problems, and where your skills fall short, you are buoyed by the instruction staff. Working this closely means it's imperative that we discern early on if we have a constructive rapport. Naturally, this means we will turn away perfectly great people. This is not an indictment of your character, but a failing of our methodology. 

Q: I didn't get an interview, and now I'm sadfaced. :(

That's not a question, but I'll speak to it anyway. If you didn't get an interview, do not despair. Again, Hackbright has become so competitive that we simply cannot accept or even interview everyone who applies. We get a lot of false negatives, and more than once have we discovered that an applicant has gone on to greater things than we can provide. We definitely aren't perfect and have gotten it wrong before, so please try not to take it personally. Please do apply again! This time, change your application up. We ask that applicants, after having received notice that they have not been selected for the quarter, to wait at least one quarter before applying again. This gives you time to work on improving your application. We love nothing better than to see improvement in an applicant.

There are many ways you can improve in the interim, but materially, you could just work your way through the two textbooks we use for the course:

We ask every incoming student to work through these books before arriving on the first day, so it's as good a start as any.

Q: I read those books and, I think I uhh... accidentally became too smart for Hackbright.

That's okay, too. Please apply anyway. As mentioned before, it honestly gives me great satisfaction to tell someone that they're more than qualified to go out and be awesome without my help. I'm looking at you, @sharonw and @reneighbor. We're not in the business of giving our services to someone who doesn't need them, this is not the foundation of commerce. We trade for currency only when the services rendered will be worthwhile.

Sometimes though, you're in that weird place where you're too advanced to be a Hackbright student but not yet ready to strike it out on your own. For those people, we have the Hackbright Apprentice program, which is a combination of self-paced study through the Hackbright curriculum and TAing for first half of the quarter, then joining the students for the final projects. Going forward, we're going to formalize the apprenticeship and make it a more structured program to run parallel to the coding fellowship. Trust me, it's super fun and you want to be a part of it.

Epilogue

I really hope that clarified things. Again, we're human, running something that's grown beyond our capabilities to handle, and we're desperately trying rectify that, but we haven't solved it yet and it's going to be a bumpy process for some time coming. We've been accused of being unprofessional because of it, and to that I have no rebuttal. Only, we're trying to make admissions smoother and fair, and I hope you'll be forgiving while we hammer that out.

P.S. The horse mentioned above is me: I conduct the final interview.

You can follow me on twitter at @chriszf, but I pretty much never tweet ever so it's not worth it.
]]>
tag:chriszf.posthaven.com,2013:Post/188172 2013-03-20T01:14:00Z 2013-10-08T16:01:31Z Apparently We're Not Ready to Be Adults About Anything

I write rarely, once in a geological age (but it's a geological internet age, so it's just a few times a year instead of a few times an aeon), and that bell tolls again.

This fine piece of internet drama wakes me from my slumber. In short, a woman at a conference overheard a joke in bad taste. She outed the perpetrator on Twitter and he was removed from the conference. (Update: No action was taken by PyCon besides informing him of the issue; nobody was removed from anywhere). He was also fired by his employer.

The discussion that followed should have been along the lines of, "Wow, Twitter is a powerful soapbox, and we must all be aware of our audience, both in person and online." Also acceptable would have been sympathy for the man with three children who was fired for a lapse in judgement. Let he who is free of sin throw the first stone, etc.

Instead, we got some mouth-foaming. And insults. And general rage. Even the last bastion of the erudite, Hacker News, was not free from insanity. Even when the man in question apologized, recognizing his behavior was disrespectful, we get a huge dose of sexism bordering on the violent.

Listen up, tech community. I'm only going to say this once. None of that is acceptable, and you are to grow the hell up right now. Whether you agree with Adria, whether you agree with (I'm assuming, here) Playhaven's decision to fire the guy, whether you think you should be allowed to tell lewd jokes at a family-friendly conference, there is no world in which any of these examples constitute an adult conversation. Save your name-calling for the playground.

There is a massive anti-women sentiment in tech, and it brings out the worst in so many people. Hordes of men, and even some women, seem to relish the idea that tech is this insular field where the only rules that apply come straight from Lord of the Flies. Somehow, if we allow anyone who has human, adult feelings onto our island, we forever lose our capacity to do our jobs. If talking freely about "banging chicks" or ogling women in bikinis is somehow critical to your professional function, you may want to re-evaluate your career.

Technology is propelled forward by creative thinking and motivated individuals, not by our cavalier attitudes towards dick jokes. Curbing that behavior does not make us any worse as engineers and entrepreneurs, so you can stop with that argument right now.

You can also stop saying that she has no right to be offended, and that the jokes weren't offensive in the first place. Just because you cannot fathom something does not mean it has no merit. Being a straight, white male gives you no authority over what women, the genderqueer, or other minorities may take offense at. Being unable to sympathize with a completely foreign experience does not make you a bad person, but being dismissive because you can't does.

As technologists, we like to think that we're the ones driving real progress so we are exempt from the rules that every other industry has developed in their long, storied histories. It's time for a reality check. The rest of the world shakes their collective head at our childish antics while we struggle frantically to build the next 'facebook for dogs' or 'twitter for watermelons.' It's time to grow up.

]]>
tag:chriszf.posthaven.com,2013:Post/188173 2012-10-12T14:52:14Z 2013-10-08T16:01:31Z There's Something About Learning To Code

I don't think I'm qualified to say whether something is a bubble, but you know something's wrong when your line of work is a joke out of a movie from 1998.

"7-Minute Abs. And we guarantee just as good a workout as the 8-minute folk. . . . If you're not happy with the first 7 minutes, we're gonna send you the extra minute free!"
"That's good. Unless, of course, somebody comes up with 6-Minute Abs. Then you're in trouble, huh?"
"No! No, no . . . not 6! I said 7. Nobody's comin' up with 6. Who works out in 6 minutes?! You won't even get your heart goin', not even a mouse on a wheel. . . . "
-There's Something About Mary

Enter codeStreak, a 6 week Ruby on Rails course.

Now, I'm loth to criticize any website especially since I've done such a poor job of keeping mine maintained, but I do want to question their hero graphic. Aside from the fact that it shows that they attempted to launch by November but were unable to get all the pieces in place (and I don't blame them, there are so many), that is just outright poor use of Dan Cook's amazing (and free!) game prototyping tiles. They just deserve a classier treatment than 'lolspeak', is all I'm saying.

Ahem.

Where were we. First we start with Codecademy's Code Year. Next, Hungry Academy takes 5 months and feeds you into LivingSocial's engineering team. Bloc.io schedules 12 weeks. Starter League (formery Code Academy, thank god one of them changed their name) takes 11 weeks in Chicago. Dev Bootcamp in SF took 10 weeks but has switched to 9. App Academy also takes 9, and now codeStreak takes 6. It's some sort of bizarre race to the bottom. If we extrapolate here, we should expect to soon see a school that promises to make you forget how to program in a single week.

Not knowing any of the fellows at codeStreak, I can't yet condemn their ambitious plan. Perhaps they have discovered a way to inject programming skill directly into the brain, a la the matrix. If that is the case, they are sorely underpriced.

]]>
tag:chriszf.posthaven.com,2013:Post/188174 2012-09-05T00:21:00Z 2015-04-24T17:22:31Z Getting BerkeleyDB working with Python on OSX Lion

Come on Apple. Why do you have to do this to me? Every other unix installation is just fine, but you have to go and be super insane with your default modules. So many things are secretly broken with the default python installation, it's recommended that you install a 'real version' before doing anything. So I'll just go ahead and follow these instructions and...

Oh.

Come on.

Okay, it looks like the solution might be to first install Xcode 4.3 and then to OH MY GOD I AM SICK OF THIS, I JUST WANT BDB IN PYTHON. EFFF YOUUUU APPLE.

Alright, I got that out of my system, let's get to solving the problem. The solution turns out to be really easy, but completely undocumented. If you don't have the intestinal fortitude to go reading through install scripts, or if you haven't dealt with good old-fashioned *nix library installations from a decade ago, you probably won't ever find it on google. If you can figure out the magic search terms that do bring up the solution, please let me know.

To start, first wipe your hard drive, repartition it to ext4 volumes, and install ubuntu or any other debian variant, and tadaa, you're done.

No? Fine. If you don't want to use the grandness of apt, then we'll have to settle for homebrew. Homebrew is a package manager along the same vein of apt or yum, but specifically for macs. The recipes included in homebrew have acceptable coverage of most of the popular packages I've tried to install, so it gets my stamp of approval. Install homebrew before moving on.

If you're like me, you'll have learned to really hate IDEs, and Xcode in particular. If you don't have it installed, take a moment to grab the gcc-without-xcode installer from the venerable Kenneth Reitz.

Now that you have brew and gcc installed, time to get BerkeleyDB installed. Or at least, one we can manipulate. For all I know, Apple may have included one somewhere, but I'll be damned if I can find it.

brew install berkeley-db

What's that? I have to install the JDK because Apple chose to not include it? I have to sign up for an Apple developer account to get it?! I give up. I'll just go back to writing my key-value pairs on this cave wall.

Wait, what's this? If we look at the brew formula for BerkeleyDB, you'll notice an interesting "without-java" option listed. Let's try that.

brew install berkeley-db --without-java

Aha, success! Now we're getting somewhere. Homebrew installs all of its packages to /usr/local/Cellar. Take note of that, it'll be important later. The end must be near. It should be a simple matter of running pip and...

Huh, okay, it wants the bdb path. Okay, no problem.

Hmmm, if we do 'pip help install', we find that to pass options to the installation process, we need to use the --install-option option of pip, which will then feed it to the package's setup.py. So we do it like so:

Still nothing. If we look at the output, we can see that the install is failing on running 'python setup.py egg_info'. If we go into the build directory and run that manually, it suggests we add the --berkeley-db option. When we do that, it seems to work, but how do we get pip to call it with that option?

It turns out, we can't. At least, not that I can figure out. You can pass parameters to 'setup.py build' through pip by the --install-option flag, but not the 'setup.py egg_info'. It's just not baked in there. Horrible.

At this point, our options are, fix pip, or fix the bsddb3 setup.py process. Well, neither are especially appealing options. If I ever need to do this again, or do it on another machine, or whatever, that would be a mess. There's gotta be a better way.

Thinking back to the 90s (I know, right?), when you tried to build a library that had a dependency on another library, you could specify the location of that dependency through a similar flag, like --my-library-path=/blah. This was almost always a convenience shortcut for setting an environment variable, MY_LIBRARY_PATH=/blah. Let's see if that's the case here. If we go into the egg and grep around for 'berkeley-db', we find the following segment:

It looks like all we have to do is set the environment variable BERKELEYDB_DIR and point it at our bsddb installation.

Finally. I'm aggravated that it was annoying as it was. I figured we had basically solved all of these problems 12 years ago. Every time I'm asked to install java or gcc on a mac, I realize how obnoxious OSX is for doing old-timey non-Apple development. It's not impossible, but some things are just absurd.

To recap (or in the modern parlance, tl;dr):

OSX ships with broken bsddb python bindings. To fix this:

  1. Install gcc
  2. Install homebrew
  3. Install berkeley-db through brew
  4. Set the BERKELEY_DB environment variable to point to the brew installation before running pip install bsddb3.
]]>
tag:chriszf.posthaven.com,2013:Post/188175 2012-07-26T06:14:00Z 2013-10-08T16:01:31Z Apparently Yelp has a no-fly list

Edit: My students are adults, which wasn't really the point of the article (and wasn't clear). Their ridiculous exclusion policy is. As smart as kids are these days, I'm not sure a "Pinterest scaling talk" is something I'd take kids to.

So I took my class on a field trip (hah!) to an event put on by the SF MySQL group at Yelp's offices. The event itself, a talk by the Pinterest engineers on how they scaled their product, was pretty great. Certainly, I learned a lot from it, and the pizza was pretty nice too. However, the venue left a bad taste in my mouth. Let me illustrate.

Quick, tell me, what do you know about the man they call Jon Blakely?

The only sensible response to that demand is, "Which Jon Blakely? There are hundreds!"

Imagine my surprise then, that Yelp's security team seems to be entirely insensible. When we arrived, one of my students was told that she was not allowed into the Yelp offices. I pressed further, and they informed me that she could not be let in because she was an "undisclosed security risk." I was also told I could email 'yelp.com' to clear things up. Thanks.

I was completely stonewalled by security, and I felt terrible. As a teacher, it's very hard not to feel protective of your students. Maybe it's a holdover from childhood, when you look at your teachers as the final authority on everything; from the other side, you feel the pressure to be that authority.

But here I was, helpless. The only thing I could do was direct her to BART and join the rest of the class inside. Once in, I did a little snooping, and got nothing at all from the Yelpers I found. Like as not, they didn't know anything about this mystery security policy. It wasn't until I ran into Erin O., the event organizer, that I got any useful information: when an event is hosted at Yelp, it is policy to provide them with a guest list. This list is composed of the full names of attendees and nothing else.

So, Yelp security team. You banned an attendee from your venue based solely on her name. Knowing nothing else about her, you decided she was an undisclosed security risk. Perhaps you have a policy of having only so many people with the letter 'Y' in their name, so as not to run afoul of fire codes. Maybe she shares a name with an internationally renowned thief, known for stealing restaurant reviews and selling them on the black market. Or maybe you were just taking a page out of the TSA playbook because it worked so well for them and it was not a terrible idea at all. Whatever the reason, bravo.

I don't hold it against the security guy, he was just following instructions. He was even polite (and contrite) about the fiasco. And I get it. Yelp is a grown-up company now, and they have to wear the big-boy security pants. You can't have anyone just walking in off the street and stealing all those reviews. But I'll be damned if that isn't the dumbest security policy I've ever seen. As crummy as google restaurant reviews are, Yelp's getting uninstalled.

]]>
tag:chriszf.posthaven.com,2013:Post/188176 2012-07-20T07:14:00Z 2013-10-08T16:01:31Z I'm a programmer and so can you!

It's okay to be a programmer -- it really is. No matter how many posts on HN advocate leaving your job and starting a new career as an entrepreneur, it's really okay to stop and think, "I like programming, I don't mind doing it for someone else, and I'm happy to keep doing it."

There's no shame in being a programmer, certainly. Otherwise, you probably wouldn't be doing it. It's become the fad recently to look down on programmers, especially by those of us "doing our own thing," as if entrepreneurs are somehow better for trying something new. Well, I probably don't have to back this up with numbers, but most entrepreneurs are destined to fail. Is that better or worse than having security doing something you enjoy?

There's a lot of ways to become wealthy as a programmer. You could, for example:
  • join a startup at an early stage
  • consult for obscene amounts of money
  • be a programmer and be paid a programmer's salary
You might not be fabulously wealthy and written about in magazines, but you could do extremely well. If you play your cards right, you might retire early to a farm and take up animal husbandry. Is that better or worse than creating businesses that never go anywhere? The answer is neither. It's just different, and that's okay.
]]>
tag:chriszf.posthaven.com,2013:Post/188177 2012-05-15T15:17:00Z 2013-10-08T16:01:31Z Of Course You Should Learn to Code, Pixar Even Made A Movie About It

It's May 15th, 2012. I just woke up and apparently the world started taking crazy pills overnight.

My first thought was that someone was pranking my wireless connection because there was no way anyone was having this conversation. But here we are. And I'm biting the linkbait.

Jeff Atwood is entitled to his opinion, ridiculous though it may be. I understand his sentiment: we don't want bad coders, coding isn't the solution, it's the means to an end, you should be problem solving, not writing code. I get it. What I don't get is how he can say that with a straight face. Look at his bio and the offending post.

In short, Jeff is a 41 year old software developer, has thirty years of experience, having started on BASIC in the 80s. He would have us believe that he got into coding at the age of 11 in 1982 to solve big problems.

Right.

My first BASIC program at age 6 was to draw a pretty snowflake onto the screen. My next programs were a series of games out of a bizarre choose-your-own-programming-adventure kids' book. When I was 10 I graduated to C and the only thing I cared about was making dumb video games.

Unless Jeff was an exceptionally mature 11-year-old, I'm pretty sure he wasn't thinking about how he could help build branded media players by learning to code.

When you're that age, there's no way to understand the power you're learning to wield. All you know is that making the computer go bloop in just the way you want is incredibly fun. I'm a professional programmer (hah!) and adult (double-hah!) now and it's still incredibly fun.

As it happens, coding is one of those skills that's also tremendously useful. It helped me build an RFID security system. It gives me a slight edge on puzzle hunts. Heck, it just earned me an extra gig on dropbox. It's empowering. I love coding and I want to share that with everyone who's interested.

Jeff insists that instead of learning to code, we "research voraciously, and understand how the things around us work at a basic level". I may just be unimaginative, but I really don't see how he insists anyone do that about computers without learning to code along the way. The fundamentals of computers are hopelessly far away for someone who just browses the web and writes an email here and there. Here's a fun game: in under five minutes, try to explain the concept of cwd to anyone who's never used anything besides a mac.

Jeff calls out Dev Bootcamp in particular for their short program and impressive results. What he doesn't realize is that the DBC students were incredibly driven and would have been great programmers if they had been pushed in that direction when they were 18. Why should we discourage that just because they're 26 now? They put in 400+ hours and got jobs as entry level engineers. If we're being honest, I probably put in less time than that at my formal CS education before I got my entry-level position. Any kids reading this, don't be like me.

I can only imagine that his elitist insistence against learning to code comes from having watched Ratatouille and stopping the film immediately after they introduced the villain. He must have missed the moment of redemption, "Anyone can code. But I realize, only now do I truly understand what he meant. Not everyone can become a great coder, but a great coder can come from anywhere." (Totally 100% what he said in the movie, honest).

]]>
tag:chriszf.posthaven.com,2013:Post/188178 2012-05-03T08:15:06Z 2013-10-08T16:01:31Z No, sorry, this is how you teach people how to program.

Coincidentally, it's the same way you teach people to do anything else.

As a teacher, here are the two things you need to do:

First, identify the gaps in their knowledge. Second, fill in those gaps.

Easier said than done.

You have to know your students fairly intimately. You have to find out how much they know, and perhaps more useful, how much they don't, before you can attempt to start. You'll have to know their goals, so you can at least point them the right way, and you'll have to be able to steer them away, when you know that the direction they want to go will end in disaster.

At first, it will seem daunting. The 'gap' in knowledge will be more like a gaping maw. It will be that way for a long time, and the first few nuggets of knowledge will be the hardest to convey. You'll struggle with terminology, using imperfect analogies to relate completely alien ideas. You won't even be speaking the same language at first.

Once they've mastered some basics, it gets a little better in some ways. Your student will have a platform from which they can see their goal and they can start walking, filling in the gaps for themselves as they go. Your job will change. Instead of feeding them knowledge, you'll be showing them where to find it, removing roadblocks, placing signposts, unveiling interesting detours and shortcut, keeping them motivated. At this point, you're not teaching as much as you're guiding, but for every step they take, you'll already have taken 30, making sure the path was right in the first place. It's a different path and a different goal for every student.

Imagine teaching twenty people. It's exhausting.

How can anyone do this reliably? I don't know how he does it, but you can see it in action here. As much as I dislike Zed's internet persona, it's clear to me that he's an amazing teacher. He cares about his students and answers every single one of their questions at the level they need, pointing them in the right direction. On top of being a good programmer, he's a good teacher and wants to share what he knows. That is how you teach people how to program.

]]>
tag:chriszf.posthaven.com,2013:Post/188179 2012-04-26T18:19:00Z 2013-10-08T16:01:31Z Learn Rails the Hard Way

Apologies to Zed Shaw, of course.

Learn Rails the Hard Way, in 10 steps

  1. Travel back in time to 1998, and build a mysql-backed CMS in php, because that was pretty much all there was.
  2. Take a bunch of CS courses, fixate on the concept of a REPL. Think to yourself, a video game is just a really fast REPL with fancy graphics.
  3. Learn Vim because it's cool, and because M-x tetris is distasteful.
  4. Learn struts for a job and feel the agony of what it's like to drown in a sea of meaningless xml files just to get a web page to display.
  5. Work for Ask.com and know the true horror that using xml for dispatch can bring: cower before the homebrewed struts-alike called Tako. Tako means octopus in Japanese, the naming was obviously a reference to the tentacles of Cthulu.
  6. Pick up the life-preserver that is Pragmatic Rails and learn Rails 1.2 because it's still 2007. You're time traveling, remember? Build a toy app that scratches an itch, learn to be smug about how fast you built it. Be slightly baffled that some constructs that work in Rails don't work in straight Ruby.
  7. Take a job that uses Python, deploy Pylons because of how pleasantly Rails-y it is. Read the source and find that it's basically a straight port of Rails. Continue to be smug.
  8. On a whim, take netcat and observe simple http traffic. Idly wonder what it would be like to have a web browser feed directly into a REPL... oh.
  9. Teach a class on Rails, even though you haven't touched it in five years.
  10. Advocate learning Sinatra, Flask, and Noir instead.

Learning to use Rails is almost as much learning what you're not using instead. It's hard to appreciate the utility of Rails without having a broader survey of what your options are. Indeed, if you lack the context to know why Rails was revolutionary in the first place, many of its features may seem esoteric and inconsistent. I've met many developers for whom Rails was a black box, and many of their problems were solved by trial and error. This is a bad sign.

"A fad is created when adoption exceeds education." Someone much smarter said that, and I can't remember who it was for the life of me. Rails was/is/will become a fad (time traveling!), and I worry that the quality of Rails developers (on average, obviously there are plenty of good ones) is declining as we move forward. I see people writing code that reminds me of the cut-and-paste web developers of the late 90s, and that's not good.

Many apps these days are little more than hastily-glued together gems, letting Rails and Devise automate everything else. I appreciate that there is a large body of knowledge involved in making that happen, and you might need to be a Rails expert to do that. I'm not so sure that means you're also a programming expert.

Don't get me wrong. Rails is wonderful. It has done amazing things for the current state of technology and the Rails team continues to innovate constantly. But if you're going to be a Rails developer then really learn it, what it does for you, and more importantly, what it hides from you. Be a developer first, then a web developer, then a Rails developer. When you finally understand Rails, you'll also understand why you don't need it. And by then, you might not even want it anymore.

]]>
tag:chriszf.posthaven.com,2013:Post/188180 2012-04-16T18:25:00Z 2013-10-08T16:01:31Z Announcing the Winner of the 2012 IOJSCC: @fat

A quick back story.

Once upon a time, @fat, a Twitter employee, wrote a javascript library that was a really good idea. Everyone thought so, and pretty soon everyone was using it. Then one day, someone looked through the software and noticed that there was nary a semicolon in sight. They looked high and low and found not one.

Now, this person looked about him and saw that every other piece of software (of note) was laden with semicolons and said to @fat, "Yo dude. What's up with that?" To which @fat replied, "The parser is gracious and lenient. It permits me to leave the semicolons out and so I do."

Immediately, the old gods of Javascript descended to the earth, stroked their mighty beards and said, "@fat, seriously, that is super dumb. It's lenient in the event that you slip up and forget a semicolon, but it wasn't meant for you to omit them all the time. Also, your code looks crazy in some places just because you mislike semicolons. We regret even adding this feature in at all."

@fat, hurt by this rebuke, cried out, "I am merely what you made me! You permitted me to create without semicolons and now you bind my hands and call me a monster!" Looking at his hands with wild eyes he announced, "Then a monster I shall be! And with these hands, I will unmake you."

And so began the great Javascript war of 2012.

This semicolon issue is much more fun written this way.

For myself, I think @fat is being ridiculous. Yes, javascript can be written as if newlines were significant (like ruby or python) but to accomplish this, he pulls tricks out that look like they were lifted from entries in the IOCCC.

Let me reiterate, just in case it wasn't clear: In the service of 'aesthetics', @fat writes code that uses tricks that bear resemblance to entries in a competition whose goal is to demonstrate the importance of coding style through irony.

And because I hate to explain anything without the use of analogy, it's as if Stephanie Meyers wrote the Twilight series using Washington Post's Worst Analogy Contest as her style guide, and Shakespeare rose from his grave to tell her to cut it out.

My money's on zombie Shakespeare.

]]>
tag:chriszf.posthaven.com,2013:Post/188181 2012-04-04T17:29:39Z 2013-10-08T16:01:31Z I Regret Everything: Episode 1 - Foreign Key Constraints

It's about time I really evaluated my stance on FK constraints on InnoDB tables in MySQL. Here was my position on the matter up till now: judicious use of foreign keys for some people can improve performance for certain queries and help maintain data integrity.

This is great because as a programmer, you probably want those things. Probably the thing you're writing reads much more than it writes. That's just the way the world works these days. Additionally, who's going to say no to data with integrity. The only way this could be better is if it also gave the data honor and humility.

So let's take a really shallow look at how foreign key constraints work before we go about properly criticizing them. First, let's talk about locks. Skip past the following paragraphs about chickens and Will Smith if you already know how they work.

Let's say you have a number that represents the total number of chickens you have wrangled into a children's ball pit. You and your buddy are tasked with maintaining this number at 25. You put chickens in, and when there are too many, you have to dispatch some. Before you start your task for the day, there are 23 chickens flapping about. You take the time to count them, and you go and get two more chickens to put in.

While you're off gathering said chickens, Will Smith, who you have convinced to partner with you for the day, also counts 23 and also decides to put two chickens in. He disappears to summon the birds, and you put yours in, not knowing he's doing the same. He returns some time later, throws his chickens in, and calls it a job well done. When you check your work two hours later, you have 27 chickens, and while you appreciate that you got to spend time with the Man In Black, you're beginning to question your line of work, and honestly, who's paying you to do this.

It's a little like the heisenburg uncertainty principle, but with chickens. Between observing a value and acting on it, there is a short but distinct amount of time in which someone else can muck it all up for you. That's where locks come in. Imagine you each have a padlock to the ball pit. When there's a lock on the door, no one else may mess with the chickens until you have removed it. As long as you can't remove someone else's lock, everything's dandy.

So... computers.

So imagine your database table to be a thing filled with chickens.

Right, let's just forget that.

Here's the point. To maintain integrity between two tables, B and A, in a scenario where a row in B depends on a row in A, InnoDB will lock both tables. If you are updating the child B, it will lock A to make sure no one deletes the parent row while you're not looking. We expect the B lock, but while it makes sense, we don't usually think that there would be a lock on A.

Now we've reached the crux: all roads from here on lead to deadlock, a scenario where a process has table A locked and is waiting for a lock on table B to free up while another process has B locked and is waiting for the lock on A to go away. Both processes wait forever, and you end up crying.

The real problem is that there are so many ways this can happen, and even as I write this post, I find more. A SELECT on the parent will lock the child, a SELECT on a child will lock the parent. If you have an S lock due to a foreign key constraint, it can't automatically be upgraded to an X lock. To avoid the 'phantom problem', InnoDB uses next-key locking and gap locking when using FK constraints.

Frankly, I don't understand that last paragraph, and the phantom problem sounds downright scary. I want to deal with data storage, not ghosts (although only just barely). This sounds like a fight where everyone's slinging locks instead of bullets, and I want none of that. I am not smart enough to deal with it.

Let's say for a second that you are. That you're very careful about your constraints, and how you lock, and the order in which you acquire your locks. You're diligent about deadlocks and it shows. Now, your entire codebase is migrated over to SqlAlchemy or ActiveRecord or any other fancy ORM. Where is your god-of-locking now? You have next to no control over locking in these environments, and the queries become problematically complex. Difficult enough that analysis may not be worth it.

Where does that leave us? Well, you have data. It has integrity. It might even have conviction, and after going through so many locks, some character. But it's awfully lonely, since the database frequently times out on lock-waits when more than one person is using it. All because you used a foreign key.

The ironic bit, of course, is that your app is already structured to avoid the real scenario that having foreign keys saves you from. Chances are, you will write your code so you never modify existing primary keys, and you won't accidentally delete records and orphan rows. Even if you do, orphaned rows may be acceptable; you can just clean them up later, no harm no foul.

So I amend my position here with finality, for all who come after me, now until eternity, world without end: judicious use of foreign keys for some people can improve performance for certain queries and help maintain data integrity, but you are not one of those people.

]]>
tag:chriszf.posthaven.com,2013:Post/188182 2011-08-22T22:39:00Z 2013-10-08T16:01:31Z On Numbers: Part the First

In which we learn to count.

Okay peeps, let's talk about numbers. Specifically, integers and floats. As programmers, these are the kinds of numbers we're interested in. For the record, an integer (int) looks like this:

1

And a float looks like this:

1.0

In the real world, 1 == 1.0, but this isn't true for computers, so let's take a peek why. But before we can understand, we need to learn how to count.

Here are some numbers, you may be familiar with them.

1 2 3 4 5 6 7 8 9 10 11

For illustrative purposes, let's rewrite them.

0001 0002 0003 0004 0005 0006 0007 0008 0009 0010 0011

It's still the same, we've just prepended zeros to pad the numbers up to 4 digits. Now let's start a little thought experiment. Imagine, for a second, that we had never discovered the number 9. Just, we as a species, looked at our hands, immediately relegated thumbs to second class citizens, and decided numbers should go from 1 to 8. Let's see what that would be like.

0001 0002 0003 0004 0005 0006 0007 0008 ...

If you're not bored yet, you may be a little confused as to what comes next. Remember, we've stricken the number 9 from existence. But never fear, we can still _represent_ the concept of 9 things (even though we don't have a numeral for it), in exactly the same way that we can represent 10 things even though we don't have a single numeral for 10. Et voila:

.... 0005 0006 0007 0008 0010 0011

"But you skipped 9! That's just 10. That doesn't work. Cats would have a mysterious extra life that we couldn't quantify, and three squared would be a non-existent number! Nena's seminal work about red balloons would be, at the very least, incredibly awkward." Well, yes and no. 

Think back to the simplest form of counting you know: tally marks. A little line for _one_ item, and then 4 lines and a strike through for 5. For simplicity, we'll represent a collection of 5 with the plus symbol. So, counting to six:

|   ||   |||   ||||   +   +|

And so 13 is ++|||, and 20 is ++++. Now, 25 is +++++, but that's getting kind of messy, so let's pretend 25 is represented by an X, and look, now we have a very simple version of roman numerals.

Let's throw another monkey and the requisite wrench into the works, and say that 0, is represented by a -. And further, to keep things clean, we'll divide everything into neat columns. All the single tallies are grouped together, and the pluses go together, and the X, etc. So some numbers, say, 4, 12, 36, 44, 51:

4:     -    -    ||||
12:    -    ++   ||
36:    X    ++   |
44:    X    +++  ||||
51:    XX   -    |

We could keep going, but despite our best efforts, things are getting kind of hairy. What if we wanted to represent the number 125? We could do XXXXX, but if we continue in our pattern of simplification, the correct thing to do would be to create a new symbol to represent 125, maybe _. But we're running out of straight lines on our keyboard and this is becoming a mess. Like we said before, it also looks like roman numerals, but those suck! Yeah! Down with romans! Romani eunt domus!

Ahem. Back to the matter at hand. How can we improve our counting system? 

Let's do something tricky. In each column, we can only use 4 tallies before switching to using the column to the left, and leaving a - in the current column. We're going to cheat, and bring back our modern arabic numerals, 0 through 4. Instead of drawing individual tallies, we'll count the number of marks we made, and just put our numeral down:

4:     0 0 4
12:    0 2 2
36:    1 2 1
44:    1 3 4
51:    2 0 1

KABLAMMO. That's the sound of your mind being blown. This is the illustration of the relation between places, numerals, and actual numbers. Let's look at the number 5. In our Zebulon Numeral system, it looks like this:  -  +  -. Translated to arabic numerals: 0 1 0 -> 010 -> 10. WHAT. THE. EFF.

So now counting, going by the basis of having five tally marks, looks like this, 1 2 3 4 10 11 12... and so forth. So the number '10' does not actually mean ten things. It means, in a system, where we have N numerals and a zero, we are representing exactly N + 1 items. It is essentially a tally mark in the next column over. And so here, where our basis of counting is four tallies and a zero, 10 really means 'five'.

In our previous example, where we hate our thumbs, 10 means 'nine'.

FINE, you say, BUT WHAT IF YOU HAD 16 FINGERS? Well, just like we made up symbols to represent numerals bigger than ||||, we can make up more symbols. We could just do, ... 8 9 | + X. But instead of a contrived example, I'll just reveal that modern convention uses the letters A-F to represent ten through fifteen.

1 2 3 4 5 6 7 8 9 A B C D E F whatcomesnextquickyouknowitalreadytoolateI'mjustgonnatellyou 10

Look at that. I've just, in an incredibly long and convoluted manner, taught you hexadecimal. Or, in english, sixatennish. Maybe 'base 16' is better. And we've seen base 5 and base 9 counting systems.

So. Let's take this all the way in the other direction. Let's say we're a little bit slow, and we only ever learned two numerals: '0' and '1'. But somehow we're still smart enough to count to numbers greater than one. What does that look like?

0000
    0001
    0010
    0011
    0100

Recognize that? That there is binary, sonny. I remember back in ought six, well, we didn't have the numeral six back then, so it was ought one one ought...

Tune in next time when we link binary to hexawhatsitall, then look at what that means for integers on computers.

]]>
tag:chriszf.posthaven.com,2013:Post/188183 2010-04-29T22:25:00Z 2013-10-08T16:01:31Z Wha? A smokescreen? Must be ninjas about!

So let me summarize Sr. Jobs' latest 'open letter'.

  1. Flash on iPhone sucks.
  2. Flash on iPhone sucks.
  3. Flash on iPhone sucks.
  4. Flash on iPhone sucks.
  5. Flash on iPhone sucks.
  6. Therefore we should not let anyone make the iPhone a compile target for their chosen language.

The real issue that Adobe is worked up about is section 3.3.1. Here is a discussion. In essence, you may not write an iPhone app in a language/framework that isn't a combination of Cocoa and Obj-C. For the non-technical, it's (almost) as if Apple is restricting you to only speaking English when calling someone on an iPhone. Almost.

Points one through five are awfully nitpicky reasons, and technically inaccurate to varying degrees. They're valid enough, though, if that is the position Apple wants to hold. My beef is with people reading along, nodding their heads, "This is well-reasoned" and then missing the curveball that he throws in point 6:

"Besides the fact that Flash is closed and proprietary, has major technical drawbacks, and doesn’t support touch based devices, there is an even more important reason we do not allow Flash on iPhones, iPods and iPads. We have discussed the downsides of using Flash to play video and interactive content from websites, but Adobe also wants developers to adopt Flash to create apps that run on our mobile devices."

One of these things is not like the other. "Adobe wants developers to adopt Flash to create apps that run on our mobile devices." Re-read it carefully. Adobe wants people to use Flash (a development environment) to build apps that run on iPhones. CS5 had the ability to write a program in actionscript that compiled down to a native iPhone app. Short of running the app through a debugger, you would not know that the app was built in the Flash environment. It's not a matter of running Flash apps compiled to bytecode and running in a Flash environment on an iPhone, it's a matter of running apps which were built in Adobe's development environment, which is expressly prohibited in section 3.3.1 of the developer agreement.

This means Unity, Mono, Haxe, Titanium, which all use the same approach to compile native apps are all forbidden. None of them are especially concerned though, because the general mood at the moment seems to be that Apple is only interested in blocking Adobe.

There are valid reasons for Apple to not want Adobe on their platform. The other environments are relatively niche environments. However, by enabling iPhone targets in CS5, Adobe allows a veritable glut of Flash devs to build multiplatform (including android) apps. It is in Apple's interests to lock in developers and establish platform exclusives, the way Sony and Microsoft do with their gaming consoles. However, to hide it under the guise of not wanting to compromise the performance or quality of their product is disingenuous. After all, we know that Apple delivers only the finest in fart apps.

]]>
tag:chriszf.posthaven.com,2013:Post/188184 2010-04-16T19:12:00Z 2013-10-08T16:01:31Z The Sound of a Thousand Birds Taking Wing: a #chirp Postmortem

If you've been following my tweets, you would know that I've been at the Chirp conference for the past two days. You'd also know that I'm whiny and complainy. Some might even say saucy. Still, the troll-esque comments I've been tweeting the whole time belie the fact that Chirp was interesting and successful in its own way, and worthwhile besides.

The morning of the first day set a bad precedent. Given that the event was presented as a conference for developers, I was sorely disappointed to attend the first two sessions consisted largely of Twitter execs laughing at their own failings and attempting to excite the crowd about Twitter. If you are a developer, imagine the circumstances under which you would like to attend a Twitter conference. If you are having trouble, imagine being a C++ developer and spending $1000 to attend Java One. The vast majority of the attendees were already enthusiastic about Twitter, and this early morning cheerleading session only served to waste time. Helen Lawrence of daredigital.com later mentioned as much in a conversation with Twitter's Alex Macgillivray, saying the 'rah rah twitter' sessions were a bit unnecessary.

Hopeful for some unique insight into the Twitter strategy, I floated in and out of the rest of the day's sessions, in between stuffing my face with cupcakes. I attended the @anywhere presentation and the two sessions on monetization (I was present for the panel on investing, but I'll admit I was actually just reading a book). Succinctly: it all fell flat. From a purely technical standpoint, the sessions were boring. @anywhere is an abbreviation of existing tools to make twitter integration easier, but not any more twitter-like than you could already do. The nytimes example of integration was banal at best.

In terms of strategy, it was almost worse. Consider the monetization approach: ads in search. If you can find the video, you can see Dick Costolo sweating and tugging uncomfortably at his collar as he attempts to explain how promoted tweets are somehow different from ads (in short, they're not). As of now, they only appear in searches from twitter.com. It's easy to see from various public sources that search is in fact a tiny fraction of their traffic. When google announced adwords, search was practically 100% of their traffic. The promotion of @anywhere does nothing to improve those numbers. Minutes prior, they admit that 75% of their traffic comes from the APIs and they want to increase that. Whatever their real plan is for monetization, promoted tweets can't be more than a distraction.

Disheartened, I skipped the Q&A session with @ev, where I missed some interesting tidbits on whether or not Twitter was going to eat other people's lunches by launching official in-house versions of various external services. I'm looking at you, twitpic. A stroke of fortune left me sitting at the same table as Alex Macgillivray for dinner that night. He expounded a little bit more openly about Twitter's approach to media sharing services. Specifically, they have no plans to choose one as being canonical. If anything, they might integrate with all of them, and let the user decide which to share through. That said, he left me with a rather sinister comment: "At the end of the day, all these services do is add a link to the end of your tweet, and we're not going to prevent that." It was meant to sound reassuring, but it makes me wonder if they might not attempt to take a bite out of posterous and brizzly by building native 'rich' tweets with embedded media as part of the tweet. A year earlier or two earlier, it would have been unthinkable, given the ties to SMS as a delivery method. The growing ubiquity of smartphones has decreased that reliance, making this possibility much more viable.

I skipped out of the evening's activities shortly after that, unwilling and unable to engage in a '24 hour hackday' forgoing sleep and showering. Unfortunately, I may have been one of the few people to decide that. Comments on day 2 to follow later today.

]]>