Why you don't want co-located branches (bzr)

blog entry posted by lalo (Lalo Martins) on 2012-10-21 15:00:00

Tags:

Having been recently stuck with git at work, I had a momentary lapse this morning, and convinced myself I needed co-located branches (git-style) to work on my game, which is of course under bzr as any project maintained by sane people should be.

It turns out I didn't, and nobody, ever, does. Here's what to do instead.

Why you think you need it

The most common argument for colo is a source tree for a large compiled project, where you have hundreds of .o files you don't want to be recompiling all the time.

In our case, we keep the generated player avatars (.png) inside the source tree, which, to be honest, is not a very brilliant design, but it's also not quite a high priority to fix right now.

Now, based on these arguments, some very clever minds have been hard at work to add colo support to bzr for the last few years; it's been a core feature since 2.5.0 (2012-02-24).

... but you don't!

Looking at these arguments as someone who understands the bzr model well, a catch soon becomes apparent: these aren't, in any way, arguments for co-located branches. They are, rather, arguments for co-located trees, and those bzr already does quite well. (Or rather, it doesn't need to.)

You see, trees only coexist with branches by default. The second greatest argument about bzr is that it allows you to work your way; and the power that unleashes is often underestimated.

(In case you're curious and haven't yet heard this rant from me, the #1 greatest argument is that the model is sane and based on a good understanding of version control and real-life workflows, as opposed to the snapshotty-hashy-hacky of certain other VCSes. But let's not digress too much.)

The how-to

First, put your branches somewhere; a shared repository is preferred to save you lots of space and (if it's going to be remote) network traffic.

$ cd /somewhere/bzr-repos
$ mkdir my-project
$ bzr init-repo --no-trees my-project
$ cd /where-your-sources-are/my-project
$ bzr push /somewhere/bzr-repos/my-project/trunk
$ bzr branch /somewhere/bzr-repos/my-project/trunk /somewhere/bzr-repos/my-project/my-branch
$ bzr branch /somewhere/bzr-repos/my-project/trunk /somewhere/bzr-repos/my-project/other-branch

2a: Use lightweight checkouts. This is best if the branches are local, and will make your switches a little faster. Back up your data first; while this won't destroy anything in the history, it may hose files in the working tree or make your checkout unusable.

$ cd /where-your-sources-are/my-project
$ bzr bind /somewhere/bzr-repos/my-project/trunk
$ bzr reconfigure --lightweight-checkout

2b: Use heavyweight (regular) checkouts. This is better if you're working directly with remote branches, as you can then work offline; if you're always online, it essentially trades some disk space for bandwidth, so take your pick.

$ cd /where-your-sources-are/my-project
$ bzr bind /somewhere/bzr-repos/my-project/trunk
$ bzr reconfigure --checkout # probably unnecessary, unless it was lightweight before

3: Happy branch-switching:

$ cd /where-your-sources-are/my-project
$ bzr switch my-branch
$ (do stuff)
$ bzr switch other-branch
$ (do stuff)
$ bzr switch trunk
$ bzr merge my-branch

etc. The relative branch specs work because they're relative to the current branch location, rather than the working tree.

4: Create new branch:

$ cd /where-your-sources-are/my-project
$ bzr switch -b a-new-branch

5: Delete old branches:

$ cd /where-your-sources-are/my-project
$ bzr switch trunk
$ bzr merge my-branch
$ bzr rmbranch my-branch

Afterword

In fact, since the Bazaar developers are, in fact, a clever bunch, the 2.5.0+ co-located branches do pretty much exactly this, except the branch storage is hidden inside the .bzr dir of your tree. So if you still want to do it "the git way", sure, go ahead and do it. In a non-bound branch, bzr switch -b new-branch will set up your branch and tree for co-located work, and create "new-branch" co-located to where you are.

The benefit of doing it explicitly, as I describe here (apart from the fact that it worked before 2.5.0, but I'm 8 months too late for that argument), is that you still keep the best of both words: you can have co-located checkouts, but you can also easily have more than one checkout; for example (and this would be awesome at work), a "wip" checkout normally bound to the feature branch you're currently working on, and a bugfix checkout bound to trunk or to your "maintenance" branch. Or, if you're a gatekeeper, those same two plus a third "master" checkout for merging submissions (including those from your own feature branches, if you're so inclined).

WARNING: unit tests and TDD do NOT eliminate defects

blog entry posted by lalo (Lalo Martins) on 2012-01-15 14:08:00

Tags:

Here's an excellent article about why you should be doing Test-Driven Development.

No, really, it's excellent; go there and read it, then come back here.

A little harsh, isn't it? But very true. It's excellent.

However, something in it made me a little uncomfortable while reading, and it wasn't too hard to figure out what.

There's a lot of people out there under the misconception that unit tests and TDD are a QA method, and that if they do it right their software will have no defects (or “bugs”). That's a dangerous misconception. It's bad for your software, because it won't work; and it's bad for TDD, because when it blows up in your face, there's a pretty good chance you'll go out there telling other people that TDD doesn't work. It does work; and it probably did work for you. It just didn't do what you were mistakenly expecting it to do.

Now, if you will, go back to the article and search for any instance where Uncle Bob tells you TDD will make your software defect-free. He never claims that. The closest he says is “your software will work better”, which is true; TDD reduces bugs a lot, but most TDD champions (at least the ones who know what they're talking about) consider that a nice side-effect at best. (So if he doesn't make the wrong claim, why am I uncomfortable with the article? Because I can easily see proponents of the “TDD as QA” misconception misusing Uncle Bob's article as proof that they're right.)

TDD is not a QA tool. TDD is a development process, I'll even say a programming process. Its main benefits are, in order of (IMO) importance and relevance:

  1. Clearer and cleaner design. I'm talking about technical, architectural design, not visual. By forcing yourself to write down what you expect the software to do in a formal language (code), you come out with a clearer idea of what you're going to do; and by designing your internal APIs so that they can be easily called by unit tests, you end up with more modular and maintainable structures.
  2. Cleaner code. I've seen people whose unit tests are confusing but production code is crystal-clear. That's obviously not ideal, but it's much better than confusing production code. By focusing most of the effort in writing the test (therefore understanding what you're doing) and then writing the simplest code that makes the test pass, you make it harder to write convoluted code. (Harder, not impossible.)
  3. More confidence. Once you've written the test and you're confident that the test expresses the problem, you'll understand exactly what the solution is, and later after the code is written and deployed, you'll trust your old code a lot more.
  4. More reuse. To be honest, this isn't even about writing the test first, but in fact there's a step that often comes before writing the test: looking at the appropriate test file, reading the other tests, and checking if what you want is already there. (Because, you know, you need to find the right file in the tree and the right place in the file to add your test.) If there's something that does almost exactly what you want, and that you had never seen before, you'll write your new test and modify the existing functionality. If there's something that does exactly what you want, you save time and don't increase the code complexity.
  5. Faster. This is almost always difficult to claim, but it really does stand to reason. Think about the other benefits above; they alone make your coding a lot faster already, enough to offset the time you spend reading and writing tests. You'll end up writing less code, because you know exactly what you need and you won't write fluff. You'll end up rewriting your code less as you iterate, because writing the test made the solution clear to you. Writing code is much like the scientific method; you come up with a working hypothesis, check if it works, adapt as necessary. It might feel like we spend most of our time (in the non-TDD world) writing code, but in reality we spend most of our time figuring out stuff, followed by checking or rewriting code. Clearer code reduces time spent on the former, and writing your verification first as code reduces the latter.

As a nice side-effect, TDD also reduces defects. It does that by (a) making the design and structure cleaner and clearer; (b) making the code cleaner, therefore easier to work with later; (c) encouraging the programmer to think about the problem being solved and write “the right code”. See a pattern? And yes, (d) preventing regressions on the unit level by keeping the unit tests around to run later. But let's be honest: how many regressions are at the unit level? If your answer wasn't “very few”, there might be something else wrong with your process.

Now here's a few reasons why TDD will not take you to the magical no-bug land:

Conclusion: TDD is great for developers and you should use it everywhere. But it's not a QA strategy.

On aggregators “stealing” content

blog entry posted by lalo (Lalo Martins) on 2012-01-06 06:42:00

Tags:

This is in response to yet another attempt at artificially limiting distribution of information online to protect expired business models, the AP's NewsRight.

I originally wrote it as a rather large rant on Google+, but I guess it's too long for that medium, and probably worth blogging.


Aggregators provide a hugely important service both to me and to you. In this day, information is global.

It used to be the case that I'd be more likely to get my information from a local outlet; a paper published in the town or city where I live, or maybe a local TV station. These would often republish stories written somewhere else, and there was a very well thought-out system for them to pay for this.

Now I have access to information from the whole world. But that means, there's way too much of it out there. Attention and “eyeballs” have become a more scarce and precious resource than content. Why would I read your article, rather than someone else's, or even spend my time playing games or writing fiction? I have precious little time, and it's mathematically impossible to read everything written every day that could be interesting for me.

Then there's management/economy theory. The “new wave” of theory today is “consumer delight”. It used to be the case that most business defined their goals as “providing what their customer needs”. Then at some point in the 20th century the thinking changed to “making money”. Then in the 70s it changed to “creating shareholder value”. Some very smart people today are saying those goals are destructive, to the economy in general, to the customer, and to your own ability to compete. The idea is that the ultimate goal of a business is to not only provide what the consumer needs, but to do it with as much excellence as you can afford; the money you make is a means, a part of the process, necessary to sustain the business and the people, and not the ultimate goal.

From that angle, your ultimate goal is to write the best story, and your ultimate metrics of success are second that it gets read as widely as possible, and first and foremost, that the people who read it get the most value out of it.

Therefore the concern at the center of your business is how stories get produced; that is where good practises need to be preserved and new things need to be tried and optimisations made. The concern of how to get compensated is necessary but secondary, and that means it should be an option at any time to rethink the business model, turn it upside down even, if that's the best for the primary goal.

Back to aggregators then: how am I supposed to know about your publication? If once every two or three months (and that's being generous) you publish an article that's the absolute best about a topic I'm interested in, am I supposed to visit your website every day just because that chance exists? That would mean visiting dozens of websites every day to get my news. I'm more likely to go with a smaller number of sites that have inferior articles but a better average.

Aggregators are there to save both of us: if I can find a good aggregator that picks those good articles from you, that's great, because it's probably the only way that article will make its way to me; you get read, and I get better information.

Now, that is currently a problem, because your model for compensation depends on people visiting yoru site. Can you see my point of view, that in light of all this, the thing that needs to be fixed is your compensation model? That the compensation model is the one weak link here, the one thing that is clearly wrong?

It's like the debate about how much profit is lost because people download music and movies. The reality is almost none, because those people are in 4 groups: (a) being most of them, wouldn't have bought the content anyway; (b) already bought it and want it in a different format; (c) download, taste, and then go ahead and buy; and (d) the very few that would have bought it if they couldn't download. So in the majority, it's not a case of buying or downloading, but rather downloading or ignoring.

In the case of news it's not a choice of aggregators or going to the source, it's aggregators or not hearing about the article at all. So from the point of view of the aggregators, you should be paying them for getting your article to the right eyeballs out there. (Which of course is also preposterous, because before you can pay the aggregators for that service, you need to make money somehow, and it's in their best interest to help you figure out how, and help you implement whatever solution turns out to be practical.)

The future of serialised live-action sci-fi

blog entry posted by lalo (Lalo Martins) on 2011-05-19 10:22:00

Tags:

So, that happened. V wasn't renewed. No Ordinary Family wasn't renewed. Caprica was cancelled. Stargate Universe was cancelled and now there's no Stargate show or movie in development. Smallville ended. Of those, only Smallville lasted more than two seasons. Syfy, formerly (but no longer) known as The Sci-Fi Channel, has one sci-fi show on air and one in production. I look at the list of sci-fi shows I'm following and I see two titles: Doctor Who, which is alive and well but produced by the BBC, which works based on a very different set of rules, and Pioneer One, a show for which the word “independent” would be an excessively modest description. Oh yeah, and let's not forget Star Trek: Phase II.

Was “the new age of sci-fi” just a fad, and already over?

I don't think so. But still, the times immediately ahead might be grim.

As I see it, V, Caprica and SGU all suffered “Firefly cancellations”. The shows had a fanbase and a following, and (I'm not sure about Caprica, but certainly for the other two) large enough to maintain the show. The flaw was in the business model.

The thing is, network TV shows are funded by advertisement. And advertisers pay based on Nielsen viewing figures. If I understand it correctly, based on a recent post by a SyFy executive, they specifically buy based on the 18-49 segment of the “L+7” figures, which means the people aged 18 to 49 who have either watched it live or via some sort of tracked DVR in the next 7 days. (I wonder what sorts of DVR are tracked. Tivo?)

There's a number of problems with that, because the prime sci-fi target audience is in some ways ahead of the curve of time:

  1. Some of us watch live, I'm sure. Personally, I know like 2 or 3 people who do. We'll DVR, and we'll have a variety of DVR solutions, most of which I'm sure won't be tracked. We'll download, if we have to. We'll use online streaming (legal if there is one, pirate if we must). Many of us will even wait for the DVD so a whole (or half) season can be marathoned in one go.
  2. It's a global world, and geeks, especially sci-fi geeks, are a little more global than average (so say we all). It's insane that the business model depends exclusively on the U.S. audience. Traditional licensing deals have months worth of gap, by which time most serious geeks will already have downloaded it (the day it aired) and watched it. What BBC America is doing for Who might be the beginning of killing this issue, but it's baby steps, because the important thing is to include the world in the production of American shows, and not to include America in the production of non-American shows.
  3. We're generally more tech-savvy and internet-centric, so again, we'll often stream or download even when we do have access to watching it live or DVRing, because it's more convenient.
  4. Counting downloads isn't a solution either, because downloads, especially pirate ones, cut off the advertisement (and if they didn't, viewers would skip them anyway). So the whole advertisement model may not be viable to begin with; ads as discrete banners on top of the show are one way out, they help pay for the show and give us extra incentive to buy DVDs/Blu's. And placement, of course, even though it's complicated to get a can of Pepsi on, say, Caprica.
  5. Targeting the wrong audience not only makes it hard to fund the show, it also harms the quality of the show itself, if the writers are writing for the wrong audience and the actors are acting for the wrong audience.

It should also be pointed out that geeks, and again especially sci-fi geeks, have (on average) more disposable income than many other audiences; further, we're more passionate about what we like (that's a core part of the definition of geek), and we're famously willing to spend that income on those passions. If you take too long to sell your show on DVD, by the time we buy it, we'll already have action figures, pins, t-shirts, and a coffee mug to keep the box company.

Why is Doctor Who doing so well? Partially, because what decides its success are the UK figures, and the show is hugely popular over there, even with non-geeks. Partially because it's actually not doing that well, and based on sheer L+7 percentage versus production cost, it could be facing cancellation if it was an U.S. show; but it's made by the BBC (and more precisely by BBC Wales), and it doesn't hinge on advertisement to continue existing; the majority of BBC budgets come from the TV licenses, and while spending from that is still to a great extent a function of figures, popularity also counts a lot. And it does quite well with merchandise, in fact it was a profitable business even when the show was not airing (from '89 to 2005).

Serialised live-action sci-fi wasn't born on TV. The form was born, along with live-action sci-fi as a whole, in the age of film serials, more precisely with the Flash Gordon serial in 1936. Before TV became a common thing, sci-fi film serials were hugely popular, and in fact Star Wars was conceived as a homage to those (just as Indiana Jones was a tribute to the other big film serial genre, the pulp-based adventure). And Star Wars was the beginning of the modern sci-fi blockbuster, so there's definitely a pedigree there.

(And why do I emphasise “live-action”? Because sci-fi proper started as a serial. Jules Verne wrote in the age of serial novels, that would be published in a bi-weekly magazine. H.G. Wells wrote serials too. Then along came comic books, which are serial by nature. And of course let's not forget animation, especially anime. Serialisation and sci-fi have a long history.)

But my point was, the transition from film serial to TV wasn't smooth. Again, Flash Gordon (54) was a big part of it, but most agree the turning point where TV sci-fi found its footing was the “holy triad” of adult shows — Science Fiction Theatre (55-57), The Twilight Zone (59-64) and The Outer Limits (63-65). Then came the popular, all-audience shows, like Lost in Space and, of course, Trek. We tend to forget how rough that transition was because it happened long ago, and not that far after the beginning of the film serial era (compare 36 to 59, against 59 to, optimistically, 2010). But it was rough. And one of the reasons it was rough is that the business model was different; film serials were funded by ticket sales, TV shows by advertisement. The advertisement model wasn't new, radio serials had been doing that for a while, but adapting it at the same time to a new medium and to the very specific characteristics of the sci-fi audience, wasn't trivial.

And this is what I think we're looking at. It's time for a change of business model. And I don't think the big studios are likely to lead that, because they're tied to their ways and their existing contracts (just like the film serial studios didn't rush to make TV shows in the 50s).

Maybe it's time for us to start producing our own series. Maybe in 20 years we'll look back and point to Pioneer One and Star Trek: Phase II as the beginning of this third era of serial live-action sci-fi.

Disclaimer: I am in fact producing one. Read that as you like: shameless self-promotion, putting my money where my mouth is, knowing what I'm talking about, having an agenda, maybe even this post being the reasoning behind the project, or a combination of all these.

Steampunk theme for Maemo/N900

blog entry posted by lalo (Lalo Martins) on 2010-09-20 23:23:00

Tags:

So, this post is here mostly for the benefit of web searchers :-) I googled for a steampunk widget theme for my N900 and found nothing. I tried “steampunk theme n900”, then replaced n900 with maemo, hildon, freemantle... nothing.

Then I tried a few promising-sounding ones and, by trial and error and luck, finally got to IivilSteel Black And Gold, which just rocks.

So if you got here while searching for the same thing, now you know.

(I guess now I need to go make a tag icon for steampunk... maybe one for the n900 too)

older posts