Category Archives: version control

make vim always start git commits on the first line

Vim has a (oftentimes) nice feature of remembering where in a file you were when you open it again. Which is all well and good, but it’s based on filenames, and git commit messages are always the same. So it remembers the line number where I finished my last commit message. I always want git commit messages to open up on the first line. Here’s some vimrc magic to make it Do The Right Thing™

" don't remember the cursor position in git commits
au FileType gitcommit au! BufEnter COMMIT_EDITMSG call setpos('.', [0, 1, 1, 0])

Local development for LuCI

So, you’ve got some LuCi modules packaged up, and you can build them via the OpenWRT build root, and you can install them with opkg, and you can edit the final files in /usr/lib/lua/luci/{controller,views,etc} and the changes take effect immediately, but you don’t like working on the target router. You don’t get to use the editors you want, it’s hard to integrate into your version control to see what you’ve been doing. You’ve seen this page about a “local development environment”, but it didn’t help as much as you had hoped?

Here’s a few things that worked for me, based on lots of advice and snippets from jow_laptop and some experimentation.

First, I’m assuming you’ve got luci trunk checked out, and I’m going to refer to that directory as [LUCI_DIR]. I’m assuming that your working luci package, which has the OpenWRT opkg Makefile, and a luasrc and htdocs directory are in [YOUR_PKG].

Now, the opkg Makefile that does a great job of building your package normally isn’t going to cut it for running inside luci. Make a new directory in [LUCI_DIR]/applications/your_name and add a Makefile with the following contents

include ../../build/
include ../../build/

That’s it. Now, you need the luasrc/htdocs etc directories from your project. This is a touch tedious…

  1. [LUCI_DIR]/applications/your_name$ ln -s [YOUR_PKG]/luasrc
  2. [LUCI_DIR]/applications/your_name$ ln -s [YOUR_PKG]/htdocs

At this point, [LUCI_DIR]$ make runuhttpd will churn away, and copy things all over the place, and you should have your package working locally.

I had problems where controllers were simply not detected, with the page’s “entry” links simply returning a 404 “No page is registered at blah/wop” As best I can tell, this was a problem in the fast indexer, and editing [LUCI_DIR]/libs/web/dispatcher.lua, in the createindex function, to use the plain indexing instead of the fast indexing meant that all my controllers showed up properly. The controllers were all working perfectly on the target mind you.

So, with this, I don’t have to fiddle around with sshfs, and I can easily use things like git diff to see what I’ve actually done recently. However, because of the way make runuhttpd works, by copying all the files first into a dist directory, then copying that into the [LUCI_DIR]/host/ tree, you have to continually stop and start the server after any changes. A project for another day….

Combining two git repositories into one

I went down some dead ends trying this, but it was pretty easy in the end. Here’s how I did it, because my search keywords didn’t turn up what I was looking for first.

I have two repositories, AAA and BBB, and I want them to end up looking like below. Also, I don’t want to keep using the old repositories, but I do most definitely want to see the full history of each file.


Or something along those lines anyway. I chose AAA to be the new final parent, so I started by moving all the original AAA files down a level. This was straightforward

    cd AAA
    mkdir aaa-oldAAA
    git mv x y z aaa-oldAAA
    git commit

I then did the same thing in the BBB repository.

Now, the fun, adding the other repository as a remote in this repository.

    cd AAA
    git remote add bbb-upstream /full/path/to/old/BBB
    git fetch bbb-upstream
    git checkout -b bbb-u-branch bbb-upstream/master

This is pretty neat. Right here, you’re only looking at the code from the BBB repository! If you git checkout master again, you’re back looking at your AAA/aaa-oldAAA repository. This makes sense when you think about it, nothing says that branches have to have the same or even related code in them.

Now, we just merge the bbb-u-branch into our local master!

   git checkout master
   git merge bbb-u-branch
   git remote rm bbb-upstream # no longer needed

Presto! Finito! The only problem I had was a merge conflict in the .gitignore files.

Note: to see the logs of files that have been moved, you need to use git log --follow filename.blah This has nothing to do with the dual merge however.

Contributing back to Open Source, trials and tribulations

So, back in March 2010, I started using It turned out not to do much of what I needed really, but it was a good base. The project seemed pretty dead in the water, but I fixed a bug or two, added some logging, and filed a bug with the diffs attached anyway. It wasn’t much, and it wasn’t pretty, but hey, if the project was alive, something would happen right?

And besides, it was now all working just fine for my needs.

Fast forward a few months, and the project has been resurrected by a third party. The code has been almost completely rewritten, and seems to have much fuller support for all the esoteric options I hadn’t needed. Oh, and the reply sent to my bug report never reached me, so I didn’t see any of this until just recently.

Of course, the released version _still_ didn’t have some rather important features I’d added, (it’s in trunk) and because of the rewrite, my diffs were now completely useless. And of course, the API has changed.

So where do I go now? Do I toss my local fork, resync with the “master” and try and get back on the train with the current open source base? Ignore it, and stay with my functional, working, battle tested, but feature limited version? (To be clear, my work mostly requires working with just escaped API mode, tx/rx only. The current xbee python project also supports transparent uart, remote AT commands, and remote IO transfer and ADC readings)

Seems like a bit of a loss all around really :( Should I have tried harder to get in contact with the original developer to get my patches in earlier? How much time should you really spend trying to patch open source software and how much time should you spend using it?

I’m well aware of the massive cost of maintaining local forks, if you actually do want the new features, but sometimes it just doesn’t seem worth it :(

FWIW, my version is still at github, and still very much in use here at home.

Version control of tools.

Over the years, this has come up on and off again. Some people fervently saying that tools should be kept in version control as well, some people saying that’s ridiculous overmanagement, and that many tools are very unhappy in version control anyway. For instance, try keeping a working version of visual studio in your version control system[1] Or even just try keeping different versions of a perl library from CPAN around.

I’ve normally leant somewhere in the middle. writing C? you probably want your compiler versioned. Nothing worse than an OS upgrade providing a new version of GCC and you finding that either a) it doesn’t compile any more, or b) worse still, some obscure “feature” has changed/been fixed. and your code compiles, but doesn’t _work_ anymore.

Writing Java? You probably want to always use the newest JDK. Writing perl? well, who knows….

Anyway, I ran into this problem recently with exiftool, or Image::ExifTool, depending on how you use it. Some changes/corrections in the library broke a lot of my photo gallery generation scripts. I did want some of the things in the newer versions of exiftool, like extended support for the MakerNotes for my new(ish) Canon 500D, and more robust parsing and writing of XMP.

What I didn’t want was changes to the IPTC print format. (Urgency was now returning “0 (reserved)” instead of “0”, I now needed Urgency# to disable print formatting) What I didn’t want was apparent changes to the priority of IPTC and XMP. What I wanted was my tools under version control.

Or more accurately, better acceptance testing of new tools before allowing new versions to be installed. But really, this is for home use. Do I really want to have a test suite run before I install any new tools? Hell No. But can I afford to let untested tools near my photos? No again.

Lose lose for the consumer. Like normal.

BACK TO POST Although, really, your version controlled tools should not require human intervention to build, so if you use visual studio to do the actual real builds of your system, you probably have other problems.

Version Control of scriptlets/single file programs

I have my ~/bin directory under version control, which is fine, and all well and good. But some of those scriptlets, I’d like to make public, and share them on github, or somewhere similar. Then they could get tickets opened and closed, and better? documentation. But, version control systems seem to consider a directory to be the smallest managed unit possible.

How does one reconcile these? I don’t want all my scriptlets public. (Some of them are both far too ugly, and contain far too much hardcoding) But I don’t want a directory for each scriptlet and have to manage paths. I DO still want to have all scriptlets under version control. Essentially, these are _all_ unique software “projects” but with just a single file per utility, it seems heavy handed to have to have an install wrapper to handle putting it into a single bin dir for actual usage. And having to have separate repositories for each one?

I can have a public and a private repository, but how does one “promote” code from private to public? (I’m only really interested in svn/git/hg) Gist’s at github seem to offer versioned single files, but how can that easily be integrated into my ~/bin directory? I want some sort of easy way of saying “update all these utilities”

Really, how does one manage this at all? Or do we all just stick it in a single directory, and live with the cruft?