Monday, April 14, 2014

Codebits 2014 - 3 days of fun

Wherein I spend three days demo'ing the Oculus Rift, hacking on a portable VR rig with a Raspberry Pi, riding RiftCycles, and mobilizing the entire medical emergency and firemen staff on call due to an extremely nuclear chili experience (rumours of my demise were greatly exagerated).

This year our usual group occupied the usual couple of tables at Codebits and split up into three projects - Pew Pew Pew!, an attempt at building a portable VR experience of a First Person Shooter with an Oculus Rift, a Kinect and a Raspberry Pi; Wolf of Codebits, a stock exchange built on top of the Meo Wallet infrastructure using the "money" that was distributed for testing to everyone at Codebits; and Nelo, the winner of the event's top prize, a Knee Lock for Polio patients to replace the metal harness that they traditionally have to use, using free and open technology like Arduino, Bitalino sensors and 3D printing and based on the idea of a chinese finger trap.



It was awesome fun, as it usual is, even though I spent a lot of time cursing at SD cards, and the Pew Pew Pew! project, which I did with Bruno Rodrigues, didn't end up fulfilling all its goals. The portability was the primary goal - getting a Raspberry Pi connected to the Oculus Rift and both feeding off a portable USB battery so that the whole thing could be stuffed in pockets and the user could have freedom of movement without worrying that he might drag a laptop with him if he turned too much or moved too far.
Bruno killing some critters with the Raspberry and the Oculus control module in his pockets
It turns out that the Oculus sucks so little power that the USB batteries we had would turn off because they thought they weren't in use... So instead of using two batteries - one for the Raspi and one for the Oculus - we used one for both, so that the Raspi would ensure that the battery would not turn off.

We managed to get the whole thing portable and Quake compiled on the Raspberry before the SD card troubles started and killed off the remainder of our schedule, where we ended up spending most of the time replacing cards, reinstalling Raspbian and trying to get things up and running again. We did manage to do a presentation in the end to show off the concept, Bruno going up on stage, pockets stuffed with cables and boxes, to show off the rig fully portable and running. So now you can guess what I'm going to be working on for the next few days ;)



Congratulations are in order to everyone at the organization for putting together another amazing event, and to everyone that managed to pull together a project while being constantly distracted by all the awesome stuff going on around them! And a special congrats to the Nelo team for pulling off such an amazing idea and stealing the show! Now I wish I were in Portugal more often to play with the Bee 3D printer that they won :-P



Update: A lot of other things happened at Codebits, to wit: RiftCycles (http://fb.me/32NIS6Jfw), Nuclear Chili experience (http://www.youtube.com/watch?v=khXNcgwI0ic), talks and workshops, Presentation Karaoke (where you have no idea what the next slide is going to have), the Amazing Quiz Show (wherein we learn what 2002::/32 is), Retrocomputing (where a bunch of people have fun with old consoles and computers, including my ZX Spectrum), and so much more!

Friday, March 08, 2013

Formatting git patches for partially transplanting a repository

So I wanted to move a subdirectory inside a git repository into its own repo, keeping all the history of the changes in the original repository in the new one. With git, copying partial history around is as easy as using git format-patch --root -o directory/to/put/patches/in -- path/to/subdirectory, which will create a numbered patch file for every commit that touched the subdirectory in question. Applying all the patches in the new repository is just a questions of doing git am -3 *.patch.

The problem is, format-patch skips merge commits, which means that there might be missing changes in the patches, which sorta makes things not work.

The alternative way is then to do git log --pretty=email, which outputs a commit in the same format and actually handles merge commits properly. But, of course, I need to do that for every commit that I want to export (and there's a bunch), and I hate doing things by hand.

To that effect, here's a few lines that do the job properly, exporting a list of commit hashes in the proper order and then going through them one by one and exporting each one to a directory, numbered appropriately so they're correctly sorted:


Export the list of interesting commits in the correct order (older to newer)

git log --oneline --reverse -- path/to/subdirectory|cut -d' ' -f1>../patches/list

Create a patch file for each commit on the list

c=`wc -l ../patches/list|cut -d' ' -f6`;for j in $(eval echo {1..$c});do n=`printf "%04d\n" $j` && a=`head -n $j ../patches/list|tail -1` && git log -p --pretty=email --stat -m --first-parent $a~1..$a -- path/to/subdirectory >../patches/new/$n-$a.patch;done

Apply all the patches in the new git repository

git am -3 -p3 ../patches/new/*.patch

The subdirectory I'm taking things out of is 3 levels deep in the original repository, and I don't want to keep the parent directories, so I'm passing -p3 to have git (and the patching process) remove them when applying.

If git am fails to apply a patch, it's very likely that this patch is a merge commit with changes that are already applied. I can check this by doing patch -p3 .git/rebase-apply/##, where ## is the failed patch number reported by git am. Patch will either apply the change or report that a change has already been applied (and do I want to revert it? just say no). If any changes needed applying with patch, I can then add the changed files with git add and do git am --resolved, which will create the commit and continue applying the patches. If there are no changes to be applied, I can just skip it with git am --skip (which is most likely to happen) and continue.

Monday, February 11, 2013

Gnome Developer Experience Hackfest 2013

The Aftermath

After finally getting rid of a really bad cold, here I am reporting about the DevX hackfest that took place right before FOSDEM, at the Betagroup Coworking Space, a very nice coworking place in Brussels with excellent access and great facilities. The hackfest, organized by Alberto Ruiz (thanks!) and sponsored by the Gnome Foundation, had the goal of improving the application developer experience on the desktop, and lasted for three days, with plenty of discussions on a variety of topics, from tooling and IDEs, documentation, languages, libraries to bundling, distribution and sandboxing (and more).

It was a pretty interesting experience, there were a lot of people participating and due to the nature of the facilities (i.e., we were all in one room together), and lot of discussions bounced around the room and spilled over from group to group. My goal for the hackfest was to work on (and hopefully finish) the tooling required to create Mono bindings for the Gnome desktop in an automated way, so that packagers and developers can make bindings available for any library that supports gobject-introspection. By the end of the hackfest, the bindings tool (called bindinator) was able to bind Webkit with no user intervention, and with gstreamer bindings 95% done  (two bugs still pending), things are looking good for automated C# bindings.

Between hacking and sneezing, we discussed tooling and IDEs, particularly what an IDE should have in terms of features, and what features a language should have to better support an application development environment; i.e., in the case of dynamic languages, a built-in AST is a very good thing to have, since you really want good code completion in your IDE, especially when you're starting on a new platform and aren't comfortable with the available libraries and APIs. Other useful features that went on the list for an IDE would be syntax highlighting (a must on any good code editor), responsive UI, good build infrastructure integration (preferably hiding away the specific build tool details and possibly with its own project format that's independent of specific build tools (looking at you autotools)), debugger support, modularization (for user extensibility). And, preferably, being built with the same tools and languages that are recommended for the platform (dogfooding++).

The language discussion was really *the* topic that dominated the three days. There was a lot of back and forth over the merits and demerits of Python, Javascript, Vala, C and C# throughout the days and into the evening activities. Which language would be the easiest to integrate? What tools are available for each? Debuggers are important and hard to do, code completion is harder in some languages than others; if one were to code an IDE from scratch, what language would be better for the UI, the logic, systems integration? Would floating point fuzziness affect someone doing an accounting app? What type of developers are the target, and what type of apps? Widgets and applets that just create a quick UI over existing libraries? Bigger apps? How many developers are there for every language, and how many things are missing and/or need to be fixed in Vala, or Javascript, or any other language? Should there be a single language recommendation? Two languages? All these and more were put forth and discussed extensively, and will probably continue to be discussed over time (as there is rarely a right answer for most of them). No matter how people feel about the decisions that came out of this hackfest, they can be assured that they weren't taken lightly, or without a fight.

All in all, it was a great hackfest, three days of very productive discussions and hacking. Kudos to Alberto Ruiz for a great job organizing everyone, and thank you to the Gnome Foundation and Andrea Veri for the sponsorship and assistance.





Friday, June 29, 2012

Boston, a hackfest

The Mono & Gnome Festival of Love 2012 is in full swing here in Boston, thanks to the wonderfully stubborn David Nielsen, which got everyone together, got us a great room to work in at the Microsoft NERD Center, and sponsorship by Fluendo, Xamarin, GNOME and PluralSight.

Day 2 of the hackfest has just finished, and it was quite an eventful day. After a slow start yesterday (particularly for me, as I managed to completely kill OSX so thoroughly that it wouldn't boot and required a full restore (all hail up to date Time Machine backups)), today was a pretty interesting day.

Highlights of the day include a loooong conversation with the gobject-introspection people, determining exactly how broken gir is and how that affects our C# binding generation, loud complaining saved for posterity in trello.com (which we're using to track our tasks and found to be a very neat and useful webapp), watching Google IO in style thanks to the wonderful resources provided by the Microsoft NERD Center (really, their facilities are top notch), generally discussing geeky stuff and the road forward for Mono & Gnome, and having a late dinner and lots of margaritas at the Border Café (which is still one of my favourite places in Boston, naysayers be damned).

All in all, a great hackfest, and we're just getting started!

This post brought to you thanks our generous sponsors:





Wednesday, May 30, 2012

Looking back, going forward

May 11, a sunny day in my little corner of the world, was my last day at Xamarin. I've spent an amazing 9 months working on Mono for Android, but more than that, Xamarin was a continuation of my work in the Mono team that started in 2006 back at Novell. So, in a sense, this is an end of a cycle.

These past 6 years have been life-changing; I dove into professional open source development head first, worked with an amazing team, met a ton of great people, and learned and did so many things that sometimes it's hard to believe it's only been 6 years. Some projects were successful, some not so much, but nothing was ever routine or mundane. Moonlight was a particularly amazing experience, working with C#, C/C++ and JS inside a browser with bridges and refcounting and all sorts of crazy hacks to build an UI toolkit from scratch, and Mono for Android was a inspiring challenge that taught me more about mobile development than I thought possible. Impossible is not a word that the Mono team use much ;-)

A lot of people have been assuming that, since I'm leaving Xamarin, I'm going to leave Mono development altogether. Rest assured, that's not going to happen. :-) There's a lot of projects I want to support in the Mono world, and the Linux/Mono community definitely needs a bit of a pick-me-up, which is why I'll be taking part in the Mono & Gnome Hackfest that's going to happen in Boston June 26 to July 2.

In the meantime, I'll be taking a bit of a break to recharge batteries and get ready for the new challenges ahead. It's going to be an interesting year! :-D

Tuesday, April 03, 2012

Broken by design, I guess

Sometimes I have to jump through so many hoops just get something working, I just have to write it down. Especially because I just *know* that somehow, somewhere, some*when*, I'm going to have to do it again. Especially in software that I assume should work out of the box, seeing as it's so popular. Or maybe I just don't do things the "normal" way and it's really just me. *shrug*

Working on Mono for Android, sometimes I can't escape looking at java. It was only today that I've actually had to build things with something other than ant, so I had to install Eclipse. I really favour command line tools over running an IDE, though, so I went on a hunt to find out how to build Eclipse projects from the command line.

I found a few topics called Headless Building and Batch Compiler, which looked promising. Of course, the executable required for the first is completely missing in my installation (Eclipse Classic 3.7.2), so I tried the ant variant next (some StackOverflow posts pointed to that being a decent solution for this). It immediately crashed with

[apt] Warning: NLS missing message: JdtApt_noWorkspace in: org.eclipse.jdt.apt.core.build.messages
[apt] Warning: NLS missing message: JdtApt_noEclipse in: org.eclipse.jdt.apt.core.build.messages
[apt] Warning: NLS missing message: JdtApt_noStartupJar in: org.eclipse.jdt.apt.core.build.messages

The last one was also an error and the build failed. A quick search on JdtApt_noStartupJar revealed this. Note line 54:

startupJar = new File(file, "startup.jar"); //$NON-NLS-1$

Soooo... startup.jar? There's nothing like that in my eclipse folder. Another quick search reveals this short but enlightening post, which basically says that startup.jar hasn't existed in Eclipse since at least 2009 (*looks at the calendar and sighs*) and that the solution is to replace all references to that file to "$ECLIPSE_HOME/plugins/org.eclipse.equinox.launcher_version.jar".

Since I can't just go and change the paths on that apt plugin thingy, I instead went and symlinked the new file to a startup.jar in the Eclipse directory. Et voilá, things work.

Wednesday, October 12, 2011

OSX, the Air and Recovery Mode, or how to make amazing software

This morning I decided I needed a case-sensitive partition on my MacBook Air. It comes with a nice juicy 250GB SSD and I still have about 140GB left, so, having woken up in an adventurous mood, I open up Disk Utility, peer at the partition, note it doesn't complain at me if I shrink it a bit, so I go ahead and resize it. I do this, of course, without killing any of the 30 tabs open on Chrome, or closing down the 3 server connections and about 30 channels on LimeChat, not to mention the 10 terminal sessions running various scripts and remote shells, or any of the ton of widgets and apps happily fidgeting in the background. Life is good.

The resize finishes with no issues, which of course only encourages me, so I go ahead and create a new partition occupying the space Disk Utility says it's free (who am I to argue, I'm sure it can do the math better than I can).

"Error: no disk space left to perform the operation"

Or something to that effect, anyways. I wonder if it might be too early for Disk Utility. You know, math this early in the morning, tricky. I reboot, because that always fixes things, right? After a few seconds (yes, SSD is that awesome) of fretting about whether I still have a working Air or whether I'm now Airless, it boots. Disk Utility isn't fooled, though, it continues to complain that the space it has free isn't big enough to create a partition.

It's at this point that my brain kicks in and I run verify on the drive and on the startup partition. Just because the resize finished with no errors, that doesn't mean it didn't actually screw things up, leaving behind a trail of dead bytes all over my drive. It just means it was sneaky about it. A bit like coming home and finding the cat nicely tucked away on her beanbag like a good obedient little kitty, but having the sofa all covered in cat hair. And feeling warm to the touch. As if a certain fur ball had just leaped off of it and onto the beanbag and then pretended to have been there all along. Sneaky.

So, after glaring at the cat (pretending to be fast asleep, snoring loudly, pink tongue jutting out in blissful forgetfulness, the sneak), I repair the partition. Or try to, because this time I get complained at repeatedly with red menacing messages and a popup, indicating I need to run the Installation Disc to start Recovery Mode and run Disk Utility from there. After a careful examination of the Air to make sure it hasn't sprouted a DVD drive while I wasn't looking, I quickly google for the proper procedure to apply to the boot process in order to go in to recovery mode, and reboot again.

Now it seems to me that this thing, not having an optical drive, would come with a recovery partition from which one would boot when needed. Maybe it's just the Air pouting, but when I hit Command-R, instead of offering to boot from the recovery partition, it went online. Online!

I have to confess, I was amazed and, quite frankly, boggled. This little gray metal thing that I'm obviously trying really hard to turn into a paperweight is going online to fetch an image of the recovery partition so it can load it and boot it on the fly. If I was impressed before that OSX allowed me to resize the system partition just like that, now this is some seriously impressive recovery process. I mean, really, I've blown up more partition tables than you could shake a stick at (the latest one was a combination of partitioning a portable drive on osx and then formatting said drive on linux and copying a whole bunch of stuff onto it so I could take it on vacation, and then when I'm on vacation 300km from home trying (haha) to use it on the mac, and then having to realign partition tables by hand on the command line), and although I usually don't lose anything except time, the recovery process is always sooooo annoying. This whole OSX recovery process was obviously done for silly people like me.

While I'm boggling at it, it does its magic thingy and lo-and-behold! Recovery Mode! I run Disk Utility, hit Verify, hit Repair. Things work, apparently, so I try again to create the partition. It creates it. I reboot and I'm back to normal land. And stuff still works.

With the hardware limitations of the Air and the possibility of not having a recovery partition, this whole recovery process is an amazing piece of well-designed software. Instead of having to waste hours trying to recover things manually, everything Just Worked (tm) and I could instead waste my time writing this blog post! It has made my day.

Oh, and poking the cat. That has also made my day. The sneak...