Archive for July, 2011

Certification, Recertification…? Oh, please…

This week I’ve run into this blog post where the author asks for feedback about recertification. Microsoft certification, that is. That post is filled with “Whaaaat? Are you kidding me?” moments for me. I have some experience acquiring those certifications, and my feelings towards it have changed over time.

Been there, done thatcert

I believe it’s been about 8 years since I’ve taken the tests and became first an MCP (Microsoft Certified Professional), then an MCAD (Microsoft Certified Application Developer), and then an MCSD .NET (Microsoft Certified Solution Developer for .NET). It took me 5 different exams to achieve that. That was back in the first release of .NET.

A couple of years later, I’ve even studied for the VS 2005 / .NET 2.0 certifications. At the time I pointed out what a mess it was to understand what exams I should prepare for, and what certifications I’d be working on getting, since there were two new ones related to software development: MCTS and MCPD. I also blogged about the shortage of good books for that.

Are they really useful?

Here’s a quote from the blog post that made me write about certification today:

“Microsoft technologies are evolving more quickly than ever. In order to maintain the value of your certifications, we need to ensure that Microsoft Certifications keep pace with changing technologies and remain a meaningful indicator of a candidate’s continued competence.”

Meaningful indicator of a candidate’s continued competence? Seriously?

Here’s another quote:

“Recertification provides assurance to hiring managers and other key stakeholders that the candidate who holds the certification has demonstrated continued competence even as the technology has changed based on service packs, revisions, and new product version releases.”

Oh my… I couldn’t disagree more.

Based on my own experience passing through 5 of those exams, both of those statements couldn’t be farther from the truth. When I passed those exams, I had been working with .NET only partial time for about a year or so. I had absolutely no real world experience with so-called “XML Web Services” and “Remoting”. I had barely worked on a small sample project that used those technologies. I hadn’t done much ASP.NET. I prepared for those exams mostly by picking up a book and quickly flipping through the pages, without really digging through each word, and without sitting down and typing or trying any of the sample code. And I have passed 5 exams just like!

Nope. I don’t think I’m terribly smart. I’m mostly self-taught, I don’t have a formal degree, and I’ve skipped through certain core elements of computer science. Bottom line is: in order to pass those tests, all one needs is to memorize what is the answer you’re expected to give to a given question, regardless as to whether you agree with it, or whether real life experience has taught you that that answer is flat out wrong!

When I was preparing for the very first exam, I’ve organized a study group at the company, and that was one benefit out of the whole process, because with that we’ve managed to increase .NET skills through the company overall (at least as far as making people aware of a couple of things that are available in the platform).

Come to think of it, sometime in 1998 (I think), when I was mostly doing Visual FoxPro development, I’ve built a website to work as an online study group for people who wanted to pass the certification for Visual FoxPro. I’ve led most of the discussion there, quite a few people succeeded into passing the exam, others were only interested in improving their skills (and the feedback they gave me is that we succeeded there), but I’ve never taken the test myself.

I remember at one time somebody asked in an online forum: “What’s the difference between an MCP and an MVP (Most Valuable Professional)?”. Somebody else responded like this: “MVP’s write the exams that MCP’s “. For sometime that was really true (as I came to realize when I was invited to write some of those exams myself). There was a time when most MVP’s received their awards based on them showing and sharing real world experience with their peers. Now *that* was an indicator of “continued competence”. Somebody memorizing desired answers to canned questions is NOT a good indicator for anything (ok, maybe the person has good memory… that’s all).

Was there ever a certification that covered the first version of Windows Workflow? If so, what happened when the second version of that technology came out and completely replace the first one, rendering that first certification completely useless? What a waste of time.

My experience when hiring…

I have interviewed dozens of developers to be hired over the years. I’ve seen over and over cases where their resumes were stamped with certification acronyms all over the place (I’ve known people who’d be so proud of themselves because they were “the 10th person in the whole world with the biggest number of MS certifications!”….like, 50 certifications under their belts or something… hmmm… whatever, dude). We’ve hired one or two of those in the past, and I can tell you that the certifications meant nothing, as we saw the work produced by some of these fellows.

On the other hand, the best people I’ve been involved with hiring had something in common: they didn’t have any certification, some of them didn’t through any “formal interview test”, and most of them I didn’t even need to see their resume. So how did I know they were good people to hire? Well, they come out often to the Virtual Brown Bag, share tips and tricks with other peers, so I can immediately see a sample of how they work, how they write code, how they think through problem solving, how they interact with other people, how they take criticism in their work, how they react when there’s several other pairs of eyes staring at his work, how they help other people learn and address their problems.

Now, *that*, I think, is the ultimate proof that somebody is committed to working on his or hers continued competence.

Do Technologies Matter?

Whether or not somebody has been renewing their certifications because a new version of technology came out is irrelevant to me. I much rather work with somebody who has a good grasp on things like object-oriented programming, design patterns, SOLID, how to talk with users and clients and understand what’s really important in a project or feature, etc. Those aren’t things that people can get certified on. Also, such people can certainly learn technologies without a problem, based on their real life experience, and not on terrible courseware for biased certifications.

Summing up…

I could go on for at least another hour jotting down my thoughts on this topic, but right now I have to get ready for today’s Virtual Brown Bag. Smile


My first steps into Mac land

After two decades of working with PC’s, as I’m into this mood of trying out different things now, I’ve decided to get me a Mac a couple of weeks ago. I’ve been doing Rails development for a few months, and I kept hearing and seeing people say that the Mac is a much better platform for that, so I decided to give it a go, since now that’s what I’m doing fulltime.

Which Mac to get?

I have two big monitors hooked up to my PC, so getting a Mac Mini and hooking it up to one of my monitors was one of the options. But then what do I do when I have to go on trips, clients, conferences, etc.? I need a laptop. But then again, which one?

I friend of mine had just purchased a MacBook Air; the 13-inch screen one. At first I thought it’d be too small for me, but playing around with my friend’s it actually felt alright. The machine seemed really fast despite its size, the battery lasts for several hours, and a thought dawned on me: if I had one, I could hook it up to one of my big monitors when working from home, and I’d had a very portable laptop to take with me on trips.  So that’s what I did: 13-inch MacBook Air, with 4Gb of Ram.

How to get used to it?

My prior experience to using a Mac had been about 20 minutes playing with somebody else’s machine. Because of that, I decided to just set it up on the side initially, so I’d do most of my work on the PC where I’m 100% comfortable, and just go to the Mac in order to install programs, try small things out, get used to the keyboard and multi-touch trackpad, etc.

I have to say, several times, I feel totally useless. Main reason being just now knowing how to do the most simple things: for instance, where’s the context menu (right-click menu)? What’s with delete/backspace? Where’s home/insert/pageup/pagedown…? How do I work with the Terminal (console) window? Can’t tell you how frustrating it is to get stuck in those things when you’re trying to get something done.

In order to speed up me getting used to it, I had to quickly set it up to make it somehow feel like it does when I’m using Windows configured to my taste. I guess that takes me to what’s probably the single most important thing when using a computer…

An Application Launcher

A very long time ago I was already very particular about finding ways to quickly launch applications or navigate to places on computer. On Windows, for a while I’ve used “Start->Run…”, WinKey+R, the Address bar tweaked into the taskbar, etc. Then I’ve found SlickRun, which I’ve used for several years. Then, when showing and talking about SlickRun two years ago at a Virtual Brown Bag, somebody showed Executor, which I totally embraced every since. In fact, when I’m building a Windows machine, right after installing the Operating System, Executor is the very first application I install; this is, easily, the application I use the most on Windows (I should really write a post about how I use that tool…).

So, I needed something similar for the Mac… really bad! Alfred was it. I haven’t fully explored this tool yet, but it gives me 80% of the features I use most of the time in Executor. The free version has the “application launching” features, but I’ll be getting their PowerPack version soon, which adds the “folder navigation” features (which I use a lot on Windows), among other things. Ah, it also adds “clipboard history”, which on Windows I’ve been using ClipX for.

Another things that Alfred gave me is the ability to lock the screen (sort of like WinKey+L on Windows), which is another thing I use every time I walk away from my computer.

The Development IDE

Obviously, as I’m going to be primarily using my Mac for development, I needed an IDE. I understand people use TextMate, MacVim, etc., but I needed something that can get me productive as quickly as possible. Like I’ve mentioned a couple of weeks ago, I’m using RubyMine on Windows for Rails development. As it turns out, that tool also runs on Mac. Sweet. The main thing for me has been to just remap the keybindings so it somewhat resembles what I have it configured on Windows.

The Console

I’ve been using the command prompt (DOS) on the PC ever since I started working with computers, and then PowerShell in the last two years or so. I had never used Bash before (well, I did try it for 30 minutes a few months ago, got stuck, and gave up). Well, now I do need to learn it, since that’s what I get in the Mac’s Terminal window. I need to review Joshua’s presentation on Bash, as well as watch PeepCode’s Meet the Command Line and Advanced Command Line videos.

Source/Version Control

For source control, I’ve been using Mercurial. On Windows, TortoiseHG works really well for me. On the Mac, I’ve started to use MacHG; I can’t say I’m 100% happy with it, but it may still be just be been uncomfortable with the environment as a whole. One thing that got me at first is that I needed to install Mercurial in order to be able to access the “hg” command in the Terminal window. After I did that, things worked fine.

One more thing: when trying to open a project in RubyMine that has a Mercurial repository, we’re asked to provide the path to the hg executable, which I had no idea where that’d be. Found it here. One of my buddies pointed out I could also run “which hg” on the Terminal window, which tells me the path to the given executable.

Also, Mercurial only accepts commits if it can find a username. In order to do that, we need a global .hgrc file in the “home” directory containing those settings.

Diff Tool

BeyondCompare has been my Diff tool of choice for several years (for both files and folders). Unfortunately, it doesn’t run on Mac (as far as what I’ve read, it’s because it’s written in Delphi, which doesn’t run on Mac…). After asking for recommendations, I got to DeltaWalker, which seems very similar to BeyondCompare. I’m using its trial to see how it feels. It’s supposed to have easy integration with Mercurial and Git, so I’m hopeful, as BeyondCompare integrated really well with my workflow on Windows.


I’ve been using MongoDB on my Rails projects. In order to get it going, I’ve followed the instructions here. Which has led me to Homebrew. In order to get Homebrew going, I needed XCode (5 bucks). After that, all was good.

Where’s Bundle Install?

Source code in place, database, Ruby (already comes in the Mac), so I’m ready to do Rails development, right? Well, no. As soon as I try running bundle install, things don’t quite work like I expected. No biggie; followed instructions found here, and I was back in the business.

Text Editor

I have tried to use the text editor that came with the Mac, but didn’t like it. I then tried TextWrangler, which has been only frustration (seriously, I can’t find a way to open a file in the darn thing, for crying out loud!). I’ll be getting TextMate, as that seems to be everybody’s favorite. Also, eventually I’ll get into MacVim, which sounds like something I’d like very much.

More Tools

I’m also going down Ben’s Ultimate List of tools for Mac users. Lots of good stuff in there.


I’ve noticed that running RSpec/Cucumber tests on my Mac is a LOT faster than on my super powerful PC. I don’t know why that is, but I can easily perceive that.


I’ve made the decision to only use my Mac for Rails development moving forward. Things run faster, more smoothly, etc. I’ve been using it as my primary development machine for the whole last week, and am enjoying it.



RubyKoans: Great way to learn the Ruby language

Several people had mentioned the RubyKoans to me. It took me a while to get started on it, but I’m glad I did, and finally finished them just a few days ago!


All you need to do is to install Ruby (try RubyInstaller if you’re Windows), download RubyKoans (it’s just a bunch of text files zipped up), get on the command line, and go for it. You run “ruby path_to_enlightenment.rb”, and it’ll get you started. The Koans are just a bunch of unit tests that walk you through learning the core aspects of the Ruby language.

As I was going through the Koans, I pushed my progress to a Bitbucket repository.


I hope to revisit this repository as I learn Ruby better am able to improve some of the code I wrote here. By the way, as you’re going through the Koans and either get stuck on a step or are wondering how other people have solved it, just search for it on the web; most often you’ll find blog posts, StackOverflow posts, or GitHub or BitBucket repositories where other people have shared their solutions.


Leave a comment

Joshua’s Bash presentation

A couple of weeks ago I attended Joshua Kessler’s presentation at the Houston Open Development User Group on Bash. I got the first hour on video made it available here. Quite a few things went flying over my head as I’m just getting started on using Bash, but it’s good that I have this recorded so I can come back to it and review things I missed this time around.  Smile

Josh shows how he uses Bash on daily basis to help him automate tasks and get stuff done.

Joshua Kessler is an IT Manager who has worked as a professional software developer with over a decade of experience. I’ve been using Linux and other un*x like operating systems as my primary OS from as far back as 1993. He is also the vice president of the Houston Open Development User Group.



Do you control your source control? Or is it the other way around?

Last week I’ve run across the following tweet:

“It’s getting to the point that if your organization is using #tfs, you are not going to get the best developers to work there. Good luck.” (!/adymitruk/status/81593379204497408)

I can relate to that. I have fought the TFS source control every since it came out in 2005. Notice I’m focusing on the “source control” part of TFS; there are a lot more pieces to this product, but they are irrelevant to this post. From here on, whenever I say “TFS”, I mean “TFS source control”. Before TFS, I had used Visual SourceSafe; compared to that, TFS was pretty good.

I’ve always struggled with weird TFS stuff, but it wasn’t until I got into Mercurial that I finally realized I was spending way too much time with my source control system.

But isn’t TFS better than no source control at all?

Well, yes. I’ve heard of developers who, at this day and age, don’t use *any* source control system. The reasons may vary: 

  • some say that the company doesn’t want to buy any software
    • hmmm… seriously? There are free open source options out there. Oh, no, my company won’t do any open source. Well, *that* sucks.
  • others say they work by themselves and therefore don’t need source control (!)
    • I totally disagree with that; I have worked by myself in several projects, and have always benefited from having source control (come to think of it, I’ll talk about that in another post…).

So, what’s my beef with source control in TFS?

TFS has been the biggest source of frustration among the developers I worked with in the last couple of years. I’ll list out some of the things that come to mind…

Need for a centralized server

Several times I’ve had a need to work offline. It could be because I was working on a plane, or at anyplace where I couldn’t get a network connection. Sometimes I did have network connection, but it was so slow I’d rather work offline. However, TFS does a poor job at that. When I try to open up a VS solution that’s bound to TFS, Visual Studio takes a long time to realize it can’t connect to the server. After that, I can tell it that I would like to work offline

But what does that mean? That means I can make lots of changes to my local files, and whenever I’m back online, I can check-in my changes. The part I really dislike here is the fact that I want to check in early and often; write a test, check in, make the test pass, check in, refactor, check in, try out a different implementation, check in, etc. If I don’t have a local repository, that workflow is just not possible.

The need to “check out”

So I’m working remotely, accessing a central server somewhere over HTTPS. I have some source code open in Visual Studio. I hit one key on the keyboard, and there goes anywhere between 5 to 30 seconds (I’ve seen even more) in order for VS to communicate with the server, do whatever it does in this case, and “check out” the file so I can continue editing. Depending on how long that takes and how busy I am, by the time I can edit the file I already forgot what I was going to do to begin with.

“Hey, just get a faster internet connection…”… “get a faster server”… whatever…

My point is, if I want to make changes to files, there is no need to communicate with a server. Such operation has to be as fast as opening the file on any text editor and typing away the changes.

Oh, you want to use “any” text editor…?

Do it through Visual Studio, or else….

Visual Studio several times works as a “big brother”; it keeps watching and controlling everything we do. Because it is such an “institution”, VS is slow coming up. If all I want to do is to fix something real quick in a class, xaml, xml, whatever, I should be able to just open the file in a fast text editor (even Notepad!), make the change, and keep moving with my life.

Well, not so fast. Unless I check out the file, I can’t make changes to it. I then need to open the project in VS and check out the file through the Solution Explorer (this is how 100% of developers should know how to do). Or maybe do it through the Source Control Explorer (I think only about 50% of developers would know how to do it this way). Or maybe just open the Command Prompt and check out the file from the command line (maybe 5% of the developers would know how to do that? Why is it that most developers dislike the command line so much?).

Again, what is the benefit of having to “check out” a file when we want to change it?

Branches… oh, the branches…

Working with branches in TFS isn’t easy. Doing “branch per feature” is a pain. Take the following workflow: I need to work on a new feature, and I’m not entirely sure what the best implementation for it is going to be yet. I have the code off the main trunk from source control sitting on my machine. In order to implement the feature, I may need to create a few files, and change a few other ones (adding a class will automatically require a change to the VS project).

Instead of working directly in the main trunk, I decide to create a branch off of it, because I want to be able to check in early and often while I’m working on the feature, but at the same time I don’t want to mess with the main trunk (and I don’t want changes from others affecting the work I’m currently doing).

Say that this project has 100MB in files. When I create a branch, what happens? I get a new copy of those 100MB dropped into my local file system. Storage is cheap, I know, but if I’m working over a network, I need to wait for all of those files to travel across the wire and make its way to my hard drive. 

I then write my tests that encompass the requirements for the new feature I’m working on. I check them in. I decide there are two possible implementations, so I want to create two sub-branches off of the one I’m working on. What happens? Oh yeah, another 100MB for each one of these sub-branches. It’s 300MB of files, when all I need, again, is to add/change a couple of files.

Creating a branch should be simple the act of signaling to the source control system that “from here on, I want you to track changes so I can get a delta afterwards and easily figure out what’s changed”. No need to have copies of every single file for every single branch!

Now, say that while I’m doing this work, another developer has also created his own branch to work on some other feature. Maybe he has written a class or method that I could really benefit from in my branch. How do I get that change of his over into my branch? Well, in TFS, as far as I know, I can’t merge a change from my buddy’s branch into mine; he’d need to merge his changes up to the trunk, and then I’d need to merge the trunk down to my branch. But his changes may have an impact on something that is in the trunk and we aren’t ready to deal with that at the moment, as we both need to finish off the features we’re working on. Hmpf… frustrating.

File Diff

I’ve never been able to use the FileDiff tool that comes with TFS. I can’t make sense out of it. For several years now I’ve been using BeyondCompare, which is a far superior tool for running comparison, understanding changes, and merging.

Odd behavior

I’ve seen some odd behaviors countless times related to developers thinking they’ve checked in everything, but then not all files make into source control, then another developer gets latest, but things don’t work right because the VS projects thinks it needs file X, but file X hasn’t been checked into source control. But then you look at the machine of the developer who made the change, and the file *is* there. We then have to do something like exclude the file from the project, include it again, check it in, and hope that now it is going to work.

I’ve seen weird stuff like that every since the first version of TFS, all the way up to version 2010. It could be a problem of TFS, a problem of VS, a problem of the developers not knowing exactly what the workflow with these tools should be. Regardless, it’s a huge source of friction and frustration.

Summing up

Those are the main things that come to mind when somebody asks me why I’m not a big fan of TFS. Both myself and people who I have worked with have spent a LOT of time dealing with problems somehow related to TFS. Not fun at all.

I’ll write up another post related to what my experiences have been with other systems.

1 Comment