Saturday, January 13, 2018

The state of programming languages and frameworks

As a professional software delivery person, I like to keep on top of technology trends and "where the market might be going". Over the last decade and a half, quite a few languages and frameworks have come and gone and very few have had any real staying power. In order to be marketable and knowledgable in things that "people want to know", I generally find the Tiobe index and Google Trends to excellent resources in gauging popularity. In my analysis this year, I've established that relatively speaking, they are in agreement, so I'm going to use google trends (as the charts are easier to embed) to elaborate.

Programming Languages

Before digging into frameworks, there is the notion of "which language" is most popular? In this regard, java has been dominant and looks to remain so for a long time. While there is a downward trend, every major language has had it's mindshare diminished, I can only imagine because of the explosion of alternate languages in recent years. Assessment: learn java, become an expert because while the market is crowded, there will always be work and/or people who want to know something about it. To be clear, I disregarded C, though it does roughly correlate to C++ in popularity...it is used more in embedded markets and that's not one I'm deep into [yet].

Alternate languages

While I would recommend any newcomers pick one of the "big 5". It really helps to have a "specialized" language you are at least passingly familiar with and can be productive in. In that regard, I also tend to take the "short term" view as these tend to come an go with great regularity. In that regard, I'd say that Python (technically in the big 5 if you go by many sources) is a solid first choice, but ruby is still a viable alternative. Outside those two, almost any other modern language would be a good idea to pick up and have as there are always specialty areas that will have a need [even for legacy languages like ADA or Fortran].

Legacy Languages

One area that is often neglected are so called "legacy languages". These are languages that have fallen out of style and/or been superseded by more modern alternatives. One reason I recommend adding a member of this group to your portfolio is that many experts in these fields are retiring but the systems running on them will continue to live on. Additionally, when doing a migration from a legacy platform, being able to quickly be able to read and understand what the old platform did is a valuable skill. One area to look at is the "area under the curve" as this represents the "amount of code potentially written". In this regard, perl is a clear winner.

Frameworks

Programming languages, however are only one dimension. Beyond this, the frameworks available to deliver higher level functionality are a key factor. From that perspective, I grabbed a few notable frameworks and did a comparison (realizing node.js isn't really a framework). In this regard, ruby on rails, while declining in popularity (and surpassed by spring boot), has a HUGE installed based and would clearly be a good choice. The winner's a little unclear here, but coupled with java's popularity as a language, I think one would not go wrong with spring-boot, perhaps having ruby on rails as a backup (and it IS the dominant framework in ruby).

Conclusion

From my perspective, I have a good familiarity with java and spring-boot, plus a deep understanding of ruby on rails...so I'm still fairly well positioned and I think I could easily recommend these as "go to" choices. Beyond those, I think I may spend some time playing around with perl again as it strikes me as a market that is set to be underserved at some point in the next 5-10 years...and will be a prime candidate for "need to know to make legacy migrations go smoothly".

Friday, July 14, 2017

The Technical Estimation Paradox

Face it, we've all been there...you're asked for an "estimate" that you KNOW you're going to be held to, but there are a hundred variables you have no way to control. The client is sitting at the other end of the table tapping their fingers and you think to yourself either: #1 "they don't understand, it's unreasonable, I don't have enough information", or #2 "Hmmmm, how much do I think based on my current information it might take?".

At the end of the day, neither of those matter beyond the psychological value they have for yourself...the real question that matters is "how much is it worth to them for you to deliver this?". Yes, if you're billing time and materials, there are practical problems: If you estimate too low, your client is going to be disappointed that you couldn't deliver in the agreed to cost/time...if you estimate too high, you client might be happy, but often they also think that you cut some corners (especially if you were the "middle of the road" estimate). On the flip side, if it's a "fixed bid", if you estimate too low, your margins are going to dwindle and you could possibly lose money and if you estimate too high you may end up in an ethical dilemma where you are making 99% margin (which is arguably good or bad, depending on your perspective). But at the end of the day, as a consumer of services, you should be happy if you get the contractually agreed to qualities you care about (without assumptions) for the agreed to amount (or less), and as a service provider, you should be happy to deliver at the agreed upon price (or less) with the agreed upon qualities (or more).

Wednesday, July 12, 2017

Software Architectural Decision Making

A common question I get asked is "How do I make architectural decisions?" and my standard answer is "it depends". While it's a tongue in cheek answer, there is a bit of truth to it. While there are frameworks and methodologies to try and reign this problem in, the reality is that the practice of "software architecture" is inherently a mess and certainly a wicked problem. That having been said, I'll give some insight into "how I do it".

First off, let me say, often many "decisions" are predetermined by your primary objective or have such a strong force behind them that there is little value in contemplating alternative solutions. A good example would be "which programming language should I use to develop an Android application?". You really have one decision and it's pretty binary: "Do I use java or javascript?" Yes, from a technical perspective it's possible to use ANY programming language (either through cross compiling or using a runtime interpreter), but if your primary goal is to release an application that allows a user to "do something on their phone" agonizing over every possible option is a HUGE waste of time. On the other hand, if your primary goal is to illustrate how to write applications using ruby to be run on an android device, the decision is preordained (and frankly not really a decision). Moreover, in the latter case, the decision switches from "which language should I use?" to "what approach should I use for running ruby applications on android?".

In the former case above, suppose our primary objective is to write an application that allows users to track their workouts on their phone. In that case, the "language" you use is relevant, but only as a secondary concern. Some questions you have to now concern yourself with are "how many people know java versus javascript?" or "will I want to ultimately release the application in both iOS and Android?". Additionally, you have to concern yourself with "are javascript developers cheaper/faster or java developers?" and "which approach is easier to test and debug?".

However, in the latter case, some questions are: "do I want to highlight using ruby's dynamic nature?" or "illustrate how Ruby can lead to higher quality code?" or something else like "do I want to illustrate how rapidly Ruby applications on android can be developer relative to java?". This also opens up another can of worms you need to consider, such are "is the pool of developers for Ruby such that developing in that language is even VIABLE?".

As we can see, the number of considerations grows at an exponential rate and is extremely sensitive to initial conditions (i.e. what is the primary problem). If you change the "primary problem" entire swaths of decisions are irrelevant (why worry about javascript if your objective is to write things in ruby?). This problem is what makes architectural decision making particularly pernicious and I would contend mean it exhibits fractal characteristics of a nonlinear system. This is also why no one has yet come up with a comprehensive "system" for making these decisions. While many lay claim to methodologies and approaches (take a look at what IBM has to say about it, the fact is it is an extremely difficult problem to reason about.

My best advice is this: Architecture is like software meteorology, you can't predict the future, but you CAN some up with general models that work for defined scopes. What does this mean? Simply that trying to define software architecture is like trying to define what the weather is for the entire globe. The fact is, "it doesn't matter" if your primary objective is to determine if you want to go to the pool or not. All that really matters is the local weather and your preference for the kind of weather you like to go to the pool in. Moreover, you don't need to necessarily explain "why you chose to go to beach instead of a pool" because you realized your original desire was "to go swimming" and limiting your option to a pool might have been a mistake (for other reasons). Put another way, "software architecture is understanding what's important" and "making important decisions", that art is learning how to figure out what is important, the science is too complicated to think about.