Fork me on GitHub

Sunday, April 20, 2014

Lessons learned after (nearly) a year of professional programming.

I turned 23 yesterday (I'm old. I know) and it's nearly a year since I started writing code professionally. During this time, I learned a lot of things about programming. Here are some of my observations.

1. Code is for people, binary executable is for the machine: This is a recurring theme which one reads about and something everyone knows but its so hard to remember this principle while coding and get carried away. In two weeks, no one, including the author of the code remembers why the code was written the way it was. I am not talking about commenting code here. Although its important to comment your code, a sign of a good programmer is his/her ability to express themselves as clearly as possible with their code alone.  Comments should be minimal, and code itself should tell its story most of the time. 

2. Respect your data types: Store strings as strings, numbers as numbers, ObjectIds as ObjectIds and dates as dates, avoid *manual* conversion of types from one form to another as much as possible because it only leads to pain and unnecessary code in the form of conversion routines. A bug in a conversion logic can be catastrophic.

3. Interface before Implementation: Write down what interfaces your entities will expose without worrying about how they are gonna do it. Its hard to keep these two facets separate when thinking about code but its an important skill which you have to develop. Think about interface first, and write it down because its volatile and a minor change can impact anyone who is relying on that interface, also, get a sign-off from all those who develop/rely on that interface.

4. Beautiful Code when you can, Beautifiable code when you can't: The ideal code contains zero 'if' statements, no loops and can be read like a novel. It doesn't use globals, doesn't maintain much state anywhere and exhibits all the characteristics of good code you  have ever read about, anywhere. However, due to constraints of time, development that happens elsewhere which you rely on, the libraries and frameworks you use, its almost impossible to achieve the 'ideal code' which everyone desperately wants. When you are in complete control of all the environment variables, write beautiful code, when you are not, write code to finish the feature, however, ensure that it can be 'beautified'. That is, your code must be easily refactorable even if its not completely refactored.

5. Decide on suitable defaults beforehand: Variables will be undefined, uninitialized, database columns will hold nulls or will be non-existent,  parameters passed in may be null/undefined. In all such cases, its important that your code function properly and not choke. Specifying default behaviour in all cases will almost always be impossible, however, you should identify the most common cases where there is a high likelihood of something like that happening and define application behaviour. It should be documented, preferably with comments. Timezones, internationalization and unicode support are some of the things you should worry about before writing a single line of your application code. Also, do not mess with usernames! make no assumptions about them. All bets are off if you do!

6. Workflow is a habit: One of the most time-consuming aspects of writing code relates to the workflow which your team follows. Its not uncommon to get into all sort of troubles with your development environment and version control you use (use git. Its awesome.) and spend hours trying to resolve those problems. The time spent doing that cuts into the time which you could have spent thinking about how you could structure your code better( or playing 2048 while taking a break). Sometimes you have to get your environment back to a sane state, it happens. But, if you make your workflow a habit you follow religiously, your "muscle memory" takes over and you find yourself producing code in such a way as to cause minimal disruption for yourself and for others. A few things I follow obsessively are: committing or stashing stuff frequently and rebasing the hell out of my commits before I push code.

7. Integration implications: It's important to clearly specify how a new feature fits into the overall application. What are its side-effects on already available feature of the system? How easy is it to use? Who benefits most from it? What is the risk involved? Will the application exhibit unintended/ seemingly inconsistent behaviour because of this addition? These are the questions you should ask and answer even before deciding the color of the button which invokes the new feature. The sooner you get these details straight, the better your resulting code is going to be. Its imperative to answer these questions before starting the design because any change which needs to be accommodated (with respect to the above posed questions) after the design is finished will be costly to fix.

8. One way: Perl programmers will tell you "there is more than one way to do it". Pythonists like myself laugh at them. In Python, "There should be only one straightforward way to do it." This is a powerful belief which forces you to think about the best way to implement something. It eliminates  unnecessary trains of thought that lead you away from the best solution. People disagree with me on this, but I believe its important to assume there is a right and most straightforward way to accomplish something and it alone must end up as code. 

9. Functional all the way: Functional code is better. I rest my case.

10. Master the tools you use: Learn how to use your editor efficiently, build your "muscle memory", and master the devtools you use. Debugging tools, diff tools and commands that let you profile something, measure stuff, code quality tools and other aids are essentials when you write code. 

The ideal state which we must aim for is to write good sub-consciously. The master programmer writes code without thinking about what he/she is writing. The fingers automatically hit the right keys and the code writes itself. Of course, what I am describing here is the highest possible state (something akin to a kung-fu master who just moves while fighting without thinking) this can only happen with masters, but its something that we must aim for.

These are some of the things I picked up after 11 months of writing production code, by watching how people (who are much better than I am) write code, and observing myself while I code. Coding is a craft and creating beautiful code is a pleasure in itself.

Saturday, March 8, 2014

What happened to User Experience

First of all, I am not a UI/UX designer and I have never been. So while what I am about to say may sound completely crazy, its my take on how the user interfaces have evolved over the years since I first interacted with my first hand-held video game. (Good times. eh? ) So, here goes... another blog on the inter-web about User Interfaces.

I can, with some difficulty recall the times when we didn't have smartphones (or any phones for that matter) in our pockets. When the first cell-phones arrived, the buttons and the incredibly (by today's standards) small screen on those phones looked wonderous. I could not help but marvel about them. How incredible the invention was! (I was in high school at the time).  The user interface was not on anyone's mind back then as far as I can tell, people cared about the functionality: how much could  they do with the device in their hands? They were willing to tolerate glitches, the device taking some time to accomplish a task while it displayed a loading spinner, and a lot more. People were patient back then.

And then, the iPhone happened. I am a fan of Steve Jobs but I admire the engineers who worked on the iPhone more. As an engineer, I know, I just know that implementing the ideas that Steve Jobs had was non-trivial and the technical constraints, not only the things related to software, but the manufacturing constraints of the day made their task so much harder. Anyway, the point I was trying to convey is the fact that, once the sleek iPhone hit the stores and people had enough money to afford it, something changed.

Users started expecting the same level of sophistication in other devices they use. The phone worked so well and so quietly and smoothly that it revolutionized the concept of 'responsive design' and 'user experience' forever.  What Apple did was to raise the bar so high in terms of usability that the rest of the technology companies had to follow them to the summit or die slowly.

I don't think android would have ever had an interface like the one it currently has, if it were not competing directly with iPhone. This had a trickle down effect on the rest of the software users use on a daily basis. The webapps which people use today are much much better than how they were a few years ago in terms of usability. We expect and take for granted killer graphics in webapps today. This is a direct consequence of users' expectations of  usability.

The reason why this shift is important to keep in mind while designing and developing software is that, you are supposed to make whatever new software you create, beautiful. The key term to keep in mind here is the word 'beautiful'. There is no other word that can substitute it. Your  new software should look seductive, respond to the slightest touch, anticipate the user's actions beforehand and be so pleasurable to use, that users can't help but get addicted to it. I sometimes come across buttons on
web-apps which I just love to press! The software should extract loyalty by being everything the user desires and leaving no room for its competition.

Saying so is easy. But nailing down a user interface is incredibly hard. It is almost impossible to achieve the perfect user interface for the software you are building because when your target audience is the whole world (why settle for anything less? ), every individual user has his/her own preferences. Common patterns arise however and designers typically concentrate on pleasing the largest fraction of the populace for the largest possible time. Designing something beautiful is tricky and complex because what is beautiful to you may be crap for someone else. (I love command line interfaces and am a huge, huge fan of Linux but people around me who aren't geeks like me unanimously agree that command line interfaces suck. Since most of the world (in other words, the people you are serving) don't seem to like command line interface either, you have to find a better user interface to seduce them into using your software.)

The first thing I do when I write software nowadays is to ensure that its usability is no less than that of GitHub. I am a web developer, so I aim to make my apps as responsive, as beautiful and as elegant and as easy to use as what I consider to be the pinnacle of usability on the Internet : GitHub.
GitHub is the most usable website I have seen since the time I started using the Internet and paying attention to these things. Whoever the user interface designer of GitHub is, hats off to him/her! You rock! Others on my list are: Google, Gmail, Twitter,  Mozilla's website, WolframAlpha and perhaps a dozen other sites.

Beautiful User Interfaces are now the norm rather than the exception. Things change so fast in the world of software with new stuff arriving everyday trying to woo the user, that it is just impossible to have software that looks ugly or is hard to use and succeed. Your software may be much more secure than anything that's out there, it may encompass more functionality than what the twenty apps on the user's phone combined can provide, but if your user interface is even a little hard to figure out, you've lost the chance to impress the jury. So, how does your app look when the browser is re-sized?

Friday, April 26, 2013

Time In Programming Languages

One of the most remarkable aspects of programming is the way in which time is modeled in our programs. I try to explain the way in which time is handled in functional and non-functional languages in this blog post.

What is a variable?

A variable in an imperative language is a named memory location. 

that is,  in a statement like

                       int a = 37;      //in c and c++ and Java

'a' denotes a named memory location which can hold a value. That memory location gets updated each time we assign a new value to a. That is,

if you say,
                  a = a+1;

a now holds the value 38.

(Note that in languages like python, variables are not typed but values are. So if you were to say a = [1, 2, 4] and later say a = 20 , the principle is still valid. That is, 'a' refers to a memory location which can be updated and accessed. The difference is:  a is not constrained to hold objects of a single type.)

What does this have to do with time?

As it turns out, everything. If you do not have assignment in your programming language, your programming language becomes purely functional in nature. That is, it loses the ability to model time.

Let me illustrate that with an example from Lisp where assignment is discouraged.

     computing a factorial of a number in Lisp

        (define    (factorial    n)
            (define  (fact-iter   fact   n)                      ;;inner function for making the process iterative
                   (if (or (= n 0)  (= n 1))   fact            ;;return fact when n becomes zero or one
                        (fact-iter (* fact n) (- n 1))))        ;;recursive call
          (fact-iter 1 n))                                           

In this factorial function, we have no variables to which anything is assigned.  No assignment is necessary. If you were to write the factorial function in C or its descendants, it would look something like this:

int factorial(int n)
       int fact = 1;
       for(int i = 1; i <=n ; i++)
             fact  *=  i;
       return fact;

Notice that there is assignment in almost every statement. 

What is a variable in  a functional language?

In functional languages, a "variable" stands for  a value. That is, you must stop thinking about a variable as a location in memory somewhere that holds a value.  In fact, you must think of the variable as a "shorthand".

So instead of typing 3.14159 each time, you alias it by saying

(define Pi 3.14159)

in Lisp. There is no concept of updating the value of pi to a different value "later on". Why? because "later on" doesn't even make sense when you have no time.

It still isn't clear what I mean when I say time doesn't exist without assignment. So let me explain further: When you have assignment, you are updating a value in a memory location somewhere. So if you were to call a function with a variable as an argument, it will return a result. If you now update the variable's value and call the same function with the same variable passed in as argument, you now get a different result. That is, there are points in time when you get different results. And the reason you get different results is because you have assigned a different value. So time comes into play. There is a distinct concept of "result before assigning the new value" and "result after assigning the new value".

What happens when you get rid of assignment?

If you have no assignment, it means that variables truly are the values they alias. So, no matter how many times you call a function with a variable, you always get the same answer. If you want to get a different answer, you call the function with a different value (variable). Note that "before and after" don't exist in this scenario. 

Why is time or the lack of it important?

Well, if you don't have time in your language, then the programs you write will be mathematical in nature. They will be akin to mathematical functions like f(x) = x^2 or f(x,y) = x+y which specify a distinct mapping. So they exist "timelessly" which means, there are no synchronization errors. Also, the order of substitutions don't matter. What do I mean by that? well, consider the sequence of statements:
1. i = 1;
2. j = 2;
3. j = i +1;
4. i  =  j + 1;

If  I interchange statements 3 and 4, I get different values for i and j. So, the order of substitution matters. However, in the factorial procedure written in lisp, in the line:

(fact-iter (* fact  n) (- n 1))
the order in which I substitute the value of n doesn't matter. Because n is the same throughout. If n is say,  5, the expression becomes:

(fact-iter  (* fact 5) (- 5 1))

On the other hand, if you have time in your programming language, then the order of statements matters and your programs will have 'state'. As it turns out, having state in a programming language leads to some horrible things like worrying about synchronization when you have multiple threads or when you are running  parallel algorithms. And you get a lot of bugs if you update the wrong variable first. 

The advantage of course is that, you have the power to represent and model the time [which we observe in the real world] in your computation. There are some situations where modeling time is of immense importance: how would you model the transactions of a  bank account in a purely functional language for example? The time of transactions is tied to having the correct balance.

There seem to be situations which purely functional languages cannot handle. So even though Lisp is considered a functional language, it provides an assignment operator called SET!. And Lispers are careful not to use it too often. The challenge is to retain as much functional nature as possible while admitting state into our programs.

Why did you write this post?

Nobody explained to me the consequences of having an assignment statement and how it relates to time. In fact, I had not even thought about it. Luckily, I read Structure and Interpretation of Computer Programs and watched the Abelson and Sussman videos which explained what the consequences of having an assignment statement in a language were. I hope readers of this post see assignment in a new light. And I am fascinated at how a simple thing like an assignment statement in a programming language can raise questions about a deep concept we call time. Perhaps things would be different if we all existed in some timeless eternal universe...

Saturday, April 13, 2013

Not quite at Home?

As companies try to control and compete for users' attention, value diminishes.

A couple of days ago, David Pogue reviewed Facebook's latest offering: Facebook home for android. An app that comes pre-installed on HTC First and is downloadable for certain other android phones.
He raised a very important and pertinent question:

 What exactly is the point ?

Mobile devices are experiencing a meteoric rise in their usage. PC sales have dropped 14% this quarter.
How do you compete when the screen is just 5 inches or even 10 inches?
Google figured this out very nicely. It open-sourced the Android OS so that hardware manufacturers used it to make smartphones. These smartphones would all come pre-loaded with Google's apps. Its search engine, voice search, Streetview and tons of other apps. Google then created an eco-system similar to Apple's. The genius lies in the fact that 1) The ecosystem is so pervasive now that most people (nearly all) look at what apps are available for a phone before they buy it. 2) Google made this happen out of thin air: without the hassles of making their own handsets and without the cost and associated risks of entering the hardware market (Motorola acquisition is relatively recent). But others were not so fortunate and fell behind. (Yahoo! comes to mind.)

Google gained the users' attention through a process that can only be described as 'passive Hypnosis'.
This does not mean that Google is bad. I am merely appreciating the genius and the nonchalance with which it was able to place itself in the center of the smartphone market. Other companies have been slow to react.(Yahoo! comes to mind). Facebook paid Instagram a billion dollars for a reason. Yahoo! acquired Summly for the same reason.

Yet, for all the hype created by the companies that ask you to download their app, the value provided to the user is actually diminishing. Websites were neat and pretty once. Now every webpage is a mammoth that hogs network and carries a lot of surplus- stuff you don't actually want to look at. Same with smartphone apps: Google removed the ad-block which was one of the most popular apps so ads could be displayed on the apps that you used. And every time I access Facebook through my mobile web browser, there is an ad (thinly disguised as a "suggested post") right in the middle of my news-feed  .Facebook's new "Home for Android" is its latest attempt to imitate Google- to become what Google has become on smartphones - The very core.

The early reviews suggest that users are not very happy with this new offering. As Pogue writes,"The ads are coming soon." That means whenever you wake your phone, there may be an ad waiting. Suppose I use my phone to find out what time it is, and instead of just showing the clock widget on the home-screen, I am (probably going to be) greeted with a nice offer to buy jewelry or movie tickets. Nobody would ever want to own such a phone. In fact,  and I am really sticking my neck out here: It may increase the popularity of G+.

Users were once controlling technology. You chose to call someone. You decided to text your friends.
You decided which website was your homepage. That isn't the case anymore. All the stuff that's out there is thrown at you whether you like it or not. And you have to mine all that stuff to find what you are really looking for.

We should control technology. The user must be free to choose what sites he/she wants to visit. What their wallpaper looks like, what their default search engine should be. When and where they want to connect with their friends and most importantly: the user should decide to log in to your website rather than you trying to log in to his/her life.

Monday, March 11, 2013

Distance from Philosophy

I was playing around with the html parsers available for python.  I wrote various little 'toy-scripts' to scrape content from websites. While doing this absolutely pointless thing just for fun, I remembered something I had read in an XKCD webcomic  mouseover text: If you start from any page on the Wikipedia and click on the first non-underlined link and continue doing this, you will eventually end up on the Wikipedia's page for 'philosophy'. 

I didn't believe it at first of course. It seemed absurd. So I decided to verify if the claim was true and started picking random stuff - totally unrelated to philosophy, like say, cats, or apple. And began to follow the links from those pages. I was amazed when, in every single case, I ended up on the page to philosophy. 

As I was writing my toy-scripts and playing around, I had an idea to measure "how far from philosophy everything was". It sounds crazy when you hear it. Its even weirder to type. (Having such thoughts is probably the weirdest of all.)

So I wrote a script which would accept a word entered from the user, go to the corresponding Wikipedia page and start following the first non-underlined links and count how many links it had to follow before it hit on the page for philosophy. I thought I'd share it with you, in case you want to take a break and kill time and amuse yourself:

from HTMLParser import HTMLParser
import httplib2

links = []

class MyHTMLParser(HTMLParser):
         def handle_starttag(self, tag, attrs):
              if tag == "a" and attrs[0][1].startswith("/wiki/") and (not attrs[0][1].startswith("/wiki/File")) and  (attrs[0][1].find("Wikipedia") == -1):

def countlinks(keyword):
'''this function counts the number of pages which
are visited when you begin with the 'keyword' on
wikipedia till you reach the page for 'philosophy'
h = httplib2.Http()
site = ''
url =  site + keyword
parser = MyHTMLParser()
count = 0
keywords_seen = []
global links
url =  site + keyword
while(keyword != "Philosophy"):
response,content = h.request(url)
content = str(content)
content =content.decode("utf-8").encode('ascii','ignore')
start = content.find("<p>") #wikipedia explanations begin at a <p> tag
keyword = links[0].replace("/wiki/",'')
initial = 1
while keyword in keywords_seen:
keyword = links[initial].replace("/wiki/",'')
initial += 1

links = []
url = site + keyword
print "visiting page about: ",keyword
count += 1

print "\n found Philosophy after ", count ," links\n"


 Save this code as "" and run using:
python <enter>
<enter the word you want to start with e.g: dog, cat, Rolling stones, etc> <hit enter>

P.S: The distance from 'dog' to philosophy is  23 links.