In most interviews, they ask questions to optimize your code. For example many times they ask to have a better way to multiply by 7. Answer is to right shift the number by 2,1 and 0 bits and add these results.

The question is that modern age compiler are so efficient that these things are taken care by compiler itself. Below is something I found in a linkedin forum about optimization by Alex Oren, Sr. Software Developer at a privately held software development company.

Some things to consider, in arbitrary order:

1. Optimizers have gotten pretty clever. Manual techniques that were considered code good optimizations a decade or two ago have been rendered obsolete and serve just to obfuscate code.

For example, dividing a variable by a constant power of two. If you look at the assembly emitted with optimizations turned on, you'll see that the compiler uses a (faster) right shift instead of a division.

2. Advances in processors cause some "common sense" techniques to become "pessimizations", as they may hinder code reordering, vectorization, branch prediction, etc., cause cache misses (or worse, page faults) and other performance-killing issues. Remember, you can no longer just count cycles as you could in the '70s.

In short, 99% of the time the compiler will do a better job than you optimizing code.

3. Don't optimize unless you have a reason to believe that there is a performance problem. Define what execution speed is "good enough" and don't waste time on trying to find ways to make it even faster.

For example, if you can shorten the running time of a process from 8 hours to 5, it is a worthwhile endeavor. Find a clever way of trimming another 5 minutes would be inefficient use of *YOUR* time.

4. If you do have a performance problem, find exactly where it occurs and concentrate your efforts there.

For example, shaving milliseconds from tight loops is ridiculous if your program spends most of its time in database queries.

5. Algorithmic optimizations are much more effective than micro-optimizations. Choosing different algorithm or data structures is key.

for example, the most efficient bubble sort will fall hopelessly behind mediocre implementations of Heapsort or Merge Sort as the data set grows. For some cases, a Radix sort can similarly top Quicksort.

To summarize:

“More computing sins are committed in the name of efficiency (without necessarily achieving it) than for any other single reason - including blind stupidity.” -- W.A. Wulf

"Premature optimization is the root of all evil" -- Donald Knuth
Subscribe - To get an automatic feed of all future posts subscribe here, or to receive them via email go here and enter your email address in the box. You can also like us on facebook and follow me on Twitter @akashag1001.