Top 10 Concepts That Every Software Engineer Should Know


Top 10 Concepts That Every Software Engineer Should Know

Written by Alex Iskold / July 22, 2008 8:21 PM / 51 Comments

software_engineer_150.jpgThe future of software development is about good craftsmen. With infrastructure like Amazon Web Services and an abundance of basic libraries, it no longer takes a village to build a good piece of software.

These days, a couple of engineers who know what they are doing can deliver complete systems. In this post, we discuss the top 10 concepts software engineers should know to achieve that.

A successful software engineer knows and uses design patterns, actively refactors code, writes unit tests and religiously seeks simplicity. Beyond the basic methods, there are concepts that good software engineers know about. These transcend programming languages and projects - they are not design patterns, but rather broad areas that you need to be familiar with. The top 10 concepts are:

  1. Interfaces
  2. Conventions and Templates
  3. Layering
  4. Algorithmic Complexity
  5. Hashing
  6. Caching
  7. Concurrency
  8. Cloud Computing
  9. Security
  10. Relational Databases

10. Relational Databases

p1.jpgRelational Databases have recently been getting a bad name because they cannot scale well to support massive web services. Yet this was one of the most fundamental achievements in computing that has carried us for two decades and will remain for a long time. Relational databases are excellent for order management systems, corporate databases and P&L data.

At the core of the relational database is the concept of representing information in records. Each record is added to a table, which defines the type of information. The database offers a way to search the records using a query language, nowadays SQL. The database offers a way to correlate information from multiple tables.

The technique of data normalization is about correct ways of partitioning the data among tables to minimize data redundancy and maximize the speed of retrieval.

9. Security

p2.jpgWith the rise of hacking and data sensitivity, the security is paramount. Security is a broad topic that includes authentication, authorization, and information transmission.

Authentication is about verifying user identity. A typical website prompts for a password. The authentication typically happens over SSL (secure socket layer), a way to transmit encrypted information over HTTP. Authorization is about permissions and is important in corporate systems, particularly those that define workflows. The recently developed OAuth protocol helps web services to enable users to open access to their private information. This is how Flickr permits access to individual photos or data sets.

Another security area is network protection. This concerns operating systems, configuration and monitoring to thwart hackers. Not only network is vulnerable, any piece of software is. Firefox browser, marketed as the most secure, has to patch the code continuously. To write secure code for your system requires understanding specifics and potential problems.

8. Cloud Computing

p3.jpgIn our recent post Reaching For The Sky Through Compute Clouds we talked about how commodity cloud computing is changing the way we deliver large-scale web applications. Massively parallel, cheap cloud computing reduces both costs and time to market.

Cloud computing grew out of parallel computing, a concept that many problems can be solved faster by running the computations in parallel.

After parallel algorithms came grid computing, which ran parallel computations on idle desktops. One of the first examples was SETI@home project out of Berkley, which used spare CPU cycles to crunch data coming from space. Grid computing is widely adopted by financial companies, which run massive risk calculations. The concept of under-utilized resources, together with the rise of J2EE platform, gave rise to the precursor of cloud computing: application server virtualization. The idea was to run applications on demand and change what is available depending on the time of day and user activity.

Today's most vivid example of cloud computing is Amazon Web Services, a package available via API. Amazon's offering includes a cloud service (EC2), a database for storing and serving large media files (S3), an indexing service (SimpleDB), and the Queue service (SQS). These first blocks already empower an unprecedented way of doing large-scale computing, and surely the best is yet to come.

7. Concurrency

p4.jpgConcurrency is one topic engineers notoriously get wrong, and understandibly so, because the brain does juggle many things at a time and in schools linear thinking is emphasized. Yet concurrency is important in any modern system.

Concurrency is about parallelism, but inside the application. Most modern languages have an in-built concept of concurrency; in Java, it's implemented using Threads.

A classic concurrency example is the producer/consumer, where the producer generates data or tasks, and places it for worker threads to consume and execute. The complexity in concurrency programming stems from the fact Threads often needs to operate on the common data. Each Thread has its own sequence of execution, but accesses common data. One of the most sophisticated concurrency libraries has been developed by Doug Lea and is now part of core Java.

6. Caching

p5.jpgNo modern web system runs without a cache, which is an in-memory store that holds a subset of information typically stored in the database. The need for cache comes from the fact that generating results based on the database is costly. For example, if you have a website that lists books that were popular last week, you'd want to compute this information once and place it into cache. User requests fetch data from the cache instead of hitting the database and regenerating the same information.

Caching comes with a cost. Only some subsets of information can be stored in memory. The most common data pruning strategy is to evict items that are least recently used (LRU). The prunning needs to be efficient, not to slow down the application.

A lot of modern web applications, including Facebook, rely on a distributed caching system called Memcached, developed by Brad Firzpatrick when working on LiveJournal. The idea was to create a caching system that utilises spare memory capacity on the network. Today, there are Memcached libraries for many popular languages, including Java and PHP.

5. Hashing

p7.jpgThe idea behind hashing is fast access to data. If the data is stored sequentially, the time to find the item is proportional to the size of the list. For each element, a hash function calculates a number, which is used as an index into the table. Given a good hash function that uniformly spreads data along the table, the look-up time is constant. Perfecting hashing is difficult and to deal with that hashtable implementations support collision resolution.

Beyond the basic storage of data, hashes are also important in distributed systems. The so-called uniform hash is used to evenly allocate data among computers in a cloud database. A flavor of this technique is part of Google's indexing service; each URL is hashed to particular computer. Memcached similarly uses a hash function.

Hash functions can be complex and sophisticated, but modern libraries have good defaults. The important thing is how hashes work and how to tune them for maximum performance benefit.

4. Algorithmic Complexity

p6.jpgThere are just a handful of things engineers must know about algorithmic complexity. First is big O notation. If something takes O(n) it's linear in the size of data. O(n^2) is quadratic. Using this notation, you should know that search through a list is O(n) and binary search (through a sorted list) is log(n). And sorting of n items would take n*log(n) time.

Your code should (almost) never have multiple nested loops (a loop inside a loop inside a loop). Most of the code written today should use Hashtables, simple lists and singly nested loops.

Due to abundance of excellent libraries, we are not as focused on efficiency these days. That's fine, as tuning can happen later on, after you get the design right.

Elegant algorithms and performance is something you shouldn't ignore. Writing compact and readable code helps ensure your algorithms are clean and simple.

3. Layering

p8.jpgLayering is probably the simplest way to discuss software architecture. It first got serious attention when John Lakos published his book about Large-scale C++ systems. Lakos argued that software consists of layers. The book introduced the concept of layering. The method is this. For each software component, count the number of other components it relies on. That is the metric of how complex the component is.

Lakos contended a good software follows the shape of a pyramid; i.e., there's a progressive increase in the cummulative complexity of each component, but not in the immediate complexity. Put differently, a good software system consists of small, reusable building blocks, each carrying its own responsibility. In a good system, no cyclic dependencies between components are present and the whole system is a stack of layers of functionality, forming a pyramid.

Lakos's work was a precursor to many developments in software engineering, most notably Refactoring. The idea behind refactoring is continuously sculpting the software to ensure it'is structurally sound and flexible. Another major contribution was by Dr Robert Martin from Object Mentor, who wrote about dependecies and acyclic architectures

Among tools that help engineers deal with system architecture are Structure 101 developed by Headway software, and SA4J developed by my former company, Information Laboratory, and now available from IBM.

2. Conventions and Templates

p9.jpgNaming conventions and basic templates are the most overlooked software patterns, yet probably the most powerful.

Naming conventions enable software automation. For example, Java Beans framework is based on a simple naming convention for getters and setters. And canonical URLs in take the user to the page that has all items tagged software.

Many social software utilise naming conventions in a similar way. For example, if your user name is johnsmith then likely your avatar is johnsmith.jpg and your rss feed is johnsmith.xml.

Naming conventions are also used in testing, for example JUnit automatically recognizes all the methods in the class that start with prefix test.

The templates are not C++ or Java language constructs. We're talking about template files that contain variables and then allow binding of objects, resolution, and rendering the result for the client.

Cold Fusion was one of the first to popularize templates for web applications. Java followed with JSPs, and recently Apache developed handy general purpose templating for Java called Velocity. PHP can be used as its own templating engine because it supports eval function (be careful with security). For XML programming it is standard to use XSL language to do templates.

From generation of HTML pages to sending standardized support emails, templates are an essential helper in any modern software system.

1. Interfaces

p10.jpgThe most important concept in software is interface. Any good software is a model of a real (or imaginary) system. Understanding how to model the problem in terms of correct and simple interfaces is crucial. Lots of systems suffer from the extremes: clumped, lengthy code with little abstractions, or an overly designed system with unnecessary complexity and unused code.

Among the many books, Agile Programming by Dr Robert Martin stands out because of focus on modeling correct interfaces.

In modeling, there are ways you can iterate towards the right solution. Firstly, never add methods that might be useful in the future. Be minimalist, get away with as little as possible. Secondly, don't be afraid to recognize today that what you did yesterday wasn't right. Be willing to change things. Thirdly, be patient and enjoy the process. Ultimately you will arrive at a system that feels right. Until then, keep iterating and don't settle.


Modern software engineering is sophisticated and powerful, with decades of experience, millions of lines of supporting code and unprecidented access to cloud computing. Today, just a couple of smart people can create software that previously required the efforts of dozens of people. But a good craftsman still needs to know what tools to use, when and why.

In this post we discussed concepts that are indispensible for software engineers. And now tell us please what you would add to this list. Share with us what concepts you find indispensible in your daily software engineering journeys.

Image credit:

0 TrackBacks

TrackBack URL for this entry:


Subscribe to comments for this post OR Subscribe to comments for all Read/WriteWeb posts

  • I'd like to add:

    1. MVC - Almost every framework uses basic concepts of MVC now; Ruby on Rails, Django, even the iPhone SDK.

    2. Object Orientation - everyone says they know it, but to really understand it is a different story.


    I disagree with a couple of these

    1. Relation Databases - There is a strong movement away from conventional SQL statements, if you look at Ruby on Rails and other similar frameworks, it's being abstracted. And I think this is the proper way to do it because it takes the burden off the average software engineer.

    2. Cloud Computer - Similarly, this should be abstracted as well.

    What do you think?

    Posted by: Charles Ju | July 22, 2008 9:49 PM

  • The first technical question I ask of any programmer is simply "what are the benefits and costs of using a db index". You'd think that would be a waste of time but 2/3 of all applicants cannot provide a reasonable answer to it.

    Having an enterprise background I'm taking back by the general lack of understanding most internet programmers have regarding relational databases. It's not that they view them as bad for the scaling reasons cited, they simply have not learned what they can do beyond store and index simple datasets.

    I am in the habit of now quoting every new hire with "Relational dbs have evolved from billions of dollars and millions of hours to find ways to organize and relate information. There are many interesting tools / techniques hidden in there. If your solution to every problem is in code, it's wrong, if your solution to every problem is in the db, it's wrong, the correct answer is knowing how to solve problems leveraging both." Instead of poking the db with simple queries I always run my developers through an exercise of "tell the db everything you know in a very complex sql statement and see what it does with it". They quickly become quite interested in what other things this "magical db" thing can do. The quote and exercise are a bit of an exaggeration, but the intent is to get them excited about something that is as interesting and useful as the tools they are accustomed. These are smart people, after they understand my general point their curiosity kicks in and we let them roll. Qualifications: dbs have limitations and not every Internet service needs to scale to the size of Facebook or Twitter.

    Troubleshooting is often characterized as a skill but there are learnable concepts that make you more efficient than just running by instinct. For example, when you see your programmer burning a hole through their screen grab a coffee and talk them through an example of the "division by halves" method of solving complex problems. When you're in the heat of battle you often don't think consciously of these simple tricks.

    Posted by: Steve Ireland | July 22, 2008 10:00 PM

  • I would argue that the first point - about database design - is wrong. DE-normalization brings speed (causing duplication on purpose to avoid joins).

    Posted by: Jonathan Beckett Posted on FriendFeed   | July 22, 2008 10:47 PM

  • great post very interesting thanks

    Posted by: dave | July 23, 2008 12:41 AM

  • It's "memcached" not "memecached"... unless you want to cache memes...

    Posted by: Dave | July 23, 2008 12:42 AM

  • Good post.Must follow for a Software Engineer.


    Posted by: kiran voleti | July 23, 2008 12:57 AM

  • Good post. However things like ORM, Generics should also count

    Posted by: Varun | July 23, 2008 1:40 AM

  • Missing data mining /machine learning.

    Posted by: Uroš Jurglič | July 23, 2008 2:57 AM

  • Great post, although very specific. The list of what a s/w engineer should know about will never end: regexps, generics,data structures,algorithms (not only algorithm complexity or hashing), OOP (not only interfaces)

    Anyway the point is that to be a good software engineer requires character, not knowledge.

    The 1st makes you write structured and re-usable code.
    The 2nd makes you an active programmer that searches for results and efficiency.
    The 3rd is the egoism and pride you must take in your work.

    This is the Perl culture which I have gladly embraced.

    Posted by: panos | July 23, 2008 3:00 AM

  • I agree with Varun - ORM ( should definitely be in there as well. This is a valuable method for DB design which incorporates Boyce Codd normalization.

    Also I would emphasize good documentation and communication. Documents like the SDD (Software design document) and SRS (Software requirements specs) are vital for project cycles.

    Anyway - thanks for a great post !

    Posted by: Steven Teerlinck | July 23, 2008 3:01 AM

  • Great post, thank you very much.

    Posted by: Gökçer Peynirci | July 23, 2008 4:49 AM

  • Nice list. I would also add these concepts...
    MVC - especially for web apps
    Testing(As in TDD - Test Driven Development)
    Regular Expression

    And some more...

    Posted by: Binny V A | July 23, 2008 5:28 AM

  • Really Good post i will try to learn all these things to become a godo software engg.

    Posted by: Ajay | July 23, 2008 5:51 AM

  • OOP is also important.

    Posted by: Author Profile Page | July 23, 2008 6:16 AM

  • Very good list. One item that you have completely missed is testing / knowledge of automated testing tools. One quote that I keep telling developers at Veda is "Bad tester cannot be a good developer."

    Harish Agarwal

    Posted by: Harish Agrawal | July 23, 2008 6:17 AM

  • "The technique of data normalization is about correct ways of partitioning the data among tables to minimize data redundancy and maximize the speed of retrieval."

    Well, no. Normalization is about referential integrity. Minimizing redundancy is only one the methods of achieving that. And it often works counter to maximizing speed of retrieval. That's why, as Jonathan Beckett points about above, real working databases often have to go through some DEnormalization.

    Posted by: J. Random User | July 23, 2008 6:45 AM

  • @Charles I agree that basic CRUD functions have (and should) be abstracted and saves a lot of time. However it is quite normal for enterprise apps to relate data in ways that can only be done if working "within the dbms" thus it's best to know your tools.

    @Jonathan "Joins are evil" is the first myth I hear from new programmers. Properly indexed and normalized tables generally do not take any more time to retrieve than results from a single table. I would encourage you to try it for yourself. DBM systems are super-optimized to do just this very thing. Contrast any measured difference with the time and complexity of doing a similar thing all in code. Again, there are exceptions if you are SAP or Twitter the scaling factor influences the overall approach more than the data access and reporting methods. The most important opportunity you are missing by avoiding joins is that a system with an effective "data model" can dramatically reduce the amount of code required in the app i.e. many answers can be provided through simple db configuration instead of "brute force" in code.

    If you are looking into DEnormalization as a strategy to improve performance (there are limited exceptions to my statement above) you should weight the cost of long term maintenance and data integrity factor with the perceived development and performance benefits. Having to change maintain a field value in many different places can be frustrating to manage.

    There are interesting alternatives emerging but the heavy lifting is still being done with traditional DBMSs. ORM has yet to break out because of poor performance and limitations for data manipulation and reporting across sets. I think the next "big thing" will not sit on a DBMS but will replace it altogether.

    Posted by: Steve Ireland | July 23, 2008 7:10 AM

  • The fear of the join is usually unfounded. In properly designed and indexed tables the cost of a join is usually trivial. Obviously there are exceptions, but I think a lot of people who argue for denormalization are people who don't like to think hard about their queries and want all their answers in on table.

    Also I think its a bit premature to talk about the impending demise of the relational database. I'd imagine the marketability of that skill will be increasing for many years to come.

    Walt Disney World For Grownups

    Posted by: BJohnson | July 23, 2008 7:25 AM

  • I agree that OOP should be added, and maybe PureMVC.

    However, I think it's imperative that people know relational database. Understanding how indexes work (and how they optimize) is an important part of understanding how data is stored. Sorting (indexes), storage (the actual database), organization (foreign keys), creativity (writing difficult SQL statements). Those are important in almost every web application.

    While the trend may be going away from it, I have to agree with BJohnson. It's way too early to say that! People are still using MySQL 3! I don't see it going away anytime soon. A good server side PHP/JSP/ASP programmer can abstract it or what have you, but it's still important to understand the CONCEPT.

    Posted by: Danny Miller | July 23, 2008 8:00 AM

  • Great write-up Alex. We've been hiring some really strong software engineers and have interviewed many more. I am concerned by how many early career engineers underestimate the need to understand how the relational database optimizer works. Deferring all understanding to an ORM and/or the optimizer is dangerous. Even though we all work in modern, high level languages, understanding how things work in C, assembly and the DB engine help you design robust, scalable software. Being able to comprehend the query plan is the prerequisite to being able to tune queries and/or the redesign the data model for optimal efficiency.

    Posted by: Kevin Merritt | July 23, 2008 8:15 AM

  • apologies if people have already mentioned this (i didn't read all the comments), but i think that defensive programming using assertions and class invariants (for OOP) should definitely be on the list. I guess that is related to interfaces (i.e., enforcing interface integrity explicitly via runtime assertions):

    Posted by: pg | July 23, 2008 9:07 AM

  • I'm first an "old" mainframe guy and subsequently IV&V (Balto, DC), now real estate (South Jersey Shore). Nice article and I love the techie stuff! However, I keep shrieking and thinking constantly about the client/customer who sais he wants a bridge to somewhere and our folks go ahead and give him the ultimate, we do him one better, turns out being "a bridge to nowhere." A little heavy on the pedal on my part, maybe, but, how/when are we going to get/make tools for the client so that we know exactly what he meant and subsequently he knows that we get what he needs? Or is he just too dumb and we know whats best. Sorry, I am troubled by how often I've seen techies take over the business, thinking we are it, when actually the business is something many of us know little about. I'm pretty sure I saw techies drive a company down, once... The screaming unheard was the tree falling in the forest... And WOW, I am just amazed at what you all have been doing - Amazon (I was from Brasil by the way), eBay, Facebook, Craigslist, etc. And me, I'm still learning how to use the mouse - - why did they make those pads so small? OK, I'll try to be more serious, if there is a next time. Thanks for a wonderful article - - got me thinking again. Michael.

    Posted by: m | July 23, 2008 9:34 AM

  • I think you aimed too high with "top concepts" -- these are all useful tools and valuable theories,

    But as someone pointed out, focusing on the actual business that is using your software is really vital. Doesn't matter how elegant the code, if you solve the wrong problems.

    And my contribution is to extend that to user interface, user experience and usability (yes, those are all different things). Again, the quality of the code, even speed, can be less important than interacting with the users and providing them a system to do what they need to do.

    Posted by: Avi Rappoport | July 23, 2008 10:46 AM

  • Thank you very much for this great article, I really enjoyed reading it and thinking about it.

    Posted by: squid | July 23, 2008 12:18 PM

  • My list would be (roughly in chronological order):
    Communicating with people
    Basic Algorithms and Data Structures
    Patterns for OO Design
    Data Modeling
    Architectural patterns
    Unit Testing
    Domain Driven Design
    Requirements analysis
    Basics of agile project management

    Posted by: Oolis | July 23, 2008 12:35 PM

  • Like #4 with the link to the Aho Hopcraft Ullman algorithms book - a forgotten classic - check it out! Yes, the A in AWK, two of the three authors of the Dragon book. Also check out Ullman's Introduction to Automata, Languages, and Computation. (BTW get the original edition of ItALaC and the 2nd edition of the Dragon Book! Recent updates have ruined the classics, but you can still get the classics.)

    Posted by: History Lives! | July 23, 2008 1:01 PM

  • Sadly, software clowns are so far behind the hardware guys its laughable. Light years behind the hardware guys.

    Modern hardware sits largely un-utilized as more and more abstraction makes it nearly impossible to optimize for hardware.

    Knuckle-dragging software clowns that use 64MB on a whim would have been unable to write a hello world program on a PDP-11.

    All this framework-abstraction has lead to a brutal increase in software memory/cpu footprint with only marginal gains in functionality.

    Now the compiler guys have to work overtime to try and clean up the mess, but more and more comes out by the day to make the highly capable hardware underneath slow as dogs.

    Software "progress" needs to be halted while the mess is cleaned up. A lot of the 50/60's programming done right is coming back, but the abstraction for retarded programmers is still strong.

    Software people: take a few hardware courses. Learn a little verilog, grab an FPGA dev board and fool around with some hardware, talk to a few verification folks and realize that why most asics work first revision is because unlike 99.9% of the software clowns, everything in hardware is simulated, unit tested, verified, regression tested and simulated again, sometimes on an FPGA before first silicon.

    Software clowns constantly throw broken CRAP over the wall. Regression? Verification? unit tests? Whats that?

    Posted by: Mick Russom | July 23, 2008 1:19 PM

  • As several people pointed out, you didn't mention OOP, IMHO the biggest development in the last 20 years.

    More than anything else, it promotes Encapsulation/Data Hiding/Decoupling. This is the idea that two components should have a few inter-dependencies as possible. OO facilitates this, as does layering, but they don't enforce it.

    Much of this is implied in your Layering item, though I think that Layering is an additional concept.

    An early and very prescient book on this is "Reliable Software Through Composite Design" by Glenford J. Myers.

    Posted by: Phil Mayes | July 23, 2008 1:22 PM

  • @Phil

    I did mention elements of OOP - interface, which is the most important one as well as Layering, which is composition. Inheritance has been misused and I do not think is of the same importance.

    Posted by: Alex Iskold | July 23, 2008 1:26 PM

  • Great Post, well besides my experiance in business and media, i am studying PHP 5 now, still a beginner , but all my developers friends should read this too.

    Posted by: ArabCrunch | July 23, 2008 3:43 PM

  • Know a craftsman by his tools. A programmer (text editor) cannot create a whole system - he requires at least a graphic designer (Photoshop), a business person (Phone), and an admin (bash & vi). They make it pretty, sell it, and keep it running respectively.

    Furthermore, there is a useful distinction between front-end and backend programmers. The front end guy must know HTML CSS JavaScript regex XML JSON HTTP TCP/IP plus a handful of JavaScript libraries, coding conventions, browser quirks, and a few specific JavaScript libraries, and some human interface guidlines. A good design sense doesn't hurt, either. He'll be using a text editor + firebug + http proxy + a small set of command line tools like curl. (This could also be a Flash or GWT specialist).

    The back-end guy must know Java (or Perl or Ruby or whatever) plus a variety of core libraries (like Collections, JDBC) and a variety of application frameworks (SpringMVC, Rails, etc), and of course be a SQL expert. He knows about REST, caching, and scaling. The more he knows about the OS (process overhead, security, etc) and network architecture, the better. This is the guy you seem to be describing in this article. He'll be using Eclipse and a database client, like SQLyog, and the build system.

    The admin's primary goal is to keep the application running, but he has a lot of programmer-like qualities, mainly for doing ad hoc scripts. He should know everything about the db, the os, and the network. He lives in SSH, the command line and in configuration files. This is the cloud computing specialist (although certainly the backend programmer should know something about it, too).

    Posted by: Josh Rehman | July 23, 2008 3:57 PM

  • @Mick - nice little rant.

    I'm a hardware guy who successfully made the transition through EDA into Web 2.0 and I can empathize with verilog-twiddlers (not VHDL ... tsk tsk) who wouldn't know an abstraction if it bit them in the aspen.

    Your highlight was "...most asics work first revision is because unlike 99.9% of the software clowns, everything in hardware is simulated, unit tested, verified, regression tested and simulated again..." - how exactly do you simulate, test, and verify without all that BROKEN CRAPPY software?

    Posted by: D Ashcart | July 23, 2008 4:43 PM

  • I agree with most things, but actually, not so much with knowing efficient algorithms. Sure, someone needs to know these things, but in the modern world it is not strictly necessary. First of all, libraries which implement efficient algorithms are common and accessible.

    In addition, modern hardware makes inefficient algorithms ok to use. Trying to make an algorithm more and more efficient may actually be hurting the development effort because the extra efficiency is not necessary, and all the expended effort in squeezing out that extra efficiency could have been used elsewhere.

    Instead, the focus should be on an agile development method which focuses on quickly delivering results. If it works, don't fix it. And that saying could not be more true today with modern software libraries and modern hardware.

    Posted by: Zach | July 23, 2008 5:35 PM

  • I think knowing efficient algorithms it is necessary. If you implement the use of it or not in the "modern world" will depend of what type of software are you doing. But believe me, it is great to know how things work.

    Posted by: Yasser | July 23, 2008 6:01 PM

  • Thank you for this list and books. I would add reflection to this list. I saw that few others already added ORM (Object-relational mapping). The threesome: convention, reflection and ORM can help in developing code that is not just compressed but also maintainable and scalable.

    Posted by: Keren Dagan | July 23, 2008 7:43 PM

  • I would definitely add Design Patterns andOptimization Techniques.

    Posted by: Antonio Sánchez De Tagle | July 23, 2008 8:20 PM

  • Where is the Multicore design where Intel and AMD are saying that software supporting thier Multicore CPUs is the bottleneck?

    Posted by: sam6 | July 24, 2008 12:06 AM

  • You've a nicely done site with lots of effort and good updates. I would like to welcome you to submit your stories to and get that extra one way traffic to your site.

    Posted by: surf | July 24, 2008 1:56 AM

  • and get that extra one way traffic to your site

    Posted by: sexshop | July 24, 2008 2:27 AM

  • You've a nicely done site with lots of effort and good updates thankss

    Posted by: vibratör | July 24, 2008 2:28 AM

  • You've a nicely done site with lots of effort and good updates thanks you

    Posted by: erotic market | July 24, 2008 2:31 AM

  • You've a nicely done site with lots of effort and good updates erotik

    Posted by: erotik market | July 24, 2008 2:32 AM

  • I think 4, 5 and 8 are not required. Instead I would add

    1 Good Communication
    2 Discipline (writing commented, readable, maintainable code)
    3 Patterns

    Posted by: critic | July 24, 2008 3:14 AM

  • Is this like a shortcut to the 4 year degree of Computer Science? Most of these concepts relate to what any good university should be teaching its students and then a whole lot more (algorithms, automata, operating systems etc etc).

    Posted by: Quli | July 24, 2008 8:47 AM

  • I really like the way you've presented this article. A very systematic and well thought of approach. Thanks for your effort and for sharing this article.

    Posted by: Welcome to Paradise | July 24, 2008 8:50 AM

  • A great post and quite interesting too, thanks.

    Posted by: Shreemani | July 24, 2008 9:19 AM

  • your blog is nice. For more coveraage try to post in (use instead of

    Posted by: Pooja | July 24, 2008 9:45 AM

  • I'm more of an "interface" guy. Because I believe that in today's world people are more interested in better packed products than the rivals. and it will make an impact in today's cut-throat competition. Other points are equally important too.

    Posted by: United Voices | July 24, 2008 10:24 AM

  • nice article, I think it's useful for newbie like me

    Posted by: coffeco | July 24, 2008 11:17 PM

  • The real #1!

    Stick with Microsoft!

    Posted by: steveballmer | July 25, 2008 4:04 PM

  • thanks! great post man..

    Posted by: Suriya Sripalang | July 26, 2008 3:07 AM