的确，paradigm（范型）是过去十年间流行的“商业用语”（例如 Macarena 和 Beanie Babies ）之一。不过它最早起源于昆虫学领域。计算的起源也很早。如果在 20 世纪之前的词典中查找“computer”，它的定义是做计算的人。无论是计算军事领域的弹道、保险公司的精算图表还是海员用的潮汐表，“computer”都是指那些处理一组特定公式得出答案的人。不过现在，“computer”是指执行明确定义的重复性计算。
在过去的 100 多年中，computer（计算机）发生了天翻地覆的变化，所用的编程方法也得到了飞速发展。第一代电子计算机之一 ENIAC 使用跳接线和电线 通过不同的组件、表格和计算引擎传送存储的数据。当人们意识到计算机的程序也可以像数据那样进行处理和存储时，就产生了“存储程序 ”计算机的构想并最终将其变成现实。这在今天看来可能不算什么，但在 1948 年可是一个了不起的突破。
工程师、物理学家、化学家以及其他领域的科学家们很早就认识到了计算机给他们的科研和工作带来的便利。但是，汇编语言就像是来自消失的亚特兰蒂斯大陆的某种神秘语言。科学家们研究了各种数学方法，最后开发出了第一个计算机高级语言 FORTRAN (FORmula TRANslation)。采用高级语言时，需要编译器，这样计算机才能执行明确定义的重复性过程，将人类可以（较容易）看懂的程序转换为机器语言。自从 FORTRAN 问世后，高级编程语言 与技术又经历了巨大的发展。最近，该领域的发展已经步入面向对象的编程以及可管理运行时环境阶段。
上述发展历程说明了什么？我们知道，编程领域的每个创新发展都是一次范型转变（90 年代的流行专业用语 中有该词）。从跳接线到二进制机器指令、汇编语言、高级语言再到面向对象的编程，这些发展进步都证明了这一点。现在，随着多核处理器的出现，软件领域下一次范型转变将会是基于线程的并发编程。
（多年以来，工程师和科学家们一直在使用并行计算和高性能计算 (HPC)。在利用 HPC 的情况下，计算机科学家们一直致力于采用并行架构实现自己的范型转变、发展 MPI 以及对分布式算法和并行算法进行研究。不过，我不太确定这对使用线程的程序员有多大用处。）
Sure, paradigm is one of those "business-speak" words that was popularized in the last decade, like the Macarena and Beanie Babies. It does have entomological roots that go way back, though. The roots of computing go way back, too. If you look up "computer" in dictionaries before the turn of the 20th century, it would be defined as a person that does computation. Whether calculating ballistic trajectories for the military or actuarial charts for insurance companies or tide tables for sailors, computers were men working through a fixed set of formulas to derive their answers. Then, as now, computers were well-suited to perform well-defined, repetitive calculations.
Computers have obviously changed over the last 100+ years. So have the methods used to program them. One of the first electronic computers, ENIAC, used patch cords and wires to direct stored data through the different components, tables, and computing engines. The realization that a computer's program could be treated and stored in the same way as data led to the idea and implementation of the "stored-program" computers. What seems common to us today was a major breakthrough back in 1948.
If programs could be stored in memory, like data, how would you encode the instructions of a program into the machine? At first, you used the machine language of the computer written and entered directly in binary. Then came assembly language, which is just a set of mnemonics used in place of the binary machine instructions. It is easier for a human programmer to understand the operation performed by an 'ADD' instruction than it is to remember that '01101110' will do the same thing. An assembler is the program that performs the well-defined and repetitive process of translating assembly instructions into machine language.
Engineers, physicists, chemists, and other scientists had always realized the advantages that computers brought to their researches and work. However, assembly language must have seemed like the secret language of an underground brotherhood from the lost continent of Atlantis. Scientists work with mathematical formulas and this led to the development of FORTRAN (FORmula TRANslation), the first high-level language. High-level languages require a compiler so that a computer can perform the well-defined and repetitive process of translating the (more) human-readable programs into machine language. Since FORTRAN, we've seen a plethora of high-level programming languages and techniques. Most recently has been object-oriented programming and managed run-time environments.
What does this history lesson have to do with anything? Well, each of these innovations in programming has been a paradigm shift. (There's another one of those '90s buzzwords.) From patch cords to binary machine instructions to assembly language to high-level languages to object-oriented programming. Now, with the advent of multi-core processors, the next paradigm shift in software is concurrent programming through threads.
Our bodies and brains do parallel processing all the time (heart beating, breathing, cogitating, walking, and chewing gum all at the same time). Thinking about things being done in parallel can still be pretty difficult, no matter how much we think we can multi-task. Yet, this is the skill that will be needed to succeed in this new programming paradigm. But, is it really all that new? No. Remember all those human computers that I mentioned? That was an example of parallel processing: each computer was assigned a portion of the whole job, each man worked at the same time as all the others, and the results were compiled together when complete. Sounds simple. Can it really be that easy to thread your applications?
(Parallel computations and high performance computing (HPC) have been used for many years now by engineers and scientists. In support of HPC, computer scientists have been developing their own paradigm shifts with parallel architectures, MPI, plus research into distributed and parallel algorithms. I'm not all that sure how much of this will be of use to programmers using threads, though.)
Multi-core processors are bringing parallel execution to the masses. So, while the paradigm of concurrent programming and parallel processing may not be new, it is going to be much more pervasive from hereon out. Will you jump on this bandwagon and take advantage of dual- and quad-cores? A more relevant question might be, Do you really need to? Will the benefits outweigh the investment of time and effort to thread your codes? Just because everyone else in your office starts doing the Macarena during their lunch hours, that doesn't mean you need to start doing it, too, right?