全新的软件编程范型

原创 2007年09月18日 18:06:00

    的确,paradigm(范型)是过去十年间流行的“商业用语”(例如 Macarena 和 Beanie Babies )之一。不过它最早起源于昆虫学领域。计算的起源也很早。如果在 20 世纪之前的词典中查找“computer”,它的定义是做计算的人。无论是计算军事领域的弹道、保险公司的精算图表还是海员用的潮汐表,“computer”都是指那些处理一组特定公式得出答案的人。不过现在,“computer”是指执行明确定义的重复性计算。

    在过去的 100 多年中,computer(计算机)发生了天翻地覆的变化,所用的编程方法也得到了飞速发展。第一代电子计算机之一 ENIAC 使用跳接线和电线 通过不同的组件、表格和计算引擎传送存储的数据。当人们意识到计算机的程序也可以像数据那样进行处理和存储时,就产生了“存储程序 ”计算机的构想并最终将其变成现实。这在今天看来可能不算什么,但在 1948 年可是一个了不起的突破。

    如果程序也能像数据一样存储在内存中,那么该如何将程序的指令编码到机器中呢?最初采用的是以二进制代码编写和输入的机器语言。之后出现了汇编语言,它只是用一组助记符代替了二进制机器指令。对于程序员而言,掌握“ADD”这类指令执行的操作要比记住执行同样操作的“01101110”这类指令容易得多。用于汇编语言的汇编程序执行明确定义的重复性过程,将汇编指令转换为机器语言。

    工程师、物理学家、化学家以及其他领域的科学家们很早就认识到了计算机给他们的科研和工作带来的便利。但是,汇编语言就像是来自消失的亚特兰蒂斯大陆的某种神秘语言。科学家们研究了各种数学方法,最后开发出了第一个计算机高级语言 FORTRAN (FORmula TRANslation)。采用高级语言时,需要编译器,这样计算机才能执行明确定义的重复性过程,将人类可以(较容易)看懂的程序转换为机器语言。自从 FORTRAN 问世后,高级编程语言 与技术又经历了巨大的发展。最近,该领域的发展已经步入面向对象的编程以及可管理运行时环境阶段。

    上述发展历程说明了什么?我们知道,编程领域的每个创新发展都是一次范型转变(90 年代的流行专业用语  中有该词)。从跳接线到二进制机器指令、汇编语言、高级语言再到面向对象的编程,这些发展进步都证明了这一点。现在,随着多核处理器的出现,软件领域下一次范型转变将会是基于线程的并发编程。

    我们的身体和大脑无时无刻不在进行“并行处理”(心跳、呼吸、思考、散步以及嚼口香糖都在同时进行)。不管我们认为自己执行多任务的能力有多高,想象如何完成并行处理还是相当困难。不过,这是成功实现这种全新的编程范型必备的技能。然而,这真是一种全新的范型吗?不是!还记得前面提到的所有那些计算人员吗?那就是一个并行处理的例子:每位计算人员都被安排了一部分工作,所有人员同时做着工作,当所有工作完成以后,再把结果汇总到一起。听起来挺简单的。那么,在应用程序中实现多线程操作真的可以这么简单吗?

    (多年以来,工程师和科学家们一直在使用并行计算和高性能计算 (HPC)。在利用 HPC 的情况下,计算机科学家们一直致力于采用并行架构实现自己的范型转变、发展 MPI 以及对分布式算法和并行算法进行研究。不过,我不太确定这对使用线程的程序员有多大用处。)

    多核处理器让大众知道了并行执行的概念。因此,虽然并发编程和并行处理的范型虽然可能不是新生事物,但也只是从现在才开始普及开来。您是否会顺应潮流使用双核和四核处理器?不过这里有一个需要考虑的问题,即您是否确实需要?您在多线程编程上投入的时间和精力是否值得?您周围其他所有人都开始做某件事,并不表示您也要一样,是不是?
 

英文原文:

Sure, paradigm is one of those "business-speak" words that was popularized in the last decade, like the Macarena and Beanie Babies. It does have entomological roots that go way back, though. The roots of computing go way back, too. If you look up "computer" in dictionaries before the turn of the 20th century, it would be defined as a person that does computation. Whether calculating ballistic trajectories for the military or actuarial charts for insurance companies or tide tables for sailors, computers were men working through a fixed set of formulas to derive their answers. Then, as now, computers were well-suited to perform well-defined, repetitive calculations.

Computers have obviously changed over the last 100+ years. So have the methods used to program them. One of the first electronic computers, ENIAC, used patch cords and wires to direct stored data through the different components, tables, and computing engines. The realization that a computer's program could be treated and stored in the same way as data led to the idea and implementation of the "stored-program" computers. What seems common to us today was a major breakthrough back in 1948.

If programs could be stored in memory, like data, how would you encode the instructions of a program into the machine? At first, you used the machine language of the computer written and entered directly in binary. Then came assembly language, which is just a set of mnemonics used in place of the binary machine instructions. It is easier for a human programmer to understand the operation performed by an 'ADD' instruction than it is to remember that '01101110' will do the same thing. An assembler is the program that performs the well-defined and repetitive process of translating assembly instructions into machine language.

Engineers, physicists, chemists, and other scientists had always realized the advantages that computers brought to their researches and work. However, assembly language must have seemed like the secret language of an underground brotherhood from the lost continent of Atlantis. Scientists work with mathematical formulas and this led to the development of FORTRAN (FORmula TRANslation), the first high-level language. High-level languages require a compiler so that a computer can perform the well-defined and repetitive process of translating the (more) human-readable programs into machine language. Since FORTRAN, we've seen a plethora of high-level programming languages and techniques. Most recently has been object-oriented programming and managed run-time environments.

What does this history lesson have to do with anything? Well, each of these innovations in programming has been a paradigm shift. (There's another one of those '90s buzzwords.) From patch cords to binary machine instructions to assembly language to high-level languages to object-oriented programming. Now, with the advent of multi-core processors, the next paradigm shift in software is concurrent programming through threads.

Our bodies and brains do parallel processing all the time (heart beating, breathing, cogitating, walking, and chewing gum all at the same time). Thinking about things being done in parallel can still be pretty difficult, no matter how much we think we can multi-task. Yet, this is the skill that will be needed to succeed in this new programming paradigm. But, is it really all that new? No. Remember all those human computers that I mentioned? That was an example of parallel processing: each computer was assigned a portion of the whole job, each man worked at the same time as all the others, and the results were compiled together when complete. Sounds simple. Can it really be that easy to thread your applications?

(Parallel computations and high performance computing (HPC) have been used for many years now by engineers and scientists. In support of HPC, computer scientists have been developing their own paradigm shifts with parallel architectures, MPI, plus research into distributed and parallel algorithms. I'm not all that sure how much of this will be of use to programmers using threads, though.)

Multi-core processors are bringing parallel execution to the masses. So, while the paradigm of concurrent programming and parallel processing may not be new, it is going to be much more pervasive from hereon out. Will you jump on this bandwagon and take advantage of dual- and quad-cores? A more relevant question might be, Do you really need to? Will the benefits outweigh the investment of time and effort to thread your codes? Just because everyone else in your office starts doing the Macarena during their lunch hours, that doesn't mean you need to start doing it, too, right?

几种主要软件开发范型的对比与选择

Paradigms 这个词的英文原意是风范、范例、模范、词型,我认为把Software  paradigms译为“软件范型”更合适。所谓Software paradigms就是软件开发过程模型,毋庸置...
  • momolulu
  • momolulu
  • 2007年12月10日 17:30
  • 1548

面向对象范型

1.面向对象范型是为了应对使用标准化结构程序设计遇到的诸多挑战应运而生的。 2.功能分解是处理复杂问题的自然方法。解决小问题比解决整个问题要简单。    功能分解方法通常会让一个主程序控制子程序,但可...
  • andyweike
  • andyweike
  • 2009年05月26日 12:48
  • 1604

《分布式系统原理与范型》第二版笔记

书籍的获取渠道: 1、购买;2、search in the net。 第一章 1、分布式系统定义 分布式系统是若干独立计算机的集合,这些计算机对于用户来说就好像是单个相关系统。   a:机器本身是独...
  • m1361459098
  • m1361459098
  • 2016年03月17日 19:38
  • 958

理解编程范型

理解编程范型所谓编程模型就是每种程序设计语言其开发都存在自己的编程模型. 四大编程范型面向对象编程模型如今面向对象无疑是最强势的编程范型,而java正是面向对象语言的典型代表. 这种范型有三大主要思想...
  • nicewuranran
  • nicewuranran
  • 2016年10月19日 21:02
  • 504

Map的泛型运用

import java.util.*; public class MapGeneric { public static void main(String args[]) { Map map=n...
  • u012575573
  • u012575573
  • 2014年03月25日 11:23
  • 1852

“主要的编程范型”及其语言特性关系(多图)

“主要的编程范型”(The principal programming paradigms)这幅图,其实出现得不算早,作者在2007年完成了该图的1.0版,到2008年更新至v1.08版本。本次提供的...
  • aimingoo
  • aimingoo
  • 2009年10月10日 04:31
  • 11438

范型 DAO范型的应用

当你偶然路过这里时,我假定你已经很明白java中范型和DAO模式了。当然,我也会顺便唠叨几句范型和DAO模式,只是它们不会这篇随笔的重点。我早先在DW上看到一篇蛮不错的文章不要重复DAO!Hibern...
  • wushuang5566110
  • wushuang5566110
  • 2011年05月11日 11:12
  • 1162

Java范型浅析

        从jdk1.5开始,Java中开始支持范型了。范型是一个很有用的编程工具,给我们带来了极大的灵活性。在看了《java核心编程》之后,我小有收获,写出来与大家分享。        所谓范...
  • andycpp
  • andycpp
  • 2007年08月17日 20:18
  • 12348

数据结构期末总结:用全新的方式创造别样的学习体验

这一学期的数据结构的学习即将结束,回望这个学期的学习,对于我来说是一种全新的挑战和锻炼。在贺老师的引领下,我接触到了翻转课堂的学习模式,并且在这种模式下对于专业的学习过程有了新的认识——代码和语言不是...
  • sdwangjingqi
  • sdwangjingqi
  • 2016年12月15日 11:14
  • 192

小娜:早上好!开启全新的一天!!

凝眉 撒胡椒 凝眉 撒胡椒
  • yuanmeng001
  • yuanmeng001
  • 2016年01月03日 05:36
  • 607
内容举报
返回顶部
收藏助手
不良信息举报
您举报文章:全新的软件编程范型
举报原因:
原因补充:

(最多只允许输入30个字)