Moreover, storage could easily be built from available two-state technology such as acoustic delay lines, magnetic cores, flip-flop circuits, or phosphor patches on a cathode-ray screen.
The decision to abandon decimal arithmetic and use binary codes for everything in the computer led to very simple, much more reliable circuits and storage media.
The tern "bit" came into standard use as shorthand for "binary digit" Today no one can think about contemporary computers with thinking about binary representations.
注意哦,计算机内部不处理二进制以外的数字和符号。
译者补充:要想学习编程,就必须了解二进制,ta 是计算机处理数据的基础。
计算机电路只处理电压、电流、开关和 malleable materials。
The patterns of zeroes and ones are abstractions invented by the designers to describe what their circuits do.
Because not every binary code is a valid description of a circuit, symbol, or number, the designers invented syntax rules that distinguished valid codes from invalid ones.
Although the machine cannot understand what patterns mean, it can distinguish allowable patterns from others by applying the syntax rules.
We cannot overemphasize the importance of physical forms in computers -such as signals in circuits or magnetic patches on disks-for without these physical effects we could not build a computer.
译者补充:存储信息和操作过程中可能会有各种小错误,而这就有一个自动纠错的过程,是一个工程问题。
虽然计算机程序似乎是抽象的,但如果没有机器利用物理现象来表示和处理二进制数,ta们就不能工作。
因此,可以说,每个数据集、每个程序和每个逻辑电路布局都是一种 “战略性的安排”。
译者补充:
科学研究需要找到基本的颗粒。
物理学找到了原子,生物学找到了细胞和基因,信息学找到了比特和晶体管。
探求世界奥秘和改变世界的过程,其实就是搞清楚 ta 们的基本单元和构建 ta 们之间关系的过程。
想巴贝齐的分析机,不就是没找到基本的颗粒吗,为了实现复杂的功能发明的复杂度也随之提高;而想香农的开关网络和图灵的图灵机都是先搞清楚问题的本质,而后用简单的方法 KO ~
The physical structure of computers consists of registers, which store bit patterns, and logic circuits, which compute functions of the data in the registers.
It takes time for these logic circuits to propagate signals from their input registers to their output registers.
If new inputs are provided before the circuits settle, the outputs are likely to be misinterpreted by subsequent circuits.
工程师通过向计算机添加时钟解决了这个问题。
At each clock tick the output of a logic circuit is stored in its registers.
The interval between ticks is long enough to guarantee that the circuit is completely settled before its output is stored.
One common deviation from linearity is to branch to another instruction at a different memory location, say X.
The decision to branch is governed by a condition C(such as "is A equal to B?") and the jump from one part of the program to another part is implemented by an instruction that says "if C then set PC to X."
译者补充:因此抓股票、抓小说,实时更新的爬虫,一般都会挂在 Web 服务器上,请见博客:《定时发邮件》。
The service process waits at a homing position for an incoming request; it then executes code to fulfill the request and returns to its homing position.
While this facilitates designing service processes, it does not remove the challenge of proving that the service process always returns to its homing position.
In the 1950s design engineers began to think about multiple-access computers that would be shared within a user community.
Multi-user systems had to guarantee that no user could access another's data without explicit permission.
This setup would provide the significant benefit of allowing users to share programs and data and would reduce the cost peruser by spreading costs across many users.
Windows 服务器长时间运行而突然宕机,但应该不会听到 Linux 系统服务器因为长时间不关机会卡死,在 Linux 上几乎是不会出现这种情况的。Linux服务器可以无休止的运行下去不宕机,因为 ta 继承了Unix卓越的稳定性和高效性。
Designers of the first operating systems achieved this by isolating each executing program in a private region of memory. defined by a base address and length.
The base-length numbers were placed in a CPU register so that all memory accesses from the CPU were confined to the defined region of memory.
This idea of partitioning memory and setting up the hardware so that it was impossible for a CPU to access outside its private memory was crucial for data protection.
It not only protected user programs from each other; it could be used to protect users from untrusted software, which could be confined into its own memory region.
Users of machines and networks today are aware they are sharing their machines and networks with many others.
They assume that the operating systems and networks are enforcing the isolation principle by keeping the executing programs in private memory regions.
When they download new software they do not trust, they expect their operating system to isolate the new software in a memory region called a "sandbox".
Although it has been in our computational thinking for a long time that operating systems isolate programs, many computer chips designed in the 1980s dropped out the memory bound checks in order to achieve greater speed.
Many security specialists are now regretting this omission.
New generations of hardware may once again enforce the security checks that computational thinking experience leads users to believe are present.
Beyond the von Neumann Architecture
One of the popular modern definitions of computational thinking is "formulating problems so that their solutions can be expressed as computational steps carried out by a machine."
This definition is closely tied to the framework of the von Neumann architecture,
In effect, the definition is a generalization of the operation of the CPU in a von Neumann machine.
After half a century, the von Neumann architecture has been approaching its limits.
There are two main reasons.
One is that the underlying chip technology, which has been doubling its component count every two years according to Moore's law, can no longer absorb the continuous reductions in component size.
Soon components will be so small they cannot comprise enough atoms to allow them to function properly.
The impending end of Moore's law has motivated extensive research into alternative architectures.
The other reason is that the separation of processor and memory in von Neumann architecture creates massive data traffic between processor and memory.
One technology invented to lessen the processor-memory bottleneck is the cache, which retains data in the CPU rather than returning it to memory.
Another technology intersperses processor and memory in a cellular array to spread the data load among many smaller processor-memory channels.
A third technology is special purpose chips---ones that do a particular job exceptionally well but are not general-purpose computers themselves.
An example is the graphics processing units(GPUs) now permeating every computer with a graphics display.
Special purpose processors are themselves the subject of extensive research.
Two new categories of computer architecture have been getting special attention.
Both are potential disruptors of today's computational thinking.
One is the neural network, which has been the powerhouse behind recent advances in artificial intelligence.
A neural network maps large bit patterns (for example, the bits of a photograph) into other bit patterns(for example, labeled faces in the photograph).
The input signals travel through multiple layers where they are combined according to assigned weights.
An external algorithm trains the network by presenting it with a large number of input-output pairs and assigning the internal weights so that the network properly maps each input to its corresponding output.
Training a network is computationally intensive, taking anywhere from many hours to several days.
A trained networks is very fast, giving its output almost instantly after the input is presented.
Graphics-processing chips have been successful in achieving fast response of a trained neural network.
Although machines capable of only pattern matching and recognition are not general-purpose(universal) computers, they are have produced amazing advances in automating some human cognitive tasks, such as recognizing faces.
However, there is no mechanism for verifying that a neural network will give the proper output when presented with an input not in its training set.
It is very jarring to our computational thinking to be unable to "explain" how a computational network generated its conclusion.
The other computer architecture getting special attention uses quantum mechanical effects to process data.
These quantum machines represent bits with electron spins and connections with quantum effects such as entanglement.
Quantum computers can perform some computations much faster than von Neumann computers.
One such computation is factoring a large composite number into its two constituent primes.
The intractability of factoring on von Neumann architectures has been the principle behind the security of the RSA crypto system, which is currently the most secure crypto system in wide use.
Quantum computers threaten to break its security.
Because their operation is nothing at all like that of the von Neumann computers, most people trained in computer science rather than physics find it very difficult to understand the operation of these machines or how to program them.
These two examples illustrate how each kind of machine has an associated style of computational thinking and is quite good at particular kinds of problems.
A person with advanced knowledge in computational thinking would be familiar with these architectures and, as part of the design process, would select the best architecture for solving the problem.
At the same time, particular machine types can also induce a kind of "blindness" --- for example, designers schooled in the basic von Neumann architecture think in terms of instructions and have trouble understanding how a quantum computer works.
Until the 1940s, computing was seen largely as an intellectual task of humans and a branch of mathematics and logic.
The invention of the very concept of computing, and it created a fresh world of computational concepts that had few counterparts or precursors.
The concepts, practices, and skills for designing programs and computers quickly diverged from mathematics and logic.
It was a profound change.
And until the 1940s, computational thinking was embedded in the tacit knowledge and state-of-the-artpractices of many different fields, including mathematics, logic, engineering, and natural sciences.
After the 1940s, computational thinking started to become the centerpiece of the new profession that designed information machines to do jobs humans never thought were possible.