最近各大it公司笔试碰到好几回关于计算机时间的问题:然后去搜索了一下:
1秒=1000毫秒(ms), 1毫秒=1/1000秒(s); 1秒=1000000 微秒(μs), 1微秒=1/1000000秒(s); 1秒=1000000000 纳秒(ns),1纳秒=1/1000000000秒(s);
execute typical instruction(执行一条指令) | 1/1,000,000,000 sec = 1 nanosec |
fetch from L1 cache memory(读取L1cache) | 0.5 nanosec |
branch misprediction(分支误预测) | 5 nanosec |
fetch from L2 cache memory(读取L2cache) | 7 nanosec |
Mutex lock/unlock(互斥上锁/解锁) | 25 nanosec |
fetch from main memory(从内存读取) | 100 nanosec(0.1微秒) |
send 2K bytes over 1Gbps network(发送2K在1Gbps的线路上) | 20,000 nanosec(20微秒) |
read 1MB sequentially from memory(读取1MB从内存中) | 250,000 nanosec(0.25毫秒) |
fetch from new disk location (seek)(硬盘seek磁道) | (8毫秒)8,000,000 nanosec |
read 1MB sequentially from disk(硬盘连续读取1MB) | (20毫秒)20,000,000 nanosec |
send packet US to Europe and back(发送数据从美国到欧洲然后返回) | (150毫秒)150 milliseconds = 150,000,000 nanosec |