Shell Programming in 24 hours (上)

Shell Programming in 24 hours

 

 

前言

1.shell responsibility

  read your command and ask Unix kernel to execut

  creat scripts-------> files contains a list of command

 

Hour 1  Shell basic

1.simple commands

2.complex commands

3.compound commands

  $command1;command2;command3......

 

Hour 2  Script Basics

1.Unix System

  Utilities------> programs and programs you can run

  Kernel ------->connect utilites and hardware

2.Shell 初始化

  check  /etc/profile exists

  check  .profile in your home directory

  两个都read后,display $

3.Interactive and Nonteractive Shell

  1. mode---->inetractive 交互  input command >execute

         ---->noninteractive 非交互  read command stored in a file >execute them

  2.start shell

     /bin/sh   ----->   interactive mode

     /bin/sh  filename   ------> noninteractive mode

  3.Making a shell Script executable

开头加入一行 #!/bin/sh  -------> cause a new shell to execute the script

 

Hour3  Working with files

1.files     

     ordinary files ; directories ; special files

2.Listing files

  $ ls  -F   -----> 区分文件夹、文件与特殊文件

       -l   -----> 分行显示文件

       -a   

3.view the content of the file

  $cat files

      files1, files2, files3

      -n     ---------->标明行号

      -b     ---------->标明行号 并且跳过空白行

4.counting words

  $wc  files  ---------->行数,单词数,字符数

       file1 file2 file3

     -l    lines

     -w   words

     -m or -c   characters

5.Manipulating Files (操作文件)

  1.coping files(cp)

$ cp source destination  ------> destination是目标文件地址以及文件名,如果名字不变,

                                     直接文件地址即可

                           source 是源文件名,不能为文件夹

                           **默认覆盖写入目标文件

    -i  sour  dest     ------->会询问是否覆盖写入

    file1 file2 file3 dest  ------->同时复制多个文件到文件夹中

  2.Renaming file(mv)

$mv  sour  dest      --------> 在同文件夹中remove也可以实现重命名的效果

      -i              --------> 询问是否覆盖写入

  3.Removing files(rm)

$rm  files

     file1 file2 file3

      -i  file1 file2    ---------> 询问是否删除

 

Hour 4   Working with directories

1.pathname 

  /home/range/docs/ch5.doc

  filename characters and (.) (-) (_)

  Absolute pathname   --------->  $pwd

  Relative pathname    --------->  (..) 代表上级文件目录

                               (.)  代表当前文件目录

2.cd

3.ls

4.Manipulating directories

   1. mkdir

     $mkdir  dir

             dir1 dir2 dir3

   2.创建一系列文件夹

     $mkdir  -p  /tmp/ch04/test1

     如果目录中,没有中间文件夹,那么就可以通过-p选项直接创建

     create all required parent directories

     **不能创建文件夹与文件同名

   3.复制文件夹coping directories

     cp  -r  sour  dest sour复制到dest 文件夹下(也可用于复制文件)

   4.moving directories

     $mv   sour_dir  dest_dir

   5.Removing directories

     $rmdir   ----->remove empty directories

     $rm  -r  ----->remove with their contents

 

Hour 5  Manipulating file attributes

1.File type

  $ls -l

2.Symbolic Links

  软连接(类似快捷方式)

  $ln -s source destination  ls -l可以查看链接

3.Device Files

  1.character special files

crm-------

  2.block special files

brw--rw----

4.change permission (chmod)

  1.symbolic method

chmod   u  +  r    file

         g  -  w

        o  =  x

        a     s

     chmod guo+rx *

     -R 选项作用于下面所有文件与文件夹

  2.octal Method(八进制方法)

Read 4

Write 2

Execute 1

5.change owners and groups

  chown

  chgrp option group files

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 0
    评论
Apache Hadoop is the technology at the heart of the Big Data revolution, and Hadoop skills are in enormous demand. Now, in just 24 lessons of one hour or less, you can learn all the skills and techniques you'll need to deploy each key component of a Hadoop platform in your local environment or in the cloud, building a fully functional Hadoop cluster and using it with real programs and datasets. Each short, easy lesson builds on all that's come before, helping you master all of Hadoop's essentials, and extend it to meet your unique challenges. Apache Hadoop in 24 Hours, Sams Teach Yourself covers all this, and much more: Understanding Hadoop and the Hadoop Distributed File System (HDFS) Importing data into Hadoop, and process it there Mastering basic MapReduce Java programming, and using advanced MapReduce API concepts Making the most of Apache Pig and Apache Hive Implementing and administering YARN Taking advantage of the full Hadoop ecosystem Managing Hadoop clusters with Apache Ambari Working with the Hadoop User Environment (HUE) Scaling, securing, and troubleshooting Hadoop environments Integrating Hadoop into the enterprise Deploying Hadoop in the cloud Getting started with Apache Spark Step-by-step instructions walk you through common questions, issues, and tasks; Q-and-As, Quizzes, and Exercises build and test your knowledge; "Did You Know?" tips offer insider advice and shortcuts; and "Watch Out!" alerts help you avoid pitfalls. By the time you're finished, you'll be comfortable using Apache Hadoop to solve a wide spectrum of Big Data problems.

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值