GreenPlum6.7.1 Centos7部署文档

GreenPlum6.7.1部署文档

一、部署规划

1.1 版本说明

操作系统版本CentOS Linux release 7.7.1908
数据库版本GreenPlum6.7.1( pg 9.4.24)
内核版本3.10.0-1062.el7.x86_64
CPU2C
MEM4G

1.2 节点规划

IPhost节点规划
192.168.188.87node1master
192.168.188.88node2seg1,seg2,mirror3,mirror4
192.168.188.89node3seg3,seg4,mirror1,mirror2

二、系统配置

系统参数需要使用root用户修改,修改完需要重启系统,也可以修改完成后一并重启系统。
建议先修改master主机的参数,待安装好master的gp后,打通ssh,使用gpscp ,gpssh 批量修改其他节点的系统参数

2.1 关闭selinux

setenforce 0 && sed -i 's/SELINUX=enforcing/SELINUX=disabled/g' /etc/selinux/config
sestatus

2.2 关闭防火墙

systemctl stop firewalld.service
systemctl disable firewalld.service
systemctl list-unit-files firewalld.service

2.3 修改/etc/sysctl.conf

# kernel.shmall = _PHYS_PAGES / 2 =echo $(expr $(getconf _PHYS_PAGES) / 2)
kernel.shmall = 483888
# kernel.shmmax = kernel.shmall * PAGE_SIZE =echo $(expr $(getconf _PHYS_PAGES) / 2 \* $(getconf PAGE_SIZE))
kernel.shmmax = 1982005248
kernel.shmmni = 4096
vm.overcommit_memory = 2
vm.overcommit_ratio = 95 
net.ipv4.ip_local_port_range = 10000 65535
kernel.sem = 500 2048000 200 40960
kernel.sysrq = 1
kernel.core_uses_pid = 1
kernel.msgmnb = 65536
kernel.msgmax = 65536
kernel.msgmni = 2048
net.ipv4.tcp_syncookies = 1
net.ipv4.conf.default.accept_source_route = 0
net.ipv4.tcp_max_syn_backlog = 4096
net.ipv4.conf.all.arp_filter = 1
net.core.netdev_max_backlog = 10000
net.core.rmem_max = 2097152
net.core.wmem_max = 2097152
vm.swappiness = 10
vm.zone_reclaim_mode = 0
vm.dirty_expire_centisecs = 500
vm.dirty_writeback_centisecs = 100
#vm.dirty_background_ratio = 0
#vm.dirty_ratio = 0
#vm.dirty_background_bytes = 1610612736
#vm.dirty_bytes = 4294967296
vm.dirty_background_ratio = 3 
vm.dirty_ratio = 10

系统内存大于64G ,建议以下配置

vm.dirty_background_ratio = 0
vm.dirty_ratio = 0
vm.dirty_background_bytes = 1610612736 # 1.5GB
vm.dirty_bytes = 4294967296 # 4GB

系统内存小于等于 64GB,移除vm.dirty_background_bytes 设置,并设置以下参数

vm.dirty_background_ratio = 3
vm.dirty_ratio = 10

增加 vm.min_free_kbytes ,确保网络和存储驱动程序PF_MEMALLOC得到分配。这对内存大的系统尤其重要。一般系统上,默认值通常太低。可以使用awk命令计算vm.min_free_kbytes的值,通常是建议的系统物理内存的3%:

awk 'BEGIN {OFMT = "%.0f";} /MemTotal/ {print "vm.min_free_kbytes =", $2 * .03;}' /proc/meminfo >> /etc/sysctl.conf 
sysctl -p

2.4 系统资源规划

/etc/security/limits.conf 
* soft nofile 524288
* hard nofile 524288
* soft nproc 131072
* hard nproc 131072

/etc/security/limits.d/20-nproc.conf
*          soft    nproc     131072
root       soft    nproc     unlimited

2.5 调整挂载项

XFS相比较ext4具有如下优点:
XFS的扩展性明显优于ext4,ext4的单个文件目录超过200W个性能下降明显
ext4作为传统文件系统确实非常稳定,但是随着存储需求的越来越大,ext4渐渐不在适应
由于历史磁盘原因,ext4的inode个数限制(32位),最多只能支持40多亿个文件,单个文件最大支持到16T
XFS使用的是64位管理空间,文件系统规模可以达到EB级别,XFS是基于B+Tree管理元数据
GP 需要使用XFS的文件系统,RHEL/CentOS 7 和Oracle Linux将XFS作为默认文件系统,SUSE/openSUSE已经为XFS做了长期支持。

/dev/sda3 /data xfs nodev,noatime,nobarrier,inode64 0 0

2.6 设置文件预读大小

/sbin/blockdev --setra 16384 /dev/sda3
/sbin/blockdev --getra /dev/sda3
#在/etc/rc.d/rc.local添加以上命令
chmod +x /etc/rc.d/rc.local

2.7 设置IO调度模式

grubby --update-kernel=ALL --args="elevator=deadline"
grubby --info=ALL

2.8 禁用透明大页面

grubby --update-kernel=ALL --args="transparent_hugepage=never"
cat /sys/kernel/mm/*transparent_hugepage/enabled

2.9 禁用 删除IPC

/etc/systemd/logind.conf
RemoveIPC=no

2.10 修改/etc/ssh/sshd_config

/etc/ssh/sshd_config
MaxStartups 10:30:100

2.11 安装依赖

yum install -y apr apr-util bash bzip2 curl krb5 libcurl libevent libxml2 libyaml zlib openldap openssh openssl openssl-libs perl readline rsync R sed tar zip krb5-devel

重启主机并查看已修改参数

三、GreenPlum安装

3.1 创建用户

在各节点执行

groupadd gpadmin
useradd gpadmin -r -m -g gpadmin
passwd gpadmin

3.2 执行安装程序

yum install -y greenplum-db-6.7.1-rhel7-x86_64.rpm

3.3 创建hostfile_exkeys

在$GPHOME目录创建两个host文件(all_host,seg_host),用于后续使用gpssh,gpscp 等脚本host参数文件
all_host : 内容是集群所有主机名或ip,包含master,segment,standby等。
seg_host: 内容是所有 segment主机名或ip
若一台机器有多网卡,且网卡没有绑定成bond0模式时,需要将多网卡的ip 或者host都列出来。

cd /usr/local/greenplum-db
[root@node1 greenplum-db]# cat all_host 
node1
node2
node3
[root@node1 greenplum-db]# cat seg_host 
node2
node3
#修改权限
chown -R gpadmin:gpadmin /usr/local/greenplum*

3.4 配置集群互信

ssh-keygen
ssh-copy-id node2
ssh-copy-id node3
source /usr/local/greenplum-db/greenplum_path.sh
gpssh-exkeys -f all_host
[STEP 1 of 5] create local ID and authorize on local host
  ... /root/.ssh/id_rsa file exists ... key generation skipped

[STEP 2 of 5] keyscan all hosts and update known_hosts file

[STEP 3 of 5] retrieving credentials from remote hosts
  ... send to node2
  ... send to node3

[STEP 4 of 5] determine common authentication file content

[STEP 5 of 5] copy authentication files to all remote hosts
  ... finished key exchange with node2
  ... finished key exchange with node3

[INFO] completed successfully
#验证gpssh
gpssh -f /usr/local/greenplum-db/all_host -e 'ls /usr/local/'
[node1] ls /usr/local/
[node1] bin  games	   greenplum-db-6.7.1  lib    libexec  share
[node1] etc  greenplum-db  include	       lib64  sbin     src
[node2] ls /usr/local/
[node2] bin  etc  games  include  lib  lib64  libexec  sbin  share  src
[node3] ls /usr/local/
[node3] bin  etc  games  include  lib  lib64  libexec  sbin  share  src

3.5 同步master 配置到各个主机

3.5.1 批量添加gpadmin用户

gpssh -f seg_host -e 'groupadd gpadmin;useradd gpadmin -r -m -g gpadmin;echo "gpadmin" | passwd --stdin gpadmin;'
gpssh -f seg_host -e 'ls /home/'

3.5.2 配置gpadmin免密登陆

su - gpadmin
source /usr/local/greenplum-db/greenplum_path.sh
ssh-keygen
ssh-copy-id node2
ssh-copy-id node3
gpssh-exkeys -f /usr/local/greenplum-db/all_host
[STEP 1 of 5] create local ID and authorize on local host
  ... /home/gpadmin/.ssh/id_rsa file exists ... key generation skipped

[STEP 2 of 5] keyscan all hosts and update known_hosts file

[STEP 3 of 5] retrieving credentials from remote hosts
  ... send to node2
  ... send to node3

[STEP 4 of 5] determine common authentication file content

[STEP 5 of 5] copy authentication files to all remote hosts
  ... finished key exchange with node2
  ... finished key exchange with node3

[INFO] completed successfully

3.5.3 配置gpadmin用户的环境变量

cat >> /home/gpadmin/.bash_profile << EOF
source /usr/local/greenplum-db/greenplum_path.sh
EOF
cat >> /home/gpadmin/.bashrc << EOF
source /usr/local/greenplum-db/greenplum_path.sh
EOF

gpscp -f /usr/local/greenplum-db/seg_host /home/gpadmin/.bash_profile gpadmin@=:/home/gpadmin/.bash_profile 
gpscp -f /usr/local/greenplum-db/seg_host /home/gpadmin/.bashrc gpadmin@=:/home/gpadmin/.bashrc

3.5.4 复制系统参数到其他节点

gpscp -f seg_host /etc/hosts root@=:/etc/hosts
gpscp -f seg_host /etc/security/limits.conf root@=:/etc/security/limits.conf 
gpscp -f seg_host /etc/sysctl.conf  root@=:/etc/sysctl.conf
gpscp -f seg_host /etc/security/limits.d/20-nproc.conf root@=:/etc/security/limits.d/20-nproc.conf
gpssh -f seg_host -e '/sbin/blockdev --setra 16384 /dev/sda3'
gpssh -f seg_host -e 'echo deadline > /sys/block/sda/queue/scheduler'
gpssh -f seg_host -e 'sysctl -p'
gpssh -f seg_host -e 'reboot'

3.6 gpsegment部署

3.6.1 分发配置

# 变量设置
link_name='greenplum-db'                    #软连接名
binary_dir_location='/usr/local'            #安装路径
binary_dir_name='greenplum-db-6.7.1'        #安装目录
binary_path='/usr/local/greenplum-db-6.7.1' #全目录

chown -R gpadmin:gpadmin $binary_path
rm -f ${binary_path}.tar; rm -f ${binary_path}.tar.gz
cd $binary_dir_location; tar cf ${binary_dir_name}.tar ${binary_dir_name}
gzip ${binary_path}.tar
gpssh -f ${binary_path}/seg_host -e "mkdir -p ${binary_dir_location};rm -rf ${binary_path};rm -rf ${binary_path}.tar;rm -rf ${binary_path}.tar.gz"
gpscp -f ${binary_path}/seg_host ${binary_path}.tar.gz root@=:${binary_path}.tar.gz
gpssh -f ${binary_path}/seg_host -e "cd ${binary_dir_location};gzip -f -d ${binary_path}.tar.gz;tar xf ${binary_path}.tar"
gpssh -f ${binary_path}/seg_host -e "rm -rf ${binary_path}.tar;rm -rf ${binary_path}.tar.gz;rm -f ${binary_dir_location}/${link_name}"
gpssh -f ${binary_path}/seg_host -e ln -fs ${binary_dir_location}/${binary_dir_name} ${binary_dir_location}/${link_name}
gpssh -f ${binary_path}/seg_host -e "chown -R gpadmin:gpadmin ${binary_dir_location}/${link_name};chown -R gpadmin:gpadmin ${binary_dir_location}/${binary_dir_name}"
gpssh -f ${binary_path}/seg_host -e "source ${binary_path}/greenplum_path.sh"
gpssh -f ${binary_path}/seg_host -e "cd ${binary_dir_location};ll"

3.6.2 创建集群数据目录

#创建master数据目录
mkdir -p /data/greenplum/data/master
chown gpadmin:gpadmin /data/greenplum/data/master

#创建segment数据目录
source /usr/local/greenplum-db/greenplum_path.sh
gpssh -f /usr/local/greenplum-db/seg_host -e 'mkdir -p /data/greenplum/data1/primary'
gpssh -f /usr/local/greenplum-db/seg_host -e 'mkdir -p /data/greenplum/data1/mirror'
gpssh -f /usr/local/greenplum-db/seg_host -e 'mkdir -p /data/greenplum/data2/primary'
gpssh -f /usr/local/greenplum-db/seg_host -e 'mkdir -p /data/greenplum/data2/mirror'
gpssh -f /usr/local/greenplum-db/seg_host -e 'chown -R gpadmin /data/greenplum/data*'

3.7 集群性能测试

推荐:

磁盘要达到2000M/s
网络至少1000M/s

3.7.1 网络测试

gpcheckperf -f /usr/local/greenplum-db/seg_host -r N -d /tmp

-------------------
--  NETPERF TEST
-------------------
NOTICE: -t is deprecated, and has no effect
NOTICE: -f is deprecated, and has no effect
NOTICE: -t is deprecated, and has no effect
NOTICE: -f is deprecated, and has no effect

====================
==  RESULT 2020-05-04T13:38:25.039440
====================
Netperf bisection bandwidth test
node2 -> node3 = 708.600000
node3 -> node2 = 758.750000

Summary:
sum = 1467.35 MB/sec
min = 708.60 MB/sec
max = 758.75 MB/sec
avg = 733.67 MB/sec
median = 758.75 MB/sec

3.7.2 磁盘测试

gpcheckperf -f /usr/local/greenplum-db/seg_host -r ds -D -d /data/greenplum/data1/primary

--------------------
--  DISK WRITE TEST
--------------------

--------------------
--  DISK READ TEST
--------------------

--------------------
--  STREAM TEST
--------------------

====================
==  RESULT 2020-05-04T13:40:48.565801
====================

 disk write avg time (sec): 39.45
 disk write tot bytes: 15836053504
 disk write tot bandwidth (MB/s): 385.66
 disk write min bandwidth (MB/s): 176.04 [node3]
 disk write max bandwidth (MB/s): 209.61 [node2]
 -- per host bandwidth --
    disk write bandwidth (MB/s): 176.04 [node3]
    disk write bandwidth (MB/s): 209.61 [node2]


 disk read avg time (sec): 9.57
 disk read tot bytes: 15836053504
 disk read tot bandwidth (MB/s): 1581.69
 disk read min bandwidth (MB/s): 752.67 [node3]
 disk read max bandwidth (MB/s): 829.03 [node2]
 -- per host bandwidth --
    disk read bandwidth (MB/s): 752.67 [node3]
    disk read bandwidth (MB/s): 829.03 [node2]


 stream tot bandwidth (MB/s): 40897.60
 stream min bandwidth (MB/s): 18901.20 [node2]
 stream max bandwidth (MB/s): 21996.40 [node3]
 -- per host bandwidth --
    stream bandwidth (MB/s): 21996.40 [node3]
    stream bandwidth (MB/s): 18901.20 [node2]

3.7.3 时钟校验

gpssh -f /usr/local/greenplum-db/all_host -e 'date'
[node1] date
[node1] 2020年 05月 04日 星期一 13:45:16 CST
[node3] date
[node3] 2020年 05月 04日 星期一 13:45:16 CST
[node2] date
[node2] 2020年 05月 04日 星期一 13:45:16 CST

3.8 编辑配置文件

su - gpadmin
mkdir -p /home/gpadmin/gpconfigs
cp $GPHOME/docs/cli_help/gpconfigs/gpinitsystem_config /home/gpadmin/gpconfigs/gpinitsystem_config
#修改配置文件
ARRAY_NAME="Greenplum Data Platform"
SEG_PREFIX=gpseg
PORT_BASE=6000
declare -a DATA_DIRECTORY=(/data/greenplum/data1/primary /data/greenplum/data2/primary)
MASTER_HOSTNAME=node1
MASTER_DIRECTORY=/data/greenplum/data/master
MASTER_PORT=5432
TRUSTED_SHELL=ssh
CHECK_POINT_SEGMENTS=8
ENCODING=UNICODE
MIRROR_PORT_BASE=7000
declare -a MIRROR_DATA_DIRECTORY=(/data/greenplum/data1/mirror /data/greenplum/data2/mirror)
DATABASE_NAME=pgdw

3.9 集群初始化

3.9.1 初始化集群

gpinitsystem -c /home/gpadmin/gpconfigs/gpinitsystem_config -h /usr/local/greenplum-db/seg_host -D
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Main
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Command line options passed to utility = -c /home/gpadmin/gpconfigs/gpinitsystem_config -h /usr/local/greenplum-db/seg_host -D
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_GPDB_ID
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Current user id of gpadmin, matches initdb id of gpadmin
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_GPDB_ID
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_PARAMS
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking configuration parameters, please wait...
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_FILE
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking file /home/gpadmin/gpconfigs/gpinitsystem_config
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_FILE
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Dumping /home/gpadmin/gpconfigs/gpinitsystem_config to logfile for reference
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed /home/gpadmin/gpconfigs/gpinitsystem_config dump to logfile
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Reading Greenplum configuration file /home/gpadmin/gpconfigs/gpinitsystem_config
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Could not find HEAP_CHECKSUM in cluster config, defaulting to on.
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Could not find HBA_HOSTNAMES in cluster config, defaulting to 0.
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale has not been set in /home/gpadmin/gpconfigs/gpinitsystem_config, will set to default value
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_LOCALE_KNOWN
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale check passed on this host
20200504:18:28:38:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_LOCALE_KNOWN
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale set to en_US.utf8
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Dump current system locale to log file
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End of system locale dump
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_FILE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking file /usr/local/greenplum-db/seg_host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_FILE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed check of file /usr/local/greenplum-db/seg_host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DUPLICATES
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-No duplicate segment instance hostnames found, will proceed
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DUPLICATES
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting up segment instance list array
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data/master on master host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed /data/greenplum/data/master directory on master host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Will create database gpdw
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-MASTER_MAX_CONNECT not set, will set to default value 250
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting segment instance MAX_CONNECTIONS to 
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Number of segment instance hosts = 2
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain IP address of Master host
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master IP address array = ::1
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking configuration parameters, Completed
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_PARAMS
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_MULTI_HOME
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing multi-home checks, please wait...
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Configuring build for standard array
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing multi-home checks, Completed
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_MULTI_HOME
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_QE_ARRAY
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Building primary segment instance array, please wait...
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.6000.lock found for port=6000
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:39:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.6001.lock found for port=6001
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.6000.lock found for port=6000
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.6001.lock found for port=6001
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_GROUP_MIRROR_ARRAY
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Building group mirror array type , please wait...
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:40:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.7000.lock found for port=7000
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.7001.lock found for port=7001
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.7000.lock found for port=7000
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_PORT_CHK
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:41:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file /tmp/.s.PGSQL.7001.lock found for port=7001
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_PORT_CHK
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_GROUP_MIRROR_ARRAY
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_QE_ARRAY
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_QES
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking Master host
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function GET_PG_PID_ACTIVE
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-No socket connection or lock file in /tmp found for port=5432
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function GET_PG_PID_ACTIVE
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking new segment hosts, please wait...
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_LOCALE_KNOWN
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale check passed on node2
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_LOCALE_KNOWN
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_OPEN_FILES
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node1
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_OPEN_FILES
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_VERSION_CHK
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:- Current postgres version = postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:- postgres version on node1 = postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_VERSION_CHK
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Segment instance node2 /usr/local/greenplum-db/./lib checked
20200504:18:28:42:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_LOCALE_KNOWN
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Locale check passed on node3
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_LOCALE_KNOWN
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_OPEN_FILES
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node1
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_OPEN_FILES
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function POSTGRES_VERSION_CHK
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:- Current postgres version = postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:- postgres version on node1 = postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function POSTGRES_VERSION_CHK
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Segment instance node3 /usr/local/greenplum-db/./lib checked
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Primary segment instance directory check
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 for dir /data/greenplum/data1/primary/gpseg0
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 /etc/hosts for localhost set as node2
20200504:18:28:43:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data1/primary on node2
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node2 /data/greenplum/data1/primary directory
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 for dir /data/greenplum/data2/primary/gpseg1
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 /etc/hosts for localhost set as node2
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data2/primary on node2
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node2 /data/greenplum/data2/primary directory
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 for dir /data/greenplum/data1/primary/gpseg2
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 /etc/hosts for localhost set as node3
20200504:18:28:44:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data1/primary on node3
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node3 /data/greenplum/data1/primary directory
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 for dir /data/greenplum/data2/primary/gpseg3
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 /etc/hosts for localhost set as node3
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data2/primary on node3
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node3 /data/greenplum/data2/primary directory
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Mirror segment instance directory check
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 for dir /data/greenplum/data1/mirror/gpseg0
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:45:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data1/mirror on node3
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node3 /data/greenplum/data1/mirror directory
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node3 for dir /data/greenplum/data2/mirror/gpseg1
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data2/mirror on node3
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node3 /data/greenplum/data2/mirror directory
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 for dir /data/greenplum/data1/mirror/gpseg2
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:46:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data1/mirror on node2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node2 /data/greenplum/data1/mirror directory
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking node2 for dir /data/greenplum/data2/mirror/gpseg3
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CHK_DIR
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_DIR
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking write access to /data/greenplum/data2/mirror on node2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Write test passed on node2 /data/greenplum/data2/mirror directory
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checking new segment hosts, Completed
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CHK_QES
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function DISPLAY_CONFIG
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum Database Creation Parameters
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:---------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master Configuration
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:---------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master instance name       = Greenplum Data Platform
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master hostname            = node1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master port                = 5432
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master instance dir        = /data/greenplum/data/master/gpseg-1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master LOCALE              = en_US.utf8
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum segment prefix   = gpseg
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master Database            = gpdw
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master connections         = 250
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master buffers             = 128000kB
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Segment connections        = 750
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Segment buffers            = 128000kB
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Checkpoint segments        = 8
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Encoding                   = UNICODE
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Postgres param file        = Off
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Initdb to be used          = /usr/local/greenplum-db/./bin/initdb
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-GP_LIBRARY_PATH is         = /usr/local/greenplum-db/./lib
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-HEAP_CHECKSUM is           = on
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-HBA_HOSTNAMES is           = 0
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Ulimit check               = Passed
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Array host connect type    = Single hostname per node
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master IP address [1]      = ::1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master IP address [2]      = 192.168.188.87
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Master IP address [3]      = fe80::3157:3eb8:c80e:8f40
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Standby Master             = Not Configured
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Number of primary segments = 2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total Database segments    = 4
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Trusted shell              = ssh
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Number segment hosts       = 2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Mirror port base           = 7000
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Number of mirror segments  = 2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Mirroring config           = ON
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Mirroring type             = Group
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:----------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum Primary Segment Configuration
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:----------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 	/data/greenplum/data1/primary/gpseg0 	6000 	2 	0
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 	/data/greenplum/data2/primary/gpseg1 	6001 	3 	1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 	/data/greenplum/data1/primary/gpseg2 	6000 	4 	2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 	/data/greenplum/data2/primary/gpseg3 	6001 	5 	3
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:---------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum Mirror Segment Configuration
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:---------------------------------------
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 	/data/greenplum/data1/mirror/gpseg0 	7000 	6 	0
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 	/data/greenplum/data2/mirror/gpseg1 	7001 	7 	1
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 	/data/greenplum/data1/mirror/gpseg2 	7000 	8 	2
20200504:18:28:47:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 	/data/greenplum/data2/mirror/gpseg3 	7001 	9 	3

Continue with Greenplum creation Yy|Nn (default=N):
> y
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function DISPLAY_CONFIG
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ARRAY_REORDER
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ARRAY_REORDER
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_QD_DB
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Building the Master instance database, please wait...
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Initializing Master Postgres instance /data/greenplum/data/master/gpseg-1
20200504:18:28:49:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing local /usr/local/greenplum-db/./bin/initdb -E UNICODE -D /data/greenplum/data/master/gpseg-1 --locale=en_US.utf8        --max_connections=250 --shared_buffers=128000kB --data-checksums --backend_output=/data/greenplum/data/master/gpseg-1.initdb
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed Master instance initialization
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting the Master port to 5432
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set Master port=5432 in postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed setting the Master port to 5432
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting the Master listen addresses to '*'
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set Master listen addresses to '*' in postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed setting the listen addresses to '*'
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting Master logging option
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Appended line log_statement=all to /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set log_statement=all in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting Master instance check point segments
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Replaced line in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set checkpoint_segments=8 in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting Master instance content id
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Appended line gp_contentid=-1 to /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set gp_contentid=-1 in /data/greenplum/data/master/gpseg-1/postgresql.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting Master instance db id
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Appended line gp_dbid=1 to /data/greenplum/data/master/gpseg-1/internal.auto.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SED_PG_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed set gp_dbid=1 in /data/greenplum/data/master/gpseg-1/internal.auto.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding gp_dumpall access to pg_hba.conf for master host
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BUILD_MASTER_PG_HBA_FILE
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Clearing values in Master pg_hba.conf
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting local access
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Setting local host access
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Complete Master pg_hba.conf configuration
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BUILD_MASTER_PG_HBA_FILE
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Creating gpssh configuration file
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BUILD_GPSSH_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BUILD_GPSSH_CONF
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Creating perfmon directories and configuration file
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BUILD_PERFMON
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed create perfmon directories and configuration file
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Starting the Master in admin mode
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function BACKOUT_COMMAND
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed starting the Master in admin mode
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-node1 contact established
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding -1 on node1, path /data/greenplum/data/master/gpseg-1 to Master gp_segment_configuration
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add -1 on node1 in dir /data/greenplum/data/master/gpseg-1 to Master gp_segment_configuration
20200504:18:28:53:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function LOAD_QE_SYSTEM_DATA
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding segment node2 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding 0 on node2, path /data/greenplum/data1/primary/gpseg0 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add 0 on node2 in dir /data/greenplum/data1/primary/gpseg0 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully added segment node2 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding segment node3 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding 2 on node3, path /data/greenplum/data1/primary/gpseg2 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add 2 on node3 in dir /data/greenplum/data1/primary/gpseg2 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully added segment node3 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding segment node2 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-node2 contact established
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding 1 on node2, path /data/greenplum/data2/primary/gpseg1 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add 1 on node2 in dir /data/greenplum/data2/primary/gpseg1 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully added segment node2 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding segment node3 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-node3 contact established
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PING_HOST
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed obtain psql count Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Adding 3 on node3, path /data/greenplum/data2/primary/gpseg3 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed add 3 on node3 in dir /data/greenplum/data2/primary/gpseg3 to Master gp_segment_configuration
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function UPDATE_GPCONFIG
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully added segment node3 to Master system tables
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function LOAD_QE_SYSTEM_DATA
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_QD_DB
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_ARRAY_SORTED_ON_CONTENT_ID
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_ARRAY_SORTED_ON_CONTENT_ID
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_SEGMENT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing parallel build of primary segment instances
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_SETUP
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Spawning parallel processes    batch [1], please wait...
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_SETUP
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_WAIT
20200504:18:28:54:015736 gpinitsystem:node1:gpadmin-[INFO]:-Waiting for parallel processes batch [1], please wait...
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_WAIT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_SUMMARY_STATUS_REPORT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Parallel process exit status
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as completed           = 4
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as killed              = 0
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as failed              = 0
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_SUMMARY_STATUS_REPORT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_SEGMENT
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Deleting distributed backout files
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Removing back out file
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-No errors generated from parallel processes
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function STOP_QD_PRODUCTION
20200504:18:29:05:015736 gpinitsystem:node1:gpadmin-[INFO]:-Restarting the Greenplum instance in production mode
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Starting gpstop with args: -a -l /home/gpadmin/gpAdminLogs -m -d /data/greenplum/data/master/gpseg-1 -v
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Setting level of parallelism to: 64
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Gathering information and validating the environment...
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:---Checking that current user can use GP binaries
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Obtaining master's port from master data directory
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Read from postgresql.conf port=5432
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Obtaining Greenplum Master catalog information
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Obtaining Segment details from master...
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Connecting to dbname='template1'
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Greenplum Version: 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7'
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Commencing Master instance shutdown with mode='smart'
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Master segment instance directory=/data/greenplum/data/master/gpseg-1
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-/data/greenplum/data/master/gpseg-1/pg_log/gp_era - end_gp_era
20200504:18:29:06:019448 gpstop:node1:gpadmin-[INFO]:-Stopping master segment and waiting for user connections to finish ...
20200504:18:29:06:019448 gpstop:node1:gpadmin-[DEBUG]:-Connecting to dbname='template1'
server shutting down
20200504:18:29:07:019448 gpstop:node1:gpadmin-[INFO]:-Attempting forceful termination of any leftover master process
20200504:18:29:07:019448 gpstop:node1:gpadmin-[DEBUG]:-Running Command: cat /tmp/.s.PGSQL.5432.lock
20200504:18:29:07:019448 gpstop:node1:gpadmin-[INFO]:-Terminating processes for segment /data/greenplum/data/master/gpseg-1
20200504:18:29:07:019448 gpstop:node1:gpadmin-[DEBUG]:-Running Command: ps ux | grep "[p]ostgres:\s*port\s*5432" | awk '{print $2}'
20200504:18:29:07:019448 gpstop:node1:gpadmin-[ERROR]:-Failed to kill processes for segment /data/greenplum/data/master/gpseg-1: ([Errno 3] No such process)
20200504:18:29:07:019448 gpstop:node1:gpadmin-[DEBUG]:-Successfully shutdown the Master instance in admin mode
20200504:18:29:07:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully shutdown the new Greenplum instance
20200504:18:29:07:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function STOP_QD_PRODUCTION
20200504:18:29:07:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function START_QD_PRODUCTION
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Starting gpstart with args: -a -l /home/gpadmin/gpAdminLogs -d /data/greenplum/data/master/gpseg-1 -v
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Setting level of parallelism to: 64
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Gathering information and validating the environment...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:---Checking that current user can use GP binaries
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Obtaining master's port from master data directory
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Read from postgresql.conf port=5432
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Read from postgresql.conf max_connections=250
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Greenplum Binary Version: 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --catalog-version
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Greenplum Catalog Version: '301908232'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data/master/gpseg-1
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Check if Master is already running...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Starting Master instance in admin mode
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=None $GPHOME/bin/pg_ctl -D /data/greenplum/data/master/gpseg-1 -l /data/greenplum/data/master/gpseg-1/pg_log/startup.log -w -t 600 -o " -p 5432 -c gp_role=utility " start
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Obtaining Greenplum Master catalog information
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Obtaining Segment details from master...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Connecting to dbname='template1'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Setting new master era
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-/data/greenplum/data/master/gpseg-1/pg_log/gp_era - write_gp_era
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-opening new file
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-wrote era: 9c0a65a67950ab11_200504182907
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-setting read only
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-verifying file
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Master Started...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data/master/gpseg-1
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Checking if standby has been activated...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Shutting down master
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/gpstop -a -m -f -d /data/greenplum/data/master/gpseg-1 -v -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-results of forcing master shutdown: Shutting down master cmdStr='$GPHOME/bin/gpstop -a -m -f -d /data/greenplum/data/master/gpseg-1 -v -B 64 -l '/home/gpadmin/gpAdminLogs''  had result: cmd had rc=0 completed=True halted=False
  stdout='20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Starting gpstop with args: -a -m -f -d /data/greenplum/data/master/gpseg-1 -v -B 64 -l /home/gpadmin/gpAdminLogs
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Setting level of parallelism to: 64
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Gathering information and validating the environment...
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:---Checking that current user can use GP binaries
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Obtaining master's port from master data directory
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Read from postgresql.conf port=5432
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Obtaining Greenplum Master catalog information
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Obtaining Segment details from master...
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Connecting to dbname='template1'
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Greenplum Version: 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7'
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Commencing Master instance shutdown with mode='fast'
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Master segment instance directory=/data/greenplum/data/master/gpseg-1
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-/data/greenplum/data/master/gpseg-1/pg_log/gp_era - end_gp_era
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-found existing file
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-removed existing file
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_ctl -D /data/greenplum/data/master/gpseg-1 -m fast -w -t 120 stop
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Attempting forceful termination of any leftover master process
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Running Command: cat /tmp/.s.PGSQL.5432.lock
20200504:18:29:07:019496 gpstop:node1:gpadmin-[INFO]:-Terminating processes for segment /data/greenplum/data/master/gpseg-1
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Running Command: ps ux | grep "[p]ostgres:\s*port\s*5432" | awk '{print $2}'
20200504:18:29:07:019496 gpstop:node1:gpadmin-[ERROR]:-Failed to kill processes for segment /data/greenplum/data/master/gpseg-1: ([Errno 3] No such process)
20200504:18:29:07:019496 gpstop:node1:gpadmin-[DEBUG]:-Successfully shutdown the Master instance in admin mode
'
  stderr=''
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-gp_segment_configuration indicates following valid segments
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-SegDB: node2:/data/greenplum/data1/primary/gpseg0:content=0:dbid=2:role=p:preferred_role=p:mode=n:status=u
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-SegDB: node2:/data/greenplum/data2/primary/gpseg1:content=1:dbid=3:role=p:preferred_role=p:mode=n:status=u
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-SegDB: node3:/data/greenplum/data1/primary/gpseg2:content=2:dbid=4:role=p:preferred_role=p:mode=n:status=u
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-SegDB: node3:/data/greenplum/data2/primary/gpseg3:content=3:dbid=5:role=p:preferred_role=p:mode=n:status=u
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-dbIdsToNotStart has 0 entries
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-gparray does not have mirrors
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Adding cmd to work_queue: /bin/ping -c 1 node3
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Adding cmd to work_queue: /bin/ping -c 1 node2
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker1] got cmd: /bin/ping -c 1 node3
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker0] got cmd: /bin/ping -c 1 node2
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: /bin/ping -c 1 node3
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: /bin/ping -c 1 node2
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker1] finished cmd: ping cmdStr='/bin/ping -c 1 node3'  had result: cmd had rc=0 completed=True halted=False
  stdout='PING node3 (192.168.188.89) 56(84) bytes of data.
64 bytes from node3 (192.168.188.89): icmp_seq=1 ttl=64 time=0.298 ms

--- node3 ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.298/0.298/0.298/0.000 ms
'
  stderr=''
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker0] finished cmd: ping cmdStr='/bin/ping -c 1 node2'  had result: cmd had rc=0 completed=True halted=False
  stdout='PING node2 (192.168.188.88) 56(84) bytes of data.
64 bytes from node2 (192.168.188.88): icmp_seq=1 ttl=64 time=0.265 ms

--- node2 ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.265/0.265/0.265/0.000 ms
'
  stderr=''
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Construct host-->datadirs mapping:
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Host node3 has Datadirs: [/data/greenplum/data1/primary/gpseg2,/data/greenplum/data2/primary/gpseg3]
Host node2 has Datadirs: [/data/greenplum/data1/primary/gpseg0,/data/greenplum/data2/primary/gpseg1]
20200504:18:29:07:019475 gpstart:node1:gpadmin-[INFO]:-Commencing parallel segment instance startup, please wait...
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Dispatching command to start segments on host: node3, with 4 contents in cluster
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-$GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Adding cmd to work_queue: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Dispatching command to start segments on host: node2, with 4 contents in cluster
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker2] got cmd: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-$GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Adding cmd to work_queue: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker3] got cmd: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:07:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64 -l '/home/gpadmin/gpAdminLogs'
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker3] finished cmd: remote segment starts on host 'node2' cmdStr='ssh -o StrictHostKeyChecking=no -o ServerAliveInterval=60 node2 ". /usr/local/greenplum-db/./greenplum_path.sh; $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0' -D '3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1' -B 64 -l '/home/gpadmin/gpAdminLogs'"'  had result: cmd had rc=0 completed=True halted=False
  stdout='20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Starting gpsegstart.py with args: -M mirrorless -V postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7 -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D 2|0|p|p|n|u|node2|node2|6000|/data/greenplum/data1/primary/gpseg0 -D 3|1|p|p|n|u|node2|node2|6001|/data/greenplum/data2/primary/gpseg1 -B 64 -l /home/gpadmin/gpAdminLogs
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Validating directories...
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Validating directory: /data/greenplum/data1/primary/gpseg0
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Validating directory: /data/greenplum/data2/primary/gpseg1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Starting segments... (mirroringMode mirrorless)
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data1/primary/gpseg0
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Adding cmd to work_queue: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg0 -l /data/greenplum/data1/primary/gpseg0/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data2/primary/gpseg1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker0] got cmd: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg0 -l /data/greenplum/data1/primary/gpseg0/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg0 -l /data/greenplum/data1/primary/gpseg0/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Adding cmd to work_queue: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg1 -l /data/greenplum/data2/primary/gpseg1/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker1] got cmd: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg1 -l /data/greenplum/data2/primary/gpseg1/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg1 -l /data/greenplum/data2/primary/gpseg1/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker1] finished cmd: Starting seg at dir /data/greenplum/data2/primary/gpseg1 cmdStr='env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg1 -l /data/greenplum/data2/primary/gpseg1/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1'  had result: cmd had rc=0 completed=True halted=False
  stdout='waiting for server to start.... done
server started
'
  stderr=''
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker0] finished cmd: Starting seg at dir /data/greenplum/data1/primary/gpseg0 cmdStr='env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg0 -l /data/greenplum/data1/primary/gpseg0/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1'  had result: cmd had rc=0 completed=True halted=False
  stdout='waiting for server to start.... done
server started
'
  stderr=''
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Checking segment postmasters... (must_be_running True)
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Postmaster /data/greenplum/data1/primary/gpseg0 is running (pid 6437)
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-Postmaster /data/greenplum/data2/primary/gpseg1 is running (pid 6438)
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[INFO]:-
COMMAND RESULTS
STATUS--DIR:/data/greenplum/data1/primary/gpseg0--STARTED:True--REASONCODE:0--REASON:Start Succeeded
STATUS--DIR:/data/greenplum/data2/primary/gpseg1--STARTED:True--REASONCODE:0--REASON:Start Succeeded
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-WorkerPool haltWork()
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker0] haltWork
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker1] haltWork
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker0] got a halt cmd
20200504:18:29:08:006421 gpsegstart.py_node2:gpadmin:node2:gpadmin-[DEBUG]:-[worker1] got a halt cmd
'
  stderr=''
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker2] finished cmd: remote segment starts on host 'node3' cmdStr='ssh -o StrictHostKeyChecking=no -o ServerAliveInterval=60 node3 ". /usr/local/greenplum-db/./greenplum_path.sh; $GPHOME/sbin/gpsegstart.py -M mirrorless -V 'postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7' -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D '4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2' -D '5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3' -B 64 -l '/home/gpadmin/gpAdminLogs'"'  had result: cmd had rc=0 completed=True halted=False
  stdout='20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Starting gpsegstart.py with args: -M mirrorless -V postgres (Greenplum Database) 6.7.1 build commit:a21de286045072d8d1df64fa48752b7dfac8c1b7 -n 4 --era 9c0a65a67950ab11_200504182907 -t 600 --master-checksum-version 1 -v -D 4|2|p|p|n|u|node3|node3|6000|/data/greenplum/data1/primary/gpseg2 -D 5|3|p|p|n|u|node3|node3|6001|/data/greenplum/data2/primary/gpseg3 -B 64 -l /home/gpadmin/gpAdminLogs
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/postgres --gp-version
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Validating directories...
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Validating directory: /data/greenplum/data2/primary/gpseg3
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Validating directory: /data/greenplum/data1/primary/gpseg2
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Starting segments... (mirroringMode mirrorless)
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data2/primary/gpseg3
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Adding cmd to work_queue: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg3 -l /data/greenplum/data2/primary/gpseg3/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_controldata /data/greenplum/data1/primary/gpseg2
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker0] got cmd: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg3 -l /data/greenplum/data2/primary/gpseg3/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg3 -l /data/greenplum/data2/primary/gpseg3/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Adding cmd to work_queue: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg2 -l /data/greenplum/data1/primary/gpseg2/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker1] got cmd: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg2 -l /data/greenplum/data1/primary/gpseg2/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg2 -l /data/greenplum/data1/primary/gpseg2/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker0] finished cmd: Starting seg at dir /data/greenplum/data2/primary/gpseg3 cmdStr='env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data2/primary/gpseg3 -l /data/greenplum/data2/primary/gpseg3/pg_log/startup.log -w -t 600 -o " -p 6001 " start 2>&1'  had result: cmd had rc=0 completed=True halted=False
  stdout='waiting for server to start.... done
server started
'
  stderr=''
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker1] finished cmd: Starting seg at dir /data/greenplum/data1/primary/gpseg2 cmdStr='env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data1/primary/gpseg2 -l /data/greenplum/data1/primary/gpseg2/pg_log/startup.log -w -t 600 -o " -p 6000 " start 2>&1'  had result: cmd had rc=0 completed=True halted=False
  stdout='waiting for server to start.... done
server started
'
  stderr=''
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Checking segment postmasters... (must_be_running True)
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Postmaster /data/greenplum/data2/primary/gpseg3 is running (pid 6421)
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-Postmaster /data/greenplum/data1/primary/gpseg2 is running (pid 6422)
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[INFO]:-
COMMAND RESULTS
STATUS--DIR:/data/greenplum/data2/primary/gpseg3--STARTED:True--REASONCODE:0--REASON:Start Succeeded
STATUS--DIR:/data/greenplum/data1/primary/gpseg2--STARTED:True--REASONCODE:0--REASON:Start Succeeded
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-WorkerPool haltWork()
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker0] haltWork
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker1] haltWork
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker1] got a halt cmd
20200504:18:29:08:006405 gpsegstart.py_node3:gpadmin:node3:gpadmin-[DEBUG]:-[worker0] got a halt cmd
'
  stderr=''
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Process results...
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-DBID:2  STARTED
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-DBID:3  STARTED
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-DBID:5  STARTED
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-DBID:4  STARTED
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------


20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-   Successful segment starts                                            = 4
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-   Failed segment starts                                                = 0
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-   Skipped segment starts (segments are marked down in configuration)   = 0
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Successfully started 4 of 4 segment instances 
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-----------------------------------------------------
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Starting Master instance node1 directory /data/greenplum/data/master/gpseg-1 
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: env GPSESSID=0000000000 GPERA=9c0a65a67950ab11_200504182907 $GPHOME/bin/pg_ctl -D /data/greenplum/data/master/gpseg-1 -l /data/greenplum/data/master/gpseg-1/pg_log/startup.log -w -t 600 -o " -p 5432 -E " start
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-Running Command: $GPHOME/bin/pg_ctl -D /data/greenplum/data/master/gpseg-1 status
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Command pg_ctl reports Master node1 instance active
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Connecting to dbname='template1' connect_timeout=15
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-/data/greenplum/data/master/gpseg-1/pg_log/gp_era - write_gp_era
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-opening new file
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-wrote era: 9c0a65a67950ab11_200504182907
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-setting read only
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-verifying file
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-No standby master configured.  skipping...
20200504:18:29:08:019475 gpstart:node1:gpadmin-[INFO]:-Database successfully started
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-WorkerPool haltWork()
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker0] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker1] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker2] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker3] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker4] haltWork
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker0] got a halt cmd
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker2] got a halt cmd
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker3] got a halt cmd
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker1] got a halt cmd
20200504:18:29:08:019475 gpstart:node1:gpadmin-[DEBUG]:-[worker4] got a halt cmd
20200504:18:29:08:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully started new Greenplum instance
20200504:18:29:08:015736 gpinitsystem:node1:gpadmin-[INFO]:-Completed restart of Greenplum instance in production mode
20200504:18:29:08:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function START_QD_PRODUCTION
20200504:18:29:08:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_DATABASE
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed create database gpdw
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_DATABASE
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SET_GP_USER_PW
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed update Greenplum superuser password
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SET_GP_USER_PW
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function REGISTER_MIRRORS
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed failed to register mirror for contentid=0
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed failed to register mirror for contentid=1
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed failed to register mirror for contentid=2
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Successfully completed failed to register mirror for contentid=3
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function ERROR_CHK
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function REGISTER_MIRRORS
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function CREATE_SEGMENT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Commencing parallel build of mirror segment instances
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_SETUP
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Spawning parallel processes    batch [1], please wait...
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_SETUP
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_COUNT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_WAIT
20200504:18:29:09:015736 gpinitsystem:node1:gpadmin-[INFO]:-Waiting for parallel processes batch [1], please wait...
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_WAIT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_COUNT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function PARALLEL_SUMMARY_STATUS_REPORT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Parallel process exit status
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as completed           = 4
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as killed              = 0
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Total processes marked as failed              = 0
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:------------------------------------------------
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function PARALLEL_SUMMARY_STATUS_REPORT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function CREATE_SEGMENT
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function FORCE_FTS_PROBE
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function FORCE_FTS_PROBE
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Start Function SCAN_LOG
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Scanning utility log file for any warning messages
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Log file scan check passed
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Function SCAN_LOG
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Greenplum Database instance successfully created
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-------------------------------------------------------
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-To complete the environment configuration, please 
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-update gpadmin .bashrc file with the following
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-1. Ensure that the greenplum_path.sh file is sourced
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-2. Add "export MASTER_DATA_DIRECTORY=/data/greenplum/data/master/gpseg-1"
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-   to access the Greenplum scripts for this instance:
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-   or, use -d /data/greenplum/data/master/gpseg-1 option for the Greenplum scripts
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-   Example gpstate -d /data/greenplum/data/master/gpseg-1
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Script log file = /home/gpadmin/gpAdminLogs/gpinitsystem_20200504.log
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-To remove instance, run gpdeletesystem utility
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-To initialize a Standby Master Segment for this Greenplum instance
20200504:18:29:15:015736 gpinitsystem:node1:gpadmin-[INFO]:-Review options for gpinitstandby
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-------------------------------------------------------
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-The Master /data/greenplum/data/master/gpseg-1/pg_hba.conf post gpinitsystem
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-has been configured to allow all hosts within this new
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-array to intercommunicate. Any hosts external to this
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-new array must be explicitly added to this file
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-Refer to the Greenplum Admin support guide which is
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-located in the /usr/local/greenplum-db/./docs directory
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-------------------------------------------------------
20200504:18:29:16:015736 gpinitsystem:node1:gpadmin-[INFO]:-End Main

3.9.2 添加standby master

gpinitstandby -s node2
20200504:19:23:41:023459 gpinitstandby:node1:gpadmin-[INFO]:-Validating environment and parameters for standby initialization...
20200504:19:23:41:023459 gpinitstandby:node1:gpadmin-[INFO]:-Checking for data directory /data/greenplum/data/master/gpseg-1 on node2
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:------------------------------------------------------
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum standby master initialization parameters
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:------------------------------------------------------
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum master hostname               = node1
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum master data directory         = /data/greenplum/data/master/gpseg-1
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum master port                   = 5432
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum standby master hostname       = node2
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum standby master port           = 5432
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum standby master data directory = /data/greenplum/data/master/gpseg-1
20200504:19:23:42:023459 gpinitstandby:node1:gpadmin-[INFO]:-Greenplum update system catalog         = On
Do you want to continue with standby master initialization? Yy|Nn (default=N):
> y
20200504:19:23:43:023459 gpinitstandby:node1:gpadmin-[INFO]:-Syncing Greenplum Database extensions to standby
20200504:19:23:44:023459 gpinitstandby:node1:gpadmin-[INFO]:-The packages on node2 are consistent.
20200504:19:23:44:023459 gpinitstandby:node1:gpadmin-[INFO]:-Adding standby master to catalog...
20200504:19:23:44:023459 gpinitstandby:node1:gpadmin-[INFO]:-Database catalog updated successfully.
20200504:19:23:44:023459 gpinitstandby:node1:gpadmin-[INFO]:-Updating pg_hba.conf file...
20200504:19:23:45:023459 gpinitstandby:node1:gpadmin-[INFO]:-pg_hba.conf files updated successfully.
20200504:19:23:46:023459 gpinitstandby:node1:gpadmin-[INFO]:-Starting standby master
20200504:19:23:46:023459 gpinitstandby:node1:gpadmin-[INFO]:-Checking if standby master is running on host: node2  in directory: /data/greenplum/data/master/gpseg-1
20200504:19:23:48:023459 gpinitstandby:node1:gpadmin-[INFO]:-Cleaning up pg_hba.conf backup files...
20200504:19:23:49:023459 gpinitstandby:node1:gpadmin-[INFO]:-Backup files of pg_hba.conf cleaned up successfully.
20200504:19:23:49:023459 gpinitstandby:node1:gpadmin-[INFO]:-Successfully created standby master on node2

 #查看安装日志根据告警和报错做出相对调整
 cat /home/gpadmin/gpAdminLogs/gpinitsystem_20200504.log |grep -E -i 'WARN|ERROR]'
#根据日志添加环境变量
#PGPORT、PGUSER、PGDATABASE需要额外添加
cat >> /home/gpadmin/.bash_profile << EOF
export MASTER_DATA_DIRECTORY=/data/greenplum/data/master/gpseg-1
export PGPORT=5432
export PGUSER=gpadmin
export PGDATABASE=gpdw
EOF

cat >> /home/gpadmin/.bashrc << EOF
export MASTER_DATA_DIRECTORY=/data/greenplum/data/master/gpseg-1
export PGPORT=5432
export PGUSER=gpadmin
export PGDATABASE=gpdw
EOF
#同步环境变量
gpscp -f /usr/local/greenplum-db/seg_host /home/gpadmin/.bash_profile gpadmin@=:/home/gpadmin/.bash_profile
gpscp -f /usr/local/greenplum-db/seg_host /home/gpadmin/.bashrc gpadmin@=:/home/gpadmin/.bashrc
gpssh -f /usr/local/greenplum-db/all_host -e 'source /home/gpadmin/.bash_profile;source /home/gpadmin/.bashrc;'

3.9.3 删除重装

#删除重装
gpdeletesystem -d /opt/greenplum/data/master/gpseg-1 -f
-d 后面跟 MASTER_DATA_DIRECTORY(master 的数据目录),会清除master,segment所有的数据目录。
-f force, 终止所有进程,强制删除。

3.10 安装成功后配置

3.10.1 配置pg_hba.conf

/data/greenplum/data/master/gpseg-1/pg_hba.conf
local    all         gpadmin         ident
host     all         gpadmin         127.0.0.1/28    trust
host     all         gpadmin         192.168.188.87/32       trust
host     all         gpadmin         ::1/128       trust
host     all         gpadmin         fe80::3157:3eb8:c80e:8f40/128       trust
#添加允许任意IP密码登陆
host     all         gpadmin         0.0.0.0/0   md5
local    replication gpadmin         ident
host     replication gpadmin         samehost       trust
host     replication gpadmin         192.168.188.87/32       trust

3.10.2 配置postgresql.conf

/data/greenplum/data/master/gpseg-1/postgresql.conf 
#允许监听任意IP
listen_addresses = '*'
#不重启数据库,reload配置文件
gpstop -u

3.10.3 登陆数据库

#登陆主节点
PGOPTIONS='-c gp_session_role=utility' psql -h node1 -p5432 -d postgres
 
#登陆到segment,需要指定segment 端口。
PGOPTIONS='-c gp_session_role=utility' psql -h node2 -p6000 -d postgres

postgres=# select * from gp_segment_configuration;
 dbid | content | role | preferred_role | mode | status | port | hostname | address |               datadir                
------+---------+------+----------------+------+--------+------+----------+---------+--------------------------------------
    1 |      -1 | p    | p              | n    | u      | 5432 | node1    | node1   | /data/greenplum/data/master/gpseg-1
    2 |       0 | p    | p              | s    | u      | 6000 | node2    | node2   | /data/greenplum/data1/primary/gpseg0
    6 |       0 | m    | m              | s    | u      | 7000 | node3    | node3   | /data/greenplum/data1/mirror/gpseg0
    4 |       2 | p    | p              | s    | u      | 6000 | node3    | node3   | /data/greenplum/data1/primary/gpseg2
    8 |       2 | m    | m              | s    | u      | 7000 | node2    | node2   | /data/greenplum/data1/mirror/gpseg2
    3 |       1 | p    | p              | s    | u      | 6001 | node2    | node2   | /data/greenplum/data2/primary/gpseg1
    7 |       1 | m    | m              | s    | u      | 7001 | node3    | node3   | /data/greenplum/data2/mirror/gpseg1
    5 |       3 | p    | p              | s    | u      | 6001 | node3    | node3   | /data/greenplum/data2/primary/gpseg3
    9 |       3 | m    | m              | s    | u      | 7001 | node2    | node2   | /data/greenplum/data2/mirror/gpseg3
   10 |      -1 | m    | m              | s    | u      | 5432 | node2    | node2   | /data/greenplum/data/master/gpseg-1
(10 rows)

四、附录

4.1 常用命令

gpstate -b =》 显示简要状态
gpstate -c =》 显示主镜像映射
gpstart -d =》 指定数据目录(默认值:$MASTER_DATA_DIRECTORY)
gpstate -e =》 显示具有镜像状态问题的片段
gpstate -f =》 显示备用主机详细信息
gpstate -i =》 显示GRIPLUM数据库版本
gpstate -m =》 显示镜像实例同步状态
gpstate -p =》 显示使用端口
gpstate -Q =》 快速检查主机状态
gpstate -s =》 显示集群详细信息
gpstate -v =》 显示详细信息
                            作用
gpconfig -c =》 --change param_name  通过在postgresql.conf 文件的底部添加新的设置来改变配置参数的设置。
gpconfig -v =》 --value value 用于由-c选项指定的配置参数的值。默认情况下,此值将应用于所有Segment及其镜像、Master和后备Master。
gpconfig -m =》 --mastervalue master_value 用于由-c 选项指定的配置参数的Master值。如果指定,则该值仅适用于Master和后备Master。该选项只能与-v一起使用。
gpconfig -masteronly =》当被指定时,gpconfig 将仅编辑Master的postgresql.conf文件。
gpconfig -r =》 --remove param_name 通过注释掉postgresql.conf文件中的项删除配置参数。
gpconfig -l =》 --list 列出所有被gpconfig工具支持的配置参数。
gpconfig -s =》 --show param_name 显示在Greenplum数据库系统中所有实例(Master和Segment)上使用的配置参数的值。如果实例中参数值存在差异,则工具将显示错误消息。使用-s=》选项运行gpconfig将直接从数据库中读取参数值,而不是从postgresql.conf文件中读取。如果用户使用gpconfig 在所有Segment中设置配置参数,然后运行gpconfig -s来验证更改,用户仍可能会看到以前的(旧)值。用户必须重新加载配置文件(gpstop -u)或重新启动系统(gpstop -r)以使更改生效。
gpconfig --file =》 对于配置参数,显示在Greenplum数据库系统中的所有Segment(Master和Segment)上的postgresql.conf文件中的值。如果实例中的参数值存在差异,则工具会显示一个消息。必须与-s选项一起指定。
gpconfig --file-compare 对于配置参数,将当前Greenplum数据库值与主机(Master和Segment)上postgresql.conf文件中的值进行比较。
gpconfig --skipvalidation 覆盖gpconfig的系统验证检查,并允许用户对任何服务器配置参数进行操作,包括隐藏参数和gpconfig无法更改的受限参数。当与-l选项(列表)一起使用时,它显示受限参数的列表。 警告: 使用此选项设置配置参数时要格外小心。
gpconfig --verbose 在gpconfig命令执行期间显示额外的日志信息。
gpconfig --debug 设置日志输出级别为调试级别。
gpconfig -? | -h | --help 显示在线帮助。

gpstart -a => 快速启动|
gpstart -d => 指定数据目录(默认值:$MASTER_DATA_DIRECTORY)
gpstart -q => 在安静模式下运行。命令输出不显示在屏幕,但仍然写入日志文件。
gpstart -m => 以维护模式连接到Master进行目录维护。例如:$ PGOPTIONS='-c gp_session_role=utility' psql postgres
gpstart -R => 管理员连接
gpstart -v => 显示详细启动信息

gpstop -a => 快速停止
gpstop -d => 指定数据目录(默认值:$MASTER_DATA_DIRECTORY)
gpstop -m => 维护模式
gpstop -q => 在安静模式下运行。命令输出不显示在屏幕,但仍然写入日志文件。
gpstop -r => 停止所有实例,然后重启系统
gpstop -u => 重新加载配置文件 postgresql.conf 和 pg_hba.conf
gpstop -v => 显示详细启动信息
gpstop -M fast      => 快速关闭。正在进行的任何事务都被中断。然后滚回去。
gpstop -M immediate => 立即关闭。正在进行的任何事务都被中止。不推荐这种关闭模式,并且在某些情况下可能导致数据库损坏需要手动恢复。
gpstop -M smart     => 智能关闭。如果存在活动连接,则此命令在警告时失败。这是默认的关机模式。
gpstop --host hostname => 停用segments数据节点,不能与-m、-r、-u、-y同时使用 

gprecoverseg -a => 快速恢复
gprecoverseg -i => 指定恢复文件
gprecoverseg -d => 指定数据目录
gprecoverseg -l => 指定日志文件
gprecoverseg -r => 平衡数据
gprecoverseg -s => 指定配置空间文件
gprecoverseg -o => 指定恢复配置文件
gprecoverseg -p => 指定额外的备用机
gprecoverseg -S => 指定输出配置空间文件

gpactivatestandby -d 路径 | 使用数据目录绝对路径,默认:$MASTER_DATA_DIRECTORY
gpactivatestandby -f | 强制激活备份主机
gpactivatestandby -v | 显示此版本信息

gpinitstandby -s 备库名称 => 指定新备库
gpinitstandby -D => debug 模式
gpinitstandby -r => 移除备用机

本文参考:
https://blog.csdn.net/zutsoft/article/details/103645796
https://www.cnblogs.com/pl-boke/p/9852383.html

  • 2
    点赞
  • 2
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

_梓杰_

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值