512错误解决(好像是512,反正是5开头)
将基于mrjob写的mapreduce python程序在hadoop上运行时会出现512错误,其原因很简单,是因为没有运行yarn(hadoop的任务调度程序),请参考网络上的相关配置教程对相应xml进行配置后启动yarn,512错误便能得到解决。个人配置及启动流程如下,供参考:
1.mapred-site.xml文件
位于/usr/local/hadoop/etc/hadoop目录下,配置如下:
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
2.yarn-site.xml文件
同样位于/usr/local/hadoop/etc/hadoop目录下,配置如下:
<configuration>
<!-- Site specific YARN configuration properties -->
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<description>Whether virtual memory limits will be enforced forcontainers.</description>
<name>yarn.nodemanager.vmem-check-enabled</name>
<value>false</value>
</property>
</configuration>
3.配置后启动yarn(需要先启动hadoop)
[hadoop@localhost ~]$ /usr/local/hadoop/sbin/start-yarn.sh
可选择启动历史服务器,如下
[hadoop@localhost ~]$ /usr/local/hadoop/sbin/mr-jobhistory-daemon.sh start historyserver
此外可通过jps命令查看hadoop各个服务的运行情况
[hadoop@localhost ~]$ jps
19060 DataNode
18965 NameNode
19654 NodeManager
19240 SecondaryNameNode
19380 ResourceManager
25702 Jps
6308 JobHistoryServer
随后再执行原命令,512错误即可得到解决。
256错误
对于我的256错误,报错如下:
Scanning logs for probable cause of failure...
Looking for history log in hdfs:///tmp/hadoop-yarn/staging...
Parsing history log: hdfs:///tmp/hadoop-yarn/staging/history/done_intermediate/hadoop/job_1619235102568_0003-1619236708570-hadoop-streamjob5490351329862187422.jar-1619236845112-0-0-FAILED-default-1619236721763.jhist
Looking for task syslogs in /usr/local/hadoop/logs/userlogs/application_1619235102568_0003...
Parsing task log: /usr/local/hadoop/logs/userlogs/application_1619235102568_0003/container_1619235102568_0003_01_000011/stderr
Parsing task log: /usr/local/hadoop/logs/userlogs/application_1619235102568_0003/container_1619235102568_0003_01_000011/syslog
Probable cause of failure:
PipeMapRed failed!
java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 127
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:322)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:535)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:421)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
(from lines 34-47 of /usr/local/hadoop/logs/userlogs/application_1619235102568_0003/container_1619235102568_0003_01_000011/syslog)
caused by:
+ python3 -c 'import fcntl; fcntl.flock(9, fcntl.LOCK_EX)'
setup-wrapper.sh:行6: python3: 未找到命令
(from lines 3-4 of /usr/local/hadoop/logs/userlogs/application_1619235102568_0003/container_1619235102568_0003_01_000011/stderr)
while reading input from lines 8-14 of hdfs://localhost:9000/user/hadoop/text
Step 1 of 1 failed: Command '['/usr/local/hadoop/bin/hadoop', 'jar', '/usr/local/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.6.0.jar', '-files', 'hdfs:///user/hadoop/tmp/mrjob/test3.hadoop.20210424.035749.843284/files/wd/mrjob.zip#mrjob.zip,hdfs:///user/hadoop/tmp/mrjob/test3.hadoop.20210424.035749.843284/files/wd/setup-wrapper.sh#setup-wrapper.sh,hdfs:///user/hadoop/tmp/mrjob/test3.hadoop.20210424.035749.843284/files/wd/test3.py#test3.py', '-input', 'hdfs:///user/hadoop/text', '-output', 'hdfs:///user/hadoop/output', '-mapper', '/bin/sh -ex setup-wrapper.sh python3 test3.py --step-num=0 --mapper', '-reducer', '/bin/sh -ex setup-wrapper.sh python3 test3.py --step-num=0 --reducer']' returned non-zero exit status 256.
可以看到经过自动对日志的分析,我的256报错原因是因为:
caused by:
+ python3 -c ‘import fcntl; fcntl.flock(9, fcntl.LOCK_EX)’
setup-wrapper.sh:行6: python3: 未找到命令
随后在命令行执行如下指令,同样发现未找到命令
[hadoop@localhost workspace]$ python3
-bash: python3: 未找到命令
因此分析问题原因是因为没给python3配环境变量导致的,虽然之前有配置过python的指向,但未配置python3的,随后添加链接,指令如下:
[hadoop@localhost bin]$ sudo ln -s /usr/local/python3/bin/python3.6 /usr/bin/python3
其中对于你的python3的source请根据你的实际python3安装位置进行调整,若想检查是否配置链接好,可输入以下命令进行验证,当弹出版本信息说明配置成功了。
[hadoop@localhost ~]$ python3 -v
import _frozen_importlib # frozen
import _imp # builtin
import sys # builtin
import '_warnings' # <class '_frozen_importlib.BuiltinImporter'>
import '_thread' # <class '_frozen_importlib.BuiltinImporter'>
import '_weakref' # <class '_frozen_importlib.BuiltinImporter'>
import '_frozen_importlib_external' # <class '_frozen_importlib.FrozenImporter'>
import '_io' # <class '_frozen_importlib.BuiltinImporter'>
import 'marshal' # <class '_frozen_importlib.BuiltinImporter'>
import 'posix' # <class '_frozen_importlib.BuiltinImporter'>
import _thread # previously loaded ('_thread')
import '_thread' # <class '_frozen_importlib.BuiltinImporter'>
import _weakref # previously loaded ('_weakref')
import '_weakref' # <class '_frozen_importlib.BuiltinImporter'>
# installing zipimport hook
import 'zipimport' # <class '_frozen_importlib.BuiltinImporter'>
# installed zipimport hook
# /usr/local/python3/lib/python3.6/encodings/__pycache__/__init__.cpython-36.pyc matches /usr/local/python3/lib/python3.6/encodings/__init__.py
# code object from '/usr/local/python3/lib/python3.6/encodings/__pycache__/__init__.cpython-36.pyc'
# /usr/local/python3/lib/python3.6/__pycache__/codecs.cpython-36.pyc matches /usr/local/python3/lib/python3.6/codecs.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/codecs.cpython-36.pyc'
import '_codecs' # <class '_frozen_importlib.BuiltinImporter'>
import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf264d048>
# /usr/local/python3/lib/python3.6/encodings/__pycache__/aliases.cpython-36.pyc matches /usr/local/python3/lib/python3.6/encodings/aliases.py
# code object from '/usr/local/python3/lib/python3.6/encodings/__pycache__/aliases.cpython-36.pyc'
import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf265a9e8>
import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf26c2b70>
# /usr/local/python3/lib/python3.6/encodings/__pycache__/utf_8.cpython-36.pyc matches /usr/local/python3/lib/python3.6/encodings/utf_8.py
# code object from '/usr/local/python3/lib/python3.6/encodings/__pycache__/utf_8.cpython-36.pyc'
import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf26687f0>
import '_signal' # <class '_frozen_importlib.BuiltinImporter'>
# /usr/local/python3/lib/python3.6/encodings/__pycache__/latin_1.cpython-36.pyc matches /usr/local/python3/lib/python3.6/encodings/latin_1.py
# code object from '/usr/local/python3/lib/python3.6/encodings/__pycache__/latin_1.cpython-36.pyc'
import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf266e320>
# /usr/local/python3/lib/python3.6/__pycache__/io.cpython-36.pyc matches /usr/local/python3/lib/python3.6/io.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/io.cpython-36.pyc'
# /usr/local/python3/lib/python3.6/__pycache__/abc.cpython-36.pyc matches /usr/local/python3/lib/python3.6/abc.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/abc.cpython-36.pyc'
# /usr/local/python3/lib/python3.6/__pycache__/_weakrefset.cpython-36.pyc matches /usr/local/python3/lib/python3.6/_weakrefset.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/_weakrefset.cpython-36.pyc'
import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf26742b0>
import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf266e908>
import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf266e550>
# /usr/local/python3/lib/python3.6/__pycache__/site.cpython-36.pyc matches /usr/local/python3/lib/python3.6/site.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/site.cpython-36.pyc'
# /usr/local/python3/lib/python3.6/__pycache__/os.cpython-36.pyc matches /usr/local/python3/lib/python3.6/os.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/os.cpython-36.pyc'
import 'errno' # <class '_frozen_importlib.BuiltinImporter'>
# /usr/local/python3/lib/python3.6/__pycache__/stat.cpython-36.pyc matches /usr/local/python3/lib/python3.6/stat.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/stat.cpython-36.pyc'
import '_stat' # <class '_frozen_importlib.BuiltinImporter'>
import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf26185f8>
# /usr/local/python3/lib/python3.6/__pycache__/posixpath.cpython-36.pyc matches /usr/local/python3/lib/python3.6/posixpath.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/posixpath.cpython-36.pyc'
# /usr/local/python3/lib/python3.6/__pycache__/genericpath.cpython-36.pyc matches /usr/local/python3/lib/python3.6/genericpath.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/genericpath.cpython-36.pyc'
import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf261cfd0>
import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf2618cc0>
# /usr/local/python3/lib/python3.6/__pycache__/_collections_abc.cpython-36.pyc matches /usr/local/python3/lib/python3.6/_collections_abc.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/_collections_abc.cpython-36.pyc'
import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf2622668>
import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf2686ef0>
# /usr/local/python3/lib/python3.6/__pycache__/_sitebuiltins.cpython-36.pyc matches /usr/local/python3/lib/python3.6/_sitebuiltins.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/_sitebuiltins.cpython-36.pyc'
import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf26092e8>
# /usr/local/python3/lib/python3.6/__pycache__/sysconfig.cpython-36.pyc matches /usr/local/python3/lib/python3.6/sysconfig.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/sysconfig.cpython-36.pyc'
import 'sysconfig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf25d8ac8>
# /usr/local/python3/lib/python3.6/__pycache__/_sysconfigdata_m_linux_x86_64-linux-gnu.cpython-36.pyc matches /usr/local/python3/lib/python3.6/_sysconfigdata_m_linux_x86_64-linux-gnu.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/_sysconfigdata_m_linux_x86_64-linux-gnu.cpython-36.pyc'
import '_sysconfigdata_m_linux_x86_64-linux-gnu' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf25ea278>
import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf2679be0>
Python 3.6.8 (default, Apr 23 2021, 14:01:07)
[GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] on linux
Type "help", "copyright", "credits" or "license" for more information.
# extension module 'readline' loaded from '/usr/local/python3/lib/python3.6/lib-dynload/readline.cpython-36m-x86_64-linux-gnu.so'
# extension module 'readline' executed from '/usr/local/python3/lib/python3.6/lib-dynload/readline.cpython-36m-x86_64-linux-gnu.so'
import 'readline' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f8bf25f95c0>
import 'atexit' # <class '_frozen_importlib.BuiltinImporter'>
# /usr/local/python3/lib/python3.6/__pycache__/rlcompleter.cpython-36.pyc matches /usr/local/python3/lib/python3.6/rlcompleter.py
# code object from '/usr/local/python3/lib/python3.6/__pycache__/rlcompleter.cpython-36.pyc'
import 'rlcompleter' # <_frozen_importlib_external.SourceFileLoader object at 0x7f8bf25f96d8>
>>>
最后再次执行我们之前的命令,发现运行成功,未出现256错误
(ps:造成256的原因可能是多种多样的,请根据自己的错误日志进行具体分析,有可能您的错误产生情况与我的并不相同)
[hadoop@localhost workspace]$ python ./test3.py -r hadoop hdfs:///user/hadoop/text -o hdfs:///user/hadoop/output
No configs found; falling back on auto-configuration
No configs specified for hadoop runner
Looking for hadoop binary in /usr/local/hadoop/bin...
Found hadoop binary: /usr/local/hadoop/bin/hadoop
Using Hadoop version 2.6.0
Looking for Hadoop streaming jar in /usr/local/hadoop...
Found Hadoop streaming jar: /usr/local/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.6.0.jar
Creating temp directory /tmp/test3.hadoop.20210424.041111.606248
uploading working dir files to hdfs:///user/hadoop/tmp/mrjob/test3.hadoop.20210424.041111.606248/files/wd...
Copying other local files to hdfs:///user/hadoop/tmp/mrjob/test3.hadoop.20210424.041111.606248/files/
Running step 1 of 1...
packageJobJar: [/tmp/hadoop-unjar7048697663376485730/] [] /tmp/streamjob3267295581038199778.jar tmpDir=null
Connecting to ResourceManager at /0.0.0.0:8032
Connecting to ResourceManager at /0.0.0.0:8032
Total input paths to process : 1
number of splits:2
Submitting tokens for job: job_1619235102568_0004
Submitted application application_1619235102568_0004
The url to track the job: http://localhost:8088/proxy/application_1619235102568_0004/
Running job: job_1619235102568_0004
Job job_1619235102568_0004 running in uber mode : false
map 0% reduce 0%
map 33% reduce 0%
map 67% reduce 0%
map 83% reduce 0%
map 100% reduce 0%
map 100% reduce 100%
Job job_1619235102568_0004 completed successfully
Output directory: hdfs:///user/hadoop/output
Counters: 49
File Input Format Counters
Bytes Read=21
File Output Format Counters
Bytes Written=24
File System Counters
FILE: Number of bytes read=62
FILE: Number of bytes written=328656
FILE: Number of large read operations=0
FILE: Number of read operations=0
FILE: Number of write operations=0
HDFS: Number of bytes read=201
HDFS: Number of bytes written=24
HDFS: Number of large read operations=0
HDFS: Number of read operations=9
HDFS: Number of write operations=2
Job Counters
Data-local map tasks=2
Launched map tasks=2
Launched reduce tasks=1
Total megabyte-seconds taken by all map tasks=127718400
Total megabyte-seconds taken by all reduce tasks=18784256
Total time spent by all map tasks (ms)=124725
Total time spent by all maps in occupied slots (ms)=124725
Total time spent by all reduce tasks (ms)=18344
Total time spent by all reduces in occupied slots (ms)=18344
Total vcore-seconds taken by all map tasks=124725
Total vcore-seconds taken by all reduce tasks=18344
Map-Reduce Framework
CPU time spent (ms)=6040
Combine input records=0
Combine output records=0
Failed Shuffles=0
GC time elapsed (ms)=11489
Input split bytes=180
Map input records=3
Map output bytes=42
Map output materialized bytes=68
Map output records=7
Merged Map outputs=2
Physical memory (bytes) snapshot=397221888
Reduce input groups=4
Reduce input records=7
Reduce output records=4
Reduce shuffle bytes=68
Shuffled Maps =2
Spilled Records=14
Total committed heap usage (bytes)=256843776
Virtual memory (bytes) snapshot=3073417216
Shuffle Errors
BAD_ID=0
CONNECTION=0
IO_ERROR=0
WRONG_LENGTH=0
WRONG_MAP=0
WRONG_REDUCE=0
job output is in hdfs:///user/hadoop/output
Removing HDFS temp directory hdfs:///user/hadoop/tmp/mrjob/test3.hadoop.20210424.041111.606248...
Removing temp directory /tmp/test3.hadoop.20210424.041111.606248...
[hadoop@localhost workspace]$
转载请注明出处,感谢。