大数据平台运维之Mahout

大数据系列之运维(自主搭建的大数据平台)

(7)Mahout运维
  1. 在 master 节点安装 Mahout Client,打开 Linux Shell 运行 mahout 命令查看Mahout 自带的案例程序。
[root@master ~]# mahout
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
Running on hadoop, using /usr/hadoop/hadoop-2.7.3/bin/hadoop and HADOOP_CONF_DIR=/usr/hadoop/hadoop-2.7.3/etc/hadoop/
MAHOUT-JOB: /usr/mahout/apache-mahout-distribution-0.11.0/mahout-examples-0.11.0-job.jar
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hadoop/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hive/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
An example program must be given as the first argument.
Valid program names are:
  arff.vector: : Generate Vectors from an ARFF file or directory
  baumwelch: : Baum-Welch algorithm for unsupervised HMM training
  buildforest: : Build the random forest classifier
  canopy: : Canopy clustering
  cat: : Print a file or resource as the logistic regression models would see it
  cleansvd: : Cleanup and verification of SVD output
  clusterdump: : Dump cluster output to text
  clusterpp: : Groups Clustering Output In Clusters
  cmdump: : Dump confusion matrix in HTML or text formats
  concatmatrices: : Concatenates 2 matrices of same cardinality into a single matrix
  cvb: : LDA via Collapsed Variation Bayes (0th deriv. approx)
  cvb0_local: : LDA via Collapsed Variation Bayes, in memory locally.
  describe: : Describe the fields and target variable in a data set
  evaluateFactorization: : compute RMSE and MAE of a rating matrix factorization against probes
  fkmeans: : Fuzzy K-means clustering
  hmmpredict: : Generate random sequence of observations by given HMM
  itemsimilarity: : Compute the item-item-similarities for item-based collaborative filtering
  kmeans: : K-means clustering
  lucene.vector: : Generate Vectors from a Lucene index
  lucene2seq: : Generate Text SequenceFiles from a Lucene index
  matrixdump: : Dump matrix in CSV format
  matrixmult: : Take the product of two matrices
  parallelALS: : ALS-WR factorization of a rating matrix
  qualcluster: : Runs clustering experiments and summarizes results in a CSV
  recommendfactorized: : Compute recommendations using the factorization of a rating matrix
  recommenditembased: : Compute recommendations using item-based collaborative filtering
  regexconverter: : Convert text files on a per line basis based on regular expressions
  resplit: : Splits a set of SequenceFiles into a number of equal splits
  rowid: : Map SequenceFile<Text,VectorWritable> to {SequenceFile<IntWritable,VectorWritable>, SequenceFile<IntWritable,Text>}
  rowsimilarity: : Compute the pairwise similarities of the rows of a matrix
  runAdaptiveLogistic: : Score new production data using a probably trained and validated AdaptivelogisticRegression model
  runlogistic: : Run a logistic regression model against CSV data
  seq2encoded: : Encoded Sparse Vector generation from Text sequence files
  seq2sparse: : Sparse Vector generation from Text sequence files
  seqdirectory: : Generate sequence files (of Text) from a directory
  seqdumper: : Generic Sequence File dumper
  seqmailarchives: : Creates SequenceFile from a directory containing gzipped mail archives
  seqwiki: : Wikipedia xml dump to sequence file
  spectralkmeans: : Spectral k-means clustering
  split: : Split Input data into test and train sets
  splitDataset: : split a rating dataset into training and probe parts
  ssvd: : Stochastic SVD
  streamingkmeans: : Streaming k-means clustering
  svd: : Lanczos Singular Value Decomposition
  testforest: : Test the random forest classifier
  testnb: : Test the Vector-based Bayes classifier
  trainAdaptiveLogistic: : Train an AdaptivelogisticRegression model
  trainlogistic: : Train a logistic regression using stochastic gradient descent
  trainnb: : Train the Vector-based Bayes classifier
  transpose: : Take the transpose of a matrix
  validateAdaptiveLogistic: : Validate an AdaptivelogisticRegression model against hold-out data set
  vecdist: : Compute the distances between a set of Vectors (or Cluster or Canopy, they must fit in memory) and a list of Vectors
  vectordump: : Dump vectors from a sequence file to text
  viterbi: : Viterbi decoding of hidden states from given output states sequence

  1. 使用 Mahout 工具将解压后的 20news-bydate.tar.gz 文件内容转换成序列文件,保存到/data/mahout/20news/output/20news-seq/目录中,并查看该目录的列表信息。使用-text 命令查看序列文件内容(前 20 行即可)。
[root@master ~]# mkdir 20news
[root@master ~]# tar -zxvf /root/tiku/Mahout/20news-bydate.tar.gz -C /root/20news/
[root@master ~]# hadoop fs -mkdir -p /data/mahout/20news/20news-all
[root@master ~]# hadoop fs -put /root/20news/* /data/mahout/20news/20news-all

这里等待时间较长。也可能出现警告:

20/04/02 13:50:15 WARN hdfs.DFSClient: DataStreamer Exception
java.nio.channels.ClosedByInterruptException
/data/mahout/20news/output/20news-seq		#目录路径可以不用先在HDFS上创建。

[root@master ~]# mahout seqdirectory -i /data/mahout/20news/20news-all -o /data/mahout/20news/output/20news-seq/
20/04/02 13:58:55 INFO deprecation: mapred.input.dir is deprecated. Instead, use mapreduce.input.fileinputformat.inputdir
20/04/02 13:58:55 INFO deprecation: mapred.compress.map.output is deprecated. Instead, use mapreduce.map.output.compress
20/04/02 13:58:55 INFO deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
20/04/02 13:58:59 INFO RMProxy: Connecting to ResourceManager at master/192.168.100.160:18040
20/04/02 13:59:08 INFO FileInputFormat: Total input paths to process : 18846
20/04/02 13:59:09 INFO CombineFileInputFormat: DEBUG: Terminated node allocation with : CompletedNodes: 2, size left: 35855003
20/04/02 13:59:10 INFO JobSubmitter: number of splits:1
20/04/02 13:59:10 INFO JobSubmitter: Submitting tokens for job: job_1585797567201_0005
20/04/02 13:59:11 INFO YarnClientImpl: Submitted application application_1585797567201_0005
20/04/02 13:59:11 INFO Job: The url to track the job: http://master:18088/proxy/application_1585797567201_0005/
20/04/02 13:59:11 INFO Job: Running job: job_1585797567201_0005
20/04/02 13:59:39 INFO Job: Job job_1585797567201_0005 running in uber mode : false
20/04/02 13:59:39 INFO Job:  map 0% reduce 0%
20/04/02 14:00:03 INFO Job:  map 2% reduce 0%
20/04/02 14:00:06 INFO Job:  map 3% reduce 0%
20/04/02 14:00:09 INFO Job:  map 4% reduce 0%
20/04/02 14:00:12 INFO Job:  map 6% reduce 0%
20/04/02 14:00:15 INFO Job:  map 7% reduce 0%
20/04/02 14:00:18 INFO Job:  map 8% reduce 0%
20/04/02 14:00:21 INFO Job:  map 9% reduce 0%
20/04/02 14:00:24 INFO Job:  map 11% reduce 0%
20/04/02 14:00:27 INFO Job:  map 12% reduce 0%
20/04/02 14:00:30 INFO Job:  map 13% reduce 0%
20/04/02 14:00:34 INFO Job:  map 14% reduce 0%
20/04/02 14:00:37 INFO Job:  map 15% reduce 0%
20/04/02 14:00:40 INFO Job:  map 17% reduce 0%
20/04/02 14:00:43 INFO Job:  map 18% reduce 0%
20/04/02 14:00:46 INFO Job:  map 19% reduce 0%
20/04/02 14:00:49 INFO Job:  map 21% reduce 0%
20/04/02 14:00:52 INFO Job:  map 22% reduce 0%
20/04/02 14:00:55 INFO Job:  map 24% reduce 0%
20/04/02 14:00:58 INFO Job:  map 25% reduce 0%
20/04/02 14:01:01 INFO Job:  map 28% reduce 0%
20/04/02 14:01:04 INFO Job:  map 30% reduce 0%
20/04/02 14:01:07 INFO Job:  map 32% reduce 0%
20/04/02 14:01:10 INFO Job:  map 35% reduce 0%
20/04/02 14:01:13 INFO Job:  map 37% reduce 0%
20/04/02 14:01:16 INFO Job:  map 39% reduce 0%
20/04/02 14:01:19 INFO Job:  map 41% reduce 0%
20/04/02 14:01:22 INFO Job:  map 43% reduce 0%
20/04/02 14:01:25 INFO Job:  map 44% reduce 0%
20/04/02 14:01:28 INFO Job:  map 46% reduce 0%
20/04/02 14:01:31 INFO Job:  map 49% reduce 0%
20/04/02 14:01:35 INFO Job:  map 50% reduce 0%
20/04/02 14:01:38 INFO Job:  map 51% reduce 0%
20/04/02 14:01:41 INFO Job:  map 52% reduce 0%
20/04/02 14:01:44 INFO Job:  map 55% reduce 0%
20/04/02 14:01:47 INFO Job:  map 56% reduce 0%
20/04/02 14:01:50 INFO Job:  map 57% reduce 0%
20/04/02 14:01:53 INFO Job:  map 58% reduce 0%
20/04/02 14:01:56 INFO Job:  map 59% reduce 0%
20/04/02 14:01:59 INFO Job:  map 61% reduce 0%
20/04/02 14:02:05 INFO Job:  map 62% reduce 0%
20/04/02 14:02:08 INFO Job:  map 64% reduce 0%
20/04/02 14:02:10 INFO Job:  map 65% reduce 0%
20/04/02 14:02:13 INFO Job:  map 66% reduce 0%
20/04/02 14:02:16 INFO Job:  map 68% reduce 0%
20/04/02 14:02:19 INFO Job:  map 70% reduce 0%
20/04/02 14:02:22 INFO Job:  map 72% reduce 0%
20/04/02 14:02:25 INFO Job:  map 74% reduce 0%
20/04/02 14:02:28 INFO Job:  map 75% reduce 0%
20/04/02 14:02:31 INFO Job:  map 76% reduce 0%
20/04/02 14:02:34 INFO Job:  map 79% reduce 0%
20/04/02 14:02:37 INFO Job:  map 80% reduce 0%
20/04/02 14:02:40 INFO Job:  map 82% reduce 0%
20/04/02 14:02:43 INFO Job:  map 84% reduce 0%
20/04/02 14:02:47 INFO Job:  map 85% reduce 0%
20/04/02 14:02:50 INFO Job:  map 88% reduce 0%
20/04/02 14:02:53 INFO Job:  map 90% reduce 0%
20/04/02 14:02:56 INFO Job:  map 93% reduce 0%
20/04/02 14:02:59 INFO Job:  map 96% reduce 0%
20/04/02 14:03:02 INFO Job:  map 99% reduce 0%
20/04/02 14:03:04 INFO Job:  map 100% reduce 0%
20/04/02 14:03:05 INFO Job: Job job_1585797567201_0005 completed successfully
20/04/02 14:03:05 INFO Job: Counters: 30
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=122661
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=37878493
		HDFS: Number of bytes written=19574319
		HDFS: Number of read operations=75388
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=2
	Job Counters 
		Launched map tasks=1
		Other local map tasks=1
		Total time spent by all maps in occupied slots (ms)=202287
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=202287
		Total vcore-milliseconds taken by all map tasks=202287
		Total megabyte-milliseconds taken by all map tasks=207141888
	Map-Reduce Framework
		Map input records=18846
		Map output records=18846
		Input split bytes=2023490
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=2785
		CPU time spent (ms)=129350
		Physical memory (bytes) snapshot=277868544
		Virtual memory (bytes) snapshot=2164076544
		Total committed heap usage (bytes)=122159104
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=19574319
20/04/02 14:03:05 INFO MahoutDriver: Program took 253614 ms (Minutes: 4.226916666666667)

[root@master ~]# hadoop fs -ls /data/mahout/20news/output/20news-seq/
-rw-r--r--   2 root supergroup          0 2020-04-02 14:03 /data/mahout/20news/output/20news-seq/_SUCCESS
-rw-r--r--   2 root supergroup   19574319 2020-04-02 14:03 /data/mahout/20news/output/20news-seq/part-m-00000

[root@master ~]# hadoop fs -text /data/mahout/20news/output/20news-seq/part-m-00000 | head -n 20
20/04/02 14:08:06 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
20/04/02 14:08:06 INFO compress.CodecPool: Got brand-new decompressor [.deflate]
/20news-bydate-test/alt.atheism/53068	From: decay@cbnewsj.cb.att.com (dean.kaflowitz)
Subject: Re: about the bible quiz answers
Organization: AT&T
Distribution: na
Lines: 18

In article <healta.153.735242337@saturn.wwc.edu>, healta@saturn.wwc.edu (Tammy R Healy) writes:
> 
> 
> #12) The 2 cheribums are on the Ark of the Covenant.  When God said make no 
> graven image, he was refering to idols, which were created to be worshipped. 
> The Ark of the Covenant wasn't wrodhipped and only the high priest could 
> enter the Holy of Holies where it was kept once a year, on the Day of 
> Atonement.

I am not familiar with, or knowledgeable about the original language,
but I believe there is a word for "idol" and that the translator
would have used the word "idol" instead of "graven image" had
the original said "idol."  So I think you're wrong here, but
then again I could be too.  I just suggesting a way to determine
text: Unable to write to output stream.

  1. 使用 Mahout 挖掘工具对数据集 user-item-score.txt(用户-物品-得分)进行物品推荐,要求采用基于项目的协同过滤算法,欧几里得距离公式定义,并且每位用户的推荐个数为 3,设置非布尔数据,最大偏好值为 4,最小偏好值为 1,将推荐输出结果保存到 output 目录中,通过-cat 命令查询输出结果part-r-00000 中的内容。

recommenditembased: : Compute recommendations using item-based collaborative filtering
推荐:使用基于项目的协作过滤计算推荐

[root@master ~]# hadoop fs -mkdir testdata
(注意,此命令的文件夹路径必须是如上,不可以是/testdata 等其他形式)
[root@master ~]# hadoop fs -put /root/tiku/Mahout/user-item-score.txt testdata
注意题目:将推荐输出结果保存到 output 目录中。
这里指的是当前用户下的output,不是/output,也无需在HDFS上创建好
[root@master ~]# mahout recommenditembased -i testdata/user-item-score.txt -o output/ -n 3 -b false -s SIMILARITY_EUCLIDEAN_DISTANCE --maxPrefsPerUser 4 --minPrefsPerUser 1
时间长达6分钟。
[root@master ~]# hadoop fs -cat output/part-r-00000
1	[172:7.244683,155:7.2344236,168:7.222178]
2	[170:8.7787075,167:8.776039,176:8.773119]
3	[154:8.428272,180:8.423885,162:8.417044]
4	[150:9.0,169:9.0,157:9.0]
5	[164:8.813206,176:8.811579,171:8.807009]
6	[172:9.512182,157:9.504695,175:9.503536]
7	[152:6.792448,158:6.785243,170:6.778984]
8	[153:8.548093,173:8.53183,161:8.527883]
9	[152:8.304974,165:8.298011,159:8.285807]
10	[180:9.018954,175:9.0144825,167:9.012351]
11	[169:8.013654,158:8.008579,156:8.002267]
12	[169:9.2497835,176:9.247996,151:9.245016]
13	[150:9.0,169:9.0,157:9.0]
14	[155:8.026558,176:8.017556,162:8.017162]
15	[154:7.5997324,180:7.591807,173:7.577799]
16	[169:8.514891,156:8.510277,159:8.509771]
17	[150:9.0,169:9.0,157:9.0]
18	[158:9.282879,164:9.270502,160:9.268536]
19	[167:8.529223,176:8.523244,155:8.518513]
20	[167:7.335958,170:7.2930007,174:7.2901974]
21	[159:9.010995,158:9.010478,160:9.00933]
22	[151:8.029679,154:8.026195,158:8.022409]
23	[161:9.265939,153:9.262611,164:9.258203]
24	[166:7.805335,163:7.7671957,167:7.762273]
25	[180:8.442727,152:8.439876,179:8.435754]
26	[177:9.514626,167:9.512479,163:9.5106535]
27	[153:9.270099,162:9.264407,161:9.264261]
28	[180:8.529967,155:8.524805,165:8.522438]
29	[167:8.042943,175:8.036189,152:8.031714]
30	[150:10.0,169:10.0,157:10.0]
31	[165:9.026068,179:9.021489,151:9.020451]
32	[164:6.430244,177:6.4238434,165:6.4196024]
33	[153:8.82286,155:8.81615,168:8.814715]
34	[180:8.773254,176:8.768795,167:8.7659645]
35	[155:8.335955,167:8.322873,174:8.313828]
36	[171:9.414659,177:9.412926,152:9.4089775]
37	[178:9.01828,158:9.015906,170:9.014726]
38	[156:6.8829117,164:6.880241,158:6.8654623]
39	[152:9.25851,162:9.258115,166:9.2546]
40	[166:9.256701,160:9.254954,177:9.251129]
41	[170:6.384581,154:6.382525,157:6.3795266]
42	[151:6.804046,176:6.801657,180:6.7903156]
43	[160:9.278294,159:9.270342,151:9.269794]
44	[168:9.516698,171:9.512943,160:9.5122175]
45	[155:8.759889,151:8.754884,164:8.753441]
46	[151:7.904089,177:7.8877735,168:7.878874]
47	[157:7.017697,173:7.014301,175:7.0133314]
48	[155:9.51533,167:9.51485,173:9.51401]
49	[155:9.032307,153:9.019996,169:9.019916]
50	[154:8.519671,160:8.51902,173:8.512761]
51	[173:8.614746,169:8.610904,170:8.610428]
52	[172:8.783082,160:8.770764,158:8.768208]
53	[173:7.568452,167:7.5595875,158:7.543423]
54	[169:9.0,157:9.0,172:9.0]
55	[159:9.211013,161:9.207755,157:9.207198]
56	[160:8.232883,170:8.224221,173:8.217167]
57	[162:7.6218677,167:7.6092143,169:7.609096]
58	[178:8.018567,169:8.0070095,176:7.996411]
59	[154:8.771842,163:8.767056,153:8.766432]
60	[165:9.518959,156:9.514377,163:9.511185]
61	[168:8.413319,162:8.408587,169:8.407725]
62	[172:8.0402,168:8.031107,153:8.0292635]
63	[170:9.03917,177:9.036586,169:9.036017]
64	[161:8.765093,174:8.763984,171:8.759152]
65	[152:9.60988,180:9.602244,166:9.600036]
66	[160:8.016334,158:8.014243,166:8.003733]
67	[167:9.291695,176:9.27745,166:9.273915]
68	[156:9.415215,159:9.411748,174:9.409421]
69	[152:9.258835,154:9.255967,170:9.255255]
70	[170:8.694196,167:8.691997,176:8.685233]
71	[158:9.262515,166:9.259051,167:9.258179]
72	[170:8.275401,151:8.275099,159:8.272958]
73	[167:9.271295,158:9.259629,170:9.259479]
74	[166:9.266875,172:9.266366,155:9.265728]
75	[162:7.58103,152:7.579109,166:7.5698905]
76	[151:8.4544,177:8.448799,161:8.443345]
77	[172:9.620576,151:9.62016,158:9.613923]
78	[162:6.0223804,171:6.0208836,153:6.019246]
79	[173:8.6011095,180:8.595921,167:8.594956]
80	[150:9.0,157:9.0,165:9.0]
81	[162:7.054459,153:7.040506,167:7.0402517]
82	[150:9.0,169:9.0,157:9.0]
83	[150:9.0,157:9.0,172:9.0]
84	[172:8.632161,174:8.621671,158:8.619549]
85	[176:7.0289874,178:7.027532,169:7.0172086]
86	[158:9.030713,179:9.02912,166:9.024003]
87	[172:6.9182425,151:6.90454,176:6.898417]
88	[167:7.769638,166:7.768989,172:7.7682962]
89	[164:9.409762,162:9.40714,161:9.405866]
90	[176:5.4864054,180:5.4733024,151:5.471639]
91	[157:6.760736,154:6.7595925,167:6.758437]
92	[157:8.507141,179:8.500539,156:8.499115]
93	[169:9.0,157:9.0,172:9.0]
94	[166:8.5067835,158:8.504053,179:8.501568]
95	[164:8.007538,171:8.002612,156:7.999423]
96	[175:8.264971,178:8.263639,162:8.263271]
97	[157:7.8494315,179:7.8257837,171:7.816413]
98	[172:9.274935,174:9.2685995,157:9.268519]
99	[167:8.52092,154:8.520652,180:8.512259]
100	[161:9.205923,169:9.204947,153:9.203809]
101	[167:8.828454,169:8.827417,179:8.825509]
102	[171:8.999125,160:8.999075,152:8.997383]
103	[163:8.82209,157:8.820706,179:8.820145]
104	[159:7.2837806,161:7.2700033,175:7.258169]
105	[157:9.028746,160:9.027347,151:9.02409]
106	[154:8.433907,160:8.421308,158:8.414169]
107	[172:9.514924,174:9.514814,153:9.512063]
108	[151:8.030385,160:8.026872,158:8.02542]
109	[156:6.6250725,157:6.622257,165:6.6082363]
110	[158:9.606727,176:9.606193,171:9.60596]
111	[171:8.017084,177:8.016968,152:8.016839]
112	[159:9.511379,157:9.510889,151:9.505083]
113	[175:7.5248113,177:7.522693,152:7.520893]
114	[172:9.26821,156:9.264633,159:9.258878]
115	[151:7.294816,171:7.293991,174:7.290334]
116	[174:9.046924,171:9.041981,164:9.038853]
117	[164:9.50573,171:9.504629,168:9.501898]
118	[160:9.52131,151:9.519595,170:9.515884]
119	[173:8.425644,156:8.4188595,164:8.417022]
120	[160:7.516728,164:7.5157986,178:7.5129976]
121	[153:7.6263037,172:7.61627,169:7.615692]
122	[178:6.762497,160:6.7599945,158:6.7560177]
123	[172:9.178253,174:9.173519,160:9.172696]
124	[156:9.253967,178:9.252134,166:9.245467]
125	[169:7.814966,166:7.807703,167:7.8047295]
126	[159:8.785326,175:8.773898,162:8.767956]
127	[169:7.8215017,170:7.8197494,178:7.816342]
128	[162:8.510248,163:8.507281,156:8.505663]
129	[166:8.026217,172:8.015036,155:8.013249]
130	[171:8.819271,161:8.812314,162:8.811065]
131	[154:7.865669,178:7.863002,157:7.8604527]
132	[179:9.523388,158:9.521263,173:9.516971]
133	[159:8.018762,158:8.016342,174:8.014847]
134	[154:9.021759,172:9.021533,162:9.01653]
135	[151:8.61318,155:8.609317,160:8.607563]
136	[156:9.254534,166:9.249688,174:9.248116]
137	[150:9.0,169:9.0,157:9.0]
138	[171:9.762398,166:9.759721,164:9.757952]
139	[153:9.528044,171:9.523034,173:9.5229645]
140	[155:7.5285306,180:7.5238295,169:7.5093036]
141	[180:7.4153795,167:7.412624,170:7.4123755]
142	[171:8.023187,164:8.009908,172:7.999817]
143	[151:7.7645965,171:7.7615533,173:7.7559156]
144	[177:7.782484,152:7.776077,151:7.7754965]
145	[164:8.619289,168:8.61923,170:8.618492]
146	[176:6.7966204,171:6.7868986,173:6.783899]
147	[160:9.018281,166:9.005128,178:9.004819]
148	[167:8.049955,157:8.041632,154:8.038961]
149	[177:9.517222,155:9.516409,178:9.516256]
150	[155:8.414563,156:8.411247,161:8.405218]
151	[158:8.020138,163:8.014851,154:8.013942]
152	[167:8.518394,152:8.514982,177:8.5132]
153	[179:8.51755,158:8.513094,164:8.512812]
154	[157:8.518561,151:8.508553,174:8.508073]
155	[159:8.001118,164:8.000777,162:7.991791]
156	[150:10.0,169:10.0,157:10.0]
157	[167:7.515985,175:7.5148,151:7.512807]
158	[160:8.514562,152:8.514253,165:8.506844]
159	[162:9.205832,179:9.20297,158:9.199801]
160	[151:8.232582,174:8.229294,166:8.22144]
161	[175:8.514033,152:8.511582,156:8.509891]
162	[156:8.40486,177:8.403272,155:8.402213]
163	[166:9.764738,158:9.761785,167:9.760458]
164	[167:8.271553,173:8.269992,158:8.26935]
165	[174:8.035829,154:8.03065,167:8.02868]
166	[168:7.779158,153:7.7743764,174:7.7730503]
167	[175:9.029314,152:9.025963,177:9.01667]
168	[177:7.565077,151:7.561794,164:7.538589]
169	[155:8.273797,168:8.273199,163:8.268657]
170	[176:8.634429,153:8.6186075,163:8.618071]
171	[159:9.262066,156:9.261877,167:9.2571745]
172	[175:9.523921,154:9.519366,163:9.516635]
173	[175:8.838636,152:8.836262,163:8.813428]
174	[158:9.26631,161:9.263697,170:9.261099]
175	[166:8.775548,178:8.771132,175:8.769558]
176	[165:7.8150263,169:7.8133674,155:7.8131046]
177	[156:8.807701,153:8.807377,157:8.805355]
178	[159:9.261233,164:9.259828,174:9.2585335]
179	[168:9.027936,169:9.025012,159:9.022427]
180	[163:8.785125,171:8.784099,155:8.783716]
181	[178:7.682512,171:7.678499,165:7.6766257]
182	[173:7.25374,167:7.235347,154:7.2335563]
183	[173:8.348462,150:8.344258,171:8.343579]
184	[175:8.343719,171:8.337056,151:8.334391]
185	[177:8.03772,169:8.033685,173:8.030679]
186	[178:8.435311,165:8.430334,156:8.429024]
187	[167:9.524929,180:9.522632,168:9.513746]
188	[159:9.300355,174:9.299761,168:9.297128]
189	[176:7.874146,158:7.8669815,178:7.863203]
190	[165:7.829162,179:7.8232923,157:7.8204217]
191	[161:8.609121,174:8.608223,151:8.606536]
192	[153:8.414844,155:8.412835,179:8.410556]
193	[170:6.7902784,173:6.7844234,168:6.772431]
194	[161:7.7525477,153:7.7513156,178:7.750592]
195	[167:8.822119,163:8.815237,166:8.814932]
196	[169:8.774473,173:8.769399,180:8.765824]
197	[156:7.234841,152:7.2339506,169:7.22824]
198	[166:7.597569,177:7.590905,162:7.5895452]
199	[156:6.8099,162:6.80961,175:6.8059845]
200	[160:6.6166706,152:6.6111975,175:6.5953183]

在此感谢先电云提供的题库。
感谢Apache开源技术服务支持
感谢抛物线、mn525520、菜鸟一枚2019三位博主的相关博客。

  • 0
    点赞
  • 0
    收藏
    觉得还不错? 一键收藏
  • 打赏
    打赏
  • 0
    评论

“相关推荐”对你有帮助么?

  • 非常没帮助
  • 没帮助
  • 一般
  • 有帮助
  • 非常有帮助
提交
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

希望不是梦

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值