set the header will be that borrowed from sequence files, e.g. SEQ- followed
by the input and output RC File formats.
</description>
</property>
<property>
<name>hive.exec.orc.dictionary.key.size.threshold</name>
<value>0.8</value>
<description>
If the number of keys in a dictionary is greater than this fraction of the total number of
non-null rows, turn off dictionary encoding. Use 1 to always use dictionary encoding.
</description>
</property>
<property>
<name>hive.multi.insert.move.tasks.share.dependencies</name>
<value>false</value>
<description>
If this is set all move tasks for tables/partitions (not directories) at the end of a
multi-insert query will only begin once the dependencies for all these move tasks have been
met.
Advantages: If concurrency is enabled, the locks will only be released once the query has
finished, so with this config enabled, the time when the table/partition is
generated will be much closer to when the lock on it is released.
Disadvantages: If concurrency is not enabled, with this disabled, the tables/partitions which
are produced by this query and finish earlier will be available for querying
much earlier. Since the locks are only released once the query finishes, this
does not apply if concurrency is enabled.
</description>
</property>
<property>
<name>hive.fetch.task.conversion</name>
<value>minimal</value>
<description>
Some select queries can be converted to single FETCH task minimizing latency.
Currently the query should be single sourced not having any subquery and should not have
any aggregations or distincts (which incurrs RS), lateral views and joins.
1. minimal : SELECT STAR, FILTER on partition columns, LIMIT only
2. more : SELECT, FILTER, LIMIT only (TABLESAMPLE, virtual columns)
</description>
</property>
<property>
<name>hive.cache.expr.evaluation</name>
<value>true</value>
<description>
If true, evaluation result of deterministic expression referenced twice or more will be cached.
For example, in filter condition like ".. where key + 10 > 10 or key + 10 = 0"
"key + 10" will be evaluated/cached once and reused for following expression ("key + 10 = 0").
Currently, this is applied only to expressions in select or filter operator.
</description>
</property>
<property>
<name>hive.hmshandler.retry.attempts</name>
<value>1</value>
<description>The number of times to retry a HMSHandler call if there were a connection error</description>
</property>
<property>
<name>hive.hmshandler.retry.interval</name>
<value>1000</value>
<description>The number of miliseconds between HMSHandler retry attempts</description>
</property>
<property>
<name>hive.server.read.socket.timeout</name>
<value>10</value>
<description>Timeout for the HiveServer to close the connection if no response from the client in N seconds, defaults to 10 seconds.</description>
</property>
<property>
<name>hive.server.tcp.keepalive</name>
<value>true</value>
<description>Whether to enable TCP keepalive for the Hive server. Keepalive will prevent accumulation of half-open connections.</description>
</property>
<property>
<name>hive.server2.in.mem.logging</name>
<value>true</value>
<description>
Whether to turn on hiveserver2 in memory logging
</description>
</property>
<property>
<name>hive.server2.in.mem.log.size</name>
<value>131072</value>
<description>
Maximum size of the hiveserver2 in memory query log. Note that the size is per query.
</description>
</property>
<property>
<name>hive.decode.partition.name</name>
<value>false</value>
<description>Whether to show the unquoted partition names in query results.</description>
</property>
<property>
<name>hive.log4j.file</name>
<value></value>
<description>Hive log4j configuration file.
If the property is not set, then logging will be initialized using hive-log4j.properties found on the classpath.
If the property is set, the value must be a valid URI (java.net.URI, e.g. "file:///tmp/my-logging.properties"), which you can then extract a URL from and pass to PropertyConfigurator.configure(URL).</description>
</property>
<property>
<name>hive.exec.log4j.file</name>
<value></value>
<description>Hive log4j configuration file for execution mode(sub command).
If the property is not set, then logging will be initialized using hive-exec-log4j.properties found on the classpath.
If the property is set, the value must be a valid URI (java.net.URI, e.g. "file:///tmp/my-logging.properties"), which you can then extract a URL from and pass to PropertyConfigurator.configure(URL).</description>
</property>
<property>
<name>hive.exec.infer.bucket.sort</name>
<value>false</value>
<description>
If this is set, when writing partitions, the metadata will include the bucketing/sorting
properties with which the data was written if any (this will not overwrite the metadata
inherited from the table if the table is bucketed/sorted)
</description>
</property>
<property>
<name>hive.exec.infer.bucket.sort.num.buckets.power.two</name>
<value>false</value>
<description>
If this is set, when setting the number of reducers for the map reduce task which writes the
final output files, it will choose a number which is a power of two, unless the user specifies
the number of reducers to use using mapred.reduce.tasks. The number of reducers
may be set to a power of two, only to be followed by a merge task meaning preventing
anything from being inferred.
With hive.exec.infer.bucket.sort set to true:
Advantages: If this is not set, the number of buckets for partitions will seem arbitrary,
which means that the number of mappers used for optimized joins, for example, will
be very low. With this set, since the number of buckets used for any partition is
a power of two, the number of mappers used for optimized joins will be the least
number of buckets used by any partition being joined.
Disadvantages: This may mean a much larger or much smaller number of reducers being used in the
final map reduce job, e.g. if a job was originally going to take 257 reducers,
it will now take 512 reducers, similarly if the max number of reducers is 511,
and a job was going to use this many, it will now use 256 reducers.
</description>
</property>
<property>
<name>hive.groupby.orderby.position.alias</name>
<value>false</value>
<description>Whether to enable using Column Position Alias in Group By or Order By</description>
</property>
<property>
<name>hive.server2.thrift.min.worker.threads</name>
<value>5</value>
<description>Minimum number of Thrift worker threads</description>
</property>
<property>
<name>hive.server2.thrift.max.worker.threads</name>
<value>500</value>
<description>Maximum number of Thrift worker threads</description>
</property>
<property>
<name>hive.server2.async.exec.threads</name>
<value>100</value>
<description>Number of threads in the async thread pool for HiveServer2</description>
</property>
<property>
<name>hive.server2.async.exec.shutdown.timeout</name>
<value>10</value>
<description>Time (in seconds) for which HiveServer2 shutdown will wait for async
threads to terminate</description>
</property>
<property>
<name>hive.server2.async.exec.keepalive.time</name>
<value>10</value>
<description>Time (in seconds) that an idle HiveServer2 async thread (from the thread pool) will wait
for a new task to arrive before terminating</description>
</property>
<property>
<name>hive.server2.async.exec.wait.queue.size</name>
<value>100</value>
<description>Size of the wait queue for async thread pool in HiveServer2.
After hitting this limit, the async thread pool will reject new requests.</description>
</property>
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
<description>Port number of HiveServer2 Thrift interface.
Can be overridden by setting $HIVE_SERVER2_THRIFT_PORT</description>
</property>
<property>
<name>hive.server2.thrift.bind.host</name>
<value>localhost</value>
<description>Bind host on which to run the HiveServer2 Thrift interface.
Can be overridden by setting $HIVE_SERVER2_THRIFT_BIND_HOST</description>
</property>
<property>
<name>hive.server2.authentication</name>
<value>NONE</value>
<description>
Client authentication types.
NONE: no authentication check
LDAP: LDAP/AD based authentication
KERBEROS: Kerberos/GSSAPI authentication
CUSTOM: Custom authentication provider
(Use with property hive.server2.custom.authentication.class)
</description>
</property>
<property>
<name>hive.server2.custom.authentication.class</name>
<value></value>
<description>
Custom authentication class. Used when property
'hive.server2.authentication' is set to 'CUSTOM'. Provided class
must be a proper implementation of the interface
org.apache.hive.service.auth.PasswdAuthenticationProvider. HiveServer2
will call its Authenticate(user, passed) method to authenticate requests.
The implementation may optionally extend the Hadoop's
org.apache.hadoop.conf.Configured class to grab Hive's Configuration object.
</description>
</property>
<property>
<name>>hive.server2.authentication.kerberos.principal</name>
<value></value>
<description>
Kerberos server principal
</description>
</property>
<property>
<name>>hive.server2.authentication.kerberos.keytab</name>
<value></value>
<description>
Kerberos keytab file for server principal
</description>
</property>
<property>
<name>hive.server2.authentication.ldap.url</name>
<value></value>
<description>
LDAP connection URL
</description>
</property>
<property>
<name>hive.server2.authentication.ldap.baseDN</name>
<value></value>
<description>
LDAP base DN
</description>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>true</value>
<description>
Setting this property to true will have hive server2 execute
hive operations as the user making the calls to it.
</description>
</property>
<property>
<name>hive.server2.table.type.mapping</name>
<value>CLASSIC</value>
<description>
This setting reflects how HiveServer will report the table types for JDBC and other
client implementations that retrieves the available tables and supported table types
HIVE : Exposes the hive's native table tyes like MANAGED_TABLE, EXTERNAL_TABLE, VIRTUAL_VIEW
CLASSIC : More generic types like TABLE and VIEW
</description>
</property>
<property>
<name>hive.server2.thrift.sasl.qop</name>
<value>auth</value>
<description>Sasl QOP value; Set it to one of following values to enable higher levels of
protection for hive server2 communication with clients.
"auth" - authentication only (default)
"auth-int" - authentication plus integrity protection
"auth-conf" - authentication plus integrity and confidentiality protection
This is applicable only hive server2 is configured to use kerberos authentication.
</description>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
<description>
Enforce metastore schema version consistency.
True: Verify that version information stored in metastore matches with one from Hive jars. Also disable automatic
schema migration attempt. Users are required to manully migrate schema after Hive upgrade which ensures
proper metastore schema migration. (Default)
False: Warn if the version information stored in metastore doesn't match with one from in Hive jars.
</description>
</property>
<property>
<name>hive.plan.serialization.format</name>
<value>kryo</value>
<description>
Query plan format serialization between client and task nodes.
Two supported values are : kryo and javaXML. Kryo is default.
</description>
</property>
<property>
<name>hive.server2.allow.user.substitution</name>
<value>true</value>
<description>
Allow altername user to be specified as part of HiveServer2 open connection requestion
</description>
</property>
</configuration>
[root@cdh1 conf]#
[root@cdh1 conf]# less hive-site.xml
[root@cdh1 conf]# mysql -u root -p
Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 4
Server version: 5.1.73 Source distribution
Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql> show tables;
ERROR 1046 (3D000): No database selected
mysql> show databases;
+--------------------+
| Database |
+--------------------+
| information_schema |
| amon |
| hive |
| mysql |
| test |
+--------------------+
5 rows in set (0.23 sec)
mysql> create database hivedb;
Query OK, 1 row affected (0.00 sec)
mysql> show databases;
+--------------------+
| Database |
+--------------------+
| information_schema |
| amon |
| hive |
| hivedb |
| mysql |
| test |
+--------------------+
6 rows in set (0.00 sec)
mysql> select user();
+----------------+
| user() |
+----------------+
| root@localhost |
+----------------+
1 row in set (0.04 sec)
mysql> create user 'hiveuser' identified by 'hivepwd';
Query OK, 0 rows affected (0.00 sec)
mysql> select user();
+----------------+
| user() |
+----------------+
| root@localhost |
+----------------+
1 row in set (0.00 sec)
mysql> grant all privileges on *.* to hiveuser@"localhost" identified by "hivepwd" with grant option;
Query OK, 0 rows affected (0.00 sec)
mysql> select user();
+----------------+
| user() |
+----------------+
| root@localhost |
+----------------+
1 row in set (0.00 sec)
mysql> q
-> ;
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'q' at line 1
mysql> Bye
[root@cdh1 conf]# mysql -u hiveuser -p
Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 6
Server version: 5.1.73 Source distribution
Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql> select user();
+--------------------+
| user() |
+--------------------+
| hiveuser@localhost |
+--------------------+
1 row in set (0.00 sec)
mysql> select user();
+--------------------+
| user() |
+--------------------+
| hiveuser@localhost |
+--------------------+
1 row in set (0.00 sec)
mysql> show databases;
+--------------------+
| Database |
+--------------------+
| information_schema |
| amon |
| hive |
| hivedb |
| mysql |
| test |
+--------------------+
6 rows in set (0.00 sec)
mysql> Bye
[root@cdh1 conf]# cd /user/local/test1
[root@cdh1 test1]# ll
total 32
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
fileList.scala:3: error: not found: value filesHere
for (file <- filesHere)
^
one error found
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
[root@cdh1 test1]# scala fileList.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
[root@cdh1 test1]# ls
fileList$$anonfun$main$1.class fileList$.class matchTest.scala prt1$.class prt.class prt.scala
fileList.class fileList.scala prt1.class prt1.scala prt$.class test.txt
[root@cdh1 test1]# ls -l
total 48
-rw-r--r-- 1 root root 1130 Jul 12 04:53 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 04:53 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 04:53 fileList$.class
-rw-r--r-- 1 root root 244 Jul 12 04:53 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# cat fileList.scala
object fileList{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
for (file <- filesHere)
if (file.getName.endsWith(".scala"))
println(file)
}
}
[root@cdh1 test1]#
[root@cdh1 conf]# cd /user/local/test1
[root@cdh1 test1]# ll
total 32
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
fileList.scala:3: error: not found: value filesHere
for (file <- filesHere)
^
one error found
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
[root@cdh1 test1]# scala fileList.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
[root@cdh1 test1]# ls
fileList$$anonfun$main$1.class fileList$.class matchTest.scala prt1$.class prt.class prt.scala
fileList.class fileList.scala prt1.class prt1.scala prt$.class test.txt
[root@cdh1 test1]# ls -l
total 48
-rw-r--r-- 1 root root 1130 Jul 12 04:53 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 04:53 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 04:53 fileList$.class
-rw-r--r-- 1 root root 244 Jul 12 04:53 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# cat fileList.scala
object fileList{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
for (file <- filesHere)
if (file.getName.endsWith(".scala"))
println(file)
}
}
[root@cdh1 test1]# ll
total 48
-rw-r--r-- 1 root root 1130 Jul 12 04:53 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 04:53 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 04:53 fileList$.class
-rw-r--r-- 1 root root 244 Jul 12 04:53 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# mkdir a.scala
[root@cdh1 test1]# ll
total 52
drwxr-xr-x 2 root root 4096 Jul 12 04:59 a.scala
-rw-r--r-- 1 root root 1130 Jul 12 04:53 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 04:53 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 04:53 fileList$.class
-rw-r--r-- 1 root root 244 Jul 12 04:53 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# scala fileList.scala
./prt1.scala
./prt.scala
./fileList.scala
./a.scala
./matchTest.scala
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
fileList.scala:6: error: illegal start of simple expression
if (file.isFile);
^
one error found
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
[root@cdh1 test1]# scala fileList.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
[root@cdh1 test1]# cp fileList.scala fileList2.scala
[root@cdh1 test1]# vim fileList2.scala
\[root@cdh1 test1]#scalac fileList2.scala
fileList2.scala:7: error: illegal start of simple pattern
if (file.isFile);
^
fileList2.scala:7: error: '<-' expected but ';' found.
if (file.isFile);
^
fileList2.scala:10: error: '<-' expected but '}' found.
}
^
fileList2.scala:11: error: illegal start of simple expression
}
^
four errors found
[root@cdh1 test1]# vim fileList2.scala
[root@cdh1 test1]# scalac fileList2.scala
[root@cdh1 test1]# scala fileList2.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
./fileList2.scala
[root@cdh1 test1]# cat fileList2.scala
object fileList2{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
//for (file <- filesHere)
for(
file <- filesHere
if (file.isFile);
if (file.getName.endsWith(".scala"))
//println(file)
)
println(file)
}
}
[root@cdh1 test1]# cp fileList2.scala fileList3.scala
[root@cdh1 test1]# vim fileList3.scala
[root@cdh1 test1]# scalac fileList3.scala
fileList3.scala:6: error: ';' expected but '<-' found.
file <- filesHere
^
fileList3.scala:9: error: illegal start of simple pattern
if (file.isFile);
^
fileList3.scala:9: error: '<-' expected but ';' found.
if (file.isFile);
^
three errors found
[root@cdh1 test1]# vim fileList3.scala
[root@cdh1 test1]# scalac fileList3.scala
[root@cdh1 test1]# scala fileList3.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
./fileList2.scala
./fileList3.scala
[root@cdh1 test1]# cat fileList3.scala
object fileList3{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
//for (file <- filesHere)
//file <- filesHere
for(file <- filesHere if (file.isFile); if (file.getName.endsWith(".scala")) )
println(file)
}
}
[root@cdh1 test1]# vim fileList3.scala
[root@cdh1 test1]# scalac fileList3.scala
[root@cdh1 test1]# scala fileList3.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
./fileList2.scala
./fileList3.scala
[root@cdh1 test1]# cat fileList3.scala
object fileList3{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
//for (file <- filesHere)
//file <- filesHere
for(file <- filesHere; if (file.isFile); if (file.getName.endsWith(".scala")) )
println(file)
}
}
[root@cdh1 test1]# ll
total 100
drwxr-xr-x 2 root root 4096 Jul 12 04:59 a.scala
-rw-r--r-- 1 root root 921 Jul 12 05:17 fileList2$$anonfun$main$1.class
-rw-r--r-- 1 root root 1023 Jul 12 05:17 fileList2$$anonfun$main$2.class
-rw-r--r-- 1 root root 992 Jul 12 05:17 fileList2$$anonfun$main$3.class
-rw-r--r-- 1 root root 770 Jul 12 05:17 fileList2.class
-rw-r--r-- 1 root root 1214 Jul 12 05:17 fileList2$.class
-rw-r--r-- 1 root root 327 Jul 12 05:17 fileList2.scala
-rw-r--r-- 1 root root 921 Jul 12 05:24 fileList3$$anonfun$main$1.class
-rw-r--r-- 1 root root 1023 Jul 12 05:24 fileList3$$anonfun$main$2.class
-rw-r--r-- 1 root root 992 Jul 12 05:24 fileList3$$anonfun$main$3.class
-rw-r--r-- 1 root root 770 Jul 12 05:24 fileList3.class
-rw-r--r-- 1 root root 1214 Jul 12 05:24 fileList3$.class
-rw-r--r-- 1 root root 312 Jul 12 05:24 fileList3.scala
-rw-r--r-- 1 root root 1166 Jul 12 05:02 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 05:02 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 05:02 fileList$.class
-rw-r--r-- 1 root root 270 Jul 12 05:02 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# less matchTest.scala
[root@cdh1 test1]# vim matchTest.scala
[root@cdh1 test1]# scalac matchTest.scala
[root@cdh1 test1]# scala matchTest.scala
huh?
[root@cdh1 test1]# scala matchTest.scala chips
salsa
[root@cdh1 test1]# scala matchTest.scala salt
pepper
[root@cdh1 test1]# scala matchTest.scala eggs
bacon
[root@cdh1 test1]# scala matchTest.scala
huh?
[root@cdh1 test1]# cat matchTest.scala
object matchTest{
def main(args:Array[String]){
val firstArg =
if(args.length > 0)
args(0)
else
""
firstArg match {
case "salt" => println("pepper")
case "chips" => println("salsa")
case "eggs" => println("bacon")
case _ => println("huh?") }
}
}
[root@cdh1 test1]# scala matchTest.scala eggs salt
bacon
[root@cdh1 test1]# scala matchTest.scala matchTest.
matchTest.class matchTest.scala
[root@cdh1 test1]# cp matchTest.scala matchTest2.scala
[root@cdh1 test1]# vim matchTest2.scala
[1]+ Stopped vim matchTest2.scala
[root@cdh1 test1]# vim matchTest2.scala
[root@cdh1 test1]# scalac matchTest2.scala
[root@cdh1 test1]# scala matchTest2.scala eggs salt
e_bacon
[root@cdh1 test1]# scala matchTest2.scala salt
s_pepper
[root@cdh1 test1]# scala matchTest2.scala chips
c_salsa
[root@cdh1 test1]# scala matchTest2.scala chipsss
o_huh?
[root@cdh1 test1]# scala matchTest2.scala
o_huh?
[root@cdh1 test1]# cat matchTest2.scala
object matchTest2{
def main(args:Array[String]){
val firstArg =
if(args.length > 0)
args(0)
else
""
var friend=
firstArg match {
case "salt" => "s_pepper"
case "chips" => "c_salsa"
case "eggs" => "e_bacon"
case _ => "o_huh?" }
println(friend)
}
}
[root@cdh1 test1]# scala
Welcome to Scala version 2.9.3 (Java HotSpot(TM) Client VM, Java 1.7.0_67).
Type in expressions to have them evaluated.
Type :help for more information.
scala> var increase = (x: Int) => x + 1
increase: Int => Int = <function1>
scala> increase 10
<console>:1: error: ';' expected but integer literal found.
increase 10
^
scala> increase (10)
res0: Int = 11
scala>
scala> increase (10)
res1: Int = 11
scala> var increase = (x: Int) => x + 9999
increase: Int => Int = <function1>
scala> increase (10)
res2: Int = 10009
scala> val increase = (x: Int) => x + 1
increase: Int => Int = <function1>
scala> increase (10)
res3: Int = 11
scala> increase = (x: Int) => x + 9999
<console>:8: error: reassignment to val
increase = (x: Int) => x + 9999
^
scala> var increase = (x: Int) => x + 1
increase: Int => Int = <function1>
scala> increase = (x: Int) => x + 9999
increase: Int => Int = <function1>
scala> increase (10)
res4: Int = 10009
scala> increase = (x: Int) => { println("We") println("are") println("here!") x + 1 }
<console>:8: error: value println is not a member of Unit
increase = (x: Int) => { println("We") println("are") println("here!") x + 1 }
^
scala> increase = (x: Int) => { println("We") println("are") println("here!") x + 1 }
<console>:8: error: value println is not a member of Unit
increase = (x: Int) => { println("We") println("are") println("here!") x + 1 }
^
scala> increase = (x: Int) => { println("We") ;println("are"); println("here!"); x + 1 }
increase: Int => Int = <function1>
scala> increase (10)
We
are
here!
res5: Int = 11
scala> val someNumbers = List(-11, -10, -5, 0, 5, 10)
someNumbers: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.f
filter filterNot find findIndexOf findLastIndexOf first firstOption
flatMap flatten fold foldLeft foldRight forall foreach
scala> someNumbers.foreach((x:Int)->println(x))
<console>:9: error: not found: value x
someNumbers.foreach((x:Int)->println(x))
^
scala> someNumbers.foreach((x:Int)=>println(x))
-11
-10
-5
0
5
10
scala> someNumbers
res8: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.filter
filter filterNot
scala> someNumbers.filter((x:Int)=>x>0)
res9: List[Int] = List(5, 10)
scala> someNumbers.filter((x:Int)=>x>0).foreach((_:Int)=>println(_))
<console>:9: error: missing parameter type for expanded function ((x$2) => println(x$2))
someNumbers.filter((x:Int)=>x>0).foreach((_:Int)=>println(_))
^
scala> someNumbers.filter((x:Int)=>x>0).foreach((x:Int)=>println(x))
5
10
scala> someNumbers.filter((x:Int)=>x>0).foreach((y:Int)=>println(y))
5
10
scala> someNumbers.filter((x)=>x>0)
res13: List[Int] = List(5, 10)
scala> someNumbers.filter(x=>x>0)
res14: List[Int] = List(5, 10)
scala> someNumbers.filter(x=>x>0).foreach((y:Int)=>println(y))
5
10
scala> someNumbers.filter(_>0).foreach(println(_))
5
10
scala> someNumbers.filter(_>0)
res17: List[Int] = List(5, 10)
scala> someNumbers.filter(_<0)
res18: List[Int] = List(-11, -10, -5)
scala> someNumbers.filter(_<0).foreach(println(_))
-11
-10
-5
scala>
scala> val f =_+_
<console>:7: error: missing parameter type for expanded function ((x$1, x$2) => x$1.$plus(x$2))
val f =_+_
^
<console>:7: error: missing parameter type for expanded function ((x$1: <error>, x$2) => x$1.$plus(x$2))
val f =_+_
^
scala> val f =(_:Int)+(_:InT)
<console>:7: error: not found: type InT
val f =(_:Int)+(_:InT)
^
scala> val f =(_:Int)+(_:Int)
f: (Int, Int) => Int = <function2>
scala> f(2,3)
res20: Int = 5
scala> def sum(a: Int, b: Int, c: Int) = a + b + c
sum: (a: Int, b: Int, c: Int)Int
scala> sum(1,3,5)
res21: Int = 9
scala> val f=sum _
f: (Int, Int, Int) => Int = <function3>
scala> sum(1,2)
<console>:9: error: not enough arguments for method sum: (a: Int, b: Int, c: Int)Int.
Unspecified value parameter c.
sum(1,2)
^
scala> f(1,2)
<console>:10: error: not enough arguments for method apply: (v1: Int, v2: Int, v3: Int)Int in trait Function3.
Unspecified value parameter v3.
f(1,2)
^
scala> f(1,2,3)
res24: Int = 6
scala> f.a
apply asInstanceOf
scala> f.apply(1,2,4)
res25: Int = 7
scala> f.apply(1,2,3)
res26: Int = 6
scala> val b = sum(1, _: Int, 3)
b: Int => Int = <function1>
scala> b(2)
res27: Int = 6
scala> b(5)
res28: Int = 9
scala> b(4)
res29: Int = 8
scala> someNumbers
res30: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.foreach(println)
-11
-10
-5
0
5
10
scala> someNumbers.filter(>0)foreach(println)
<console>:1: error: ')' expected but integer literal found.
someNumbers.filter(>0)foreach(println)
^
scala> someNumbers.filter(>0).foreach(println)
<console>:1: error: ')' expected but integer literal found.
someNumbers.filter(>0).foreach(println)
^
scala> someNumbers.filter(_>0).foreach(println)
5
10
scala> val c = sum
<console>:8: error: missing arguments for method sum in object $iw;
follow this method with `_' if you want to treat it as a partially applied function
val c = sum
^
scala> val c = sum _
c: (Int, Int, Int) => Int = <function3>
scala> c(2,3,4)
res33: Int = 9
scala> (x: Int) => x + more
<console>:8: error: not found: value more
(x: Int) => x + more
^
scala> var more = 1
more: Int = 1
scala> val f(x: Int) => x + more
<console>:1: error: '=' expected but '=>' found.
val f(x: Int) => x + more
^
scala> val f(x: Int) = x + more
<console>:10: error: value f is not a case class constructor, nor does it have an unapply/unapplySeq method
val f(x: Int) = x + more
^
scala> val f=(x: Int) => x + more
f: Int => Int = <function1>
scala> f(1)
res35: Int = 2
scala> f(3)
res36: Int = 4
scala> f(10)
res37: Int = 11
scala> more = 9999
more: Int = 9999
scala> f(10)
res38: Int = 10009
scala> someNumbers
res39: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.foreach(sum+=_)
<console>:10: error: missing arguments for method sum in object $iw;
follow this method with `_' if you want to treat it as a partially applied function
someNumbers.foreach(sum+=_)
^
<console>:10: error: reassignment to val
someNumbers.foreach(sum+=_)
^
scala> var sum=0;
sum: Int = 0
scala> someNumbers.foreach(sum+=_)
scala> someNumbers
res42: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.foreach(sum+=_)
scala> someNumbers.foreach(sum+=_)
scala> sum
res45: Int = -33
scala> someNumbers.foreach(sum+=_).println
<console>:10: error: value println is not a member of Unit
someNumbers.foreach(sum+=_).println
^
scala> def echo(args:String*)
| args.foreach(println)
<console>:10: error: not found: value args
args.foreach(println)
^
<console>:7: error: only classes can have declared but undefined members
def echo(args:String*)
^
scala> val echo(args:String*)
<console>:1: error: ')' expected but identifier found.
val echo(args:String*)
^
scala> val echo(args:String*) args.foreach(println)
<console>:1: error: ')' expected but identifier found.
val echo(args:String*) args.foreach(println)
^
scala> def echo(args:String*)=args.foreach(println)
echo: (args: String*)Unit
scala> echo("hello","world")
hello
world
scala> def echo(args:String*)=for(arg<-args) println
echo: (args: String*)Unit
scala> echo("hello","world")
scala> echo("hello","world")
scala> def echo(args:String*)=for(arg<-args) println(_)
<console>:7: error: missing parameter type for expanded function ((x$1) => println(x$1))
def echo(args:String*)=for(arg<-args) println(_)
^
scala> def echo(args:String*)=for(arg<-args) println(arg)
echo: (args: String*)Unit
scala> echo("hello","world")
hello
world
scala> echo("hello","world","spark")
hello
world
spark
scala> echo()
scala> echo("hello")
hello
scala> val arr = Array("What's", "up", "doc?")
arr: Array[java.lang.String] = Array(What's, up, doc?)
scala> echo(arr)
<console>:10: error: type mismatch;
found : Array[java.lang.String]
required: String
echo(arr)
^
scala> echo(arr.toString)
[Ljava.lang.String;@614169
scala> echo(arr.toString:_*)
<console>:10: error: type mismatch;
found : java.lang.String
required: Seq[String]
echo(arr.toString:_*)
^
scala> echo(arr:_*)
What's
up
doc?
scala> arr
res59: Array[java.lang.String] = Array(What's, up, doc?)
scala> echo(arr:_)
<console>:1: error: `*' expected
echo(arr:_)
^
scala> echo(arr:_*)
What's
up
doc?
scala>
by the input and output RC File formats.
</description>
</property>
<property>
<name>hive.exec.orc.dictionary.key.size.threshold</name>
<value>0.8</value>
<description>
If the number of keys in a dictionary is greater than this fraction of the total number of
non-null rows, turn off dictionary encoding. Use 1 to always use dictionary encoding.
</description>
</property>
<property>
<name>hive.multi.insert.move.tasks.share.dependencies</name>
<value>false</value>
<description>
If this is set all move tasks for tables/partitions (not directories) at the end of a
multi-insert query will only begin once the dependencies for all these move tasks have been
met.
Advantages: If concurrency is enabled, the locks will only be released once the query has
finished, so with this config enabled, the time when the table/partition is
generated will be much closer to when the lock on it is released.
Disadvantages: If concurrency is not enabled, with this disabled, the tables/partitions which
are produced by this query and finish earlier will be available for querying
much earlier. Since the locks are only released once the query finishes, this
does not apply if concurrency is enabled.
</description>
</property>
<property>
<name>hive.fetch.task.conversion</name>
<value>minimal</value>
<description>
Some select queries can be converted to single FETCH task minimizing latency.
Currently the query should be single sourced not having any subquery and should not have
any aggregations or distincts (which incurrs RS), lateral views and joins.
1. minimal : SELECT STAR, FILTER on partition columns, LIMIT only
2. more : SELECT, FILTER, LIMIT only (TABLESAMPLE, virtual columns)
</description>
</property>
<property>
<name>hive.cache.expr.evaluation</name>
<value>true</value>
<description>
If true, evaluation result of deterministic expression referenced twice or more will be cached.
For example, in filter condition like ".. where key + 10 > 10 or key + 10 = 0"
"key + 10" will be evaluated/cached once and reused for following expression ("key + 10 = 0").
Currently, this is applied only to expressions in select or filter operator.
</description>
</property>
<property>
<name>hive.hmshandler.retry.attempts</name>
<value>1</value>
<description>The number of times to retry a HMSHandler call if there were a connection error</description>
</property>
<property>
<name>hive.hmshandler.retry.interval</name>
<value>1000</value>
<description>The number of miliseconds between HMSHandler retry attempts</description>
</property>
<property>
<name>hive.server.read.socket.timeout</name>
<value>10</value>
<description>Timeout for the HiveServer to close the connection if no response from the client in N seconds, defaults to 10 seconds.</description>
</property>
<property>
<name>hive.server.tcp.keepalive</name>
<value>true</value>
<description>Whether to enable TCP keepalive for the Hive server. Keepalive will prevent accumulation of half-open connections.</description>
</property>
<property>
<name>hive.server2.in.mem.logging</name>
<value>true</value>
<description>
Whether to turn on hiveserver2 in memory logging
</description>
</property>
<property>
<name>hive.server2.in.mem.log.size</name>
<value>131072</value>
<description>
Maximum size of the hiveserver2 in memory query log. Note that the size is per query.
</description>
</property>
<property>
<name>hive.decode.partition.name</name>
<value>false</value>
<description>Whether to show the unquoted partition names in query results.</description>
</property>
<property>
<name>hive.log4j.file</name>
<value></value>
<description>Hive log4j configuration file.
If the property is not set, then logging will be initialized using hive-log4j.properties found on the classpath.
If the property is set, the value must be a valid URI (java.net.URI, e.g. "file:///tmp/my-logging.properties"), which you can then extract a URL from and pass to PropertyConfigurator.configure(URL).</description>
</property>
<property>
<name>hive.exec.log4j.file</name>
<value></value>
<description>Hive log4j configuration file for execution mode(sub command).
If the property is not set, then logging will be initialized using hive-exec-log4j.properties found on the classpath.
If the property is set, the value must be a valid URI (java.net.URI, e.g. "file:///tmp/my-logging.properties"), which you can then extract a URL from and pass to PropertyConfigurator.configure(URL).</description>
</property>
<property>
<name>hive.exec.infer.bucket.sort</name>
<value>false</value>
<description>
If this is set, when writing partitions, the metadata will include the bucketing/sorting
properties with which the data was written if any (this will not overwrite the metadata
inherited from the table if the table is bucketed/sorted)
</description>
</property>
<property>
<name>hive.exec.infer.bucket.sort.num.buckets.power.two</name>
<value>false</value>
<description>
If this is set, when setting the number of reducers for the map reduce task which writes the
final output files, it will choose a number which is a power of two, unless the user specifies
the number of reducers to use using mapred.reduce.tasks. The number of reducers
may be set to a power of two, only to be followed by a merge task meaning preventing
anything from being inferred.
With hive.exec.infer.bucket.sort set to true:
Advantages: If this is not set, the number of buckets for partitions will seem arbitrary,
which means that the number of mappers used for optimized joins, for example, will
be very low. With this set, since the number of buckets used for any partition is
a power of two, the number of mappers used for optimized joins will be the least
number of buckets used by any partition being joined.
Disadvantages: This may mean a much larger or much smaller number of reducers being used in the
final map reduce job, e.g. if a job was originally going to take 257 reducers,
it will now take 512 reducers, similarly if the max number of reducers is 511,
and a job was going to use this many, it will now use 256 reducers.
</description>
</property>
<property>
<name>hive.groupby.orderby.position.alias</name>
<value>false</value>
<description>Whether to enable using Column Position Alias in Group By or Order By</description>
</property>
<property>
<name>hive.server2.thrift.min.worker.threads</name>
<value>5</value>
<description>Minimum number of Thrift worker threads</description>
</property>
<property>
<name>hive.server2.thrift.max.worker.threads</name>
<value>500</value>
<description>Maximum number of Thrift worker threads</description>
</property>
<property>
<name>hive.server2.async.exec.threads</name>
<value>100</value>
<description>Number of threads in the async thread pool for HiveServer2</description>
</property>
<property>
<name>hive.server2.async.exec.shutdown.timeout</name>
<value>10</value>
<description>Time (in seconds) for which HiveServer2 shutdown will wait for async
threads to terminate</description>
</property>
<property>
<name>hive.server2.async.exec.keepalive.time</name>
<value>10</value>
<description>Time (in seconds) that an idle HiveServer2 async thread (from the thread pool) will wait
for a new task to arrive before terminating</description>
</property>
<property>
<name>hive.server2.async.exec.wait.queue.size</name>
<value>100</value>
<description>Size of the wait queue for async thread pool in HiveServer2.
After hitting this limit, the async thread pool will reject new requests.</description>
</property>
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
<description>Port number of HiveServer2 Thrift interface.
Can be overridden by setting $HIVE_SERVER2_THRIFT_PORT</description>
</property>
<property>
<name>hive.server2.thrift.bind.host</name>
<value>localhost</value>
<description>Bind host on which to run the HiveServer2 Thrift interface.
Can be overridden by setting $HIVE_SERVER2_THRIFT_BIND_HOST</description>
</property>
<property>
<name>hive.server2.authentication</name>
<value>NONE</value>
<description>
Client authentication types.
NONE: no authentication check
LDAP: LDAP/AD based authentication
KERBEROS: Kerberos/GSSAPI authentication
CUSTOM: Custom authentication provider
(Use with property hive.server2.custom.authentication.class)
</description>
</property>
<property>
<name>hive.server2.custom.authentication.class</name>
<value></value>
<description>
Custom authentication class. Used when property
'hive.server2.authentication' is set to 'CUSTOM'. Provided class
must be a proper implementation of the interface
org.apache.hive.service.auth.PasswdAuthenticationProvider. HiveServer2
will call its Authenticate(user, passed) method to authenticate requests.
The implementation may optionally extend the Hadoop's
org.apache.hadoop.conf.Configured class to grab Hive's Configuration object.
</description>
</property>
<property>
<name>>hive.server2.authentication.kerberos.principal</name>
<value></value>
<description>
Kerberos server principal
</description>
</property>
<property>
<name>>hive.server2.authentication.kerberos.keytab</name>
<value></value>
<description>
Kerberos keytab file for server principal
</description>
</property>
<property>
<name>hive.server2.authentication.ldap.url</name>
<value></value>
<description>
LDAP connection URL
</description>
</property>
<property>
<name>hive.server2.authentication.ldap.baseDN</name>
<value></value>
<description>
LDAP base DN
</description>
</property>
<property>
<name>hive.server2.enable.doAs</name>
<value>true</value>
<description>
Setting this property to true will have hive server2 execute
hive operations as the user making the calls to it.
</description>
</property>
<property>
<name>hive.server2.table.type.mapping</name>
<value>CLASSIC</value>
<description>
This setting reflects how HiveServer will report the table types for JDBC and other
client implementations that retrieves the available tables and supported table types
HIVE : Exposes the hive's native table tyes like MANAGED_TABLE, EXTERNAL_TABLE, VIRTUAL_VIEW
CLASSIC : More generic types like TABLE and VIEW
</description>
</property>
<property>
<name>hive.server2.thrift.sasl.qop</name>
<value>auth</value>
<description>Sasl QOP value; Set it to one of following values to enable higher levels of
protection for hive server2 communication with clients.
"auth" - authentication only (default)
"auth-int" - authentication plus integrity protection
"auth-conf" - authentication plus integrity and confidentiality protection
This is applicable only hive server2 is configured to use kerberos authentication.
</description>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
<description>
Enforce metastore schema version consistency.
True: Verify that version information stored in metastore matches with one from Hive jars. Also disable automatic
schema migration attempt. Users are required to manully migrate schema after Hive upgrade which ensures
proper metastore schema migration. (Default)
False: Warn if the version information stored in metastore doesn't match with one from in Hive jars.
</description>
</property>
<property>
<name>hive.plan.serialization.format</name>
<value>kryo</value>
<description>
Query plan format serialization between client and task nodes.
Two supported values are : kryo and javaXML. Kryo is default.
</description>
</property>
<property>
<name>hive.server2.allow.user.substitution</name>
<value>true</value>
<description>
Allow altername user to be specified as part of HiveServer2 open connection requestion
</description>
</property>
</configuration>
[root@cdh1 conf]#
[root@cdh1 conf]# less hive-site.xml
[root@cdh1 conf]# mysql -u root -p
Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 4
Server version: 5.1.73 Source distribution
Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql> show tables;
ERROR 1046 (3D000): No database selected
mysql> show databases;
+--------------------+
| Database |
+--------------------+
| information_schema |
| amon |
| hive |
| mysql |
| test |
+--------------------+
5 rows in set (0.23 sec)
mysql> create database hivedb;
Query OK, 1 row affected (0.00 sec)
mysql> show databases;
+--------------------+
| Database |
+--------------------+
| information_schema |
| amon |
| hive |
| hivedb |
| mysql |
| test |
+--------------------+
6 rows in set (0.00 sec)
mysql> select user();
+----------------+
| user() |
+----------------+
| root@localhost |
+----------------+
1 row in set (0.04 sec)
mysql> create user 'hiveuser' identified by 'hivepwd';
Query OK, 0 rows affected (0.00 sec)
mysql> select user();
+----------------+
| user() |
+----------------+
| root@localhost |
+----------------+
1 row in set (0.00 sec)
mysql> grant all privileges on *.* to hiveuser@"localhost" identified by "hivepwd" with grant option;
Query OK, 0 rows affected (0.00 sec)
mysql> select user();
+----------------+
| user() |
+----------------+
| root@localhost |
+----------------+
1 row in set (0.00 sec)
mysql> q
-> ;
ERROR 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'q' at line 1
mysql> Bye
[root@cdh1 conf]# mysql -u hiveuser -p
Enter password:
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 6
Server version: 5.1.73 Source distribution
Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.
Oracle is a registered trademark of Oracle Corporation and/or its
affiliates. Other names may be trademarks of their respective
owners.
Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.
mysql> select user();
+--------------------+
| user() |
+--------------------+
| hiveuser@localhost |
+--------------------+
1 row in set (0.00 sec)
mysql> select user();
+--------------------+
| user() |
+--------------------+
| hiveuser@localhost |
+--------------------+
1 row in set (0.00 sec)
mysql> show databases;
+--------------------+
| Database |
+--------------------+
| information_schema |
| amon |
| hive |
| hivedb |
| mysql |
| test |
+--------------------+
6 rows in set (0.00 sec)
mysql> Bye
[root@cdh1 conf]# cd /user/local/test1
[root@cdh1 test1]# ll
total 32
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
fileList.scala:3: error: not found: value filesHere
for (file <- filesHere)
^
one error found
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
[root@cdh1 test1]# scala fileList.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
[root@cdh1 test1]# ls
fileList$$anonfun$main$1.class fileList$.class matchTest.scala prt1$.class prt.class prt.scala
fileList.class fileList.scala prt1.class prt1.scala prt$.class test.txt
[root@cdh1 test1]# ls -l
total 48
-rw-r--r-- 1 root root 1130 Jul 12 04:53 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 04:53 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 04:53 fileList$.class
-rw-r--r-- 1 root root 244 Jul 12 04:53 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# cat fileList.scala
object fileList{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
for (file <- filesHere)
if (file.getName.endsWith(".scala"))
println(file)
}
}
[root@cdh1 test1]#
[root@cdh1 conf]# cd /user/local/test1
[root@cdh1 test1]# ll
total 32
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
fileList.scala:3: error: not found: value filesHere
for (file <- filesHere)
^
one error found
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
[root@cdh1 test1]# scala fileList.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
[root@cdh1 test1]# ls
fileList$$anonfun$main$1.class fileList$.class matchTest.scala prt1$.class prt.class prt.scala
fileList.class fileList.scala prt1.class prt1.scala prt$.class test.txt
[root@cdh1 test1]# ls -l
total 48
-rw-r--r-- 1 root root 1130 Jul 12 04:53 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 04:53 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 04:53 fileList$.class
-rw-r--r-- 1 root root 244 Jul 12 04:53 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# cat fileList.scala
object fileList{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
for (file <- filesHere)
if (file.getName.endsWith(".scala"))
println(file)
}
}
[root@cdh1 test1]# ll
total 48
-rw-r--r-- 1 root root 1130 Jul 12 04:53 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 04:53 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 04:53 fileList$.class
-rw-r--r-- 1 root root 244 Jul 12 04:53 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# mkdir a.scala
[root@cdh1 test1]# ll
total 52
drwxr-xr-x 2 root root 4096 Jul 12 04:59 a.scala
-rw-r--r-- 1 root root 1130 Jul 12 04:53 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 04:53 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 04:53 fileList$.class
-rw-r--r-- 1 root root 244 Jul 12 04:53 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# scala fileList.scala
./prt1.scala
./prt.scala
./fileList.scala
./a.scala
./matchTest.scala
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
fileList.scala:6: error: illegal start of simple expression
if (file.isFile);
^
one error found
[root@cdh1 test1]# vim fileList.scala
[root@cdh1 test1]# scalac fileList.scala
[root@cdh1 test1]# scala fileList.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
[root@cdh1 test1]# cp fileList.scala fileList2.scala
[root@cdh1 test1]# vim fileList2.scala
\[root@cdh1 test1]#scalac fileList2.scala
fileList2.scala:7: error: illegal start of simple pattern
if (file.isFile);
^
fileList2.scala:7: error: '<-' expected but ';' found.
if (file.isFile);
^
fileList2.scala:10: error: '<-' expected but '}' found.
}
^
fileList2.scala:11: error: illegal start of simple expression
}
^
four errors found
[root@cdh1 test1]# vim fileList2.scala
[root@cdh1 test1]# scalac fileList2.scala
[root@cdh1 test1]# scala fileList2.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
./fileList2.scala
[root@cdh1 test1]# cat fileList2.scala
object fileList2{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
//for (file <- filesHere)
for(
file <- filesHere
if (file.isFile);
if (file.getName.endsWith(".scala"))
//println(file)
)
println(file)
}
}
[root@cdh1 test1]# cp fileList2.scala fileList3.scala
[root@cdh1 test1]# vim fileList3.scala
[root@cdh1 test1]# scalac fileList3.scala
fileList3.scala:6: error: ';' expected but '<-' found.
file <- filesHere
^
fileList3.scala:9: error: illegal start of simple pattern
if (file.isFile);
^
fileList3.scala:9: error: '<-' expected but ';' found.
if (file.isFile);
^
three errors found
[root@cdh1 test1]# vim fileList3.scala
[root@cdh1 test1]# scalac fileList3.scala
[root@cdh1 test1]# scala fileList3.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
./fileList2.scala
./fileList3.scala
[root@cdh1 test1]# cat fileList3.scala
object fileList3{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
//for (file <- filesHere)
//file <- filesHere
for(file <- filesHere if (file.isFile); if (file.getName.endsWith(".scala")) )
println(file)
}
}
[root@cdh1 test1]# vim fileList3.scala
[root@cdh1 test1]# scalac fileList3.scala
[root@cdh1 test1]# scala fileList3.scala
./prt1.scala
./prt.scala
./fileList.scala
./matchTest.scala
./fileList2.scala
./fileList3.scala
[root@cdh1 test1]# cat fileList3.scala
object fileList3{
//val filesHere = (new java.io.File(".")).listFiles
def main(args :Array[String]){
val filesHere = (new java.io.File(".")).listFiles
//for (file <- filesHere)
//file <- filesHere
for(file <- filesHere; if (file.isFile); if (file.getName.endsWith(".scala")) )
println(file)
}
}
[root@cdh1 test1]# ll
total 100
drwxr-xr-x 2 root root 4096 Jul 12 04:59 a.scala
-rw-r--r-- 1 root root 921 Jul 12 05:17 fileList2$$anonfun$main$1.class
-rw-r--r-- 1 root root 1023 Jul 12 05:17 fileList2$$anonfun$main$2.class
-rw-r--r-- 1 root root 992 Jul 12 05:17 fileList2$$anonfun$main$3.class
-rw-r--r-- 1 root root 770 Jul 12 05:17 fileList2.class
-rw-r--r-- 1 root root 1214 Jul 12 05:17 fileList2$.class
-rw-r--r-- 1 root root 327 Jul 12 05:17 fileList2.scala
-rw-r--r-- 1 root root 921 Jul 12 05:24 fileList3$$anonfun$main$1.class
-rw-r--r-- 1 root root 1023 Jul 12 05:24 fileList3$$anonfun$main$2.class
-rw-r--r-- 1 root root 992 Jul 12 05:24 fileList3$$anonfun$main$3.class
-rw-r--r-- 1 root root 770 Jul 12 05:24 fileList3.class
-rw-r--r-- 1 root root 1214 Jul 12 05:24 fileList3$.class
-rw-r--r-- 1 root root 312 Jul 12 05:24 fileList3.scala
-rw-r--r-- 1 root root 1166 Jul 12 05:02 fileList$$anonfun$main$1.class
-rw-r--r-- 1 root root 685 Jul 12 05:02 fileList.class
-rw-r--r-- 1 root root 968 Jul 12 05:02 fileList$.class
-rw-r--r-- 1 root root 270 Jul 12 05:02 fileList.scala
-rw-r--r-- 1 root root 216 Jul 11 19:35 matchTest.scala
-rw-r--r-- 1 root root 697 Jul 11 06:30 prt1.class
-rw-r--r-- 1 root root 1066 Jul 11 06:30 prt1$.class
-rw-r--r-- 1 root root 586 Jul 11 06:29 prt1.scala
-rw-r--r-- 1 root root 594 Jul 11 06:26 prt.class
-rw-r--r-- 1 root root 1052 Jul 11 06:26 prt$.class
-rw-r--r-- 1 root root 120 Jul 11 06:19 prt.scala
-rw-r--r-- 1 root root 21 Jul 11 23:22 test.txt
[root@cdh1 test1]# less matchTest.scala
[root@cdh1 test1]# vim matchTest.scala
[root@cdh1 test1]# scalac matchTest.scala
[root@cdh1 test1]# scala matchTest.scala
huh?
[root@cdh1 test1]# scala matchTest.scala chips
salsa
[root@cdh1 test1]# scala matchTest.scala salt
pepper
[root@cdh1 test1]# scala matchTest.scala eggs
bacon
[root@cdh1 test1]# scala matchTest.scala
huh?
[root@cdh1 test1]# cat matchTest.scala
object matchTest{
def main(args:Array[String]){
val firstArg =
if(args.length > 0)
args(0)
else
""
firstArg match {
case "salt" => println("pepper")
case "chips" => println("salsa")
case "eggs" => println("bacon")
case _ => println("huh?") }
}
}
[root@cdh1 test1]# scala matchTest.scala eggs salt
bacon
[root@cdh1 test1]# scala matchTest.scala matchTest.
matchTest.class matchTest.scala
[root@cdh1 test1]# cp matchTest.scala matchTest2.scala
[root@cdh1 test1]# vim matchTest2.scala
[1]+ Stopped vim matchTest2.scala
[root@cdh1 test1]# vim matchTest2.scala
[root@cdh1 test1]# scalac matchTest2.scala
[root@cdh1 test1]# scala matchTest2.scala eggs salt
e_bacon
[root@cdh1 test1]# scala matchTest2.scala salt
s_pepper
[root@cdh1 test1]# scala matchTest2.scala chips
c_salsa
[root@cdh1 test1]# scala matchTest2.scala chipsss
o_huh?
[root@cdh1 test1]# scala matchTest2.scala
o_huh?
[root@cdh1 test1]# cat matchTest2.scala
object matchTest2{
def main(args:Array[String]){
val firstArg =
if(args.length > 0)
args(0)
else
""
var friend=
firstArg match {
case "salt" => "s_pepper"
case "chips" => "c_salsa"
case "eggs" => "e_bacon"
case _ => "o_huh?" }
println(friend)
}
}
[root@cdh1 test1]# scala
Welcome to Scala version 2.9.3 (Java HotSpot(TM) Client VM, Java 1.7.0_67).
Type in expressions to have them evaluated.
Type :help for more information.
scala> var increase = (x: Int) => x + 1
increase: Int => Int = <function1>
scala> increase 10
<console>:1: error: ';' expected but integer literal found.
increase 10
^
scala> increase (10)
res0: Int = 11
scala>
scala> increase (10)
res1: Int = 11
scala> var increase = (x: Int) => x + 9999
increase: Int => Int = <function1>
scala> increase (10)
res2: Int = 10009
scala> val increase = (x: Int) => x + 1
increase: Int => Int = <function1>
scala> increase (10)
res3: Int = 11
scala> increase = (x: Int) => x + 9999
<console>:8: error: reassignment to val
increase = (x: Int) => x + 9999
^
scala> var increase = (x: Int) => x + 1
increase: Int => Int = <function1>
scala> increase = (x: Int) => x + 9999
increase: Int => Int = <function1>
scala> increase (10)
res4: Int = 10009
scala> increase = (x: Int) => { println("We") println("are") println("here!") x + 1 }
<console>:8: error: value println is not a member of Unit
increase = (x: Int) => { println("We") println("are") println("here!") x + 1 }
^
scala> increase = (x: Int) => { println("We") println("are") println("here!") x + 1 }
<console>:8: error: value println is not a member of Unit
increase = (x: Int) => { println("We") println("are") println("here!") x + 1 }
^
scala> increase = (x: Int) => { println("We") ;println("are"); println("here!"); x + 1 }
increase: Int => Int = <function1>
scala> increase (10)
We
are
here!
res5: Int = 11
scala> val someNumbers = List(-11, -10, -5, 0, 5, 10)
someNumbers: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.f
filter filterNot find findIndexOf findLastIndexOf first firstOption
flatMap flatten fold foldLeft foldRight forall foreach
scala> someNumbers.foreach((x:Int)->println(x))
<console>:9: error: not found: value x
someNumbers.foreach((x:Int)->println(x))
^
scala> someNumbers.foreach((x:Int)=>println(x))
-11
-10
-5
0
5
10
scala> someNumbers
res8: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.filter
filter filterNot
scala> someNumbers.filter((x:Int)=>x>0)
res9: List[Int] = List(5, 10)
scala> someNumbers.filter((x:Int)=>x>0).foreach((_:Int)=>println(_))
<console>:9: error: missing parameter type for expanded function ((x$2) => println(x$2))
someNumbers.filter((x:Int)=>x>0).foreach((_:Int)=>println(_))
^
scala> someNumbers.filter((x:Int)=>x>0).foreach((x:Int)=>println(x))
5
10
scala> someNumbers.filter((x:Int)=>x>0).foreach((y:Int)=>println(y))
5
10
scala> someNumbers.filter((x)=>x>0)
res13: List[Int] = List(5, 10)
scala> someNumbers.filter(x=>x>0)
res14: List[Int] = List(5, 10)
scala> someNumbers.filter(x=>x>0).foreach((y:Int)=>println(y))
5
10
scala> someNumbers.filter(_>0).foreach(println(_))
5
10
scala> someNumbers.filter(_>0)
res17: List[Int] = List(5, 10)
scala> someNumbers.filter(_<0)
res18: List[Int] = List(-11, -10, -5)
scala> someNumbers.filter(_<0).foreach(println(_))
-11
-10
-5
scala>
scala> val f =_+_
<console>:7: error: missing parameter type for expanded function ((x$1, x$2) => x$1.$plus(x$2))
val f =_+_
^
<console>:7: error: missing parameter type for expanded function ((x$1: <error>, x$2) => x$1.$plus(x$2))
val f =_+_
^
scala> val f =(_:Int)+(_:InT)
<console>:7: error: not found: type InT
val f =(_:Int)+(_:InT)
^
scala> val f =(_:Int)+(_:Int)
f: (Int, Int) => Int = <function2>
scala> f(2,3)
res20: Int = 5
scala> def sum(a: Int, b: Int, c: Int) = a + b + c
sum: (a: Int, b: Int, c: Int)Int
scala> sum(1,3,5)
res21: Int = 9
scala> val f=sum _
f: (Int, Int, Int) => Int = <function3>
scala> sum(1,2)
<console>:9: error: not enough arguments for method sum: (a: Int, b: Int, c: Int)Int.
Unspecified value parameter c.
sum(1,2)
^
scala> f(1,2)
<console>:10: error: not enough arguments for method apply: (v1: Int, v2: Int, v3: Int)Int in trait Function3.
Unspecified value parameter v3.
f(1,2)
^
scala> f(1,2,3)
res24: Int = 6
scala> f.a
apply asInstanceOf
scala> f.apply(1,2,4)
res25: Int = 7
scala> f.apply(1,2,3)
res26: Int = 6
scala> val b = sum(1, _: Int, 3)
b: Int => Int = <function1>
scala> b(2)
res27: Int = 6
scala> b(5)
res28: Int = 9
scala> b(4)
res29: Int = 8
scala> someNumbers
res30: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.foreach(println)
-11
-10
-5
0
5
10
scala> someNumbers.filter(>0)foreach(println)
<console>:1: error: ')' expected but integer literal found.
someNumbers.filter(>0)foreach(println)
^
scala> someNumbers.filter(>0).foreach(println)
<console>:1: error: ')' expected but integer literal found.
someNumbers.filter(>0).foreach(println)
^
scala> someNumbers.filter(_>0).foreach(println)
5
10
scala> val c = sum
<console>:8: error: missing arguments for method sum in object $iw;
follow this method with `_' if you want to treat it as a partially applied function
val c = sum
^
scala> val c = sum _
c: (Int, Int, Int) => Int = <function3>
scala> c(2,3,4)
res33: Int = 9
scala> (x: Int) => x + more
<console>:8: error: not found: value more
(x: Int) => x + more
^
scala> var more = 1
more: Int = 1
scala> val f(x: Int) => x + more
<console>:1: error: '=' expected but '=>' found.
val f(x: Int) => x + more
^
scala> val f(x: Int) = x + more
<console>:10: error: value f is not a case class constructor, nor does it have an unapply/unapplySeq method
val f(x: Int) = x + more
^
scala> val f=(x: Int) => x + more
f: Int => Int = <function1>
scala> f(1)
res35: Int = 2
scala> f(3)
res36: Int = 4
scala> f(10)
res37: Int = 11
scala> more = 9999
more: Int = 9999
scala> f(10)
res38: Int = 10009
scala> someNumbers
res39: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.foreach(sum+=_)
<console>:10: error: missing arguments for method sum in object $iw;
follow this method with `_' if you want to treat it as a partially applied function
someNumbers.foreach(sum+=_)
^
<console>:10: error: reassignment to val
someNumbers.foreach(sum+=_)
^
scala> var sum=0;
sum: Int = 0
scala> someNumbers.foreach(sum+=_)
scala> someNumbers
res42: List[Int] = List(-11, -10, -5, 0, 5, 10)
scala> someNumbers.foreach(sum+=_)
scala> someNumbers.foreach(sum+=_)
scala> sum
res45: Int = -33
scala> someNumbers.foreach(sum+=_).println
<console>:10: error: value println is not a member of Unit
someNumbers.foreach(sum+=_).println
^
scala> def echo(args:String*)
| args.foreach(println)
<console>:10: error: not found: value args
args.foreach(println)
^
<console>:7: error: only classes can have declared but undefined members
def echo(args:String*)
^
scala> val echo(args:String*)
<console>:1: error: ')' expected but identifier found.
val echo(args:String*)
^
scala> val echo(args:String*) args.foreach(println)
<console>:1: error: ')' expected but identifier found.
val echo(args:String*) args.foreach(println)
^
scala> def echo(args:String*)=args.foreach(println)
echo: (args: String*)Unit
scala> echo("hello","world")
hello
world
scala> def echo(args:String*)=for(arg<-args) println
echo: (args: String*)Unit
scala> echo("hello","world")
scala> echo("hello","world")
scala> def echo(args:String*)=for(arg<-args) println(_)
<console>:7: error: missing parameter type for expanded function ((x$1) => println(x$1))
def echo(args:String*)=for(arg<-args) println(_)
^
scala> def echo(args:String*)=for(arg<-args) println(arg)
echo: (args: String*)Unit
scala> echo("hello","world")
hello
world
scala> echo("hello","world","spark")
hello
world
spark
scala> echo()
scala> echo("hello")
hello
scala> val arr = Array("What's", "up", "doc?")
arr: Array[java.lang.String] = Array(What's, up, doc?)
scala> echo(arr)
<console>:10: error: type mismatch;
found : Array[java.lang.String]
required: String
echo(arr)
^
scala> echo(arr.toString)
[Ljava.lang.String;@614169
scala> echo(arr.toString:_*)
<console>:10: error: type mismatch;
found : java.lang.String
required: Seq[String]
echo(arr.toString:_*)
^
scala> echo(arr:_*)
What's
up
doc?
scala> arr
res59: Array[java.lang.String] = Array(What's, up, doc?)
scala> echo(arr:_)
<console>:1: error: `*' expected
echo(arr:_)
^
scala> echo(arr:_*)
What's
up
doc?
scala>