//收集了一小部分,忘记的时候过来查一下
java--hadoop部分
/*** 此类用来处理DNS原始日志:统计给定域名平均响应时延
*@paramInput
*@paramOutput
*@paramcacheUriListfilePath
*@paramcacheIpNetTypefilePath
*
[文件cachefile需要上传HDFS,文件为K-V形式,多个V用;隔开]
*
*
NOTE:该类适合限定域名的时延统计,若统计所有域名的的平均时延此类不适用,因为reducer类使用集合进行汇聚,所有域名会导致内存溢出
*/是单独起个段落 (注意和
换行、
再起一个段落 比较)
是加黑加粗
@param是参数
@author yanghl 作者
/*** This is an example Aggregated Hadoop Map/Reduce application. Computes the
* histogram of the words in the input texts.
*
* To run: bin/hadoop jar hadoop-*-examples.jar aggregatewordhist in-dir
* out-dir numOfReducers textinputformat
**/
是倾斜体,表示路径
/*** Creates a Statement
object for sending
* SQL statements to the database.
* SQL statements without parameters are normally
* executed using Statement
objects. If the same SQL statement
* is executed many times, it may be more efficient to use a
* PreparedStatement
object.
*
* Result sets created using the returned Statement
* object will by default be type TYPE_FORWARD_ONLY
* and have a concurrency level of CONCUR_READ_ONLY
.
* The holdability of the created result sets can be determined by
* calling {@link#getHoldability}.
*
*@returna new default Statement
object
*@exceptionSQLException if a database access error occurs
* or this method is called on a closed connection*/
@return
@exception
{@link #getHoldability}.加链接 getHoldability()本包的一个方法
/**Holds a <url, referrer, time > tuple*/
static class AccessRecord implements Writable, DBWritable {.....}
========================
< >注释中的<>表示的另一种方法
========================
/***
The basic service for managing a set of JDBC drivers.
* NOTE: The {@linkDataSource
} interface, new in the
* JDBC 2.0 API, provides another way to connect to a data source.
* The use of a DataSource
object is the preferred means of
* connecting to a data source.
*
*
As part of its initialization, the DriverManager
class will
* attempt to load the driver classes referenced in the "jdbc.drivers"
* system property. This allows a user to customize the JDBC Drivers
* used by their applications. For example in your
* ~/.hotjava/properties file you might specify:
*
* jdbc.drivers=foo.bah.Driver:wombat.sql.Driver:bad.taste.ourDriver
*
*/===============================
【br、P、pre、code、B】标签
换行
再起一个段落
===============================
/*** A tool interface that supports handling of generic command-line options.
*
*
Tool
, is the standard for any Map-Reduce tool/application.
* The tool/application should delegate the handling of
* standard command-line options to {@linkToolRunner#run(Tool, String[])}
* and only handle its custom arguments.
*
*
Here is how a typical Tool
is implemented:
*
* public class MyApp extends Configured implements Tool {
*
* public int run(String[] args) throws Exception {
* //
Configuration
processed byToolRunner
* Configuration conf = getConf();
*
* // Create a JobConf using the processed
conf
* JobConf job = new JobConf(conf, MyApp.class);
*
* // Process custom command-line options
* Path in = new Path(args[1]);
* Path out = new Path(args[2]);
*
* // Specify various job-specific parameters
* job.setJobName("my-app");
* job.setInputPath(in);
* job.setOutputPath(out);
* job.setMapperClass(MyMapper.class);
* job.setReducerClass(MyReducer.class);
*
* // Submit the job, then poll for progress until the job is complete
* JobClient.runJob(job);
* return 0;
* }
*
* public static void main(String[] args) throws Exception {
* // Let
ToolRunner
handle generic command-line options* int res = ToolRunner.run(new Configuration(), new MyApp(), args);
*
* System.exit(res);
* }
* }
*
*
*@seeGenericOptionsParser
*@seeToolRunner*/
@InterfaceAudience.Public
@InterfaceStability.Stable
public interface Tool extends Configurable {
【
】还可以在代码注释里画表格、列表
/*** Rounding mode to round away from zero. Always increments the
* digit prior to a non-zero discarded fraction. Note that this
* rounding mode never decreases the magnitude of the calculated
* value.
*
*
Example:
*
*
Input Number*
Input rounded to one digit
with {@codeUP} rounding*
5.5 6*
2.5 3*
1.6 2*
1.1 2*
1.0 1*
-1.0 -1*
-1.1 -2*
-1.6 -2*
-2.5 -3*
-5.5 -6*
*/
- a
- a
- a
还可以在代码注释里【写代码或配置文件内容】{@code xxxx....}
/*** This program uses map/reduce to just run a distributed job where there is
* no interaction between the tasks and each task write a large unsorted
* random binary sequence file of BytesWritable.
* In order for this program to generate data for terasort with 10-byte keys
* and 90-byte values, have the following config:
*
{@code* <?xml version="1.0"?>* <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
*
*
* mapreduce.randomwriter.minkey
* 10
*
*
* mapreduce.randomwriter.maxkey
* 10
*
*
* mapreduce.randomwriter.minvalue
* 90
*
*
* mapreduce.randomwriter.maxvalue
* 90
*
*
* mapreduce.randomwriter.totalbytes
* 1099511627776
*
* }
* Equivalently, {@linkRandomWriter} also supports all the above options
* and ones supported by {@linkGenericOptionsParser} via the command-line.*/