Generated by
JDiff

Package org.apache.hadoop.mapred

Removed Classes
JobShell Provide command line parsing for JobSubmission job submission looks like hadoop jar -libjars -archives -files inputjar args
StatusHttpServer A mapred http server.
StatusHttpServer.TaskGraphServlet The servlet that outputs svg graphics for map / reduce task statuses
 

Added Classes and Interfaces
MapReducePolicyProvider PolicyProvider for Map-Reduce protocols.
RawKeyValueIterator RawKeyValueIterator is an iterator used to iterate over the raw keys and values during sort/merge of intermediate data.
TIPStatus The states of a TaskInProgress as seen by the JobTracker.
TaskGraphServlet The servlet that outputs svg graphics for map / reduce task statuses
 

Changed Classes and Interfaces
ClusterStatus Status information on the current state of the Map-Reduce cluster.
Counters A set of named counters.
Counters.Counter A counter record comprising its name and value.
Counters.Group Group of counters comprising of counters from a particular counter Enum class.
FileInputFormat A base class for file-based InputFormat
FileOutputFormat A base class for OutputFormat
FileSplit A section of an input file.
ID A general identifier which internally stores the id as an integer.
InputFormat InputFormat describes the input-specification for a Map-Reduce job.
InputSplit InputSplit represents the data to be processed by an individual Mapper
InvalidInputException This class wraps a list of problems with the input so that the user can get a list of problems together instead of finding and fixing them one by one.
IsolationRunner  
JobClient JobClient is the primary interface for the user-job to interact with the JobTracker JobClient provides facilities to submit jobs track their progress access component-tasks' reports/logs get the Map-Reduce cluster status information etc.
JobClient.TaskStatusFilter  
JobConf A map/reduce job configuration.
JobContext @deprecated Use org.apache.hadoop.mapreduce.JobContext instead.
JobHistory.JobInfo Helper class for logging or reading back events related to job start finish or failure.
JobHistory.Keys Job history files contain key="value" pairs where keys belong to this enum.
JobHistory.Listener Callback interface for reading back log events from JobHistory.
JobHistory.RecordTypes Record types are identifiers for each line of log in history files.
JobHistory.Task Helper class for logging or reading back events related to Task's start finish or failure.
JobHistory.Values This enum contains some of the values commonly used by history log events.
JobID JobID represents the immutable and unique identifier for the job.
JobPriority Used to describe the priority of the running job.
JobProfile A JobProfile is a MapReduce primitive.
JobTracker JobTracker is the central location for submitting and tracking MR jobs in a network environment.
JobTracker.State  
KeyValueLineRecordReader This class treats a line in the input as a key/value pair separated by a separator character.
KeyValueTextInputFormat An InputFormat for plain text files.
LineRecordReader Treats keys as offset in file and value as line.
MapFileOutputFormat An OutputFormat that writes MapFile}s
MapRunnable Expert: Generic interface for
MapRunner Default MapRunnable implementation.
Mapper Maps input key/value pairs to a set of intermediate key/value pairs.
MultiFileInputFormat An abstract InputFormat that returns i int) method.
MultiFileSplit A sub-collection of input files.
OutputCollector Collects the <key value> pairs output by an
OutputCommitter OutputCommitter describes the commit of task output for a Map-Reduce job.
OutputFormat OutputFormat describes the output-specification for a Map-Reduce job.
Partitioner Partitions the key space.
RecordReader RecordReader reads <key value> pairs from an InputSplit
RecordWriter RecordWriter writes the output <key value> pairs to an output file.
Reducer Reduces a set of intermediate values which share a key to a smaller set of values.
Reporter A facility for Map-Reduce applications to report progress and update counters status information etc.
RunningJob RunningJob is the user-interface to query for details on a running Map-Reduce job.
SequenceFileAsBinaryInputFormat InputFormat reading keys values from SequenceFiles in binary (raw) format.
SequenceFileAsBinaryInputFormat.
SequenceFileAsBinaryRecordReader
Read records from a SequenceFile as binary (raw) bytes.
SequenceFileAsBinaryOutputFormat An OutputFormat that writes keys values to i binary(raw) format
SequenceFileAsTextInputFormat This class is similar to SequenceFileInputFormat except it generates SequenceFileAsTextRecordReader which converts the input keys and values to their String forms by calling toString() method.
SequenceFileAsTextRecordReader This class converts the input keys and values to their String forms by calling toString() method.
SequenceFileInputFilter A class that allows a map/red job to work on a sample of sequence files.
SequenceFileInputFormat An InputFormat for
SequenceFileOutputFormat An OutputFormat that writes
SequenceFileRecordReader An RecordReader for SequenceFile}s
TaskAttemptContext @deprecated Use org.apache.hadoop.mapreduce.TaskAttemptContext instead.
TaskAttemptID TaskAttemptID represents the immutable and unique identifier for a task attempt.
TaskCompletionEvent.Status  
TaskID TaskID represents the immutable and unique identifier for a Map or Reduce Task.
TaskLog A simple logger to handle the task-specific user logs.
TaskLog.LogName The filter for userlogs.
TaskReport A report on the state of a task.
TaskTracker TaskTracker is a process that starts and tracks MR Tasks in a networked environment.
TextInputFormat An InputFormat for plain text files.
TextOutputFormat An OutputFormat that writes plain text files.
TextOutputFormat.LineRecordWriter