source: proiecte/HadoopJUnit/hadoop-0.20.1/src/core/org/apache/hadoop/metrics/package.html @ 120

Last change on this file since 120 was 120, checked in by (none), 14 years ago

Added the mail files for the Hadoop JUNit Project

  • Property svn:executable set to *
File size: 6.0 KB
Line 
1<html>
2
3<!--
4   Licensed to the Apache Software Foundation (ASF) under one or more
5   contributor license agreements.  See the NOTICE file distributed with
6   this work for additional information regarding copyright ownership.
7   The ASF licenses this file to You under the Apache License, Version 2.0
8   (the "License"); you may not use this file except in compliance with
9   the License.  You may obtain a copy of the License at
10
11       http://www.apache.org/licenses/LICENSE-2.0
12
13   Unless required by applicable law or agreed to in writing, software
14   distributed under the License is distributed on an "AS IS" BASIS,
15   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16   See the License for the specific language governing permissions and
17   limitations under the License.
18-->
19
20  <head>
21    <title>org.apache.hadoop.metrics</title>
22  </head>
23<body>
24This package defines an API for reporting performance metric information.
25<p/>
26The API is abstract so that it can be implemented on top of
27a variety of metrics client libraries.  The choice of
28client library is a configuration option, and different
29modules within the same application can use
30different metrics implementation libraries.
31<p/>
32Sub-packages:
33<dl>
34    <dt><code>org.apache.hadoop.metrics.spi</code></dt>
35    <dd>The abstract Server Provider Interface package. Those wishing to
36    integrate the metrics API with a particular metrics client library should
37    extend this package.</dd>
38   
39    <dt><code>org.apache.hadoop.metrics.file</code></dt>
40    <dd>An implementation package which writes the metric data to
41    a file, or sends it to the standard output stream.</dd>
42 
43    <dt> <code>org.apache.hadoop.metrics.ganglia</code></dt>
44    <dd>An implementation package which sends metric data to
45    <a href="http://ganglia.sourceforge.net/">Ganglia</a>.</dd>
46</dl>
47
48<h3>Introduction to the Metrics API</h3>
49
50Here is a simple example of how to use this package to report a single
51metric value:
52<pre>
53    private ContextFactory contextFactory = ContextFactory.getFactory();
54   
55    void reportMyMetric(float myMetric) {
56        MetricsContext myContext = contextFactory.getContext("myContext");
57        MetricsRecord myRecord = myContext.getRecord("myRecord");
58        myRecord.setMetric("myMetric", myMetric);
59        myRecord.update();
60    }
61</pre>
62 
63In this example there are three names:
64<dl>
65  <dt><i>myContext</i></dt>
66  <dd>The context name will typically identify either the application, or else a
67  module within an application or library.</dd>
68 
69  <dt><i>myRecord</i></dt>
70  <dd>The record name generally identifies some entity for which a set of
71  metrics are to be reported.  For example, you could have a record named
72  "cacheStats" for reporting a number of statistics relating to the usage of
73  some cache in your application.</dd>
74 
75  <dt><i>myMetric</i></dt>
76  <dd>This identifies a particular metric.  For example, you might have metrics
77  named "cache_hits" and "cache_misses".
78  </dd>
79</dl>
80
81<h3>Tags</h3>
82
83In some cases it is useful to have multiple records with the same name. For
84example, suppose that you want to report statistics about each disk on a computer.
85In this case, the record name would be something like "diskStats", but you also
86need to identify the disk which is done by adding a <i>tag</i> to the record.
87The code could look something like this:
88<pre>
89    private MetricsRecord diskStats =
90            contextFactory.getContext("myContext").getRecord("diskStats");
91           
92    void reportDiskMetrics(String diskName, float diskBusy, float diskUsed) {
93        diskStats.setTag("diskName", diskName);
94        diskStats.setMetric("diskBusy", diskBusy);
95        diskStats.setMetric("diskUsed", diskUsed);
96        diskStats.update();
97    }
98</pre>
99
100<h3>Buffering and Callbacks</h3>
101
102Data is not sent immediately to the metrics system when
103<code>MetricsRecord.update()</code> is called. Instead it is stored in an
104internal table, and the contents of the table are sent periodically.
105This can be important for two reasons:
106<ol>
107    <li>It means that a programmer is free to put calls to this API in an
108    inner loop, since updates can be very frequent without slowing down
109    the application significantly.</li>
110    <li>Some implementations can gain efficiency by combining many metrics
111    into a single UDP message.</li>
112</ol>
113
114The API provides a timer-based callback via the
115<code>registerUpdater()</code> method.  The benefit of this
116versus using <code>java.util.Timer</code> is that the callbacks will be done
117immediately before sending the data, making the data as current as possible.
118
119<h3>Configuration</h3>
120
121It is possible to programmatically examine and modify configuration data
122before creating a context, like this:
123<pre>
124    ContextFactory factory = ContextFactory.getFactory();
125    ... examine and/or modify factory attributes ...
126    MetricsContext context = factory.getContext("myContext");
127</pre>
128The factory attributes can be examined and modified using the following
129<code>ContextFactory</code>methods:
130<ul>
131    <li><code>Object getAttribute(String attributeName)</code></li>
132    <li><code>String[] getAttributeNames()</code></li>
133    <li><code>void setAttribute(String name, Object value)</code></li>
134    <li><code>void removeAttribute(attributeName)</code></li>
135</ul>
136
137<p/>
138<code>ContextFactory.getFactory()</code> initializes the factory attributes by
139reading the properties file <code>hadoop-metrics.properties</code> if it exists
140on the class path.
141
142<p/>
143A factory attribute named:
144<pre>
145<i>contextName</i>.class
146</pre>
147should have as its value the fully qualified name of the class to be
148instantiated by a call of the <code>CodeFactory</code> method
149<code>getContext(<i>contextName</i>)</code>.  If this factory attribute is not
150specified, the default is to instantiate
151<code>org.apache.hadoop.metrics.file.FileContext</code>.
152
153<p/>
154Other factory attributes are specific to a particular implementation of this
155API and are documented elsewhere.  For example, configuration attributes for
156the file and Ganglia implementations can be found in the javadoc for
157their respective packages.
158</body>
159</html>
Note: See TracBrowser for help on using the repository browser.