| Home | Trees | Indices | Help | 
 | 
|---|
|  | 
object --+
         |
        Accumulator
A shared variable that can be accumulated, i.e., has a commutative and
  associative "add" operation. Worker tasks on a Spark cluster 
  can add values to an Accumulator with the += operator, but 
  only the driver program is allowed to access its value, using 
  value. Updates from the workers get propagated automatically
  to the driver program.
While SparkContext supports accumulators for primitive 
  data types like int and float, users can also 
  define accumulators for custom types by providing a custom AccumulatorParam object. Refer to the doctest of this 
  module for an example.
| Instance Methods | |||
| 
 | |||
| 
 | |||
| 
 | |||
| 
 | |||
| 
 | |||
| 
 | |||
| 
 | |||
| Inherited from  | |||
| Properties | |
| Inherited from  | 
| Method Details | 
| 
 Create a new Accumulator with a given initial value and AccumulatorParam object 
 | 
| 
 Custom serialization; saves the zero value from our AccumulatorParam 
 | 
| 
 Sets the accumulator's value; only usable in driver program 
 | 
| 
 str(x) 
 | 
| 
 repr(x) 
 | 
| Home | Trees | Indices | Help | 
 | 
|---|
| Generated by Epydoc 3.0.1 on Thu Jul 17 20:36:18 2014 | http://epydoc.sourceforge.net |