Packages

package h2o

Type shortcuts to simplify work in Sparkling REPL

Linear Supertypes
Logging, Serializable, Serializable, Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. h2o
  2. Logging
  3. Serializable
  4. Serializable
  5. Logging
  6. AnyRef
  7. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. trait CrossSparkUtils extends AnyRef
  2. type Dataset[X] = sql.Dataset[X]
  3. class DefaultSource extends RelationProvider with SchemaRelationProvider with CreatableRelationProvider with DataSourceRegister

    Provides access to H2OFrame from pure SQL statements (i.e.

    Provides access to H2OFrame from pure SQL statements (i.e. for users of the JDBC server).

  4. type Frame = water.fvec.Frame
  5. type H2O = water.H2O
  6. class H2OConf extends Logging with InternalBackendConf with ExternalBackendConf with Serializable

    Configuration holder which is representing properties passed from user to Sparkling Water.

  7. class H2OContext extends H2OContextExtensions

    Create new H2OContext based on provided H2O configuration

  8. implicit class H2ODataFrameReader extends AnyRef

    Adds a method, h2o, to DataFrameReader that allows you to read h2o frames using the DataFileReader.

    Adds a method, h2o, to DataFrameReader that allows you to read h2o frames using the DataFileReader. It's alias for sqlContext.read.format("org.apache.spark.h2o").option("key",frame.key.toString).load()

  9. implicit class H2ODataFrameWriter[T] extends AnyRef

    Adds a method, h2o, to DataFrameWriter that allows you to write h2o frames using the DataFileWriter.

    Adds a method, h2o, to DataFrameWriter that allows you to write h2o frames using the DataFileWriter. It's alias for sqlContext.write.format("org.apache.spark.h2o").option("key","new_frame_key").save()

  10. type H2OFrame = water.fvec.H2OFrame
  11. class JavaH2OContext extends AnyRef

    A Java-friendly version of org.apache.spark.h2o.H2OContext

    A Java-friendly version of org.apache.spark.h2o.H2OContext

    Sparkling Water can run in two modes. External cluster mode and internal cluster mode. When using external cluster mode, it tries to connect to existing H2O cluster using the provided spark configuration properties. In the case of internal cluster mode,it creates H2O cluster living in Spark - that means that each Spark executor will have one h2o instance running in it. This mode is not recommended for big clusters and clusters where Spark executors are not stable.

    Cluster mode can be set using the spark configuration property spark.ext.h2o.mode which can be set in script starting sparkling-water or can be set in H2O configuration class H2OConf

  12. type RDD[X] = rdd.RDD[X]
  13. class WrongSparkVersion extends Exception with NoStackTrace

Abstract Value Members

  1. abstract def getClass(): Class[_]
    Definition Classes
    Any

Concrete Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    Any
  2. final def ##(): Int
    Definition Classes
    Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    Any
  4. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  5. def equals(arg0: Any): Boolean
    Definition Classes
    Any
  6. def hashCode(): Int
    Definition Classes
    Any
  7. def initializeLogIfNecessary(isInterpreter: Boolean, silent: Boolean): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  8. def initializeLogIfNecessary(isInterpreter: Boolean): Unit
    Attributes
    protected
    Definition Classes
    Logging
  9. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  10. def isTraceEnabled(): Boolean
    Attributes
    protected
    Definition Classes
    Logging
  11. def log: Logger
    Attributes
    protected
    Definition Classes
    Logging
  12. def logDebug(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  13. def logDebug(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  14. def logError(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  15. def logError(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  16. def logInfo(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  17. def logInfo(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  18. def logName: String
    Attributes
    protected
    Definition Classes
    Logging
  19. def logTrace(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  20. def logTrace(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  21. def logWarning(msg: ⇒ String, throwable: Throwable): Unit
    Attributes
    protected
    Definition Classes
    Logging
  22. def logWarning(msg: ⇒ String): Unit
    Attributes
    protected
    Definition Classes
    Logging
  23. def toString(): String
    Definition Classes
    Any
  24. object H2OConf extends Logging with Serializable
  25. object H2OContext extends Logging with Serializable
  26. object SparkSpecificUtils extends CrossSparkUtils

Inherited from Logging

Inherited from Serializable

Inherited from Serializable

Inherited from Logging

Inherited from AnyRef

Inherited from Any

Ungrouped