CRAN

DidacticBoost 0.1.1

A Simple Implementation and Demonstration of Gradient Boosting

Released Apr 19, 2016 by David Shaub

This package is available for Renjin and there are no known compatibility issues.

Dependencies

rpart 4.1-13

A basic, clear implementation of tree-based gradient boosting designed to illustrate the core operation of boosting models. Tuning parameters (such as stochastic subsampling, modified learning rate, or regularization) are not implemented. The only adjustable parameter is the number of training rounds. If you are looking for a high performance boosting implementation with tuning parameters, consider the 'xgboost' package.

Installation

Maven

This package can be included as a dependency from a Java or Scala project by including the following your project's pom.xml file. Read more about embedding Renjin in JVM-based projects.

<dependencies>
  <dependency>
    <groupId>org.renjin.cran</groupId>
    <artifactId>DidacticBoost</artifactId>
    <version>0.1.1-b24</version>
  </dependency>
</dependencies>
<repositories>
  <repository>
    <id>bedatadriven</id>
    <name>bedatadriven public repo</name>
    <url>https://nexus.bedatadriven.com/content/groups/public/</url>
  </repository>
</repositories>

View build log

Renjin CLI

If you're using Renjin from the command line, you load this library by invoking:

library('org.renjin.cran:DidacticBoost')

Test Results

This package was last tested against Renjin 0.9.2687 on Aug 24, 2018.

Source

R

View GitHub Mirror

Release History