haifengl on master
drop DataTransform and move imp… DataFrameTransform -> DataTrans… (compare)
haifengl on master
DataFrameTransform (compare)
haifengl on master
add DataTransform (compare)
haifengl on master
SecurityManager is deprecated f… (compare)
haifengl on master
use scala.jdk.CollectionConvert… (compare)
haifengl on master
fix coefficients() (compare)
jupyterlab.sh
to bootstrap and install the almond kernel haifengl/smile#672
xh
initialization be inside the loop as a copy of x
, like:
default double g(double[] x, double[] gradient) {
double fx = f(x);
int n = x.length;
for (int i = 0; i < n; i++) {
double[] xh = x.clone();
double xi = x[i];
double h = EPSILON * Math.abs(xi);
if (h == 0.0) {
h = EPSILON;
}
xh[i] = xi + h; // trick to reduce finite-precision error.
h = xh[i] - xi;
double fh = f(xh);
xh[i] = xi;
gradient[i] = (fh - fx) / h;
}
return fx;
}
f(x1 + h1, 0, 0) - f(x1, x2, x3)
, f(x1 + h1, x2 + h2, 0) - f(x1, x2, x3)
, which seems wrong... or am I missing something?
default double g(double[] x, double[] gradient) {
double fx = f(x);
int n = x.length;
for (int i = 0; i < n; i++) {
double[] xh = x.clone();
double xi = x[i];
double h = EPSILON * Math.abs(xi);
if (h == 0.0) {
h = EPSILON;
}
xh[i] = xi + h; // trick to reduce finite-precision error.
h = xh[i] - xi;
double fh = f(xh);
xh[i] = xi;
gradient[i] = (fh - fx) / h;
}
return fx;
}
java.lang.ArithmeticException: LAPACK GETRS error code: -8
at smile.math.matrix.Matrix$LU.solve(Matrix.java:2219)
at smile.math.matrix.Matrix$LU.solve(Matrix.java:2189)
at smile.math.BFGS.subspaceMinimization(BFGS.java:875)
at smile.math.BFGS.minimize(BFGS.java:647)
kindly look into this issue regarding "Formula.lhs" of RandomForest. As my dataset goes through several tranformations I end up having this,
var xtrain: Array[Array[Double]] = xtrainx
var ytrain: Array[Int] = bc_ytrainSet.value.map(x=>scala.math.floor(x).toInt)
var xtest: Array[Array[Double]] = xtestx
var ytest: Array[Int] = bc_ytestSet.value.map(x=>scala.math.floor(x).toInt)
//var nn: KNN[Array[Double]] =KNN.fit(xtrain, ytrain, 5)
var rf = RandomForest.fit(Formula.lhs(?), xtrain)
var pred = rf.predict(xtest)
var accu = Accuracy.of(ytest, pred)
actually I want to know, what to write inside Formula.lhs(?), in the absense of any header. For KNN it is working fine without any header.
2 . How to call a custom function on a specific column of Dataframe, as we do in python pandas.
def fun(num):
some operation
new = df["column"].apply(fun)