Skip to content

Commit

Permalink
SPARK-2748 [MLLIB] [GRAPHX] Loss of precision for small arguments to …
Browse files Browse the repository at this point in the history
…Math.exp, Math.log

In a few places in MLlib, an expression of the form `log(1.0 + p)` is evaluated. When p is so small that `1.0 + p == 1.0`, the result is 0.0. However the correct answer is very near `p`. This is why `Math.log1p` exists.

Similarly for one instance of `exp(m) - 1` in GraphX; there's a special `Math.expm1` method.

While the errors occur only for very small arguments, given their use in machine learning algorithms, this is entirely possible.

Also note the related PR for Python: #1652

Author: Sean Owen <[email protected]>

Closes #1659 from srowen/SPARK-2748 and squashes the following commits:

c5926d4 [Sean Owen] Use log1p, expm1 for better precision for tiny arguments
  • Loading branch information
srowen authored and mengxr committed Jul 30, 2014
1 parent 7c5fc28 commit ee07541
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 6 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -100,8 +100,10 @@ object GraphGenerators {
*/
private def sampleLogNormal(mu: Double, sigma: Double, maxVal: Int): Int = {
val rand = new Random()
val m = math.exp(mu + (sigma * sigma) / 2.0)
val s = math.sqrt((math.exp(sigma*sigma) - 1) * math.exp(2*mu + sigma*sigma))
val sigmaSq = sigma * sigma
val m = math.exp(mu + sigmaSq / 2.0)
// expm1 is exp(m)-1 with better accuracy for tiny m
val s = math.sqrt(math.expm1(sigmaSq) * math.exp(2*mu + sigmaSq))
// Z ~ N(0, 1)
var X: Double = maxVal

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,9 +68,9 @@ class LogisticGradient extends Gradient {
val gradient = brzData * gradientMultiplier
val loss =
if (label > 0) {
math.log(1 + math.exp(margin))
math.log1p(math.exp(margin)) // log1p is log(1+p) but more accurate for small p
} else {
math.log(1 + math.exp(margin)) - margin
math.log1p(math.exp(margin)) - margin
}

(Vectors.fromBreeze(gradient), loss)
Expand All @@ -89,9 +89,9 @@ class LogisticGradient extends Gradient {
brzAxpy(gradientMultiplier, brzData, cumGradient.toBreeze)

if (label > 0) {
math.log(1 + math.exp(margin))
math.log1p(math.exp(margin))
} else {
math.log(1 + math.exp(margin)) - margin
math.log1p(math.exp(margin)) - margin
}
}
}
Expand Down

0 comments on commit ee07541

Please sign in to comment.