Hacker News new | past | comments | ask | show | jobs | submit login

Perhaps this example is what you describe? flatMap (used by the for comprehension) enables monads to combine in powerful ways. The first part shows uncorrelated X,Y such that Y ~ -X, then the second part shows correlated Y = -X such that every y = -x

    {
      val X: Podel[Int] = Bernoulli(p = 0.5)
      val Y: Podel[Int] = X.map(-1 * _)
      val Z: Podel[Int] = X + Y
      val d: Podel[Int] = for {
        x <- X
        y <- Y
        z <- Z
      } yield {
          z
        }
      d.hist(sampleSize = 10000, optLabel = Some("Uncorrelated X, Y"))
    }

    {
      val X: Bernoulli = Bernoulli(p = 0.5)
      val d: Podel[Int] = for {
        x <- X
        y = - 1 * x
        z = x + y
      } yield {
        z
      }
      d.hist(sampleSize = 10000, optLabel = Some("y = -x"))
    }
    /*
Uncorrelated X, Y

-1 25.52% #########################

0 49.60% #################################################

1 24.88% ########################

y = -x

0 100.00% ####################################################################################################

     */
So far, the weakness I find with probability monad, and simulation, is lack of precision. This makes them useless when the probability you want to find is precise such as 5.324e-11. Only formulas can give you the answer.



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: