Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix PySpark: actually kill driver on termination
We use the stdin broken pipe as a signal that the parent process that launched the SparkSubmitDriverBootstrapper JVM has exited. This is very similar to what Py4J's JavaGateway does (see the parameter "--die-on-broken-pipe"). This allows both JVMs to actually terminate after the application has finished. This was especially relevant for the PySpark shell, where Spark submit itself is launched as a python subprocess and the driver was never actually killed even after the shell had exited.
- Loading branch information