Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Documenter docs #35

Merged
merged 1 commit into from
Nov 24, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,19 @@ julia:
- nightly
notifications:
email: false

# uncomment the following lines to override the default test script
#script:
# - if [[ -a .git/shallow ]]; then git fetch --unshallow; fi
# - julia -e 'Pkg.clone(pwd()); Pkg.build("SnoopCompile"); Pkg.test("SnoopCompile"; coverage=true)'

jobs:
include:
- stage: "Documentation"
julia: 1.2
os: linux
script:
- julia --project=docs/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd()));
Pkg.instantiate()'
- julia --project=docs/ docs/make.jl
after_success: skip
222 changes: 6 additions & 216 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,221 +2,11 @@

[![Build Status](https://travis-ci.org/timholy/SnoopCompile.jl.svg?branch=master)](https://travis-ci.org/timholy/SnoopCompile.jl)

SnoopCompile "snoops" on the Julia compiler, getting it to log the
functions and argument-types it's compiling. By parsing the log file,
you can learn which functions are being precompiled, and even how long
each one takes to compile. You can use the package to generate
"precompile lists" that reduce the amount of time needed for JIT
compilation in packages.
SnoopCompile "snoops" on the Julia compiler, causing it to record the
functions and argument types it's compiling. From these lists of methods,
you can generate lists of `precompile` directives that may reduce the latency between
loading packages and using them to do "real work."

## Usage
See the documentation:

### Snooping on inference (recommended)

Currently precompilation only saves inferred code; consequently, the time spent on type inference
is probably the most relevant concern. To learn how much time is spent on each method instance
when executing a command `cmd`, do something like the following (note: requires Julia 1.2 or higher):

```julia
using SnoopCompile

a = rand(Float16, 5)

julia> inf_timing = @snoopi tmin=0.01 sum(a)
1-element Array{Tuple{Float64,Core.MethodInstance},1}:
(0.011293888092041016, MethodInstance for sum(::Array{Float16,1}))
```

Here, we filtered out methods that took less than 10ms to compile via `tmin=0.01`.
We can see that only one method took longer than this (and if your machine is faster than
mine, `inf_timing` might even be empty with these settings).
You can see the specific `MethodInstance` that got compiled.
Note that

```julia
julia> @which sum(a)
sum(a::AbstractArray) in Base at reducedim.jl:652
```

indicates that the method is much more general (i.e., defined for `AbstractArray`)
than the instance (defined for `Array{Float16,1}`); precompilation works on the concrete
types of the objects passed as arguments.

You can use `@snoopi` to come up with a list of precompile-worthy functions.
A recommended approach is to write a script that "exercises" the functionality
you'd like to precompile.
One option is to use your package's `"runtests.jl"` file, or you can write a custom
script for this purpose.
Here's an example for the
[FixedPointNumbers package](https://github.com/JuliaMath/FixedPointNumbers.jl):

```
using FixedPointNumbers

x = N0f8(0.2)
y = x + x
y = x - x
y = x*x
y = x/x
y = Float32(x)
y = Float64(x)
y = 0.3*x
y = x*0.3
y = 2*x
y = x*2
y = x/15
y = x/8.0
```

Save this as a file `"snoopfpn.jl"` and navigate at the Julia REPL to that directory,
and then do

```julia
julia> using SnoopCompile

julia> inf_timing = @snoopi tmin=0.01 include("snoopfpn.jl")
3-element Array{Tuple{Float64,Core.MethodInstance},1}:
(0.016037940979003906, MethodInstance for *(::Float64, ::Normed{UInt8,8}))
(0.028206825256347656, MethodInstance for *(::Normed{UInt8,8}, ::Normed{UInt8,8}))
(0.0369410514831543, MethodInstance for Normed{UInt8,8}(::Float64))
```

SnoopCompile contains utilities that for generating precompile files that can be `include`d into
package(s):

```julia
julia> pc = SnoopCompile.parcel(inf_timing)
Dict{Symbol,Array{String,1}} with 1 entry:
:FixedPointNumbers => ["precompile(Tuple{typeof(*),Float64,Normed{UInt8,8}})", "precompile(Tuple{typeof(*),Normed{UInt8,8},Normed{UInt8,8}})", "precompile(Tuple{Type{Normed{UInt8,8}},Float64})"]

julia> SnoopCompile.write("/tmp/precompile", pc)
```

This splits the calls up into a dictionary, `pc`, indexed by the package which "owns"
each call.
(In this case there is only one, `FixedPointNumbers`, but in more complex cases there may
be several.) If you then look in the `/tmp/precompile` directory, you'll see one or more
files, named by their parent package, that may be suitable for `include`ing into your
module definition(s).
These may or may not reduce the time for your package to execute similar operations.

While this "automated" approach is often useful, sometimes it makes more sense to
inspect the results and write your own precompile directives.
For example, for FixedPointNumbers a more elegant precompile file could be

```julia
function _precompile_()
ccall(:jl_generating_output, Cint, ()) == 1 || return nothing
for T in (N0f8, N0f16) # Normed types we want to support
for f in (+, -, *, /) # operations we want to support
precompile(Tuple{typeof(f),T,T})
for S in (Float32, Float64, Int) # other number types we want to support
precompile(Tuple{typeof(f),T,S})
precompile(Tuple{typeof(f),S,T})
end
end
for S in (Float32, Float64)
precompile(Tuple{Type{T},S})
precompile(Tuple{Type{S},T})
end
end
end
```

This covers `+`, `-`, `*`, `/`, and conversion for various combinations of types.
The results from `@snoopi` can suggest method/type combinations that might be useful to
precompile, but often you can generalize its suggestions in useful ways.

If you `include` a precompile file into your package definition and then run the same
`@snoopi` command again, hopefully it will omit many of the MethodInstances
you obtained previously.
This is a sign of success.
Unfortunately, at present Julia has some significant limitations in terms of how
extensively it can cache inference results, so you may not see as many of these
methods "disappear" as you might hope.

**NOTE**: if you later modify your code so that some of the methods no longer
exist, `precompile` will *not* throw an error; it will instead fail silently.
If you want to be certain that your precompile directives don't go stale,
preface each with an `@assert`.
Note that this forces you to update your precompile directives as you modify your package,
which may or may not be desirable.

### Snooping on the compiler

SnoopCompile can also snoop on native code generation using a macro `@snoopc`,
and thereby also determine what methods are being compiled.
Note that native code is not cached, so this approach may not prioritize the most
most useful methods in the same way that `@snoopi` does.

Here's let's demonstrate `@snoopc` with a snoop script, in this case for the `ColorTypes` package:

```julia
using SnoopCompile

### Log the compiles
# This only needs to be run once (to generate "/tmp/colortypes_compiles.log")

SnoopCompile.@snoopc "/tmp/colortypes_compiles.log" begin
using ColorTypes, Pkg
include(joinpath(dirname(dirname(pathof(ColorTypes))), "test", "runtests.jl"))
end

### Parse the compiles and generate precompilation scripts
# This can be run repeatedly to tweak the scripts

data = SnoopCompile.read("/tmp/colortypes_compiles.log")

pc = SnoopCompile.parcel(reverse!(data[2]))
SnoopCompile.write("/tmp/precompile", pc)
```

After the conclusion of this script, the `"/tmp/precompile"` folder will contain a number of `*.jl` files, organized by package.
For each package, you could copy its corresponding `*.jl` file into the package's `src/` directory
and `include` it into the package:

```jl
module SomeModule

# All the usual commands that define the module go here

# ... followed by:

include("precompile.jl")
_precompile_()

end # module SomeModule
```

There's a more complete example illustrating potential options in the `examples/` directory.

### Additional flags

When calling the `@snoopc` macro, a new julia process is spawned using the function `Base.julia_cmd()`.
Advanced users may want to tweak the flags passed to this process to suit specific needs.
This can be done by passing an array of flags of the form `["--flag1", "--flag2"]` as the first argument to the `@snoop` macro.
For instance, if you want to pass the `--project=/path/to/dir` flag to the process, to cause the julia process to load the project specified by the path, a snoop script may look like:
```julia
using SnoopCompile

SnoopCompile.@snoopc ["--project=/path/to/dir"] "/tmp/compiles.csv" begin
# ... statement to snoop on
end

# ... processing the precompile statements
```

## `userimg.jl`

Currently, precompilation does not cache functions from other modules; as a consequence, your speedup in execution time might be smaller than you'd like. In such cases, one strategy is to generate a script for your `base/userimg.jl` file and build the packages (with precompiles) into julia itself. Simply append/replace the last two lines of the above script with

```jl
# Use these two lines if you want to add to your userimg.jl
pc = SnoopCompile.format_userimg(reverse!(data[2]))
SnoopCompile.write("/tmp/userimg_Images.jl", pc)
```

**Users are warned that there are substantial negatives associated with relying on a `userimg.jl` script**:
- Your julia build times become very long
- `Pkg.update()` will have no effect on packages that you've built into julia until you next recompile julia itself. Consequently, you may not get the benefit of enhancements or bug fixes.
- For a package that you sometimes develop, this strategy is very inefficient, because testing a change means rebuilding Julia as well as your package.
[![](https://img.shields.io/badge/docs-stable-blue.svg)](https://timholy.github.io/SnoopCompile.jl/stable)
2 changes: 2 additions & 0 deletions docs/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
build/
site/
5 changes: 5 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"

[compat]
Documenter = "0.24"
15 changes: 15 additions & 0 deletions docs/make.jl
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
using Documenter
using SnoopCompile

makedocs(
sitename = "SnoopCompile",
format = Documenter.HTML(
prettyurls = get(ENV, "CI", nothing) == "true"
),
modules = [SnoopCompile],
pages = ["index.md", "snoopi.md", "snoopc.md", "userimg.md", "reference.md"]
)

deploydocs(
repo = "github.com/timholy/SnoopCompile.jl.git"
)
53 changes: 53 additions & 0 deletions docs/src/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# SnoopCompile.jl

SnoopCompile "snoops" on the Julia compiler, causing it to record the
functions and argument types it's compiling. From these lists of methods,
you can generate lists of `precompile` directives that may reduce the latency between
loading packages and using them to do "real work."

## Background

Julia uses
[Just-in-time (JIT) compilation](https://en.wikipedia.org/wiki/Just-in-time_compilation) to
generate the code that runs on your CPU.
Broadly speaking, there are two major steps: *inference* and *code generation*.
Inference is the process of determining the type of each object, which in turn
determines which specific methods get called; once type inference is complete,
code generation performs optimizations and ultimately generates the assembly
language (native code) used on CPUs.
Some aspects of this process are documented [here](https://docs.julialang.org/en/latest/devdocs/eval).

Every time you load a package in a fresh Julia session, the methods you use
need to be JIT-compiled, and this contributes to the latency of using the package.
In some circumstances, you can save some of the work to reduce the burden next time.
This is called *precompilation*.
Unfortunately, precompilation is not as comprehensive as one might hope.
Currently, Julia is only capable of saving inference results (not native code) in the
`*.ji` files that are the result of precompilation.
Moreover, there are some significant constraints that sometimes prevent Julia from
saving even the inference results;
and finally, what does get saved can sometimes be invalidated if later packages
provide more specific methods that supersede some of the calls in the precompiled methods.

Despite these limitations, there are cases where precompilation can substantially reduce
latency.
SnoopCompile is designed to try to make it easy to try precompilation to see whether
it produces measurable benefits.

## Who should use this package

SnoopCompile is intended primarily for package *developers* who want to improve the
experience for their users.
Because the results of SnoopCompile are typically stored in the `*.ji` precompile files,
anyone can take advantage of the reduced latency.

[PackageCompiler](https://github.com/JuliaLang/PackageCompiler.jl) is an alternative
that *non-developer users* may want to consider for their own workflow.
It performs more thorough precompilation than the "standard" usage of SnoopCompile,
although one can achieve a similar effect by creating [`userimg.jl` files](@ref userimg).
However, the cost is vastly increased build times, which for package developers is
unlikely to be productive.

Finally, another alternative that reduces latency without any modifications
to package files is [Revise](https://github.com/timholy/Revise.jl).
It can be used in conjunction with SnoopCompile.
10 changes: 10 additions & 0 deletions docs/src/reference.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# Reference

```@docs
@snoopi
@snoopc
SnoopCompile.parcel
SnoopCompile.write
SnoopCompile.read
SnoopCompile.format_userimg
```
58 changes: 58 additions & 0 deletions docs/src/snoopc.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Snooping on code generation: `@snoopc`

`@snoopc` has the advantage of working on any modern version of Julia.
It "snoops" on the code-generation phase of compilation (the 'c' is a reference to
code-generation).
Note that while native code is not cached, it nevertheless reveals
which methods are being compiled.

Note that unlike `@snoopi`, `@snoopc` will generate all methods, not just the top-level
methods that trigger compilation.
(It is redundant to precompile dependent methods, but neither is it harmful.)
It is also worth noting that `@snoopc` requires "spinning up" a new Julia process,
and so it is a bit slower than `@snoopi`.

Let's demonstrate `@snoopc` with a snoop script, in this case for the `ColorTypes` package:

```julia
using SnoopCompile

### Log the compiles
# This only needs to be run once (to generate "/tmp/colortypes_compiles.log")

SnoopCompile.@snoopc "/tmp/colortypes_compiles.log" begin
using ColorTypes, Pkg
include(joinpath(dirname(dirname(pathof(ColorTypes))), "test", "runtests.jl"))
end

### Parse the compiles and generate precompilation scripts
# This can be run repeatedly to tweak the scripts

data = SnoopCompile.read("/tmp/colortypes_compiles.log")

pc = SnoopCompile.parcel(reverse!(data[2]))
SnoopCompile.write("/tmp/precompile", pc)
```

As with `@snoopi`, the `"/tmp/precompile"` folder will now contain a number of `*.jl` files,
organized by package.
For each package, you could copy its corresponding `*.jl` file into the package's `src/` directory
and `include` it into the package as described for [`@snoopi`](@ref auto).

There are more complete example illustrating potential options in the `examples/` directory.

## Additional flags

When calling the `@snoopc` macro, a new julia process is spawned using the function `Base.julia_cmd()`.
Advanced users may want to tweak the flags passed to this process to suit specific needs.
This can be done by passing an array of flags of the form `["--flag1", "--flag2"]` as the first argument to the `@snoop` macro.
For instance, if you want to pass the `--project=/path/to/dir` flag to the process, to cause the julia process to load the project specified by the path, a snoop script may look like:
```julia
using SnoopCompile

SnoopCompile.@snoopc ["--project=/path/to/dir"] "/tmp/compiles.csv" begin
# ... statement to snoop on
end

# ... processing the precompile statements
```
Loading