Skip to content

Commit

Permalink
Merge pull request #44 from MilesCranmer/defaults
Browse files Browse the repository at this point in the history
Nicer default behavior
  • Loading branch information
MilesCranmer committed Jun 18, 2024
2 parents f27b6bc + 5904de7 commit e36d467
Show file tree
Hide file tree
Showing 9 changed files with 525 additions and 334 deletions.
6 changes: 5 additions & 1 deletion Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "AirspeedVelocity"
uuid = "1c8270ee-6884-45cc-9545-60fa71ec23e4"
authors = ["Miles Cranmer <miles.cranmer@gmail.com>"]
version = "0.5.3"
version = "0.6.0"

[deps]
BenchmarkTools = "6e4b80f9-dd63-53aa-95a3-0cdb28fa8baf"
Expand All @@ -16,6 +16,7 @@ PlotlyLight = "ca7969ec-10b3-423e-8d99-40f33abb42bf"
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
REPL = "3fa0cd96-eef1-5676-8a61-b3b8758bbffb"
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
TOML = "fa267f1f-6049-4f14-aa54-33bafae1ed76"

[compat]
BenchmarkTools = "1"
Expand All @@ -24,7 +25,10 @@ DispatchDoctor = "0.4"
FilePathsBase = "0.9"
JSON3 = "1"
OrderedCollections = "1"
Pkg = "1"
PlotlyKaleido = "2"
PlotlyLight = "0.6"
REPL = "1"
Statistics = "1"
TOML = "1"
julia = "1.8"
80 changes: 55 additions & 25 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,21 +30,32 @@ This will install two executables at `~/.julia/bin` - make sure to have it on yo

## Examples

You may then use the CLI to generate benchmarks for any package with, e.g.,
You may use the CLI to generate benchmarks for any package with, e.g.,

```bash
benchpkg
```

This will benchmark the package defined in the current directory
at the current dirty state, against the default branch (i.e., `main` or `master`),
over all benchmarks defined in `benchmark/benchmarks.jl`. It will then print
a markdown table of the results while also saving the JSON results to the current directory.


You can configure all options with the CLI flags. For example, to benchmark
the registered package `Transducers.jl` at the revisions `v0.4.20`, `v0.4.70`, and `master`,
you can use:

```bash
benchpkg Transducers \
--rev=v0.4.20,v0.4.70,master \
--bench-on=v0.4.20
```

which will benchmark `Transducers.jl`,
at the revisions `v0.4.20`, `v0.4.70`, and `master`,
using the benchmark script `benchmark/benchmarks.jl` as it was defined at `v0.4.20`,
This will further use the benchmark script `benchmark/benchmarks.jl` as it was defined at `v0.4.20`,
and then save the JSON results in the current directory.

We can view the results of the benchmark as a table
with `benchpkgtable`:
We can explicitly view the results of the benchmark as a table with `benchpkgtable`:

```bash
benchpkgtable Transducers \
Expand Down Expand Up @@ -131,31 +142,35 @@ For running benchmarks, you can use the `benchpkg` command, which is
built into the `~/.julia/bin` folder:

```markdown
benchpkg package_name [-r --rev <arg>]
[--url <arg>]
[--path <arg>]
[-o, --output-dir <arg>]
[-e, --exeflags <arg>]
[-a, --add <arg>]
[-s, --script <arg>]
[--bench-on <arg>]
[-f, --filter <arg>]
[--nsamples-load-time <arg>]
[--tune]
benchpkg [package_name] [-r --rev <arg>]
[--url <arg>]
[--path <arg>]
[-o, --output-dir <arg>]
[-e, --exeflags <arg>]
[-a, --add <arg>]
[-s, --script <arg>]
[--bench-on <arg>]
[-f, --filter <arg>]
[--nsamples-load-time <arg>]
[--tune]
[--dont-print]

Benchmark a package over a set of revisions.

# Arguments

- `package_name`: Name of the package.
- `package_name`: Name of the package. If not given, the package is assumed to be
the current directory.

# Options

- `-r, --rev <arg>`: Revisions to test (delimit by comma). Use `dirty` to
benchmark the current state of the package at `path` (and not a git commit).
The default is `{DEFAULT},dirty`, which will attempt to find the default branch
of the package.
- `--url <arg>`: URL of the package.
- `--path <arg>`: Path of the package.
- `-o, --output-dir <arg>`: Where to save the JSON results.
- `--path <arg>`: Path of the package. The default is `.` if other arguments are not given.
- `-o, --output-dir <arg>`: Where to save the JSON results. The default is `.`.
- `-e, --exeflags <arg>`: CLI flags for Julia (default: none).
- `-a, --add <arg>`: Extra packages needed (delimit by comma).
- `-s, --script <arg>`: The benchmark script. Default: `benchmark/benchmarks.jl` downloaded from `stable`.
Expand All @@ -164,17 +179,22 @@ Benchmark a package over a set of revisions.
- `-f, --filter <arg>`: Filter the benchmarks to run (delimit by comma).
- `--nsamples-load-time <arg>`: Number of samples to take when measuring load time of
the package (default: 5). (This means starting a Julia process for each sample.)
- `--dont-print`: Don't print the table.

# Flags

- `--tune`: Whether to run benchmarks with tuning (default: false).
- `--tune`: Whether to run benchmarks with tuning (default: false).
```

You can also just generate a table:

```markdown
benchpkgtable package_name [-r --rev <arg>] [-i --input-dir <arg>]
[--ratio]
benchpkgtable [package_name] [-r --rev <arg>]
[-i --input-dir <arg>]
[--ratio]
[--mode <arg>]
[--url <arg>]
[--path <arg>]

Print a table of the benchmarks of a package as created with `benchpkg`.

Expand All @@ -185,19 +205,29 @@ Print a table of the benchmarks of a package as created with `benchpkg`.
# Options

- `-r, --rev <arg>`: Revisions to test (delimit by comma).
The default is `{DEFAULT},dirty`, which will attempt to find the default branch
of the package.
- `-i, --input-dir <arg>`: Where the JSON results were saved (default: ".").
- `--url <arg>`: URL of the package. Only used to get the package name.
- `--path <arg>`: Path of the package. The default is `.` if other arguments are not given.
Only used to get the package name.

# Flags

- `--ratio`: Whether to include the ratio (default: false). Only applies when
comparing two revisions.
- `--mode`: Table mode(s). Valid values are "time" (default), to print the
benchmark time, or "memory", to print the allocation and memory usage.
Both options can be passed, if delimited by comma.
```

For plotting, you can use the `benchpkgplot` function:

```markdown
benchpkgplot package_name [-r --rev <arg>] [-i --input-dir <arg>]
[-o --output-dir <arg>] [-n --npart <arg>]
benchpkgplot package_name [-r --rev <arg>]
[-i --input-dir <arg>]
[-o --output-dir <arg>]
[-n --npart <arg>]
[--format <arg>]

Plot the benchmarks of a package as created with `benchpkg`.
Expand Down
69 changes: 50 additions & 19 deletions src/BenchPkg.jl
Original file line number Diff line number Diff line change
@@ -1,35 +1,40 @@
module BenchPkg

using ..Utils: benchmark
using ..TableUtils: create_table, format_memory
using ..Utils: benchmark, get_package_name_defaults, parse_rev, load_results
using Comonicon
using Comonicon: @main

"""
benchpkg package_name [-r --rev <arg>]
[--url <arg>]
[--path <arg>]
[-o, --output-dir <arg>]
[-e, --exeflags <arg>]
[-a, --add <arg>]
[-s, --script <arg>]
[--bench-on <arg>]
[-f, --filter <arg>]
[--nsamples-load-time <arg>]
[--tune]
benchpkg [package_name] [-r --rev <arg>]
[--url <arg>]
[--path <arg>]
[-o, --output-dir <arg>]
[-e, --exeflags <arg>]
[-a, --add <arg>]
[-s, --script <arg>]
[--bench-on <arg>]
[-f, --filter <arg>]
[--nsamples-load-time <arg>]
[--tune]
[--dont-print]
Benchmark a package over a set of revisions.
# Arguments
- `package_name`: Name of the package.
- `package_name`: Name of the package. If not given, the package is assumed to be
the current directory.
# Options
- `-r, --rev <arg>`: Revisions to test (delimit by comma). Use `dirty` to
benchmark the current state of the package at `path` (and not a git commit).
The default is `{DEFAULT},dirty`, which will attempt to find the default branch
of the package.
- `--url <arg>`: URL of the package.
- `--path <arg>`: Path of the package.
- `-o, --output-dir <arg>`: Where to save the JSON results.
- `--path <arg>`: Path of the package. The default is `.` if other arguments are not given.
- `-o, --output-dir <arg>`: Where to save the JSON results. The default is `.`.
- `-e, --exeflags <arg>`: CLI flags for Julia (default: none).
- `-a, --add <arg>`: Extra packages needed (delimit by comma).
- `-s, --script <arg>`: The benchmark script. Default: `benchmark/benchmarks.jl` downloaded from `stable`.
Expand All @@ -38,15 +43,16 @@ Benchmark a package over a set of revisions.
- `-f, --filter <arg>`: Filter the benchmarks to run (delimit by comma).
- `--nsamples-load-time <arg>`: Number of samples to take when measuring load time of
the package (default: 5). (This means starting a Julia process for each sample.)
- `--dont-print`: Don't print the table.
# Flags
- `--tune`: Whether to run benchmarks with tuning (default: false).
"""
@main function benchpkg(
package_name::String;
rev::String,
package_name::String="";
rev::String="{DEFAULT},dirty",
output_dir::String=".",
script::String="",
exeflags::String="",
Expand All @@ -57,21 +63,39 @@ Benchmark a package over a set of revisions.
bench_on::String="",
filter::String="",
nsamples_load_time::Int=5,
dont_print::Bool=false,
)
revs = convert(Vector{String}, split(rev, ","))
Base.filter!(x -> length(x) > 0, revs)
Base.filter!(!isempty, revs)

filtered = convert(Vector{String}, split(filter, ","))
Base.filter!(x -> length(x) > 0, filtered)

@assert length(revs) > 0 "No revisions specified."
@assert nsamples_load_time > 0 "nsamples_load_time must be positive."

package_name, url, path = get_package_name_defaults(package_name, url, path)

if path != ""
revs = map(Base.Fix2(parse_rev, path), revs)
else
if any(==("{DEFAULT}"), revs)
error("You must explicitly set `--revs` for this set of options.")
end
end

_script = if bench_on == "dirty" && path != "" && script == ""
bench_on = nothing
joinpath(path, "benchmark", "benchmarks.jl")
else
script
end

benchmark(
package_name,
revs;
output_dir=output_dir,
script=(length(script) > 0 ? script : nothing),
script=(length(_script) > 0 ? _script : nothing),
tune=tune,
exeflags=(length(exeflags) > 0 ? `$(Cmd(split(exeflags, " ") .|> String))` : ``),
extra_pkgs=convert(Vector{String}, split(add, ",")),
Expand All @@ -82,6 +106,13 @@ Benchmark a package over a set of revisions.
nsamples_load_time=nsamples_load_time,
)

if !dont_print
combined_results = load_results(package_name, revs; input_dir=output_dir)
println(
create_table(combined_results; add_ratio_col=length(revs) == 2, key="median")
)
end

return nothing
end

Expand Down
36 changes: 29 additions & 7 deletions src/BenchPkgTable.jl
Original file line number Diff line number Diff line change
@@ -1,13 +1,17 @@
module BenchPkgTable

using ..TableUtils: create_table, format_memory
using ..Utils: load_results
using ..Utils: get_package_name_defaults, parse_rev, load_results
using Comonicon
using Comonicon: @main

"""
benchpkgtable package_name [-r --rev <arg>] [-i --input-dir <arg>]
[--ratio] [--mode <arg>]
benchpkgtable [package_name] [-r --rev <arg>]
[-i --input-dir <arg>]
[--ratio]
[--mode <arg>]
[--url <arg>]
[--path <arg>]
Print a table of the benchmarks of a package as created with `benchpkg`.
Expand All @@ -18,7 +22,12 @@ Print a table of the benchmarks of a package as created with `benchpkg`.
# Options
- `-r, --rev <arg>`: Revisions to test (delimit by comma).
The default is `{DEFAULT},dirty`, which will attempt to find the default branch
of the package.
- `-i, --input-dir <arg>`: Where the JSON results were saved (default: ".").
- `--url <arg>`: URL of the package. Only used to get the package name.
- `--path <arg>`: Path of the package. The default is `.` if other arguments are not given.
Only used to get the package name.
# Flags
Expand All @@ -29,16 +38,29 @@ Print a table of the benchmarks of a package as created with `benchpkg`.
Both options can be passed, if delimited by comma.
"""
@main function benchpkgtable(
package_name::String;
rev::String,
package_name::String="";
rev::String="dirty,{DEFAULT}",
input_dir::String=".",
ratio::Bool=false,
mode::String="time",
url::String="",
path::String="",
)
revs = convert(Vector{String}, split(rev, ","))
# Filter empty strings:
revs = filter(x -> length(x) > 0, revs)
Base.filter!(!isempty, revs)

@assert length(revs) > 0 "No revisions specified."

package_name, url, path = get_package_name_defaults(package_name, url, path)

if path != ""
revs = map(Base.Fix2(parse_rev, path), revs)
else
if any(==("{DEFAULT}"), revs)
error("You must explicitly set `--revs` for this set of options.")
end
end

combined_results = load_results(package_name, revs; input_dir=input_dir)

modes = split(mode, ",")
Expand Down
Loading

2 comments on commit e36d467

@MilesCranmer
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/109245

Tip: Release Notes

Did you know you can add release notes too? Just add markdown formatted text underneath the comment after the text
"Release notes:" and it will be added to the registry PR, and if TagBot is installed it will also be added to the
release that TagBot creates. i.e.

@JuliaRegistrator register

Release notes:

## Breaking changes

- blah

To add them here just re-invoke and the PR will be updated.

Tagging

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.6.0 -m "<description of version>" e36d467e463f4ae90d2340879b1ff3f3011916bd
git push origin v0.6.0

Please sign in to comment.