Solutions to Advent of Code in Julia (12⭐):
Day | 2015 | 2019 | 2020 |
---|---|---|---|
1 | ⭐⭐ | ⭐⭐ | |
2 | ⭐⭐ | ||
3 | ⭐⭐ | ||
4 | |||
5 | |||
6 | |||
7 | |||
8 | |||
9 | |||
10 | |||
11 | |||
12 | |||
13 | |||
14 | ⭐⭐ | ||
15 | |||
16 | ⭐⭐ | ||
17 | |||
18 | |||
19 | |||
20 | |||
21 | |||
22 | |||
23 | |||
24 | |||
25 |
Start the Julia REPL (make sure the environment is activated) and download and compile dependencies:
julia> ]
(julia) pkg> instantiate
(julia) pkg> precompile
Enter the puzzle directory, and run the Julia solution file:
$ cd 2015/01_not_quite_lisp/
$ julia aoc201501.jl input.txt
TODO: Create a Copier task that creates a new Julia puzzle template
You can test all solutions by running:
$ julia test_all_puzzles.jl
Note that the tests are done by running the puzzle solutions on the input.txt
file in each puzzle directory and comparing the output to a output.jl.txt
file in the same directory.
You can include a benchmarking report by adding the -r
option:
$ julia test_all_puzzles.jl -r
This will create a file named timings.jl.md
that contains timing information for each puzzle.
Follow these steps after solving a puzzle:
-
Store the solution to an output file:
$ cd 2015/01_not_quite_lisp/ $ julia aoc201501.jl input.txt > output.jl.txt
-
Run benchmarks and add them to the README:
$ cd ../.. $ julia test_all_puzzles.jl -r
-
Update READMEs across all projects:
$ cd .. $ make