Rust task runner and build tool.

Overview

cargo-make

crates.io CI codecov license Crates.io GitHub All Releases Built with cargo-make

Rust task runner and build tool.

Overview

The cargo-make task runner enables to define and configure sets of tasks and run them as a flow.
A task is a command, script, rust code or other sub tasks to execute.
Tasks can have dependencies which are also tasks that will be executed before the task itself.
With a simple toml based configuration file, you can define a multi platform build script that can run build, test, generate documentation, run bench tests, run security validations and more, executed by running a single command.

Installation

In order to install, just run the following command

cargo install --force cargo-make

This will install cargo-make in your ~/.cargo/bin.
Make sure to add ~/.cargo/bin directory to your PATH variable.

You will have two executables available: cargo-make and makers

  • cargo-make - This is a cargo plugin invoked using cargo make ...
  • makers - A standalone executable which provides same features and cli arguments as cargo-make but is invoked directly and not as a cargo plugin.

See Cli Options section for full CLI instructions.

In order to install with minimal features (for example, no TLS support), run the following:

cargo install --no-default-features --force cargo-make

Binary Release

Binary releases are available in the github releases page.
The following binaries are available for each release:

  • x86_64-unknown-linux-musl
  • x86_64-apple-darwin
  • x86_64-pc-windows-msvc
  • arm-unknown-linux-gnueabihf

Usage

When using cargo-make, all tasks are defined and configured via toml files.
Below are simple instructions to get you started off quickly.

Simple Example

In order to run a set of tasks, you first must define them in a toml file.
For example, if we would like to have a script which:

  • Formats the code
  • Cleans old target directory
  • Runs build
  • Runs tests

We will create a toml file as follows:

[tasks.format]
install_crate = "rustfmt"
command = "cargo"
args = ["fmt", "--", "--emit=files"]

[tasks.clean]
command = "cargo"
args = ["clean"]

[tasks.build]
command = "cargo"
args = ["build"]
dependencies = ["clean"]

[tasks.test]
command = "cargo"
args = ["test"]
dependencies = ["clean"]

[tasks.my-flow]
dependencies = [
    "format",
    "build",
    "test"
]

We would execute the flow with the following command:

cargo make --makefile simple-example.toml my-flow

The output would look something like this:

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: simple-example.toml
[cargo-make] INFO - Task: my-flow
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: format
[cargo-make] INFO - Execute Command: "cargo" "fmt" "--" "--emit=files"
[cargo-make] INFO - Running Task: clean
[cargo-make] INFO - Execute Command: "cargo" "clean"
[cargo-make] INFO - Running Task: build
[cargo-make] INFO - Execute Command: "cargo" "build"
   Compiling bitflags v0.9.1
   Compiling unicode-width v0.1.4
   Compiling quote v0.3.15
   Compiling unicode-segmentation v1.1.0
   Compiling strsim v0.6.0
   Compiling libc v0.2.24
   Compiling serde v1.0.8
   Compiling vec_map v0.8.0
   Compiling ansi_term v0.9.0
   Compiling unicode-xid v0.0.4
   Compiling synom v0.11.3
   Compiling rand v0.3.15
   Compiling term_size v0.3.0
   Compiling atty v0.2.2
   Compiling syn v0.11.11
   Compiling textwrap v0.6.0
   Compiling clap v2.25.0
   Compiling serde_derive_internals v0.15.1
   Compiling toml v0.4.2
   Compiling serde_derive v1.0.8
   Compiling cargo-make v0.1.2 (file:///home/ubuntu/workspace)
    Finished dev [unoptimized + debuginfo] target(s) in 79.75 secs
[cargo-make] INFO - Running Task: test
[cargo-make] INFO - Execute Command: "cargo" "test"
   Compiling cargo-make v0.1.2 (file:///home/ubuntu/workspace)
    Finished dev [unoptimized + debuginfo] target(s) in 5.1 secs
     Running target/debug/deps/cargo_make-d5f8d30d73043ede

running 10 tests
test log::tests::create_info ... ok
test log::tests::get_level_error ... ok
test log::tests::create_verbose ... ok
test log::tests::get_level_info ... ok
test log::tests::get_level_other ... ok
test log::tests::get_level_verbose ... ok
test installer::tests::is_crate_installed_false ... ok
test installer::tests::is_crate_installed_true ... ok
test command::tests::validate_exit_code_error ... ok
test log::tests::create_error ... ok

test result: ok. 10 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out

[cargo-make] INFO - Running Task: my-flow
[cargo-make] INFO - Build done in 72 seconds.

We now created a build script that can run on any platform.

cargo-make can be invoked as a cargo plugin via 'cargo make' command or as a standalone executable via 'makers' command.

Important Note: if you are running this example in a cargo workspace, you will need to add the following to the top of the file:

[env]
CARGO_MAKE_EXTEND_WORKSPACE_MAKEFILE = true

More on workspace support in the relevant sections in this document.

Tasks, Dependencies and Aliases

In many cases, certain tasks depend on other tasks.
For example you would like to format the code before running build and run the build before running tests.
Such flow can be defined as follows:

[tasks.format]
install_crate = "rustfmt"
command = "cargo"
args = ["fmt", "--", "--emit=files"]

[tasks.build]
command = "cargo"
args = ["build"]
dependencies = ["format"]

[tasks.test]
command = "cargo"
args = ["test"]
dependencies = ["build"]

When you run:

cargo make --makefile ./my_build.toml test

It will try to run test, see that it has dependencies and those have other dependencies.
Therefore it will create an execution plan for the tasks based on the tasks and their dependencies.
In our case it will invoke format -> build -> test.

The same task will never be executed twice so if we have for example:

[tasks.A]
dependencies = ["B", "C"]

[tasks.B]
dependencies = ["D"]

[tasks.C]
dependencies = ["D"]

[tasks.D]
script = "echo hello"

In this example, A depends on B and C, and both B and C are dependent on D.
Task D however will not be invoked twice.
The output of the execution will look something like this:

[cargo-make] INFO - Task: A
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: D
[cargo-make] INFO - Execute Command: "sh" "/tmp/cargo-make/CNuU47tIix.sh"
hello
[cargo-make] INFO - Running Task: B
[cargo-make] INFO - Running Task: C
[cargo-make] INFO - Running Task: A

As you can see, 'hello' was printed once by task D as it was only invoked once.
But what if we want to run D twice?
Simple answer would be to duplicate task D and have B depend on D and C depend on D2 which is a copy of D.
But duplicating can lead to bugs and to huge makefiles, so we have aliases for that.
An alias task has its own name and points to another task.
All of the definitions of the alias task are ignored.
So now, if we want to have D execute twice we can do the following:

[tasks.A]
dependencies = ["B", "C"]

[tasks.B]
dependencies = ["D"]

[tasks.C]
dependencies = ["D2"]

[tasks.D]
script = "echo hello"

[tasks.D2]
alias="D"

Now C depends on D2 and D2 is an alias for D.
Execution output of such make file would like as follows:

[cargo-make] INFO - Task: A
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: D
[cargo-make] INFO - Execute Command: "sh" "/tmp/cargo-make/HP0UD7pgoX.sh"
hello
[cargo-make] INFO - Running Task: B
[cargo-make] INFO - Running Task: D2
[cargo-make] INFO - Execute Command: "sh" "/tmp/cargo-make/TuuZJkqCE2.sh"
hello
[cargo-make] INFO - Running Task: C
[cargo-make] INFO - Running Task: A

Now you can see that 'hello' was printed twice.

Tasks may also depend on tasks in other files. To do this, specify the dependency with the object format, providing the path. cargo-make will use this path as it would any other supplied on the command line: If a filename is supplied, it searches that file. Otherwise it search for the default Makefile.toml on that path.

[tasks.install]
command = "mv"
args = ["src/B/out", "src/C/static"]
dependencies = [
  { name = "compile", path = "src/B" },
  { name = "clean", path = "src/C/tasks.toml" },
]

It is also possible to define platform specific aliases, for example:

[tasks.my_task]
linux_alias = "linux_my_task"
windows_alias = "windows_my_task"
mac_alias = "mac_my_task"

[tasks.linux_my_task]

[tasks.mac_my_task]

[tasks.windows_my_task]

If platform specific alias is found and matches current platform it will take precedence over the non platform alias definition.
For example:

[tasks.my_task]
linux_alias = "run"
alias = "do_nothing"

[tasks.run]
script = "echo hello"

[tasks.do_nothing]

If you run task my_task on windows or mac, it will invoke the do_nothing task.
However, if executed on a linux platform, it will invoke the run task.

As a side note, cargo-make will attempt to invoke the task dependencies in the order that they were defined unless they are defined also as sub dependencies.

Commands, Scripts and Sub Tasks

The actual operation that a task invokes can be defined in 3 ways.
The below explains each one:

  • run_task - Invokes another task with the name defined in this attribute. Unlike dependencies which are invoked before the current task, the task defined in the run_task is invoked after the current task.
  • command - The command attribute defines what executable to invoke. You can use the args attribute to define what command line arguments to provide as part of the command.
  • script - Invokes the script. You can change the executable used to invoke the script using the script_runner attribute. If not defined, the default platform runner is used (cmd for windows, sh for others).

Only one of the definitions will be used.
If multiple attributes are defined (for example both command and script), the task will fail during invocation.

The script attribute may hold non OS scripts, for example rust code to be compiled and executed.
In order to use non OS script runners, you must define the special script_runner with the @ prefix.
The following runners are currently supported:

  • @duckscript - Executes the defined duckscript code. See example
  • @rust - Compiles and executes the defined rust code. See example
  • @shell - For windows platform, it will try to convert the shell commands to windows batch commands (only basic scripts are supported) and execute the script, for other platforms the script will be executed as is. See example

Below are some basic examples of each action type.

Sub Task

In this example, if we execute the flow task, it will invoke the echo task defined in the run_task attribute.

[tasks.echo]
script = "echo hello world"

[tasks.flow]
run_task = "echo"

A more complex example below demonstrates the ability to define multiple task names and optional conditions attached to each task.
The first task for which the conditions are met (or if no conditions are defined at all), will be invoked.
If no task conditions are met, no sub task will be invoked.
More on conditions can be found the conditions section

[tasks.test1]
command = "echo"
args = ["running test1"]

[tasks.test2]
command = "echo"
args = ["running test2"]

[tasks.test3]
command = "echo"
args = ["running test3"]

[tasks.test-default]
command = "echo"
args = ["running test-default"]

[tasks.test-routing]
run_task = [
    { name = "test1", condition = { platforms = ["windows", "linux"], channels = ["beta", "stable"] } },
    { name = "test2", condition = { platforms = ["mac"], rust_version = { min = "1.20.0", max = "1.30.0" } } },
    { name = "test3", condition_script = [ "somecommand" ] },
    { name = "test-default" }
]

It is also possible to run the sub task as a forked sub process using the fork attribute.
This prevents any environment changes done in the sub task to impact the rest of the flow in the parent process.
Example of invoking the sub task in a forked sub process:

[tasks.echo]
command = "echo"
args = ["hello world"]

[tasks.fork-example]
run_task = { name = "echo", fork = true }

The name attribute can hold either a single task name or a list of tasks.
In case of a list, the tasks would be invoked one after the other in sequence.
For example, below simple-multi and routing-multi both demonstrate different ways to define multi task invocations via run_task:

[tasks.echo1]
command = "echo"
args = ["1"]

[tasks.echo2]
command = "echo"
args = ["2"]

[tasks.simple-multi]
run_task = { name = ["echo1", "echo2"] }

[tasks.routing-multi]
run_task = [
    { name = ["echo1", "echo2"] },
]

You can also setup a cleanup task to run after the sub task even if the sub task failed.
This is only supported in combination with fork=true attribute.
For example:

[tasks.echo1]
command = "echo"
args = ["1"]

[tasks.echo2]
command = "echo"
args = ["2"]

[tasks.fail]
script =  "exit 1"

[tasks.cleanup]
command = "echo"
args = ["cleanup"]

[tasks.cleanup-example]
run_task = { name = ["echo1", "echo2", "fail"], fork = true, cleanup_task = "cleanup" }

In order to run multiple tasks in parallel, add parallel = true to the run_task object.
For example:

[tasks.echo1]
command = "echo"
args = ["1"]

[tasks.echo2]
command = "echo"
args = ["2"]

[tasks.parallel-multi]
run_task = { name = ["echo1", "echo2"], parallel = true }

This allows to run independent tasks in parallel and speed up the overall performance of the flow.
Be aware that parallel invocation of tasks will cause issues if the following feature are used:

  • Setting the task current working directory via cwd attribute will result in all parallel tasks being affected.
  • Avoid using CARGO_MAKE_CURRENT_TASK_ type environment variables as those may hold incorrect values.

Command

For running commands, you can also define the command line arguments as below example invokes cargo command with the plugin name as a command line argument:

[tasks.build-with-verbose]
command = "cargo"
args = ["build", "--verbose", "--all-features"]

It is possible to provide environment variables as part of the command and arguments to be replaced in runtime with actual values, for example:

[env]
SIMPLE = "SIMPLE VALUE"
ECHO_CMD = "echo"

[tasks.expand]
command = "${ECHO_CMD}"
args = [
    "VALUE: ${SIMPLE}"
]

cargo-make cli also supports additional arguments which will be available to all tasks.
Following example task, will print those additional arguments:

[tasks.varargs]
command = "echo"
args = [
    "args are:", "${@}"
]

Invoking cargo-make with additional arguments would result in the following:

> cargo make varargs arg1 arg2 arg3

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: Makefile.toml
[cargo-make] INFO - Task: varargs
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: varargs
[cargo-make] INFO - Execute Command: "echo" "args are:" "arg1" "arg2" "arg3"
args are: arg1 arg2 arg3
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Invoking cargo-make without any additional arguments would result in the following:

> cargo make varargs

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: Makefile.toml
[cargo-make] INFO - Task: varargs
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: varargs
[cargo-make] INFO - Execute Command: "echo" "args are:"
args are:
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

This can also be used for templating, for example:

[tasks.varargs]
command = "echo"
args = [
    "args are:", "-o=${@}"
]

Would output:

> cargo make varargs arg1 arg2 arg3

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: Makefile.toml
[cargo-make] INFO - Task: varargs
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: varargs
[cargo-make] INFO - Execute Command: "echo" "args are:" "arg1" "arg2" "arg3"
args are: -o=arg1 -o=arg2 -o=arg3
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Command line arguments can also contain built in functions which are explained later on in this document.

Script

Below simple script which prints hello world.

[tasks.hello-world]
script = [
    "echo start...",
    "echo \"Hello World From Script\"",
    "echo end..."
]

You can use multi line toml string to make the script more readable as follows:

[tasks.hello-world]
script = '''
echo start...
echo "Hello World From Script"
echo end...
'''

cargo-make cli also supports additional arguments which will be available to all tasks.
Following example task, will print those additional arguments:

[tasks.cli-args]
script = "echo args are: ${@}"

Invoking cargo-make with additional arguments would result in the following:

> cargo make cli-args arg1 arg2 arg3

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: Makefile.toml
[cargo-make] INFO - Task: cli-args
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: cli-args
+ cd /projects/rust/cargo-make/examples
+ echo args are: arg1 arg2 arg3
args are: arg1 arg2 arg3
[cargo-make] INFO - Running Task: end

Invoking cargo-make without any additional arguments would result in the following:

> cargo make cli-args

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: Makefile.toml
[cargo-make] INFO - Task: cli-args
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: cli-args
+ cd /projects/rust/cargo-make/examples
+ echo args are:
args are:
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

It is also possible to point to an existing script instead of holding the script text inside the makefile by using the file property as follows:

[tasks.hello-world-from-script-file]
script = { file = "script.sh" }

Script file paths are always relative to the current working directory unless specified by the absolute_path attribute, for example:

[tasks.hello-world-from-script-file-absolute-path]
script = { file = "${CARGO_MAKE_WORKING_DIRECTORY}/script.sh", absolute_path = true }

File paths support environment substitution.

Favor commands over scripts, as commands support more features such as automatic dependencies installation, argument functions, and more...

In order to share common script content among multiple tasks, you can use the script pre/main/post form as follows:

[tasks.base-script]
script.pre = "echo start"
script.main = "echo old"
script.post = "echo end"

[tasks.extended-script]
extend = "base-script"
script.main = "echo new"

Running extended-script task would print:

start
new
end

Duckscript

Duckscript is incredibly simple shell like language which provides cross platform shell scripting capability.
Duckscript is embedded inside cargo-make so unlike other scripting solutions or commands, duckscript can change cargo-make environment variables from inside the script.
In addition you can run cargo-make tasks from within duckscript script.
This allows a really powerful two way integration with cargo-make.

[tasks.duckscript-example]
script_runner = "@duckscript"
script = '''
task_name = get_env CARGO_MAKE_CURRENT_TASK_NAME
echo The currently running cargo make task is: ${task_name}

# since all env vars are auto loaded as duckscript variables by cargo-make
# you can access them directly
echo The currently running cargo make task is: ${CARGO_MAKE_CURRENT_TASK_NAME}

cd .. # this changes cargo-make current working directory (cargo-make will revert to original directory after script execution)
pwd
set_env CARGO_MAKE_CURRENT_TASK_NAME tricking_cargo_make
'''

The next example shows how to invoke cargo-make tasks from duckscript:

[tasks.run-task-from-duckscript]
script_runner = "@duckscript"
script = '''
echo first invocation of echo1 task:
cm_run_task echo1
echo second invocation of echo1 task:
cm_run_task echo1

echo running task: echo2:
cm_run_task echo2
'''

[tasks.echo1]
command = "echo"
args = ["1"]

[tasks.echo2]
command = "echo"
args = ["2"]

Same as OS scripts, the @duckscript runner also supports the cargo-make CLI arguments access.
In addition, all environment variables are preloaded as duckscript variables and can be directly read from the script (no need to invoke the get_env command).

Rust Code

In this example, when the rust task is invoked, the script content will be compiled and executed. You can see how dependencies are defined in Cargo.toml format inside the code.

[tasks.rust]
script_runner = "@rust"
script = '''
//! ```cargo
//! [dependencies]
//! envmnt = "*"
//! ```
fn main() {
    let value = envmnt::get_or("PATH", "NO PATH VAR DEFINED");
    println!("Path Value: {}", &value);
}

Same as OS scripts, the @rust runner also supports the cargo-make CLI arguments access.
There are several different rust script runners currently available:

By default, rust-script is used, however this can be changed via environment variable CARGO_MAKE_RUST_SCRIPT_PROVIDER which should hold the crate name.
This enables to define a different runner for each task by setting it in the env block of the specific tasks.
For example:

[tasks.rust-script]
env = { "CARGO_MAKE_RUST_SCRIPT_PROVIDER" = "rust-script" }
script_runner = "@rust"
script = '''
fn main() {
    println!("test");
}
'''

[tasks.cargo-script]
env = { "CARGO_MAKE_RUST_SCRIPT_PROVIDER" = "cargo-script" }
script_runner = "@rust"
script = '''
fn main() {
    println!("test");
}
'''

[tasks.cargo-play]
env = { "CARGO_MAKE_RUST_SCRIPT_PROVIDER" = "cargo-play" }
script_runner = "@rust"
script = '''
fn main() {
    println!("test");
}
'''

Keep in mind that dependencies used by the rust script are defined differently for each runner.
Please see the specific crate docs for learn more.

Cross Platform Shell

In this example, when the shell task is invoked, the script content will be automatically converted to windows batch commands (in case we are on windows platform) and invoked.

[tasks.shell]
script_runner = "@shell"
script = '''
rm ./myfile.txt
'''

Same as OS scripts, the @shell runner also supports the cargo-make CLI arguments access.

See shell2batch project for complete set of features.

Other Programming Languages

cargo-make can also run scripts written in various scripting languages such as python, perl, ruby, javascript and more...
Any runner which takes the form of command file (for example python ./program.py) is supported.

Below are few examples:

[tasks.python]
script_runner = "python"
script_extension = "py"
script = '''
print("Hello, World!")
'''

[tasks.perl]
script_runner = "perl"
script_extension = "pl"
script = '''
print "Hello, World!\n";
'''

[tasks.javascript]
script_runner = "node"
script_extension = "js"
script = '''
console.log('Hello, World!');
'''

[tasks.php]
script_runner = "php"
script_extension = "php"
script = '''
<?php
echo "Hello, World!\n";
'''

[tasks.powershell]
script_runner = "powershell"
script_extension = "ps1"
script = '''
Write-Host "Hello, World!"
'''

In case you need to provider the script runner arguments before the script file, you can use the script_runner_args attribute.
For example:

[tasks.php-with-args]
script_runner = "php"
script_runner_args = ["-f"]
script_extension = "php"
script = '''
<?php
echo "Hello, World!\n";
'''

script_runner_args requires script_extension defined as well.

Shebang Support

Instead of defining custom runners via script_runner attribute, it's possible to define it in the script shebang line.

In case of windows, make sure not to use a runner which doesn't have the # character defined as comment (for example cmd.exe does not), which would lead to an error.

Example task using bash:

[tasks.shebang-sh]
script = '''
#!/usr/bin/env bash
echo hello
'''

Output:

> cargo make --cwd ./examples --makefile ./shebang.toml shebang-sh
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: ./shebang.toml
[cargo-make] INFO - Task: shebang-sh
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: shebang-sh
[cargo-make] INFO - Execute Command: "/usr/bin/env" "bash" "/tmp/cargo-make/cJf6XEXrL9.sh"
hello
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Example task using python:

[tasks.shebang-python]
script = '''
#!/usr/bin/env python3
print("Hello, World!")
'''

Output:

> cargo make --cwd ./examples --makefile ./shebang.toml shebang-python
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: ./shebang.toml
[cargo-make] INFO - Task: shebang-python
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: shebang-python
[cargo-make] INFO - Execute Command: "/usr/bin/env" "python3" "/tmp/cargo-make/Wy3QMJiQaS.sh"
Hello, World!
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Another trick you can do with shebang lines, is to define one of the special runners like @duckscript as follows:

[tasks.duckscript-shebang-example]
script = '''
#!@duckscript
echo Running duckscript without runner attribute.
'''

However that language must support comments starting with the # character.

Default Tasks and Extending

There is no real need to define some of the basic build, test, ... tasks that were shown in the previous examples.
cargo-make comes with a built in toml file that will serve as a base for every execution.
The optional external toml file that is provided while running cargo-make will only extend and add or overwrite tasks that are defined in the default makefiles.
Lets take the build task definition which comes already in the default toml:

[tasks.build]
description = "Runs the rust compiler."
category = "Build"
command = "cargo"
args = ["build", "--all-features"]

If for example, you would like to add verbose output to it and remove the --all-features flag, you would just need to change the args and add the --verbose as follows:

[tasks.build]
args = ["build", "--verbose"]

If you want to disable some existing task (will also disable its dependencies), you can do it as follows:

[tasks.build]
disabled = true

There is no need to redefine existing properties of the task, only what needs to be added or overwritten.
The default toml file comes with many steps and flows already built in, so it is worth to check it first.

In case you do want to delete all of the original task attributes in your extended task, you can use the clear attribute as follows:

[tasks.sometask]
clear = true
command = "echo"
args = [
    "extended task"
]

You can also extend additional external files from your external makefile by using the extend attribute, for example:

extend = "my_common_makefile.toml"

The file path in the extend attribute is always relative to the current toml file you are in and not to the process working directory.

The extend attribute can be very useful when you have a workspace with a Makefile.toml that contains all of the common custom tasks and in each project you can have a simple Makefile.toml which just has the extend attribute pointing to the workspace makefile.

Extending External Makefiles

In order for a makefile to extend additional external files from your external file by using the extend attribute, for example:

extend = "my_common_makefile.toml"

The file path in the extend attribute is always relative to the current toml file you are in and not to the process working directory.
The makefile pointed to in the extend attribute must exist or the build will fail.

In order to define optional extending makefiles, you will need to pass the optional flag in addition to the path as follows:

extend = { path = "does_not_exist_makefile.toml", optional = true }

You can also define a list of makefiles to extend from.
All will be loaded in the order you define.
For example:

extend = [ { path = "must_have_makefile.toml" }, { path = "optional_makefile.toml", optional = true }, { path = "another_must_have_makefile.toml" } ]

Automatically Extend Workspace Makefile

When running cargo make for modules which are part of a workspace, you can automatically have the member crates makefile (even if doesn't exist) extend the workspace level makefile.

The workspace level makefile env section must contain the following environment variable (can also be set via cli)

[env]
CARGO_MAKE_EXTEND_WORKSPACE_MAKEFILE = true

This allows you to maintaining a single makefile for the entire workspace but having access to those custom tasks in every member crate.
This is only relevant for workspace builds which are triggered in the workspace root.
Flows that start directly in the member crate, must manually extend the workspace level makefile using the extend keyword.

Load Scripts

In more complex scenarios, you may want multiple unrelated projects to share some common custom tasks, for example if you wish to notify some internal company server of the build status.
Instead of redefining those tasks in each project you can create a single toml file with those definitions and have all projects extend that file.
The extend however, only knows to find the extending files in the local file system, so in order to pull some common toml from a remote server (using http or git clone and so on...), you can use the load scripts.

Load scripts are defined in the config section using the load_script attribute and are invoked before the extend attribute is evaluated.
This allows you to first pull the toml file from the remote server and put it in a location defined by the extend attribute.

Here is an example of a load script which downloads the common toml from a remote server using HTTP:

[config]
load_script = "wget -O /home/myuser/common.toml companyserver.com/common.toml"

Here is an example of pulling the common toml file from some git repo:

[config]
load_script = "git clone git@mygitserver:user/project.git /home/myuser/common"

You can run any command or set of commands you want, therefore you can build a more complex flow of how and from where to fetch the common toml file and where to put it.
If needed, you can override the load_script per platform using the linux_load_script, windows_load_script and mac_load_script attributes.

Predefined Makefiles

While cargo-make comes with many built in tasks, defined in the default makefiles, they are not always relevant for every project.
The cargo-make-tasks repository holds a collection of additional makefiles that can be loaded and provide replacement tasks for the built in cargo-make tasks.
For example the cmake.toml provides cmake related tasks for projects using cmake.

See the cargo-make-tasks repository for more information and usage examples.

Extending Tasks

There are multiple ways of extending tasks in the same or from extended makefiles.

Task Override

cargo-make comes with many predefined tasks and flows that can be used without redefining them in your project.
However in some cases, you would like to change them a bit to fit your needs without rewriting the entire task.
Lets take for example the build task which is predefined internally inside cargo-make as follows:

[tasks.build]
description = "Runs the rust compiler."
category = "Build"
command = "cargo"
args = ["build", "--all-features"]

If for example you do not want to use the --all-features mode, you can just change the args of the task in your external Makefile.toml as follows:

[tasks.build]
args = ["build"]

When cargo-make starts up, it will load the external Makefile.toml and the internal makefile definitions and will merge them.
Since the external file overrides the internal definitions, only the args attribute for the build task which was redefined, will override the args attribute which was defined internally, and the actual result would be:

[tasks.build]
description = "Runs the rust compiler."
category = "Build"
command = "cargo"
args = ["build"]

The same process can be used to override tasks from other makefiles loaded using the extend keyword from Extending External Makefiles section.

Platform Override

In case you want to override a task or specific attributes in a task for specific platforms, you can define an override task with the platform name (currently linux, windows and mac) under the specific task.
For example:

[tasks.hello-world]
script = '''
echo "Hello World From Unknown"
'''

[tasks.hello-world.linux]
script = '''
echo "Hello World From Linux"
'''

If you run cargo make with task 'hello-world' on linux, it would redirect to hello-world.linux while on other platforms it will execute the original hello-world.
In linux the output would be:

[cargo-make] INFO - Task: hello-world
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: hello-world
[cargo-make] INFO - Execute Command: "sh" "/tmp/cargo-make/kOUJfw8Vfc.sh"
Hello World From Linux
[cargo-make] INFO - Build done in 0 seconds.

While on other platforms

[cargo-make] INFO - Task: hello-world
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: hello-world
[cargo-make] INFO - Execute Command: "sh" "/tmp/cargo-make/2gYnulOJLP.sh"
Hello World From Unknown
[cargo-make] INFO - Build done in 0 seconds.

In the override task you can define any attribute that will override the attribute of the parent task, while undefined attributes will use the value from the parent task and will not be modified.
In case you need to delete attributes from the parent (for example you have a command defined in the parent task but you want to have a script defined in the override task), then you will have to clear the parent task in the override task using the clear attribute as follows:

[tasks.hello-world.linux]
clear = true
script = '''
echo "Hello World From Linux"
'''

This means, however, that you will have to redefine all attributes in the override task that you want to carry with you from the parent task.
Important - alias comes before checking override task so if parent task has an alias it will be redirected to that task instead of the override.
To have an alias redirect per platform, use the linux_alias, windows_alias, mac_alias attributes.
In addition, aliases can not be defined in platform override tasks, only in parent tasks.

Extend Attribute

Until now, the override capability enabled to override the task with the same name from different makefile or in different platforms.
However, the extend keyword is also available on the task level and enables you to override any task by name.
Let's look at the following example:

[tasks.1]
category = "1"
description = "1"
command = "echo"
args = ["1"]

[tasks.2]
extend = "1"
category = "2"
args = ["2"]

[tasks.3]
extend = "2"
args = ["3"]

When task 3 is loaded, it loads task 2 which loads task 1.
The final task 3 definition would be:

[tasks.3]
extend = "2"
category = "2"
description = "1"
command = "echo"
args = ["3"]

We run task 3 the output would be:

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: task_extend.toml
[cargo-make] INFO - Task: 3
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: 3
[cargo-make] INFO - Execute Command: "echo" "3"
3
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Environment Variables

cargo-make enables you to defined environment variables in several ways.
Environment variables can later be used in commands, scripts, conditions, functions and more, so it is important to have a powerful way to define them for your build.

Global Configuration

You can define env vars to be set as part of the execution of the flow in the global env block for your makefile, for example:

[env]
RUST_BACKTRACE = 1
EVALUATED_VAR = { script = ["echo SOME VALUE"] }
TEST1 = "value1"
TEST2 = "value2"
BOOL_VALUE = true
DEV = false
PROD = false
COMPOSITE = "${TEST1} ${TEST2}"
MULTI_LINE_SCRIPT = { script = ["echo 1\necho 2"], multi_line = true }
LIBRARY_EXTENSION = { source = "${CARGO_MAKE_RUST_TARGET_OS}", default_value = "unknown", mapping = {"linux" = "so", "macos" = "dylib", "windows" = "dll", "openbsd" = "so" } }
TO_UNSET = { unset = true }
PREFER_EXISTING = { value = "new", condition = { env_not_set = ["PREFER_EXISTING"] } }
OVERWRITE_EXISTING = { value = "new", condition = { env_set = ["OVERWRITE_EXISTING"] } }
ENV_FROM_LIST = ["ARG1", "${SIMPLE}", "simple value: ${SIMPLE} script value: ${SCRIPT}"]
PATH_GLOB = { glob = "./src/**/mod.rs", include_files = true, include_dirs = false, ignore_type = "git" }

# profile based environment override
[env.development]
DEV = true

[env.production]
PROD = true

Environment variables can be defined as:

  • Simple key/value pair, where the value can be either string, boolean or a number.
RUST_BACKTRACE = 1
BOOL_VALUE = true
  • Key and an array which will be joined with the ';' separator
LIST_VALUE = [ "VALUE1", "VALUE2", "VALUE3" ]
  • Key and output of a script (only simple native shell scripts are supported, special runners such as duckscript, rust and so on, are not supported)
EVALUATED_VAR = { script = ["echo SOME VALUE"] }
  • Key and a decode map (if default_value not provided, it will default to the source value)
LIBRARY_EXTENSION = { source = "${CARGO_MAKE_RUST_TARGET_OS}", default_value = "unknown", mapping = {"linux" = "so", "macos" = "dylib", "windows" = "dll", "openbsd" = "so" } }
  • Key and a value expression built from strings and other env variables using the ${} syntax
COMPOSITE = "${TEST1} and ${TEST2}"
  • Key and a path glob which will populate the env variable with all relevant paths separated by a ';' character
PATH_GLOB = { glob = "./src/**/mod.rs", include_files = true, include_dirs = false, ignore_type = "git" }
  • Key and a structure holding the value (can be an expression) and optional condition which must be valid in order for the environment variable to be set

All environment variables defined in the env block and in the default Makefile.toml will be set before running the tasks.
To unset an environment variable, use the MY_VAR = { unset = true } syntax.
See more on profile based environment setup in the profile environment section

Task

Environment variables can be defined inside tasks using the env attribute, so when a task is invoked (after its dependencies), the environment variables will be set, for example:

[tasks.test-flow]
env = { "SOME_ENV_VAR" = "value" }
run_task = "actual-task"

[tasks.actual-task]
condition = { env_set = [ "SOME_ENV_VAR" ] }
script = '''
echo var: ${SOME_ENV_VAR}
'''

In task level, environment variables capabilities are the same as in the global level.

Command Line

Environment variables can be defined in the command line using the --env/-e argument as follows:

cargo make --env ENV1=VALUE1 --env ENV2=VALUE2 -e ENV3=VALUE3

Env File

It is also possible to provide an env file path as part of the cli args as follows:

cargo make --env-file=./env/production.env

This allows to use the same Makefile.toml but with different environment variables loaded from different env files.

The env file, is a simple key=value file.

In addition, you can define environment variables values based on other environment variables using the ${} syntax.
For example:

#just a comment...
ENV1_TEST=TEST1
ENV2_TEST=TEST2
ENV3_TEST=VALUE OF ENV2 IS: ${ENV2_TEST}

Env files can also be defined globally in the Makefile.toml via env_files attribute as follows:

env_files = [
    "./env1.env",
    "./env2.env"
]

In this example, the env files will be loaded in the order in which they were defined.
To enable profile based filtering, you can use the object form as follows:

env_files = [
    { path = "./profile.env", profile = "development" },
    { path = "./env.env" }
]

In this example, profile.env is only loaded in case the runtime profile is development.
More on profiles in the profiles section.

Relative paths are relative compared to the toml file that declared them and not to the current working directory.

The same env_files attribute can be defined on the task level, however relative paths on the task level are relative to the current working directory.
If the task defines a different working directory, it will change after the env files are loaded.

Env Setup Scripts

Environment setup scripts are script that are invoked after environment files and the env block.
They are defined globally by the env_scripts attribute.
These scripts can be used to run anything needed before starting up the flow.
In case of duckscript scripts which are invoked by the embedded runtime, it is also possible to modify the cargo-make runtime environment variables directly.
For Example:

env_scripts = [
'''
#!@duckscript
echo first env script...

composite_env_value = get_env COMPOSITE
echo COMPOSITE = ${composite_env_value}

set_env COMPOSITE_2 ${composite_env_value}
''',
'''
#!@duckscript
echo second env script...

composite_env_value = get_env COMPOSITE_2
echo COMPOSITE_2 = ${composite_env_value}
'''
]

[env]
SIMPLE = "SIMPLE VALUE"
SCRIPT = { script = ["echo SCRIPT VALUE"] }
COMPOSITE = "simple value: ${SIMPLE} script value: ${SCRIPT}"

In this example, since the env block is invoked before the env scripts, the duckscripts have access to the COMPOSITE environment variable.
These scripts use that value to create a new environment variable COMPOSITE_2 and in the second script we just print it.

Loading Order

cargo-make will load the environment variables in the following order

  • Load environment file provided on the command line
  • Setup internal environment variables (see Global section). Not including per task variables.
  • Load global environment files defined in the env_files attribute.
  • Load global environment variables provided on the command line.
  • Load global environment variables defined in the env block and relevant sub env blocks based on profile/additional profiles.
  • Load global environment variables defined in the env.[current profile] block.
  • Load global environment setup scripts defines in the env_scripts attribute.
  • Per Task
    • Load environment files defined in the env_files attribute (relative paths are treated differently then global env_files).
    • Setup per task internal environment variables (see Global section).
    • Load environment variables defined in the env block (same behaviour as global env block).

Global

In addition to manually setting environment variables, cargo-make will also automatically add few environment variables on its own which can be helpful when running task scripts, commands, conditions, etc.

  • CARGO_MAKE - Set to "true" to help sub processes identify they are running from cargo make.
  • CARGO_MAKE_TASK - Holds the name of the main task being executed.
  • CARGO_MAKE_TASK_ARGS - A list of arguments provided to cargo-make after the task name, separated with a ';' character.
  • CARGO_MAKE_CURRENT_TASK_NAME - Holds the currently executed task name.
  • CARGO_MAKE_CURRENT_TASK_INITIAL_MAKEFILE - Holds the full path to the makefile which initially defined the currently executed task (not available for internal core tasks).
  • CARGO_MAKE_CURRENT_TASK_INITIAL_MAKEFILE_DIRECTORY - Holds the full path to the directory containing the makefile which initially defined the currently executed task (not available for internal core tasks).
  • CARGO_MAKE_COMMAND - The command used to invoke cargo-make (for example: cargo make and makers)
  • CARGO_MAKE_WORKING_DIRECTORY - The current working directory (can be defined by setting the --cwd cli option)
  • CARGO_MAKE_WORKSPACE_WORKING_DIRECTORY - The original working directory of the workspace. Enables workspace members access to the workspace level CARGO_MAKE_WORKING_DIRECTORY.
  • CARGO_MAKE_PROFILE - The current profile name in lower case (should not be manually modified by global/task env blocks)
  • CARGO_MAKE_ADDITIONAL_PROFILES - The additional profile names in lower case, separated with a ';' character (should not be manually modified by global/task env blocks)
  • CARGO_MAKE_PROJECT_NAME - For standalone crates, this will be the same as CARGO_MAKE_CRATE_NAME and for workspace it will default to the working directory basename.
  • CARGO_MAKE_PROJECT_VERSION For standalone crates, this will be the same as CARGO_MAKE_CRATE_VERSION and for workspaces it will be the main crate version (main crate defined by the optional main_project_member attribute in the config section).
  • CARGO_MAKE_CARGO_HOME - The path to CARGO_HOME as described in the cargo documentation
  • CARGO_MAKE_CARGO_PROFILE - The cargo profile name mapped from the CARGO_MAKE_PROFILE (unmapped value will default to CARGO_MAKE_PROFILE value)
  • CARGO_MAKE_RUST_VERSION - The rust version (for example 1.20.0)
  • CARGO_MAKE_RUST_CHANNEL - Rust channel (stable, beta, nightly)
  • CARGO_MAKE_RUST_TARGET_ARCH - x86, x86_64, arm, etc ... (see rust cfg feature)
  • CARGO_MAKE_RUST_TARGET_ENV - gnu, msvc, etc ... (see rust cfg feature)
  • CARGO_MAKE_RUST_TARGET_OS - windows, macos, ios, linux, android, etc ... (see rust cfg feature)
  • CARGO_MAKE_RUST_TARGET_POINTER_WIDTH - 32, 64
  • CARGO_MAKE_RUST_TARGET_VENDOR - apple, pc, unknown
  • CARGO_MAKE_RUST_TARGET_TRIPLE - x86_64-unknown-linux-gnu, x86_64-apple-darwin, x86_64-pc-windows-msvc, etc ...
  • CARGO_MAKE_CRATE_TARGET_DIRECTORY - Gets target directory where cargo stores the output of a build, respects ${CARGO_TARGET_DIR}, .cargo/config.toml's and ${CARGO_HOME}/config.toml, but not --target-dir command-line flag.
  • CARGO_MAKE_CRATE_CUSTOM_TRIPLE_TARGET_DIRECTORY - Like CARGO_MAKE_CRATE_TARGET_DIRECTORY but respects build.target in .cargo/config.toml.
  • CARGO_MAKE_CRATE_HAS_DEPENDENCIES - Holds true/false based if there are dependencies defined in the Cargo.toml or not (defined as false if no Cargo.toml is found)
  • CARGO_MAKE_CRATE_IS_WORKSPACE - Holds true/false based if this is a workspace crate or not (defined even if no Cargo.toml is found)
  • CARGO_MAKE_CRATE_WORKSPACE_MEMBERS - Holds list of member paths (defined as empty value if no Cargo.toml is found)
  • CARGO_MAKE_CRATE_CURRENT_WORKSPACE_MEMBER - Holds the name of the current workspace member being built (only if flow started as a workspace level flow)
  • CARGO_MAKE_CRATE_LOCK_FILE_EXISTS - Holds true/false based if a Cargo.lock file exists in current working directory (in workspace projects, each member has a different working directory).
  • CARGO_MAKE_CRATE_TARGET_TRIPLE - Gets target triple that will be build with by default, respects .cargo/config.toml and ${CARGO_HOME}/config.toml.
  • CARGO_MAKE_CI - Holds true/false based if the task is running in a continuous integration system (such as Travis CI).
  • CARGO_MAKE_PR - Holds true/false based if the task is running in a continuous integration system (such as Travis CI) as part of a pull request build (unknown is set as false).
  • CARGO_MAKE_CI_BRANCH_NAME - Holds the continuous integration branch name (if available).
  • CARGO_MAKE_CI_VENDOR - Holds the continuous integration vendor name (if available).
  • CARGO_MAKE_DUCKSCRIPT_VERSION - The embedded duckscript runtime version.
  • CARGO_MAKE_DUCKSCRIPT_SDK_VERSION - The embedded duckscript SDK version.

The following environment variables will be set by cargo-make if Cargo.toml file exists and the relevant value is defined:

  • CARGO_MAKE_CRATE_NAME - Holds the crate name from the Cargo.toml file found in the cwd.
  • CARGO_MAKE_CRATE_FS_NAME - Same as CARGO_MAKE_CRATE_NAME however some characters are replaced (for example '-' to '_').
  • CARGO_MAKE_CRATE_VERSION - Holds the crate version from the Cargo.toml file found in the cwd.
  • CARGO_MAKE_CRATE_DESCRIPTION - Holds the crate description from the Cargo.toml file found in the cwd.
  • CARGO_MAKE_CRATE_LICENSE - Holds the crate license from the Cargo.toml file found in the cwd.
  • CARGO_MAKE_CRATE_DOCUMENTATION - Holds the crate documentation link from the Cargo.toml file found in the cwd.
  • CARGO_MAKE_CRATE_HOMEPAGE - Holds the crate homepage link from the Cargo.toml file found in the cwd.
  • CARGO_MAKE_CRATE_REPOSITORY - Holds the crate repository link from the Cargo.toml file found in the cwd.

The following environment variables will be set by cargo-make if the project is part of a git repo:

  • CARGO_MAKE_GIT_BRANCH - The current branch name.
  • CARGO_MAKE_GIT_USER_NAME - The user name pulled from the git config user.name key.
  • CARGO_MAKE_GIT_USER_EMAIL - The user email pulled from the git config user.email key.
  • CARGO_MAKE_GIT_HEAD_LAST_COMMIT_HASH - The last HEAD commit hash.
  • CARGO_MAKE_GIT_HEAD_LAST_COMMIT_HASH_PREFIX - The last HEAD commit hash prefix.

Ignoring Errors

In some cases you want to run optional tasks as part of a bigger flow, but do not want to break your entire build in case of any error in those optional tasks.
For those tasks, you can add the ignore_errors=true attribute.

[tasks.unstable_task]
ignore_errors = true

Conditions

Conditions allow you to evaluate at runtime if to run a specific task or not.
These conditions are evaluated before the task is running its installation and/or commands and if the condition is not fulfilled, the task will not be invoked.
The task dependencies however are not affected by parent task condition outcome.

There are two types of conditions:

The task runner will evaluate any condition defined and a task definition may contain both types at the same time.

Criteria

The condition attribute may define multiple parameters to validate.
All defined parameters must be valid for the condition as a whole to be true and enable the task to run.

Below is an example of a condition definition that checks that we are running on windows or linux (but not mac) and that we are running on beta or nightly (but not stable):

[tasks.test-condition]
condition = { platforms = ["windows", "linux"], channels = ["beta", "nightly"] }
script = '''
echo "condition was met"
'''

The following condition types are available:

  • profile - See profiles for more info
  • platforms - List of platform names (windows, linux, mac)
  • channels - List of rust channels (stable, beta, nightly)
  • env_set - List of environment variables that must be defined
  • env_not_set - List of environment variables that must not be defined
  • env_true - List of environment variables that must be defined and must not be set to any of the following (case insensitive): false, no, 0 or empty
  • env_false - List of environment variables that must be defined and set to any of the following (case insensitive): false, no, 0 or empty
  • env - Map of environment variables that must be defined and equal to the provided values
  • env_contains - Map of environment variables that must be defined and contain (case insensitive) the provided values
  • rust_version - Optional definition of min, max and/or specific rust version
  • files_exist - List of absolute path files to check they exist. Environment substitution is supported so you can define relative paths such as ${CARGO_MAKE_WORKING_DIRECTORY}/Cargo.toml
  • files_not_exist - List of absolute path files to check they do not exist. Environment substitution is supported so you can define relative paths such as ${CARGO_MAKE_WORKING_DIRECTORY}/Cargo.toml

Few examples:

[tasks.test-condition]
condition = { profiles = ["development", "production"], platforms = ["windows", "linux"], channels = ["beta", "nightly"], env_set = [ "CARGO_MAKE_KCOV_VERSION" ], env_not_set = [ "CARGO_MAKE_SKIP_CODECOV" ], env = { "CARGO_MAKE_CI" = true, "CARGO_MAKE_RUN_CODECOV" = true }, rust_version = { min = "1.20.0", max = "1.30.0" } files_exist = ["${CARGO_MAKE_WORKING_DIRECTORY}/Cargo.toml"] files_not_exist = ["${CARGO_MAKE_WORKING_DIRECTORY}/Cargo2.toml"] }

To setup a custom failure message, use the fail_message inside the condition object, for example:

[tasks.test-condition-with-message]
condition = { platforms = ["windows"], fail_message = "Condition Failed." }
command = "echo"
args = ["condition was met"]

Fail messages are only printed if log level is verbose or reduce output flag is set to false in the config as follows:

[config]
reduce_output = false

Scripts

These script are invoked before the task is running its installation and/or commands and if the exit code of the condition script is non zero, the task will not be invoked.

Below is an example of a condition script that always returns a non zero value, in which case the command is never executed:

[tasks.never]
condition_script = [
    "exit 1"
]
command = "cargo"
args = ["build"]

Condition scripts can be used to ensure that the task is only invoked if a specific condition is met, for example if a specific 3rd party is installed.

To setup a custom failure message, use the fail_message inside the condition object, for example:

[tasks.test-condition-script-with-message]
condition = { fail_message = "Condition Script Failed." }
condition_script = [
    "exit 1"
]
command = "echo"
args = ["condition was met"]

Combining Conditions and Sub Tasks

Conditions and run_task combined can enable you to define a conditional sub flow.
For example, if you have a coverage flow that should only be invoked on linux in a CI build, and only if the CARGO_MAKE_RUN_CODECOV environment variable is defined as "true":

[tasks.ci-coverage-flow]
description = "Runs the coverage flow and uploads the results to codecov."
condition = { platforms = ["linux"], env = { "CARGO_MAKE_CI" = true, "CARGO_MAKE_RUN_CODECOV" = true } }
run_task = "codecov-flow"

[tasks.codecov-flow]
description = "Runs the full coverage flow and uploads the results to codecov."
windows_alias = "empty"
dependencies = [
    "coverage-flow",
    "codecov"
]

The first task ci-coverage-flow defines the condition that checks we are on linux, running as part of a CI build and the CARGO_MAKE_RUN_CODECOV environment variable is set to "true".
Only if all conditions are met, it will run the codecov-flow task.
We can't define the condition directly on the codecov-flow task, as it will invoke the task dependencies before checking the condition.

Installing Dependencies

Some tasks will require third party crates, rustup components or other native tools.
cargo-make provides multiple ways to setup those dependencies before running the task.

Cargo Plugins

When a task invokes a cargo plugin using the command attribute, for example:

[tasks.audit]
command = "cargo"
args = ["audit"]

cargo-make will first check the command is available.
Only if the command is not available, it will attempt to install it by running cargo install cargo-
In case the cargo plugin has a different name, you can specify it manually via install_crate attribute.
You can specify additional installation arguments using the install_crate_args attribute (for example: version).

To disable the automatic crate installation, you can set the install_crate attribute as false, for example:

[tasks.test]
command = "cargo"
args = ["test"]
install_crate = false

Crates

cargo-make can verify third party crates are installed if the relevant installation info is provided.
First it will check the crate is installed, and only if not available it will attempt to install it.
Installation of third party crates is first done via rustup if the component name is provided.
If rustup failed or component name is not provided, it will resort to using cargo install command.
For example:

[tasks.rustfmt]
install_crate = { crate_name = "rustfmt-nightly", rustup_component_name = "rustfmt-preview", binary = "rustfmt", test_arg = "--help" }
command = "rustfmt"

In this example, cargo will first test that the command rustfmt --help works well and only if fails, it will first attempt to install via rustup the component rustfmt-preview and if failed, it will try to run cargo install for the crate name rustfmt-nightly.

If passing multiple arguments is necessary, test_arg may contain an array of arguments. For example:

[tasks.doc-upload]
install_crate = { crate_name = "cargo-travis", binary = "cargo", test_arg = ["doc-upload", "--help"] }
command = "cargo"
args = ["doc-upload"]

In this example, cargo-make will test the presence of cargo-travis by running the command cargo doc-upload --help, and install the crate only if this command fails.

Rustup Components

Rustup components that are not deployed as crates or components which are pure sources (no executable binary), can also be installed via cargo-make.
The following example show how to install a rustup component with binaries:

[tasks.install-rls]
install_crate = { rustup_component_name = "rls-preview", binary = "rls", test_arg = "--help" }

In this example, cargo-make will first check if rls binary is available and only if failed to execute it, it will install the rls component using rustup.

Some rustup components are pure sources and therefore in those cases, cargo-make cannot verify that they are already installed, and will attempt to install them every time.
Example:

[tasks.install-rust-src]
install_crate = { rustup_component_name = "rust-src" }

Native Dependencies

Native dependencies can also be installed, however it is up to the Makefile author to write the script which checks the dependency exists and if not, to install it correctly.
This is done by setting up an installation script in the install_script attribute of the task.
It is possible to use platform overrides to specify different installation scripts for linux/mac/windows platforms.
For example:

[tasks.coverage-kcov]
windows_alias = "empty"
install_script = '''
KCOV_INSTALLATION_DIRECTORY=""
KCOV_BINARY_DIRECTORY=""
if [ -n "CARGO_MAKE_KCOV_INSTALLATION_DIRECTORY" ]; then
    mkdir -p ${CARGO_MAKE_KCOV_INSTALLATION_DIRECTORY}
    cd ${CARGO_MAKE_KCOV_INSTALLATION_DIRECTORY}
    KCOV_INSTALLATION_DIRECTORY="$(pwd)/"
    cd -
    echo "Kcov Installation Directory: ${KCOV_INSTALLATION_DIRECTORY}"
    KCOV_BINARY_DIRECTORY="${KCOV_INSTALLATION_DIRECTORY}/build/src/"
    echo "Kcov Binary Directory: ${KCOV_BINARY_DIRECTORY}"
fi

# get help info to fetch all supported command line arguments
KCOV_HELP_INFO=`${KCOV_BINARY_DIRECTORY}kcov --help` || true

# check needed arguments are supported, else install
if [[ $KCOV_HELP_INFO != *"--include-pattern"* ]] || [[ $KCOV_HELP_INFO != *"--exclude-line"* ]] || [[ $KCOV_HELP_INFO != *"--exclude-region"* ]]; then
    # check we are on a supported platform
    if [ "$(grep -Ei 'debian|buntu|mint' /etc/*release)" ]; then
        echo "Installing/Upgrading kcov..."
        sudo apt-get update || true
        sudo apt-get install -y libcurl4-openssl-dev libelf-dev libdw-dev cmake gcc binutils-dev

        mkdir -p ${CARGO_MAKE_KCOV_DOWNLOAD_DIRECTORY}
        cd ${CARGO_MAKE_KCOV_DOWNLOAD_DIRECTORY}
        KCOV_DOWNLOAD_DIRECTORY=$(pwd)

        wget https://github.com/SimonKagstrom/kcov/archive/v${CARGO_MAKE_KCOV_VERSION}.zip
        unzip v${CARGO_MAKE_KCOV_VERSION}.zip
        cd kcov-${CARGO_MAKE_KCOV_VERSION}
        mkdir -p build
        cd ./build
        cmake ..
        make

        # if custom installation directory, leave kcov as local
        if [ -n "CARGO_MAKE_KCOV_INSTALLATION_DIRECTORY" ]; then
            cd ${KCOV_DOWNLOAD_DIRECTORY}/kcov-${CARGO_MAKE_KCOV_VERSION}
            mv ./* ${KCOV_INSTALLATION_DIRECTORY}
        else
            sudo make install
            cd ../..
            rm -rf kcov-${CARGO_MAKE_KCOV_VERSION}
        fi
    fi
fi
'''

This task, checks if kcov is installed and if not, will install it and any other dependency it requires.

Defining Version

It is possible to define minimal version of depended crates, for example:

[tasks.simple-example]
install_crate = { min_version = "0.0.1" }
command = "cargo"
args = ["make", "--version"]

[tasks.complex-example]
install_crate = { crate_name = "cargo-make", binary = "cargo", test_arg = ["make", "--version"], min_version = "0.0.1" }
command = "cargo"
args = ["make", "--version"]

This ensures we are using a crate version that supports the feature we require for the build.
Currently there are few limitations when defining min_version:

  • Specifying toolchain in the task or rustup_component_name in the install_crate structure, will make cargo-make ignore the min version value.
  • In case cargo-make is unable to detect the currently installed version due to any error, cargo-make will assume the version is valid and printout a warning.

If you want to ensure a specific version is used, you can define the version attribute instead, for example:

[tasks.complex-example]
install_crate = { crate_name = "cargo-make", binary = "cargo", test_arg = ["make", "--version"], version = "0.0.1" }
command = "cargo"
args = ["make", "--version"]

Global Lock Of Versions

In case min_version is defined, you can have the --locked flag automatically added to the crate installation command by defining the CARGO_MAKE_CRATE_INSTALLATION_LOCKED=true environment variable. If version is defined instead of min_version, this will automatically be set as true.

Alternate Cargo Install Commands

You can specify a different cargo install command in order to make the crate installation to use some custom cargo installer plugin. For example, if you want to use instead of install a plugin such as local-install simply add the install_command attribute with the relevant value.
For example:

[tasks.alt-command-example1]
install_crate = { install_command = "custom-install" }
command = "cargo"
args = ["somecrate"]

[tasks.alt-command-example2]
install_crate = { crate_name = "somecrate", install_command = "custom-install" }

Installation Priorities

Only one type of installation will be invoked per task.
The following defines the installation types sorted by priority for which cargo-make uses to decide which installation flow to invoke:

  • install_crate - Enables to install crates and rustup components.
  • install_script - Custom script which can be used to install or run anything that is needed by the task command.
  • automatic cargo plugin - In case the command is cargo, cargo-make will check which cargo plugin to automatically install (if needed).

In case multiple installation types are defined (for example both install_crate and install_script) only one installation type will be invoked based on the above priority list.

Multiple Installations

In some cases, tasks require multiple items installed in order to run properly.
For example, you might need rustup component rls and rust-src and cargo plugin cargo-xbuild at the same task.
In order to achieve this, you can split the task to invocation task and installation task and set the installation task as a dependency.
The following example defines a flow of two similar tasks that have the same dependencies: cargo-xbuild crate, rls rustup binary component and rust-src rustup sources only component.
You can have both rustup dependencies as an installation only tasks which are set as dependencies for the xbuild tasks.
Since dependencies are only invoked once, it will also ensure that those rustup components are not installed twice.

[tasks.install-rls]
# install rls-preview only if needed
install_crate = { rustup_component_name = "rls-preview", binary = "rls", test_arg = "--help" }

[tasks.install-rust-src]
# always install rust-src via rustup component add
install_crate = { rustup_component_name = "rust-src" }

[tasks.xbuild1]
# run cargo xbuild, if xbuild is not installed, it will be automatically installed for you
command = "cargo"
args = [ "xbuild", "some arg" ]
dependencies = [ "install-rls", "install-rust-src" ]

[tasks.xbuild2]
# run cargo xbuild, if xbuild is not installed, it will be automatically installed for you
command = "cargo"
args = [ "xbuild", "another arg" ]
dependencies = [ "install-rls", "install-rust-src" ]

[tasks.myflow]
dependencies = [ "xbuild1", "xbuild2" ]

Workspace Support

In case cargo-make detects that the current working directory is a workspace root (A directory with Cargo.toml which defines a workspace and its members), it will not invoke the requested tasks in that directory.
Instead, it will generate a task definition in runtime which will go to each member directory and invoke the requested task on that member.
For example if we have the following directory structure:

workspace
├── Cargo.toml
├── member1
│   └── Cargo.toml
└── member2
    └── Cargo.toml

And we ran cargo make mytask, it will go to each workspace member directory and execute: cargo make mytask at that directory, where mytask is the original task that was requested on the workspace level.
The order of the members is defined by the member attribute in the workspace Cargo.toml.

This flow is called a workspace flow, as it identifies the workspace and handles the request for each workspace member, while the root directory which defines the workspace structure is ignored.

We can use this capability to run same functionality on all workspace member crates, for example if we want to format all crates, we can run in the workspace directory: cargo make format.

Member crate makefiles can also automatically extend the workspace directory makefile.
See more info at the relevant section.

Disabling Workspace Support

In case you wish to run the tasks on the workspace root directory and not on the members (for example generating a workspace level README file), use the --no-workspace cli flag when running cargo make.
For example:

cargo make --no-workspace mytask

This makes cargo-make ignore that this directory is a workspace root and just runs a simple flow as if this was a simple directory with a makefile.

Another way to call a task on the workspace level and not for each member, is to define that task in the workspace Makefile.toml with workspace set to false as follows:

[tasks.ignore-members]
workspace = false

Setting workspace=false for the task requested on the cargo-make command line is equivalent to calling it with the --no-workspace flag.
This flag is only checked for the task on the cargo-make command line and is completely ignored for all other tasks which are executed as part of the flow.
By default the workspace flag for all tasks is set to true, but that can be configured differently in the config section as follows:

[config]
default_to_workspace = false

In which case, workspace level support is always disabled unless a task defines workspace=true.

Composite Flow

You can define a composite flow that runs tasks on both the workspace root directory and member directories.
This is an example of a workspace level Makefile.toml which enables to run such a flow:

[tasks.composite]
dependencies = ["member_flow", "workspace_flow"]

[tasks.member_flow]
# by forking, cargo make starts and by default detects it is a workspace and runs the member_task for each member
run_task = { name = "member_task", fork = true }

[tasks.workspace_flow]
#run some workspace level command or flow

You can start this composite flow as follows:

cargo make --no-workspace composite

Profiles

You can prevent profiles from being passed down to workspace members by setting CARGO_MAKE_USE_WORKSPACE_PROFILE to false:

[env]
CARGO_MAKE_USE_WORKSPACE_PROFILE = false

See more on profiles in the profile section.

Skipping/Including Specific Members

In most cases you will want to run a specific flow on all members, but in rare cases you will want to skip specific members.

By setting the CARGO_MAKE_WORKSPACE_SKIP_MEMBERS environment variable to hold the member names to skip (as an array), you can define if you want those members not to participate in the flow.

In the below example we will skip member3 and member4 (should be defined in the workspace level Makefile.toml):

[env]
CARGO_MAKE_WORKSPACE_SKIP_MEMBERS = ["member3", "member4"]

You can also define glob paths, for example:

[env]
CARGO_MAKE_WORKSPACE_SKIP_MEMBERS = "tools/*"

However there are some cases you will want to skip specific members only if a specific condition is met.
For example, you want to build a member module only if we are running on a rust nightly compiler.
This is a simple example of a conditioned skip for member3 and memeber4 (should be defined in the workspace level Makefile.toml):

[tasks.workspace-task]
condition = { channels = ["beta", "stable"] }
env = { "CARGO_MAKE_WORKSPACE_SKIP_MEMBERS" = ["member3", "member4"] }
run_task = { name = "member-task", fork = true }

You will have to invoke this as a composite flow:

cargo make workspace-task --no-workspace

In addition you can also state the opposite, meaning which members to include via CARGO_MAKE_WORKSPACE_INCLUDE_MEMBERS environment variable.
It follows the same rules as the CARGO_MAKE_WORKSPACE_SKIP_MEMBERS environment variable.
If you define both, the included members will be a subset of the non excluded members, meaning both filters will apply.

Workspace Emulation

Workspace emulation enables you to create a workspace like structure for your project without actually defining a rust workspace.
This means you can have a project directory without a Cargo.toml and have many child crates.
This enables to run cargo make on all member crates while on the root project folder without having the need of an actual cargo workspace which has some side effects (such as shared target folder and dependencies).

In order to setup the workspace emulation, you will need to define the following in your workspace level Makefile.toml:

[env]
# this tells cargo-make that this directory acts as a workspace root
CARGO_MAKE_WORKSPACE_EMULATION = true

# a list of crate members. since we do not have a Cargo.toml, we will need to specify this in here.
CARGO_MAKE_CRATE_WORKSPACE_MEMBERS = [
    "member1",
    "member2"
]

Toolchain

cargo-make supports setting the toolchain to be used when invoking commands and installing rust dependencies by setting the toolchain attribute as part of the task definition.
The following example shows how to print both stable and nightly rustc versions currently installed:

[tasks.rustc-version-stable]
toolchain = "stable"
command = "rustc"
args = [ "--version" ]

[tasks.rustc-version-nightly]
toolchain = "nightly"
command = "rustc"
args = [ "--version" ]

[tasks.rustc-version-flow]
dependencies = [
    "rustc-version-stable",
    "rustc-version-nightly"
]

An example output of the above rustc-version-flow is:

[cargo-make] INFO - Task: rustc-version-flow
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: rustc-version-stable
[cargo-make] INFO - Execute Command: "rustup" "run" "stable" "rustc" "--version"
rustc 1.30.1 (1433507eb 2018-11-07)
[cargo-make] INFO - Running Task: rustc-version-nightly
[cargo-make] INFO - Execute Command: "rustup" "run" "nightly" "rustc" "--version"
rustc 1.32.0-nightly (451987d86 2018-11-01)
[cargo-make] INFO - Running Task: rustc-version-flow
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 2 seconds.

It's also possible to assert a minimum required version of rustc with a channel. This can help to document required compiler features and to remind developers to upgrade their installation.

[tasks.requires-stable-edition-2021]
toolchain = { channel = "stable", min_version = "1.56" }
command = "rustc"
args = ["--version"]

The task will fail when the toolchain is either not installed or the existing version is smaller than the specified min_version.

Init and End tasks

Every task or flow that is executed by the cargo-make has additional 2 tasks.
An init task that gets invoked at the start of all flows and end task that is invoked at the end of all flows.
The names of the init and end tasks are defined in the config section in the toml file, the below shows the default settings:

[config]
init_task = "init"
end_task = "end"

[tasks.init]

[tasks.end]

By default the init and end tasks are empty and can be modified by external toml files or you can simply change the names of the init and end tasks in the external toml files to point to different tasks.
These tasks allow common actions to be invoked no matter what flow you are running.

Important to mention that init and end tasks invocation is different than other tasks.

  • Aliases and dependencies are ignored
  • If the same task is defined in the executed flow, those tasks will be invoked multiple times

Therefore it is not recommended to use the init/end tasks also inside your flows.

Catching Errors

By default any error in any task that does not have ignore_errors=true set to it, will cause the entire flow to fail.
However, there are scenarios in which you would like to run some sort of cleanups before the failed flow finishes.
cargo make enables you to define an on error task which will only be invoked in case the flow failed.
In order to define this special task you must add the on_error_task attribute in the config section in your Makefile and point it to your task, for example:

[config]
on_error_task = "catch"

[tasks.catch]
script = '''
echo "Doing cleanups in catch"
'''

Cargo Alias Tasks

Cargo alias commands can be automatically loaded as cargo-make tasks.
To automatically loading them, the following must be defined in the Makefile.toml config section:

[config]
load_cargo_aliases = true

Each alias defined in the config.toml will be loaded as a task with the same name as the alias.
In case a task with that name already exists, it will be ignored.
The task definition will simply call cargo and the alias value, therefore no automatic cargo plugin installation will be invoked.

Profiles

Profiles are a useful tool used to define custom behaviour.
In order to set the execution profile, use the --profile or -p cli argument and provide the profile name.
Profile names are automatically converted to underscores and are trimmed.
If no profile name is provided, the profile will be defaulted to development.

Example Setting Profile:

cargo make --profile production mytask

Profiles provide multiple capabilities:

condition = { profiles = ["development", "production"] }
  • New environment variable CARGO_MAKE_PROFILE which holds the profile name and can be used by conditions, scripts and commands.

Additional profiles can be set in the config section but have limited support.

[config]
additional_profiles = ["second_profile", "another_profile"]

Additional profiles can be used to define additional environment blocks and they will be defined in a new environment variable CARGO_MAKE_ADDITIONAL_PROFILES

Environment Variables

Profiles enable you to define a new subset of environment variables that will only be set in runtime if the current profile matches the env profile.

[env]
RUST_BACKTRACE = "1"
EVALUATED_VAR = { script = ["echo SOME VALUE"] }
TEST1 = "value1"
TEST2 = "value2"
COMPOSITE = "${TEST1} ${TEST2}"

# profile based environment override
[env.development]
DEV = true

[env.production]
PROD = true

Example:

We have the following makefile with 2 profile based env maps

[env]
COMMON = "COMMON"
PROFILE_NAME = "${CARGO_MAKE_PROFILE}"

[env.development]
IS_DEV = true
IS_PROD = false

[env.production]
IS_DEV = false
IS_PROD = true

[tasks.echo]
script = [
'''
echo COMMON: ${COMMON}
echo PROFILE_NAME: ${PROFILE_NAME}
echo IS_DEV: ${IS_DEV}
echo IS_PROD: ${IS_PROD}
'''
]

We run the echo task with production profile as follows:

cargo make --cwd ./examples --makefile profile.toml --profile production echo

Output:

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: profile.toml
[cargo-make] INFO - Task: echo
[cargo-make] INFO - Profile: production
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: echo
+ cd /media/devhdd/projects/rust/cargo-make/examples
+ echo COMMON: COMMON
COMMON: COMMON
+ echo PROFILE_NAME: production
PROFILE_NAME: production
+ echo IS_DEV: FALSE
IS_DEV: FALSE
+ echo IS_PROD: TRUE
IS_PROD: TRUE
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Env files also can be filtered based on profile, using the profile attribute as follows:

env_files = [
    { path = "./development.env", profile = "development" },
    { path = "./production.env", profile = "production" },
    { path = "./env.env" }
]

Additional profiles defined in the config section will also result in additional env blocks/files to be loaded, for example:

env_files = [
    { path = "./second.env", profile = "second_profile" },
    { path = "./another.env", profile = "another_profile" }
]

[config]
additional_profiles = ["second_profile", "another_profile"]

[env.second_profile]
IS_SECOND_AVAILABLE = true

[env.another_profile]
IS_OTHER_AVAILABLE = true

This could be quite handy in having environment variable blocks which will enable/disable specific tasks.

Conditions

Conditions enable you to trigger/skip tasks.
Conditions have built in support for profiles, so you can trigger/skip tasks based on the profile name.

Example:

[tasks.echo-development]
condition = { profiles = [ "development" ] }
command = "echo"
args = [ "running in development profile" ]

[tasks.echo-production]
condition = { profiles = [ "production" ] }
command = "echo"
args = [ "running in production profile" ]

Built In Profiles

cargo-make comes with few built in profiles to quickly enable additional conditional tasks.

  • ci-coverage-tasks - Will enable all code coverage tasks and setup rust compilation to remove dead code.
  • none-thread-safe-tests - Sets up rust test runner to a single thread
  • multi-phase-tests - Enable to split the tests to multiple phases (thread safe, multi threaded, custom)
  • ci-static-code-analysis-tasks - Will enable all static code analysis tasks such as format checking and clippy as part of the CI flow (see special note about backward compatibility below).
  • ci-all-build-tasks - Will enable all extra compilation tasks (i.e. bench and example code) as part of the CI flow (see special note about backward compatibility below).
  • all-default-tasks - Will enable extra tasks invoked while running the default task (such as toml formatting).

Some of these profiles may change in the future to enable more tasks which may break your build and by definition will never be backward compatible.
Use them with care.

Private Tasks

Private tasks are tasks that should only be invoked by other tasks and not directly from the cli.

In order to define a task as private, add the private attribute with value true as follows:

[tasks.internal-task]
private = true

Deprecated Tasks

It is possible to mark tasks as deprecated in order to warn users that they should no longer use this task and switch to a newer/different task instead.
Once invoked, a warning message will be displayed with the deprecation information.
You can define a task deprecated by setting the deprecated to true or by providing a relevant message.
For example:

[tasks.legacy]
deprecated = "Please use task OTHER instead"

[tasks.legacy-extended]
extend = "legacy"
deprecated = false

[tasks.legacy2]
deprecated = true

When invoking legacy task for example, the output is:

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: deprecated.toml
[cargo-make] INFO - Task: legacy
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Running Task: legacy
[cargo-make] WARN - Task: legacy is deprecated - Please use task OTHER instead
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Build Done in 0 seconds.

When listing tasks, deprecated tasks will contain this information as well:

No Category
----------
default - Empty Task
empty - Empty Task
legacy - No Description. (deprecated - Please use task OTHER instead)
legacy-extended - No Description.
legacy2 - No Description. (deprecated)

Watch

Watching for changes in your project and firing a task via cargo-make is very easy.
Simply add the watch attribute for the task and set it to true and once the task is triggered, it will run every time a file changes in the project.
The process needs to be killed in order to stop the watch.

Example:

[tasks.watch-example]
command = "echo"
args = [ "Triggered by watch" ]
watch = true

Below is a sample output of invoking the task:

[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: ./examples/watch.toml
[cargo-make] INFO - Task: watch-example
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: watch-example
[cargo-make] INFO - Running Task: watch-example-watch
[cargo-make] INFO - Execute Command: "cargo" "watch" "-q" "-x" "make --disable-check-for-updates --no-on-error --loglevel=info --makefile=/projects/rust/cargo-make/examples/watch.toml watch-example"
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: /projects/rust/cargo-make/examples/watch.toml
[cargo-make] INFO - Task: watch-example
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: watch-example
[cargo-make] INFO - Execute Command: "echo" "Triggered by watch"
Triggered by watch
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.
^C

You can also fine tune the watch setup (which is based on cargo-watch) by providing an object to the watch attribute as follows:

[tasks.watch-args-example]
command = "echo"
args = [ "Triggered by watch" ]
watch = { postpone = true, no_git_ignore = true, ignore_pattern = "examples/files/*", watch = ["./docs/"] }

Functions

cargo-make comes with built in functions which help extend capabilities missing with environment variables.
Functions are not supported everywhere in the makefile and are currently only supported in command arguments array structure.
In order to define a function call, the following format is used @@FUNCTION_NAME(ARG1,ARG2,ARG3,...)
For example:

[tasks.split-example]
command = "echo"
args = ["@@split(ENV_VAR,|)"]

Currently Supported Functions:

Split

The split function accepts two arguments:

  • environment variable name
  • split by character

And returns an array of sub strings.
This enables to split an environment variable to multiple command arguments, for example:

[env]
MULTIPLE_VALUES="1 2 3 4"

[tasks.split]
command = "echo"
args = ["@@split(MULTIPLE_VALUES, )"]

[tasks.no-split]
command = "echo"
args = ["${MULTIPLE_VALUES}"]
> cargo make --cwd ./examples --makefile functions.toml split
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: split
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: split
[cargo-make] INFO - Execute Command: "echo" "1" "2" "3" "4"
1 2 3 4
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

> cargo make --cwd ./examples --makefile functions.toml no-split
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: no-split
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: no-split
[cargo-make] INFO - Execute Command: "echo" "1 2 3 4"
1 2 3 4
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

GetAt

The getat function accepts three arguments:

  • environment variable name
  • split by character
  • index of the item to return

And returns an array with a single value based on the given index.
This enables to split an environment variable and extract only the needed param, for example:

[env]
MULTIPLE_VALUES="1 2 3 4"

[tasks.getat]
command = "echo"
args = ["@@getat(MULTIPLE_VALUES,|,3)"]
> cargo make --cwd ./examples --makefile functions.toml getat
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: getat
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: getat
[cargo-make] INFO - Execute Command: "echo" "4"
4
[cargo-make] INFO - Build Done in 0 seconds.

Remove Empty

The remove empty function accepts a single argument:

  • environment variable name

It will completely remove that command line argument in case the environment variable is not defined or is empty or it returns the actual environment variable value.

[tasks.remove-empty]
command = "echo"
args = ["1", "@@remove-empty(DOES_NOT_EXIST)", "2"]
> cargo make --cwd ./examples --makefile functions.toml remove-empty
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: remove-empty
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: remove-empty
[cargo-make] INFO - Execute Command: "echo" "1" "2"
1 2
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Trim

The trim function accepts the following arguments:

  • environment variable name
  • optionally a trim type: start/end (if not provided, it will trim both start and end)

It will completely remove that command line argument in case the environment variable is not defined or after it is trimmed, it is empty or it returns the actual environment variable value.

[env]
TRIM_VALUE="   123    "

[tasks.trim]
command = "echo"
args = ["@@trim(TRIM_VALUE)"]
> cargo make --cwd ./examples --makefile functions.toml remove-empty
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: trim
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: trim
[cargo-make] INFO - Execute Command: "echo" "123"
123
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Below are examples when using the start/end attributes:

[env]
TRIM_VALUE="   123    "

[tasks.trim-start]
command = "echo"
args = ["@@trim(TRIM_VALUE,start)"]

[tasks.trim-end]
command = "echo"
args = ["@@trim(TRIM_VALUE,end)"]
> cargo make --cwd ./examples --makefile functions.toml trim-start
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: trim-start
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: trim-start
[cargo-make] INFO - Execute Command: "echo" "123    "
123
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

> cargo make --cwd ./examples --makefile functions.toml trim-end
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: trim-end
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: init
[cargo-make] INFO - Running Task: trim-end
[cargo-make] INFO - Execute Command: "echo" "   123"
   123
[cargo-make] INFO - Running Task: end
[cargo-make] INFO - Build Done  in 0 seconds.

Decode

The decode function accepts the following arguments:

  • environment variable name
  • optional a list of mapping values (source/target)
  • optional default value

It will completely remove that command line argument in case the output it is empty.

For example:

[tasks.decode]
command = "echo"
args = ["Env:", "${CARGO_MAKE_PROFILE}", "Decoded:", "@@decode(CARGO_MAKE_PROFILE,development,dev,ci,test)"]

We check the CARGO_MAKE_PROFILE environment variable value and look for it in the mappings.
If the value is development it will be mapped to dev while ci is mapped to test.
In case no mapping is found, the original value is returned.
Sample run for a mapping that was found:

cargo make --cwd ./examples --makefile functions.toml -e DECODE_ENV_VAR=development decode
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: decode
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Running Task: decode
[cargo-make] INFO - Execute Command: "echo" "Env:" "development" "Decoded:" "dev"
Env: development Decoded: dev
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Build Done in 0 seconds.

Another sample run for a mapping that was not found:

cargo make --cwd ./examples --makefile functions.toml -e DECODE_ENV_VAR=unmapped decode
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: decode
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Running Task: decode
[cargo-make] INFO - Execute Command: "echo" "Env:" "unmapped" "Decoded:" "unmapped"
Env: unmapped Decoded: unmapped
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Build Done in 0 seconds.

Another example:

[tasks.decode-with-default]
command = "echo"
args = ["Env:", "${DECODE_ENV_VAR}", "Decoded:", "@@decode(DECODE_ENV_VAR,development,dev,ci,test,unknown)"]

Same as previous example, but the difference here is that if not mapping is found, the default value (last argument) is returned.
Sample run:

cargo make --cwd ./examples --makefile functions.toml -e DECODE_ENV_VAR=unmapped decode-with-default
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: decode-with-default
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Running Task: decode-with-default
[cargo-make] INFO - Execute Command: "echo" "Env:" "unmapped" "Decoded:" "unknown"
Env: unmapped Decoded: unknown
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Build Done in 0 seconds.

Mapped values can hold environment expressions, for example:

[tasks.decode-with-eval]
command = "echo"
args = ["Env:", "${DECODE_ENV_VAR}", "Decoded:", "@@decode(DECODE_ENV_VAR,test,The current profile is: ${CARGO_MAKE_PROFILE})"]

Sample run:

cargo make --cwd ./examples --makefile functions.toml -e DECODE_ENV_VAR=test decode-with-eval
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: functions.toml
[cargo-make] INFO - Task: decode-with-eval
[cargo-make] INFO - Profile: development
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Running Task: decode-with-eval
[cargo-make] INFO - Execute Command: "echo" "Env:" "test" "Decoded:" "The current profile is: development"
Env: test Decoded: The current profile is: development
[cargo-make] INFO - Running Task: empty
[cargo-make] INFO - Build Done in 0 seconds.

Continuous Integration

cargo-make comes with a predefined flow for continuous integration build executed by internal or online services such as travis-ci and appveyor.
It is recommended to install cargo-make with the debug flag for faster installation.

Github Actions

Add the following to your workflow yml file:

- name: Install cargo-make
  uses: actions-rs/cargo@v1
  with:
    command: install
    args: --debug cargo-make
- name: Run CI
  uses: actions-rs/cargo@v1
  with:
    command: make
    args: ci-flow

This will use the latest cargo-make with all latest features.

You can see full yaml file at: ci.yml

If you want to run code coverage and upload it to codecov, also define the following environment variable:

CARGO_MAKE_RUN_CODECOV=true

When working with workspaces, in order to run the ci-flow for each member and package all coverage data, use the following command:

- name: Install cargo-make
  uses: actions-rs/cargo@v1
  with:
    command: install
    args: --debug cargo-make
- name: Run CI
  uses: actions-rs/cargo@v1
  with:
    command: make
    args: --no-workspace workspace-ci-flow

To speed up cargo-make installation during the build, you can use the rust-cargo-make github action to download the prebuilt binary.

Travis

Add the following to .travis.yml file:

script:
  - cargo install --debug cargo-make
  - cargo make ci-flow

This will use the latest cargo-make with all latest features.
When caching cargo:

cache: cargo
script:
  - which cargo-make || cargo install cargo-make
  - cargo make ci-flow

NOTE: While using cache, in order to update cargo-make, you will need to manually clear the travis cache

If you want to run code coverage and upload it to codecov, also define the following environment variable:

env:
  global:
    - CARGO_MAKE_RUN_CODECOV="true"

NOTE: If you are using kcov coverage, you can cache the kcov installation by setting the CARGO_MAKE_KCOV_INSTALLATION_DIRECTORY environment variable to a location which is cached by travis.

When working with workspaces, in order to run the ci-flow for each member and package all coverage data, use the following command:

script:
  - cargo install --debug cargo-make
  - cargo make --no-workspace workspace-ci-flow

AppVeyor

Add the following to appveyor.yml file:

build: false

test_script:
  - cargo install --debug cargo-make
  - cargo make ci-flow

When working with workspaces, in order to run the ci-flow for each member and package all coverage data, use the following command:

build: false

test_script:
  - cargo install --debug cargo-make
  - cargo make --no-workspace workspace-ci-flow

GitLab CI

Add the following to your gitlab-ci.yml file:

test:cargo:
  script:
  - cargo install --debug cargo-make
  - cargo make ci-flow

When working with workspaces, in order to run the ci-flow for each member and package all coverage data, use the following command:

build: false

test:cargo:
  script:
  - cargo install --debug cargo-make
  - cargo make --no-workspace workspace-ci-flow

To upload your coverage information to codecov, you'll need to go to repo settings for your GitLab repo, and add a secret variable with your codecov token for that repository.

Then you can add the following in your gitlab-ci.yml to enable coverage support:

variables:
  CARGO_MAKE_RUN_CODECOV: "true"

CircleCI

Add the following to your .circleci/config.yml file:

- run:
    name: install cargo-make
    command: cargo install --debug cargo-make
- run:
    name: ci flow
    command: cargo make ci-flow

This will use the latest cargo-make with all latest features.
When caching cargo:

  - restore_cache:
      key: project-cache
  # ....
  - run:
      name: install cargo-make
      command: which cargo-make || cargo install cargo-make
  - run:
      name: ci flow
      command: cargo make ci-flow
  # ....
  - save_cache:
      key: project-cache
      paths:
        - "~/.cargo"

NOTE: While using cache, in order to update cargo-make, you will need to manually clear the CircleCI cache

NOTE: If you are using kcov coverage, you can cache the kcov installation by setting the CARGO_MAKE_KCOV_INSTALLATION_DIRECTORY environment variable to a location which is cached by CircleCI.

When working with workspaces, in order to run the ci-flow for each member and package all coverage data, use the following command:

- run:
    name: install cargo-make
    command: cargo install --debug cargo-make
- run:
    name: ci flow
    command: cargo make --no-workspace workspace-ci-flow

Azure Pipelines

Add the following to your azure-pipelines.yml file:

- script: cargo install --debug cargo-make
  displayName: install cargo-make
- script: cargo make ci-flow
  displayName: ci flow

When working with workspaces, in order to run the ci-flow for each member and package all coverage data, use the following setup:

- script: cargo install --debug cargo-make
  displayName: install cargo-make
- script: cargo make --no-workspace workspace-ci-flow
  displayName: ci flow

drone.io

This is a minimal .drone.yml example for running the ci-flow task with the docker runner:

pipeline:
  ci-flow:
    image: rust:1.38-slim
    commands:
    - cargo install --debug cargo-make
    - cargo make ci-flow

Cirrus CI

This is a minimal .cirrus.yml example for running the ci-flow task:

container:
  image: rust:latest

task:
  name: ci-flow
  install_script: cargo install --debug cargo-make
  flow_script: cargo make ci-flow

Predefined Flows

The default makefiles file comes with many predefined tasks and flows.
The following are some of the main flows that can be used without any need of an external Makefile.toml definition.

  • default - Can be executed without adding the task name, simply run 'cargo make'. This task is an alias for dev-test-flow.
  • dev-test-flow - Also the default flow so it can be invoked without writing any task name (simply run cargo make).
    This task runs formatting, cargo build and cargo test and will most likely be the set of tasks that you will run while developing and testing a rust project.
  • watch-flow - Watches for any file change and if any change is detected, it will invoke the test flow.
  • ci-flow - Should be used in CI builds (such as travis/appveyor) and it runs build and test with verbose level.
  • workspace-ci-flow - Should be used in CI builds (such as travis/appveyor) for workspace projects.
  • publish-flow - Cleans old target directory and publishes the project.
  • build-flow - Runs full cycle of build, tests, security checks, dependencies up to date validations and documentation generation.
    This flow can be used to make sure your project is fully tested and up to date.
  • coverage-flow - Creates coverage report from all unit and integration tests (not supported on windows). By default cargo-make uses kcov for code coverage, however additional unsupported implementations are defined.
  • codecov-flow - Runs the coverage-flow and uploads the coverage results to codecov (not supported on windows).

Coverage

cargo-make has built in support for multiple coverage tasks.
Switching between them without modifying the flows is done by setting the coverage provider name in the CARGO_MAKE_COVERAGE_PROVIDER environment variable as follows:

[env]
# can be defined as kcov, tarpaulin, ...
CARGO_MAKE_COVERAGE_PROVIDER = "kcov"

In case you have a custom coverage task, it can be plugged into the coverage flow by changing the main coverage task alias, for example:

[tasks.coverage]
alias = "coverage-some-custom-provider"

You can run:

cargo make --list-all-steps | grep "coverage-"

To view all currently supported providers. Example output:

ci-coverage-flow: No Description.
coverage-tarpaulin: Runs coverage using tarpaulin rust crate (linux only)
coverage-flow: Runs the full coverage flow.
coverage-kcov: Installs (if missing) and runs coverage using kcov (not supported on windows)

All built in coverage providers are supported by their authors and not by cargo-make.

Based on the above explanation, to generate a coverage report for a simple project, run the following command:

cargo make coverage

In order to run coverage in a workspace project and package all member coverage reports in the workspace level, run the following command:

cargo make --no-workspace workspace-coverage

If you are using kcov, you may declare the following environment variables in your Makefile.toml to customize the coverage task:

Specify lines or regions of code to ignore:

[env]
CARGO_MAKE_KCOV_EXCLUDE_LINE = "unreachable,kcov-ignore"             # your choice of pattern(s)
CARGO_MAKE_KCOV_EXCLUDE_REGION = "kcov-ignore-start:kcov-ignore-end" # your choice of markers

By default, the binaries executed to collect coverage are filtered by a regular expression. You may override the following in case it does not match the binaries generated on your system:

[env]
# for example: cargo make filter regex would be cargo_make-[a-z0-9]*$
CARGO_MAKE_TEST_COVERAGE_BINARY_FILTER = "${CARGO_MAKE_CRATE_FS_NAME}-[a-z0-9]*$"

Full List

See full list of all predefined tasks (generated via cargo make --list-all-steps)

Disabling Predefined Tasks/Flows

In order to prevent loading of internal core tasks and flows, simply add the following configuration property in your external Makefile.toml:

[config]
skip_core_tasks = true

Modifying Predefined Tasks/Flows

It is possible to modify the internal core tasks.
All modifications are defines in the config.modify_core_tasks section.

[config.modify_core_tasks]
# if true, all core tasks are set to private (default false)
private = true

# if set to some value, all core tasks are modified to: <namespace>::<name> for example default::build
namespace = "default"

Minimal Version

In case you are using cargo-make features that are only available from a specific version, you can ensure the build will fail if it is invoked by an older cargo-make version.
In order to specify the minimal version, use the min_version in the config section as follows:

[config]
min_version = "0.35.7"

Performance Tuning

Some features of cargo-make can be disabled which can improve the startup time.
Below is a list of all current features:

[config]
# Skip loading of all core tasks which saves up a bit on toml parsing and task creation
skip_core_tasks = true
# Skips loading Git related environment variables
skip_git_env_info = true
# Skips loading rust related environment variables
skip_rust_env_info = true
# Skips loading the current crate related environment variables
skip_crate_env_info = true

When running in a rust workspace, you can disable some of the features in the member makefiles.
For example, if the members are in the same git repo as the entire project, you can add skip_git_env_info in the members makefiles and they will still have the environment variables setup from the parent process.

Diff Changes

Using the --diff-steps cli command flag, you can diff your correct overrides compared to the prebuilt internal makefile flow.

Example Usage:

cargo make --diff-steps --makefile ./examples/override_core.toml post-build
[cargo-make] INFO - cargo make 0.35.7
[cargo-make] INFO - Build File: ./examples/override_core.toml
[cargo-make] INFO - Task: post-build
[cargo-make] INFO - Setting Up Env.
[cargo-make] INFO - Printing diff...
[cargo-make] INFO - Execute Command: "git" "diff" "--no-index" "/tmp/cargo-make/Lz7lFgjj0x.toml" "/tmp/cargo-make/uBpOa9THwD.toml"
diff --git a/tmp/cargo-make/Lz7lFgjj0x.toml b/tmp/cargo-make/uBpOa9THwD.toml
index 5152290..ba0ef1d 100644
--- a/tmp/cargo-make/Lz7lFgjj0x.toml
+++ b/tmp/cargo-make/uBpOa9THwD.toml
@@ -42,7 +42,9 @@
         name: "post-build",
         config: Task {
             clear: None,
-            description: None,
+            description: Some(
+                "Overide description"
+            ),
             category: Some(
                 "Build"
             ),
[cargo-make] INFO - Done

Git is required to be available as it is used to diff the structures and output it to the console using standard git coloring scheme.

Cli Options

These are the following options available while running cargo-make:

USAGE:
    cargo make [FLAGS] [OPTIONS] [--] [TASK_CMD]...
    or
    makers  [FLAGS] [OPTIONS] [--] [TASK_CMD]...

FLAGS:
        --allow-private                Allow invocation of private tasks
        --diff-steps                   Runs diff between custom flow and prebuilt flow (requires git)
        --disable-check-for-updates    Disables the update check during startup
        --experimental                 Allows access unsupported experimental predefined tasks.
    -h, --help                         Prints help information
        --list-all-steps               Lists all known steps
        --no-color                     Disables colorful output
        --no-on-error                  Disable on error flow even if defined in config sections
        --no-workspace                 Disable workspace support (tasks are triggered on workspace and not on members)
        --print-steps                  Only prints the steps of the build in the order they will be invoked but without
                                       invoking them
        --skip-init-end-tasks          If set, init and end tasks are skipped
        --time-summary                 Print task level time summary at end of flow
    -v, --verbose                      Sets the log level to verbose (shorthand for --loglevel verbose)
    -V, --version                      Prints version information

OPTIONS:
        --cwd <DIRECTORY>                    Will set the current working directory. The search for the makefile will be
                                             from this directory if defined.
    -e, --env <ENV>...                       Set environment variables
        --env-file <FILE>                    Set environment variables from provided file
        --list-category-steps <CATEGORY>     List steps for a given category
    -l, --loglevel <LOG LEVEL>               The log level [default: info]  [possible values: verbose, info, error]
        --makefile <FILE>                    The optional toml file containing the tasks definitions [default:
                                             Makefile.toml]
        --output-format <OUTPUT FORMAT>      The print/list steps format (some operations do not support all formats)
                                             [default: default]  [possible values: default, short-description, markdown,
                                             markdown-single-page, markdown-sub-section, autocomplete]
        --output-file <OUTPUT_FILE>          The list steps output file name
    -p, --profile <PROFILE>                  The profile name (will be converted to lower case) [default: development]
        --skip-tasks <SKIP_TASK_PATTERNS>    Skip all tasks that match the provided regex (example: pre.*|post.*)
    -t, --task <TASK>                        The task name to execute (can omit the flag if the task name is the last
                                             argument) [default: default]

ARGS:
    <TASK_CMD>...    The task to execute, potentially including arguments which can be accessed in the task itself.

Shell Completion

cargo-make comes with shell auto completion support, however in order to provide the exact task names that are available in the current directory, it will run the --list-all-steps command which might take a bit to finish.

Bash

Source the makers-completion.bash file found in extra/shell folder at the start of your shell session. It will enable auto completion for the makers executable.

zsh

zsh supports bash auto completion, therefore the existing bash autocomplete can be used by running the following script:

autoload -U +X compinit && compinit
autoload -U +X bashcompinit && bashcompinit

# make sure to update the path based on your file system location
source ./extra/shell/makers-completion.bash

It will enable auto completion for the makers executable.

Global Configuration

Some of the default CLI values and cargo-make behaviour can be configured via optional global configuration file config.toml located in the cargo-make directory.

The cargo-make directory location can be defined via CARGO_MAKE_HOME environment variable value.
If CARGO_MAKE_HOME has not been defined, the cargo-make default location is:

OS Location
Linux $XDG_CONFIG_HOME or $HOME/.config
Windows RoamingAppData
Mac $HOME/Library/Preferences

If for any reason, the above paths are not valid for the given platform, it will default to $HOME/.cargo-make

The following example config.toml shows all possible options with their default values:

# The default log level if not defined by the --loglevel cli argument
log_level = "info"

# The default configuration whether output coloring is disabled
disable_color = false

# The default task name if no task was provided as part of the cargo-make invocation
default_task_name = "default"

# cargo-make checks for updates during invocation.
# This configuration defines the minimum amount of time which must pass before cargo-make invocations will try to check for updates.
# If the minimum amount of time did not pass, cargo-make will not check for updates (same as --disable-check-for-updates)
# Valid values are: always, daily, weekly, monthly
# If any other value is provided, it will be treated as weekly.
update_check_minimum_interval = "weekly"

# If set to true and cwd was not provided in the command line arguments and the current cwd is not the project root (Cargo.toml not present),
# cargo make will attempt to find the project root by searching the parent directories, until a directory with a Cargo.toml is found.
# cargo make will set the cwd to that directory and will use any Makefile.toml found at that location.
search_project_root = false

Makefile Definition

Config Section

Task

Platform Override

Condition

More info can be found in the types section of the API documentation.

Task Naming Conventions

This section explains the logic behind the default task names.
While the default names logic can be used as a convention for any new task defined in some project Makefile.toml, it is not required.

The default makefiles file comes with several types of tasks:

  • Single command or script task (for example cargo build)
  • Tasks that come before or after the single command tasks (hooks)
  • Tasks that define flows using dependencies
  • Tasks which only install some dependency

Single command tasks are named based on their command (in most cases), for example the task that runs cargo build is named build.

[tasks.build]
command = "cargo"
args = ["build"]

This allows to easily understand what this task does.

Tasks that are invoked before/after those tasks are named the same way as the original task but with the pre/post prefix.
For example for task build the default toml also defines pre-build and post-build tasks.

[tasks.pre-build]

[tasks.post-build]

In the default makefiles, all pre/post tasks are empty and are there as placeholders for external Makefile.toml to override so custom functionality can be defined easily before/after running a specific task.

Flows are named with the flow suffix, for example: ci-flow

[tasks.ci-flow]
# CI task will run cargo build and cargo test with verbose output
dependencies = [
    "pre-build",
    "build-verbose",
    "post-build",
    "pre-test",
    "test-verbose",
    "post-test"
]

This prevents flow task names to conflict with single command task names and quickly allow users to understand that this task is a flow definition.

Tasks which only install some dependency but do not invoke any command start with the install- prefix, for example:

[tasks.install-rust-src]
install_crate = { rustup_component_name = "rust-src" }

Articles

Below is a list of articles which explain most of the cargo-make features.

The articles are missing some of the new features which have been added after they were published, such as:

And more...

Badge

If you are using cargo-make in your project and want to display it in your project README or website, you can embed the "Built with cargo-make" badge.

Built with cargo-make

Here are few snapshots:

Markdown

[![Built with cargo-make](https://sagiegurari.github.io/cargo-make/assets/badges/cargo-make.svg)](https://sagiegurari.github.io/cargo-make)

HTML

<a href="https://sagiegurari.github.io/cargo-make">
  <img src="https://sagiegurari.github.io/cargo-make/assets/badges/cargo-make.svg" alt="Built with cargo-make">
</a>

Roadmap

While already feature rich, cargo-make is still under heavy development.
You can view the future development items list in the github project issues

Editor Support

Vim

VSCode

For debugging purposes there are some example .vscode files located within the docs/vscode-example directory

You may also need:

  • A local install of LLVM (For the LLDB Debugger) installed and reachable on the path
  • VSCode Extension - CodeLLDB
  • VSCode Extension - "rust-analyser" (not the "rust" one)
  • VSCode Extension - "Task Explorer"
  • VSCode Extension - "crates"

Contributing

See contributing guide

Release History

See Changelog

License

Developed by Sagie Gur-Ari and licensed under the Apache 2 open source license.

Comments
  • "Task not found" with workspace AND "root package"

    In crate with workspace AND "root package":

    • Calling cargo make build/test/etc. ignores "root package"
    • Calling cargo make 'task_name' (without CARGO_MAKE_EXTEND_WORKSPACE_MAKEFILE) results in "Task not found". Adding CARGO_MAKE_EXTEND_WORKSPACE_MAKEFILE ignores "root package".

    Basic cargo build/test works fine.

    See attachment.

    bug 
    opened by tower120 32
  • Support variable substitution with toml lists in the same way

    Support variable substitution with toml lists in the same way "${@}" is supported

    Features Description One Makefile-ism that I like to use is variables which contain the common options shared between multiple different invocations of a command, to ensure they remain consistent.

    (eg. So I can retrofit a CARGOFLAGS onto Cargo to complement RUSTFLAGS.)

    However, cargo-make lacking the shell's loose relationship with list-like behaviour rules out the old standby of a space-delimited string and cargo-make panics if you attempt to put something like options = ["--foo", "--bar"] in the [env] section.

    (And the @shell runner seems to be attempting some kind of weird, broken quoting on my experiments in feeding it shell-quoted strings.)

    It'd be helpful to have an alternative to crafting tasks like this, which bring in a whole separate language just to split a shell-quoted string for want of list-type variables:

    [env]
    splittable_arguments = "'%s | %s | %s' foo bar baz"
    unsplit_argument = "hello world!"
    
    [tasks.example]
    command = "sh"
    args = ["-c", "printf ${splittable_arguments} \"${unsplit_argument}\""]
    

    In fact, there's actually a case in the default Makefile.toml which doesn't need this, but which would have benefitted from the mindset that produced it. There are quite a few repetitions of --all-features that would be more useful if replaced with something like "${features_opt}" so it'd be easy to replace with nothing or --features="whatever I want".

    (That's actually one of the things contributing to my decision that it's easier to skip_core_tasks. I don't want --all-features and I want users of my template to be able to change the set of enabled features for a single run by invoking with --env.)

    Describe the solution you'd like

    Support some kind of list variable which can be used the same way ${@} is.

    (It'd also be nice to support temporarily overriding them using --env but I haven't had time to think about the pros and cons of various approaches to implementing that yet.)

    enhancement 
    opened by ssokolow 31
  • script hangs terminal

    script hangs terminal

    Describe the bug Running a task that launches npx webpack-dev-server causes a hang in windows when exiting

    To Reproduce

    1. Install node, npm, etc.
    2. Create a minimal webpack project (and install webpack-dev-server)
    3. Start the webpack-dev-server via npx directly:
    npx webpack-dev-server --config webpack.config.js
    
    1. Control-C to break out
    2. No problem so far.. can type on the console, task manager looks clean - all good
    3. Now try via cargo make...
    4. Create following in Makefile.toml
    [tasks.webpack-development-server]
    script = ["npx webpack-dev-server --config webpack.config.js"]
    
    1. Launch it via cargo make webpack-development-server
    2. Starts fine
    3. Control-C to exit
    4. Appears to exit but then terminal is messed up - can't enter text
    5. Also task manager shows additional consoles (this is after hitting control-c in the terminal but before closing the window entirely):

    image

    Error Stack

    None - it's just that the terminal becomes unusable

    bug enhancement 
    opened by dakom 29
  • [FR] A way to use `pwsh.exe` (PowershellCore) on Windows by default instead of `cmd.exe`

    [FR] A way to use `pwsh.exe` (PowershellCore) on Windows by default instead of `cmd.exe`

    I'm using Windows and the new Powershell Core (pwsh.exe and not powershell.exe).

    It would be amazing to be able to use pwsh.exe instead of cmd.exe by default:

    [tasks.windows_start]
    script_runner = "@shell-pwsh" # or differently
    script = "cargo watch -x run"
    
    [tasks.start]
    windows_alias = "windows_start"
    
    enhancement 
    opened by frederikhors 26
  • Override env var

    Override env var

    Hey, I did expect env_files to behave like typical dotnet files, which surprisingly isn't the case. Is that intended? My env_file has a variable "CONNECTION_STRING", which should default to the value set in the env_file but be overwritten, if it's defined as an environment variable. It's not being overwritten tho, is it a bug?

    Usecase: Having default values defined in the env_file that will be overwritten in the docker machine by global environment variables

    enhancement 
    opened by John0x 23
  • Support accepting arguments and passing them to commands

    Support accepting arguments and passing them to commands

    Problem Description

    I typically write command-line utilities in Rust using clap which have required positional arguments. (eg. command_name [options] <path> ...)

    Currently, I use just's +args="" to implement commands like run-memstats +args="" (MALLOC_CONF=stats_print:true cargo run -- {{args}}) or kcachegrind +args="" (run the command inside valgrind, then open up the resulting log in kcachegrind).

    While it's not perfect (just doesn't provide an equivalent to "$@" so the choices are to only support one argument or not support paths with spaces), cargo-make doesn't seem to provide any mechanism at all for this... which greatly limits its utility to me.

    If I use it, I'll probably wind up writing a shell wrapper which handles certain commands itself and then passes the rest through to cargo-make.

    enhancement 
    opened by ssokolow 22
  • Conditional env vars shouldn't unset when the predicate fails

    Conditional env vars shouldn't unset when the predicate fails

    Feature Description

    The docs describe it as possible to set environment variables through conditions, but this doesn't actually work to either/or variables

    For example, the following:

    MAKEFILE_NIGHTLY_OVERRIDE = { script = "echo MAKEFILE_NIGHTLY_OVERRIDE" } # Get it from the bash/cmd env
    MAKEFILE_NIGHTLY = { condition = { env_set = ["MAKEFILE_NIGHTLY_OVERRIDE"]}, value = "1"}
    MAKEFILE_NIGHTLY = { condition = { env_not_set = ["MAKEFILE_NIGHTLY_OVERRIDE"]}, value = "2"}
    

    will not conditionally set MAKEFILE_NIGHTLY based on the value of MAKEFILE_NIGHTLY_OVERRIDE. rather it will set MAKEFILE_NIGHTLY to 2 if the override variable is set and 1 if it isn't.

    This is because the condition evaluating to false ends up unsetting the variable.

    Describe The Solution You'd Like

    It would be nice if conditions would only touch (set or unset) the variable if the condition is true.

    (It would also work if you could use duckscript in env vars since then you can just use the if there)

    rejected 
    opened by Manishearth 21
  • Support for including only parts of the default makefile without risk of naming collisions

    Support for including only parts of the default makefile without risk of naming collisions

    Features Description I'm working on modernizing my rust-cli-boilerplate and I care very much about providing a clean, coherent development experience... but the default makefile tasks follow a philosophy diametrically opposed to how the justfile I'm interested in replacing is set up.

    (Most notably, I consider it borderline insane to make the intutive and concise build be a plumbing command (To use the git terminology) which doesn't execute a pre- and post- command, while the stuff which seems to be what an "experienced cargo-make user" should be using habitually, like the *-flow tasks, is verbose in a way I wouldn't expect from a porcelain command. )

    I also find that having all those default tasks I never use and never test for compatibility with my codebase makes --list-all-steps useless when it should be a more convenient alternative to pulling up my project template's README again. (Heck. Ideally, I'd like to generate that part of my README from --list-all-steps.)

    That aside, I find myself redefining a lot of commands to get rid of default flags I don't want, like --all-features on build commands, which could cause build errors if flags are meant to be mutually exclusive as in rust-cpython, and missing commands which one would think would be obvious to provide a coherent interface, such as a wrapper around cargo run so that makers or a shorter shell alias of it can become muscle memory and I can focus on the subcommands.

    ~~At present, since I can't find a way to omit the Makefile.toml in the README, my plan is to write a quick little script which takes an installed instance of cargo-make and my Makefile.toml and amends it with all the lines needed to manually cut out all the predefined bits I don't want.~~

    (I already build this stuff from scratch. I'm only interested in cargo-make because it's starting to get more painful to hack around Just's myriad GNU Make-esque shortcomings than it would be to write said "explicitly disable all defaults" script.)

    EDIT: I had an idiot moment. skip_core_tasks = true exists.

    Describe the solution you'd like

    Ideally, a Makefile.toml option which lets me whitelist and rename which tasks I receive from Makefile.toml (eg. I'm perfectly happy to let you maintain the kcov-related tasks), so Makefile.toml is less use defaults::* and more use defaults::coverage-kcov as kcov;

    ~~Failing that, I'll take a simple option in Makefile.toml which prevents the default Makefile.toml from being loaded so I can take full responsibility for writing and maintaining all of my tasks.~~

    enhancement 
    opened by ssokolow 21
  • Implementation of @quicklisp runner

    Implementation of @quicklisp runner

    DISCLAIMER: I tried to make this as short as possible, but there is lots of info to be processed x.x

    DISCLAIMER: I am not a lisp backend developer (i just use the language on preferred implementation atm) so the information is provided to the best of my ability and may be inaccurate, peer-reviews were made and addressed.Any relevant information/criticism is welcomed.

    github_quicklisp

    This is a feature request to implement quicklisp https://www.quicklisp.org/beta/ which is a deployed through loadable lisp library expected to allow for implementation-independent code in cargo-make https://github.com/sagiegurari/cargo-make which is arguably a better alternative alternative to make command reading Makefile.

    Expectation

    The ability to use lisp and/or common-lisp (programming language) called from cargo-make on all supported devices by rustlang and/or *lisp (https://doc.rust-lang.org/nightly/rustc/platform-support.html) with readable and fault tolerant implementation-independent implementation while not preventing implementation of other touring complete systems.

    Issue

    Currently cargo-make version =0.32.6 requires the following entry to run common lisp through Embedded Common Lisp (ecl) to use implementation independent code style:

    [env]
    MESSAGE = "something"
    
    # NOTICE(Krey): You will need quicklisp installed
    
    [tasks.kreyren]
    script_runner = "ecl"
    script_runner_args = [ "--norc", "--quiet", "--shell" ]
    script_extension = "cl"
    script = [
    '''
    (setf *load-verbose* nil)
    (load "/home/kreyren/quicklisp/setup.lisp" :verbose nil)
    (ql:quickload :uiop :silent t)
    
    (write-line (uiop:getenv "MESSAGE"))
    '''
    ]
    

    to return something which is hard to read and maintain as it requires duplicate code (namely script_runner_args and lisp lines above write-line)

    Where the expected is:

    [env]
    MESSAGE = "something"
    
    [tasks.kreyren]
    script_runner = "@quicklisp"
    script = [
    '''
    (write-line (uiop:getenv "MESSAGE"))
    '''
    ]
    

    Implementation compatibility

    Common lisp has many implementations alike:

    • CLISP (clisp) - Implementation of Common lisp written in C
    • CCL (Clozure CL)
    • Clasp - Effort to write C++/LLVM common lisp implementation
    • ABCL (Armed Bear CL)
    • ACL (Allegro CL)
    • LW (LispWorks)
    • CMUCL (Carnegie Mellon University)
    • MKCL (ManKai)
    • Scieneer CL
    • SICL - Work in progress implementation
    • Embedded Common Lisp (ECL)
    • Steel Bank Common Lisp (SBCL)
    • possibly more..

    Where following are dialects of lisp that are not supported by quicklisp as quicklisp is depending on ASDF that is using CLOS that is close to impossible to port to these (See statement below):

    • Emacs Lisp (elisp)
    • Scheme
    • Racket - https://docs.racket-lang.org/
    • etc..

    Where hard-coded logic is mentioned in specification: http://www.lispworks.com/documentation/HyperSpec/Body/03_ababa.htm which are implemented in quicklisp that allows to write implementation-independent common lisp (meaning that the common lisp code written will work on all other implementations).

    Example of implemendation dependent code printing value of environment variable MESSAGE:

    (write-line (ext:getenv "MESSAGE"))
    

    as ext is specific to ecl and sbcl (possibly others..).

    Whereas this implementation works on all implementation provided:

    (load #p"~/quicklisp/setup.lisp")
    (write-line (uiop:getenv "MESSAGE"))
    

    Thus the script_runner = "@quicklisp" should be looking for executables capable of processing the runtime instead of depending only on the hard-coded.

    Silencing unwanted output

    By default quicklisp is outputting lots of unwanted informations:

    kreyren@leonid:~$ export MESSAGE=kreyren
    kreyren@leonid:~$ ecl
    ECL (Embeddable Common-Lisp) 20.4.24 (git:UNKNOWN)
    Copyright (C) 1984 Taiichi Yuasa and Masami Hagiya
    Copyright (C) 1993 Giuseppe Attardi
    Copyright (C) 2013 Juan J. Garcia-Ripoll
    Copyright (C) 2018 Daniel Kochmanski
    Copyright (C) 2020 Daniel Kochmanski and Marius Gerbershagen
    ECL is free software, and you are welcome to redistribute it
    under certain conditions; see file 'Copyright' for details.
    Type :h for Help.  
    Top level in: #<process TOP-LEVEL 0x7fb1d054af80>.
    > (load "/home/kreyren/quicklisp/setup.lisp")
    
    ;;; Loading "/home/kreyren/quicklisp/setup.lisp"
    ;;; Loading #P"/usr/lib/x86_64-linux-gnu/ecl-20.4.24/asdf.fas"
    #P"/home/kreyren/quicklisp/setup.lisp"
    > (ql:quickload :uiop)
    To load "uiop":
      Load 1 ASDF system:
        uiop
    ; Loading "uiop"
    
    (:UIOP)
    > (write-line (uiop:getenv "MESSAGE"))
    kreyren
    "kreyren"
    > 
    

    To silence these on ecl it's expected to use:

    • --shell to silence the header including copyright
    • (setf *load-verbose* nil) to silence the ;; Loading .. messages from ecl
    • (load "/home/kreyren/quicklisp/setup.lisp" :verbose nil) to silence the ;;; Loading #P"/usr/lib/x86_64-linux-gnu/ecl-20.4.24/asdf.fas" #P"/home/kreyren/quicklisp/setup.lisp" from quicklisp itself
    • (ql:quickload :uiop :silent t) to silence the loading of quicklisp in the said implementation

    Additionally we need argument --norc to avoid sourcing of ~/.eclrc which could interfiere with the logic in cargo-make's script.

    on clisp this is getting:

    kreyren@leonid:~$ MESSAGE=kreyren clisp test.lisp 
    WARNING: DEFGENERIC: redefining function DIST in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function SYSTEM-INDEX-URL in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function RELEASE-INDEX-URL in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function AVAILABLE-VERSIONS-URL in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function RELEASE in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function NAME in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function BASE-DIRECTORY in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function METADATA-NAME in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function PREFERENCE-PARENT in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function SHORT-DESCRIPTION in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function PROVIDED-RELEASES in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function PROVIDED-SYSTEMS in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    WARNING: DEFGENERIC: redefining function ARCHIVE-URL in
             /home/kreyren/.cache/common-lisp/clisp-2.49.92-unix-x64/home/kreyren/quicklisp/quicklisp/dist.fas,
             was defined in top-level
    kreyren
    

    To silence these we need --quiet

    Handling of arguments per implementation

    Recommend implementing script_runner_args at the background depending on found executable namely this should be doing if the end-user did not overwrite script_runner_args already:

    # ECL
    script_runner_args = [ "--norc", "--quiet", "--shell" ]
    script_extension = "cl"
    # clisp
    script_runner_args = [ "--norc", "--quiet" ]
    script_extension = "cl"
    # elisp - Doesn't work atm
    script_runner_args = [ "--quick" ,"--script" ]
    script_extension = "cl"
    # sbcl
    script_runner_args = [ "--no-userinit", "--script" ]
    script_extension = "cl"
    

    rlisp

    rlisp was concluded to be not usable https://github.com/Kreyren/rust-lisp/actions/runs/295136199

    Created https://github.com/swgillespie/rust-lisp/pull/7 to track the code usability

    Filed https://github.com/swgillespie/rust-lisp/issues/6 to get more info

    elisp

    elisp is able to process the file using script_runner_args = [ "--quick" ,"--script" ], but does not work:

    Loading /home/kreyren/quicklisp/setup.lisp... Symbol’s function definition is void: defpackage

    Filed https://github.com/quicklisp/quicklisp-bootstrap/issues/21 for the elisp compatibility of quicklisp

    EDIT: Is not supported

    Prepending lisp code to the created scripts

    For the implementation to be able to implement quicklisp we need to source quicklisp library which can be done by prepending

    (load #p"~/quicklisp/setup.lisp")
    

    assuming that it has been installed on the system.

    Deployment of quicklisp

    To be able to run quicklisp we need to run https://beta.quicklisp.org/quicklisp.lisp to get the backend to be used in the implementation.

    FIXME: How to implement this in cargo-make?

    Quicklisp goal

    Allegedly the goal is to make it easy to distribute and update Lisp code over the Internet, which may interfere in presented usecase.

    Caches

    Worth mentioning that common-lisp is caching it's functions in ~/.cache/common-lisp which might influence the runtime as the changes might not be present in the real-time

    Quicklisp compatibility with non-common lisp

    This is a quote of Zach Beane (@xach) from irc.freenode.net/#lisp (was allowed to quote):

    <Xach> quicklisp is a common lisp program. other lisps are not common lisp and are not compatible.
    <Xach> other lisps could be made compatible. it's a lot of work. nobody has done it. common lisp has a lot of features and quicklisp uses a lot of 
    them.
    <Xach> also, quicklisp uses extra-standard functionality that would also need implementation - networking and filesystem work mostly.
    <kreyren> Is quicklisp implemented by design to allow possible implementation of non-CL interpretations or would that require lots of rewritting?
    <Xach> kreyren: it is implemented by design to take full advantage of Common Lisp and I think trying to make it work elsewhere would be difficult and not very rewarding - since it is meant to allow you to run other common lisp programs, which also are not portable to other lisps.
    <Xach> "Lisp" isn't generic - there are only specifics
    <Xach> I think the ideas of quicklisp are pretty portable, even if the code itself is not especially
    

    My usecase

    I want to use (C)lisp on my repositories that are aiming to be cross-platform compatible with ideology to support as many devices as possible where it's not limiting me in terms of technology for things where it's not practical to implement them in rustlang (scripts).

    I prefer rustlang since it allow me to outsource my code in libraries (crates) allowing for passive maintenance of my codebase which is much less efficient on C + i am fed up with keep hotfixing of C standard issues and wasting time looking for memory leaks where it seems that rustlang is not less efficient then C assuming optimization made i.e. comparing fibonacci in rustlang https://github.com/Kreyren/rustlang-fibonacci/tree/kreyren/case-study-performance-2 to C and Lisp.

    Rustlang currently works on less devices compared to lisp which i want to in worst case scenario implement though lisp wrapper to read the Makefile.toml.

    Clisp maintenance

    Based on activity of https://gitlab.com/gnu-clisp/clisp it was advised to not rely on clisp as it seems somewhat unmaintained which might be subjective assuming that there doesn't seem to be any actionable issues.

    EDIT: Merging the https://gitlab.com/gnu-clisp/clisp/-/merge_requests/3 seems actionable enough assuming unmaintained.

    EDIT2: requested an official statement from GNU about maintenance.

    References

    1. https://courses.cs.washington.edu/courses/cse341/04wi/lectures/14-scheme-quote.html
    2. emacs scripting https://www.emacswiki.org/emacs/EmacsScripts
    3. Quicklisp on github https://github.com/quicklisp
    4. http://clhs.lisp.se/
    5. https://web.archive.org/web/20200426054415/http://home.pipeline.com/~hbaker1/TInference.html
    6. Rust your own lisp https://dev.to/deciduously/rust-your-own-lisp-50an
    7. Risp https://stopa.io/post/222
    8. rust_lisp https://crates.io/crates/rust_lisp
    9. Ketos https://crates.io/crates/ketos
    enhancement 
    opened by Kreyren 20
  • @shell script `rm -rf` in windows can not delete file

    @shell script `rm -rf` in windows can not delete file

    Describe The Bug

    script_runner = "@shell"
    script="rm -rf folder file"
    

    in windows will became rmdir, and it can not delete file

    To Reproduce

    Error Stack

    The error stack trace
    

    Code Sample

    /// paste code here
    
    investigation 
    opened by umaYnit 19
  • Support: Figure out why script's

    Support: Figure out why script's "current directory" fails to be read

    Problem Description

    Hiya, I need help with figuring out why a script task works as my local user, but does not work when I run it through a CI runner.

    I have a conformance task in a workspace repository root's Makefile.toml, which contains a script.

    [tasks.conformance]
    script = [
    '''
    echo hi
    '''
    ]
    

    The actual script content doesn't matter — when I run this as myself, I get the hi. My CI runner is my laptop, installed as its own gitlab-runner user.

    When I run cargo make --no-workspace conformance as gitlab-runner, it fails with:

    [cargo-make] INFO - cargo-make 0.10.5
    [cargo-make] INFO - Using Build File: Makefile.toml
    [cargo-make] INFO - Task: conformance
    [cargo-make] INFO - Setting Up Env.
    [cargo-make] INFO - Running Task: init
    [cargo-make] INFO - Running Task: conformance
    [cargo-make] ERROR - Error while executing command, unable to extract exit code.
    [cargo-make] WARN - Build Failed.
    

    Modifying cargo-make a little, I get this bit of information:

    # ... elided (same as above)
    [cargo-make] INFO - Running Task: conformance
    [cargo-make] ERROR - Err(ScriptError { info: IOError(Error { repr: Os { code: 13, message: "Permission denied" } }) })
    [cargo-make] WARN - Build Failed.
    

    So the runner apparently hits "Permission denied" before executing the script. I traced this to [email protected]: runner.rs#L92, current_dir() is returning Err (EACCES).

    Based on the Rust docs, source code, googling, and checking file permissions (I checked every parent directory along the working dir to see that it's ugo r+x), the more likely reason for the error is the working directory "does not exist", or is incorrect.

    So, I tried figuring out what the working directory is.

    $ cargo make --no-workspace -v conformance 2>&1 | grep -F 'orking direc'
    [cargo-make] DEBUG - Changing working directory to: .
    [cargo-make] DEBUG - Working directory changed to: .
    

    Cargo make successfully changes it to ., and my mini experiment, compiled and run from the same repository successfully gets the current_dir():

    use std::env;
    use std::env::current_dir;
    use std::path::Path;
    
    fn main() {
        // let here = Path::new(".").canonicalize().unwrap();
        let here = Path::new(".");
    
        assert!(env::set_current_dir(&here).is_ok());
        println!("Successfully changed working directory to {}", here.display());
        match current_dir() {
            Ok(path) => println!("{}", path.display()),
            e @ Err(..) => println!("{:?}", e),
        };
    }
    

    Output:

    Successfully changed working directory to .
    /home/gitlab-runner/builds/81b47ec8/0/azriel91/autexousious
    

    This is where I ran out of ideas :sob:

    investigation 
    opened by azriel91 19
  • Darwin arm64 binaries

    Darwin arm64 binaries

    Feature Description

    It would be cool if pre-built binaries could be enabled for macOS with M1 chips, as they are growing more and more popular.

    Describe The Solution You'd Like

    Ship macOS arm64 binaries.

    enhancement 
    opened by RDIL 1
  • Cargo make running slow on workspaces

    Cargo make running slow on workspaces

    Describe The Bug

    cargo-make runs slow for a hello world script in workspace project.

    To Reproduce

    Same as in: https://github.com/sagiegurari/cargo-make/issues/584

    Use

    [config]
    # Skip loading of all core tasks which saves up a bit on toml parsing and task creation
    skip_core_tasks = true
    # Skips loading Git related environment variables
    skip_git_env_info = true
    # Skips loading rust related environment variables
    skip_rust_env_info = true
    # Skips loading the current crate related environment variables
    skip_crate_env_info = true
    
    [tasks.D]
    script = "echo hello"
    

    Result

    ❯ time makers --time-summary D
    [cargo-make] INFO - makers 0.36.3
    [cargo-make] INFO - Build File: Makefile.toml
    [cargo-make] INFO - Task: D
    [cargo-make] INFO - Profile: development
    [cargo-make] INFO - Running Task: D
    hello
    [cargo-make] INFO - ==================Time Summary==================
    [cargo-make] INFO - D:                  100.00%    0.03 seconds
    [cargo-make] INFO - [Load Makefiles]:   0.00%      0.00 seconds
    [cargo-make] INFO - [Setup Env]:        0.00%      0.00 seconds
    [cargo-make] INFO - ================================================
    [cargo-make] INFO - Build Done in 0.50 seconds.
    
    ________________________________________________________
    Executed in  509.72 millis    fish           external
       usr time  282.26 millis    0.14 millis  282.12 millis
       sys time  144.28 millis    1.21 millis  143.07 millis
    
    investigation 
    opened by Swoorup 1
  • Question about packaging Makefile.toml with a crate

    Question about packaging Makefile.toml with a crate

    I'm using cargo-make to generate example plugins with a library that I've been developing and I hope to allow others to use the library to create their own plugins with the cargo-make setup I've written but I'm not sure if that is possible.

    Is it possible to add a Makefile.toml file to a crate on crates.io and somehow let a user extend it? Do you have some other recommended mechanism for sharing Makefile.toml files?

    question 
    opened by x37v 3
  • Do not load base when `skip_core_tasks=true`

    Do not load base when `skip_core_tasks=true`

    Describe The Bug

    When skip_core_tasks=true is set, there are still some empty tasks from base.toml listed

    Hooks
    ----------
    end - By default this task is invoked at the end of every cargo-make run.
    init - By default this task is invoked at the start of every cargo-make run.
    
    No Category
    ----------
    default - Empty Task
    
    Tools
    ----------
    empty - Empty Task
    

    This was a little bit confusing when I defined my own tasks and catagories and didn't know where they are from. Is it possible to completely remove them?

    To Reproduce

    Error Stack

    The error stack trace
    

    Code Sample

    /// paste code here
    
    documentation 
    opened by xxchan 6
  • `parallel` tasks should have the option to kill children on exit

    `parallel` tasks should have the option to kill children on exit

    Feature Description

    Currently, when a child of a a parallel task exits …

    | … with an exit code of zero | … with a non-zero exit code | | --- | --- | | the runner continues as if nothing happened (without any logging, unless fork is enabled) | the runner exits (with an error message + exit code of 1) — without killing any still-living child tasks |

    Makefile.toml
    [tasks."parent"]
    [tasks."parent".run_task]
    # fork = true
    name = ["child:1", "child:2"]
    parallel = true
    
    [tasks."child:1"]
    script = '''
    #!/usr/bin/env bash
    
    sleep 1
    # toggle below to trigger a non-zero exit code
    # false
    '''
    
    [tasks."child:2"]
    script = '''
    #!/usr/bin/env bash
    
    sleep 2
    '''
    

    This is mostly an issue for long-lived tasks, like filesystem watchers. My use case is two long-lived tasks; if one fails, the other continues in the background, and I need to remember to ps aux | grep and kill manually.


    I've used concurrently extensively, and I really like its implementation. It provides --kill-others and --kill-others-on-fail flags (and a corresponding killOthers: ("failure" | "success")[] API option).

    • Without --kill-others[-on-fail], when a child exits …

      | … with an exit code of zero | … with a non-zero exit code | | --- | --- | | the runner continues as if nothing happened (without any logging) — this is the same behavior as cargo-make | the runner continues as if nothing happened (without any logging), then exits with a code of 1 |

    • With --kill-others, when a child exits …

      | … with an exit code of zero | … with a non-zero exit code | | --- | --- | | the runner exits with a code of 1, killing other children with SIGTERM | the runner exits with a code of 1, killing other children with SIGTERM |

    • With --kill-others-on-fail, when a child exits …

      | … with an exit code of zero | … with a non-zero exit code | | --- | --- | | the runner continues as if nothing happened (without any logging) — this is the same behavior as cargo-make | the runner exits with a code of 1, killing other children with SIGTERM |

    Describe The Solution You'd Like

    I think the current implementation is mostly fine, with the exception of not killing children on exit. In my opinion, there's two potential simpler solutions that wouldn't involve adding configuration options:

    When the child of a parallel task exits with a non-zero exit code, …

    • … do nothing; then exit with the existing error message and exit code of 1 when all other children exit
    • … exit immediately and kill all living children

    This wouldn't solve my use-case, but it at least would resolve the existing bug.


    Higher-effort would be to implement an option similar to concurrently, maybe nesting it under the existing parallel option:

    parallel = { kill_others = ["failure", "success"] }
    

    or, if we want to be able to specify the signal sent to living children (though this might be difficult to implement cross-platform), maybe:

    parallel = { on_error = "SIGKILL", on_exit = "SIGTERM" }
    

    or even specifying a task (that would be provided with the info of the process to be killed) to enable arbitrary actions:

    [task."name"]
    parallel = { on_error_task = "name:on_error" }
    
    [task."name:on_error"]
    script = '''
    #!/usr/bin/env bash
    
    notify-send -u critical "killing task ${CARGO_MAKE_TASK_NAME}"
    kill "${CARGO_MAKE_TASK_PID}"
    '''
    

    Happy to draft up a PR, just curious which direction (if any) the maintainers would like to see this feature go!

    opened by ezracelli 7
Releases(0.36.3)
Build custom songs for Pokémon Mystery Dungeon: Explorers of Sky from Soundfonts and MIDI files

skysongbuilder A tool to build custom songs for Pokémon Mystery Dungeon: Explorers of Sky from Soundfonts and MIDI files Features: Optimizations down

Adakite 3 Sep 23, 2023
A music bot build in Serenity/Rust.

Sunny Flowers is a Discord bot to play media in voice channels. It uses Serenity and Songbird to accomplish this.

Sophie 11 Nov 5, 2022
envelope - a command line utility and tool to help you manage your .env files

envelope envelope is a command line utility and tool to help you manage your .env files. How it works envelope is basically a command line utility tha

Mattia Righetti 6 Oct 14, 2023
Tool for solving music harmonics written in rust.

Harmonized Tool for solving harmonics tasks Installation steps Requirements Make shure you have installed rust cargo If not: Download and run rust ins

Dmitry Miasnenko 0 Jan 28, 2022
this tool visualizes audio input

audiovis I tried to create a high quality classic audio visualiser with cpal as audio backend and wgpu as accelerated video frontend demo bar visualis

null 35 Dec 16, 2022
A tool that switch default audio playback device on windows.

AudioSwitch A tool built by Rust that can switch default audio playback device on windows. How to use specify which device you want to use Execute it

null 2 Aug 28, 2022
Symphonia is a pure Rust audio decoding and media demuxing library supporting AAC, FLAC, MP3, MP4, OGG, Vorbis, and WAV.

Pure Rust multimedia format demuxing, tag reading, and audio decoding library

Philip Deljanov 1k Jan 2, 2023
A library and application for lossless, format-preserving, two-pass optimization and repair of Vorbis data, reducing its size without altering any audio information.

OptiVorbis A library and application for lossless, format-preserving, two-pass optimization and repair of Vorbis data, reducing its size without alter

OptiVorbis 27 Jan 3, 2023
A CLI and library to convert data to sound, and vice versa (dependency-free)

Data to sound A simple crate to convert data to sound, and sound to data. The sound file format is wave (.wav). You can use it as a library or as a co

Awiteb 8 Feb 28, 2023
Idiomatic Rust bindings for OpenAL 1.1 and extensions (including EFX).

alto alto provides idiomatic Rust bindings for OpenAL 1.1 and extensions (including EFX). WARNING Because Alto interacts with global C state via dynam

null 80 Aug 7, 2022
High-level PortMidi bindings and wrappers for Rust

portmidi-rs High-level PortMidi bindings for Rust. PortMidi website: http://portmedia.sourceforge.net/portmidi/ Installation Add this to your Cargo.to

Philippe Delrieu 69 Dec 1, 2022
PortAudio bindings and wrappers for Rust.

rust-portaudio PortAudio bindings and wrappers for Rust. PortAudio is a free, cross-platform, open-source, audio I/O library. rust-portaudio is still

null 331 Dec 23, 2022
A Rust environment for sound synthesis and algorithmic composition.

Sorceress A Rust environment for sound synthesis and algorithmic composition, powered by SuperCollider. Overview Sorceress is a Rust crate that provid

Wesley Merkel 82 Dec 26, 2022
Implements the free and open audio codec Opus in Rust.

opus-native Overview Implements the free and open audio codec Opus in Rust. Status This crate is under heavy development. Most functionality is not wo

Nils Hasenbanck 9 Nov 28, 2022
Auritia is a DAW coded in Rust and Vue in hopes of having cross platform compatability, while also providing enough features for anyone to use professionally

Steps Install WebView if you're not on Windows 11 Install Node deps npm i To run the dev server do npm run tauri dev Compiling Linux You will need to

Auritia 20 Aug 27, 2022
Loopers is graphical live looper, written in Rust, designed for ease of use and rock-solid stability

Loopers Loopers is a graphical live looper, written in Rust, designed for ease of use and rock-solid stability. It can be used as a practice tool, com

Micah Wylde 81 Dec 29, 2022
Simple examples to demonstrate full-stack Rust audio plugin dev with baseplug and iced_audio

iced baseplug examples Simple examples to demonstrate full-stack Rust audio plugin dev with baseplug and iced_audio WIP (The GUI knobs do nothing curr

Billy Messenger 10 Sep 12, 2022
A wav encoding and decoding library in Rust

Hound A wav encoding and decoding library in Rust. Hound can read and write the WAVE audio format, an ubiquitous format for raw, uncompressed audio. T

Ruud van Asseldonk 345 Dec 27, 2022
PortAudio bindings and wrappers for Rust.

rust-portaudio PortAudio bindings and wrappers for Rust. PortAudio is a free, cross-platform, open-source, audio I/O library. rust-portaudio is still

null 332 Dec 30, 2022