Merge branch 'master' into trywithout

This commit is contained in:
Daniel Wagner-Hall 2020-05-02 11:50:04 +01:00
commit 624ce68126
76 changed files with 1456 additions and 661 deletions

View File

@ -1,12 +1,64 @@
# Changelog
## Cargo 1.44 (2020-06-04)
[bda50510...HEAD](https://github.com/rust-lang/cargo/compare/bda50510...HEAD)
## Cargo 1.45 (2020-07-16)
[ebda5065e...HEAD](https://github.com/rust-lang/cargo/compare/ebda5065e...HEAD)
### Added
### Changed
- Changed official documentation to recommend `.cargo/config.toml` filenames
(with the `.toml` extension). `.toml` extension support was added in 1.39.
[#8121](https://github.com/rust-lang/cargo/pull/8121)
- The `registry.index` config value is no longer allowed (it has been
deprecated for 4 years).
[#7973](https://github.com/rust-lang/cargo/pull/7973)
- An error is generated if both `--index` and `--registry` are passed
(previously `--index` was silently ignored).
[#7973](https://github.com/rust-lang/cargo/pull/7973)
- The `registry.token` config value is no longer used with the `--index` flag.
This is intended to avoid potentially leaking the crates.io token to another
registry.
[#7973](https://github.com/rust-lang/cargo/pull/7973)
- Added a warning if `registry.token` is used with source replacement. It is
intended this will be an error in future versions.
[#7973](https://github.com/rust-lang/cargo/pull/7973)
- Windows GNU targets now copy `.dll.a` import library files for DLL crate
types to the output directory.
[#8141](https://github.com/rust-lang/cargo/pull/8141)
- Dylibs for all dependencies are now unconditionally copied to the output
directory. Some obscure scenarios can cause an old dylib to be referenced
between builds, and this ensures that all the latest copies are used.
[#8139](https://github.com/rust-lang/cargo/pull/8139)
### Fixed
- Fixed copying Windows `.pdb` files to the output directory when the filename
contained dashes.
[#8123](https://github.com/rust-lang/cargo/pull/8123)
### Nightly only
- Fixed passing the full path for `--target` to `rustdoc` when using JSON spec
targets.
[#8094](https://github.com/rust-lang/cargo/pull/8094)
- `-Cembed-bitcode=no` renamed to `-Cbitcode-in-rlib=no`
[#8134](https://github.com/rust-lang/cargo/pull/8134)
- Added new `resolver` field to `Cargo.toml` to opt-in to the new feature
resolver.
[#8129](https://github.com/rust-lang/cargo/pull/8129)
## Cargo 1.44 (2020-06-04)
[bda50510...ebda5065e](https://github.com/rust-lang/cargo/compare/bda50510...ebda5065e)
### Added
- 🔥 Added the `cargo tree` command.
[docs](https://doc.rust-lang.org/nightly/cargo/commands/cargo-tree.html)
[#8062](https://github.com/rust-lang/cargo/pull/8062)
- Added warnings if a package has Windows-restricted filenames (like `nul`,
`con`, `aux`, `prn`, etc.).
[#7959](https://github.com/rust-lang/cargo/pull/7959)
- Added a `"build-finished"` JSON message when compilation is complete so that
tools can detect when they can stop listening for JSON messages with
commands like `cargo run` or `cargo test`.
[#8069](https://github.com/rust-lang/cargo/pull/8069)
### Changed
- Valid package names are now restricted to Unicode XID identifiers. This is
@ -19,22 +71,79 @@
[#7959](https://github.com/rust-lang/cargo/pull/7959)
- Tests are no longer hard-linked into the output directory (`target/debug/`).
This ensures tools will have access to debug symbols and execute tests in
the same was as Cargo. Tools should use JSON messages to discover the path
the same way as Cargo. Tools should use JSON messages to discover the path
to the executable.
[#7965](https://github.com/rust-lang/cargo/pull/7965)
- Updating git submodules now displays an "Updating" message for each
submodule.
[#7989](https://github.com/rust-lang/cargo/pull/7989)
- File modification times are now preserved when extracting a `.crate` file.
This reverses the change made in 1.40 where the mtime was not preserved.
[#7935](https://github.com/rust-lang/cargo/pull/7935)
- Build script warnings are now displayed separately when the build script
fails.
[#8017](https://github.com/rust-lang/cargo/pull/8017)
- Removed the `git-checkout` subcommand.
[#8040](https://github.com/rust-lang/cargo/pull/8040)
- The progress bar is now enabled for all unix platforms. Previously it was
only Linux, macOS, and FreeBSD.
[#8054](https://github.com/rust-lang/cargo/pull/8054)
- Artifacts generated by pre-release versions of `rustc` now share the same
filenames. This means that changing nightly versions will not leave stale
files in the build directory.
[#8073](https://github.com/rust-lang/cargo/pull/8073)
- Invalid package names are rejected when using renamed dependencies.
[#8090](https://github.com/rust-lang/cargo/pull/8090)
- Added a certain class of HTTP2 errors as "spurious" that will get retried.
[#8102](https://github.com/rust-lang/cargo/pull/8102)
### Fixed
- Cargo no longer buffers excessive amounts of compiler output in memory.
[#7838](https://github.com/rust-lang/cargo/pull/7838)
- Symbolic links in git repositories now work on Windows.
[#7996](https://github.com/rust-lang/cargo/pull/7996)
- Fixed an issue where `profile.dev` was not loaded from a config file with
`cargo test` when the `dev` profile was not defined in `Cargo.toml`.
[#8012](https://github.com/rust-lang/cargo/pull/8012)
- When a binary is built as an implicit dependency of an integration test,
it now checks `dep_name/feature_name` syntax in `required-features` correctly.
[#8020](https://github.com/rust-lang/cargo/pull/8020)
- Fixed an issue where Cargo would not detect that an executable (such as an
integration test) needs to be rebuilt when the previous build was
interrupted with Ctrl-C.
[#8087](https://github.com/rust-lang/cargo/pull/8087)
- Protect against some (unknown) situations where Cargo could panic when the
system monotonic clock doesn't appear to be monotonic.
[#8114](https://github.com/rust-lang/cargo/pull/8114)
### Nightly only
- Fixed panic with new feature resolver and required-features.
[#7962](https://github.com/rust-lang/cargo/pull/7962)
- Added `RUSTC_WORKSPACE_WRAPPER` environment variable, which provides a way
to wrap `rustc` for workspace members only, and affects the filename hash so
that artifacts produced by the wrapper are cached separately. This usage can
be seen on nightly clippy with `cargo clippy -Zunstable-options`.
[#7533](https://github.com/rust-lang/cargo/pull/7533)
- Added `--unit-graph` CLI option to display Cargo's internal dependency graph
as JSON.
[#7977](https://github.com/rust-lang/cargo/pull/7977)
- Changed `-Zbuild_dep` to `-Zhost_dep`, and added proc-macros to the feature
decoupling logic.
[#8003](https://github.com/rust-lang/cargo/pull/8003)
[#8028](https://github.com/rust-lang/cargo/pull/8028)
- Fixed so that `--crate-version` is not automatically passed when the flag
is found in `RUSTDOCFLAGS`.
[#8014](https://github.com/rust-lang/cargo/pull/8014)
- Fixed panic with `-Zfeatures=dev_dep` and `check --profile=test`.
[#8027](https://github.com/rust-lang/cargo/pull/8027)
- Fixed panic with `-Zfeatures=itarget` with certain host dependencies.
[#8048](https://github.com/rust-lang/cargo/pull/8048)
- Added support for `-Cembed-bitcode=no`, which provides a performance boost
and disk-space usage reduction for non-LTO builds.
[#8066](https://github.com/rust-lang/cargo/pull/8066)
- `-Zpackage-features` has been extended with several changes intended to make
it easier to select features on the command-line in a workspace.
[#8074](https://github.com/rust-lang/cargo/pull/8074)
## Cargo 1.43 (2020-04-23)
[9d32b7b0...rust-1.43.0](https://github.com/rust-lang/cargo/compare/9d32b7b0...rust-1.43.0)

View File

@ -1,6 +1,6 @@
[package]
name = "cargo"
version = "0.45.0"
version = "0.46.0"
edition = "2018"
authors = ["Yehuda Katz <wycats@gmail.com>",
"Carl Lerche <me@carllerche.com>",
@ -32,7 +32,7 @@ pretty_env_logger = { version = "0.4", optional = true }
anyhow = "1.0"
filetime = "0.2.9"
flate2 = { version = "1.0.3", default-features = false, features = ["zlib"] }
git2 = "0.13.1"
git2 = "0.13.5"
git2-curl = "0.14.0"
glob = "0.3.0"
hex = "0.4"
@ -44,7 +44,7 @@ jobserver = "0.1.21"
lazycell = "1.2.0"
libc = "0.2"
log = "0.4.6"
libgit2-sys = "0.12.1"
libgit2-sys = "0.12.5"
memchr = "2.1.3"
num_cpus = "1.0"
opener = "0.4"

View File

@ -41,6 +41,9 @@ jobs:
x86_64-msvc:
TOOLCHAIN: stable-x86_64-pc-windows-msvc
OTHER_TARGET: i686-pc-windows-msvc
x86_64-gnu:
TOOLCHAIN: nightly-x86_64-pc-windows-gnu
OTHER_TARGET: i686-pc-windows-gnu
- job: rustfmt
pool:

View File

@ -4,7 +4,7 @@ steps:
rustup set profile minimal
rustup component remove --toolchain=$TOOLCHAIN rust-docs || echo "already removed"
rustup update --no-self-update $TOOLCHAIN
if [ "$TOOLCHAIN" = "nightly" ]; then
if [[ "$TOOLCHAIN" == "nightly"* ]]; then
rustup component add --toolchain=$TOOLCHAIN rustc-dev
fi
rustup default $TOOLCHAIN

View File

@ -190,6 +190,8 @@ pub fn alternate() -> &'static str {
"i686-unknown-linux-gnu"
} else if cfg!(all(target_os = "windows", target_env = "msvc")) {
"i686-pc-windows-msvc"
} else if cfg!(all(target_os = "windows", target_env = "gnu")) {
"i686-pc-windows-gnu"
} else {
panic!("This test should be gated on cross_compile::disabled.");
}

View File

@ -117,7 +117,6 @@ use std::path::{Path, PathBuf};
use std::process::{Command, Output};
use std::str;
use std::time::{self, Duration};
use std::usize;
use cargo::util::{is_ci, CargoResult, ProcessBuilder, ProcessError, Rustc};
use serde_json::{self, Value};
@ -1723,6 +1722,7 @@ fn _process(t: &OsStr) -> cargo::util::ProcessBuilder {
.env_remove("RUSTDOC")
.env_remove("RUSTC_WRAPPER")
.env_remove("RUSTFLAGS")
.env_remove("RUSTDOCFLAGS")
.env_remove("XDG_CONFIG_HOME") // see #2345
.env("GIT_CONFIG_NOSYSTEM", "1") // keep trying to sandbox ourselves
.env_remove("EMAIL")

View File

@ -25,7 +25,7 @@ proptest! {
0
} else {
// but that local builds will give a small clear test case.
std::u32::MAX
u32::MAX
},
result_cache: prop::test_runner::basic_result_cache,
.. ProptestConfig::default()

View File

@ -28,7 +28,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
let opts = CleanOptions {
config,
spec: values(args, "package"),
target: args.target(),
targets: args.targets(),
requested_profile: args.get_profile_name(config, "dev", ProfileChecking::Checked)?,
profile_specified: args.is_present("profile") || args.is_present("release"),
doc: args.is_present("doc"),

View File

@ -28,7 +28,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
let opts = FetchOptions {
config,
target: args.target(),
targets: args.targets(),
};
let _ = ops::fetch(&ws, &opts)?;
Ok(())

View File

@ -12,13 +12,11 @@ pub fn cli() -> App {
)
.arg(opt("quiet", "No output printed to stdout").short("q"))
.arg_features()
.arg(
opt(
.arg(multi_opt(
"filter-platform",
"TRIPLE",
"Only include resolve dependencies matching the given target-triple",
)
.value_name("TRIPLE"),
)
))
.arg(opt(
"no-deps",
"Output information only about the workspace members \
@ -51,7 +49,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
all_features: args.is_present("all-features"),
no_default_features: args.is_present("no-default-features"),
no_deps: args.is_present("no-deps"),
filter_platform: args.value_of("filter-platform").map(|s| s.to_string()),
filter_platforms: args._values_of("filter-platform"),
version,
};

View File

@ -42,7 +42,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
list: args.is_present("list"),
check_metadata: !args.is_present("no-metadata"),
allow_dirty: args.is_present("allow-dirty"),
target: args.target(),
targets: args.targets(),
jobs: args.jobs()?,
features: args._values_of("features"),
all_features: args.is_present("all-features"),

View File

@ -40,7 +40,7 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
index,
verify: !args.is_present("no-verify"),
allow_dirty: args.is_present("allow-dirty"),
target: args.target(),
targets: args.targets(),
jobs: args.jobs()?,
dry_run: args.is_present("dry-run"),
registry,

View File

@ -129,15 +129,15 @@ pub fn exec(config: &mut Config, args: &ArgMatches<'_>) -> CliResult {
)?;
}
let target = if args.is_present("all-targets") {
let targets = if args.is_present("all-targets") {
config
.shell()
.warn("the --all-targets flag has been changed to --target=all")?;
Some("all")
vec!["all".to_string()]
} else {
args.value_of("target")
args._values_of("target")
};
let target = tree::Target::from_cli(target);
let target = tree::Target::from_cli(targets);
let edge_kinds = parse_edge_kinds(config, args)?;
let graph_features = edge_kinds.contains(&EdgeKind::Feature);

View File

@ -2,6 +2,7 @@ use crate::core::compiler::CompileKind;
use crate::core::interning::InternedString;
use crate::util::ProcessBuilder;
use crate::util::{CargoResult, Config, RustfixDiagnosticServer};
use anyhow::bail;
use serde::ser;
use std::cell::RefCell;
use std::path::PathBuf;
@ -10,7 +11,7 @@ use std::path::PathBuf;
#[derive(Debug)]
pub struct BuildConfig {
/// The requested kind of compilation for this session
pub requested_kind: CompileKind,
pub requested_kinds: Vec<CompileKind>,
/// Number of rustc jobs to run in parallel.
pub jobs: u32,
/// Build profile
@ -50,12 +51,11 @@ impl BuildConfig {
pub fn new(
config: &Config,
jobs: Option<u32>,
requested_target: &Option<String>,
requested_targets: &[String],
mode: CompileMode,
) -> CargoResult<BuildConfig> {
let cfg = config.build_config()?;
let requested_kind =
CompileKind::from_requested_target(config, requested_target.as_deref())?;
let requested_kinds = CompileKind::from_requested_targets(config, requested_targets)?;
if jobs == Some(0) {
anyhow::bail!("jobs must be at least 1")
}
@ -69,7 +69,7 @@ impl BuildConfig {
let jobs = jobs.or(cfg.jobs).unwrap_or(::num_cpus::get() as u32);
Ok(BuildConfig {
requested_kind,
requested_kinds,
jobs,
requested_profile: InternedString::new("dev"),
mode,
@ -95,6 +95,13 @@ impl BuildConfig {
pub fn test(&self) -> bool {
self.mode == CompileMode::Test || self.mode == CompileMode::Bench
}
pub fn single_requested_kind(&self) -> CargoResult<CompileKind> {
match self.requested_kinds.len() {
1 => Ok(self.requested_kinds[0]),
_ => bail!("only one `--target` argument is supported"),
}
}
}
#[derive(Clone, Copy, Debug, PartialEq, Eq)]

View File

@ -1,8 +1,8 @@
use crate::core::compiler::unit_graph::UnitGraph;
use crate::core::compiler::{BuildConfig, CompileKind, Unit};
use crate::core::profiles::Profiles;
use crate::core::PackageSet;
use crate::core::{InternedString, Workspace};
use crate::core::{PackageId, PackageSet};
use crate::util::config::Config;
use crate::util::errors::CargoResult;
use crate::util::Rustc;
@ -99,10 +99,6 @@ impl<'a, 'cfg> BuildContext<'a, 'cfg> {
&self.target_data.info(unit.kind).rustdocflags
}
pub fn show_warnings(&self, pkg: PackageId) -> bool {
pkg.source_id().is_path() || self.config.extra_verbose()
}
pub fn extra_args_for(&self, unit: &Unit) -> Option<&Vec<String>> {
self.extra_compiler_args.get(unit)
}

View File

@ -90,11 +90,18 @@ impl FileType {
impl TargetInfo {
pub fn new(
config: &Config,
requested_kind: CompileKind,
requested_kinds: &[CompileKind],
rustc: &Rustc,
kind: CompileKind,
) -> CargoResult<TargetInfo> {
let rustflags = env_args(config, requested_kind, &rustc.host, None, kind, "RUSTFLAGS")?;
let rustflags = env_args(
config,
requested_kinds,
&rustc.host,
None,
kind,
"RUSTFLAGS",
)?;
let mut process = rustc.process();
process
.arg("-")
@ -180,7 +187,7 @@ impl TargetInfo {
// information
rustflags: env_args(
config,
requested_kind,
requested_kinds,
&rustc.host,
Some(&cfg),
kind,
@ -188,7 +195,7 @@ impl TargetInfo {
)?,
rustdocflags: env_args(
config,
requested_kind,
requested_kinds,
&rustc.host,
Some(&cfg),
kind,
@ -257,6 +264,17 @@ impl TargetInfo {
flavor: FileFlavor::Normal,
should_replace_hyphens: false,
})
} else if target_triple.ends_with("windows-gnu")
&& crate_type.ends_with("dylib")
&& suffix == ".dll"
{
// LD can link DLL directly, but LLD requires the import library.
ret.push(FileType {
suffix: ".dll.a".to_string(),
prefix: "lib".to_string(),
flavor: FileFlavor::Normal,
should_replace_hyphens: false,
})
}
// See rust-lang/cargo#4535.
@ -405,7 +423,7 @@ fn output_err_info(cmd: &ProcessBuilder, stdout: &str, stderr: &str) -> String {
/// scripts, ...), even if it is the same as the target.
fn env_args(
config: &Config,
requested_kind: CompileKind,
requested_kinds: &[CompileKind],
host_triple: &str,
target_cfg: Option<&[Cfg]>,
kind: CompileKind,
@ -430,7 +448,7 @@ fn env_args(
// This means that, e.g., even if the specified --target is the
// same as the host, build scripts in plugins won't get
// RUSTFLAGS.
if !requested_kind.is_host() && kind.is_host() {
if requested_kinds != &[CompileKind::Host] && kind.is_host() {
// This is probably a build script or plugin and we're
// compiling with --target. In this scenario there are
// no rustflags we can apply.
@ -515,21 +533,26 @@ pub struct RustcTargetData {
}
impl RustcTargetData {
pub fn new(ws: &Workspace<'_>, requested_kind: CompileKind) -> CargoResult<RustcTargetData> {
pub fn new(
ws: &Workspace<'_>,
requested_kinds: &[CompileKind],
) -> CargoResult<RustcTargetData> {
let config = ws.config();
let rustc = config.load_global_rustc(Some(ws))?;
let host_config = config.target_cfg_triple(&rustc.host)?;
let host_info = TargetInfo::new(config, requested_kind, &rustc, CompileKind::Host)?;
let host_info = TargetInfo::new(config, requested_kinds, &rustc, CompileKind::Host)?;
let mut target_config = HashMap::new();
let mut target_info = HashMap::new();
if let CompileKind::Target(target) = requested_kind {
for kind in requested_kinds {
if let CompileKind::Target(target) = *kind {
let tcfg = config.target_cfg_triple(target.short_name())?;
target_config.insert(target, tcfg);
target_info.insert(
target,
TargetInfo::new(config, requested_kind, &rustc, CompileKind::Target(target))?,
TargetInfo::new(config, requested_kinds, &rustc, *kind)?,
);
}
}
Ok(RustcTargetData {
rustc,

View File

@ -8,15 +8,14 @@ use semver::Version;
use super::BuildContext;
use crate::core::compiler::CompileKind;
use crate::core::{Edition, Package, PackageId, Target};
use crate::core::compiler::Unit;
use crate::core::{Edition, Package, PackageId};
use crate::util::{self, config, join_paths, process, CargoResult, Config, ProcessBuilder};
/// Structure with enough information to run `rustdoc --test`.
pub struct Doctest {
/// The package being doc-tested.
pub package: Package,
/// The target being tested (currently always the package's lib).
pub target: Target,
/// What's being doctested
pub unit: Unit,
/// Arguments needed to pass to rustdoc to run this test.
pub args: Vec<OsString>,
/// Whether or not -Zunstable-options is needed.
@ -26,11 +25,12 @@ pub struct Doctest {
/// A structure returning the result of a compilation.
pub struct Compilation<'cfg> {
/// An array of all tests created during this compilation.
/// `(package, target, path_to_test_exe)`
pub tests: Vec<(Package, Target, PathBuf)>,
/// `(unit, path_to_test_exe)` where `unit` contains information such as the
/// package, compile target, etc.
pub tests: Vec<(Unit, PathBuf)>,
/// An array of all binaries created.
pub binaries: Vec<PathBuf>,
pub binaries: Vec<(Unit, PathBuf)>,
/// All directories for the output of native build commands.
///
@ -41,20 +41,17 @@ pub struct Compilation<'cfg> {
pub native_dirs: BTreeSet<PathBuf>,
/// Root output directory (for the local package's artifacts)
pub root_output: PathBuf,
pub root_output: HashMap<CompileKind, PathBuf>,
/// Output directory for rust dependencies.
/// May be for the host or for a specific target.
pub deps_output: PathBuf,
pub deps_output: HashMap<CompileKind, PathBuf>,
/// Output directory for the rust host dependencies.
pub host_deps_output: PathBuf,
/// The path to the host libdir for the compiler used
sysroot_host_libdir: PathBuf,
/// The path to rustc's own libstd
pub host_dylib_path: PathBuf,
/// The path to libstd for the target
pub target_dylib_path: PathBuf,
/// The path to libstd for each target
sysroot_target_libdir: HashMap<CompileKind, PathBuf>,
/// Extra environment variables that were passed to compilations and should
/// be passed to future invocations of programs.
@ -69,9 +66,6 @@ pub struct Compilation<'cfg> {
/// Flags to pass to rustdoc when invoked from cargo test, per package.
pub rustdocflags: HashMap<PackageId, Vec<String>>,
pub host: String,
pub target: String,
config: &'cfg Config,
/// Rustc process to be used by default
@ -82,7 +76,7 @@ pub struct Compilation<'cfg> {
/// rustc_workspace_wrapper_process
primary_rustc_process: Option<ProcessBuilder>,
target_runner: Option<(PathBuf, Vec<String>)>,
target_runners: HashMap<CompileKind, Option<(PathBuf, Vec<String>)>>,
}
impl<'cfg> Compilation<'cfg> {
@ -100,23 +94,28 @@ impl<'cfg> Compilation<'cfg> {
}
}
let default_kind = bcx.build_config.requested_kind;
Ok(Compilation {
// TODO: deprecated; remove.
native_dirs: BTreeSet::new(),
root_output: PathBuf::from("/"),
deps_output: PathBuf::from("/"),
host_deps_output: PathBuf::from("/"),
host_dylib_path: bcx
root_output: HashMap::new(),
deps_output: HashMap::new(),
sysroot_host_libdir: bcx
.target_data
.info(CompileKind::Host)
.sysroot_host_libdir
.clone(),
target_dylib_path: bcx
.target_data
.info(default_kind)
.sysroot_target_libdir
.clone(),
sysroot_target_libdir: bcx
.build_config
.requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
.map(|kind| {
(
*kind,
bcx.target_data.info(*kind).sysroot_target_libdir.clone(),
)
})
.collect(),
tests: Vec::new(),
binaries: Vec::new(),
extra_env: HashMap::new(),
@ -127,16 +126,20 @@ impl<'cfg> Compilation<'cfg> {
rustc_process: rustc,
rustc_workspace_wrapper_process,
primary_rustc_process,
host: bcx.host_triple().to_string(),
target: bcx.target_data.short_name(&default_kind).to_string(),
target_runner: target_runner(bcx, default_kind)?,
target_runners: bcx
.build_config
.requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
.map(|kind| Ok((*kind, target_runner(bcx, *kind)?)))
.collect::<CargoResult<HashMap<_, _>>>()?,
})
}
/// See `process`.
pub fn rustc_process(
&self,
pkg: &Package,
unit: &Unit,
is_primary: bool,
is_workspace: bool,
) -> CargoResult<ProcessBuilder> {
@ -148,17 +151,22 @@ impl<'cfg> Compilation<'cfg> {
self.rustc_process.clone()
};
self.fill_env(rustc, pkg, true)
self.fill_env(rustc, &unit.pkg, unit.kind, true)
}
/// See `process`.
pub fn rustdoc_process(&self, pkg: &Package, target: &Target) -> CargoResult<ProcessBuilder> {
let mut p = self.fill_env(process(&*self.config.rustdoc()?), pkg, false)?;
if target.edition() != Edition::Edition2015 {
p.arg(format!("--edition={}", target.edition()));
pub fn rustdoc_process(&self, unit: &Unit) -> CargoResult<ProcessBuilder> {
let mut p = self.fill_env(
process(&*self.config.rustdoc()?),
&unit.pkg,
unit.kind,
true,
)?;
if unit.target.edition() != Edition::Edition2015 {
p.arg(format!("--edition={}", unit.target.edition()));
}
for crate_type in target.rustc_crate_types() {
for crate_type in unit.target.rustc_crate_types() {
p.arg("--crate-type").arg(crate_type);
}
@ -171,20 +179,21 @@ impl<'cfg> Compilation<'cfg> {
cmd: T,
pkg: &Package,
) -> CargoResult<ProcessBuilder> {
self.fill_env(process(cmd), pkg, true)
self.fill_env(process(cmd), pkg, CompileKind::Host, false)
}
pub fn target_runner(&self) -> &Option<(PathBuf, Vec<String>)> {
&self.target_runner
pub fn target_runner(&self, kind: CompileKind) -> Option<&(PathBuf, Vec<String>)> {
self.target_runners.get(&kind).and_then(|x| x.as_ref())
}
/// See `process`.
pub fn target_process<T: AsRef<OsStr>>(
&self,
cmd: T,
kind: CompileKind,
pkg: &Package,
) -> CargoResult<ProcessBuilder> {
let builder = if let Some((ref runner, ref args)) = *self.target_runner() {
let builder = if let Some((runner, args)) = self.target_runner(kind) {
let mut builder = process(runner);
builder.args(args);
builder.arg(cmd);
@ -192,7 +201,7 @@ impl<'cfg> Compilation<'cfg> {
} else {
process(cmd)
};
self.fill_env(builder, pkg, false)
self.fill_env(builder, pkg, kind, false)
}
/// Prepares a new process with an appropriate environment to run against
@ -204,26 +213,28 @@ impl<'cfg> Compilation<'cfg> {
&self,
mut cmd: ProcessBuilder,
pkg: &Package,
is_host: bool,
kind: CompileKind,
is_rustc_tool: bool,
) -> CargoResult<ProcessBuilder> {
let mut search_path = if is_host {
let mut search_path = vec![self.host_deps_output.clone()];
search_path.push(self.host_dylib_path.clone());
search_path
let mut search_path = Vec::new();
if is_rustc_tool {
search_path.push(self.deps_output[&CompileKind::Host].clone());
search_path.push(self.sysroot_host_libdir.clone());
} else {
let mut search_path =
super::filter_dynamic_search_path(self.native_dirs.iter(), &self.root_output);
search_path.push(self.deps_output.clone());
search_path.push(self.root_output.clone());
search_path.extend(super::filter_dynamic_search_path(
self.native_dirs.iter(),
&self.root_output[&kind],
));
search_path.push(self.deps_output[&kind].clone());
search_path.push(self.root_output[&kind].clone());
// For build-std, we don't want to accidentally pull in any shared
// libs from the sysroot that ships with rustc. This may not be
// required (at least I cannot craft a situation where it
// matters), but is here to be safe.
if self.config.cli_unstable().build_std.is_none() {
search_path.push(self.target_dylib_path.clone());
search_path.push(self.sysroot_target_libdir[&kind].clone());
}
}
search_path
};
let dylib_path = util::dylib_path();
let dylib_path_is_empty = dylib_path.is_empty();

View File

@ -1,7 +1,9 @@
use crate::core::{InternedString, Target};
use crate::util::errors::{CargoResult, CargoResultExt};
use crate::util::Config;
use anyhow::bail;
use serde::Serialize;
use std::collections::BTreeSet;
use std::path::Path;
/// Indicator for how a unit is being compiled.
@ -41,17 +43,30 @@ impl CompileKind {
}
}
/// Creates a new `CompileKind` based on the requested target.
/// Creates a new list of `CompileKind` based on the requested list of
/// targets.
///
/// If no target is given, this consults the config if the default is set.
/// Otherwise returns `CompileKind::Host`.
pub fn from_requested_target(
/// If no targets are given then this returns a single-element vector with
/// `CompileKind::Host`.
pub fn from_requested_targets(
config: &Config,
target: Option<&str>,
) -> CargoResult<CompileKind> {
let kind = match target {
Some(s) => CompileKind::Target(CompileTarget::new(s)?),
None => match &config.build_config()?.target {
targets: &[String],
) -> CargoResult<Vec<CompileKind>> {
if targets.len() > 1 && !config.cli_unstable().multitarget {
bail!("specifying multiple `--target` flags requires `-Zmultitarget`")
}
if targets.len() != 0 {
return Ok(targets
.iter()
.map(|value| Ok(CompileKind::Target(CompileTarget::new(value)?)))
// First collect into a set to deduplicate any `--target` passed
// more than once...
.collect::<CargoResult<BTreeSet<_>>>()?
// ... then generate a flat list for everything else to use.
.into_iter()
.collect());
}
let kind = match &config.build_config()?.target {
Some(val) => {
let value = if val.raw_value().ends_with(".json") {
let path = val.clone().resolve_path(config);
@ -62,9 +77,8 @@ impl CompileKind {
CompileKind::Target(CompileTarget::new(&value)?)
}
None => CompileKind::Host,
},
};
Ok(kind)
Ok(vec![kind])
}
}

View File

@ -133,7 +133,7 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
.map(|unit| (unit, LazyCell::new()))
.collect();
CompilationFiles {
ws: &cx.bcx.ws,
ws: cx.bcx.ws,
host,
target,
export_dir: cx.bcx.build_config.export_dir.clone(),
@ -332,7 +332,7 @@ impl<'a, 'cfg: 'a> CompilationFiles<'a, 'cfg> {
// we don't want to link it up.
if out_dir.ends_with("deps") {
// Don't lift up library dependencies.
if unit.target.is_bin() || self.roots.contains(unit) {
if unit.target.is_bin() || self.roots.contains(unit) || unit.target.is_dylib() {
Some((
out_dir.parent().unwrap().to_owned(),
if unit.mode.is_any_test() {

View File

@ -165,13 +165,13 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
let bindst = output.bin_dst();
if unit.mode == CompileMode::Test {
self.compilation.tests.push((
unit.pkg.clone(),
unit.target.clone(),
output.path.clone(),
));
self.compilation
.tests
.push((unit.clone(), output.path.clone()));
} else if unit.target.is_executable() {
self.compilation.binaries.push(bindst.clone());
self.compilation
.binaries
.push((unit.clone(), bindst.clone()));
}
}
@ -199,8 +199,7 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
let mut unstable_opts = false;
let args = compiler::extern_args(&self, unit, &mut unstable_opts)?;
self.compilation.to_doc_test.push(compilation::Doctest {
package: unit.pkg.clone(),
target: unit.target.clone(),
unit: unit.clone(),
args,
unstable_opts,
});
@ -273,10 +272,12 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
let dest = self.bcx.profiles.get_dir_name();
let host_layout = Layout::new(self.bcx.ws, None, &dest)?;
let mut targets = HashMap::new();
if let CompileKind::Target(target) = self.bcx.build_config.requested_kind {
for kind in self.bcx.build_config.requested_kinds.iter() {
if let CompileKind::Target(target) = *kind {
let layout = Layout::new(self.bcx.ws, Some(target), &dest)?;
targets.insert(target, layout);
}
}
self.primary_packages
.extend(self.bcx.roots.iter().map(|u| u.pkg.package_id()));
@ -302,12 +303,22 @@ impl<'a, 'cfg> Context<'a, 'cfg> {
.chain_err(|| "couldn't prepare build directories")?;
}
self.compilation.host_deps_output = self.files_mut().host.deps().to_path_buf();
let files = self.files.as_ref().unwrap();
let layout = files.layout(self.bcx.build_config.requested_kind);
self.compilation.root_output = layout.dest().to_path_buf();
self.compilation.deps_output = layout.deps().to_path_buf();
for &kind in self
.bcx
.build_config
.requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
{
let layout = files.layout(kind);
self.compilation
.root_output
.insert(kind, layout.dest().to_path_buf());
self.compilation
.deps_output
.insert(kind, layout.deps().to_path_buf());
}
Ok(())
}

View File

@ -387,12 +387,9 @@ fn build_work(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<Job> {
// state informing what variables were discovered via our script as
// well.
paths::write(&output_file, &output.stdout)?;
log::debug!(
"rewinding custom script output mtime {:?} to {}",
output_file,
timestamp
);
filetime::set_file_times(output_file, timestamp, timestamp)?;
// This mtime shift allows Cargo to detect if a source file was
// modified in the middle of the build.
paths::set_file_time_no_err(output_file, timestamp);
paths::write(&err_file, &output.stderr)?;
paths::write(&root_output_file, util::path2bytes(&script_out_dir)?)?;
let parsed_output =

View File

@ -1209,7 +1209,12 @@ fn calculate_normal(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<Finger
let target_root = target_root(cx);
let local = if unit.mode.is_doc() {
// rustdoc does not have dep-info files.
let fingerprint = pkg_fingerprint(cx.bcx, &unit.pkg)?;
let fingerprint = pkg_fingerprint(cx.bcx, &unit.pkg).chain_err(|| {
format!(
"failed to determine package fingerprint for documenting {}",
unit.pkg
)
})?;
vec![LocalFingerprint::Precalculated(fingerprint)]
} else {
let dep_info = dep_info_loc(cx, unit);
@ -1270,7 +1275,18 @@ fn calculate_run_custom_build(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoRes
// the whole crate.
let (gen_local, overridden) = build_script_local_fingerprints(cx, unit);
let deps = &cx.build_explicit_deps[unit];
let local = (gen_local)(deps, Some(&|| pkg_fingerprint(cx.bcx, &unit.pkg)))?.unwrap();
let local = (gen_local)(
deps,
Some(&|| {
pkg_fingerprint(cx.bcx, &unit.pkg).chain_err(|| {
format!(
"failed to determine package fingerprint for build script for {}",
unit.pkg
)
})
}),
)?
.unwrap();
let output = deps.build_script_output.clone();
// Include any dependencies of our execution, which is typically just the
@ -1521,7 +1537,7 @@ fn compare_old_fingerprint(
// update the mtime so other cleaners know we used it
let t = FileTime::from_system_time(SystemTime::now());
debug!("mtime-on-use forcing {:?} to {}", loc, t);
filetime::set_file_times(loc, t, t)?;
paths::set_file_time_no_err(loc, t);
}
let new_hash = new_fingerprint.hash();

View File

@ -889,7 +889,7 @@ impl<'cfg> DrainState<'cfg> {
artifact: Artifact,
cx: &mut Context<'_, '_>,
) -> CargoResult<()> {
if unit.mode.is_run_custom_build() && cx.bcx.show_warnings(unit.pkg.package_id()) {
if unit.mode.is_run_custom_build() && unit.show_warnings(cx.bcx.config) {
self.emit_warnings(None, unit, cx)?;
}
let unlocked = self.queue.finish(unit, &artifact);

View File

@ -136,7 +136,7 @@ fn compile<'cfg>(
};
work.then(link_targets(cx, unit, false)?)
} else {
let work = if cx.bcx.show_warnings(unit.pkg.package_id()) {
let work = if unit.show_warnings(bcx.config) {
replay_output_cache(
unit.pkg.package_id(),
&unit.target,
@ -223,6 +223,7 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
.to_path_buf();
let fingerprint_dir = cx.files().fingerprint_dir(unit);
let script_metadata = cx.find_build_script_metadata(unit.clone());
let is_local = unit.is_local();
return Ok(Work::new(move |state| {
// Only at runtime have we discovered what the extra -L and -l
@ -312,7 +313,7 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
&pkg_root,
&target_dir,
// Do not track source files in the fingerprint for registry dependencies.
current_id.source_id().is_path(),
is_local,
)
.chain_err(|| {
internal(format!(
@ -320,8 +321,9 @@ fn rustc(cx: &mut Context<'_, '_>, unit: &Unit, exec: &Arc<dyn Executor>) -> Car
rustc_dep_info_loc.display()
))
})?;
debug!("rewinding mtime of {:?} to {}", dep_info_loc, timestamp);
filetime::set_file_times(dep_info_loc, timestamp, timestamp)?;
// This mtime shift allows Cargo to detect if a source file was
// modified in the middle of the build.
paths::set_file_time_no_err(dep_info_loc, timestamp);
}
Ok(())
@ -537,7 +539,7 @@ fn prepare_rustc(
let mut base = cx
.compilation
.rustc_process(&unit.pkg, is_primary, is_workspace)?;
.rustc_process(unit, is_primary, is_workspace)?;
if cx.bcx.config.cli_unstable().jobserver_per_rustc {
let client = cx.new_jobserver()?;
base.inherit_jobserver(&client);
@ -553,7 +555,7 @@ fn prepare_rustc(
fn rustdoc(cx: &mut Context<'_, '_>, unit: &Unit) -> CargoResult<Work> {
let bcx = cx.bcx;
let mut rustdoc = cx.compilation.rustdoc_process(&unit.pkg, &unit.target)?;
let mut rustdoc = cx.compilation.rustdoc_process(unit)?;
rustdoc.inherit_jobserver(&cx.jobserver);
rustdoc.arg("--crate-name").arg(&unit.target.crate_name());
add_path_args(bcx, unit, &mut rustdoc);
@ -687,12 +689,12 @@ fn add_path_args(bcx: &BuildContext<'_, '_>, unit: &Unit, cmd: &mut ProcessBuild
fn add_cap_lints(bcx: &BuildContext<'_, '_>, unit: &Unit, cmd: &mut ProcessBuilder) {
// If this is an upstream dep we don't want warnings from, turn off all
// lints.
if !bcx.show_warnings(unit.pkg.package_id()) {
if !unit.show_warnings(bcx.config) {
cmd.arg("--cap-lints").arg("allow");
// If this is an upstream dep but we *do* want warnings, make sure that they
// don't fail compilation.
} else if !unit.pkg.package_id().source_id().is_path() {
} else if !unit.is_local() {
cmd.arg("--cap-lints").arg("warn");
}
}

View File

@ -96,8 +96,7 @@ fn add_deps_for_unit(
// Recursively traverse all transitive dependencies
let unit_deps = Vec::from(cx.unit_deps(unit)); // Create vec due to mutable borrow.
for dep in unit_deps {
let source_id = dep.unit.pkg.package_id().source_id();
if source_id.is_path() {
if unit.is_local() {
add_deps_for_unit(deps, cx, &dep.unit, visited)?;
}
}

View File

@ -34,7 +34,7 @@ pub fn parse_unstable_flag(value: Option<&str>) -> Vec<String> {
pub fn resolve_std<'cfg>(
ws: &Workspace<'cfg>,
target_data: &RustcTargetData,
requested_target: CompileKind,
requested_targets: &[CompileKind],
crates: &[String],
) -> CargoResult<(PackageSet<'cfg>, Resolve, ResolvedFeatures)> {
let src_path = detect_sysroot_src_path(target_data)?;
@ -107,7 +107,7 @@ pub fn resolve_std<'cfg>(
let resolve = ops::resolve_ws_with_opts(
&std_ws,
target_data,
requested_target,
requested_targets,
&opts,
&specs,
HasDevUnits::No,
@ -126,11 +126,11 @@ pub fn generate_std_roots(
crates: &[String],
std_resolve: &Resolve,
std_features: &ResolvedFeatures,
kind: CompileKind,
kinds: &[CompileKind],
package_set: &PackageSet<'_>,
interner: &UnitInterner,
profiles: &Profiles,
) -> CargoResult<Vec<Unit>> {
) -> CargoResult<HashMap<CompileKind, Vec<Unit>>> {
// Generate the root Units for the standard library.
let std_ids = crates
.iter()
@ -138,10 +138,9 @@ pub fn generate_std_roots(
.collect::<CargoResult<Vec<PackageId>>>()?;
// Convert PackageId to Package.
let std_pkgs = package_set.get_many(std_ids)?;
// Generate a list of Units.
std_pkgs
.into_iter()
.map(|pkg| {
// Generate a map of Units for each kind requested.
let mut ret = HashMap::new();
for pkg in std_pkgs {
let lib = pkg
.targets()
.iter()
@ -152,15 +151,29 @@ pub fn generate_std_roots(
// in time is minimal, and the difference in caching is
// significant.
let mode = CompileMode::Build;
let profile =
profiles.get_profile(pkg.package_id(), /*is_member*/ false, unit_for, mode);
let features =
std_features.activated_features(pkg.package_id(), FeaturesFor::NormalOrDev);
Ok(interner.intern(
pkg, lib, profile, kind, mode, features, /*is_std*/ true,
))
})
.collect::<CargoResult<Vec<_>>>()
let profile = profiles.get_profile(
pkg.package_id(),
/*is_member*/ false,
/*is_local*/ false,
unit_for,
mode,
);
let features = std_features.activated_features(pkg.package_id(), FeaturesFor::NormalOrDev);
for kind in kinds {
let list = ret.entry(*kind).or_insert(Vec::new());
list.push(interner.intern(
pkg,
lib,
profile,
*kind,
mode,
features.clone(),
/*is_std*/ true,
));
}
}
return Ok(ret);
}
fn detect_sysroot_src_path(target_data: &RustcTargetData) -> CargoResult<PathBuf> {

View File

@ -614,7 +614,13 @@ fn render_rustc_info(bcx: &BuildContext<'_, '_>) -> String {
.lines()
.next()
.expect("rustc version");
let requested_target = bcx.target_data.short_name(&bcx.build_config.requested_kind);
let requested_target = bcx
.build_config
.requested_kinds
.iter()
.map(|kind| bcx.target_data.short_name(kind))
.collect::<Vec<_>>()
.join(", ");
format!(
"{}<br>Host: {}<br>Target: {}",
version,

View File

@ -2,6 +2,7 @@ use crate::core::compiler::{CompileKind, CompileMode};
use crate::core::manifest::{LibKind, Target, TargetKind};
use crate::core::{profiles::Profile, InternedString, Package};
use crate::util::hex::short_hash;
use crate::util::Config;
use std::cell::RefCell;
use std::collections::HashSet;
use std::fmt;
@ -67,6 +68,19 @@ impl UnitInner {
pub fn requires_upstream_objects(&self) -> bool {
self.mode.is_any_test() || self.target.kind().requires_upstream_objects()
}
/// Returns whether or not this is a "local" package.
///
/// A "local" package is one that the user can likely edit, or otherwise
/// wants warnings, etc.
pub fn is_local(&self) -> bool {
self.pkg.package_id().source_id().is_path() && !self.is_std
}
/// Returns whether or not warnings should be displayed for this unit.
pub fn show_warnings(&self, config: &Config) -> bool {
self.is_local() || config.extra_verbose()
}
}
impl Unit {

View File

@ -55,7 +55,7 @@ pub fn build_unit_dependencies<'a, 'cfg>(
features: &'a ResolvedFeatures,
std_resolve: Option<&'a (Resolve, ResolvedFeatures)>,
roots: &[Unit],
std_roots: &[Unit],
std_roots: &HashMap<CompileKind, Vec<Unit>>,
global_mode: CompileMode,
target_data: &'a RustcTargetData,
profiles: &'a Profiles,
@ -108,14 +108,16 @@ pub fn build_unit_dependencies<'a, 'cfg>(
/// Compute all the dependencies for the standard library.
fn calc_deps_of_std(
mut state: &mut State<'_, '_>,
std_roots: &[Unit],
std_roots: &HashMap<CompileKind, Vec<Unit>>,
) -> CargoResult<Option<UnitGraph>> {
if std_roots.is_empty() {
return Ok(None);
}
// Compute dependencies for the standard library.
state.is_std = true;
deps_of_roots(std_roots, &mut state)?;
for roots in std_roots.values() {
deps_of_roots(roots, &mut state)?;
}
state.is_std = false;
Ok(Some(std::mem::replace(
&mut state.unit_dependencies,
@ -124,11 +126,15 @@ fn calc_deps_of_std(
}
/// Add the standard library units to the `unit_dependencies`.
fn attach_std_deps(state: &mut State<'_, '_>, std_roots: &[Unit], std_unit_deps: UnitGraph) {
fn attach_std_deps(
state: &mut State<'_, '_>,
std_roots: &HashMap<CompileKind, Vec<Unit>>,
std_unit_deps: UnitGraph,
) {
// Attach the standard library as a dependency of every target unit.
for (unit, deps) in state.unit_dependencies.iter_mut() {
if !unit.kind.is_host() && !unit.mode.is_run_custom_build() {
deps.extend(std_roots.iter().map(|unit| UnitDep {
deps.extend(std_roots[&unit.kind].iter().map(|unit| UnitDep {
unit: unit.clone(),
unit_for: UnitFor::new_normal(),
extern_crate_name: unit.pkg.name(),
@ -574,10 +580,14 @@ fn new_unit_dep(
kind: CompileKind,
mode: CompileMode,
) -> CargoResult<UnitDep> {
let profile =
state
.profiles
.get_profile(pkg.package_id(), state.ws.is_member(pkg), unit_for, mode);
let is_local = pkg.package_id().source_id().is_path() && !state.is_std;
let profile = state.profiles.get_profile(
pkg.package_id(),
state.ws.is_member(pkg),
is_local,
unit_for,
mode,
);
new_unit_dep_with_profile(state, parent, pkg, target, unit_for, kind, mode, profile)
}

View File

@ -352,6 +352,7 @@ pub struct CliUnstable {
pub features: Option<Vec<String>>,
pub crate_versions: bool,
pub separate_nightlies: bool,
pub multitarget: bool,
}
impl CliUnstable {
@ -430,6 +431,7 @@ impl CliUnstable {
"features" => self.features = Some(parse_features(v)),
"crate-versions" => self.crate_versions = parse_empty(k, v)?,
"separate-nightlies" => self.separate_nightlies = parse_empty(k, v)?,
"multitarget" => self.multitarget = parse_empty(k, v)?,
_ => bail!("unknown `-Z` flag specified: {}", k),
}

View File

@ -353,7 +353,7 @@ compact_debug! {
kinds.clone(),
self.src_path.path().unwrap().to_path_buf(),
self.edition,
).inner.clone(),
).inner,
format!("lib_target({:?}, {:?}, {:?}, {:?})",
self.name, kinds, self.src_path, self.edition),
)
@ -366,21 +366,21 @@ compact_debug! {
&self.name,
path.to_path_buf(),
self.edition,
).inner.clone(),
).inner,
format!("custom_build_target({:?}, {:?}, {:?})",
self.name, path, self.edition),
)
}
TargetSourcePath::Metabuild => {
(
Target::metabuild_target(&self.name).inner.clone(),
Target::metabuild_target(&self.name).inner,
format!("metabuild_target({:?})", self.name),
)
}
}
}
_ => (
Target::new(self.src_path.clone(), self.edition).inner.clone(),
Target::new(self.src_path.clone(), self.edition).inner,
format!("with_path({:?}, {:?})", self.src_path, self.edition),
),
}

View File

@ -472,7 +472,7 @@ impl<'cfg> PackageSet<'cfg> {
resolve: &Resolve,
root_ids: &[PackageId],
has_dev_units: HasDevUnits,
requested_kind: CompileKind,
requested_kinds: &[CompileKind],
target_data: &RustcTargetData,
) -> CargoResult<()> {
fn collect_used_deps(
@ -480,7 +480,7 @@ impl<'cfg> PackageSet<'cfg> {
resolve: &Resolve,
pkg_id: PackageId,
has_dev_units: HasDevUnits,
requested_kind: CompileKind,
requested_kinds: &[CompileKind],
target_data: &RustcTargetData,
) -> CargoResult<()> {
if !used.insert(pkg_id) {
@ -495,9 +495,11 @@ impl<'cfg> PackageSet<'cfg> {
// dependencies are used both for target and host. To tighten this
// up, this function would need to track "for_host" similar to how
// unit dependencies handles it.
if !target_data.dep_platform_activated(dep, requested_kind)
&& !target_data.dep_platform_activated(dep, CompileKind::Host)
{
let activated = requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
.any(|kind| target_data.dep_platform_activated(dep, *kind));
if !activated {
return false;
}
true
@ -509,7 +511,7 @@ impl<'cfg> PackageSet<'cfg> {
resolve,
dep_id,
has_dev_units,
requested_kind,
requested_kinds,
target_data,
)?;
}
@ -527,7 +529,7 @@ impl<'cfg> PackageSet<'cfg> {
resolve,
*id,
has_dev_units,
requested_kind,
requested_kinds,
target_data,
)?;
}

View File

@ -287,6 +287,7 @@ impl Profiles {
&self,
pkg_id: PackageId,
is_member: bool,
is_local: bool,
unit_for: UnitFor,
mode: CompileMode,
) -> Profile {
@ -360,7 +361,7 @@ impl Profiles {
// itself (aka crates.io / git dependencies)
//
// (see also https://github.com/rust-lang/cargo/issues/3972)
if !pkg_id.source_id().is_path() {
if !is_local {
profile.incremental = false;
}
profile.name = profile_name;

View File

@ -175,7 +175,7 @@ impl ConflictCache {
dep: &Dependency,
must_contain: Option<PackageId>,
) -> Option<&ConflictMap> {
let out = self.find(dep, &|id| cx.is_active(id), must_contain, std::usize::MAX);
let out = self.find(dep, &|id| cx.is_active(id), must_contain, usize::MAX);
if cfg!(debug_assertions) {
if let Some(c) = &out {
assert!(cx.is_conflicting(None, c).is_some());

View File

@ -246,8 +246,8 @@ impl ResolvedFeatures {
pub struct FeatureResolver<'a, 'cfg> {
ws: &'a Workspace<'cfg>,
target_data: &'a RustcTargetData,
/// The platform to build for, requested by the user.
requested_target: CompileKind,
/// The platforms to build for, requested by the user.
requested_targets: &'a [CompileKind],
resolve: &'a Resolve,
package_set: &'a PackageSet<'cfg>,
/// Options that change how the feature resolver operates.
@ -269,7 +269,7 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
package_set: &'a PackageSet<'cfg>,
requested_features: &RequestedFeatures,
specs: &[PackageIdSpec],
requested_target: CompileKind,
requested_targets: &[CompileKind],
has_dev_units: HasDevUnits,
) -> CargoResult<ResolvedFeatures> {
use crate::util::profile;
@ -287,7 +287,7 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
let mut r = FeatureResolver {
ws,
target_data,
requested_target,
requested_targets,
resolve,
package_set,
opts,
@ -536,8 +536,9 @@ impl<'a, 'cfg> FeatureResolver<'a, 'cfg> {
.dep_platform_activated(dep, CompileKind::Host);
}
// Not a build dependency, and not for a build script, so must be Target.
self.target_data
.dep_platform_activated(dep, self.requested_target)
self.requested_targets
.iter()
.any(|kind| self.target_data.dep_platform_activated(dep, *kind))
};
self.resolve
.deps(pkg_id)

View File

@ -792,7 +792,9 @@ impl<'cfg> Workspace<'cfg> {
if !manifest.patch().is_empty() {
emit_warning("patch")?;
}
if manifest.resolve_behavior() != self.resolve_behavior {
if manifest.resolve_behavior().is_some()
&& manifest.resolve_behavior() != self.resolve_behavior
{
// Only warn if they don't match.
emit_warning("resolver")?;
}

View File

@ -23,7 +23,7 @@ pub struct CleanOptions<'a> {
/// A list of packages to clean. If empty, everything is cleaned.
pub spec: Vec<String>,
/// The target arch triple to clean, or None for the host arch
pub target: Option<String>,
pub targets: Vec<String>,
/// Whether to clean the release directory
pub profile_specified: bool,
/// Whether to clean the directory of a certain build profile
@ -61,9 +61,9 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
if opts.spec.is_empty() {
return rm_rf(&target_dir.into_path_unlocked(), config);
}
let mut build_config = BuildConfig::new(config, Some(1), &opts.target, CompileMode::Build)?;
let mut build_config = BuildConfig::new(config, Some(1), &opts.targets, CompileMode::Build)?;
build_config.requested_profile = opts.requested_profile;
let target_data = RustcTargetData::new(ws, build_config.requested_kind)?;
let target_data = RustcTargetData::new(ws, &build_config.requested_kinds)?;
// Resolve for default features. In the future, `cargo clean` should be rewritten
// so that it doesn't need to guess filename hashes.
let resolve_opts = ResolveOpts::new(
@ -80,7 +80,7 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
let ws_resolve = ops::resolve_ws_with_opts(
ws,
&target_data,
build_config.requested_kind,
&build_config.requested_kinds,
&resolve_opts,
&specs,
HasDevUnits::Yes,
@ -102,13 +102,18 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
// Generate all relevant `Unit` targets for this package
for target in pkg.targets() {
for kind in [CompileKind::Host, build_config.requested_kind].iter() {
for kind in build_config
.requested_kinds
.iter()
.chain(Some(&CompileKind::Host))
{
for mode in CompileMode::all_modes() {
for unit_for in UnitFor::all_values() {
let profile = if mode.is_run_custom_build() {
profiles.get_profile_run_custom_build(&profiles.get_profile(
pkg.package_id(),
ws.is_member(pkg),
/*is_local*/ true,
*unit_for,
CompileMode::Build,
))
@ -116,6 +121,7 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
profiles.get_profile(
pkg.package_id(),
ws.is_member(pkg),
/*is_local*/ true,
*unit_for,
*mode,
)
@ -141,7 +147,7 @@ pub fn clean(ws: &Workspace<'_>, opts: &CleanOptions<'_>) -> CargoResult<()> {
&features,
None,
&units,
&[],
&Default::default(),
build_config.mode,
&target_data,
&profiles,

View File

@ -79,7 +79,7 @@ pub struct CompileOptions {
impl<'a> CompileOptions {
pub fn new(config: &Config, mode: CompileMode) -> CargoResult<CompileOptions> {
Ok(CompileOptions {
build_config: BuildConfig::new(config, None, &None, mode)?,
build_config: BuildConfig::new(config, None, &[], mode)?,
features: Vec::new(),
all_features: false,
no_default_features: false,
@ -310,7 +310,7 @@ pub fn create_bcx<'a, 'cfg>(
}
}
let target_data = RustcTargetData::new(ws, build_config.requested_kind)?;
let target_data = RustcTargetData::new(ws, &build_config.requested_kinds)?;
let specs = spec.to_package_id_specs(ws)?;
let dev_deps = ws.require_optional_deps() || filter.need_dev_deps(build_config.mode);
@ -323,7 +323,7 @@ pub fn create_bcx<'a, 'cfg>(
let resolve = ops::resolve_ws_with_opts(
ws,
&target_data,
build_config.requested_kind,
&build_config.requested_kinds,
&opts,
&specs,
has_dev_units,
@ -341,14 +341,14 @@ pub fn create_bcx<'a, 'cfg>(
.shell()
.warn("-Zbuild-std does not currently fully support --build-plan")?;
}
if build_config.requested_kind.is_host() {
if build_config.requested_kinds[0].is_host() {
// TODO: This should eventually be fixed. Unfortunately it is not
// easy to get the host triple in BuildConfig. Consider changing
// requested_target to an enum, or some other approach.
anyhow::bail!("-Zbuild-std requires --target");
}
let (std_package_set, std_resolve, std_features) =
standard_lib::resolve_std(ws, &target_data, build_config.requested_kind, crates)?;
standard_lib::resolve_std(ws, &target_data, &build_config.requested_kinds, crates)?;
pkg_set.add_set(std_package_set);
Some((std_resolve, std_features))
} else {
@ -413,13 +413,13 @@ pub fn create_bcx<'a, 'cfg>(
ws,
&to_builds,
filter,
build_config.requested_kind,
&build_config.requested_kinds,
build_config.mode,
&resolve,
&resolved_features,
&pkg_set,
&profiles,
&interner,
interner,
)?;
let std_roots = if let Some(crates) = &config.cli_unstable().build_std {
@ -440,13 +440,13 @@ pub fn create_bcx<'a, 'cfg>(
&crates,
std_resolve,
std_features,
build_config.requested_kind,
&build_config.requested_kinds,
&pkg_set,
&interner,
interner,
&profiles,
)?
} else {
Vec::new()
Default::default()
};
let mut extra_compiler_args = HashMap::new();
@ -491,7 +491,7 @@ pub fn create_bcx<'a, 'cfg>(
build_config.mode,
&target_data,
&profiles,
&interner,
interner,
)?;
let bcx = BuildContext::new(
@ -694,7 +694,7 @@ fn generate_targets(
ws: &Workspace<'_>,
packages: &[&Package],
filter: &CompileFilter,
default_arch_kind: CompileKind,
requested_kinds: &[CompileKind],
mode: CompileMode,
resolve: &Resolve,
resolved_features: &features::ResolvedFeatures,
@ -703,8 +703,9 @@ fn generate_targets(
interner: &UnitInterner,
) -> CargoResult<Vec<Unit>> {
let config = ws.config();
// Helper for creating a `Unit` struct.
let new_unit = |pkg: &Package, target: &Target, target_mode: CompileMode| {
// Helper for creating a list of `Unit` structures
let new_unit =
|units: &mut HashSet<Unit>, pkg: &Package, target: &Target, target_mode: CompileMode| {
let unit_for = if target_mode.is_any_test() {
// NOTE: the `UnitFor` here is subtle. If you have a profile
// with `panic` set, the `panic` flag is cleared for
@ -759,23 +760,32 @@ fn generate_targets(
CompileMode::Bench => CompileMode::Test,
_ => target_mode,
};
let kind = default_arch_kind.for_target(target);
let profile =
profiles.get_profile(pkg.package_id(), ws.is_member(pkg), unit_for, target_mode);
let is_local = pkg.package_id().source_id().is_path();
let profile = profiles.get_profile(
pkg.package_id(),
ws.is_member(pkg),
is_local,
unit_for,
target_mode,
);
// No need to worry about build-dependencies, roots are never build dependencies.
let features_for = FeaturesFor::from_for_host(target.proc_macro());
let features =
Vec::from(resolved_features.activated_features(pkg.package_id(), features_for));
interner.intern(
let features = resolved_features.activated_features(pkg.package_id(), features_for);
for kind in requested_kinds {
let unit = interner.intern(
pkg,
target,
profile,
kind,
kind.for_target(target),
target_mode,
features,
features.clone(),
/*is_std*/ false,
)
);
units.insert(unit);
}
};
// Create a list of proposed targets.
@ -921,8 +931,7 @@ fn generate_targets(
None => Vec::new(),
};
if target.is_lib() || unavailable_features.is_empty() {
let unit = new_unit(pkg, target, mode);
units.insert(unit);
new_unit(&mut units, pkg, target, mode);
} else if requires_features {
let required_features = target.required_features().unwrap();
let quoted_required_features: Vec<String> = required_features

View File

@ -25,12 +25,11 @@ pub fn doc(ws: &Workspace<'_>, options: &DocOptions) -> CargoResult<()> {
options.compile_opts.all_features,
!options.compile_opts.no_default_features,
);
let requested_kind = options.compile_opts.build_config.requested_kind;
let target_data = RustcTargetData::new(ws, requested_kind)?;
let target_data = RustcTargetData::new(ws, &options.compile_opts.build_config.requested_kinds)?;
let ws_resolve = ops::resolve_ws_with_opts(
ws,
&target_data,
requested_kind,
&options.compile_opts.build_config.requested_kinds,
&opts,
&specs,
HasDevUnits::No,
@ -69,15 +68,20 @@ pub fn doc(ws: &Workspace<'_>, options: &DocOptions) -> CargoResult<()> {
}
}
let open_kind = if options.open_result {
Some(options.compile_opts.build_config.single_requested_kind()?)
} else {
None
};
let compilation = ops::compile(ws, &options.compile_opts)?;
if options.open_result {
if let Some(kind) = open_kind {
let name = match names.first() {
Some(s) => s.to_string(),
None => return Ok(()),
};
let path = compilation
.root_output
let path = compilation.root_output[&kind]
.with_file_name("doc")
.join(&name)
.join("index.html");

View File

@ -1,4 +1,4 @@
use crate::core::compiler::{BuildConfig, CompileMode, TargetInfo};
use crate::core::compiler::{BuildConfig, CompileMode, RustcTargetData};
use crate::core::{PackageSet, Resolve, Workspace};
use crate::ops;
use crate::util::CargoResult;
@ -8,7 +8,7 @@ use std::collections::HashSet;
pub struct FetchOptions<'a> {
pub config: &'a Config,
/// The target arch triple to fetch dependencies for
pub target: Option<String>,
pub targets: Vec<String>,
}
/// Executes `cargo fetch`.
@ -21,14 +21,8 @@ pub fn fetch<'a>(
let jobs = Some(1);
let config = ws.config();
let build_config = BuildConfig::new(config, jobs, &options.target, CompileMode::Build)?;
let rustc = config.load_global_rustc(Some(ws))?;
let target_info = TargetInfo::new(
config,
build_config.requested_kind,
&rustc,
build_config.requested_kind,
)?;
let build_config = BuildConfig::new(config, jobs, &options.targets, CompileMode::Build)?;
let data = RustcTargetData::new(ws, &build_config.requested_kinds)?;
let mut fetched_packages = HashSet::new();
let mut deps_to_fetch = ws.members().map(|p| p.package_id()).collect::<Vec<_>>();
let mut to_download = Vec::new();
@ -43,20 +37,21 @@ pub fn fetch<'a>(
.deps(id)
.filter(|&(_id, deps)| {
deps.iter().any(|d| {
// If no target was specified then all dependencies can
// be fetched.
let target = match options.target {
Some(ref t) => t,
None => return true,
};
// If this dependency is only available for certain
// platforms, make sure we're only fetching it for that
// platform.
let platform = match d.platform() {
Some(p) => p,
None => return true,
};
platform.matches(target, target_info.cfg())
// If no target was specified then all dependencies are
// fetched.
if options.targets.len() == 0 {
return true;
}
// Otherwise we only download this dependency if any of the
// requested platforms would match this dependency. Note
// that this is a bit lossy because not all dependencies are
// always compiled for all platforms, but it should be
// "close enough" for now.
build_config
.requested_kinds
.iter()
.any(|kind| data.dep_platform_activated(d, *kind))
})
})
.map(|(id, _deps)| id);

View File

@ -376,7 +376,7 @@ fn install_one(
let mut binaries: Vec<(&str, &Path)> = compile
.binaries
.iter()
.map(|bin| {
.map(|(_, bin)| {
let name = bin.file_name().unwrap();
if let Some(s) = name.to_str() {
Ok((s, bin.as_ref()))
@ -612,7 +612,7 @@ fn make_ws_rustc_target<'cfg>(
ws.set_require_optional_deps(false);
let rustc = config.load_global_rustc(Some(&ws))?;
let target = match &opts.build_config.requested_kind {
let target = match &opts.build_config.single_requested_kind()? {
CompileKind::Host => rustc.host.as_str().to_owned(),
CompileKind::Target(target) => target.short_name().to_owned(),
};

View File

@ -1,4 +1,4 @@
use crate::core::compiler::{CompileKind, CompileTarget, RustcTargetData};
use crate::core::compiler::{CompileKind, RustcTargetData};
use crate::core::dependency::DepKind;
use crate::core::resolver::{HasDevUnits, Resolve, ResolveOpts};
use crate::core::{Dependency, InternedString, Package, PackageId, Workspace};
@ -17,7 +17,7 @@ pub struct OutputMetadataOptions {
pub all_features: bool,
pub no_deps: bool,
pub version: u32,
pub filter_platform: Option<String>,
pub filter_platforms: Vec<String>,
}
/// Loads the manifest, resolves the dependencies of the package to the concrete
@ -105,11 +105,9 @@ fn build_resolve_graph(
) -> CargoResult<(Vec<Package>, MetadataResolve)> {
// TODO: Without --filter-platform, features are being resolved for `host` only.
// How should this work?
let requested_kind = match &metadata_opts.filter_platform {
Some(t) => CompileKind::Target(CompileTarget::new(t)?),
None => CompileKind::Host,
};
let target_data = RustcTargetData::new(ws, requested_kind)?;
let requested_kinds =
CompileKind::from_requested_targets(ws.config(), &metadata_opts.filter_platforms)?;
let target_data = RustcTargetData::new(ws, &requested_kinds)?;
// Resolve entire workspace.
let specs = Packages::All.to_package_id_specs(ws)?;
let resolve_opts = ResolveOpts::new(
@ -121,7 +119,7 @@ fn build_resolve_graph(
let ws_resolve = ops::resolve_ws_with_opts(
ws,
&target_data,
requested_kind,
&requested_kinds,
&resolve_opts,
&specs,
HasDevUnits::Yes,
@ -147,7 +145,7 @@ fn build_resolve_graph(
&ws_resolve.targeted_resolve,
&package_map,
&target_data,
requested_kind,
&requested_kinds,
);
}
// Get a Vec of Packages.
@ -168,7 +166,7 @@ fn build_resolve_graph_r(
resolve: &Resolve,
package_map: &HashMap<PackageId, Package>,
target_data: &RustcTargetData,
requested_kind: CompileKind,
requested_kinds: &[CompileKind],
) {
if node_map.contains_key(&pkg_id) {
return;
@ -177,12 +175,15 @@ fn build_resolve_graph_r(
let deps: Vec<Dep> = resolve
.deps(pkg_id)
.filter(|(_dep_id, deps)| match requested_kind {
CompileKind::Target(_) => deps
.iter()
.any(|dep| target_data.dep_platform_activated(dep, requested_kind)),
// No --filter-platform is interpreted as "all platforms".
CompileKind::Host => true,
.filter(|(_dep_id, deps)| {
if requested_kinds == &[CompileKind::Host] {
true
} else {
requested_kinds.iter().any(|kind| {
deps.iter()
.any(|dep| target_data.dep_platform_activated(dep, *kind))
})
}
})
.filter_map(|(dep_id, deps)| {
let dep_kinds: Vec<_> = deps.iter().map(DepKindInfo::from).collect();
@ -213,7 +214,7 @@ fn build_resolve_graph_r(
resolve,
package_map,
target_data,
requested_kind,
requested_kinds,
);
}
}

View File

@ -29,7 +29,7 @@ pub struct PackageOpts<'cfg> {
pub allow_dirty: bool,
pub verify: bool,
pub jobs: Option<u32>,
pub target: Option<String>,
pub targets: Vec<String>,
pub features: Vec<String>,
pub all_features: bool,
pub no_default_features: bool,
@ -50,8 +50,17 @@ struct ArchiveFile {
enum FileContents {
/// Absolute path to the file on disk to add to the archive.
OnDisk(PathBuf),
/// Contents of a file generated in memory.
Generated(String),
/// Generates a file.
Generated(GeneratedFile),
}
enum GeneratedFile {
/// Generates `Cargo.toml` by rewriting the original.
Manifest,
/// Generates `Cargo.lock` in some cases (like if there is a binary).
Lockfile,
/// Adds a `.cargo-vcs_info.json` file if in a (clean) git repo.
VcsInfo(String),
}
pub fn package(ws: &Workspace<'_>, opts: &PackageOpts<'_>) -> CargoResult<Option<FileLock>> {
@ -71,8 +80,6 @@ pub fn package(ws: &Workspace<'_>, opts: &PackageOpts<'_>) -> CargoResult<Option
check_metadata(pkg, config)?;
}
verify_dependencies(pkg)?;
if !pkg.manifest().exclude().is_empty() && !pkg.manifest().include().is_empty() {
config.shell().warn(
"both package.include and package.exclude are specified; \
@ -100,6 +107,8 @@ pub fn package(ws: &Workspace<'_>, opts: &PackageOpts<'_>) -> CargoResult<Option
return Ok(None);
}
verify_dependencies(pkg)?;
let filename = format!("{}-{}.crate", pkg.name(), pkg.version());
let dir = ws.target_dir().join("package");
let mut dst = {
@ -156,11 +165,10 @@ fn build_ar_list(
rel_str: "Cargo.toml.orig".to_string(),
contents: FileContents::OnDisk(src_file),
});
let generated = pkg.to_registry_toml(ws)?;
result.push(ArchiveFile {
rel_path,
rel_str,
contents: FileContents::Generated(generated),
contents: FileContents::Generated(GeneratedFile::Manifest),
});
}
"Cargo.lock" => continue,
@ -179,18 +187,17 @@ fn build_ar_list(
}
}
if pkg.include_lockfile() {
let new_lock = build_lock(ws)?;
result.push(ArchiveFile {
rel_path: PathBuf::from("Cargo.lock"),
rel_str: "Cargo.lock".to_string(),
contents: FileContents::Generated(new_lock),
contents: FileContents::Generated(GeneratedFile::Lockfile),
});
}
if let Some(vcs_info) = vcs_info {
result.push(ArchiveFile {
rel_path: PathBuf::from(VCS_INFO_FILE),
rel_str: VCS_INFO_FILE.to_string(),
contents: FileContents::Generated(vcs_info),
contents: FileContents::Generated(GeneratedFile::VcsInfo(vcs_info)),
});
}
if let Some(license_file) = &pkg.manifest().metadata().license_file {
@ -530,7 +537,12 @@ fn tar(
format!("could not archive source file `{}`", disk_path.display())
})?;
}
FileContents::Generated(contents) => {
FileContents::Generated(generated_kind) => {
let contents = match generated_kind {
GeneratedFile::Manifest => pkg.to_registry_toml(ws)?,
GeneratedFile::Lockfile => build_lock(ws)?,
GeneratedFile::VcsInfo(s) => s,
};
header.set_entry_type(EntryType::file());
header.set_mode(0o644);
header.set_mtime(
@ -704,7 +716,7 @@ fn run_verify(ws: &Workspace<'_>, tar: &FileLock, opts: &PackageOpts<'_>) -> Car
ops::compile_with_exec(
&ws,
&ops::CompileOptions {
build_config: BuildConfig::new(config, opts.jobs, &opts.target, CompileMode::Build)?,
build_config: BuildConfig::new(config, opts.jobs, &opts.targets, CompileMode::Build)?,
features: opts.features.clone(),
no_default_features: opts.no_default_features,
all_features: opts.all_features,
@ -823,8 +835,7 @@ fn check_filename(file: &Path, shell: &mut Shell) -> CargoResult<()> {
file.display()
)
}
let mut check_windows = |name| -> CargoResult<()> {
if restricted_names::is_windows_reserved(name) {
if restricted_names::is_windows_reserved_path(file) {
shell.warn(format!(
"file {} is a reserved Windows filename, \
it will not work on Windows platforms",
@ -832,16 +843,4 @@ fn check_filename(file: &Path, shell: &mut Shell) -> CargoResult<()> {
))?;
}
Ok(())
};
for component in file.iter() {
if let Some(component) = component.to_str() {
check_windows(component)?;
}
}
if file.extension().is_some() {
if let Some(stem) = file.file_stem().and_then(|s| s.to_str()) {
check_windows(stem)?;
}
}
Ok(())
}

View File

@ -70,16 +70,19 @@ pub fn run(
}
}
// `cargo run` is only compatible with one `--target` flag at most
options.build_config.single_requested_kind()?;
let compile = ops::compile(ws, options)?;
assert_eq!(compile.binaries.len(), 1);
let exe = &compile.binaries[0];
let (unit, exe) = &compile.binaries[0];
let exe = match exe.strip_prefix(config.cwd()) {
Ok(path) if path.file_name() == Some(path.as_os_str()) => Path::new(".").join(path),
Ok(path) => path.to_path_buf(),
Err(_) => exe.to_path_buf(),
};
let pkg = bins[0].0;
let mut process = compile.target_process(exe, pkg)?;
let mut process = compile.target_process(exe, unit.kind, pkg)?;
process.args(args).cwd(config.cwd());
config.shell().status("Running", process.to_string())?;

View File

@ -64,9 +64,7 @@ pub fn run_benches(
fn compile_tests<'a>(ws: &Workspace<'a>, options: &TestOptions) -> CargoResult<Compilation<'a>> {
let mut compilation = ops::compile(ws, &options.compile_opts)?;
compilation
.tests
.sort_by(|a, b| (a.0.package_id(), &a.1, &a.2).cmp(&(b.0.package_id(), &b.1, &b.2)));
compilation.tests.sort();
Ok(compilation)
}
@ -78,16 +76,14 @@ fn run_unit_tests(
compilation: &Compilation<'_>,
) -> CargoResult<(Test, Vec<ProcessError>)> {
let cwd = config.cwd();
let mut errors = Vec::new();
for &(ref pkg, ref target, ref exe) in &compilation.tests {
let kind = target.kind();
let test = target.name().to_string();
for (unit, exe) in compilation.tests.iter() {
let test = unit.target.name().to_string();
let exe_display = exe.strip_prefix(cwd).unwrap_or(exe).display();
let mut cmd = compilation.target_process(exe, pkg)?;
let mut cmd = compilation.target_process(exe, unit.kind, &unit.pkg)?;
cmd.args(test_args);
if target.harness() && config.shell().verbosity() == Verbosity::Quiet {
if unit.target.harness() && config.shell().verbosity() == Verbosity::Quiet {
cmd.arg("--quiet");
}
config
@ -102,7 +98,12 @@ fn run_unit_tests(
match result {
Err(e) => {
let e = e.downcast::<ProcessError>()?;
errors.push((kind.clone(), test.clone(), pkg.name().to_string(), e));
errors.push((
unit.target.kind().clone(),
test.clone(),
unit.pkg.name().to_string(),
e,
));
if !options.no_fail_fast {
break;
}
@ -137,48 +138,44 @@ fn run_doc_tests(
) -> CargoResult<(Test, Vec<ProcessError>)> {
let mut errors = Vec::new();
// The unstable doctest-xcompile feature enables both per-target-ignores and
// cross-compiling doctests. As a side effect, this feature also gates running
// doctests with runtools when target == host.
let doctest_xcompile = config.cli_unstable().doctest_xcompile;
let mut runtool: &Option<(std::path::PathBuf, Vec<String>)> = &None;
if doctest_xcompile {
runtool = compilation.target_runner();
} else if compilation.host != compilation.target {
return Ok((Test::Doc, errors));
}
for doctest_info in &compilation.to_doc_test {
let Doctest {
package,
target,
args,
unstable_opts,
unit,
} = doctest_info;
config.shell().status("Doc-tests", target.name())?;
let mut p = compilation.rustdoc_process(package, target)?;
p.arg("--test")
.arg(target.src_path().path().unwrap())
.arg("--crate-name")
.arg(&target.crate_name());
if doctest_xcompile {
if let CompileKind::Target(target) = options.compile_opts.build_config.requested_kind {
// Skip any `--target` tests unless `doctest-xcompile` is specified.
if !config.cli_unstable().doctest_xcompile && !unit.kind.is_host() {
continue;
}
config.shell().status("Doc-tests", unit.target.name())?;
let mut p = compilation.rustdoc_process(unit)?;
p.arg("--test")
.arg(unit.target.src_path().path().unwrap())
.arg("--crate-name")
.arg(&unit.target.crate_name());
if config.cli_unstable().doctest_xcompile {
if let CompileKind::Target(target) = unit.kind {
// use `rustc_target()` to properly handle JSON target paths
p.arg("--target").arg(target.rustc_target());
}
p.arg("-Zunstable-options");
p.arg("--enable-per-target-ignores");
}
if let Some((runtool, runtool_args)) = runtool {
if let Some((runtool, runtool_args)) = compilation.target_runner(unit.kind) {
p.arg("--runtool").arg(runtool);
for arg in runtool_args {
p.arg("--runtool-arg").arg(arg);
}
}
}
for &rust_dep in &[&compilation.deps_output] {
for &rust_dep in &[
&compilation.deps_output[&unit.kind],
&compilation.deps_output[&CompileKind::Host],
] {
let mut arg = OsString::from("dependency=");
arg.push(rust_dep);
p.arg("-L").arg(arg);
@ -188,17 +185,11 @@ fn run_doc_tests(
p.arg("-L").arg(native_dep);
}
for &host_rust_dep in &[&compilation.host_deps_output] {
let mut arg = OsString::from("dependency=");
arg.push(host_rust_dep);
p.arg("-L").arg(arg);
}
for arg in test_args {
p.arg("--test-args").arg(arg);
}
if let Some(cfgs) = compilation.cfgs.get(&package.package_id()) {
if let Some(cfgs) = compilation.cfgs.get(&unit.pkg.package_id()) {
for cfg in cfgs.iter() {
p.arg("--cfg").arg(cfg);
}
@ -212,7 +203,7 @@ fn run_doc_tests(
p.arg("-Zunstable-options");
}
if let Some(flags) = compilation.rustdocflags.get(&package.package_id()) {
if let Some(flags) = compilation.rustdocflags.get(&unit.pkg.package_id()) {
p.args(flags);
}
config

View File

@ -42,7 +42,7 @@ pub struct PublishOpts<'cfg> {
pub verify: bool,
pub allow_dirty: bool,
pub jobs: Option<u32>,
pub target: Option<String>,
pub targets: Vec<String>,
pub dry_run: bool,
pub registry: Option<String>,
pub features: Vec<String>,
@ -88,7 +88,7 @@ pub fn publish(ws: &Workspace<'_>, opts: &PublishOpts<'_>) -> CargoResult<()> {
list: false,
check_metadata: true,
allow_dirty: opts.allow_dirty,
target: opts.target.clone(),
targets: opts.targets.clone(),
jobs: opts.jobs,
features: opts.features.clone(),
all_features: opts.all_features,
@ -811,7 +811,7 @@ fn get_source_id(
reg: Option<&String>,
) -> CargoResult<SourceId> {
match (reg, index) {
(Some(r), _) => SourceId::alt_registry(config, &r),
(Some(r), _) => SourceId::alt_registry(config, r),
(_, Some(i)) => SourceId::for_registry(&i.into_url()?),
_ => {
let map = SourceConfigMap::new(config)?;

View File

@ -75,7 +75,7 @@ pub fn resolve_ws<'a>(ws: &Workspace<'a>) -> CargoResult<(PackageSet<'a>, Resolv
pub fn resolve_ws_with_opts<'cfg>(
ws: &Workspace<'cfg>,
target_data: &RustcTargetData,
requested_target: CompileKind,
requested_targets: &[CompileKind],
opts: &ResolveOpts,
specs: &[PackageIdSpec],
has_dev_units: HasDevUnits,
@ -127,7 +127,7 @@ pub fn resolve_ws_with_opts<'cfg>(
let pkg_set = get_resolved_packages(&resolved_with_overrides, registry)?;
let member_ids = ws
.members_with_features(&specs, &opts.features)?
.members_with_features(specs, &opts.features)?
.into_iter()
.map(|(p, _fts)| p.package_id())
.collect::<Vec<_>>();
@ -135,8 +135,8 @@ pub fn resolve_ws_with_opts<'cfg>(
&resolved_with_overrides,
&member_ids,
has_dev_units,
requested_target,
&target_data,
requested_targets,
target_data,
)?;
let resolved_features = FeatureResolver::resolve(
@ -146,7 +146,7 @@ pub fn resolve_ws_with_opts<'cfg>(
&pkg_set,
&opts.features,
specs,
requested_target,
requested_targets,
has_dev_units,
)?;

View File

@ -251,7 +251,7 @@ pub fn build<'a>(
specs: &[PackageIdSpec],
requested_features: &RequestedFeatures,
target_data: &RustcTargetData,
requested_kind: CompileKind,
requested_kinds: &[CompileKind],
package_map: HashMap<PackageId, &'a Package>,
opts: &TreeOptions,
) -> CargoResult<Graph<'a>> {
@ -261,6 +261,7 @@ pub fn build<'a>(
for (member, requested_features) in members_with_features {
let member_id = member.package_id();
let features_for = FeaturesFor::from_for_host(member.proc_macro());
for kind in requested_kinds {
let member_index = add_pkg(
&mut graph,
resolve,
@ -268,7 +269,7 @@ pub fn build<'a>(
member_id,
features_for,
target_data,
requested_kind,
*kind,
opts,
);
if opts.graph_features {
@ -276,6 +277,7 @@ pub fn build<'a>(
add_cli_features(&mut graph, member_index, &requested_features, fmap);
}
}
}
if opts.graph_features {
add_internal_features(&mut graph, resolve);
}

View File

@ -49,16 +49,16 @@ pub struct TreeOptions {
#[derive(PartialEq)]
pub enum Target {
Host,
Specific(String),
Specific(Vec<String>),
All,
}
impl Target {
pub fn from_cli(target: Option<&str>) -> Target {
match target {
None => Target::Host,
Some("all") => Target::All,
Some(target) => Target::Specific(target.to_string()),
pub fn from_cli(targets: Vec<String>) -> Target {
match targets.len() {
0 => Target::Host,
1 if targets[0] == "all" => Target::All,
_ => Target::Specific(targets),
}
}
}
@ -126,14 +126,14 @@ pub fn build_and_print(ws: &Workspace<'_>, opts: &TreeOptions) -> CargoResult<()
if opts.graph_features && opts.duplicates {
bail!("the `-e features` flag does not support `--duplicates`");
}
let requested_target = match &opts.target {
Target::All | Target::Host => None,
Target::Specific(t) => Some(t.as_ref()),
let requested_targets = match &opts.target {
Target::All | Target::Host => Vec::new(),
Target::Specific(t) => t.clone(),
};
// TODO: Target::All is broken with -Zfeatures=itarget. To handle that properly,
// `FeatureResolver` will need to be taught what "all" means.
let requested_kind = CompileKind::from_requested_target(ws.config(), requested_target)?;
let target_data = RustcTargetData::new(ws, requested_kind)?;
let requested_kinds = CompileKind::from_requested_targets(ws.config(), &requested_targets)?;
let target_data = RustcTargetData::new(ws, &requested_kinds)?;
let specs = opts.packages.to_package_id_specs(ws)?;
let resolve_opts = ResolveOpts::new(
/*dev_deps*/ true,
@ -152,7 +152,7 @@ pub fn build_and_print(ws: &Workspace<'_>, opts: &TreeOptions) -> CargoResult<()
let ws_resolve = ops::resolve_ws_with_opts(
ws,
&target_data,
requested_kind,
&requested_kinds,
&resolve_opts,
&specs,
has_dev,
@ -172,7 +172,7 @@ pub fn build_and_print(ws: &Workspace<'_>, opts: &TreeOptions) -> CargoResult<()
&specs,
&resolve_opts.features,
&target_data,
requested_kind,
&requested_kinds,
package_map,
opts,
)?;

View File

@ -96,6 +96,15 @@ impl<'cfg> PathSource<'cfg> {
/// are relevant for building this package, but it also contains logic to
/// use other methods like .gitignore to filter the list of files.
pub fn list_files(&self, pkg: &Package) -> CargoResult<Vec<PathBuf>> {
self._list_files(pkg).chain_err(|| {
format!(
"failed to determine list of files in {}",
pkg.root().display()
)
})
}
fn _list_files(&self, pkg: &Package) -> CargoResult<Vec<PathBuf>> {
let root = pkg.root();
let no_include_option = pkg.manifest().include().is_empty();
@ -111,17 +120,21 @@ impl<'cfg> PathSource<'cfg> {
}
let ignore_include = include_builder.build()?;
let ignore_should_package = |relative_path: &Path| -> CargoResult<bool> {
let ignore_should_package = |relative_path: &Path, is_dir: bool| -> CargoResult<bool> {
// "Include" and "exclude" options are mutually exclusive.
if no_include_option {
match ignore_exclude
.matched_path_or_any_parents(relative_path, /* is_dir */ false)
{
match ignore_exclude.matched_path_or_any_parents(relative_path, is_dir) {
Match::None => Ok(true),
Match::Ignore(_) => Ok(false),
Match::Whitelist(_) => Ok(true),
}
} else {
if is_dir {
// Generally, include directives don't list every
// directory (nor should they!). Just skip all directory
// checks, and only check files.
return Ok(true);
}
match ignore_include
.matched_path_or_any_parents(relative_path, /* is_dir */ false)
{
@ -132,7 +145,7 @@ impl<'cfg> PathSource<'cfg> {
}
};
let mut filter = |path: &Path| -> CargoResult<bool> {
let mut filter = |path: &Path, is_dir: bool| -> CargoResult<bool> {
let relative_path = path.strip_prefix(root)?;
let rel = relative_path.as_os_str();
@ -142,13 +155,13 @@ impl<'cfg> PathSource<'cfg> {
return Ok(true);
}
ignore_should_package(relative_path)
ignore_should_package(relative_path, is_dir)
};
// Attempt Git-prepopulate only if no `include` (see rust-lang/cargo#4135).
if no_include_option {
if let Some(result) = self.discover_git_and_list_files(pkg, root, &mut filter) {
return result;
if let Some(result) = self.discover_git_and_list_files(pkg, root, &mut filter)? {
return Ok(result);
}
// no include option and not git repo discovered (see rust-lang/cargo#7183).
return self.list_files_walk_except_dot_files_and_dirs(pkg, &mut filter);
@ -162,50 +175,53 @@ impl<'cfg> PathSource<'cfg> {
&self,
pkg: &Package,
root: &Path,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>,
) -> Option<CargoResult<Vec<PathBuf>>> {
// If this package is in a Git repository, then we really do want to
// query the Git repository as it takes into account items such as
// `.gitignore`. We're not quite sure where the Git repository is,
// however, so we do a bit of a probe.
//
// We walk this package's path upwards and look for a sibling
// `Cargo.toml` and `.git` directory. If we find one then we assume that
// we're part of that repository.
let mut cur = root;
loop {
if cur.join("Cargo.toml").is_file() {
// If we find a Git repository next to this `Cargo.toml`, we still
// check to see if we are indeed part of the index. If not, then
// this is likely an unrelated Git repo, so keep going.
if let Ok(repo) = git2::Repository::open(cur) {
let index = match repo.index() {
Ok(index) => index,
Err(err) => return Some(Err(err.into())),
filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<Option<Vec<PathBuf>>> {
let repo = match git2::Repository::discover(root) {
Ok(repo) => repo,
Err(e) => {
log::debug!(
"could not discover git repo at or above {}: {}",
root.display(),
e
);
return Ok(None);
}
};
let path = root.strip_prefix(cur).unwrap().join("Cargo.toml");
if index.get_path(&path, 0).is_some() {
return Some(self.list_files_git(pkg, &repo, filter));
let index = repo
.index()
.chain_err(|| format!("failed to open git index at {}", repo.path().display()))?;
let repo_root = repo.workdir().ok_or_else(|| {
anyhow::format_err!(
"did not expect repo at {} to be bare",
repo.path().display()
)
})?;
let repo_relative_path = match paths::strip_prefix_canonical(root, repo_root) {
Ok(p) => p,
Err(e) => {
log::warn!(
"cannot determine if path `{:?}` is in git repo `{:?}`: {:?}",
root,
repo_root,
e
);
return Ok(None);
}
};
let manifest_path = repo_relative_path.join("Cargo.toml");
if index.get_path(&manifest_path, 0).is_some() {
return Ok(Some(self.list_files_git(pkg, &repo, filter)?));
}
}
// Don't cross submodule boundaries.
if cur.join(".git").is_dir() {
break;
}
match cur.parent() {
Some(parent) => cur = parent,
None => break,
}
}
None
// Package Cargo.toml is not in git, don't use git to guide our selection.
Ok(None)
}
fn list_files_git(
&self,
pkg: &Package,
repo: &git2::Repository,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>,
filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<Vec<PathBuf>> {
warn!("list_files_git {}", pkg.package_id());
let index = repo.index()?;
@ -289,7 +305,10 @@ impl<'cfg> PathSource<'cfg> {
continue;
}
if is_dir.unwrap_or_else(|| file_path.is_dir()) {
// `is_dir` is None for symlinks. The `unwrap` checks if the
// symlink points to a directory.
let is_dir = is_dir.unwrap_or_else(|| file_path.is_dir());
if is_dir {
warn!(" found submodule {}", file_path.display());
let rel = file_path.strip_prefix(root)?;
let rel = rel.to_str().ok_or_else(|| {
@ -307,7 +326,8 @@ impl<'cfg> PathSource<'cfg> {
PathSource::walk(&file_path, &mut ret, false, filter)?;
}
}
} else if (*filter)(&file_path)? {
} else if (*filter)(&file_path, is_dir)? {
assert!(!is_dir);
// We found a file!
warn!(" found {}", file_path.display());
ret.push(file_path);
@ -338,20 +358,19 @@ impl<'cfg> PathSource<'cfg> {
fn list_files_walk_except_dot_files_and_dirs(
&self,
pkg: &Package,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>,
filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<Vec<PathBuf>> {
let root = pkg.root();
let mut exclude_dot_files_dir_builder = GitignoreBuilder::new(root);
exclude_dot_files_dir_builder.add_line(None, ".*")?;
let ignore_dot_files_and_dirs = exclude_dot_files_dir_builder.build()?;
let mut filter_ignore_dot_files_and_dirs = |path: &Path| -> CargoResult<bool> {
let mut filter_ignore_dot_files_and_dirs =
|path: &Path, is_dir: bool| -> CargoResult<bool> {
let relative_path = path.strip_prefix(root)?;
match ignore_dot_files_and_dirs
.matched_path_or_any_parents(relative_path, /* is_dir */ false)
{
match ignore_dot_files_and_dirs.matched_path_or_any_parents(relative_path, is_dir) {
Match::Ignore(_) => Ok(false),
_ => filter(path),
_ => filter(path, is_dir),
}
};
self.list_files_walk(pkg, &mut filter_ignore_dot_files_and_dirs)
@ -360,7 +379,7 @@ impl<'cfg> PathSource<'cfg> {
fn list_files_walk(
&self,
pkg: &Package,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>,
filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<Vec<PathBuf>> {
let mut ret = Vec::new();
PathSource::walk(pkg.root(), &mut ret, true, filter)?;
@ -371,12 +390,14 @@ impl<'cfg> PathSource<'cfg> {
path: &Path,
ret: &mut Vec<PathBuf>,
is_root: bool,
filter: &mut dyn FnMut(&Path) -> CargoResult<bool>,
filter: &mut dyn FnMut(&Path, bool) -> CargoResult<bool>,
) -> CargoResult<()> {
if !path.is_dir() {
if (*filter)(path)? {
ret.push(path.to_path_buf());
let is_dir = path.is_dir();
if !is_root && !(*filter)(path, is_dir)? {
return Ok(());
}
if !is_dir {
ret.push(path.to_path_buf());
return Ok(());
}
// Don't recurse into any sub-packages that we have.
@ -415,7 +436,12 @@ impl<'cfg> PathSource<'cfg> {
let mut max = FileTime::zero();
let mut max_path = PathBuf::new();
for file in self.list_files(pkg)? {
for file in self.list_files(pkg).chain_err(|| {
format!(
"failed to determine the most recently modified file in {}",
pkg.root().display()
)
})? {
// An `fs::stat` error here is either because path is a
// broken symlink, a permissions error, or a race
// condition where this path was `rm`-ed -- either way,

View File

@ -178,7 +178,7 @@ use crate::sources::PathSource;
use crate::util::errors::CargoResultExt;
use crate::util::hex;
use crate::util::into_url::IntoUrl;
use crate::util::{CargoResult, Config, Filesystem};
use crate::util::{restricted_names, CargoResult, Config, Filesystem};
const PACKAGE_SOURCE_LOCK: &str = ".cargo-ok";
pub const CRATES_IO_INDEX: &str = "https://github.com/rust-lang/crates.io-index";
@ -495,11 +495,18 @@ impl<'cfg> RegistrySource<'cfg> {
prefix
)
}
// Once that's verified, unpack the entry as usual.
entry
.unpack_in(parent)
.chain_err(|| format!("failed to unpack entry at `{}`", entry_path.display()))?;
// Unpacking failed
let mut result = entry.unpack_in(parent).map_err(anyhow::Error::from);
if cfg!(windows) && restricted_names::is_windows_reserved_path(&entry_path) {
result = result.chain_err(|| {
format!(
"`{}` appears to contain a reserved Windows path, \
it cannot be extracted on Windows",
entry_path.display()
)
});
}
result.chain_err(|| format!("failed to unpack entry at `{}`", entry_path.display()))?;
}
// Write to the lock file to indicate that unpacking was successful.

View File

@ -139,7 +139,7 @@ pub trait AppExt: Sized {
}
fn arg_target_triple(self, target: &'static str) -> Self {
self._arg(opt("target", target).value_name("TRIPLE"))
self._arg(multi_opt("target", target, "TRIPLE"))
}
fn arg_target_dir(self) -> Self {
@ -321,8 +321,8 @@ pub trait ArgMatchesExt {
self.value_of_u32("jobs")
}
fn target(&self) -> Option<String> {
self._value_of("target").map(|s| s.to_string())
fn targets(&self) -> Vec<String> {
self._values_of("target")
}
fn get_profile_name(
@ -454,7 +454,7 @@ pub trait ArgMatchesExt {
}
}
let mut build_config = BuildConfig::new(config, self.jobs()?, &self.target(), mode)?;
let mut build_config = BuildConfig::new(config, self.jobs()?, &self.targets(), mode)?;
build_config.message_format = message_format.unwrap_or(MessageFormat::Human);
build_config.requested_profile = self.get_profile_name(config, "dev", profile_checking)?;
build_config.build_plan = self._is_present("build-plan");

View File

@ -34,14 +34,13 @@ impl<'a> Retry<'a> {
}
fn maybe_spurious(err: &Error) -> bool {
for e in err.chain() {
if let Some(git_err) = e.downcast_ref::<git2::Error>() {
if let Some(git_err) = err.downcast_ref::<git2::Error>() {
match git_err.class() {
git2::ErrorClass::Net | git2::ErrorClass::Os => return true,
_ => (),
}
}
if let Some(curl_err) = e.downcast_ref::<curl::Error>() {
if let Some(curl_err) = err.downcast_ref::<curl::Error>() {
if curl_err.is_couldnt_connect()
|| curl_err.is_couldnt_resolve_proxy()
|| curl_err.is_couldnt_resolve_host()
@ -55,12 +54,11 @@ fn maybe_spurious(err: &Error) -> bool {
return true;
}
}
if let Some(not_200) = e.downcast_ref::<HttpNot200>() {
if let Some(not_200) = err.downcast_ref::<HttpNot200>() {
if 500 <= not_200.code && not_200.code < 600 {
return true;
}
}
}
false
}

View File

@ -393,3 +393,43 @@ fn _link_or_copy(src: &Path, dst: &Path) -> CargoResult<()> {
})?;
Ok(())
}
/// Changes the filesystem mtime (and atime if possible) for the given file.
///
/// This intentionally does not return an error, as this is sometimes not
/// supported on network filesystems. For the current uses in Cargo, this is a
/// "best effort" approach, and errors shouldn't be propagated.
pub fn set_file_time_no_err<P: AsRef<Path>>(path: P, time: FileTime) {
let path = path.as_ref();
match filetime::set_file_times(path, time, time) {
Ok(()) => log::debug!("set file mtime {} to {}", path.display(), time),
Err(e) => log::warn!(
"could not set mtime of {} to {}: {:?}",
path.display(),
time,
e
),
}
}
/// Strips `base` from `path`.
///
/// This canonicalizes both paths before stripping. This is useful if the
/// paths are obtained in different ways, and one or the other may or may not
/// have been normalized in some way.
pub fn strip_prefix_canonical<P: AsRef<Path>>(
path: P,
base: P,
) -> Result<PathBuf, std::path::StripPrefixError> {
// Not all filesystems support canonicalize. Just ignore if it doesn't work.
let safe_canonicalize = |path: &Path| match path.canonicalize() {
Ok(p) => p,
Err(e) => {
log::warn!("cannot canonicalize {:?}: {:?}", path, e);
path.to_path_buf()
}
};
let canon_path = safe_canonicalize(path.as_ref());
let canon_base = safe_canonicalize(base.as_ref());
canon_path.strip_prefix(canon_base).map(|p| p.to_path_buf())
}

View File

@ -2,6 +2,7 @@
use crate::util::CargoResult;
use anyhow::bail;
use std::path::Path;
/// Returns `true` if the name contains non-ASCII characters.
pub fn is_non_ascii_name(name: &str) -> bool {
@ -81,3 +82,13 @@ pub fn validate_package_name(name: &str, what: &str, help: &str) -> CargoResult<
}
Ok(())
}
// Check the entire path for names reserved in Windows.
pub fn is_windows_reserved_path(path: &Path) -> bool {
path.iter()
.filter_map(|component| component.to_str())
.any(|component| {
let stem = component.split('.').next().unwrap();
is_windows_reserved(stem)
})
}

View File

@ -48,7 +48,11 @@ alter the way the JSON messages are computed and rendered. See the description
of the `--message-format` option in the [build command documentation] for more
details.
If you are using Rust, the [cargo_metadata] crate can be used to parse these
messages.
[build command documentation]: ../commands/cargo-build.md
[cargo_metadata]: https://crates.io/crates/cargo_metadata
#### Compiler messages

View File

@ -99,6 +99,25 @@ information from `.cargo/config.toml`. See the rustc issue for more information.
cargo test --target foo -Zdoctest-xcompile
```
### multitarget
* Tracking Issue: [#8176](https://github.com/rust-lang/cargo/issues/8176)
This flag allows passing multiple `--target` flags to the `cargo` subcommand
selected. When multiple `--target` flags are passed the selected build targets
will be built for each of the selected architectures.
For example to compile a library for both 32 and 64-bit:
```
cargo build --target x86_64-unknown-linux-gnu --target i686-unknown-linux-gnu
```
or running tests for both targets:
```
cargo test --target x86_64-unknown-linux-gnu --target i686-unknown-linux-gnu
```
### Custom named profiles
* Tracking Issue: [rust-lang/cargo#6988](https://github.com/rust-lang/cargo/issues/6988)

View File

@ -3769,7 +3769,11 @@ fn cdylib_not_lifted() {
p.cargo("build").run();
let files = if cfg!(windows) {
if cfg!(target_env = "msvc") {
vec!["foo.dll.lib", "foo.dll.exp", "foo.dll"]
} else {
vec!["libfoo.dll.a", "foo.dll"]
}
} else if cfg!(target_os = "macos") {
vec!["libfoo.dylib"]
} else {
@ -3803,7 +3807,11 @@ fn cdylib_final_outputs() {
p.cargo("build").run();
let files = if cfg!(windows) {
if cfg!(target_env = "msvc") {
vec!["foo_bar.dll.lib", "foo_bar.dll"]
} else {
vec!["foo_bar.dll", "libfoo_bar.dll.a"]
}
} else if cfg!(target_os = "macos") {
vec!["libfoo_bar.dylib"]
} else {

View File

@ -1663,7 +1663,7 @@ fn build_script_with_dynamic_native_dependency() {
let src = root.join(&file);
let dst = out_dir.join(&file);
fs::copy(src, dst).unwrap();
if cfg!(windows) {
if cfg!(target_env = "msvc") {
fs::copy(root.join("builder.dll.lib"),
out_dir.join("builder.dll.lib")).unwrap();
}
@ -3977,7 +3977,9 @@ fn links_interrupted_can_restart() {
fn build_script_scan_eacces() {
// build.rs causes a scan of the whole project, which can be a problem if
// a directory is not accessible.
use cargo_test_support::git;
use std::os::unix::fs::PermissionsExt;
let p = project()
.file("src/lib.rs", "")
.file("build.rs", "fn main() {}")
@ -3985,12 +3987,21 @@ fn build_script_scan_eacces() {
.build();
let path = p.root().join("secrets");
fs::set_permissions(&path, fs::Permissions::from_mode(0)).unwrap();
// "Caused by" is a string from libc such as the following:
// The last "Caused by" is a string from libc such as the following:
// Permission denied (os error 13)
p.cargo("build")
.with_stderr(
"\
[ERROR] cannot read \"[..]/foo/secrets\"
[ERROR] failed to determine package fingerprint for build script for foo v0.0.1 ([..]/foo)
Caused by:
failed to determine the most recently modified file in [..]/foo
Caused by:
failed to determine list of files in [..]/foo
Caused by:
cannot read \"[..]/foo/secrets\"
Caused by:
[..]
@ -3998,5 +4009,28 @@ Caused by:
)
.with_status(101)
.run();
// Try `package.exclude` to skip a directory.
p.change_file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.0.1"
exclude = ["secrets"]
"#,
);
p.cargo("build").run();
// Try with git. This succeeds because the git status walker ignores
// directories it can't access.
p.change_file("Cargo.toml", &basic_manifest("foo", "0.0.1"));
p.build_dir().rm_rf();
let repo = git::init(&p.root());
git::add(&repo);
git::commit(&repo);
p.cargo("build").run();
// Restore permissions so that the directory can be deleted.
fs::set_permissions(&path, fs::Permissions::from_mode(0o755)).unwrap();
}

View File

@ -396,7 +396,6 @@ fn no_cross_doctests() {
[COMPILING] foo v0.0.1 ([CWD])
[FINISHED] test [unoptimized + debuginfo] target(s) in [..]
[RUNNING] target/{triple}/debug/deps/foo-[..][EXE]
[DOCTEST] foo
",
triple = target
))

View File

@ -8,52 +8,8 @@ use std::thread;
use cargo_test_support::{project, slow_cpu_multiplier};
#[cfg(unix)]
fn enabled() -> bool {
true
}
// On Windows support for these tests is only enabled through the usage of job
// objects. Support for nested job objects, however, was added in recent-ish
// versions of Windows, so this test may not always be able to succeed.
//
// As a result, we try to add ourselves to a job object here
// can succeed or not.
#[cfg(windows)]
fn enabled() -> bool {
use winapi::um::{handleapi, jobapi, jobapi2, processthreadsapi};
unsafe {
// If we're not currently in a job, then we can definitely run these
// tests.
let me = processthreadsapi::GetCurrentProcess();
let mut ret = 0;
let r = jobapi::IsProcessInJob(me, 0 as *mut _, &mut ret);
assert_ne!(r, 0);
if ret == ::winapi::shared::minwindef::FALSE {
return true;
}
// If we are in a job, then we can run these tests if we can be added to
// a nested job (as we're going to create a nested job no matter what as
// part of these tests.
//
// If we can't be added to a nested job, then these tests will
// definitely fail, and there's not much we can do about that.
let job = jobapi2::CreateJobObjectW(0 as *mut _, 0 as *const _);
assert!(!job.is_null());
let r = jobapi2::AssignProcessToJobObject(job, me);
handleapi::CloseHandle(job);
r != 0
}
}
#[cargo_test]
fn ctrl_c_kills_everyone() {
if !enabled() {
return;
}
let listener = TcpListener::bind("127.0.0.1:0").unwrap();
let addr = listener.local_addr().unwrap();
@ -132,7 +88,7 @@ fn ctrl_c_kills_everyone() {
}
#[cfg(unix)]
fn ctrl_c(child: &mut Child) {
pub fn ctrl_c(child: &mut Child) {
let r = unsafe { libc::kill(-(child.id() as i32), libc::SIGINT) };
if r < 0 {
panic!("failed to kill: {}", io::Error::last_os_error());
@ -140,6 +96,6 @@ fn ctrl_c(child: &mut Child) {
}
#[cfg(windows)]
fn ctrl_c(child: &mut Child) {
pub fn ctrl_c(child: &mut Child) {
child.kill().unwrap();
}

View File

@ -1592,6 +1592,17 @@ fn resolver_enables_new_features() {
p.cargo("run --bin a")
.masquerade_as_nightly_cargo()
.env("EXPECTED_FEATS", "1")
.with_stderr(
"\
[UPDATING] [..]
[DOWNLOADING] crates ...
[DOWNLOADED] common [..]
[COMPILING] common v1.0.0
[COMPILING] a v0.1.0 [..]
[FINISHED] [..]
[RUNNING] `target/debug/a[EXE]`
",
)
.run();
// only normal+dev

View File

@ -10,6 +10,7 @@ use std::process::Stdio;
use std::thread;
use std::time::SystemTime;
use super::death;
use cargo_test_support::paths::{self, CargoPathExt};
use cargo_test_support::registry::Package;
use cargo_test_support::{basic_manifest, is_coarse_mtime, project, rustc_host, sleep_ms};
@ -2316,8 +2317,14 @@ LLVM version: 9.0
fn linking_interrupted() {
// Interrupt during the linking phase shouldn't leave test executable as "fresh".
let listener = TcpListener::bind("127.0.0.1:0").unwrap();
let addr = listener.local_addr().unwrap();
// This is used to detect when linking starts, then to pause the linker so
// that the test can kill cargo.
let link_listener = TcpListener::bind("127.0.0.1:0").unwrap();
let link_addr = link_listener.local_addr().unwrap();
// This is used to detect when rustc exits.
let rustc_listener = TcpListener::bind("127.0.0.1:0").unwrap();
let rustc_addr = rustc_listener.local_addr().unwrap();
// Create a linker that we can interrupt.
let linker = project()
@ -2326,8 +2333,6 @@ fn linking_interrupted() {
.file(
"src/main.rs",
&r#"
use std::io::Read;
fn main() {
// Figure out the output filename.
let output = match std::env::args().find(|a| a.starts_with("/OUT:")) {
@ -2346,43 +2351,79 @@ fn linking_interrupted() {
std::fs::write(&output, "").unwrap();
// Tell the test that we are ready to be interrupted.
let mut socket = std::net::TcpStream::connect("__ADDR__").unwrap();
// Wait for the test to tell us to exit.
let _ = socket.read(&mut [0; 1]);
// Wait for the test to kill us.
std::thread::sleep(std::time::Duration::new(60, 0));
}
"#
.replace("__ADDR__", &addr.to_string()),
.replace("__ADDR__", &link_addr.to_string()),
)
.build();
linker.cargo("build").run();
// Create a wrapper around rustc that will tell us when rustc is finished.
let rustc = project()
.at("rustc-waiter")
.file("Cargo.toml", &basic_manifest("rustc-waiter", "1.0.0"))
.file(
"src/main.rs",
&r#"
fn main() {
let mut conn = None;
// Check for a normal build (not -vV or --print).
if std::env::args().any(|arg| arg == "t1") {
// Tell the test that rustc has started.
conn = Some(std::net::TcpStream::connect("__ADDR__").unwrap());
}
let status = std::process::Command::new("rustc")
.args(std::env::args().skip(1))
.status()
.expect("rustc to run");
std::process::exit(status.code().unwrap_or(1));
}
"#
.replace("__ADDR__", &rustc_addr.to_string()),
)
.build();
rustc.cargo("build").run();
// Build it once so that the fingerprint gets saved to disk.
let p = project()
.file("src/lib.rs", "")
.file("tests/t1.rs", "")
.build();
p.cargo("test --test t1 --no-run").run();
// Make a change, start a build, then interrupt it.
p.change_file("src/lib.rs", "// modified");
let linker_env = format!(
"CARGO_TARGET_{}_LINKER",
rustc_host().to_uppercase().replace('-', "_")
);
// NOTE: This assumes that the paths to the linker or rustc are not in the
// fingerprint. But maybe they should be?
let mut cmd = p
.cargo("test --test t1 --no-run")
.env(&linker_env, linker.bin("linker"))
.env("RUSTC", rustc.bin("rustc-waiter"))
.build_command();
let mut child = cmd
.stdout(Stdio::null())
.stderr(Stdio::null())
.env("__CARGO_TEST_SETSID_PLEASE_DONT_USE_ELSEWHERE", "1")
.spawn()
.unwrap();
// Wait for rustc to start.
let mut rustc_conn = rustc_listener.accept().unwrap().0;
// Wait for linking to start.
let mut conn = listener.accept().unwrap().0;
drop(link_listener.accept().unwrap());
// Interrupt the child.
child.kill().unwrap();
// Note: rustc and the linker are still running, let them exit here.
conn.write(b"X").unwrap();
death::ctrl_c(&mut child);
assert!(!child.wait().unwrap().success());
// Wait for rustc to exit. If we don't wait, then the command below could
// start while rustc is still being torn down.
let mut buf = [0];
drop(rustc_conn.read_exact(&mut buf));
// Build again, shouldn't be fresh.
p.cargo("test --test t1")

View File

@ -10,7 +10,9 @@ use cargo_test_support::install::{
};
use cargo_test_support::paths;
use cargo_test_support::registry::Package;
use cargo_test_support::{basic_manifest, cargo_process, project, NO_SUCH_FILE_ERR_MSG};
use cargo_test_support::{
basic_manifest, cargo_process, project, symlink_supported, t, NO_SUCH_FILE_ERR_MSG,
};
fn pkg(name: &str, vers: &str) {
Package::new(name, vers)
@ -1458,3 +1460,40 @@ fn git_install_reads_workspace_manifest() {
.with_stderr_contains(" invalid type: integer `3`[..]")
.run();
}
#[cargo_test]
fn install_git_with_symlink_home() {
// Ensure that `cargo install` with a git repo is OK when CARGO_HOME is a
// symlink, and uses an build script.
if !symlink_supported() {
return;
}
let p = git::new("foo", |p| {
p.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
// This triggers discover_git_and_list_files for detecting changed files.
.file("build.rs", "fn main() {}")
});
#[cfg(unix)]
use std::os::unix::fs::symlink;
#[cfg(windows)]
use std::os::windows::fs::symlink_dir as symlink;
let actual = paths::root().join("actual-home");
t!(std::fs::create_dir(&actual));
t!(symlink(&actual, paths::home().join(".cargo")));
cargo_process("install --git")
.arg(p.url().to_string())
.with_stderr(
"\
[UPDATING] git repository [..]
[INSTALLING] foo v1.0.0 [..]
[COMPILING] foo v1.0.0 [..]
[FINISHED] [..]
[INSTALLING] [..]home/.cargo/bin/foo[..]
[INSTALLED] package `foo [..]
[WARNING] be sure to add [..]
",
)
.run();
}

View File

@ -67,6 +67,7 @@ mod message_format;
mod metabuild;
mod metadata;
mod minimal_versions;
mod multitarget;
mod net_config;
mod new;
mod offline;

View File

@ -0,0 +1,144 @@
//! Tests for multiple `--target` flags to subcommands
use cargo_test_support::{basic_manifest, cross_compile, project, rustc_host};
#[cargo_test]
fn double_target_rejected() {
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("build --target a --target b")
.with_stderr("error: specifying multiple `--target` flags requires `-Zmultitarget`")
.with_status(101)
.run();
}
#[cargo_test]
fn simple_build() {
if cross_compile::disabled() {
return;
}
let t1 = cross_compile::alternate();
let t2 = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("build -Z multitarget")
.arg("--target")
.arg(&t1)
.arg("--target")
.arg(&t2)
.masquerade_as_nightly_cargo()
.run();
assert!(p.target_bin(&t1, "foo").is_file());
assert!(p.target_bin(&t2, "foo").is_file());
}
#[cargo_test]
fn simple_test() {
if !cross_compile::can_run_on_host() {
return;
}
let t1 = cross_compile::alternate();
let t2 = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/lib.rs", "fn main() {}")
.build();
p.cargo("test -Z multitarget")
.arg("--target")
.arg(&t1)
.arg("--target")
.arg(&t2)
.masquerade_as_nightly_cargo()
.with_stderr_contains(&format!("[RUNNING] [..]{}[..]", t1))
.with_stderr_contains(&format!("[RUNNING] [..]{}[..]", t2))
.run();
}
#[cargo_test]
fn simple_run() {
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("run -Z multitarget --target a --target b")
.with_stderr("error: only one `--target` argument is supported")
.with_status(101)
.masquerade_as_nightly_cargo()
.run();
}
#[cargo_test]
fn simple_doc() {
if cross_compile::disabled() {
return;
}
let t1 = cross_compile::alternate();
let t2 = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/lib.rs", "//! empty lib")
.build();
p.cargo("doc -Z multitarget")
.arg("--target")
.arg(&t1)
.arg("--target")
.arg(&t2)
.masquerade_as_nightly_cargo()
.run();
assert!(p.build_dir().join(&t1).join("doc/foo/index.html").is_file());
assert!(p.build_dir().join(&t2).join("doc/foo/index.html").is_file());
}
#[cargo_test]
fn simple_check() {
if cross_compile::disabled() {
return;
}
let t1 = cross_compile::alternate();
let t2 = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("check -Z multitarget")
.arg("--target")
.arg(&t1)
.arg("--target")
.arg(&t2)
.masquerade_as_nightly_cargo()
.run();
}
#[cargo_test]
fn same_value_twice() {
if cross_compile::disabled() {
return;
}
let t = rustc_host();
let p = project()
.file("Cargo.toml", &basic_manifest("foo", "1.0.0"))
.file("src/main.rs", "fn main() {}")
.build();
p.cargo("build -Z multitarget")
.arg("--target")
.arg(&t)
.arg("--target")
.arg(&t)
.masquerade_as_nightly_cargo()
.run();
assert!(p.target_bin(&t, "foo").is_file());
}

View File

@ -20,6 +20,7 @@ fn binary_with_debug() {
&["foo"],
&["foo", "foo.dSYM"],
&["foo.exe", "foo.pdb"],
&["foo.exe"],
);
}
@ -55,6 +56,7 @@ fn static_library_with_debug() {
&["libfoo.a"],
&["libfoo.a"],
&["foo.lib"],
&["libfoo.a"],
);
}
@ -90,6 +92,7 @@ fn dynamic_library_with_debug() {
&["libfoo.so"],
&["libfoo.dylib"],
&["foo.dll", "foo.dll.lib"],
&["foo.dll", "libfoo.dll.a"],
);
}
@ -124,6 +127,7 @@ fn rlib_with_debug() {
&["libfoo.rlib"],
&["libfoo.rlib"],
&["libfoo.rlib"],
&["libfoo.rlib"],
);
}
@ -167,6 +171,7 @@ fn include_only_the_binary_from_the_current_package() {
&["foo"],
&["foo", "foo.dSYM"],
&["foo.exe", "foo.pdb"],
&["foo.exe"],
);
}
@ -242,6 +247,7 @@ fn avoid_build_scripts() {
&["a", "b"],
&["a", "a.dSYM", "b", "b.dSYM"],
&["a.exe", "a.pdb", "b.exe", "b.pdb"],
&["a.exe", "b.exe"],
);
}
@ -266,6 +272,7 @@ fn cargo_build_out_dir() {
&["foo"],
&["foo", "foo.dSYM"],
&["foo.exe", "foo.pdb"],
&["foo.exe"],
);
}
@ -273,10 +280,15 @@ fn check_dir_contents(
out_dir: &Path,
expected_linux: &[&str],
expected_mac: &[&str],
expected_win: &[&str],
expected_win_msvc: &[&str],
expected_win_gnu: &[&str],
) {
let expected = if cfg!(target_os = "windows") {
expected_win
if cfg!(target_env = "msvc") {
expected_win_msvc
} else {
expected_win_gnu
}
} else if cfg!(target_os = "macos") {
expected_mac
} else {

View File

@ -1,12 +1,12 @@
//! Tests for the `cargo package` command.
use cargo_test_support::paths::CargoPathExt;
use cargo_test_support::registry::Package;
use cargo_test_support::publish::validate_crate_contents;
use cargo_test_support::registry::{self, Package};
use cargo_test_support::{
basic_manifest, cargo_process, git, path2url, paths, project, publish::validate_crate_contents,
registry, symlink_supported, t,
basic_manifest, cargo_process, git, path2url, paths, project, symlink_supported, t,
};
use std::fs::{read_to_string, File};
use std::fs::{self, read_to_string, File};
use std::path::Path;
#[cargo_test]
@ -1691,3 +1691,157 @@ fn package_restricted_windows() {
)
.run();
}
#[cargo_test]
fn finds_git_in_parent() {
// Test where `Cargo.toml` is not in the root of the git repo.
let repo_path = paths::root().join("repo");
fs::create_dir(&repo_path).unwrap();
let p = project()
.at("repo/foo")
.file("Cargo.toml", &basic_manifest("foo", "0.1.0"))
.file("src/lib.rs", "")
.build();
let repo = git::init(&repo_path);
git::add(&repo);
git::commit(&repo);
p.change_file("ignoreme", "");
p.change_file("ignoreme2", "");
p.cargo("package --list --allow-dirty")
.with_stdout(
"\
Cargo.toml
Cargo.toml.orig
ignoreme
ignoreme2
src/lib.rs
",
)
.run();
p.change_file(".gitignore", "ignoreme");
p.cargo("package --list --allow-dirty")
.with_stdout(
"\
.gitignore
Cargo.toml
Cargo.toml.orig
ignoreme2
src/lib.rs
",
)
.run();
fs::write(repo_path.join(".gitignore"), "ignoreme2").unwrap();
p.cargo("package --list --allow-dirty")
.with_stdout(
"\
.gitignore
Cargo.toml
Cargo.toml.orig
src/lib.rs
",
)
.run();
}
#[cargo_test]
#[cfg(windows)]
fn reserved_windows_name() {
Package::new("bar", "1.0.0")
.file("src/lib.rs", "pub mod aux;")
.file("src/aux.rs", "")
.publish();
let p = project()
.file(
"Cargo.toml",
r#"
[project]
name = "foo"
version = "0.0.1"
authors = []
license = "MIT"
description = "foo"
[dependencies]
bar = "1.0.0"
"#,
)
.file("src/main.rs", "extern crate bar;\nfn main() { }")
.build();
p.cargo("package")
.with_status(101)
.with_stderr_contains(
"\
error: failed to verify package tarball
Caused by:
failed to download replaced source registry `[..]`
Caused by:
failed to unpack package `[..] `[..]`)`
Caused by:
failed to unpack entry at `[..]aux.rs`
Caused by:
`[..]aux.rs` appears to contain a reserved Windows path, it cannot be extracted on Windows
Caused by:
failed to unpack `[..]aux.rs`
Caused by:
failed to unpack `[..]aux.rs` into `[..]aux.rs`",
)
.run();
}
#[cargo_test]
fn list_with_path_and_lock() {
// Allow --list even for something that isn't packageable.
// Init an empty registry because a versionless path dep will search for
// the package on crates.io.
registry::init();
let p = project()
.file(
"Cargo.toml",
r#"
[package]
name = "foo"
version = "0.1.0"
license = "MIT"
description = "foo"
homepage = "foo"
[dependencies]
bar = {path="bar"}
"#,
)
.file("src/main.rs", "fn main() {}")
.file("bar/Cargo.toml", &basic_manifest("bar", "0.1.0"))
.file("bar/src/lib.rs", "")
.build();
p.cargo("package --list")
.with_stdout(
"\
Cargo.lock
Cargo.toml
Cargo.toml.orig
src/main.rs
",
)
.run();
p.cargo("package")
.with_status(101)
.with_stderr(
"\
error: all path dependencies must have a version specified when packaging.
dependency `bar` does not specify a version.
",
)
.run();
}

View File

@ -180,7 +180,7 @@ fn plugin_with_dynamic_native_dependency() {
let src = root.join(&file);
let dst = out_dir.join(&file);
fs::copy(src, dst).unwrap();
if cfg!(windows) {
if cfg!(target_env = "msvc") {
fs::copy(root.join("builder.dll.lib"),
out_dir.join("builder.dll.lib")).unwrap();
}
@ -435,5 +435,5 @@ fn shared_panic_abort_plugins() {
.file("baz/src/lib.rs", "")
.build();
p.cargo("build").run();
p.cargo("build -v").run();
}

View File

@ -405,7 +405,8 @@ fn named_config_profile() {
let dep_pkg = PackageId::new("dep", "0.1.0", crates_io).unwrap();
// normal package
let p = profiles.get_profile(a_pkg, true, UnitFor::new_normal(), CompileMode::Build);
let mode = CompileMode::Build;
let p = profiles.get_profile(a_pkg, true, true, UnitFor::new_normal(), mode);
assert_eq!(p.name, "foo");
assert_eq!(p.codegen_units, Some(2)); // "foo" from config
assert_eq!(p.opt_level, "1"); // "middle" from manifest
@ -414,7 +415,7 @@ fn named_config_profile() {
assert_eq!(p.overflow_checks, true); // "dev" built-in (ignore package override)
// build-override
let bo = profiles.get_profile(a_pkg, true, UnitFor::new_host(false), CompileMode::Build);
let bo = profiles.get_profile(a_pkg, true, true, UnitFor::new_host(false), mode);
assert_eq!(bo.name, "foo");
assert_eq!(bo.codegen_units, Some(6)); // "foo" build override from config
assert_eq!(bo.opt_level, "1"); // SAME as normal
@ -423,7 +424,7 @@ fn named_config_profile() {
assert_eq!(bo.overflow_checks, true); // SAME as normal
// package overrides
let po = profiles.get_profile(dep_pkg, false, UnitFor::new_normal(), CompileMode::Build);
let po = profiles.get_profile(dep_pkg, false, true, UnitFor::new_normal(), mode);
assert_eq!(po.name, "foo");
assert_eq!(po.codegen_units, Some(7)); // "foo" package override from config
assert_eq!(po.opt_level, "1"); // SAME as normal

View File

@ -297,8 +297,8 @@ fn no_warn_workspace_extras() {
.cwd("a")
.with_stderr(
"\
[UPDATING] `[..]` index
[PACKAGING] a v0.1.0 ([..])
[UPDATING] `[..]` index
",
)
.run();
@ -328,10 +328,10 @@ fn warn_package_with_yanked() {
p.cargo("package --no-verify")
.with_stderr(
"\
[PACKAGING] foo v0.0.1 ([..])
[UPDATING] `[..]` index
[WARNING] package `bar v0.1.0` in Cargo.lock is yanked in registry \
`crates.io`, consider updating to a version that is not yanked
[PACKAGING] foo v0.0.1 ([..])
",
)
.run();
@ -469,6 +469,7 @@ fn ignore_lockfile_inner() {
"\
[PACKAGING] bar v0.0.1 ([..])
[ARCHIVING] .cargo_vcs_info.json
[ARCHIVING] .gitignore
[ARCHIVING] Cargo.lock
[ARCHIVING] Cargo.toml
[ARCHIVING] Cargo.toml.orig

View File

@ -359,13 +359,18 @@ fn package_with_path_deps() {
.file("notyet/src/lib.rs", "")
.build();
p.cargo("package -v")
p.cargo("package")
.with_status(101)
.with_stderr_contains(
"\
[ERROR] no matching package named `notyet` found
location searched: registry [..]
required by package `foo v0.0.1 ([..])`
[PACKAGING] foo [..]
[UPDATING] [..]
[ERROR] failed to prepare local package for uploading
Caused by:
no matching package named `notyet` found
location searched: registry `https://github.com/rust-lang/crates.io-index`
required by package `foo v0.0.1 [..]`
",
)
.run();
@ -375,8 +380,8 @@ required by package `foo v0.0.1 ([..])`
p.cargo("package")
.with_stderr(
"\
[UPDATING] `[..]` index
[PACKAGING] foo v0.0.1 ([CWD])
[UPDATING] `[..]` index
[VERIFYING] foo v0.0.1 ([CWD])
[DOWNLOADING] crates ...
[DOWNLOADED] notyet v0.0.1 (registry `[ROOT][..]`)

View File

@ -849,6 +849,7 @@ fn run_with_library_paths() {
fn main() {{
let search_path = std::env::var_os("{}").unwrap();
let paths = std::env::split_paths(&search_path).collect::<Vec<_>>();
println!("{{:#?}}", paths);
assert!(paths.contains(&r#"{}"#.into()));
assert!(paths.contains(&r#"{}"#.into()));
}}

View File

@ -6,9 +6,9 @@ use std::env;
#[cargo_test]
fn rustc_info_cache() {
// TODO: need to gate this on nightly as soon as -Cbitcode-in-rlib lands in
// nightly
if true {
// Needs `-Cbitcode-in-rlib` to ride to stable before this can be enabled
// everywhere.
if !cargo_test_support::is_nightly() {
return;
}

View File

@ -21,6 +21,12 @@ fn setup() -> Option<Setup> {
return None;
}
if cfg!(all(target_os = "windows", target_env = "gnu")) {
// FIXME: contains object files that we don't handle yet:
// https://github.com/rust-lang/wg-cargo-std-aware/issues/46
return None;
}
// Our mock sysroot requires a few packages from crates.io, so make sure
// they're "published" to crates.io. Also edit their code a bit to make sure
// that they have access to our custom crates with custom apis.
@ -297,7 +303,7 @@ fn lib_nostd() {
r#"
#![no_std]
pub fn foo() {
assert_eq!(core::u8::MIN, 0);
assert_eq!(u8::MIN, 0);
}
"#,
)
@ -512,7 +518,7 @@ fn doctest() {
)
.build();
p.cargo("test --doc -v")
p.cargo("test --doc -v -Zdoctest-xcompile")
.build_std(&setup)
.with_stdout_contains("test src/lib.rs - f [..] ... ok")
.target_host()
@ -570,3 +576,31 @@ fn macro_expanded_shadow() {
p.cargo("build -v").build_std(&setup).target_host().run();
}
#[cargo_test]
fn ignores_incremental() {
// Incremental is not really needed for std, make sure it is disabled.
// Incremental also tends to have bugs that affect std libraries more than
// any other crate.
let setup = match setup() {
Some(s) => s,
None => return,
};
let p = project().file("src/lib.rs", "").build();
p.cargo("build")
.env("CARGO_INCREMENTAL", "1")
.build_std(&setup)
.target_host()
.run();
let incremental: Vec<_> = p
.glob(format!("target/{}/debug/incremental/*", rustc_host()))
.map(|e| e.unwrap())
.collect();
assert_eq!(incremental.len(), 1);
assert!(incremental[0]
.file_name()
.unwrap()
.to_str()
.unwrap()
.starts_with("foo-"));
}

View File

@ -129,7 +129,7 @@ fn cargo_test_overflow_checks() {
use std::panic;
pub fn main() {
let r = panic::catch_unwind(|| {
[1, i32::max_value()].iter().sum::<i32>();
[1, i32::MAX].iter().sum::<i32>();
});
assert!(r.is_err());
}"#,