ostype += 'eabihf'
elif cputype == 'aarch64':
cputype = 'aarch64'
+ elif cputype == 'mips':
+ if sys.byteorder == 'big':
+ cputype = 'mips'
+ elif sys.byteorder == 'little':
+ cputype = 'mipsel'
+ else:
+ raise ValueError('unknown byteorder: ' + sys.byteorder)
+ elif cputype == 'mips64':
+ if sys.byteorder == 'big':
+ cputype = 'mips64'
+ elif sys.byteorder == 'little':
+ cputype = 'mips64el'
+ else:
+ raise ValueError('unknown byteorder: ' + sys.byteorder)
+ # only the n64 ABI is supported, indicate it
+ ostype += 'abi64'
elif cputype in {'powerpc', 'ppc', 'ppc64'}:
cputype = 'powerpc'
elif cputype in {'amd64', 'x86_64', 'x86-64', 'x64'}:
pub fn error_index(build: &Build, compiler: &Compiler) {
println!("Testing error-index stage{}", compiler.stage);
- let output = testdir(build, compiler.host).join("error-index.md");
+ let dir = testdir(build, compiler.host);
+ t!(fs::create_dir_all(&dir));
+ let output = dir.join("error-index.md");
build.run(build.tool_cmd(compiler, "error_index_generator")
.arg("markdown")
.arg(&output)
First, we’ll install Rust. Then, the classic ‘Hello World’ program. Finally,
we’ll talk about Cargo, Rust’s build system and package manager.
+We’ll be showing off a number of commands using a terminal, and those lines all
+start with `$`. You don't need to type in the `$`s, they are there to indicate
+the start of each command. We’ll see many tutorials and examples around the web
+that follow this convention: `$` for commands run as our regular user, and `#`
+for commands we should be running as an administrator.
+
# Installing Rust
The first step to using Rust is to install it. Generally speaking, you’ll need
an Internet connection to run the commands in this section, as we’ll be
downloading Rust from the Internet.
-We’ll be showing off a number of commands using a terminal, and those lines all
-start with `$`. You don't need to type in the `$`s, they are there to indicate
-the start of each command. We’ll see many tutorials and examples around the web
-that follow this convention: `$` for commands run as our regular user, and `#`
-for commands we should be running as an administrator.
+The Rust compiler runs on, and compiles to, a great number of platforms, but is
+best supported on Linux, Mac, and Windows, on the x86 and x86-64 CPU
+architecture. There are official builds of the Rust compiler and standard
+library for these platforms and more. [For full details on Rust platform support
+see the website][platform-support].
-## Platform support
-
-The Rust compiler runs on, and compiles to, a great number of platforms, though
-not all platforms are equally supported. Rust's support levels are organized
-into three tiers, each with a different set of guarantees.
-
-Platforms are identified by their "target triple" which is the string to inform
-the compiler what kind of output should be produced. The columns below indicate
-whether the corresponding component works on the specified platform.
-
-### Tier 1
-
-Tier 1 platforms can be thought of as "guaranteed to build and work".
-Specifically they will each satisfy the following requirements:
-
-* Automated testing is set up to run tests for the platform.
-* Landing changes to the `rust-lang/rust` repository's master branch is gated on
- tests passing.
-* Official release artifacts are provided for the platform.
-* Documentation for how to use and how to build the platform is available.
-
-| Target | std |rustc|cargo| notes |
-|-------------------------------|-----|-----|-----|----------------------------|
-| `i686-apple-darwin` | ✓ | ✓ | ✓ | 32-bit OSX (10.7+, Lion+) |
-| `i686-pc-windows-gnu` | ✓ | ✓ | ✓ | 32-bit MinGW (Windows 7+) |
-| `i686-pc-windows-msvc` | ✓ | ✓ | ✓ | 32-bit MSVC (Windows 7+) |
-| `i686-unknown-linux-gnu` | ✓ | ✓ | ✓ | 32-bit Linux (2.6.18+) |
-| `x86_64-apple-darwin` | ✓ | ✓ | ✓ | 64-bit OSX (10.7+, Lion+) |
-| `x86_64-pc-windows-gnu` | ✓ | ✓ | ✓ | 64-bit MinGW (Windows 7+) |
-| `x86_64-pc-windows-msvc` | ✓ | ✓ | ✓ | 64-bit MSVC (Windows 7+) |
-| `x86_64-unknown-linux-gnu` | ✓ | ✓ | ✓ | 64-bit Linux (2.6.18+) |
-
-### Tier 2
-
-Tier 2 platforms can be thought of as "guaranteed to build". Automated tests
-are not run so it's not guaranteed to produce a working build, but platforms
-often work to quite a good degree and patches are always welcome! Specifically,
-these platforms are required to have each of the following:
-
-* Automated building is set up, but may not be running tests.
-* Landing changes to the `rust-lang/rust` repository's master branch is gated on
- platforms **building**. Note that this means for some platforms only the
- standard library is compiled, but for others the full bootstrap is run.
-* Official release artifacts are provided for the platform.
-
-| Target | std |rustc|cargo| notes |
-|-------------------------------|-----|-----|-----|----------------------------|
-| `aarch64-apple-ios` | ✓ | | | ARM64 iOS |
-| `aarch64-unknown-linux-gnu` | ✓ | ✓ | ✓ | ARM64 Linux (2.6.18+) |
-| `arm-linux-androideabi` | ✓ | | | ARM Android |
-| `arm-unknown-linux-gnueabi` | ✓ | ✓ | ✓ | ARM Linux (2.6.18+) |
-| `arm-unknown-linux-gnueabihf` | ✓ | ✓ | ✓ | ARM Linux (2.6.18+) |
-| `armv7-apple-ios` | ✓ | | | ARM iOS |
-|`armv7-unknown-linux-gnueabihf`| ✓ | ✓ | ✓ | ARMv7 Linux (2.6.18+) |
-| `armv7s-apple-ios` | ✓ | | | ARM iOS |
-| `i386-apple-ios` | ✓ | | | 32-bit x86 iOS |
-| `i586-pc-windows-msvc` | ✓ | | | 32-bit Windows w/o SSE |
-| `mips-unknown-linux-gnu` | ✓ | | | MIPS Linux (2.6.18+) |
-| `mips-unknown-linux-musl` | ✓ | | | MIPS Linux with MUSL |
-| `mipsel-unknown-linux-gnu` | ✓ | | | MIPS (LE) Linux (2.6.18+) |
-| `mipsel-unknown-linux-musl` | ✓ | | | MIPS (LE) Linux with MUSL |
-| `powerpc-unknown-linux-gnu` | ✓ | | | PowerPC Linux (2.6.18+) |
-| `powerpc64-unknown-linux-gnu` | ✓ | | | PPC64 Linux (2.6.18+) |
-|`powerpc64le-unknown-linux-gnu`| ✓ | | | PPC64LE Linux (2.6.18+) |
-| `x86_64-apple-ios` | ✓ | | | 64-bit x86 iOS |
-| `x86_64-rumprun-netbsd` | ✓ | | | 64-bit NetBSD Rump Kernel |
-| `x86_64-unknown-freebsd` | ✓ | ✓ | ✓ | 64-bit FreeBSD |
-| `x86_64-unknown-linux-musl` | ✓ | | | 64-bit Linux with MUSL |
-| `x86_64-unknown-netbsd` | ✓ | ✓ | ✓ | 64-bit NetBSD |
-
-### Tier 3
-
-Tier 3 platforms are those which Rust has support for, but landing changes is
-not gated on the platform either building or passing tests. Working builds for
-these platforms may be spotty as their reliability is often defined in terms of
-community contributions. Additionally, release artifacts and installers are not
-provided, but there may be community infrastructure producing these in
-unofficial locations.
-
-| Target | std |rustc|cargo| notes |
-|-------------------------------|-----|-----|-----|----------------------------|
-| `aarch64-linux-android` | ✓ | | | ARM64 Android |
-| `armv7-linux-androideabi` | ✓ | | | ARM-v7a Android |
-| `i686-linux-android` | ✓ | | | 32-bit x86 Android |
-| `i686-pc-windows-msvc` (XP) | ✓ | | | Windows XP support |
-| `i686-unknown-freebsd` | ✓ | ✓ | ✓ | 32-bit FreeBSD |
-| `x86_64-pc-windows-msvc` (XP) | ✓ | | | Windows XP support |
-| `x86_64-sun-solaris` | ✓ | ✓ | | 64-bit Solaris/SunOS |
-| `x86_64-unknown-bitrig` | ✓ | ✓ | | 64-bit Bitrig |
-| `x86_64-unknown-dragonfly` | ✓ | ✓ | | 64-bit DragonFlyBSD |
-| `x86_64-unknown-openbsd` | ✓ | ✓ | | 64-bit OpenBSD |
-
-Note that this table can be expanded over time, this isn't the exhaustive set of
-tier 3 platforms that will ever be!
+[platform-support]: https://forge.rust-lang.org/platform-support.html
## Installing on Linux or Mac
```bash
$ cd ~/projects
$ cargo new guessing_game --bin
+ Created binary (application) `guessing_game` project
$ cd guessing_game
```
```{bash}
$ cargo build
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 0.53 secs
```
Excellent! Open up your `src/main.rs` again. We’ll be writing all of
```bash
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 0.0 secs
Running `target/debug/guessing_game`
Hello, world!
```
```bash
$ cargo build
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
-src/main.rs:10:5: 10:39 warning: unused result which must be used,
-#[warn(unused_must_use)] on by default
-src/main.rs:10 io::stdin().read_line(&mut guess);
- ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+warning: unused result which must be used, #[warn(unused_must_use)] on by default
+ --> src/main.rs:10:5
+ |
+10 | io::stdin().read_line(&mut guess);
+ | ^
+
+ Finished debug [unoptimized + debuginfo] target(s) in 0.42 secs
```
Rust warns us that we haven’t used the `Result` value. This warning comes from
```bash
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 0.44 secs
Running `target/debug/guessing_game`
Guess the number!
Please input your guess.
```bash
$ cargo build
Updating registry `https://github.com/rust-lang/crates.io-index`
- Downloading rand v0.3.8
- Downloading libc v0.1.6
- Compiling libc v0.1.6
- Compiling rand v0.3.8
+ Downloading rand v0.3.14
+ Downloading libc v0.2.17
+ Compiling libc v0.2.17
+ Compiling rand v0.3.14
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 5.88 secs
```
(You may see different versions, of course.)
```bash
$ cargo build
+ Finished debug [unoptimized + debuginfo] target(s) in 0.0 secs
```
-That’s right, no output! Cargo knows that our project has been built, and that
+That’s right, nothing was done! Cargo knows that our project has been built, and that
all of its dependencies are built, and so there’s no reason to do all that
stuff. With nothing to do, it simply exits. If we open up `src/main.rs` again,
-make a trivial change, and then save it again, we’ll only see one line:
+make a trivial change, and then save it again, we’ll only see two lines:
```bash
$ cargo build
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 0.45 secs
```
So, we told Cargo we wanted any `0.3.x` version of `rand`, and so it fetched the latest
-version at the time this was written, `v0.3.8`. But what happens when next
-week, version `v0.3.9` comes out, with an important bugfix? While getting
-bugfixes is important, what if `0.3.9` contains a regression that breaks our
+version at the time this was written, `v0.3.14`. But what happens when next
+week, version `v0.3.15` comes out, with an important bugfix? While getting
+bugfixes is important, what if `0.3.15` contains a regression that breaks our
code?
The answer to this problem is the `Cargo.lock` file you’ll now find in your
to the `Cargo.lock` file. When you build your project in the future, Cargo
will see that the `Cargo.lock` file exists, and then use that specific version
rather than do all the work of figuring out versions again. This lets you
-have a repeatable build automatically. In other words, we’ll stay at `0.3.8`
+have a repeatable build automatically. In other words, we’ll stay at `0.3.14`
until we explicitly upgrade, and so will anyone who we share our code with,
thanks to the lock file.
-What about when we _do_ want to use `v0.3.9`? Cargo has another command,
+What about when we _do_ want to use `v0.3.15`? Cargo has another command,
`update`, which says ‘ignore the lock, figure out all the latest versions that
fit what we’ve specified. If that works, write those versions out to the lock
file’. But, by default, Cargo will only look for versions larger than `0.3.0`
```bash
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 0.55 secs
Running `target/debug/guessing_game`
Guess the number!
The secret number is: 7
4
You guessed: 4
$ cargo run
+ Finished debug [unoptimized + debuginfo] target(s) in 0.0 secs
Running `target/debug/guessing_game`
Guess the number!
The secret number is: 83
```bash
$ cargo build
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
-src/main.rs:28:21: 28:35 error: mismatched types:
- expected `&collections::string::String`,
- found `&_`
-(expected struct `collections::string::String`,
- found integral variable) [E0308]
-src/main.rs:28 match guess.cmp(&secret_number) {
- ^~~~~~~~~~~~~~
+error[E0308]: mismatched types
+ --> src/main.rs:23:21
+ |
+23 | match guess.cmp(&secret_number) {
+ | ^^^^^^^^^^^^^^ expected struct `std::string::String`, found integral variable
+ |
+ = note: expected type `&std::string::String`
+ = note: found type `&{integer}`
+
error: aborting due to previous error
-Could not compile `guessing_game`.
+
+error: Could not compile `guessing_game`.
+
+To learn more, run the command again with --verbose.
```
Whew! This is a big error. The core of it is that we have ‘mismatched types’.
```bash
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 0.57 secs
Running `target/guessing_game`
Guess the number!
The secret number is: 58
```bash
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 0.58 secs
Running `target/guessing_game`
Guess the number!
The secret number is: 59
```bash
$ cargo run
Compiling guessing_game v0.1.0 (file:///home/you/projects/guessing_game)
+ Finished debug [unoptimized + debuginfo] target(s) in 0.57 secs
Running `target/guessing_game`
Guess the number!
The secret number is: 61
```rust,ignore
struct Point {
- mut x: i32,
+ mut x: i32, // This causes an error.
y: i32,
}
```
of the features that one might expect to be language features are library
features in Rust, so what you're looking for may be there, not here.
+Finally, this document is not normative. It may include details that are
+specific to `rustc` itself, and should not be taken as a specification for
+the Rust language. We intend to produce such a document someday, but this
+is what we have for now.
+
You may also be interested in the [grammar].
[book]: book/index.html
* - `abi_vectorcall` - Allows the usage of the vectorcall calling convention
(e.g. `extern "vectorcall" func fn_();`)
-* - `dotdot_in_tuple_patterns` - Allows `..` in tuple (struct) patterns.
-
* - `abi_sysv64` - Allows the usage of the system V AMD64 calling convention
(e.g. `extern "sysv64" func fn_();`)
use core::convert::From;
use heap::deallocate;
+/// A soft limit on the amount of references that may be made to an `Arc`.
+///
+/// Going above this limit will abort your program (although not
+/// necessarily) at _exactly_ `MAX_REFCOUNT + 1` references.
const MAX_REFCOUNT: usize = (isize::MAX) as usize;
/// A thread-safe reference-counting pointer.
Ok(elem)
}
}
+
+ /// Consumes the `Arc`, returning the wrapped pointer.
+ ///
+ /// To avoid a memory leak the pointer must be converted back to an `Arc` using
+ /// [`Arc::from_raw`][from_raw].
+ ///
+ /// [from_raw]: struct.Arc.html#method.from_raw
+ ///
+ /// # Examples
+ ///
+ /// ```
+ /// #![feature(rc_raw)]
+ ///
+ /// use std::sync::Arc;
+ ///
+ /// let x = Arc::new(10);
+ /// let x_ptr = Arc::into_raw(x);
+ /// assert_eq!(unsafe { *x_ptr }, 10);
+ /// ```
+ #[unstable(feature = "rc_raw", issue = "37197")]
+ pub fn into_raw(this: Self) -> *mut T {
+ let ptr = unsafe { &mut (**this.ptr).data as *mut _ };
+ mem::forget(this);
+ ptr
+ }
+
+ /// Constructs an `Arc` from a raw pointer.
+ ///
+ /// The raw pointer must have been previously returned by a call to a
+ /// [`Arc::into_raw`][into_raw].
+ ///
+ /// This function is unsafe because improper use may lead to memory problems. For example, a
+ /// double-free may occur if the function is called twice on the same raw pointer.
+ ///
+ /// [into_raw]: struct.Arc.html#method.into_raw
+ ///
+ /// # Examples
+ ///
+ /// ```
+ /// #![feature(rc_raw)]
+ ///
+ /// use std::sync::Arc;
+ ///
+ /// let x = Arc::new(10);
+ /// let x_ptr = Arc::into_raw(x);
+ ///
+ /// unsafe {
+ /// // Convert back to an `Arc` to prevent leak.
+ /// let x = Arc::from_raw(x_ptr);
+ /// assert_eq!(*x, 10);
+ ///
+ /// // Further calls to `Arc::from_raw(x_ptr)` would be memory unsafe.
+ /// }
+ ///
+ /// // The memory was freed when `x` went out of scope above, so `x_ptr` is now dangling!
+ /// ```
+ #[unstable(feature = "rc_raw", issue = "37197")]
+ pub unsafe fn from_raw(ptr: *mut T) -> Self {
+ // To find the corresponding pointer to the `ArcInner` we need to subtract the offset of the
+ // `data` field from the pointer.
+ Arc { ptr: Shared::new((ptr as *mut u8).offset(-offset_of!(ArcInner<T>, data)) as *mut _) }
+ }
}
impl<T: ?Sized> Arc<T> {
/// Gets the number of [`Weak`][weak] pointers to this value.
///
- /// Be careful how you use this information, because another thread
- /// may change the weak count at any time.
- ///
/// [weak]: struct.Weak.html
///
+ /// # Safety
+ ///
+ /// This method by itself is safe, but using it correctly requires extra care.
+ /// Another thread can change the weak count at any time,
+ /// including potentially between calling this method and acting on the result.
+ ///
/// # Examples
///
/// ```
/// Gets the number of strong (`Arc`) pointers to this value.
///
- /// Be careful how you use this information, because another thread
- /// may change the strong count at any time.
+ /// # Safety
+ ///
+ /// This method by itself is safe, but using it correctly requires extra care.
+ /// Another thread can change the strong count at any time,
+ /// including potentially between calling this method and acting on the result.
///
/// # Examples
///
assert_eq!(Arc::try_unwrap(x), Ok(5));
}
+ #[test]
+ fn into_from_raw() {
+ let x = Arc::new(box "hello");
+ let y = x.clone();
+
+ let x_ptr = Arc::into_raw(x);
+ drop(y);
+ unsafe {
+ assert_eq!(**x_ptr, "hello");
+
+ let x = Arc::from_raw(x_ptr);
+ assert_eq!(**x, "hello");
+
+ assert_eq!(Arc::try_unwrap(x).map(|x| *x), Ok("hello"));
+ }
+ }
+
#[test]
fn test_cowarc_clone_make_mut() {
let mut cow0 = Arc::new(75);
#[macro_use]
extern crate std;
+// Module with internal macros used by other modules (needs to be included before other modules).
+#[macro_use]
+mod macros;
+
// Heaps provided for low-level allocation strategies
pub mod heap;
--- /dev/null
+// Copyright 2013-2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// Private macro to get the offset of a struct field in bytes from the address of the struct.
+macro_rules! offset_of {
+ ($container:path, $field:ident) => {{
+ // Make sure the field actually exists. This line ensures that a compile-time error is
+ // generated if $field is accessed through a Deref impl.
+ let $container { $field : _, .. };
+
+ // Create an (invalid) instance of the container and calculate the offset to its
+ // field. Using a null pointer might be UB if `&(*(0 as *const T)).field` is interpreted to
+ // be nullptr deref.
+ let invalid: $container = ::core::mem::uninitialized();
+ let offset = &invalid.$field as *const _ as usize - &invalid as *const _ as usize;
+
+ // Do not run destructors on the made up invalid instance.
+ ::core::mem::forget(invalid);
+ offset as isize
+ }};
+}
pub fn would_unwrap(this: &Self) -> bool {
Rc::strong_count(&this) == 1
}
+
+ /// Consumes the `Rc`, returning the wrapped pointer.
+ ///
+ /// To avoid a memory leak the pointer must be converted back to an `Rc` using
+ /// [`Rc::from_raw`][from_raw].
+ ///
+ /// [from_raw]: struct.Rc.html#method.from_raw
+ ///
+ /// # Examples
+ ///
+ /// ```
+ /// #![feature(rc_raw)]
+ ///
+ /// use std::rc::Rc;
+ ///
+ /// let x = Rc::new(10);
+ /// let x_ptr = Rc::into_raw(x);
+ /// assert_eq!(unsafe { *x_ptr }, 10);
+ /// ```
+ #[unstable(feature = "rc_raw", issue = "37197")]
+ pub fn into_raw(this: Self) -> *mut T {
+ let ptr = unsafe { &mut (**this.ptr).value as *mut _ };
+ mem::forget(this);
+ ptr
+ }
+
+ /// Constructs an `Rc` from a raw pointer.
+ ///
+ /// The raw pointer must have been previously returned by a call to a
+ /// [`Rc::into_raw`][into_raw].
+ ///
+ /// This function is unsafe because improper use may lead to memory problems. For example, a
+ /// double-free may occur if the function is called twice on the same raw pointer.
+ ///
+ /// [into_raw]: struct.Rc.html#method.into_raw
+ ///
+ /// # Examples
+ ///
+ /// ```
+ /// #![feature(rc_raw)]
+ ///
+ /// use std::rc::Rc;
+ ///
+ /// let x = Rc::new(10);
+ /// let x_ptr = Rc::into_raw(x);
+ ///
+ /// unsafe {
+ /// // Convert back to an `Rc` to prevent leak.
+ /// let x = Rc::from_raw(x_ptr);
+ /// assert_eq!(*x, 10);
+ ///
+ /// // Further calls to `Rc::from_raw(x_ptr)` would be memory unsafe.
+ /// }
+ ///
+ /// // The memory was freed when `x` went out of scope above, so `x_ptr` is now dangling!
+ /// ```
+ #[unstable(feature = "rc_raw", issue = "37197")]
+ pub unsafe fn from_raw(ptr: *mut T) -> Self {
+ // To find the corresponding pointer to the `RcBox` we need to subtract the offset of the
+ // `value` field from the pointer.
+ Rc { ptr: Shared::new((ptr as *mut u8).offset(-offset_of!(RcBox<T>, value)) as *mut _) }
+ }
}
impl Rc<str> {
assert_eq!(Rc::try_unwrap(x), Ok(5));
}
+ #[test]
+ fn into_from_raw() {
+ let x = Rc::new(box "hello");
+ let y = x.clone();
+
+ let x_ptr = Rc::into_raw(x);
+ drop(y);
+ unsafe {
+ assert_eq!(**x_ptr, "hello");
+
+ let x = Rc::from_raw(x_ptr);
+ assert_eq!(**x, "hello");
+
+ assert_eq!(Rc::try_unwrap(x).map(|x| *x), Ok("hello"));
+ }
+ }
+
#[test]
fn get_mut() {
let mut x = Rc::new(3);
#[inline]
fn add_one(&self) -> Self {
- *self + 1
+ Add::add(*self, 1)
}
#[inline]
fn sub_one(&self) -> Self {
- *self - 1
+ Sub::sub(*self, 1)
}
#[inline]
#[inline]
fn add_one(&self) -> Self {
- *self + 1
+ Add::add(*self, 1)
}
#[inline]
fn sub_one(&self) -> Self {
- *self - 1
+ Sub::sub(*self, 1)
}
#[inline]
#[inline]
fn add_one(&self) -> Self {
- *self + 1
+ Add::add(*self, 1)
}
#[inline]
fn sub_one(&self) -> Self {
- *self - 1
+ Sub::sub(*self, 1)
}
#[inline]
/// Trait to represent types that can be created by summing up an iterator.
///
-/// This trait is used to implement the `sum` method on iterators. Types which
-/// implement the trait can be generated by the `sum` method. Like
-/// `FromIterator` this trait should rarely be called directly and instead
-/// interacted with through `Iterator::sum`.
+/// This trait is used to implement the [`sum()`] method on iterators. Types which
+/// implement the trait can be generated by the [`sum()`] method. Like
+/// [`FromIterator`] this trait should rarely be called directly and instead
+/// interacted with through [`Iterator::sum()`].
+///
+/// [`sum()`]: ../../std/iter/trait.Sum.html#tymethod.sum
+/// [`FromIterator`]: ../../std/iter/trait.FromIterator.html
+/// [`Iterator::sum()`]: ../../std/iter/trait.Iterator.html#method.sum
#[stable(feature = "iter_arith_traits", since = "1.12.0")]
pub trait Sum<A = Self>: Sized {
/// Method which takes an iterator and generates `Self` from the elements by
/// Trait to represent types that can be created by multiplying elements of an
/// iterator.
///
-/// This trait is used to implement the `product` method on iterators. Types
-/// which implement the trait can be generated by the `product` method. Like
-/// `FromIterator` this trait should rarely be called directly and instead
-/// interacted with through `Iterator::product`.
+/// This trait is used to implement the [`product()`] method on iterators. Types
+/// which implement the trait can be generated by the [`product()`] method. Like
+/// [`FromIterator`] this trait should rarely be called directly and instead
+/// interacted with through [`Iterator::product()`].
+///
+/// [`product()`]: ../../std/iter/trait.Product.html#tymethod.product
+/// [`FromIterator`]: ../../std/iter/trait.FromIterator.html
+/// [`Iterator::product()`]: ../../std/iter/trait.Iterator.html#method.product
#[stable(feature = "iter_arith_traits", since = "1.12.0")]
pub trait Product<A = Self>: Sized {
/// Method which takes an iterator and generates `Self` from the elements by
///
/// assert_eq!(w, b"testformatted arguments");
/// ```
+///
+/// A module can import both `std::fmt::Write` and `std::io::Write` and call `write!` on objects
+/// implementing either, as objects do not typically implement both. However, the module must
+/// import the traits qualified so their names do not conflict:
+///
+/// ```
+/// use std::fmt::Write as FmtWrite;
+/// use std::io::Write as IoWrite;
+///
+/// let mut s = String::new();
+/// let mut v = Vec::new();
+/// write!(&mut s, "{} {}", "abc", 123).unwrap(); // uses fmt::Write::write_fmt
+/// write!(&mut v, "s = {:?}", s).unwrap(); // uses io::Write::write_fmt
+/// assert_eq!(v, b"s = \"abc 123\"");
+/// ```
#[macro_export]
#[stable(feature = "core", since = "1.6.0")]
macro_rules! write {
///
/// assert_eq!(&w[..], "test\nformatted arguments\n".as_bytes());
/// ```
+///
+/// A module can import both `std::fmt::Write` and `std::io::Write` and call `write!` on objects
+/// implementing either, as objects do not typically implement both. However, the module must
+/// import the traits qualified so their names do not conflict:
+///
+/// ```
+/// use std::fmt::Write as FmtWrite;
+/// use std::io::Write as IoWrite;
+///
+/// let mut s = String::new();
+/// let mut v = Vec::new();
+/// writeln!(&mut s, "{} {}", "abc", 123).unwrap(); // uses fmt::Write::write_fmt
+/// writeln!(&mut v, "s = {:?}", s).unwrap(); // uses io::Write::write_fmt
+/// assert_eq!(v, b"s = \"abc 123\\n\"\n");
+/// ```
#[macro_export]
#[stable(feature = "rust1", since = "1.0.0")]
macro_rules! writeln {
#[cfg(any(target_arch = "arm", target_arch = "aarch64"))]
const UNWIND_DATA_REG: (i32, i32) = (0, 1); // R0, R1 / X0, X1
-#[cfg(any(target_arch = "mips", target_arch = "mipsel", target_arch = "mips64"))]
+#[cfg(any(target_arch = "mips", target_arch = "mips64"))]
const UNWIND_DATA_REG: (i32, i32) = (4, 5); // A0, A1
#[cfg(any(target_arch = "powerpc", target_arch = "powerpc64"))]
// except according to those terms.
use hir::def_id::DefId;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use std::cell::RefCell;
use std::ops::Index;
use std::hash::Hash;
pub struct DepTrackingMap<M: DepTrackingMapConfig> {
phantom: PhantomData<M>,
graph: DepGraph,
- map: FnvHashMap<M::Key, M::Value>,
+ map: FxHashMap<M::Key, M::Value>,
}
pub trait DepTrackingMapConfig {
DepTrackingMap {
phantom: PhantomData,
graph: graph,
- map: FnvHashMap()
+ map: FxHashMap()
}
}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-use rustc_data_structures::fnv::{FnvHashMap, FnvHashSet};
+use rustc_data_structures::fx::{FxHashMap, FxHashSet};
use std::fmt::Debug;
use std::hash::Hash;
use super::{DepGraphQuery, DepNode};
pub struct DepGraphEdges<D: Clone + Debug + Eq + Hash> {
nodes: Vec<DepNode<D>>,
- indices: FnvHashMap<DepNode<D>, IdIndex>,
- edges: FnvHashSet<(IdIndex, IdIndex)>,
+ indices: FxHashMap<DepNode<D>, IdIndex>,
+ edges: FxHashSet<(IdIndex, IdIndex)>,
open_nodes: Vec<OpenNode>,
}
pub fn new() -> DepGraphEdges<D> {
DepGraphEdges {
nodes: vec![],
- indices: FnvHashMap(),
- edges: FnvHashSet(),
+ indices: FxHashMap(),
+ edges: FxHashSet(),
open_nodes: Vec::new()
}
}
// except according to those terms.
use hir::def_id::DefId;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use session::config::OutputType;
use std::cell::{Ref, RefCell};
use std::rc::Rc;
/// things available to us. If we find that they are not dirty, we
/// load the path to the file storing those work-products here into
/// this map. We can later look for and extract that data.
- previous_work_products: RefCell<FnvHashMap<Arc<WorkProductId>, WorkProduct>>,
+ previous_work_products: RefCell<FxHashMap<Arc<WorkProductId>, WorkProduct>>,
/// Work-products that we generate in this run.
- work_products: RefCell<FnvHashMap<Arc<WorkProductId>, WorkProduct>>,
+ work_products: RefCell<FxHashMap<Arc<WorkProductId>, WorkProduct>>,
}
impl DepGraph {
DepGraph {
data: Rc::new(DepGraphData {
thread: DepGraphThreadData::new(enabled),
- previous_work_products: RefCell::new(FnvHashMap()),
- work_products: RefCell::new(FnvHashMap()),
+ previous_work_products: RefCell::new(FxHashMap()),
+ work_products: RefCell::new(FxHashMap()),
})
}
}
/// Access the map of work-products created during this run. Only
/// used during saving of the dep-graph.
- pub fn work_products(&self) -> Ref<FnvHashMap<Arc<WorkProductId>, WorkProduct>> {
+ pub fn work_products(&self) -> Ref<FxHashMap<Arc<WorkProductId>, WorkProduct>> {
self.data.work_products.borrow()
}
}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::graph::{Direction, INCOMING, Graph, NodeIndex, OUTGOING};
use std::fmt::Debug;
use std::hash::Hash;
pub struct DepGraphQuery<D: Clone + Debug + Hash + Eq> {
pub graph: Graph<DepNode<D>, ()>,
- pub indices: FnvHashMap<DepNode<D>, NodeIndex>,
+ pub indices: FxHashMap<DepNode<D>, NodeIndex>,
}
impl<D: Clone + Debug + Hash + Eq> DepGraphQuery<D> {
edges: &[(DepNode<D>, DepNode<D>)])
-> DepGraphQuery<D> {
let mut graph = Graph::new();
- let mut indices = FnvHashMap();
+ let mut indices = FxHashMap();
for node in nodes {
indices.insert(node.clone(), graph.next_node_index());
graph.add_node(node.clone());
// except according to those terms.
use hir::def_id::{CrateNum, DefId, DefIndex, LOCAL_CRATE};
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use std::fmt::Write;
use std::hash::{Hash, Hasher};
use std::collections::hash_map::DefaultHasher;
#[derive(Clone)]
pub struct Definitions {
data: Vec<DefData>,
- key_map: FnvHashMap<DefKey, DefIndex>,
+ key_map: FxHashMap<DefKey, DefIndex>,
node_map: NodeMap<DefIndex>,
}
pub fn new() -> Definitions {
Definitions {
data: vec![],
- key_map: FnvHashMap(),
+ key_map: FxHashMap(),
node_map: NodeMap(),
}
}
use hir::def::Def;
use hir::def_id::DefId;
-use util::nodemap::{NodeMap, FnvHashSet};
+use util::nodemap::{NodeMap, FxHashSet};
use syntax_pos::{mk_sp, Span, ExpnId, DUMMY_SP};
use syntax::codemap::{self, respan, Spanned};
// Map from the NodeId of a glob import to a list of items which are actually
// imported.
-pub type GlobMap = NodeMap<FnvHashSet<Name>>;
+pub type GlobMap = NodeMap<FxHashSet<Name>>;
use ty::{self, Ty, TyCtxt, TypeFoldable};
use ty::fold::TypeFolder;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use std::collections::hash_map::Entry;
use super::InferCtxt;
pub struct TypeFreshener<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
infcx: &'a InferCtxt<'a, 'gcx, 'tcx>,
freshen_count: u32,
- freshen_map: FnvHashMap<ty::InferTy, Ty<'tcx>>,
+ freshen_map: FxHashMap<ty::InferTy, Ty<'tcx>>,
}
impl<'a, 'gcx, 'tcx> TypeFreshener<'a, 'gcx, 'tcx> {
TypeFreshener {
infcx: infcx,
freshen_count: 0,
- freshen_map: FnvHashMap(),
+ freshen_map: FxHashMap(),
}
}
use ty::error::TypeError;
use ty::relate::{Relate, RelateResult, TypeRelation};
use syntax_pos::Span;
-use util::nodemap::{FnvHashMap, FnvHashSet};
+use util::nodemap::{FxHashMap, FxHashSet};
pub struct HrMatchResult<U> {
pub value: U,
// Map each skolemized region to a vector of other regions that it
// must be equated with. (Note that this vector may include other
// skolemized regions from `skol_map`.)
- let skol_resolution_map: FnvHashMap<_, _> =
+ let skol_resolution_map: FxHashMap<_, _> =
skol_map
.iter()
.map(|(&br, &skol)| {
// `skol_map`. There should always be a representative if things
// are properly well-formed.
let mut unconstrained_regions = vec![];
- let skol_representatives: FnvHashMap<_, _> =
+ let skol_representatives: FxHashMap<_, _> =
skol_resolution_map
.iter()
.map(|(&skol, &(br, ref regions))| {
snapshot: &CombinedSnapshot,
debruijn: ty::DebruijnIndex,
new_vars: &[ty::RegionVid],
- a_map: &FnvHashMap<ty::BoundRegion, &'tcx ty::Region>,
+ a_map: &FxHashMap<ty::BoundRegion, &'tcx ty::Region>,
r0: &'tcx ty::Region)
-> &'tcx ty::Region {
// Regions that pre-dated the LUB computation stay as they are.
snapshot: &CombinedSnapshot,
debruijn: ty::DebruijnIndex,
new_vars: &[ty::RegionVid],
- a_map: &FnvHashMap<ty::BoundRegion,
- &'tcx ty::Region>,
+ a_map: &FxHashMap<ty::BoundRegion, &'tcx ty::Region>,
a_vars: &[ty::RegionVid],
b_vars: &[ty::RegionVid],
r0: &'tcx ty::Region)
fn rev_lookup<'a, 'gcx, 'tcx>(infcx: &InferCtxt<'a, 'gcx, 'tcx>,
span: Span,
- a_map: &FnvHashMap<ty::BoundRegion, &'tcx ty::Region>,
+ a_map: &FxHashMap<ty::BoundRegion, &'tcx ty::Region>,
r: &'tcx ty::Region) -> &'tcx ty::Region
{
for (a_br, a_r) in a_map {
}
fn var_ids<'a, 'gcx, 'tcx>(fields: &CombineFields<'a, 'gcx, 'tcx>,
- map: &FnvHashMap<ty::BoundRegion, &'tcx ty::Region>)
+ map: &FxHashMap<ty::BoundRegion, &'tcx ty::Region>)
-> Vec<ty::RegionVid> {
map.iter()
.map(|(_, &r)| match *r {
snapshot: &CombinedSnapshot,
r: &'tcx ty::Region,
directions: TaintDirections)
- -> FnvHashSet<&'tcx ty::Region> {
+ -> FxHashSet<&'tcx ty::Region> {
self.region_vars.tainted(&snapshot.region_vars_snapshot, r, directions)
}
let escaping_types =
self.type_variables.borrow_mut().types_escaping_snapshot(&snapshot.type_snapshot);
- let mut escaping_region_vars = FnvHashSet();
+ let mut escaping_region_vars = FxHashSet();
for ty in &escaping_types {
self.tcx.collect_regions(ty, &mut escaping_region_vars);
}
// region back to the `ty::BoundRegion` that it originally
// represented. Because `leak_check` passed, we know that
// these taint sets are mutually disjoint.
- let inv_skol_map: FnvHashMap<&'tcx ty::Region, ty::BoundRegion> =
+ let inv_skol_map: FxHashMap<&'tcx ty::Region, ty::BoundRegion> =
skol_map
.iter()
.flat_map(|(&skol_br, &skol)| {
snapshot: &CombinedSnapshot)
{
debug!("pop_skolemized({:?})", skol_map);
- let skol_regions: FnvHashSet<_> = skol_map.values().cloned().collect();
+ let skol_regions: FxHashSet<_> = skol_map.values().cloned().collect();
self.region_vars.pop_skolemized(&skol_regions, &snapshot.region_vars_snapshot);
if !skol_map.is_empty() {
self.projection_cache.borrow_mut().rollback_skolemized(
use syntax::ast;
use errors::DiagnosticBuilder;
use syntax_pos::{self, Span, DUMMY_SP};
-use util::nodemap::{FnvHashMap, FnvHashSet, NodeMap};
+use util::nodemap::{FxHashMap, FxHashSet, NodeMap};
use self::combine::CombineFields;
use self::higher_ranked::HrMatchResult;
// the set of predicates on which errors have been reported, to
// avoid reporting the same error twice.
- pub reported_trait_errors: RefCell<FnvHashSet<traits::TraitErrorKey<'tcx>>>,
+ pub reported_trait_errors: RefCell<FxHashSet<traits::TraitErrorKey<'tcx>>>,
// Sadly, the behavior of projection varies a bit depending on the
// stage of compilation. The specifics are given in the
/// A map returned by `skolemize_late_bound_regions()` indicating the skolemized
/// region that each late-bound region was replaced with.
-pub type SkolemizationMap<'tcx> = FnvHashMap<ty::BoundRegion, &'tcx ty::Region>;
+pub type SkolemizationMap<'tcx> = FxHashMap<ty::BoundRegion, &'tcx ty::Region>;
/// Why did we require that the two types be related?
///
selection_cache: traits::SelectionCache::new(),
evaluation_cache: traits::EvaluationCache::new(),
projection_cache: RefCell::new(traits::ProjectionCache::new()),
- reported_trait_errors: RefCell::new(FnvHashSet()),
+ reported_trait_errors: RefCell::new(FxHashSet()),
projection_mode: Reveal::NotSpecializable,
tainted_by_errors_flag: Cell::new(false),
err_count_on_creation: self.sess.err_count(),
parameter_environment: param_env,
selection_cache: traits::SelectionCache::new(),
evaluation_cache: traits::EvaluationCache::new(),
- reported_trait_errors: RefCell::new(FnvHashSet()),
+ reported_trait_errors: RefCell::new(FxHashSet()),
projection_mode: projection_mode,
tainted_by_errors_flag: Cell::new(false),
err_count_on_creation: tcx.sess.err_count(),
self.probe(|_| {
let origin = TypeOrigin::Misc(syntax_pos::DUMMY_SP);
let trace = TypeTrace::types(origin, true, a, b);
- self.sub(true, trace, &a, &b).map(|_| ())
+ self.sub(true, trace, &a, &b).map(|InferOk { obligations, .. }| {
+ // FIXME(#32730) propagate obligations
+ assert!(obligations.is_empty());
+ })
})
}
span: Span,
lbrct: LateBoundRegionConversionTime,
value: &ty::Binder<T>)
- -> (T, FnvHashMap<ty::BoundRegion, &'tcx ty::Region>)
+ -> (T, FxHashMap<ty::BoundRegion, &'tcx ty::Region>)
where T : TypeFoldable<'tcx>
{
self.tcx.replace_late_bound_regions(
// anyhow. We should make this typetrace stuff more
// generic so we don't have to do anything quite this
// terrible.
- self.equate(true, TypeTrace::dummy(self.tcx), a, b)
- }).map(|_| ())
+ let trace = TypeTrace::dummy(self.tcx);
+ self.equate(true, trace, a, b).map(|InferOk { obligations, .. }| {
+ // FIXME(#32730) propagate obligations
+ assert!(obligations.is_empty());
+ })
+ })
}
pub fn node_ty(&self, id: ast::NodeId) -> McResult<Ty<'tcx>> {
use super::Constraint;
use infer::SubregionOrigin;
use infer::region_inference::RegionVarBindings;
-use util::nodemap::{FnvHashMap, FnvHashSet};
+use util::nodemap::{FxHashMap, FxHashSet};
use std::borrow::Cow;
use std::collections::hash_map::Entry::Vacant;
struct ConstraintGraph<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
tcx: TyCtxt<'a, 'gcx, 'tcx>,
graph_name: String,
- map: &'a FnvHashMap<Constraint<'tcx>, SubregionOrigin<'tcx>>,
- node_ids: FnvHashMap<Node, usize>,
+ map: &'a FxHashMap<Constraint<'tcx>, SubregionOrigin<'tcx>>,
+ node_ids: FxHashMap<Node, usize>,
}
#[derive(Clone, Hash, PartialEq, Eq, Debug, Copy)]
map: &'a ConstraintMap<'tcx>)
-> ConstraintGraph<'a, 'gcx, 'tcx> {
let mut i = 0;
- let mut node_ids = FnvHashMap();
+ let mut node_ids = FxHashMap();
{
let mut add_node = |node| {
if let Vacant(e) = node_ids.entry(node) {
type Node = Node;
type Edge = Edge<'tcx>;
fn nodes(&self) -> dot::Nodes<Node> {
- let mut set = FnvHashSet();
+ let mut set = FxHashSet();
for node in self.node_ids.keys() {
set.insert(*node);
}
}
}
-pub type ConstraintMap<'tcx> = FnvHashMap<Constraint<'tcx>, SubregionOrigin<'tcx>>;
+pub type ConstraintMap<'tcx> = FxHashMap<Constraint<'tcx>, SubregionOrigin<'tcx>>;
fn dump_region_constraints_to<'a, 'gcx, 'tcx>(tcx: TyCtxt<'a, 'gcx, 'tcx>,
map: &ConstraintMap<'tcx>,
use super::{RegionVariableOrigin, SubregionOrigin, MiscVariable};
use super::unify_key;
-use rustc_data_structures::fnv::{FnvHashMap, FnvHashSet};
+use rustc_data_structures::fx::{FxHashMap, FxHashSet};
use rustc_data_structures::graph::{self, Direction, NodeIndex, OUTGOING};
use rustc_data_structures::unify::{self, UnificationTable};
use middle::free_region::FreeRegionMap;
}
}
-pub type CombineMap<'tcx> = FnvHashMap<TwoRegions<'tcx>, RegionVid>;
+pub type CombineMap<'tcx> = FxHashMap<TwoRegions<'tcx>, RegionVid>;
pub struct RegionVarBindings<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
tcx: TyCtxt<'a, 'gcx, 'tcx>,
// Constraints of the form `A <= B` introduced by the region
// checker. Here at least one of `A` and `B` must be a region
// variable.
- constraints: RefCell<FnvHashMap<Constraint<'tcx>, SubregionOrigin<'tcx>>>,
+ constraints: RefCell<FxHashMap<Constraint<'tcx>, SubregionOrigin<'tcx>>>,
// A "verify" is something that we need to verify after inference is
// done, but which does not directly affect inference in any way.
// record the fact that `'a <= 'b` is implied by the fn signature,
// and then ignore the constraint when solving equations. This is
// a bit of a hack but seems to work.
- givens: RefCell<FnvHashSet<(ty::FreeRegion, ty::RegionVid)>>,
+ givens: RefCell<FxHashSet<(ty::FreeRegion, ty::RegionVid)>>,
lubs: RefCell<CombineMap<'tcx>>,
glbs: RefCell<CombineMap<'tcx>>,
struct TaintSet<'tcx> {
directions: TaintDirections,
- regions: FnvHashSet<&'tcx ty::Region>
+ regions: FxHashSet<&'tcx ty::Region>
}
impl<'a, 'gcx, 'tcx> TaintSet<'tcx> {
fn new(directions: TaintDirections,
initial_region: &'tcx ty::Region)
-> Self {
- let mut regions = FnvHashSet();
+ let mut regions = FxHashSet();
regions.insert(initial_region);
TaintSet { directions: directions, regions: regions }
}
}
}
- fn into_set(self) -> FnvHashSet<&'tcx ty::Region> {
+ fn into_set(self) -> FxHashSet<&'tcx ty::Region> {
self.regions
}
tcx: tcx,
var_origins: RefCell::new(Vec::new()),
values: RefCell::new(None),
- constraints: RefCell::new(FnvHashMap()),
+ constraints: RefCell::new(FxHashMap()),
verifys: RefCell::new(Vec::new()),
- givens: RefCell::new(FnvHashSet()),
- lubs: RefCell::new(FnvHashMap()),
- glbs: RefCell::new(FnvHashMap()),
+ givens: RefCell::new(FxHashSet()),
+ lubs: RefCell::new(FxHashMap()),
+ glbs: RefCell::new(FxHashMap()),
skolemization_count: Cell::new(0),
bound_count: Cell::new(0),
undo_log: RefCell::new(Vec::new()),
/// completes to remove all trace of the skolemized regions
/// created in that time.
pub fn pop_skolemized(&self,
- skols: &FnvHashSet<&'tcx ty::Region>,
+ skols: &FxHashSet<&'tcx ty::Region>,
snapshot: &RegionSnapshot) {
debug!("pop_skolemized_regions(skols={:?})", skols);
self.skolemization_count.set(snapshot.skolemization_count);
return;
- fn kill_constraint<'tcx>(skols: &FnvHashSet<&'tcx ty::Region>,
+ fn kill_constraint<'tcx>(skols: &FxHashSet<&'tcx ty::Region>,
undo_entry: &UndoLogEntry<'tcx>)
-> bool {
match undo_entry {
mark: &RegionSnapshot,
r0: &'tcx Region,
directions: TaintDirections)
- -> FnvHashSet<&'tcx ty::Region> {
+ -> FxHashSet<&'tcx ty::Region> {
debug!("tainted(mark={:?}, r0={:?}, directions={:?})",
mark, r0, directions);
dup_vec: &mut [u32])
-> (Vec<RegionAndOrigin<'tcx>>, bool) {
struct WalkState<'tcx> {
- set: FnvHashSet<RegionVid>,
+ set: FxHashSet<RegionVid>,
stack: Vec<RegionVid>,
result: Vec<RegionAndOrigin<'tcx>>,
dup_found: bool,
}
let mut state = WalkState {
- set: FnvHashSet(),
+ set: FxHashSet(),
stack: vec![orig_node_idx],
result: Vec::new(),
dup_found: false,
#![feature(conservative_impl_trait)]
#![feature(const_fn)]
#![feature(core_intrinsics)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(enumset)]
#![feature(libc)]
#![feature(nonzero)]
use lint::{EarlyLintPassObject, LateLintPassObject};
use lint::{Default, CommandLine, Node, Allow, Warn, Deny, Forbid};
use lint::builtin;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use std::cmp;
use std::default::Default as StdDefault;
late_passes: Option<Vec<LateLintPassObject>>,
/// Lints indexed by name.
- by_name: FnvHashMap<String, TargetLint>,
+ by_name: FxHashMap<String, TargetLint>,
/// Current levels of each lint, and where they were set.
- levels: FnvHashMap<LintId, LevelSource>,
+ levels: FxHashMap<LintId, LevelSource>,
/// Map of registered lint groups to what lints they expand to. The bool
/// is true if the lint group was added by a plugin.
- lint_groups: FnvHashMap<&'static str, (Vec<LintId>, bool)>,
+ lint_groups: FxHashMap<&'static str, (Vec<LintId>, bool)>,
/// Extra info for future incompatibility lints, descibing the
/// issue or RFC that caused the incompatibility.
- future_incompatible: FnvHashMap<LintId, FutureIncompatibleInfo>,
+ future_incompatible: FxHashMap<LintId, FutureIncompatibleInfo>,
/// Maximum level a lint can be
lint_cap: Option<Level>,
lints: vec![],
early_passes: Some(vec![]),
late_passes: Some(vec![]),
- by_name: FnvHashMap(),
- levels: FnvHashMap(),
- future_incompatible: FnvHashMap(),
- lint_groups: FnvHashMap(),
+ by_name: FxHashMap(),
+ levels: FxHashMap(),
+ future_incompatible: FxHashMap(),
+ lint_groups: FxHashMap(),
lint_cap: None,
}
}
Err(FindLintError::Removed) => { }
Err(_) => {
match self.lint_groups.iter().map(|(&x, pair)| (x, pair.0.clone()))
- .collect::<FnvHashMap<&'static str,
- Vec<LintId>>>()
+ .collect::<FxHashMap<&'static str,
+ Vec<LintId>>>()
.get(&lint_name[..]) {
Some(v) => {
v.iter()
use hir::def::Def;
use hir::def_id::{DefId};
use lint;
-use util::nodemap::FnvHashSet;
+use util::nodemap::FxHashSet;
use syntax::{ast, codemap};
use syntax::attr;
struct MarkSymbolVisitor<'a, 'tcx: 'a> {
worklist: Vec<ast::NodeId>,
tcx: TyCtxt<'a, 'tcx, 'tcx>,
- live_symbols: Box<FnvHashSet<ast::NodeId>>,
+ live_symbols: Box<FxHashSet<ast::NodeId>>,
struct_has_extern_repr: bool,
ignore_non_const_paths: bool,
inherited_pub_visibility: bool,
MarkSymbolVisitor {
worklist: worklist,
tcx: tcx,
- live_symbols: box FnvHashSet(),
+ live_symbols: box FxHashSet(),
struct_has_extern_repr: false,
ignore_non_const_paths: false,
inherited_pub_visibility: false,
}
fn mark_live_symbols(&mut self) {
- let mut scanned = FnvHashSet();
+ let mut scanned = FxHashSet();
while !self.worklist.is_empty() {
let id = self.worklist.pop().unwrap();
if scanned.contains(&id) {
fn find_live<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
access_levels: &privacy::AccessLevels,
krate: &hir::Crate)
- -> Box<FnvHashSet<ast::NodeId>> {
+ -> Box<FxHashSet<ast::NodeId>> {
let worklist = create_and_seed_worklist(tcx, access_levels, krate);
let mut symbol_visitor = MarkSymbolVisitor::new(tcx, worklist);
symbol_visitor.mark_live_symbols();
struct DeadVisitor<'a, 'tcx: 'a> {
tcx: TyCtxt<'a, 'tcx, 'tcx>,
- live_symbols: Box<FnvHashSet<ast::NodeId>>,
+ live_symbols: Box<FxHashSet<ast::NodeId>>,
}
impl<'a, 'tcx> DeadVisitor<'a, 'tcx> {
use session;
use session::config;
use middle::cstore::LinkagePreference::{self, RequireStatic, RequireDynamic};
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use rustc_back::PanicStrategy;
/// A list of dependencies for a certain crate type.
/// A mapping of all required dependencies for a particular flavor of output.
///
/// This is local to the tcx, and is generally relevant to one session.
-pub type Dependencies = FnvHashMap<config::CrateType, DependencyList>;
+pub type Dependencies = FxHashMap<config::CrateType, DependencyList>;
#[derive(Copy, Clone, PartialEq, Debug)]
pub enum Linkage {
config::CrateTypeProcMacro => {},
}
- let mut formats = FnvHashMap();
+ let mut formats = FxHashMap();
// Sweep all crates for found dylibs. Add all dylibs, as well as their
// dependencies, ensuring there are no conflicts. The only valid case for a
fn add_library(sess: &session::Session,
cnum: CrateNum,
link: LinkagePreference,
- m: &mut FnvHashMap<CrateNum, LinkagePreference>) {
+ m: &mut FxHashMap<CrateNum, LinkagePreference>) {
match m.get(&cnum) {
Some(&link2) => {
// If the linkages differ, then we'd have two copies of the library
use hir::def_id::DefId;
use ty;
use middle::weak_lang_items;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use syntax::ast;
use syntax::parse::token::InternedString;
session: &'a Session,
- item_refs: FnvHashMap<&'static str, usize>,
+ item_refs: FxHashMap<&'static str, usize>,
}
impl<'a, 'v, 'tcx> Visitor<'v> for LanguageItemCollector<'a, 'tcx> {
impl<'a, 'tcx> LanguageItemCollector<'a, 'tcx> {
pub fn new(session: &'a Session, ast_map: &'a hir_map::Map<'tcx>)
-> LanguageItemCollector<'a, 'tcx> {
- let mut item_refs = FnvHashMap();
+ let mut item_refs = FxHashMap();
$( item_refs.insert($name, $variant as usize); )*
//! outside their scopes. This pass will also generate a set of exported items
//! which are available for use externally when compiled as a library.
-use util::nodemap::{DefIdSet, FnvHashMap};
+use util::nodemap::{DefIdSet, FxHashMap};
use std::hash::Hash;
use std::fmt;
// Accessibility levels for reachable HIR nodes
#[derive(Clone)]
pub struct AccessLevels<Id = NodeId> {
- pub map: FnvHashMap<Id, AccessLevel>
+ pub map: FxHashMap<Id, AccessLevel>
}
impl<Id: Hash + Eq> AccessLevels<Id> {
use ty::{self, TyCtxt};
use middle::privacy;
use session::config;
-use util::nodemap::{NodeSet, FnvHashSet};
+use util::nodemap::{NodeSet, FxHashSet};
use syntax::abi::Abi;
use syntax::ast;
// Step 2: Mark all symbols that the symbols on the worklist touch.
fn propagate(&mut self) {
- let mut scanned = FnvHashSet();
+ let mut scanned = FxHashSet();
loop {
let search_item = match self.worklist.pop() {
Some(item) => item,
use dep_graph::DepNode;
use hir::map as ast_map;
use session::Session;
-use util::nodemap::{FnvHashMap, NodeMap, NodeSet};
+use util::nodemap::{FxHashMap, NodeMap, NodeSet};
use ty;
use std::cell::RefCell;
/// The region maps encode information about region relationships.
pub struct RegionMaps {
code_extents: RefCell<Vec<CodeExtentData>>,
- code_extent_interner: RefCell<FnvHashMap<CodeExtentData, CodeExtent>>,
+ code_extent_interner: RefCell<FxHashMap<CodeExtentData, CodeExtent>>,
/// `scope_map` maps from a scope id to the enclosing scope id;
/// this is usually corresponding to the lexical nesting, though
/// in the case of closures the parent scope is the innermost
let maps = RegionMaps {
code_extents: RefCell::new(vec![]),
- code_extent_interner: RefCell::new(FnvHashMap()),
+ code_extent_interner: RefCell::new(FxHashMap()),
scope_map: RefCell::new(vec![]),
var_map: RefCell::new(NodeMap()),
rvalue_scopes: RefCell::new(NodeMap()),
use syntax_pos::Span;
use util::nodemap::NodeMap;
-use rustc_data_structures::fnv::FnvHashSet;
+use rustc_data_structures::fx::FxHashSet;
use hir;
use hir::print::lifetime_to_string;
use hir::intravisit::{self, Visitor, FnKind};
generics: &hir::Generics) {
debug!("insert_late_bound_lifetimes(decl={:?}, generics={:?})", decl, generics);
- let mut constrained_by_input = ConstrainedCollector { regions: FnvHashSet() };
+ let mut constrained_by_input = ConstrainedCollector { regions: FxHashSet() };
for arg in &decl.inputs {
constrained_by_input.visit_ty(&arg.ty);
}
let mut appears_in_output = AllCollector {
- regions: FnvHashSet(),
+ regions: FxHashSet(),
impl_trait: false
};
intravisit::walk_fn_ret_ty(&mut appears_in_output, &decl.output);
// Subtle point: because we disallow nested bindings, we can just
// ignore binders here and scrape up all names we see.
let mut appears_in_where_clause = AllCollector {
- regions: FnvHashSet(),
+ regions: FxHashSet(),
impl_trait: false
};
for ty_param in generics.ty_params.iter() {
return;
struct ConstrainedCollector {
- regions: FnvHashSet<ast::Name>,
+ regions: FxHashSet<ast::Name>,
}
impl<'v> Visitor<'v> for ConstrainedCollector {
}
struct AllCollector {
- regions: FnvHashSet<ast::Name>,
+ regions: FxHashSet<ast::Name>,
impl_trait: bool
}
use syntax::ast::{NodeId, Attribute};
use syntax::feature_gate::{GateIssue, emit_feature_err, find_lang_feature_accepted_version};
use syntax::attr::{self, Stability, Deprecation};
-use util::nodemap::{DefIdMap, FnvHashSet, FnvHashMap};
+use util::nodemap::{DefIdMap, FxHashSet, FxHashMap};
use hir;
use hir::{Item, Generics, StructField, Variant, PatKind};
depr_map: DefIdMap<Option<DeprecationEntry>>,
/// Maps for each crate whether it is part of the staged API.
- staged_api: FnvHashMap<CrateNum, bool>
+ staged_api: FxHashMap<CrateNum, bool>
}
// A private tree-walker for producing an Index.
}
}
- let mut staged_api = FnvHashMap();
+ let mut staged_api = FxHashMap();
staged_api.insert(LOCAL_CRATE, is_staged_api);
Index {
staged_api: staged_api,
/// features and possibly prints errors. Returns a list of all
/// features used.
pub fn check_unstable_api_usage<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>)
- -> FnvHashMap<InternedString, attr::StabilityLevel> {
+ -> FxHashMap<InternedString, attr::StabilityLevel> {
let _task = tcx.dep_graph.in_task(DepNode::StabilityCheck);
let ref active_lib_features = tcx.sess.features.borrow().declared_lib_features;
let mut checker = Checker {
tcx: tcx,
active_features: active_features,
- used_features: FnvHashMap(),
+ used_features: FxHashMap(),
in_skip_block: 0,
};
intravisit::walk_crate(&mut checker, tcx.map.krate());
struct Checker<'a, 'tcx: 'a> {
tcx: TyCtxt<'a, 'tcx, 'tcx>,
- active_features: FnvHashSet<InternedString>,
- used_features: FnvHashMap<InternedString, attr::StabilityLevel>,
+ active_features: FxHashSet<InternedString>,
+ used_features: FxHashMap<InternedString, attr::StabilityLevel>,
// Within a block where feature gate checking can be skipped.
in_skip_block: u32,
}
/// were expected to be library features), and the list of features used from
/// libraries, identify activated features that don't exist and error about them.
pub fn check_unused_or_stable_features(sess: &Session,
- lib_features_used: &FnvHashMap<InternedString,
- attr::StabilityLevel>) {
+ lib_features_used: &FxHashMap<InternedString,
+ attr::StabilityLevel>) {
let ref declared_lib_features = sess.features.borrow().declared_lib_features;
- let mut remaining_lib_features: FnvHashMap<InternedString, Span>
+ let mut remaining_lib_features: FxHashMap<InternedString, Span>
= declared_lib_features.clone().into_iter().collect();
fn format_stable_since_msg(version: &str) -> String {
use session::search_paths::PathKind;
use session::config::DebugInfoLevel;
use ty::tls;
-use util::nodemap::{NodeMap, FnvHashMap, FnvHashSet};
+use util::nodemap::{NodeMap, FxHashMap, FxHashSet};
use util::common::duration_to_secs_str;
use mir::transform as mir_pass;
/// Set of (LintId, span, message) tuples tracking lint (sub)diagnostics
/// that have been set once, but should not be set again, in order to avoid
/// redundantly verbose output (Issue #24690).
- pub one_time_diagnostics: RefCell<FnvHashSet<(lint::LintId, Span, String)>>,
+ pub one_time_diagnostics: RefCell<FxHashSet<(lint::LintId, Span, String)>>,
pub plugin_llvm_passes: RefCell<Vec<String>>,
pub mir_passes: RefCell<mir_pass::Passes>,
pub plugin_attributes: RefCell<Vec<(String, AttributeType)>>,
working_dir: env::current_dir().unwrap(),
lint_store: RefCell::new(lint::LintStore::new()),
lints: RefCell::new(NodeMap()),
- one_time_diagnostics: RefCell::new(FnvHashSet()),
+ one_time_diagnostics: RefCell::new(FxHashSet()),
plugin_llvm_passes: RefCell::new(Vec::new()),
mir_passes: RefCell::new(mir_pass::Passes::new()),
plugin_attributes: RefCell::new(Vec::new()),
crate_types: RefCell::new(Vec::new()),
- dependency_formats: RefCell::new(FnvHashMap()),
+ dependency_formats: RefCell::new(FxHashMap()),
crate_disambiguator: RefCell::new(token::intern("").as_str()),
features: RefCell::new(feature_gate::Features::new()),
recursion_limit: Cell::new(64),
use hir::def_id::{DefId, LOCAL_CRATE};
use ty::{self, Ty, TyCtxt};
-use infer::{InferCtxt, TypeOrigin};
+use infer::{InferCtxt, InferOk, TypeOrigin};
use syntax_pos::DUMMY_SP;
#[derive(Copy, Clone)]
debug!("overlap: b_impl_header={:?}", b_impl_header);
// Do `a` and `b` unify? If not, no overlap.
- if let Err(_) = selcx.infcx().eq_impl_headers(true,
- TypeOrigin::Misc(DUMMY_SP),
- &a_impl_header,
- &b_impl_header) {
- return None;
+ match selcx.infcx().eq_impl_headers(true, TypeOrigin::Misc(DUMMY_SP), &a_impl_header,
+ &b_impl_header) {
+ Ok(InferOk { obligations, .. }) => {
+ // FIXME(#32730) propagate obligations
+ assert!(obligations.is_empty());
+ }
+ Err(_) => return None
}
debug!("overlap: unification check succeeded");
use ty::fast_reject;
use ty::fold::TypeFolder;
use ty::subst::Subst;
-use util::nodemap::{FnvHashMap, FnvHashSet};
+use util::nodemap::{FxHashMap, FxHashSet};
use std::cmp;
use std::fmt;
let generic_map = def.generics.types.iter().map(|param| {
(param.name.as_str().to_string(),
trait_ref.substs.type_for_def(param).to_string())
- }).collect::<FnvHashMap<String, String>>();
+ }).collect::<FxHashMap<String, String>>();
let parser = Parser::new(&istring);
let mut errored = false;
let err: String = parser.filter_map(|p| {
"the trait `{}` cannot be made into an object", trait_str
));
- let mut reported_violations = FnvHashSet();
+ let mut reported_violations = FxHashSet();
for violation in violations {
if !reported_violations.insert(violation.clone()) {
continue;
fn predicate_can_apply(&self, pred: ty::PolyTraitRef<'tcx>) -> bool {
struct ParamToVarFolder<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
infcx: &'a InferCtxt<'a, 'gcx, 'tcx>,
- var_map: FnvHashMap<Ty<'tcx>, Ty<'tcx>>
+ var_map: FxHashMap<Ty<'tcx>, Ty<'tcx>>
}
impl<'a, 'gcx, 'tcx> TypeFolder<'gcx, 'tcx> for ParamToVarFolder<'a, 'gcx, 'tcx> {
let cleaned_pred = pred.fold_with(&mut ParamToVarFolder {
infcx: self,
- var_map: FnvHashMap()
+ var_map: FxHashMap()
});
let cleaned_pred = super::project::normalize(
use std::mem;
use syntax::ast;
use util::common::ErrorReported;
-use util::nodemap::{FnvHashSet, NodeMap};
+use util::nodemap::{FxHashSet, NodeMap};
use super::CodeAmbiguity;
use super::CodeProjectionError;
}
pub struct GlobalFulfilledPredicates<'tcx> {
- set: FnvHashSet<ty::PolyTraitPredicate<'tcx>>,
+ set: FxHashSet<ty::PolyTraitPredicate<'tcx>>,
dep_graph: DepGraph,
}
impl<'a, 'gcx, 'tcx> GlobalFulfilledPredicates<'gcx> {
pub fn new(dep_graph: DepGraph) -> GlobalFulfilledPredicates<'gcx> {
GlobalFulfilledPredicates {
- set: FnvHashSet(),
+ set: FxHashSet(),
dep_graph: dep_graph,
}
}
use std::rc::Rc;
use syntax::abi::Abi;
use hir;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
struct InferredObligationsSnapshotVecDelegate<'tcx> {
phantom: PhantomData<&'tcx i32>,
#[derive(Clone)]
pub struct SelectionCache<'tcx> {
- hashmap: RefCell<FnvHashMap<ty::TraitRef<'tcx>,
- SelectionResult<'tcx, SelectionCandidate<'tcx>>>>,
+ hashmap: RefCell<FxHashMap<ty::TraitRef<'tcx>,
+ SelectionResult<'tcx, SelectionCandidate<'tcx>>>>,
}
pub enum MethodMatchResult {
#[derive(Clone)]
pub struct EvaluationCache<'tcx> {
- hashmap: RefCell<FnvHashMap<ty::PolyTraitRef<'tcx>, EvaluationResult>>
+ hashmap: RefCell<FxHashMap<ty::PolyTraitRef<'tcx>, EvaluationResult>>
}
impl<'cx, 'gcx, 'tcx> SelectionContext<'cx, 'gcx, 'tcx> {
impl<'tcx> SelectionCache<'tcx> {
pub fn new() -> SelectionCache<'tcx> {
SelectionCache {
- hashmap: RefCell::new(FnvHashMap())
+ hashmap: RefCell::new(FxHashMap())
}
}
}
impl<'tcx> EvaluationCache<'tcx> {
pub fn new() -> EvaluationCache<'tcx> {
EvaluationCache {
- hashmap: RefCell::new(FnvHashMap())
+ hashmap: RefCell::new(FxHashMap())
}
}
}
use super::{SelectionContext, FulfillmentContext};
use super::util::impl_trait_ref_and_oblig;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use hir::def_id::DefId;
-use infer::{InferCtxt, TypeOrigin};
+use infer::{InferCtxt, InferOk, TypeOrigin};
use middle::region;
use ty::subst::{Subst, Substs};
use traits::{self, Reveal, ObligationCause};
target_substs);
// do the impls unify? If not, no specialization.
- if let Err(_) = infcx.eq_trait_refs(true,
- TypeOrigin::Misc(DUMMY_SP),
- source_trait_ref,
- target_trait_ref) {
- debug!("fulfill_implication: {:?} does not unify with {:?}",
- source_trait_ref,
- target_trait_ref);
- return Err(());
+ match infcx.eq_trait_refs(true, TypeOrigin::Misc(DUMMY_SP), source_trait_ref,
+ target_trait_ref) {
+ Ok(InferOk { obligations, .. }) => {
+ // FIXME(#32730) propagate obligations
+ assert!(obligations.is_empty())
+ }
+ Err(_) => {
+ debug!("fulfill_implication: {:?} does not unify with {:?}",
+ source_trait_ref,
+ target_trait_ref);
+ return Err(());
+ }
}
// attempt to prove all of the predicates for impl2 given those for impl1
}
pub struct SpecializesCache {
- map: FnvHashMap<(DefId, DefId), bool>
+ map: FxHashMap<(DefId, DefId), bool>
}
impl SpecializesCache {
pub fn new() -> Self {
SpecializesCache {
- map: FnvHashMap()
+ map: FxHashMap()
}
}
use ty::{self, TyCtxt, ImplOrTraitItem, TraitDef, TypeFoldable};
use ty::fast_reject::{self, SimplifiedType};
use syntax::ast::Name;
-use util::nodemap::{DefIdMap, FnvHashMap};
+use util::nodemap::{DefIdMap, FxHashMap};
/// A per-trait graph of impls in specialization order. At the moment, this
/// graph forms a tree rooted with the trait itself, with all other nodes
// the specialization graph.
/// Impls of the trait.
- nonblanket_impls: FnvHashMap<fast_reject::SimplifiedType, Vec<DefId>>,
+ nonblanket_impls: FxHashMap<fast_reject::SimplifiedType, Vec<DefId>>,
/// Blanket impls associated with the trait.
blanket_impls: Vec<DefId>,
impl<'a, 'gcx, 'tcx> Children {
fn new() -> Children {
Children {
- nonblanket_impls: FnvHashMap(),
+ nonblanket_impls: FxHashMap(),
blanket_impls: vec![],
}
}
use ty::{self, Ty, TyCtxt, ToPredicate, ToPolyTraitRef};
use ty::outlives::Component;
use util::common::ErrorReported;
-use util::nodemap::FnvHashSet;
+use util::nodemap::FxHashSet;
use super::{Obligation, ObligationCause, PredicateObligation, SelectionContext, Normalized};
struct PredicateSet<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
tcx: TyCtxt<'a, 'gcx, 'tcx>,
- set: FnvHashSet<ty::Predicate<'tcx>>,
+ set: FxHashSet<ty::Predicate<'tcx>>,
}
impl<'a, 'gcx, 'tcx> PredicateSet<'a, 'gcx, 'tcx> {
fn new(tcx: TyCtxt<'a, 'gcx, 'tcx>) -> PredicateSet<'a, 'gcx, 'tcx> {
- PredicateSet { tcx: tcx, set: FnvHashSet() }
+ PredicateSet { tcx: tcx, set: FxHashSet() }
}
fn insert(&mut self, pred: &ty::Predicate<'tcx>) -> bool {
pub struct SupertraitDefIds<'a, 'gcx: 'a+'tcx, 'tcx: 'a> {
tcx: TyCtxt<'a, 'gcx, 'tcx>,
stack: Vec<DefId>,
- visited: FnvHashSet<DefId>,
+ visited: FxHashSet<DefId>,
}
pub fn supertrait_def_ids<'cx, 'gcx, 'tcx>(tcx: TyCtxt<'cx, 'gcx, 'tcx>,
use hir::def_id::{DefId};
use ty::{self, Ty, TyCtxt};
use util::common::MemoizationMap;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use std::fmt;
use std::ops;
impl<'a, 'tcx> ty::TyS<'tcx> {
pub fn type_contents(&'tcx self, tcx: TyCtxt<'a, 'tcx, 'tcx>) -> TypeContents {
- return tcx.tc_cache.memoize(self, || tc_ty(tcx, self, &mut FnvHashMap()));
+ return tcx.tc_cache.memoize(self, || tc_ty(tcx, self, &mut FxHashMap()));
fn tc_ty<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
ty: Ty<'tcx>,
- cache: &mut FnvHashMap<Ty<'tcx>, TypeContents>) -> TypeContents
+ cache: &mut FxHashMap<Ty<'tcx>, TypeContents>) -> TypeContents
{
// Subtle: Note that we are *not* using tcx.tc_cache here but rather a
// private cache for this walk. This is needed in the case of cyclic
use ty::maps;
use util::common::MemoizationMap;
use util::nodemap::{NodeMap, NodeSet, DefIdMap, DefIdSet};
-use util::nodemap::{FnvHashMap, FnvHashSet};
+use util::nodemap::{FxHashMap, FxHashSet};
use rustc_data_structures::accumulate_vec::AccumulateVec;
use arena::TypedArena;
/// Specifically use a speedy hash algorithm for these hash sets,
/// they're accessed quite often.
- type_: RefCell<FnvHashSet<Interned<'tcx, TyS<'tcx>>>>,
- type_list: RefCell<FnvHashSet<Interned<'tcx, Slice<Ty<'tcx>>>>>,
- substs: RefCell<FnvHashSet<Interned<'tcx, Substs<'tcx>>>>,
- bare_fn: RefCell<FnvHashSet<Interned<'tcx, BareFnTy<'tcx>>>>,
- region: RefCell<FnvHashSet<Interned<'tcx, Region>>>,
- stability: RefCell<FnvHashSet<&'tcx attr::Stability>>,
- layout: RefCell<FnvHashSet<&'tcx Layout>>,
+ type_: RefCell<FxHashSet<Interned<'tcx, TyS<'tcx>>>>,
+ type_list: RefCell<FxHashSet<Interned<'tcx, Slice<Ty<'tcx>>>>>,
+ substs: RefCell<FxHashSet<Interned<'tcx, Substs<'tcx>>>>,
+ bare_fn: RefCell<FxHashSet<Interned<'tcx, BareFnTy<'tcx>>>>,
+ region: RefCell<FxHashSet<Interned<'tcx, Region>>>,
+ stability: RefCell<FxHashSet<&'tcx attr::Stability>>,
+ layout: RefCell<FxHashSet<&'tcx Layout>>,
}
impl<'gcx: 'tcx, 'tcx> CtxtInterners<'tcx> {
fn new(arenas: &'tcx CtxtArenas<'tcx>) -> CtxtInterners<'tcx> {
CtxtInterners {
arenas: arenas,
- type_: RefCell::new(FnvHashSet()),
- type_list: RefCell::new(FnvHashSet()),
- substs: RefCell::new(FnvHashSet()),
- bare_fn: RefCell::new(FnvHashSet()),
- region: RefCell::new(FnvHashSet()),
- stability: RefCell::new(FnvHashSet()),
- layout: RefCell::new(FnvHashSet())
+ type_: RefCell::new(FxHashSet()),
+ type_list: RefCell::new(FxHashSet()),
+ substs: RefCell::new(FxHashSet()),
+ bare_fn: RefCell::new(FxHashSet()),
+ region: RefCell::new(FxHashSet()),
+ stability: RefCell::new(FxHashSet()),
+ layout: RefCell::new(FxHashSet())
}
}
impl<'a, 'gcx, 'tcx> Tables<'tcx> {
pub fn empty() -> Tables<'tcx> {
Tables {
- node_types: FnvHashMap(),
+ node_types: FxHashMap(),
item_substs: NodeMap(),
adjustments: NodeMap(),
- method_map: FnvHashMap(),
- upvar_capture_map: FnvHashMap(),
+ method_map: FxHashMap(),
+ upvar_capture_map: FxHashMap(),
closure_tys: DefIdMap(),
closure_kinds: DefIdMap(),
liberated_fn_sigs: NodeMap(),
pub tcache: RefCell<DepTrackingMap<maps::Tcache<'tcx>>>,
// Internal cache for metadata decoding. No need to track deps on this.
- pub rcache: RefCell<FnvHashMap<ty::CReaderCacheKey, Ty<'tcx>>>,
+ pub rcache: RefCell<FxHashMap<ty::CReaderCacheKey, Ty<'tcx>>>,
// Cache for the type-contents routine. FIXME -- track deps?
- pub tc_cache: RefCell<FnvHashMap<Ty<'tcx>, ty::contents::TypeContents>>,
+ pub tc_cache: RefCell<FxHashMap<Ty<'tcx>, ty::contents::TypeContents>>,
// FIXME no dep tracking, but we should be able to remove this
pub ty_param_defs: RefCell<NodeMap<ty::TypeParameterDef<'tcx>>>,
// FIXME dep tracking -- should be harmless enough
- pub normalized_cache: RefCell<FnvHashMap<Ty<'tcx>, Ty<'tcx>>>,
+ pub normalized_cache: RefCell<FxHashMap<Ty<'tcx>, Ty<'tcx>>>,
pub lang_items: middle::lang_items::LanguageItems,
pub data_layout: TargetDataLayout,
/// Cache for layouts computed from types.
- pub layout_cache: RefCell<FnvHashMap<Ty<'tcx>, &'tcx Layout>>,
+ pub layout_cache: RefCell<FxHashMap<Ty<'tcx>, &'tcx Layout>>,
/// Used to prevent layout from recursing too deeply.
pub layout_depth: Cell<usize>,
types: common_types,
named_region_map: named_region_map,
region_maps: region_maps,
- free_region_maps: RefCell::new(FnvHashMap()),
+ free_region_maps: RefCell::new(FxHashMap()),
item_variance_map: RefCell::new(DepTrackingMap::new(dep_graph.clone())),
variance_computed: Cell::new(false),
sess: s,
freevars: RefCell::new(freevars),
maybe_unused_trait_imports: maybe_unused_trait_imports,
tcache: RefCell::new(DepTrackingMap::new(dep_graph.clone())),
- rcache: RefCell::new(FnvHashMap()),
- tc_cache: RefCell::new(FnvHashMap()),
+ rcache: RefCell::new(FxHashMap()),
+ tc_cache: RefCell::new(FxHashMap()),
impl_or_trait_items: RefCell::new(DepTrackingMap::new(dep_graph.clone())),
impl_or_trait_item_def_ids: RefCell::new(DepTrackingMap::new(dep_graph.clone())),
trait_items_cache: RefCell::new(DepTrackingMap::new(dep_graph.clone())),
ty_param_defs: RefCell::new(NodeMap()),
- normalized_cache: RefCell::new(FnvHashMap()),
+ normalized_cache: RefCell::new(FxHashMap()),
lang_items: lang_items,
inherent_impls: RefCell::new(DepTrackingMap::new(dep_graph.clone())),
used_unsafe: RefCell::new(NodeSet()),
fragment_infos: RefCell::new(DefIdMap()),
crate_name: token::intern_and_get_ident(crate_name),
data_layout: data_layout,
- layout_cache: RefCell::new(FnvHashMap()),
+ layout_cache: RefCell::new(FxHashMap()),
layout_depth: Cell::new(0),
derive_macros: RefCell::new(NodeMap()),
}, f)
use ty::{self, Binder, Ty, TyCtxt, TypeFlags};
use std::fmt;
-use util::nodemap::{FnvHashMap, FnvHashSet};
+use util::nodemap::{FxHashMap, FxHashSet};
/// The TypeFoldable trait is implemented for every type that can be folded.
/// Basically, every type that has a corresponding method in TypeFolder.
/// whether any late-bound regions were skipped
pub fn collect_regions<T>(self,
value: &T,
- region_set: &mut FnvHashSet<&'tcx ty::Region>)
+ region_set: &mut FxHashSet<&'tcx ty::Region>)
-> bool
where T : TypeFoldable<'tcx>
{
tcx: TyCtxt<'a, 'gcx, 'tcx>,
current_depth: u32,
fld_r: &'a mut (FnMut(ty::BoundRegion) -> &'tcx ty::Region + 'a),
- map: FnvHashMap<ty::BoundRegion, &'tcx ty::Region>
+ map: FxHashMap<ty::BoundRegion, &'tcx ty::Region>
}
impl<'a, 'gcx, 'tcx> TyCtxt<'a, 'gcx, 'tcx> {
pub fn replace_late_bound_regions<T,F>(self,
value: &Binder<T>,
mut f: F)
- -> (T, FnvHashMap<ty::BoundRegion, &'tcx ty::Region>)
+ -> (T, FxHashMap<ty::BoundRegion, &'tcx ty::Region>)
where F : FnMut(ty::BoundRegion) -> &'tcx ty::Region,
T : TypeFoldable<'tcx>,
{
/// variables and equate `value` with something else, those
/// variables will also be equated.
pub fn collect_constrained_late_bound_regions<T>(&self, value: &Binder<T>)
- -> FnvHashSet<ty::BoundRegion>
+ -> FxHashSet<ty::BoundRegion>
where T : TypeFoldable<'tcx>
{
self.collect_late_bound_regions(value, true)
/// Returns a set of all late-bound regions that appear in `value` anywhere.
pub fn collect_referenced_late_bound_regions<T>(&self, value: &Binder<T>)
- -> FnvHashSet<ty::BoundRegion>
+ -> FxHashSet<ty::BoundRegion>
where T : TypeFoldable<'tcx>
{
self.collect_late_bound_regions(value, false)
}
fn collect_late_bound_regions<T>(&self, value: &Binder<T>, just_constraint: bool)
- -> FnvHashSet<ty::BoundRegion>
+ -> FxHashSet<ty::BoundRegion>
where T : TypeFoldable<'tcx>
{
let mut collector = LateBoundRegionsCollector::new(just_constraint);
tcx: tcx,
current_depth: 1,
fld_r: fld_r,
- map: FnvHashMap()
+ map: FxHashMap()
}
}
}
/// Collects all the late-bound regions it finds into a hash set.
struct LateBoundRegionsCollector {
current_depth: u32,
- regions: FnvHashSet<ty::BoundRegion>,
+ regions: FxHashSet<ty::BoundRegion>,
just_constrained: bool,
}
fn new(just_constrained: bool) -> Self {
LateBoundRegionsCollector {
current_depth: 1,
- regions: FnvHashSet(),
+ regions: FxHashSet(),
just_constrained: just_constrained,
}
}
use ty::walk::TypeWalker;
use util::common::MemoizationMap;
use util::nodemap::NodeSet;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use serialize::{self, Encodable, Encoder};
use std::borrow::Cow;
// maps from an expression id that corresponds to a method call to the details
// of the method to be invoked
-pub type MethodMap<'tcx> = FnvHashMap<MethodCall, MethodCallee<'tcx>>;
+pub type MethodMap<'tcx> = FxHashMap<MethodCall, MethodCallee<'tcx>>;
// Contains information needed to resolve types and (in the future) look up
// the types of AST nodes.
pub region: &'tcx ty::Region,
}
-pub type UpvarCaptureMap<'tcx> = FnvHashMap<UpvarId, UpvarCapture<'tcx>>;
+pub type UpvarCaptureMap<'tcx> = FxHashMap<UpvarId, UpvarCapture<'tcx>>;
#[derive(Copy, Clone)]
pub struct ClosureUpvar<'tcx> {
pub free_id_outlive: CodeExtent,
/// A cache for `moves_by_default`.
- pub is_copy_cache: RefCell<FnvHashMap<Ty<'tcx>, bool>>,
+ pub is_copy_cache: RefCell<FxHashMap<Ty<'tcx>, bool>>,
/// A cache for `type_is_sized`
- pub is_sized_cache: RefCell<FnvHashMap<Ty<'tcx>, bool>>,
+ pub is_sized_cache: RefCell<FxHashMap<Ty<'tcx>, bool>>,
}
impl<'a, 'tcx> ParameterEnvironment<'tcx> {
implicit_region_bound: self.implicit_region_bound,
caller_bounds: caller_bounds,
free_id_outlive: self.free_id_outlive,
- is_copy_cache: RefCell::new(FnvHashMap()),
- is_sized_cache: RefCell::new(FnvHashMap()),
+ is_copy_cache: RefCell::new(FxHashMap()),
+ is_sized_cache: RefCell::new(FxHashMap()),
}
}
caller_bounds: Vec::new(),
implicit_region_bound: self.mk_region(ty::ReEmpty),
free_id_outlive: free_id_outlive,
- is_copy_cache: RefCell::new(FnvHashMap()),
- is_sized_cache: RefCell::new(FnvHashMap()),
+ is_copy_cache: RefCell::new(FxHashMap()),
+ is_sized_cache: RefCell::new(FxHashMap()),
}
}
implicit_region_bound: tcx.mk_region(ty::ReScope(free_id_outlive)),
caller_bounds: predicates,
free_id_outlive: free_id_outlive,
- is_copy_cache: RefCell::new(FnvHashMap()),
- is_sized_cache: RefCell::new(FnvHashMap()),
+ is_copy_cache: RefCell::new(FxHashMap()),
+ is_sized_cache: RefCell::new(FxHashMap()),
};
let cause = traits::ObligationCause::misc(span, free_id_outlive.node_id(&self.region_maps));
use ty::{Ty, TyCtxt, TraitRef};
use std::cell::{Cell, RefCell};
use hir;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
/// As `TypeScheme` but for a trait ref.
pub struct TraitDef<'tcx> {
/// Impls of the trait.
nonblanket_impls: RefCell<
- FnvHashMap<fast_reject::SimplifiedType, Vec<DefId>>
+ FxHashMap<fast_reject::SimplifiedType, Vec<DefId>>
>,
/// Blanket impls associated with the trait.
unsafety: unsafety,
generics: generics,
trait_ref: trait_ref,
- nonblanket_impls: RefCell::new(FnvHashMap()),
+ nonblanket_impls: RefCell::new(FxHashMap()),
blanket_impls: RefCell::new(vec![]),
flags: Cell::new(ty::TraitFlags::NO_TRAIT_FLAGS),
specialization_graph: RefCell::new(traits::specialization_graph::Graph::new()),
use ty::fold::TypeVisitor;
use ty::layout::{Layout, LayoutError};
use ty::TypeVariants::*;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use rustc_const_math::{ConstInt, ConstIsize, ConstUsize};
fn impls_bound(&'tcx self, tcx: TyCtxt<'a, 'tcx, 'tcx>,
param_env: &ParameterEnvironment<'tcx>,
bound: ty::BuiltinBound,
- cache: &RefCell<FnvHashMap<Ty<'tcx>, bool>>,
+ cache: &RefCell<FxHashMap<Ty<'tcx>, bool>>,
span: Span) -> bool
{
if self.has_param_types() || self.has_self_ty() {
use hir::def_id::DefId;
use syntax::ast;
-pub use rustc_data_structures::fnv::FnvHashMap;
-pub use rustc_data_structures::fnv::FnvHashSet;
+pub use rustc_data_structures::fx::FxHashMap;
+pub use rustc_data_structures::fx::FxHashSet;
-pub type NodeMap<T> = FnvHashMap<ast::NodeId, T>;
-pub type DefIdMap<T> = FnvHashMap<DefId, T>;
+pub type NodeMap<T> = FxHashMap<ast::NodeId, T>;
+pub type DefIdMap<T> = FxHashMap<DefId, T>;
-pub type NodeSet = FnvHashSet<ast::NodeId>;
-pub type DefIdSet = FnvHashSet<DefId>;
+pub type NodeSet = FxHashSet<ast::NodeId>;
+pub type DefIdSet = FxHashSet<DefId>;
-pub fn NodeMap<T>() -> NodeMap<T> { FnvHashMap() }
-pub fn DefIdMap<T>() -> DefIdMap<T> { FnvHashMap() }
-pub fn NodeSet() -> NodeSet { FnvHashSet() }
-pub fn DefIdSet() -> DefIdSet { FnvHashSet() }
+pub fn NodeMap<T>() -> NodeMap<T> { FxHashMap() }
+pub fn DefIdMap<T>() -> DefIdMap<T> { FxHashMap() }
+pub fn NodeSet() -> NodeSet { FxHashSet() }
+pub fn DefIdSet() -> DefIdSet { FxHashSet() }
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-use target::{Target, TargetResult};
+use target::{Target, TargetOptions, TargetResult};
pub fn target() -> TargetResult {
let mut base = super::fuchsia_base::opts();
target_os: "fuchsia".to_string(),
target_env: "".to_string(),
target_vendor: "unknown".to_string(),
- options: base,
+ options: TargetOptions {
+ abi_blacklist: super::arm_base::abi_blacklist(),
+ .. base
+ },
})
}
use rustc::mir::transform::{Pass, MirPass, MirSource};
use rustc::middle::const_val::ConstVal;
use rustc::middle::lang_items;
-use rustc::util::nodemap::FnvHashMap;
+use rustc::util::nodemap::FxHashMap;
use rustc_data_structures::indexed_set::IdxSetBuf;
use rustc_data_structures::indexed_vec::Idx;
use syntax_pos::Span;
env: &env,
flow_inits: flow_inits,
flow_uninits: flow_uninits,
- drop_flags: FnvHashMap(),
+ drop_flags: FxHashMap(),
patch: MirPatch::new(mir),
}.elaborate()
};
env: &'a MoveDataParamEnv<'tcx>,
flow_inits: DataflowResults<MaybeInitializedLvals<'a, 'tcx>>,
flow_uninits: DataflowResults<MaybeUninitializedLvals<'a, 'tcx>>,
- drop_flags: FnvHashMap<MovePathIndex, Local>,
+ drop_flags: FxHashMap<MovePathIndex, Local>,
patch: MirPatch<'tcx>,
}
use rustc::ty::{self, TyCtxt, ParameterEnvironment};
use rustc::mir::*;
-use rustc::util::nodemap::FnvHashMap;
+use rustc::util::nodemap::FxHashMap;
use rustc_data_structures::indexed_vec::{IndexVec};
use syntax::codemap::DUMMY_SP;
/// subsequent search so that it is solely relative to that
/// base-lvalue). For the remaining lookup, we map the projection
/// elem to the associated MovePathIndex.
- projections: FnvHashMap<(MovePathIndex, AbstractElem<'tcx>), MovePathIndex>
+ projections: FxHashMap<(MovePathIndex, AbstractElem<'tcx>), MovePathIndex>
}
struct MoveDataBuilder<'a, 'tcx: 'a> {
locals: mir.local_decls.indices().map(Lvalue::Local).map(|v| {
Self::new_move_path(&mut move_paths, &mut path_map, None, v)
}).collect(),
- projections: FnvHashMap(),
+ projections: FxHashMap(),
},
move_paths: move_paths,
path_map: path_map,
use rustc::middle::expr_use_visitor::MutateMode;
use rustc::middle::mem_categorization as mc;
use rustc::ty::{self, TyCtxt};
-use rustc::util::nodemap::{FnvHashMap, NodeSet};
+use rustc::util::nodemap::{FxHashMap, NodeSet};
use std::cell::RefCell;
use std::rc::Rc;
pub paths: RefCell<Vec<MovePath<'tcx>>>,
/// Cache of loan path to move path index, for easy lookup.
- pub path_map: RefCell<FnvHashMap<Rc<LoanPath<'tcx>>, MovePathIndex>>,
+ pub path_map: RefCell<FxHashMap<Rc<LoanPath<'tcx>>, MovePathIndex>>,
/// Each move or uninitialized variable gets an entry here.
pub moves: RefCell<Vec<Move>>,
pub fn new() -> MoveData<'tcx> {
MoveData {
paths: RefCell::new(Vec::new()),
- path_map: RefCell::new(FnvHashMap()),
+ path_map: RefCell::new(FxHashMap()),
moves: RefCell::new(Vec::new()),
path_assignments: RefCell::new(Vec::new()),
var_assignments: RefCell::new(Vec::new()),
#![allow(non_camel_case_types)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(quote)]
#![feature(rustc_diagnostic_macros)]
#![feature(rustc_private)]
use rustc_const_math::ConstInt;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::indexed_vec::Idx;
use pattern::{FieldPattern, Pattern, PatternKind};
/// associated types to get field types.
pub wild_pattern: &'a Pattern<'tcx>,
pub pattern_arena: &'a TypedArena<Pattern<'tcx>>,
- pub byte_array_map: FnvHashMap<*const Pattern<'tcx>, Vec<&'a Pattern<'tcx>>>,
+ pub byte_array_map: FxHashMap<*const Pattern<'tcx>, Vec<&'a Pattern<'tcx>>>,
}
impl<'a, 'tcx> MatchCheckCtxt<'a, 'tcx> {
tcx: tcx,
wild_pattern: &wild_pattern,
pattern_arena: &pattern_arena,
- byte_array_map: FnvHashMap(),
+ byte_array_map: FxHashMap(),
})
}
html_favicon_url = "https://doc.rust-lang.org/favicon.ico",
html_root_url = "https://doc.rust-lang.org/nightly/")]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(rustc_private)]
#![feature(staged_api)]
#![feature(rustc_diagnostic_macros)]
--- /dev/null
+// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+use std::collections::{HashMap, HashSet};
+use std::default::Default;
+use std::hash::{Hasher, Hash, BuildHasherDefault};
+use std::ops::BitXor;
+
+pub type FxHashMap<K, V> = HashMap<K, V, BuildHasherDefault<FxHasher>>;
+pub type FxHashSet<V> = HashSet<V, BuildHasherDefault<FxHasher>>;
+
+#[allow(non_snake_case)]
+pub fn FxHashMap<K: Hash + Eq, V>() -> FxHashMap<K, V> {
+ HashMap::default()
+}
+
+#[allow(non_snake_case)]
+pub fn FxHashSet<V: Hash + Eq>() -> FxHashSet<V> {
+ HashSet::default()
+}
+
+/// A speedy hash algorithm for use within rustc. The hashmap in libcollections
+/// by default uses SipHash which isn't quite as speedy as we want. In the
+/// compiler we're not really worried about DOS attempts, so we use a fast
+/// non-cryptographic hash.
+///
+/// This is the same as the algorithm used by Firefox -- which is a homespun
+/// one not based on any widely-known algorithm -- though modified to produce
+/// 64-bit hash values instead of 32-bit hash values. It consistently
+/// out-performs an FNV-based hash within rustc itself -- the collision rate is
+/// similar or slightly worse than FNV, but the speed of the hash function
+/// itself is much higher because it works on up to 8 bytes at a time.
+pub struct FxHasher {
+ hash: usize
+}
+
+#[cfg(target_pointer_width = "32")]
+const K: usize = 0x9e3779b9;
+#[cfg(target_pointer_width = "64")]
+const K: usize = 0x517cc1b727220a95;
+
+impl Default for FxHasher {
+ #[inline]
+ fn default() -> FxHasher {
+ FxHasher { hash: 0 }
+ }
+}
+
+impl FxHasher {
+ #[inline]
+ fn add_to_hash(&mut self, i: usize) {
+ self.hash = self.hash.rotate_left(5).bitxor(i).wrapping_mul(K);
+ }
+}
+
+impl Hasher for FxHasher {
+ #[inline]
+ fn write(&mut self, bytes: &[u8]) {
+ for byte in bytes {
+ let i = *byte;
+ self.add_to_hash(i as usize);
+ }
+ }
+
+ #[inline]
+ fn write_u8(&mut self, i: u8) {
+ self.add_to_hash(i as usize);
+ }
+
+ #[inline]
+ fn write_u16(&mut self, i: u16) {
+ self.add_to_hash(i as usize);
+ }
+
+ #[inline]
+ fn write_u32(&mut self, i: u32) {
+ self.add_to_hash(i as usize);
+ }
+
+ #[cfg(target_pointer_width = "32")]
+ #[inline]
+ fn write_u64(&mut self, i: u64) {
+ self.add_to_hash(i as usize);
+ self.add_to_hash((i >> 32) as usize);
+ }
+
+ #[cfg(target_pointer_width = "64")]
+ #[inline]
+ fn write_u64(&mut self, i: u64) {
+ self.add_to_hash(i as usize);
+ }
+
+ #[inline]
+ fn write_usize(&mut self, i: usize) {
+ self.add_to_hash(i);
+ }
+
+ #[inline]
+ fn finish(&self) -> u64 {
+ self.hash as u64
+ }
+}
+
+pub fn hash<T: Hash>(v: &T) -> u64 {
+ let mut state = FxHasher::default();
+ v.hash(&mut state);
+ state.finish()
+}
pub mod transitive_relation;
pub mod unify;
pub mod fnv;
+pub mod fx;
pub mod tuple_slice;
pub mod veccell;
pub mod control_flow_graph;
//! in the first place). See README.md for a general overview of how
//! to use this class.
-use fnv::{FnvHashMap, FnvHashSet};
+use fx::{FxHashMap, FxHashSet};
use std::cell::Cell;
use std::collections::hash_map::Entry;
/// backtrace iterator (which uses `split_at`).
nodes: Vec<Node<O>>,
/// A cache of predicates that have been successfully completed.
- done_cache: FnvHashSet<O::Predicate>,
+ done_cache: FxHashSet<O::Predicate>,
/// An cache of the nodes in `nodes`, indexed by predicate.
- waiting_cache: FnvHashMap<O::Predicate, NodeIndex>,
+ waiting_cache: FxHashMap<O::Predicate, NodeIndex>,
/// A list of the obligations added in snapshots, to allow
/// for their removal.
cache_list: Vec<O::Predicate>,
ObligationForest {
nodes: vec![],
snapshots: vec![],
- done_cache: FnvHashSet(),
- waiting_cache: FnvHashMap(),
+ done_cache: FxHashSet(),
+ waiting_cache: FxHashMap(),
cache_list: vec![],
scratch: Some(vec![]),
}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-use fnv::FnvHashMap;
+use fx::FxHashMap;
use std::hash::Hash;
use std::ops;
use std::mem;
pub struct SnapshotMap<K, V>
where K: Hash + Clone + Eq
{
- map: FnvHashMap<K, V>,
+ map: FxHashMap<K, V>,
undo_log: Vec<UndoLog<K, V>>,
}
{
pub fn new() -> Self {
SnapshotMap {
- map: FnvHashMap(),
+ map: FxHashMap(),
undo_log: vec![],
}
}
#![cfg_attr(not(stage0), deny(warnings))]
#![feature(box_syntax)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(libc)]
#![feature(quote)]
#![feature(rustc_diagnostic_macros)]
use rustc::dep_graph::debug::{DepNodeFilter, EdgeFilter};
use rustc::hir::def_id::DefId;
use rustc::ty::TyCtxt;
-use rustc_data_structures::fnv::FnvHashSet;
+use rustc_data_structures::fx::FxHashSet;
use rustc_data_structures::graph::{Direction, INCOMING, OUTGOING, NodeIndex};
use rustc::hir;
use rustc::hir::intravisit::Visitor;
}
}
-pub struct GraphvizDepGraph<'q>(FnvHashSet<&'q DepNode<DefId>>,
+pub struct GraphvizDepGraph<'q>(FxHashSet<&'q DepNode<DefId>>,
Vec<(&'q DepNode<DefId>, &'q DepNode<DefId>)>);
impl<'a, 'tcx, 'q> dot::GraphWalk<'a> for GraphvizDepGraph<'q> {
// filter) or the set of nodes whose labels contain all of those
// substrings.
fn node_set<'q>(query: &'q DepGraphQuery<DefId>, filter: &DepNodeFilter)
- -> Option<FnvHashSet<&'q DepNode<DefId>>>
+ -> Option<FxHashSet<&'q DepNode<DefId>>>
{
debug!("node_set(filter={:?})", filter);
}
fn filter_nodes<'q>(query: &'q DepGraphQuery<DefId>,
- sources: &Option<FnvHashSet<&'q DepNode<DefId>>>,
- targets: &Option<FnvHashSet<&'q DepNode<DefId>>>)
- -> FnvHashSet<&'q DepNode<DefId>>
+ sources: &Option<FxHashSet<&'q DepNode<DefId>>>,
+ targets: &Option<FxHashSet<&'q DepNode<DefId>>>)
+ -> FxHashSet<&'q DepNode<DefId>>
{
if let &Some(ref sources) = sources {
if let &Some(ref targets) = targets {
}
fn walk_nodes<'q>(query: &'q DepGraphQuery<DefId>,
- starts: &FnvHashSet<&'q DepNode<DefId>>,
+ starts: &FxHashSet<&'q DepNode<DefId>>,
direction: Direction)
- -> FnvHashSet<&'q DepNode<DefId>>
+ -> FxHashSet<&'q DepNode<DefId>>
{
- let mut set = FnvHashSet();
+ let mut set = FxHashSet();
for &start in starts {
debug!("walk_nodes: start={:?} outgoing?={:?}", start, direction == OUTGOING);
if set.insert(start) {
}
fn walk_between<'q>(query: &'q DepGraphQuery<DefId>,
- sources: &FnvHashSet<&'q DepNode<DefId>>,
- targets: &FnvHashSet<&'q DepNode<DefId>>)
- -> FnvHashSet<&'q DepNode<DefId>>
+ sources: &FxHashSet<&'q DepNode<DefId>>,
+ targets: &FxHashSet<&'q DepNode<DefId>>)
+ -> FxHashSet<&'q DepNode<DefId>>
{
// This is a bit tricky. We want to include a node only if it is:
// (a) reachable from a source and (b) will reach a target. And we
}
fn filter_edges<'q>(query: &'q DepGraphQuery<DefId>,
- nodes: &FnvHashSet<&'q DepNode<DefId>>)
+ nodes: &FxHashSet<&'q DepNode<DefId>>)
-> Vec<(&'q DepNode<DefId>, &'q DepNode<DefId>)>
{
query.edges()
use rustc::hir::def_id::{CRATE_DEF_INDEX, DefId};
use rustc::hir::intravisit as visit;
use rustc::ty::TyCtxt;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc::util::common::record_time;
use rustc::session::config::DebugInfoLevel::NoDebugInfo;
pub mod hasher;
pub struct IncrementalHashesMap {
- hashes: FnvHashMap<DepNode<DefId>, Fingerprint>,
+ hashes: FxHashMap<DepNode<DefId>, Fingerprint>,
// These are the metadata hashes for the current crate as they were stored
// during the last compilation session. They are only loaded if
// -Z query-dep-graph was specified and are needed for auto-tests using
// the #[rustc_metadata_dirty] and #[rustc_metadata_clean] attributes to
// check whether some metadata hash has changed in between two revisions.
- pub prev_metadata_hashes: RefCell<FnvHashMap<DefId, Fingerprint>>,
+ pub prev_metadata_hashes: RefCell<FxHashMap<DefId, Fingerprint>>,
}
impl IncrementalHashesMap {
pub fn new() -> IncrementalHashesMap {
IncrementalHashesMap {
- hashes: FnvHashMap(),
- prev_metadata_hashes: RefCell::new(FnvHashMap()),
+ hashes: FxHashMap(),
+ prev_metadata_hashes: RefCell::new(FxHashMap()),
}
}
html_root_url = "https://doc.rust-lang.org/nightly/")]
#![cfg_attr(not(stage0), deny(warnings))]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![cfg_attr(stage0, feature(question_mark))]
#![feature(rustc_private)]
#![feature(staged_api)]
use rustc::dep_graph::{DepNode, WorkProduct, WorkProductId};
use rustc::hir::def_id::DefIndex;
use std::sync::Arc;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use ich::Fingerprint;
use super::directory::DefPathIndex;
/// is only populated if -Z query-dep-graph is specified. It will be
/// empty otherwise. Importing crates are perfectly happy with just having
/// the DefIndex.
- pub index_map: FnvHashMap<DefIndex, DefPathIndex>
+ pub index_map: FxHashMap<DefIndex, DefPathIndex>
}
/// The hash for some metadata that (when saving) will be exported
use rustc::hir::def_id::DefId;
use rustc::hir::intravisit::Visitor;
use syntax::ast::{self, Attribute, NestedMetaItem};
-use rustc_data_structures::fnv::{FnvHashSet, FnvHashMap};
+use rustc_data_structures::fx::{FxHashSet, FxHashMap};
use syntax::parse::token::InternedString;
use syntax_pos::Span;
use rustc::ty::TyCtxt;
}
let _ignore = tcx.dep_graph.in_ignore();
- let dirty_inputs: FnvHashSet<DepNode<DefId>> =
+ let dirty_inputs: FxHashSet<DepNode<DefId>> =
dirty_inputs.iter()
.filter_map(|d| retraced.map(d))
.collect();
pub struct DirtyCleanVisitor<'a, 'tcx:'a> {
tcx: TyCtxt<'a, 'tcx, 'tcx>,
query: &'a DepGraphQuery<DefId>,
- dirty_inputs: FnvHashSet<DepNode<DefId>>,
+ dirty_inputs: FxHashSet<DepNode<DefId>>,
}
impl<'a, 'tcx> DirtyCleanVisitor<'a, 'tcx> {
}
pub fn check_dirty_clean_metadata<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
- prev_metadata_hashes: &FnvHashMap<DefId, Fingerprint>,
- current_metadata_hashes: &FnvHashMap<DefId, Fingerprint>) {
+ prev_metadata_hashes: &FxHashMap<DefId, Fingerprint>,
+ current_metadata_hashes: &FxHashMap<DefId, Fingerprint>) {
if !tcx.sess.opts.debugging_opts.query_dep_graph {
return;
}
pub struct DirtyCleanMetadataVisitor<'a, 'tcx:'a, 'm> {
tcx: TyCtxt<'a, 'tcx, 'tcx>,
- prev_metadata_hashes: &'m FnvHashMap<DefId, Fingerprint>,
- current_metadata_hashes: &'m FnvHashMap<DefId, Fingerprint>,
+ prev_metadata_hashes: &'m FxHashMap<DefId, Fingerprint>,
+ current_metadata_hashes: &'m FxHashMap<DefId, Fingerprint>,
}
impl<'a, 'tcx, 'm> Visitor<'tcx> for DirtyCleanMetadataVisitor<'a, 'tcx, 'm> {
use rustc::ty::TyCtxt;
use rustc::util::fs as fs_util;
use rustc_data_structures::flock;
-use rustc_data_structures::fnv::{FnvHashSet, FnvHashMap};
+use rustc_data_structures::fx::{FxHashSet, FxHashMap};
use std::ffi::OsString;
use std::fs as std_fs;
debug!("crate-dir: {}", crate_dir.display());
try!(create_dir(tcx.sess, &crate_dir, "crate"));
- let mut source_directories_already_tried = FnvHashSet();
+ let mut source_directories_already_tried = FxHashSet();
loop {
// Generate a session directory of the form:
/// Find the most recent published session directory that is not in the
/// ignore-list.
fn find_source_directory(crate_dir: &Path,
- source_directories_already_tried: &FnvHashSet<PathBuf>)
+ source_directories_already_tried: &FxHashSet<PathBuf>)
-> Option<PathBuf> {
let iter = crate_dir.read_dir()
.unwrap() // FIXME
}
fn find_source_directory_in_iter<I>(iter: I,
- source_directories_already_tried: &FnvHashSet<PathBuf>)
+ source_directories_already_tried: &FxHashSet<PathBuf>)
-> Option<PathBuf>
where I: Iterator<Item=PathBuf>
{
// First do a pass over the crate directory, collecting lock files and
// session directories
- let mut session_directories = FnvHashSet();
- let mut lock_files = FnvHashSet();
+ let mut session_directories = FxHashSet();
+ let mut lock_files = FxHashSet();
for dir_entry in try!(crate_directory.read_dir()) {
let dir_entry = match dir_entry {
}
// Now map from lock files to session directories
- let lock_file_to_session_dir: FnvHashMap<String, Option<String>> =
+ let lock_file_to_session_dir: FxHashMap<String, Option<String>> =
lock_files.into_iter()
.map(|lock_file_name| {
assert!(lock_file_name.ends_with(LOCK_FILE_EXT));
}
// Filter out `None` directories
- let lock_file_to_session_dir: FnvHashMap<String, String> =
+ let lock_file_to_session_dir: FxHashMap<String, String> =
lock_file_to_session_dir.into_iter()
.filter_map(|(lock_file_name, directory_name)| {
directory_name.map(|n| (lock_file_name, n))
}
fn all_except_most_recent(deletion_candidates: Vec<(SystemTime, PathBuf, Option<flock::Lock>)>)
- -> FnvHashMap<PathBuf, Option<flock::Lock>> {
+ -> FxHashMap<PathBuf, Option<flock::Lock>> {
let most_recent = deletion_candidates.iter()
.map(|&(timestamp, ..)| timestamp)
.max();
.map(|(_, path, lock)| (path, lock))
.collect()
} else {
- FnvHashMap()
+ FxHashMap()
}
}
(UNIX_EPOCH + Duration::new(5, 0), PathBuf::from("5"), None),
(UNIX_EPOCH + Duration::new(3, 0), PathBuf::from("3"), None),
(UNIX_EPOCH + Duration::new(2, 0), PathBuf::from("2"), None),
- ]).keys().cloned().collect::<FnvHashSet<PathBuf>>(),
+ ]).keys().cloned().collect::<FxHashSet<PathBuf>>(),
vec![
PathBuf::from("1"),
PathBuf::from("2"),
PathBuf::from("3"),
PathBuf::from("4"),
- ].into_iter().collect::<FnvHashSet<PathBuf>>()
+ ].into_iter().collect::<FxHashSet<PathBuf>>()
);
assert_eq!(all_except_most_recent(
vec![
- ]).keys().cloned().collect::<FnvHashSet<PathBuf>>(),
- FnvHashSet()
+ ]).keys().cloned().collect::<FxHashSet<PathBuf>>(),
+ FxHashSet()
);
}
#[test]
fn test_find_source_directory_in_iter() {
- let already_visited = FnvHashSet();
+ let already_visited = FxHashSet();
// Find newest
assert_eq!(find_source_directory_in_iter(
use rustc::hir::def_id::{CrateNum, DefId};
use rustc::hir::svh::Svh;
use rustc::ty::TyCtxt;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::flock;
use rustc_serialize::Decodable;
use rustc_serialize::opaque::Decoder;
pub struct HashContext<'a, 'tcx: 'a> {
pub tcx: TyCtxt<'a, 'tcx, 'tcx>,
incremental_hashes_map: &'a IncrementalHashesMap,
- item_metadata_hashes: FnvHashMap<DefId, Fingerprint>,
- crate_hashes: FnvHashMap<CrateNum, Svh>,
+ item_metadata_hashes: FxHashMap<DefId, Fingerprint>,
+ crate_hashes: FxHashMap<CrateNum, Svh>,
}
impl<'a, 'tcx> HashContext<'a, 'tcx> {
HashContext {
tcx: tcx,
incremental_hashes_map: incremental_hashes_map,
- item_metadata_hashes: FnvHashMap(),
- crate_hashes: FnvHashMap(),
+ item_metadata_hashes: FxHashMap(),
+ crate_hashes: FxHashMap(),
}
}
use rustc::hir::svh::Svh;
use rustc::session::Session;
use rustc::ty::TyCtxt;
-use rustc_data_structures::fnv::{FnvHashSet, FnvHashMap};
+use rustc_data_structures::fx::{FxHashSet, FxHashMap};
use rustc_serialize::Decodable as RustcDecodable;
use rustc_serialize::opaque::Decoder;
use std::fs;
use super::fs::*;
use super::file_format;
-pub type DirtyNodes = FnvHashSet<DepNode<DefPathIndex>>;
+pub type DirtyNodes = FxHashSet<DepNode<DefPathIndex>>;
/// If we are in incremental mode, and a previous dep-graph exists,
/// then load up those nodes/edges that are still valid into the
// Compute which work-products have an input that has changed or
// been removed. Put the dirty ones into a set.
- let mut dirty_target_nodes = FnvHashSet();
+ let mut dirty_target_nodes = FxHashSet();
for &(raw_source_node, ref target_node) in &retraced_edges {
if dirty_raw_source_nodes.contains(raw_source_node) {
if !dirty_target_nodes.contains(target_node) {
retraced: &RetracedDefIdDirectory)
-> DirtyNodes {
let mut hcx = HashContext::new(tcx, incremental_hashes_map);
- let mut dirty_nodes = FnvHashSet();
+ let mut dirty_nodes = FxHashSet();
for hash in serialized_hashes {
if let Some(dep_node) = retraced.map(&hash.dep_node) {
/// otherwise no longer applicable.
fn reconcile_work_products<'a, 'tcx>(tcx: TyCtxt<'a, 'tcx, 'tcx>,
work_products: Vec<SerializedWorkProduct>,
- dirty_target_nodes: &FnvHashSet<DepNode<DefId>>) {
+ dirty_target_nodes: &FxHashSet<DepNode<DefId>>) {
debug!("reconcile_work_products({:?})", work_products);
for swp in work_products {
if dirty_target_nodes.contains(&DepNode::WorkProduct(swp.id.clone())) {
fn load_prev_metadata_hashes(tcx: TyCtxt,
retraced: &RetracedDefIdDirectory,
- output: &mut FnvHashMap<DefId, Fingerprint>) {
+ output: &mut FxHashMap<DefId, Fingerprint>) {
if !tcx.sess.opts.debugging_opts.query_dep_graph {
return
}
use rustc::dep_graph::{DepGraphQuery, DepNode};
use rustc::hir::def_id::DefId;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::graph::{DepthFirstTraversal, INCOMING, NodeIndex};
use super::hash::*;
// nodes.
// - Values: transitive predecessors of the key that are hashable
// (e.g., HIR nodes, input meta-data nodes)
- pub inputs: FnvHashMap<&'query DepNode<DefId>, Vec<&'query DepNode<DefId>>>,
+ pub inputs: FxHashMap<&'query DepNode<DefId>, Vec<&'query DepNode<DefId>>>,
// - Keys: some hashable node
// - Values: the hash thereof
- pub hashes: FnvHashMap<&'query DepNode<DefId>, Fingerprint>,
+ pub hashes: FxHashMap<&'query DepNode<DefId>, Fingerprint>,
}
impl<'q> Predecessors<'q> {
let all_nodes = query.graph.all_nodes();
let tcx = hcx.tcx;
- let inputs: FnvHashMap<_, _> = all_nodes.iter()
+ let inputs: FxHashMap<_, _> = all_nodes.iter()
.enumerate()
.filter(|&(_, node)| match node.data {
DepNode::WorkProduct(_) => true,
})
.collect();
- let mut hashes = FnvHashMap();
+ let mut hashes = FxHashMap();
for input in inputs.values().flat_map(|v| v.iter().cloned()) {
hashes.entry(input)
.or_insert_with(|| hcx.hash(input).unwrap());
use rustc::hir::svh::Svh;
use rustc::session::Session;
use rustc::ty::TyCtxt;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_serialize::Encodable as RustcEncodable;
use rustc_serialize::opaque::Encoder;
use std::hash::Hash;
let query = tcx.dep_graph.query();
let mut hcx = HashContext::new(tcx, incremental_hashes_map);
let preds = Predecessors::new(&query, &mut hcx);
- let mut current_metadata_hashes = FnvHashMap();
+ let mut current_metadata_hashes = FxHashMap();
// IMPORTANT: We are saving the metadata hashes *before* the dep-graph,
// since metadata-encoding might add new entries to the
svh: Svh,
preds: &Predecessors,
builder: &mut DefIdDirectoryBuilder,
- current_metadata_hashes: &mut FnvHashMap<DefId, Fingerprint>,
+ current_metadata_hashes: &mut FxHashMap<DefId, Fingerprint>,
encoder: &mut Encoder)
-> io::Result<()> {
// For each `MetaData(X)` node where `X` is local, accumulate a
// (I initially wrote this with an iterator, but it seemed harder to read.)
let mut serialized_hashes = SerializedMetadataHashes {
hashes: vec![],
- index_map: FnvHashMap()
+ index_map: FxHashMap()
};
- let mut def_id_hashes = FnvHashMap();
+ let mut def_id_hashes = FxHashMap();
for (&target, sources) in &preds.inputs {
let def_id = match *target {
#![cfg_attr(test, feature(test))]
#![feature(box_patterns)]
#![feature(box_syntax)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(quote)]
#![feature(rustc_diagnostic_macros)]
#![feature(rustc_private)]
#![feature(slice_patterns)]
#![feature(staged_api)]
-#![feature(dotdot_in_tuple_patterns)]
#[macro_use]
extern crate syntax;
use middle::const_val::ConstVal;
use rustc_const_eval::eval_const_expr_partial;
use rustc_const_eval::EvalHint::ExprTypeChecked;
-use util::nodemap::FnvHashSet;
+use util::nodemap::FxHashSet;
use lint::{LateContext, LintContext, LintArray};
use lint::{LintPass, LateLintPass};
impl<'a, 'tcx> ImproperCTypesVisitor<'a, 'tcx> {
/// Check if the given type is "ffi-safe" (has a stable, well-defined
/// representation which can be exported to C code).
- fn check_type_for_ffi(&self, cache: &mut FnvHashSet<Ty<'tcx>>, ty: Ty<'tcx>) -> FfiResult {
+ fn check_type_for_ffi(&self, cache: &mut FxHashSet<Ty<'tcx>>, ty: Ty<'tcx>) -> FfiResult {
use self::FfiResult::*;
let cx = self.cx.tcx;
// any generic types right now:
let ty = self.cx.tcx.normalize_associated_type(&ty);
- match self.check_type_for_ffi(&mut FnvHashSet(), ty) {
+ match self.check_type_for_ffi(&mut FxHashSet(), ty) {
FfiResult::FfiSafe => {}
FfiResult::FfiUnsafe(s) => {
self.cx.span_lint(IMPROPER_CTYPES, sp, s);
use rustc::hir::pat_util;
use rustc::ty;
use rustc::ty::adjustment;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use lint::{LateContext, EarlyContext, LintContext, LintArray};
use lint::{LintPass, EarlyLintPass, LateLintPass};
// collect all mutable pattern and group their NodeIDs by their Identifier to
// avoid false warnings in match arms with multiple patterns
- let mut mutables = FnvHashMap();
+ let mut mutables = FxHashMap();
for p in pats {
pat_util::pat_bindings(p, |mode, id, _, path1| {
let name = path1.node;
// of llvm-config, not the target that we're attempting to link.
let mut cmd = Command::new(&llvm_config);
cmd.arg("--libs");
+
+ // Force static linking with "--link-static" if available.
+ let mut version_cmd = Command::new(&llvm_config);
+ version_cmd.arg("--version");
+ let version_output = output(&mut version_cmd);
+ let mut parts = version_output.split('.');
+ if let (Some(major), Some(minor)) = (parts.next().and_then(|s| s.parse::<u32>().ok()),
+ parts.next().and_then(|s| s.parse::<u32>().ok())) {
+ if major > 3 || (major == 3 && minor >= 8) {
+ cmd.arg("--link-static");
+ }
+ }
+
if !is_crossed {
cmd.arg("--system-libs");
}
use rustc::session::search_paths::PathKind;
use rustc::middle;
use rustc::middle::cstore::{CrateStore, validate_crate_name, ExternCrate};
-use rustc::util::nodemap::{FnvHashMap, FnvHashSet};
+use rustc::util::nodemap::{FxHashMap, FxHashSet};
use rustc::hir::map::Definitions;
use std::cell::{RefCell, Cell};
pub sess: &'a Session,
cstore: &'a CStore,
next_crate_num: CrateNum,
- foreign_item_map: FnvHashMap<String, Vec<ast::NodeId>>,
+ foreign_item_map: FxHashMap<String, Vec<ast::NodeId>>,
local_crate_name: String,
}
sess: sess,
cstore: cstore,
next_crate_num: cstore.next_crate_num(),
- foreign_item_map: FnvHashMap(),
+ foreign_item_map: FxHashMap(),
local_crate_name: local_crate_name.to_owned(),
}
}
fn update_extern_crate(&mut self,
cnum: CrateNum,
mut extern_crate: ExternCrate,
- visited: &mut FnvHashSet<(CrateNum, bool)>)
+ visited: &mut FxHashSet<(CrateNum, bool)>)
{
if !visited.insert((cnum, extern_crate.direct)) { return }
// The map from crate numbers in the crate we're resolving to local crate
// numbers
let deps = crate_root.crate_deps.decode(metadata);
- let map: FnvHashMap<_, _> = deps.enumerate().map(|(crate_num, dep)| {
+ let map: FxHashMap<_, _> = deps.enumerate().map(|(crate_num, dep)| {
debug!("resolving dep crate {} hash: `{}`", dep.name, dep.hash);
let (local_cnum, ..) = self.resolve_crate(root,
&dep.name.as_str(),
let extern_crate =
ExternCrate { def_id: def_id, span: item.span, direct: true, path_len: len };
- self.update_extern_crate(cnum, extern_crate, &mut FnvHashSet());
+ self.update_extern_crate(cnum, extern_crate, &mut FxHashSet());
self.cstore.add_extern_mod_stmt_cnum(info.id, cnum);
loaded_macros
use rustc::middle::cstore::ExternCrate;
use rustc_back::PanicStrategy;
use rustc_data_structures::indexed_vec::IndexVec;
-use rustc::util::nodemap::{FnvHashMap, NodeMap, NodeSet, DefIdMap};
+use rustc::util::nodemap::{FxHashMap, NodeMap, NodeSet, DefIdMap};
use std::cell::{RefCell, Cell};
use std::rc::Rc;
/// hashmap, which gives the reverse mapping. This allows us to
/// quickly retrace a `DefPath`, which is needed for incremental
/// compilation support.
- pub key_map: FnvHashMap<DefKey, DefIndex>,
+ pub key_map: FxHashMap<DefKey, DefIndex>,
/// Flag if this crate is required by an rlib version of this crate, or in
/// other words whether it was explicitly linked to. An example of a crate
pub struct CStore {
pub dep_graph: DepGraph,
- metas: RefCell<FnvHashMap<CrateNum, Rc<CrateMetadata>>>,
+ metas: RefCell<FxHashMap<CrateNum, Rc<CrateMetadata>>>,
/// Map from NodeId's of local extern crate statements to crate numbers
extern_mod_crate_map: RefCell<NodeMap<CrateNum>>,
used_crate_sources: RefCell<Vec<CrateSource>>,
pub fn new(dep_graph: &DepGraph) -> CStore {
CStore {
dep_graph: dep_graph.clone(),
- metas: RefCell::new(FnvHashMap()),
- extern_mod_crate_map: RefCell::new(FnvHashMap()),
+ metas: RefCell::new(FxHashMap()),
+ extern_mod_crate_map: RefCell::new(FxHashMap()),
used_crate_sources: RefCell::new(Vec::new()),
used_libraries: RefCell::new(Vec::new()),
used_link_args: RefCell::new(Vec::new()),
statically_included_foreign_items: RefCell::new(NodeSet()),
- visible_parent_map: RefCell::new(FnvHashMap()),
- inlined_item_cache: RefCell::new(FnvHashMap()),
- defid_for_inlined_node: RefCell::new(FnvHashMap()),
+ visible_parent_map: RefCell::new(FxHashMap()),
+ inlined_item_cache: RefCell::new(FxHashMap()),
+ defid_for_inlined_node: RefCell::new(FxHashMap()),
}
}
use rustc::hir::map as hir_map;
use rustc::hir::map::{DefKey, DefPathData};
-use rustc::util::nodemap::FnvHashMap;
+use rustc::util::nodemap::FxHashMap;
use rustc::hir;
use rustc::hir::intravisit::IdRange;
/// Go through each item in the metadata and create a map from that
/// item's def-key to the item's DefIndex.
- pub fn load_key_map(&self, index: LazySeq<Index>) -> FnvHashMap<DefKey, DefIndex> {
+ pub fn load_key_map(&self, index: LazySeq<Index>) -> FxHashMap<DefKey, DefIndex> {
index.iter_enumerated(self.raw_bytes())
.map(|(index, item)| (item.decode(self).def_key.decode(self), index))
.collect()
use rustc::ty::{self, Ty, TyCtxt};
use rustc::session::config::{self, CrateTypeProcMacro};
-use rustc::util::nodemap::{FnvHashMap, NodeSet};
+use rustc::util::nodemap::{FxHashMap, NodeSet};
use rustc_serialize::{Encodable, Encoder, SpecializedEncoder, opaque};
use std::hash::Hash;
reachable: &'a NodeSet,
lazy_state: LazyState,
- type_shorthands: FnvHashMap<Ty<'tcx>, usize>,
- predicate_shorthands: FnvHashMap<ty::Predicate<'tcx>, usize>,
+ type_shorthands: FxHashMap<Ty<'tcx>, usize>,
+ predicate_shorthands: FxHashMap<ty::Predicate<'tcx>, usize>,
}
macro_rules! encoder_methods {
variant: &U,
map: M)
-> Result<(), <Self as Encoder>::Error>
- where M: for<'b> Fn(&'b mut Self) -> &'b mut FnvHashMap<T, usize>,
+ where M: for<'b> Fn(&'b mut Self) -> &'b mut FxHashMap<T, usize>,
T: Clone + Eq + Hash,
U: Encodable
{
struct ImplVisitor<'a, 'tcx: 'a> {
tcx: TyCtxt<'a, 'tcx, 'tcx>,
- impls: FnvHashMap<DefId, Vec<DefIndex>>,
+ impls: FxHashMap<DefId, Vec<DefIndex>>,
}
impl<'a, 'tcx, 'v> Visitor<'v> for ImplVisitor<'a, 'tcx> {
fn encode_impls(&mut self) -> LazySeq<TraitImpls> {
let mut visitor = ImplVisitor {
tcx: self.tcx,
- impls: FnvHashMap(),
+ impls: FxHashMap(),
};
self.tcx.map.krate().visit_all_items(&mut visitor);
#![feature(box_patterns)]
#![feature(conservative_impl_trait)]
#![feature(core_intrinsics)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(proc_macro_internals)]
#![feature(proc_macro_lib)]
#![cfg_attr(stage0, feature(question_mark))]
use rustc::session::filesearch::{FileSearch, FileMatches, FileDoesntMatch};
use rustc::session::search_paths::PathKind;
use rustc::util::common;
-use rustc::util::nodemap::FnvHashMap;
+use rustc::util::nodemap::FxHashMap;
use rustc_llvm as llvm;
use rustc_llvm::{False, ObjectFile, mk_section_iter};
let rlib_prefix = format!("lib{}", self.crate_name);
let staticlib_prefix = format!("{}{}", staticpair.0, self.crate_name);
- let mut candidates = FnvHashMap();
+ let mut candidates = FxHashMap();
let mut staticlibs = vec![];
// First, find all possible candidate rlibs and dylibs purely based on
let hash_str = hash.to_string();
let slot = candidates.entry(hash_str)
- .or_insert_with(|| (FnvHashMap(), FnvHashMap()));
+ .or_insert_with(|| (FxHashMap(), FxHashMap()));
let (ref mut rlibs, ref mut dylibs) = *slot;
fs::canonicalize(path)
.map(|p| {
// A Library candidate is created if the metadata for the set of
// libraries corresponds to the crate id and hash criteria that this
// search is being performed for.
- let mut libraries = FnvHashMap();
+ let mut libraries = FxHashMap();
for (_hash, (rlibs, dylibs)) in candidates {
let mut slot = None;
let rlib = self.extract_one(rlibs, CrateFlavor::Rlib, &mut slot);
// be read, it is assumed that the file isn't a valid rust library (no
// errors are emitted).
fn extract_one(&mut self,
- m: FnvHashMap<PathBuf, PathKind>,
+ m: FxHashMap<PathBuf, PathKind>,
flavor: CrateFlavor,
slot: &mut Option<(Svh, MetadataBlob)>)
-> Option<(PathBuf, PathKind)> {
// rlibs/dylibs.
let sess = self.sess;
let dylibname = self.dylibname();
- let mut rlibs = FnvHashMap();
- let mut dylibs = FnvHashMap();
+ let mut rlibs = FxHashMap();
+ let mut dylibs = FxHashMap();
{
let locs = locs.map(|l| PathBuf::from(l)).filter(|loc| {
if !loc.exists() {
use std;
use rustc_const_math::{ConstMathErr, Op};
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::indexed_vec::Idx;
use build::{BlockAnd, BlockAndExtension, Builder};
// first process the set of fields that were provided
// (evaluating them in order given by user)
- let fields_map: FnvHashMap<_, _> =
+ let fields_map: FxHashMap<_, _> =
fields.into_iter()
.map(|f| (f.name, unpack!(block = this.as_operand(block, f.expr))))
.collect();
//! details.
use build::{BlockAnd, BlockAndExtension, Builder};
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::bitvec::BitVector;
use rustc::middle::const_val::ConstVal;
use rustc::ty::{AdtDef, Ty};
SwitchInt {
switch_ty: Ty<'tcx>,
options: Vec<ConstVal>,
- indices: FnvHashMap<ConstVal, usize>,
+ indices: FxHashMap<ConstVal, usize>,
},
// test for equality
use build::Builder;
use build::matches::{Candidate, MatchPair, Test, TestKind};
use hair::*;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::bitvec::BitVector;
use rustc::middle::const_val::ConstVal;
use rustc::ty::{self, Ty};
// these maps are empty to start; cases are
// added below in add_cases_to_switch
options: vec![],
- indices: FnvHashMap(),
+ indices: FxHashMap(),
}
}
}
candidate: &Candidate<'pat, 'tcx>,
switch_ty: Ty<'tcx>,
options: &mut Vec<ConstVal>,
- indices: &mut FnvHashMap<ConstVal, usize>)
+ indices: &mut FxHashMap<ConstVal, usize>)
-> bool
{
let match_pair = match candidate.match_pairs.iter().find(|mp| mp.lvalue == *test_lvalue) {
use rustc::mir::*;
use syntax_pos::Span;
use rustc_data_structures::indexed_vec::Idx;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
pub struct Scope<'tcx> {
/// the scope-id within the scope_auxiliary
free: Option<FreeData<'tcx>>,
/// The cache for drop chain on “normal” exit into a particular BasicBlock.
- cached_exits: FnvHashMap<(BasicBlock, CodeExtent), BasicBlock>,
+ cached_exits: FxHashMap<(BasicBlock, CodeExtent), BasicBlock>,
}
struct DropData<'tcx> {
needs_cleanup: false,
drops: vec![],
free: None,
- cached_exits: FnvHashMap()
+ cached_exits: FxHashMap()
});
self.scope_auxiliary.push(ScopeAuxiliary {
extent: extent,
#![feature(associated_consts)]
#![feature(box_patterns)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(rustc_diagnostic_macros)]
#![feature(rustc_private)]
#![feature(staged_api)]
use rustc::mir::*;
use rustc::mir::transform::MirSource;
use rustc::ty::TyCtxt;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use rustc_data_structures::indexed_vec::{Idx};
use std::fmt::Display;
use std::fs;
}
fn scope_entry_exit_annotations(auxiliary: Option<&ScopeAuxiliaryVec>)
- -> FnvHashMap<Location, Vec<Annotation>>
+ -> FxHashMap<Location, Vec<Annotation>>
{
// compute scope/entry exit annotations
- let mut annotations = FnvHashMap();
+ let mut annotations = FxHashMap();
if let Some(auxiliary) = auxiliary {
for (scope_id, auxiliary) in auxiliary.iter_enumerated() {
annotations.entry(auxiliary.dom)
block: BasicBlock,
mir: &Mir,
w: &mut Write,
- annotations: &FnvHashMap<Location, Vec<Annotation>>)
+ annotations: &FxHashMap<Location, Vec<Annotation>>)
-> io::Result<()> {
let data = &mir[block];
/// Returns the total number of variables printed.
fn write_scope_tree(tcx: TyCtxt,
mir: &Mir,
- scope_tree: &FnvHashMap<VisibilityScope, Vec<VisibilityScope>>,
+ scope_tree: &FxHashMap<VisibilityScope, Vec<VisibilityScope>>,
w: &mut Write,
parent: VisibilityScope,
depth: usize)
writeln!(w, " {{")?;
// construct a scope tree and write it out
- let mut scope_tree: FnvHashMap<VisibilityScope, Vec<VisibilityScope>> = FnvHashMap();
+ let mut scope_tree: FxHashMap<VisibilityScope, Vec<VisibilityScope>> = FxHashMap();
for (index, scope_data) in mir.visibility_scopes.iter().enumerate() {
if let Some(parent) = scope_data.parent_scope {
scope_tree.entry(parent)
use rustc::mir::transform::{MirPass, MirSource, Pass};
use rustc::mir::visit::{MutVisitor, Visitor};
use rustc::ty::TyCtxt;
-use rustc::util::nodemap::FnvHashSet;
+use rustc::util::nodemap::FxHashSet;
use rustc_data_structures::indexed_vec::Idx;
use std::mem;
#[derive(Default)]
struct OptimizationList {
- and_stars: FnvHashSet<Location>,
+ and_stars: FxHashSet<Location>,
}
use rustc::hir;
use rustc::hir::intravisit as hir_visit;
use rustc::util::common::to_readable_str;
-use rustc::util::nodemap::{FnvHashMap, FnvHashSet};
+use rustc::util::nodemap::{FxHashMap, FxHashSet};
use syntax::ast::{self, NodeId, AttrId};
use syntax::visit as ast_visit;
use syntax_pos::Span;
struct StatCollector<'k> {
krate: Option<&'k hir::Crate>,
- data: FnvHashMap<&'static str, NodeData>,
- seen: FnvHashSet<Id>,
+ data: FxHashMap<&'static str, NodeData>,
+ seen: FxHashSet<Id>,
}
pub fn print_hir_stats(krate: &hir::Crate) {
let mut collector = StatCollector {
krate: Some(krate),
- data: FnvHashMap(),
- seen: FnvHashSet(),
+ data: FxHashMap(),
+ seen: FxHashSet(),
};
hir_visit::walk_crate(&mut collector, krate);
collector.print("HIR STATS");
pub fn print_ast_stats(krate: &ast::Crate, title: &str) {
let mut collector = StatCollector {
krate: None,
- data: FnvHashMap(),
- seen: FnvHashSet(),
+ data: FxHashMap(),
+ seen: FxHashSet(),
};
ast_visit::walk_crate(&mut collector, krate);
collector.print(title);
html_root_url = "https://doc.rust-lang.org/nightly/")]
#![cfg_attr(not(stage0), deny(warnings))]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(rustc_diagnostic_macros)]
#![feature(staged_api)]
#![feature(rustc_private)]
html_root_url = "https://doc.rust-lang.org/nightly/")]
#![cfg_attr(not(stage0), deny(warnings))]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(rustc_diagnostic_macros)]
#![feature(rustc_private)]
#![feature(staged_api)]
use rustc::hir::def::*;
use rustc::hir::def_id::{CRATE_DEF_INDEX, DefId};
use rustc::ty;
-use rustc::util::nodemap::FnvHashMap;
+use rustc::util::nodemap::FxHashMap;
use std::cell::Cell;
use std::rc::Rc;
self.invocations.insert(mark, invocation);
}
- let mut macros: FnvHashMap<_, _> = macros.into_iter().map(|mut def| {
+ let mut macros: FxHashMap<_, _> = macros.into_iter().map(|mut def| {
def.body = mark_tts(&def.body, mark);
let ext = macro_rules::compile(&self.session.parse_sess, &def);
(def.ident.name, (def, Rc::new(ext)))
#![feature(associated_consts)]
#![feature(borrow_state)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(rustc_diagnostic_macros)]
#![feature(rustc_private)]
#![feature(staged_api)]
use rustc::hir::def_id::{CrateNum, CRATE_DEF_INDEX, DefId};
use rustc::ty;
use rustc::hir::{Freevar, FreevarMap, TraitCandidate, TraitMap, GlobMap};
-use rustc::util::nodemap::{NodeMap, NodeSet, FnvHashMap, FnvHashSet};
+use rustc::util::nodemap::{NodeMap, NodeSet, FxHashMap, FxHashSet};
use syntax::ext::hygiene::{Mark, SyntaxContext};
use syntax::ast::{self, FloatTy};
/// error E0403: the name is already used for a type parameter in this type parameter list
NameAlreadyUsedInTypeParameterList(Name, &'a Span),
/// error E0404: is not a trait
- IsNotATrait(&'a str),
+ IsNotATrait(&'a str, &'a str),
/// error E0405: use of undeclared trait name
UndeclaredTraitName(&'a str, SuggestedCandidates),
/// error E0407: method is not a member of trait
err
}
- ResolutionError::IsNotATrait(name) => {
+ ResolutionError::IsNotATrait(name, kind_name) => {
let mut err = struct_span_err!(resolver.session,
span,
E0404,
"`{}` is not a trait",
name);
- err.span_label(span, &format!("not a trait"));
+ err.span_label(span, &format!("expected trait, found {}", kind_name));
err
}
ResolutionError::UndeclaredTraitName(name, candidates) => {
}
// Map from the name in a pattern to its binding mode.
-type BindingMap = FnvHashMap<Ident, BindingInfo>;
+type BindingMap = FxHashMap<Ident, BindingInfo>;
#[derive(Copy, Clone, PartialEq, Eq, Debug)]
enum PatternSource {
self.resolve_type(ty);
}
fn visit_poly_trait_ref(&mut self, tref: &ast::PolyTraitRef, m: &ast::TraitBoundModifier) {
- match self.resolve_trait_reference(tref.trait_ref.ref_id, &tref.trait_ref.path, 0) {
+ match self.resolve_trait_reference(tref.trait_ref.ref_id, &tref.trait_ref.path, 0, None) {
Ok(def) => self.record_def(tref.trait_ref.ref_id, def),
Err(_) => {
// error already reported
/// One local scope.
#[derive(Debug)]
struct Rib<'a> {
- bindings: FnvHashMap<Ident, Def>,
+ bindings: FxHashMap<Ident, Def>,
kind: RibKind<'a>,
}
impl<'a> Rib<'a> {
fn new(kind: RibKind<'a>) -> Rib<'a> {
Rib {
- bindings: FnvHashMap(),
+ bindings: FxHashMap(),
kind: kind,
}
}
// is the NodeId of the local `extern crate` item (otherwise, `extern_crate_id` is None).
extern_crate_id: Option<NodeId>,
- resolutions: RefCell<FnvHashMap<(Name, Namespace), &'a RefCell<NameResolution<'a>>>>,
+ resolutions: RefCell<FxHashMap<(Name, Namespace), &'a RefCell<NameResolution<'a>>>>,
no_implicit_prelude: bool,
kind: kind,
normal_ancestor_id: None,
extern_crate_id: None,
- resolutions: RefCell::new(FnvHashMap()),
+ resolutions: RefCell::new(FxHashMap()),
no_implicit_prelude: false,
glob_importers: RefCell::new(Vec::new()),
globs: RefCell::new((Vec::new())),
/// Interns the names of the primitive types.
struct PrimitiveTypeTable {
- primitive_types: FnvHashMap<Name, PrimTy>,
+ primitive_types: FxHashMap<Name, PrimTy>,
}
impl PrimitiveTypeTable {
fn new() -> PrimitiveTypeTable {
- let mut table = PrimitiveTypeTable { primitive_types: FnvHashMap() };
+ let mut table = PrimitiveTypeTable { primitive_types: FxHashMap() };
table.intern("bool", TyBool);
table.intern("char", TyChar);
// Maps the node id of a statement to the expansions of the `macro_rules!`s
// immediately above the statement (if appropriate).
- macros_at_scope: FnvHashMap<NodeId, Vec<Mark>>,
+ macros_at_scope: FxHashMap<NodeId, Vec<Mark>>,
graph_root: Module<'a>,
prelude: Option<Module<'a>>,
- trait_item_map: FnvHashMap<(Name, DefId), bool /* is static method? */>,
+ trait_item_map: FxHashMap<(Name, DefId), bool /* is static method? */>,
// Names of fields of an item `DefId` accessible with dot syntax.
// Used for hints during error reporting.
- field_names: FnvHashMap<DefId, Vec<Name>>,
+ field_names: FxHashMap<DefId, Vec<Name>>,
// All imports known to succeed or fail.
determined_imports: Vec<&'a ImportDirective<'a>>,
// all imports, but only glob imports are actually interesting).
pub glob_map: GlobMap,
- used_imports: FnvHashSet<(NodeId, Namespace)>,
- used_crates: FnvHashSet<CrateNum>,
+ used_imports: FxHashSet<(NodeId, Namespace)>,
+ used_crates: FxHashSet<CrateNum>,
pub maybe_unused_trait_imports: NodeSet,
privacy_errors: Vec<PrivacyError<'a>>,
ambiguity_errors: Vec<AmbiguityError<'a>>,
- disallowed_shadowing: Vec<(Name, Span, LegacyScope<'a>)>,
+ disallowed_shadowing: Vec<&'a LegacyBinding<'a>>,
arenas: &'a ResolverArenas<'a>,
dummy_binding: &'a NameBinding<'a>,
pub exported_macros: Vec<ast::MacroDef>,
crate_loader: &'a mut CrateLoader,
- macro_names: FnvHashSet<Name>,
- builtin_macros: FnvHashMap<Name, Rc<SyntaxExtension>>,
+ macro_names: FxHashSet<Name>,
+ builtin_macros: FxHashMap<Name, Rc<SyntaxExtension>>,
+ lexical_macro_resolutions: Vec<(Name, LegacyScope<'a>)>,
// Maps the `Mark` of an expansion to its containing module or block.
- invocations: FnvHashMap<Mark, &'a InvocationData<'a>>,
+ invocations: FxHashMap<Mark, &'a InvocationData<'a>>,
}
pub struct ResolverArenas<'a> {
let mut definitions = Definitions::new();
DefCollector::new(&mut definitions).collect_root();
- let mut invocations = FnvHashMap();
+ let mut invocations = FxHashMap();
invocations.insert(Mark::root(),
arenas.alloc_invocation_data(InvocationData::root(graph_root)));
session: session,
definitions: definitions,
- macros_at_scope: FnvHashMap(),
+ macros_at_scope: FxHashMap(),
// The outermost module has def ID 0; this is not reflected in the
// AST.
graph_root: graph_root,
prelude: None,
- trait_item_map: FnvHashMap(),
- field_names: FnvHashMap(),
+ trait_item_map: FxHashMap(),
+ field_names: FxHashMap(),
determined_imports: Vec::new(),
indeterminate_imports: Vec::new(),
make_glob_map: make_glob_map == MakeGlobMap::Yes,
glob_map: NodeMap(),
- used_imports: FnvHashSet(),
- used_crates: FnvHashSet(),
+ used_imports: FxHashSet(),
+ used_crates: FxHashSet(),
maybe_unused_trait_imports: NodeSet(),
privacy_errors: Vec::new(),
exported_macros: Vec::new(),
crate_loader: crate_loader,
- macro_names: FnvHashSet(),
- builtin_macros: FnvHashMap(),
+ macro_names: FxHashSet(),
+ builtin_macros: FxHashMap(),
+ lexical_macro_resolutions: Vec::new(),
invocations: invocations,
}
}
fn add_to_glob_map(&mut self, id: NodeId, name: Name) {
if self.make_glob_map {
- self.glob_map.entry(id).or_insert_with(FnvHashSet).insert(name);
+ self.glob_map.entry(id).or_insert_with(FxHashSet).insert(name);
}
}
}
ItemKind::DefaultImpl(_, ref trait_ref) => {
- self.with_optional_trait_ref(Some(trait_ref), |_, _| {});
+ self.with_optional_trait_ref(Some(trait_ref), |_, _| {}, None);
}
ItemKind::Impl(.., ref generics, ref opt_trait_ref, ref self_type, ref impl_items) =>
self.resolve_implementation(generics,
match type_parameters {
HasTypeParameters(generics, rib_kind) => {
let mut function_type_rib = Rib::new(rib_kind);
- let mut seen_bindings = FnvHashMap();
+ let mut seen_bindings = FxHashMap();
for type_parameter in &generics.ty_params {
let name = type_parameter.ident.name;
debug!("with_type_parameter_rib: {}", type_parameter.id);
self.label_ribs.push(Rib::new(rib_kind));
// Add each argument to the rib.
- let mut bindings_list = FnvHashMap();
+ let mut bindings_list = FxHashMap();
for argument in &declaration.inputs {
self.resolve_pattern(&argument.pat, PatternSource::FnParam, &mut bindings_list);
fn resolve_trait_reference(&mut self,
id: NodeId,
trait_path: &Path,
- path_depth: usize)
+ path_depth: usize,
+ generics: Option<&Generics>)
-> Result<PathResolution, ()> {
self.resolve_path(id, trait_path, path_depth, TypeNS).and_then(|path_res| {
match path_res.base_def {
}
let mut err = resolve_struct_error(self, trait_path.span, {
- ResolutionError::IsNotATrait(&path_names_to_string(trait_path, path_depth))
+ ResolutionError::IsNotATrait(&path_names_to_string(trait_path, path_depth),
+ path_res.base_def.kind_name())
});
+ if let Some(generics) = generics {
+ if let Some(span) = generics.span_for_name(
+ &path_names_to_string(trait_path, path_depth)) {
+
+ err.span_label(span, &"type parameter defined here");
+ }
+ }
// If it's a typedef, give a note
if let Def::TyAlias(..) = path_res.base_def {
result
}
- fn with_optional_trait_ref<T, F>(&mut self, opt_trait_ref: Option<&TraitRef>, f: F) -> T
+ fn with_optional_trait_ref<T, F>(&mut self,
+ opt_trait_ref: Option<&TraitRef>,
+ f: F,
+ generics: Option<&Generics>)
+ -> T
where F: FnOnce(&mut Resolver, Option<DefId>) -> T
{
let mut new_val = None;
if let Some(trait_ref) = opt_trait_ref {
if let Ok(path_res) = self.resolve_trait_reference(trait_ref.ref_id,
&trait_ref.path,
- 0) {
+ 0,
+ generics) {
assert!(path_res.depth == 0);
self.record_def(trait_ref.ref_id, path_res);
new_val = Some((path_res.base_def.def_id(), trait_ref.clone()));
}
});
});
- });
+ }, Some(&generics));
});
}
walk_list!(self, visit_expr, &local.init);
// Resolve the pattern.
- self.resolve_pattern(&local.pat, PatternSource::Let, &mut FnvHashMap());
+ self.resolve_pattern(&local.pat, PatternSource::Let, &mut FxHashMap());
}
// build a map from pattern identifiers to binding-info's.
// that expands into an or-pattern where one 'x' was from the
// user and one 'x' came from the macro.
fn binding_mode_map(&mut self, pat: &Pat) -> BindingMap {
- let mut binding_map = FnvHashMap();
+ let mut binding_map = FxHashMap();
pat.walk(&mut |pat| {
if let PatKind::Ident(binding_mode, ident, ref sub_pat) = pat.node {
fn resolve_arm(&mut self, arm: &Arm) {
self.value_ribs.push(Rib::new(NormalRibKind));
- let mut bindings_list = FnvHashMap();
+ let mut bindings_list = FxHashMap();
for pattern in &arm.pats {
self.resolve_pattern(&pattern, PatternSource::Match, &mut bindings_list);
}
pat_id: NodeId,
outer_pat_id: NodeId,
pat_src: PatternSource,
- bindings: &mut FnvHashMap<Ident, NodeId>)
+ bindings: &mut FxHashMap<Ident, NodeId>)
-> PathResolution {
// Add the binding to the local ribs, if it
// doesn't already exist in the bindings map. (We
pat_src: PatternSource,
// Maps idents to the node ID for the
// outermost pattern that binds them.
- bindings: &mut FnvHashMap<Ident, NodeId>) {
+ bindings: &mut FxHashMap<Ident, NodeId>) {
// Visit all direct subpatterns of this pattern.
let outer_pat_id = pat.id;
pat.walk(&mut |pat| {
}
max_assoc_types = path.segments.len() - qself.position;
// Make sure the trait is valid.
- let _ = self.resolve_trait_reference(id, path, max_assoc_types);
+ let _ = self.resolve_trait_reference(id, path, max_assoc_types, None);
}
None => {
max_assoc_types = path.segments.len();
self.visit_expr(subexpression);
self.value_ribs.push(Rib::new(NormalRibKind));
- self.resolve_pattern(pattern, PatternSource::IfLet, &mut FnvHashMap());
+ self.resolve_pattern(pattern, PatternSource::IfLet, &mut FxHashMap());
self.visit_block(if_block);
self.value_ribs.pop();
ExprKind::WhileLet(ref pattern, ref subexpression, ref block, label) => {
self.visit_expr(subexpression);
self.value_ribs.push(Rib::new(NormalRibKind));
- self.resolve_pattern(pattern, PatternSource::WhileLet, &mut FnvHashMap());
+ self.resolve_pattern(pattern, PatternSource::WhileLet, &mut FxHashMap());
self.resolve_labeled_block(label, expr.id, block);
ExprKind::ForLoop(ref pattern, ref subexpression, ref block, label) => {
self.visit_expr(subexpression);
self.value_ribs.push(Rib::new(NormalRibKind));
- self.resolve_pattern(pattern, PatternSource::For, &mut FnvHashMap());
+ self.resolve_pattern(pattern, PatternSource::For, &mut FxHashMap());
self.resolve_labeled_block(label, expr.id, block);
fn report_errors(&mut self) {
self.report_shadowing_errors();
- let mut reported_spans = FnvHashSet();
+ let mut reported_spans = FxHashSet();
for &AmbiguityError { span, name, b1, b2 } in &self.ambiguity_errors {
if !reported_spans.insert(span) { continue }
}
fn report_shadowing_errors(&mut self) {
- let mut reported_errors = FnvHashSet();
- for (name, span, scope) in replace(&mut self.disallowed_shadowing, Vec::new()) {
- if self.resolve_macro_name(scope, name, false).is_some() &&
- reported_errors.insert((name, span)) {
- let msg = format!("`{}` is already in scope", name);
- self.session.struct_span_err(span, &msg)
+ for (name, scope) in replace(&mut self.lexical_macro_resolutions, Vec::new()) {
+ self.resolve_macro_name(scope, name);
+ }
+
+ let mut reported_errors = FxHashSet();
+ for binding in replace(&mut self.disallowed_shadowing, Vec::new()) {
+ if self.resolve_macro_name(binding.parent, binding.name).is_some() &&
+ reported_errors.insert((binding.name, binding.span)) {
+ let msg = format!("`{}` is already in scope", binding.name);
+ self.session.struct_span_err(binding.span, &msg)
.note("macro-expanded `macro_rules!`s may not shadow \
existing macros (see RFC 1560)")
.emit();
}
pub struct LegacyBinding<'a> {
- parent: LegacyScope<'a>,
- name: ast::Name,
+ pub parent: LegacyScope<'a>,
+ pub name: ast::Name,
ext: Rc<SyntaxExtension>,
- span: Span,
+ pub span: Span,
}
impl<'a> base::Resolver for Resolver<'a> {
if let LegacyScope::Expansion(parent) = invocation.legacy_scope.get() {
invocation.legacy_scope.set(LegacyScope::simplify_expansion(parent));
}
- self.resolve_macro_name(invocation.legacy_scope.get(), name, true).ok_or_else(|| {
+ self.resolve_macro_name(invocation.legacy_scope.get(), name).ok_or_else(|| {
if force {
let msg = format!("macro undefined: '{}!'", name);
let mut err = self.session.struct_span_err(path.span, &msg);
}
impl<'a> Resolver<'a> {
- pub fn resolve_macro_name(&mut self,
- mut scope: LegacyScope<'a>,
- name: ast::Name,
- record_used: bool)
+ pub fn resolve_macro_name(&mut self, mut scope: LegacyScope<'a>, name: ast::Name)
-> Option<Rc<SyntaxExtension>> {
+ let mut possible_time_travel = None;
let mut relative_depth: u32 = 0;
loop {
scope = match scope {
LegacyScope::Empty => break,
LegacyScope::Expansion(invocation) => {
if let LegacyScope::Empty = invocation.expansion.get() {
+ if possible_time_travel.is_none() {
+ possible_time_travel = Some(scope);
+ }
invocation.legacy_scope.get()
} else {
relative_depth += 1;
}
LegacyScope::Binding(binding) => {
if binding.name == name {
- if record_used && relative_depth > 0 {
- self.disallowed_shadowing.push((name, binding.span, binding.parent));
+ if let Some(scope) = possible_time_travel {
+ // Check for disallowed shadowing later
+ self.lexical_macro_resolutions.push((name, scope));
+ } else if relative_depth > 0 {
+ self.disallowed_shadowing.push(binding);
}
return Some(binding.ext.clone());
}
};
}
+ if let Some(scope) = possible_time_travel {
+ self.lexical_macro_resolutions.push((name, scope));
+ }
self.builtin_macros.get(&name).cloned()
}
#![cfg_attr(not(stage0), deny(warnings))]
#![feature(custom_attribute)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![allow(unused_attributes)]
#![feature(rustc_private)]
#![feature(staged_api)]
use type_of;
use value::Value;
use Disr;
-use util::nodemap::{NodeSet, FnvHashMap, FnvHashSet};
+use util::nodemap::{NodeSet, FxHashMap, FxHashSet};
use arena::TypedArena;
use libc::c_uint;
fn internalize_symbols<'a, 'tcx>(sess: &Session,
ccxs: &CrateContextList<'a, 'tcx>,
symbol_map: &SymbolMap<'tcx>,
- reachable: &FnvHashSet<&str>) {
+ reachable: &FxHashSet<&str>) {
let scx = ccxs.shared();
let tcx = scx.tcx();
// 'unsafe' because we are holding on to CStr's from the LLVM module within
// this block.
unsafe {
- let mut referenced_somewhere = FnvHashSet();
+ let mut referenced_somewhere = FxHashSet();
// Collect all symbols that need to stay externally visible because they
// are referenced via a declaration in some other codegen unit.
// Also collect all symbols for which we cannot adjust linkage, because
// it is fixed by some directive in the source code (e.g. #[no_mangle]).
- let linkage_fixed_explicitly: FnvHashSet<_> = scx
+ let linkage_fixed_explicitly: FxHashSet<_> = scx
.translation_items()
.borrow()
.iter()
}
if scx.sess().opts.debugging_opts.print_trans_items.is_some() {
- let mut item_to_cgus = FnvHashMap();
+ let mut item_to_cgus = FxHashMap();
for cgu in &codegen_units {
for (&trans_item, &linkage) in cgu.items() {
use machine::llalign_of_pref;
use type_::Type;
use value::Value;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use libc::{c_uint, c_char};
use std::borrow::Cow;
// Build version of path with cycles removed.
// Pass 1: scan table mapping str -> rightmost pos.
- let mut mm = FnvHashMap();
+ let mut mm = FxHashMap();
let len = v.len();
let mut i = 0;
while i < len {
use common::{fulfill_obligation, type_is_sized};
use glue::{self, DropGlueKind};
use monomorphize::{self, Instance};
-use util::nodemap::{FnvHashSet, FnvHashMap, DefIdMap};
+use util::nodemap::{FxHashSet, FxHashMap, DefIdMap};
use trans_item::{TransItem, type_to_string, def_id_to_string};
// that are potentially inlined by LLVM into the source.
// The two numbers in the tuple are the start (inclusive) and
// end index (exclusive) within the `targets` vecs.
- index: FnvHashMap<TransItem<'tcx>, (usize, usize)>,
+ index: FxHashMap<TransItem<'tcx>, (usize, usize)>,
targets: Vec<TransItem<'tcx>>,
}
fn new() -> InliningMap<'tcx> {
InliningMap {
- index: FnvHashMap(),
+ index: FxHashMap(),
targets: Vec::new(),
}
}
pub fn collect_crate_translation_items<'a, 'tcx>(scx: &SharedCrateContext<'a, 'tcx>,
mode: TransItemCollectionMode)
- -> (FnvHashSet<TransItem<'tcx>>,
+ -> (FxHashSet<TransItem<'tcx>>,
InliningMap<'tcx>) {
// We are not tracking dependencies of this pass as it has to be re-executed
// every time no matter what.
let roots = collect_roots(scx, mode);
debug!("Building translation item graph, beginning at roots");
- let mut visited = FnvHashSet();
+ let mut visited = FxHashSet();
let mut recursion_depths = DefIdMap();
let mut inlining_map = InliningMap::new();
// Collect all monomorphized translation items reachable from `starting_point`
fn collect_items_rec<'a, 'tcx: 'a>(scx: &SharedCrateContext<'a, 'tcx>,
starting_point: TransItem<'tcx>,
- visited: &mut FnvHashSet<TransItem<'tcx>>,
+ visited: &mut FxHashSet<TransItem<'tcx>>,
recursion_depths: &mut DefIdMap<usize>,
inlining_map: &mut InliningMap<'tcx>) {
if !visited.insert(starting_point.clone()) {
if let Some(trait_ref) = tcx.impl_trait_ref(impl_def_id) {
let callee_substs = tcx.erase_regions(&trait_ref.substs);
- let overridden_methods: FnvHashSet<_> = items.iter()
- .map(|item| item.name)
- .collect();
+ let overridden_methods: FxHashSet<_> = items.iter()
+ .map(|item| item.name)
+ .collect();
for method in tcx.provided_trait_methods(trait_ref.def_id) {
if overridden_methods.contains(&method.name) {
continue;
use session::Session;
use session::config;
use symbol_map::SymbolMap;
-use util::nodemap::{NodeSet, DefIdMap, FnvHashMap, FnvHashSet};
+use util::nodemap::{NodeSet, DefIdMap, FxHashMap, FxHashSet};
use std::ffi::{CStr, CString};
use std::cell::{Cell, RefCell};
pub n_inlines: Cell<usize>,
pub n_closures: Cell<usize>,
pub n_llvm_insns: Cell<usize>,
- pub llvm_insns: RefCell<FnvHashMap<String, usize>>,
+ pub llvm_insns: RefCell<FxHashMap<String, usize>>,
// (ident, llvm-instructions)
pub fn_stats: RefCell<Vec<(String, usize)> >,
}
use_dll_storage_attrs: bool,
- translation_items: RefCell<FnvHashSet<TransItem<'tcx>>>,
+ translation_items: RefCell<FxHashSet<TransItem<'tcx>>>,
trait_cache: RefCell<DepTrackingMap<TraitSelectionCache<'tcx>>>,
project_cache: RefCell<DepTrackingMap<ProjectionCache<'tcx>>>,
}
previous_work_product: Option<WorkProduct>,
tn: TypeNames, // FIXME: This seems to be largely unused.
codegen_unit: CodegenUnit<'tcx>,
- needs_unwind_cleanup_cache: RefCell<FnvHashMap<Ty<'tcx>, bool>>,
- fn_pointer_shims: RefCell<FnvHashMap<Ty<'tcx>, ValueRef>>,
- drop_glues: RefCell<FnvHashMap<DropGlueKind<'tcx>, (ValueRef, FnType)>>,
+ needs_unwind_cleanup_cache: RefCell<FxHashMap<Ty<'tcx>, bool>>,
+ fn_pointer_shims: RefCell<FxHashMap<Ty<'tcx>, ValueRef>>,
+ drop_glues: RefCell<FxHashMap<DropGlueKind<'tcx>, (ValueRef, FnType)>>,
/// Cache instances of monomorphic and polymorphic items
- instances: RefCell<FnvHashMap<Instance<'tcx>, ValueRef>>,
+ instances: RefCell<FxHashMap<Instance<'tcx>, ValueRef>>,
/// Cache generated vtables
- vtables: RefCell<FnvHashMap<ty::PolyTraitRef<'tcx>, ValueRef>>,
+ vtables: RefCell<FxHashMap<ty::PolyTraitRef<'tcx>, ValueRef>>,
/// Cache of constant strings,
- const_cstr_cache: RefCell<FnvHashMap<InternedString, ValueRef>>,
+ const_cstr_cache: RefCell<FxHashMap<InternedString, ValueRef>>,
/// Reverse-direction for const ptrs cast from globals.
/// Key is a ValueRef holding a *T,
/// when we ptrcast, and we have to ptrcast during translation
/// of a [T] const because we form a slice, a (*T,usize) pair, not
/// a pointer to an LLVM array type. Similar for trait objects.
- const_unsized: RefCell<FnvHashMap<ValueRef, ValueRef>>,
+ const_unsized: RefCell<FxHashMap<ValueRef, ValueRef>>,
/// Cache of emitted const globals (value -> global)
- const_globals: RefCell<FnvHashMap<ValueRef, ValueRef>>,
+ const_globals: RefCell<FxHashMap<ValueRef, ValueRef>>,
/// Cache of emitted const values
- const_values: RefCell<FnvHashMap<(ast::NodeId, &'tcx Substs<'tcx>), ValueRef>>,
+ const_values: RefCell<FxHashMap<(ast::NodeId, &'tcx Substs<'tcx>), ValueRef>>,
/// Cache of external const values
extern_const_values: RefCell<DefIdMap<ValueRef>>,
/// Mapping from static definitions to their DefId's.
- statics: RefCell<FnvHashMap<ValueRef, DefId>>,
+ statics: RefCell<FxHashMap<ValueRef, DefId>>,
- impl_method_cache: RefCell<FnvHashMap<(DefId, ast::Name), DefId>>,
+ impl_method_cache: RefCell<FxHashMap<(DefId, ast::Name), DefId>>,
/// Cache of closure wrappers for bare fn's.
- closure_bare_wrapper_cache: RefCell<FnvHashMap<ValueRef, ValueRef>>,
+ closure_bare_wrapper_cache: RefCell<FxHashMap<ValueRef, ValueRef>>,
/// List of globals for static variables which need to be passed to the
/// LLVM function ReplaceAllUsesWith (RAUW) when translation is complete.
/// to constants.)
statics_to_rauw: RefCell<Vec<(ValueRef, ValueRef)>>,
- lltypes: RefCell<FnvHashMap<Ty<'tcx>, Type>>,
- llsizingtypes: RefCell<FnvHashMap<Ty<'tcx>, Type>>,
- type_hashcodes: RefCell<FnvHashMap<Ty<'tcx>, String>>,
+ lltypes: RefCell<FxHashMap<Ty<'tcx>, Type>>,
+ llsizingtypes: RefCell<FxHashMap<Ty<'tcx>, Type>>,
+ type_hashcodes: RefCell<FxHashMap<Ty<'tcx>, String>>,
int_type: Type,
opaque_vec_type: Type,
builder: BuilderRef_res,
/// Holds the LLVM values for closure IDs.
- closure_vals: RefCell<FnvHashMap<Instance<'tcx>, ValueRef>>,
+ closure_vals: RefCell<FxHashMap<Instance<'tcx>, ValueRef>>,
dbg_cx: Option<debuginfo::CrateDebugContext<'tcx>>,
eh_unwind_resume: Cell<Option<ValueRef>>,
rust_try_fn: Cell<Option<ValueRef>>,
- intrinsics: RefCell<FnvHashMap<&'static str, ValueRef>>,
+ intrinsics: RefCell<FxHashMap<&'static str, ValueRef>>,
/// Number of LLVM instructions translated into this `LocalCrateContext`.
/// This is used to perform some basic load-balancing to keep all LLVM
n_inlines: Cell::new(0),
n_closures: Cell::new(0),
n_llvm_insns: Cell::new(0),
- llvm_insns: RefCell::new(FnvHashMap()),
+ llvm_insns: RefCell::new(FxHashMap()),
fn_stats: RefCell::new(Vec::new()),
},
check_overflow: check_overflow,
use_dll_storage_attrs: use_dll_storage_attrs,
- translation_items: RefCell::new(FnvHashSet()),
+ translation_items: RefCell::new(FxHashSet()),
trait_cache: RefCell::new(DepTrackingMap::new(tcx.dep_graph.clone())),
project_cache: RefCell::new(DepTrackingMap::new(tcx.dep_graph.clone())),
}
self.use_dll_storage_attrs
}
- pub fn translation_items(&self) -> &RefCell<FnvHashSet<TransItem<'tcx>>> {
+ pub fn translation_items(&self) -> &RefCell<FxHashSet<TransItem<'tcx>>> {
&self.translation_items
}
previous_work_product: previous_work_product,
codegen_unit: codegen_unit,
tn: TypeNames::new(),
- needs_unwind_cleanup_cache: RefCell::new(FnvHashMap()),
- fn_pointer_shims: RefCell::new(FnvHashMap()),
- drop_glues: RefCell::new(FnvHashMap()),
- instances: RefCell::new(FnvHashMap()),
- vtables: RefCell::new(FnvHashMap()),
- const_cstr_cache: RefCell::new(FnvHashMap()),
- const_unsized: RefCell::new(FnvHashMap()),
- const_globals: RefCell::new(FnvHashMap()),
- const_values: RefCell::new(FnvHashMap()),
+ needs_unwind_cleanup_cache: RefCell::new(FxHashMap()),
+ fn_pointer_shims: RefCell::new(FxHashMap()),
+ drop_glues: RefCell::new(FxHashMap()),
+ instances: RefCell::new(FxHashMap()),
+ vtables: RefCell::new(FxHashMap()),
+ const_cstr_cache: RefCell::new(FxHashMap()),
+ const_unsized: RefCell::new(FxHashMap()),
+ const_globals: RefCell::new(FxHashMap()),
+ const_values: RefCell::new(FxHashMap()),
extern_const_values: RefCell::new(DefIdMap()),
- statics: RefCell::new(FnvHashMap()),
- impl_method_cache: RefCell::new(FnvHashMap()),
- closure_bare_wrapper_cache: RefCell::new(FnvHashMap()),
+ statics: RefCell::new(FxHashMap()),
+ impl_method_cache: RefCell::new(FxHashMap()),
+ closure_bare_wrapper_cache: RefCell::new(FxHashMap()),
statics_to_rauw: RefCell::new(Vec::new()),
- lltypes: RefCell::new(FnvHashMap()),
- llsizingtypes: RefCell::new(FnvHashMap()),
- type_hashcodes: RefCell::new(FnvHashMap()),
+ lltypes: RefCell::new(FxHashMap()),
+ llsizingtypes: RefCell::new(FxHashMap()),
+ type_hashcodes: RefCell::new(FxHashMap()),
int_type: Type::from_ref(ptr::null_mut()),
opaque_vec_type: Type::from_ref(ptr::null_mut()),
builder: BuilderRef_res(llvm::LLVMCreateBuilderInContext(llcx)),
- closure_vals: RefCell::new(FnvHashMap()),
+ closure_vals: RefCell::new(FxHashMap()),
dbg_cx: dbg_cx,
eh_personality: Cell::new(None),
eh_unwind_resume: Cell::new(None),
rust_try_fn: Cell::new(None),
- intrinsics: RefCell::new(FnvHashMap()),
+ intrinsics: RefCell::new(FxHashMap()),
n_llvm_insns: Cell::new(0),
type_of_depth: Cell::new(0),
symbol_map: symbol_map,
&self.shared.link_meta
}
- pub fn needs_unwind_cleanup_cache(&self) -> &RefCell<FnvHashMap<Ty<'tcx>, bool>> {
+ pub fn needs_unwind_cleanup_cache(&self) -> &RefCell<FxHashMap<Ty<'tcx>, bool>> {
&self.local().needs_unwind_cleanup_cache
}
- pub fn fn_pointer_shims(&self) -> &RefCell<FnvHashMap<Ty<'tcx>, ValueRef>> {
+ pub fn fn_pointer_shims(&self) -> &RefCell<FxHashMap<Ty<'tcx>, ValueRef>> {
&self.local().fn_pointer_shims
}
pub fn drop_glues<'a>(&'a self)
- -> &'a RefCell<FnvHashMap<DropGlueKind<'tcx>, (ValueRef, FnType)>> {
+ -> &'a RefCell<FxHashMap<DropGlueKind<'tcx>, (ValueRef, FnType)>> {
&self.local().drop_glues
}
self.sess().cstore.defid_for_inlined_node(node_id)
}
- pub fn instances<'a>(&'a self) -> &'a RefCell<FnvHashMap<Instance<'tcx>, ValueRef>> {
+ pub fn instances<'a>(&'a self) -> &'a RefCell<FxHashMap<Instance<'tcx>, ValueRef>> {
&self.local().instances
}
- pub fn vtables<'a>(&'a self) -> &'a RefCell<FnvHashMap<ty::PolyTraitRef<'tcx>, ValueRef>> {
+ pub fn vtables<'a>(&'a self) -> &'a RefCell<FxHashMap<ty::PolyTraitRef<'tcx>, ValueRef>> {
&self.local().vtables
}
- pub fn const_cstr_cache<'a>(&'a self) -> &'a RefCell<FnvHashMap<InternedString, ValueRef>> {
+ pub fn const_cstr_cache<'a>(&'a self) -> &'a RefCell<FxHashMap<InternedString, ValueRef>> {
&self.local().const_cstr_cache
}
- pub fn const_unsized<'a>(&'a self) -> &'a RefCell<FnvHashMap<ValueRef, ValueRef>> {
+ pub fn const_unsized<'a>(&'a self) -> &'a RefCell<FxHashMap<ValueRef, ValueRef>> {
&self.local().const_unsized
}
- pub fn const_globals<'a>(&'a self) -> &'a RefCell<FnvHashMap<ValueRef, ValueRef>> {
+ pub fn const_globals<'a>(&'a self) -> &'a RefCell<FxHashMap<ValueRef, ValueRef>> {
&self.local().const_globals
}
- pub fn const_values<'a>(&'a self) -> &'a RefCell<FnvHashMap<(ast::NodeId, &'tcx Substs<'tcx>),
- ValueRef>> {
+ pub fn const_values<'a>(&'a self) -> &'a RefCell<FxHashMap<(ast::NodeId, &'tcx Substs<'tcx>),
+ ValueRef>> {
&self.local().const_values
}
&self.local().extern_const_values
}
- pub fn statics<'a>(&'a self) -> &'a RefCell<FnvHashMap<ValueRef, DefId>> {
+ pub fn statics<'a>(&'a self) -> &'a RefCell<FxHashMap<ValueRef, DefId>> {
&self.local().statics
}
pub fn impl_method_cache<'a>(&'a self)
- -> &'a RefCell<FnvHashMap<(DefId, ast::Name), DefId>> {
+ -> &'a RefCell<FxHashMap<(DefId, ast::Name), DefId>> {
&self.local().impl_method_cache
}
- pub fn closure_bare_wrapper_cache<'a>(&'a self) -> &'a RefCell<FnvHashMap<ValueRef, ValueRef>> {
+ pub fn closure_bare_wrapper_cache<'a>(&'a self) -> &'a RefCell<FxHashMap<ValueRef, ValueRef>> {
&self.local().closure_bare_wrapper_cache
}
&self.local().statics_to_rauw
}
- pub fn lltypes<'a>(&'a self) -> &'a RefCell<FnvHashMap<Ty<'tcx>, Type>> {
+ pub fn lltypes<'a>(&'a self) -> &'a RefCell<FxHashMap<Ty<'tcx>, Type>> {
&self.local().lltypes
}
- pub fn llsizingtypes<'a>(&'a self) -> &'a RefCell<FnvHashMap<Ty<'tcx>, Type>> {
+ pub fn llsizingtypes<'a>(&'a self) -> &'a RefCell<FxHashMap<Ty<'tcx>, Type>> {
&self.local().llsizingtypes
}
- pub fn type_hashcodes<'a>(&'a self) -> &'a RefCell<FnvHashMap<Ty<'tcx>, String>> {
+ pub fn type_hashcodes<'a>(&'a self) -> &'a RefCell<FxHashMap<Ty<'tcx>, String>> {
&self.local().type_hashcodes
}
self.local().opaque_vec_type
}
- pub fn closure_vals<'a>(&'a self) -> &'a RefCell<FnvHashMap<Instance<'tcx>, ValueRef>> {
+ pub fn closure_vals<'a>(&'a self) -> &'a RefCell<FxHashMap<Instance<'tcx>, ValueRef>> {
&self.local().closure_vals
}
&self.local().rust_try_fn
}
- fn intrinsics<'a>(&'a self) -> &'a RefCell<FnvHashMap<&'static str, ValueRef>> {
+ fn intrinsics<'a>(&'a self) -> &'a RefCell<FxHashMap<&'static str, ValueRef>> {
&self.local().intrinsics
}
&*self.local().symbol_map
}
- pub fn translation_items(&self) -> &RefCell<FnvHashSet<TransItem<'tcx>>> {
+ pub fn translation_items(&self) -> &RefCell<FxHashSet<TransItem<'tcx>>> {
&self.shared.translation_items
}
use type_::Type;
use rustc::ty::{self, AdtKind, Ty, layout};
use session::config;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use util::common::path2cstr;
use libc::{c_uint, c_longlong};
// The UniqueTypeIds created so far
unique_id_interner: Interner,
// A map from UniqueTypeId to debuginfo metadata for that type. This is a 1:1 mapping.
- unique_id_to_metadata: FnvHashMap<UniqueTypeId, DIType>,
+ unique_id_to_metadata: FxHashMap<UniqueTypeId, DIType>,
// A map from types to debuginfo metadata. This is a N:1 mapping.
- type_to_metadata: FnvHashMap<Ty<'tcx>, DIType>,
+ type_to_metadata: FxHashMap<Ty<'tcx>, DIType>,
// A map from types to UniqueTypeId. This is a N:1 mapping.
- type_to_unique_id: FnvHashMap<Ty<'tcx>, UniqueTypeId>
+ type_to_unique_id: FxHashMap<Ty<'tcx>, UniqueTypeId>
}
impl<'tcx> TypeMap<'tcx> {
pub fn new() -> TypeMap<'tcx> {
TypeMap {
unique_id_interner: Interner::new(),
- type_to_metadata: FnvHashMap(),
- unique_id_to_metadata: FnvHashMap(),
- type_to_unique_id: FnvHashMap(),
+ type_to_metadata: FxHashMap(),
+ unique_id_to_metadata: FxHashMap(),
+ type_to_unique_id: FxHashMap(),
}
}
use rustc::ty::{self, Ty};
use rustc::mir;
use session::config::{self, FullDebugInfo, LimitedDebugInfo, NoDebugInfo};
-use util::nodemap::{DefIdMap, FnvHashMap, FnvHashSet};
+use util::nodemap::{DefIdMap, FxHashMap, FxHashSet};
use libc::c_uint;
use std::cell::{Cell, RefCell};
llcontext: ContextRef,
builder: DIBuilderRef,
current_debug_location: Cell<InternalDebugLocation>,
- created_files: RefCell<FnvHashMap<String, DIFile>>,
- created_enum_disr_types: RefCell<FnvHashMap<(DefId, layout::Integer), DIType>>,
+ created_files: RefCell<FxHashMap<String, DIFile>>,
+ created_enum_disr_types: RefCell<FxHashMap<(DefId, layout::Integer), DIType>>,
type_map: RefCell<TypeMap<'tcx>>,
namespace_map: RefCell<DefIdMap<DIScope>>,
// This collection is used to assert that composite types (structs, enums,
// ...) have their members only set once:
- composite_types_completed: RefCell<FnvHashSet<DIType>>,
+ composite_types_completed: RefCell<FxHashSet<DIType>>,
}
impl<'tcx> CrateDebugContext<'tcx> {
llcontext: llcontext,
builder: builder,
current_debug_location: Cell::new(InternalDebugLocation::UnknownLocation),
- created_files: RefCell::new(FnvHashMap()),
- created_enum_disr_types: RefCell::new(FnvHashMap()),
+ created_files: RefCell::new(FxHashMap()),
+ created_enum_disr_types: RefCell::new(FxHashMap()),
type_map: RefCell::new(TypeMap::new()),
namespace_map: RefCell::new(DefIdMap()),
- composite_types_completed: RefCell::new(FnvHashSet()),
+ composite_types_completed: RefCell::new(FxHashSet()),
};
}
}
#![feature(cell_extras)]
#![feature(const_fn)]
#![feature(custom_attribute)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![allow(unused_attributes)]
#![feature(libc)]
#![feature(quote)]
use glue;
use type_::Type;
-use rustc_data_structures::fnv::FnvHashMap;
+use rustc_data_structures::fx::FxHashMap;
use syntax::parse::token;
use super::{MirContext, LocalRef};
adt::trans_get_discr(bcx, ty, discr_lvalue.llval, None, true)
);
- let mut bb_hist = FnvHashMap();
+ let mut bb_hist = FxHashMap();
for target in targets {
*bb_hist.entry(target).or_insert(0) += 1;
}
use syntax::ast::NodeId;
use syntax::parse::token::{self, InternedString};
use trans_item::TransItem;
-use util::nodemap::{FnvHashMap, FnvHashSet};
+use util::nodemap::{FxHashMap, FxHashSet};
pub enum PartitioningStrategy {
/// Generate one codegen unit per source-level module.
/// as well as the crate name and disambiguator.
name: InternedString,
- items: FnvHashMap<TransItem<'tcx>, llvm::Linkage>,
+ items: FxHashMap<TransItem<'tcx>, llvm::Linkage>,
}
impl<'tcx> CodegenUnit<'tcx> {
pub fn new(name: InternedString,
- items: FnvHashMap<TransItem<'tcx>, llvm::Linkage>)
+ items: FxHashMap<TransItem<'tcx>, llvm::Linkage>)
-> Self {
CodegenUnit {
name: name,
}
pub fn empty(name: InternedString) -> Self {
- Self::new(name, FnvHashMap())
+ Self::new(name, FxHashMap())
}
pub fn contains_item(&self, item: &TransItem<'tcx>) -> bool {
&self.name
}
- pub fn items(&self) -> &FnvHashMap<TransItem<'tcx>, llvm::Linkage> {
+ pub fn items(&self) -> &FxHashMap<TransItem<'tcx>, llvm::Linkage> {
&self.items
}
struct PreInliningPartitioning<'tcx> {
codegen_units: Vec<CodegenUnit<'tcx>>,
- roots: FnvHashSet<TransItem<'tcx>>,
+ roots: FxHashSet<TransItem<'tcx>>,
}
struct PostInliningPartitioning<'tcx>(Vec<CodegenUnit<'tcx>>);
where I: Iterator<Item = TransItem<'tcx>>
{
let tcx = scx.tcx();
- let mut roots = FnvHashSet();
- let mut codegen_units = FnvHashMap();
+ let mut roots = FxHashSet();
+ let mut codegen_units = FxHashMap();
for trans_item in trans_items {
let is_root = !trans_item.is_instantiated_only_on_demand(tcx);
for codegen_unit in &initial_partitioning.codegen_units[..] {
// Collect all items that need to be available in this codegen unit
- let mut reachable = FnvHashSet();
+ let mut reachable = FxHashSet();
for root in codegen_unit.items.keys() {
follow_inlining(*root, inlining_map, &mut reachable);
}
fn follow_inlining<'tcx>(trans_item: TransItem<'tcx>,
inlining_map: &InliningMap<'tcx>,
- visited: &mut FnvHashSet<TransItem<'tcx>>) {
+ visited: &mut FxHashSet<TransItem<'tcx>>) {
if !visited.insert(trans_item) {
return;
}
use std::borrow::Cow;
use syntax::codemap::Span;
use trans_item::TransItem;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
// In the SymbolMap we collect the symbol names of all translation items of
// the current crate. This map exists as a performance optimization. Symbol
// Thus they could also always be recomputed if needed.
pub struct SymbolMap<'tcx> {
- index: FnvHashMap<TransItem<'tcx>, (usize, usize)>,
+ index: FxHashMap<TransItem<'tcx>, (usize, usize)>,
arena: String,
}
}
let mut symbol_map = SymbolMap {
- index: FnvHashMap(),
+ index: FxHashMap(),
arena: String::with_capacity(1024),
};
use llvm::{Float, Double, X86_FP80, PPC_FP128, FP128};
use context::CrateContext;
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use syntax::ast;
use rustc::ty::layout;
/* Memory-managed object interface to type handles. */
pub struct TypeNames {
- named_types: RefCell<FnvHashMap<String, TypeRef>>,
+ named_types: RefCell<FxHashMap<String, TypeRef>>,
}
impl TypeNames {
pub fn new() -> TypeNames {
TypeNames {
- named_types: RefCell::new(FnvHashMap())
+ named_types: RefCell::new(FxHashMap())
}
}
ElisionFailureInfo, ElidedLifetime};
use rscope::{AnonTypeScope, MaybeWithAnonTypes};
use util::common::{ErrorReported, FN_OUTPUT_NAME};
-use util::nodemap::{NodeMap, FnvHashSet};
+use util::nodemap::{NodeMap, FxHashSet};
use std::cell::RefCell;
use syntax::{abi, ast};
let mut possible_implied_output_region = None;
for input_type in input_tys.iter() {
- let mut regions = FnvHashSet();
+ let mut regions = FxHashSet();
let have_bound_regions = tcx.collect_regions(input_type, &mut regions);
debug!("find_implied_output_regions: collected {:?} from {:?} \
return tcx.types.err;
}
- let mut associated_types = FnvHashSet::default();
+ let mut associated_types = FxHashSet::default();
for tr in traits::supertraits(tcx, principal) {
if let Some(trait_id) = tcx.map.as_local_node_id(tr.def_id()) {
use collect::trait_associated_type_names;
use rustc::infer::{self, InferOk, TypeOrigin};
use rustc::ty::{self, Ty, TypeFoldable, LvaluePreference};
use check::{FnCtxt, Expectation};
-use util::nodemap::FnvHashMap;
+use util::nodemap::FxHashMap;
use std::collections::hash_map::Entry::{Occupied, Vacant};
use std::cmp;
let field_map = variant.fields
.iter()
.map(|field| (field.name, field))
- .collect::<FnvHashMap<_, _>>();
+ .collect::<FxHashMap<_, _>>();
// Keep track of which fields have already appeared in the pattern.
- let mut used_fields = FnvHashMap();
+ let mut used_fields = FxHashMap();
// Typecheck each field.
for &Spanned { node: ref field, span } in fields {
debug!("compare_impl_method: trait_fty={:?}", trait_fty);
- if let Err(terr) = infcx.sub_types(false, origin, impl_fty, trait_fty) {
+ let sub_result = infcx.sub_types(false, origin, impl_fty, trait_fty)
+ .map(|InferOk { obligations, .. }| {
+ // FIXME(#32730) propagate obligations
+ assert!(obligations.is_empty());
+ });
+
+ if let Err(terr) = sub_result {
debug!("sub_types failed: impl ty {:?}, trait ty {:?}",
impl_fty,
trait_fty);
use hir::def_id::DefId;
use middle::free_region::FreeRegionMap;
-use rustc::infer;
+use rustc::infer::{self, InferOk};
use middle::region;
use rustc::ty::subst::{Subst, Substs};
use rustc::ty::{self, AdtKind, Ty, TyCtxt};
use rustc::traits::{self, Reveal};
-use util::nodemap::FnvHashSet;
+use util::nodemap::FxHashSet;
use syntax::ast;
use syntax_pos::{self, Span};
infcx.fresh_substs_for_item(drop_impl_span, drop_impl_did);
let fresh_impl_self_ty = drop_impl_ty.subst(tcx, fresh_impl_substs);
- if let Err(_) = infcx.eq_types(true, infer::TypeOrigin::Misc(drop_impl_span),
- named_type, fresh_impl_self_ty) {
- let item_span = tcx.map.span(self_type_node_id);
- struct_span_err!(tcx.sess, drop_impl_span, E0366,
- "Implementations of Drop cannot be specialized")
- .span_note(item_span,
- "Use same sequence of generic type and region \
- parameters that is on the struct/enum definition")
- .emit();
- return Err(());
+ match infcx.eq_types(true, infer::TypeOrigin::Misc(drop_impl_span),
+ named_type, fresh_impl_self_ty) {
+ Ok(InferOk { obligations, .. }) => {
+ // FIXME(#32730) propagate obligations
+ assert!(obligations.is_empty());
+ }
+ Err(_) => {
+ let item_span = tcx.map.span(self_type_node_id);
+ struct_span_err!(tcx.sess, drop_impl_span, E0366,
+ "Implementations of Drop cannot be specialized")
+ .span_note(item_span,
+ "Use same sequence of generic type and region \
+ parameters that is on the struct/enum definition")
+ .emit();
+ return Err(());
+ }
}
if let Err(ref errors) = fulfillment_cx.select_all_or_error(&infcx) {
rcx: rcx,
span: span,
parent_scope: parent_scope,
- breadcrumbs: FnvHashSet()
+ breadcrumbs: FxHashSet()
},
TypeContext::Root,
typ,
struct DropckContext<'a, 'b: 'a, 'gcx: 'b+'tcx, 'tcx: 'b> {
rcx: &'a mut RegionCtxt<'b, 'gcx, 'tcx>,
/// types that have already been traversed
- breadcrumbs: FnvHashSet<Ty<'tcx>>,
+ breadcrumbs: FxHashSet<Ty<'tcx>>,
/// span for error reporting
span: Span,
/// the scope reachable dtorck types must outlive
use rustc::ty::subst::Substs;
use rustc::ty::FnSig;
use rustc::ty::{self, Ty};
-use rustc::util::nodemap::FnvHashMap;
+use rustc::util::nodemap::FxHashMap;
use {CrateCtxt, require_same_types};
use syntax::abi::Abi;
return
}
- let mut structural_to_nomimal = FnvHashMap();
+ let mut structural_to_nomimal = FxHashMap();
let sig = tcx.no_late_bound_regions(i_ty.ty.fn_sig()).unwrap();
if intr.inputs.len() != sig.inputs.len() {
ccx: &CrateCtxt<'a, 'tcx>,
position: &str,
span: Span,
- structural_to_nominal: &mut FnvHashMap<&'a intrinsics::Type, ty::Ty<'tcx>>,
+ structural_to_nominal: &mut FxHashMap<&'a intrinsics::Type, ty::Ty<'tcx>>,
expected: &'a intrinsics::Type, t: ty::Ty<'tcx>)
{
use intrinsics::Type::*;
use rustc::traits;
use rustc::ty::{self, Ty, ToPolyTraitRef, TraitRef, TypeFoldable};
use rustc::infer::{InferOk, TypeOrigin};
-use rustc::util::nodemap::FnvHashSet;
+use rustc::util::nodemap::FxHashSet;
use syntax::ast;
use syntax_pos::{Span, DUMMY_SP};
use rustc::hir;
opt_simplified_steps: Option<Vec<ty::fast_reject::SimplifiedType>>,
inherent_candidates: Vec<Candidate<'tcx>>,
extension_candidates: Vec<Candidate<'tcx>>,
- impl_dups: FnvHashSet<DefId>,
+ impl_dups: FxHashSet<DefId>,
import_id: Option<ast::NodeId>,
/// Collects near misses when the candidate functions are missing a `self` keyword and is only
item_name: item_name,
inherent_candidates: Vec::new(),
extension_candidates: Vec::new(),
- impl_dups: FnvHashSet(),
+ impl_dups: FxHashSet(),
import_id: None,
steps: Rc::new(steps),
opt_simplified_steps: opt_simplified_steps,
fn assemble_extension_candidates_for_traits_in_scope(&mut self,
expr_id: ast::NodeId)
-> Result<(), MethodError<'tcx>> {
- let mut duplicates = FnvHashSet();
+ let mut duplicates = FxHashSet();
let opt_applicable_traits = self.tcx.trait_map.get(&expr_id);
if let Some(applicable_traits) = opt_applicable_traits {
for trait_candidate in applicable_traits {
}
fn assemble_extension_candidates_for_all_traits(&mut self) -> Result<(), MethodError<'tcx>> {
- let mut duplicates = FnvHashSet();
+ let mut duplicates = FxHashSet();
for trait_info in suggest::all_traits(self.ccx) {
if duplicates.insert(trait_info.def_id) {
self.assemble_extension_candidates_for_trait(trait_info.def_id)?;
use hir::def_id::{CRATE_DEF_INDEX, DefId};
use middle::lang_items::FnOnceTraitLangItem;
use rustc::traits::{Obligation, SelectionContext};
-use util::nodemap::FnvHashSet;
+use util::nodemap::FxHashSet;
use syntax::ast;
use errors::DiagnosticBuilder;
});
// Cross-crate:
- let mut external_mods = FnvHashSet();
+ let mut external_mods = FxHashSet();
fn handle_external_def(ccx: &CrateCtxt,
traits: &mut AllTraitsVec,
- external_mods: &mut FnvHashSet<DefId>,
+ external_mods: &mut FxHashSet<DefId>,
def: Def) {
let def_id = def.def_id();
match def {
use TypeAndSubsts;
use lint;
use util::common::{block_query, ErrorReported, indenter, loop_query};
-use util::nodemap::{DefIdMap, FnvHashMap, FnvHashSet, NodeMap};
+use util::nodemap::{DefIdMap, FxHashMap, FxHashSet, NodeMap};
use std::cell::{Cell, Ref, RefCell};
use std::mem::replace;
if !is_implemented {
if !is_provided {
- missing_items.push(trait_item.name());
+ missing_items.push(trait_item);
} else if associated_type_overridden {
invalidated_items.push(trait_item.name());
}
}
if !missing_items.is_empty() {
- struct_span_err!(tcx.sess, impl_span, E0046,
+ let mut err = struct_span_err!(tcx.sess, impl_span, E0046,
"not all trait items implemented, missing: `{}`",
missing_items.iter()
- .map(|name| name.to_string())
- .collect::<Vec<_>>().join("`, `"))
- .span_label(impl_span, &format!("missing `{}` in implementation",
+ .map(|trait_item| trait_item.name().to_string())
+ .collect::<Vec<_>>().join("`, `"));
+ err.span_label(impl_span, &format!("missing `{}` in implementation",
missing_items.iter()
- .map(|name| name.to_string())
- .collect::<Vec<_>>().join("`, `"))
- ).emit();
+ .map(|trait_item| trait_item.name().to_string())
+ .collect::<Vec<_>>().join("`, `")));
+ for trait_item in missing_items {
+ if let Some(span) = tcx.map.span_if_local(trait_item.def_id()) {
+ err.span_label(span, &format!("`{}` from trait", trait_item.name()));
+ } else {
+ err.note(&format!("`{}` from trait: `{}`",
+ trait_item.name(),
+ signature(trait_item)));
+ }
+ }
+ err.emit();
}
if !invalidated_items.is_empty() {
}
}
+fn signature<'a, 'tcx>(item: &ty::ImplOrTraitItem) -> String {
+ match *item {
+ ty::MethodTraitItem(ref item) => format!("{}", item.fty.sig.0),
+ ty::TypeTraitItem(ref item) => format!("type {};", item.name.to_string()),
+ ty::ConstTraitItem(ref item) => format!("const {}: {:?};", item.name.to_string(), item.ty),
+ }
+}
+
/// Checks a constant with a given type.
fn check_const_with_type<'a, 'tcx>(ccx: &'a CrateCtxt<'a, 'tcx>,
expr: &'tcx hir::Expr,
// We must collect the defaults *before* we do any unification. Because we have
// directly attached defaults to the type variables any unification that occurs
// will erase defaults causing conflicting defaults to be completely ignored.
- let default_map: FnvHashMap<_, _> =
+ let default_map: FxHashMap<_, _> =
unsolved_variables
.iter()
.filter_map(|t| self.default(t).map(|d| (t, d)))
.collect();
- let mut unbound_tyvars = FnvHashSet();
+ let mut unbound_tyvars = FxHashSet();
debug!("select_all_obligations_and_apply_defaults: defaults={:?}", default_map);
// table then apply defaults until we find a conflict. That default must be the one
// that caused conflict earlier.
fn find_conflicting_default(&self,
- unbound_vars: &FnvHashSet<Ty<'tcx>>,
- default_map: &FnvHashMap<&Ty<'tcx>, type_variable::Default<'tcx>>,
+ unbound_vars: &FxHashSet<Ty<'tcx>>,
+ default_map: &FxHashMap<&Ty<'tcx>, type_variable::Default<'tcx>>,
conflict: Ty<'tcx>)
-> Option<type_variable::Default<'tcx>> {
use rustc::ty::error::UnconstrainedNumeric::Neither;
_ => span_bug!(span, "non-ADT passed to check_expr_struct_fields")
};
- let mut remaining_fields = FnvHashMap();
+ let mut remaining_fields = FxHashMap();
for field in &variant.fields {
remaining_fields.insert(field.name, field);
}
- let mut seen_fields = FnvHashMap();
+ let mut seen_fields = FxHashMap();
let mut error_happened = false;
};
if let Some((variant, did, substs)) = variant {
- if variant.ctor_kind == CtorKind::Fn &&
- !self.tcx.sess.features.borrow().relaxed_adts {
- emit_feature_err(&self.tcx.sess.parse_sess,
- "relaxed_adts", path.span, GateIssue::Language,
- "tuple structs and variants in struct patterns are unstable");
- }
-
// Check bounds on type arguments used in the path.
let type_predicates = self.tcx.lookup_predicates(did);
let bounds = self.instantiate_bounds(path.span, substs, &type_predicates);
use rustc::infer::TypeOrigin;
use rustc::traits;
use rustc::ty::{self, Ty, TyCtxt};
-use rustc::util::nodemap::{FnvHashSet, FnvHashMap};
+use rustc::util::nodemap::{FxHashSet, FxHashMap};
use syntax::ast;
use syntax_pos::Span;
assert_eq!(ty_predicates.parent, None);
let variances = self.tcx().item_variances(item_def_id);
- let mut constrained_parameters: FnvHashSet<_> =
+ let mut constrained_parameters: FxHashSet<_> =
variances.iter().enumerate()
.filter(|&(_, &variance)| variance != ty::Bivariant)
.map(|(index, _)| Parameter(index as u32))
fn reject_shadowing_type_parameters(tcx: TyCtxt, span: Span, generics: &ty::Generics) {
let parent = tcx.lookup_generics(generics.parent.unwrap());
- let impl_params: FnvHashMap<_, _> = parent.types
- .iter()
- .map(|tp| (tp.name, tp.def_id))
- .collect();
+ let impl_params: FxHashMap<_, _> = parent.types
+ .iter()
+ .map(|tp| (tp.name, tp.def_id))
+ .collect();
for method_param in &generics.types {
if impl_params.contains_key(&method_param.name) {
if f.unsubst_ty().is_phantom_data() {
// Ignore PhantomData fields
- None
- } else if infcx.sub_types(false, origin, b, a).is_ok() {
- // Ignore fields that aren't significantly changed
- None
- } else {
- // Collect up all fields that were significantly changed
- // i.e. those that contain T in coerce_unsized T -> U
- Some((i, a, b))
+ return None;
}
+
+ // Ignore fields that aren't significantly changed
+ if let Ok(ok) = infcx.sub_types(false, origin, b, a) {
+ if ok.obligations.is_empty() {
+ return None;
+ }
+ }
+
+ // Collect up all fields that were significantly changed
+ // i.e. those that contain T in coerce_unsized T -> U
+ Some((i, a, b))
})
.collect::<Vec<_>>();
use rscope::*;
use rustc::dep_graph::DepNode;
use util::common::{ErrorReported, MemoizationMap};
-use util::nodemap::{NodeMap, FnvHashMap, FnvHashSet};
+use util::nodemap::{NodeMap, FxHashMap, FxHashSet};
use {CrateCtxt, write_ty_to_tcx};
use rustc_const_math::ConstInt;
// Convert all the associated consts.
// Also, check if there are any duplicate associated items
- let mut seen_type_items = FnvHashMap();
- let mut seen_value_items = FnvHashMap();
+ let mut seen_type_items = FxHashMap();
+ let mut seen_value_items = FxHashMap();
for impl_item in impl_items {
let seen_items = match impl_item.node {
disr_val: ty::Disr,
def: &hir::VariantData)
-> ty::VariantDefData<'tcx, 'tcx> {
- let mut seen_fields: FnvHashMap<ast::Name, Span> = FnvHashMap();
+ let mut seen_fields: FxHashMap<ast::Name, Span> = FxHashMap();
let node_id = ccx.tcx.map.as_local_node_id(did).unwrap();
let fields = def.fields().iter().map(|f| {
let fid = ccx.tcx.map.local_def_id(f.id);
{
let inline_bounds = from_bounds(ccx, param_bounds);
let where_bounds = from_predicates(ccx, param_id, &where_clause.predicates);
- let all_bounds: FnvHashSet<_> = inline_bounds.into_iter()
- .chain(where_bounds)
- .collect();
+ let all_bounds: FxHashSet<_> = inline_bounds.into_iter()
+ .chain(where_bounds)
+ .collect();
return if all_bounds.len() > 1 {
ty::ObjectLifetimeDefault::Ambiguous
} else if all_bounds.len() == 0 {
// The trait reference is an input, so find all type parameters
// reachable from there, to start (if this is an inherent impl,
// then just examine the self type).
- let mut input_parameters: FnvHashSet<_> =
+ let mut input_parameters: FxHashSet<_> =
ctp::parameters_for(&impl_scheme.ty, false).into_iter().collect();
if let Some(ref trait_ref) = impl_trait_ref {
input_parameters.extend(ctp::parameters_for(trait_ref, false));
let impl_predicates = ccx.tcx.lookup_predicates(impl_def_id);
let impl_trait_ref = ccx.tcx.impl_trait_ref(impl_def_id);
- let mut input_parameters: FnvHashSet<_> =
+ let mut input_parameters: FxHashSet<_> =
ctp::parameters_for(&impl_scheme.ty, false).into_iter().collect();
if let Some(ref trait_ref) = impl_trait_ref {
input_parameters.extend(ctp::parameters_for(trait_ref, false));
ctp::identify_constrained_type_params(
&impl_predicates.predicates.as_slice(), impl_trait_ref, &mut input_parameters);
- let lifetimes_in_associated_types: FnvHashSet<_> = impl_items.iter()
+ let lifetimes_in_associated_types: FxHashSet<_> = impl_items.iter()
.map(|item| ccx.tcx.impl_or_trait_item(ccx.tcx.map.local_def_id(item.id)))
.filter_map(|item| match item {
ty::TypeTraitItem(ref assoc_ty) => assoc_ty.ty,
use rustc::ty::{self, Ty};
use rustc::ty::fold::{TypeFoldable, TypeVisitor};
-use rustc::util::nodemap::FnvHashSet;
+use rustc::util::nodemap::FxHashSet;
#[derive(Clone, PartialEq, Eq, Hash, Debug)]
pub struct Parameter(pub u32);
pub fn identify_constrained_type_params<'tcx>(predicates: &[ty::Predicate<'tcx>],
impl_trait_ref: Option<ty::TraitRef<'tcx>>,
- input_parameters: &mut FnvHashSet<Parameter>)
+ input_parameters: &mut FxHashSet<Parameter>)
{
let mut predicates = predicates.to_owned();
setup_constraining_predicates(&mut predicates, impl_trait_ref, input_parameters);
/// think of any.
pub fn setup_constraining_predicates<'tcx>(predicates: &mut [ty::Predicate<'tcx>],
impl_trait_ref: Option<ty::TraitRef<'tcx>>,
- input_parameters: &mut FnvHashSet<Parameter>)
+ input_parameters: &mut FxHashSet<Parameter>)
{
// The canonical way of doing the needed topological sort
// would be a DFS, but getting the graph and its ownership
#![feature(box_patterns)]
#![feature(box_syntax)]
#![feature(conservative_impl_trait)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(quote)]
#![feature(rustc_diagnostic_macros)]
#![feature(rustc_private)]
use dep_graph::DepNode;
use hir::map as hir_map;
-use rustc::infer::TypeOrigin;
+use rustc::infer::{InferOk, TypeOrigin};
use rustc::ty::subst::Substs;
use rustc::ty::{self, Ty, TyCtxt, TypeFoldable};
use rustc::traits::{self, Reveal};
actual: Ty<'tcx>)
-> bool {
ccx.tcx.infer_ctxt(None, None, Reveal::NotSpecializable).enter(|infcx| {
- if let Err(err) = infcx.eq_types(false, origin.clone(), expected, actual) {
- infcx.report_mismatched_types(origin, expected, actual, err);
- false
- } else {
- true
+ match infcx.eq_types(false, origin.clone(), expected, actual) {
+ Ok(InferOk { obligations, .. }) => {
+ // FIXME(#32730) propagate obligations
+ assert!(obligations.is_empty());
+ true
+ }
+ Err(err) => {
+ infcx.report_mismatched_types(origin, expected, actual, err);
+ false
+ }
}
})
}
use rustc::hir::def_id::DefId;
use rustc::hir::print as pprust;
use rustc::ty::{self, TyCtxt};
-use rustc::util::nodemap::FnvHashSet;
+use rustc::util::nodemap::FxHashSet;
use rustc_const_eval::lookup_const_by_id;
.into_iter()
.map(|meth| meth.name.to_string())
.collect()
- }).unwrap_or(FnvHashSet());
+ }).unwrap_or(FxHashSet());
ret.push(clean::Item {
inner: clean::ImplItem(clean::Impl {
// If we're reexporting a reexport it may actually reexport something in
// two namespaces, so the target may be listed twice. Make sure we only
// visit each node at most once.
- let mut visited = FnvHashSet();
+ let mut visited = FxHashSet();
for item in tcx.sess.cstore.item_children(did) {
let def_id = item.def.def_id();
if tcx.sess.cstore.visibility(def_id) == ty::Visibility::Public {
use rustc::ty::subst::Substs;
use rustc::ty::{self, AdtKind};
use rustc::middle::stability;
-use rustc::util::nodemap::{FnvHashMap, FnvHashSet};
+use rustc::util::nodemap::{FxHashMap, FxHashSet};
use rustc::hir;
pub access_levels: Arc<AccessLevels<DefId>>,
// These are later on moved into `CACHEKEY`, leaving the map empty.
// Only here so that they can be filtered through the rustdoc passes.
- pub external_traits: FnvHashMap<DefId, Trait>,
+ pub external_traits: FxHashMap<DefId, Trait>,
}
struct CrateNum(def_id::CrateNum);
// Note that associated types also have a sized bound by default, but we
// don't actually know the set of associated types right here so that's
// handled in cleaning associated types
- let mut sized_params = FnvHashSet();
+ let mut sized_params = FxHashSet();
where_predicates.retain(|pred| {
match *pred {
WP::BoundPredicate { ty: Generic(ref g), ref bounds } => {
});
if let Some((tcx, &hir::ItemTy(ref ty, ref generics))) = tcx_and_alias {
let provided_params = &path.segments.last().unwrap().parameters;
- let mut ty_substs = FnvHashMap();
- let mut lt_substs = FnvHashMap();
+ let mut ty_substs = FxHashMap();
+ let mut lt_substs = FxHashMap();
for (i, ty_param) in generics.ty_params.iter().enumerate() {
let ty_param_def = tcx.expect_def(ty_param.id);
if let Some(ty) = provided_params.types().get(i).cloned()
pub struct Impl {
pub unsafety: hir::Unsafety,
pub generics: Generics,
- pub provided_trait_methods: FnvHashSet<String>,
+ pub provided_trait_methods: FxHashSet<String>,
pub trait_: Option<Type>,
pub for_: Type,
pub items: Vec<Item>,
.map(|meth| meth.name.to_string())
.collect()
})
- }).unwrap_or(FnvHashSet());
+ }).unwrap_or(FxHashSet());
ret.push(Item {
name: None,
use rustc::ty::{self, TyCtxt};
use rustc::hir::map as hir_map;
use rustc::lint;
-use rustc::util::nodemap::FnvHashMap;
+use rustc::util::nodemap::FxHashMap;
use rustc_trans::back::link;
use rustc_resolve as resolve;
use rustc_metadata::cstore::CStore;
NotTyped(&'a session::Session)
}
-pub type ExternalPaths = FnvHashMap<DefId, (Vec<String>, clean::TypeKind)>;
+pub type ExternalPaths = FxHashMap<DefId, (Vec<String>, clean::TypeKind)>;
pub struct DocContext<'a, 'tcx: 'a> {
pub map: &'a hir_map::Map<'tcx>,
/// Later on moved into `html::render::CACHE_KEY`
pub renderinfo: RefCell<RenderInfo>,
/// Later on moved through `clean::Crate` into `html::render::CACHE_KEY`
- pub external_traits: RefCell<FnvHashMap<DefId, clean::Trait>>,
+ pub external_traits: RefCell<FxHashMap<DefId, clean::Trait>>,
// The current set of type and lifetime substitutions,
// for expanding type aliases at the HIR level:
/// Table type parameter definition -> substituted type
- pub ty_substs: RefCell<FnvHashMap<Def, clean::Type>>,
+ pub ty_substs: RefCell<FxHashMap<Def, clean::Type>>,
/// Table node id of lifetime parameter definition -> substituted lifetime
- pub lt_substs: RefCell<FnvHashMap<ast::NodeId, clean::Lifetime>>,
+ pub lt_substs: RefCell<FxHashMap<ast::NodeId, clean::Lifetime>>,
}
impl<'b, 'tcx> DocContext<'b, 'tcx> {
/// Call the closure with the given parameters set as
/// the substitutions for a type alias' RHS.
pub fn enter_alias<F, R>(&self,
- ty_substs: FnvHashMap<Def, clean::Type>,
- lt_substs: FnvHashMap<ast::NodeId, clean::Lifetime>,
+ ty_substs: FxHashMap<Def, clean::Type>,
+ lt_substs: FxHashMap<ast::NodeId, clean::Lifetime>,
f: F) -> R
where F: FnOnce() -> R {
let (old_tys, old_lts) =
use rustc::middle::privacy::AccessLevels;
use rustc::middle::stability;
use rustc::hir;
-use rustc::util::nodemap::{FnvHashMap, FnvHashSet};
+use rustc::util::nodemap::{FxHashMap, FxHashSet};
use rustc_data_structures::flock;
use clean::{self, Attributes, GetDefId, SelfTy, Mutability};
/// `true`.
pub include_sources: bool,
/// The local file sources we've emitted and their respective url-paths.
- pub local_sources: FnvHashMap<PathBuf, String>,
+ pub local_sources: FxHashMap<PathBuf, String>,
/// All the passes that were run on this crate.
- pub passes: FnvHashSet<String>,
+ pub passes: FxHashSet<String>,
/// The base-URL of the issue tracker for when an item has been tagged with
/// an issue number.
pub issue_tracker_base_url: Option<String>,
/// Mapping of typaram ids to the name of the type parameter. This is used
/// when pretty-printing a type (so pretty printing doesn't have to
/// painfully maintain a context like this)
- pub typarams: FnvHashMap<DefId, String>,
+ pub typarams: FxHashMap<DefId, String>,
/// Maps a type id to all known implementations for that type. This is only
/// recognized for intra-crate `ResolvedPath` types, and is used to print
///
/// The values of the map are a list of implementations and documentation
/// found on that implementation.
- pub impls: FnvHashMap<DefId, Vec<Impl>>,
+ pub impls: FxHashMap<DefId, Vec<Impl>>,
/// Maintains a mapping of local crate node ids to the fully qualified name
/// and "short type description" of that node. This is used when generating
/// URLs when a type is being linked to. External paths are not located in
/// this map because the `External` type itself has all the information
/// necessary.
- pub paths: FnvHashMap<DefId, (Vec<String>, ItemType)>,
+ pub paths: FxHashMap<DefId, (Vec<String>, ItemType)>,
/// Similar to `paths`, but only holds external paths. This is only used for
/// generating explicit hyperlinks to other crates.
- pub external_paths: FnvHashMap<DefId, (Vec<String>, ItemType)>,
+ pub external_paths: FxHashMap<DefId, (Vec<String>, ItemType)>,
/// This map contains information about all known traits of this crate.
/// Implementations of a crate should inherit the documentation of the
/// parent trait if no extra documentation is specified, and default methods
/// should show up in documentation about trait implementations.
- pub traits: FnvHashMap<DefId, clean::Trait>,
+ pub traits: FxHashMap<DefId, clean::Trait>,
/// When rendering traits, it's often useful to be able to list all
/// implementors of the trait, and this mapping is exactly, that: a mapping
/// of trait ids to the list of known implementors of the trait
- pub implementors: FnvHashMap<DefId, Vec<Implementor>>,
+ pub implementors: FxHashMap<DefId, Vec<Implementor>>,
/// Cache of where external crate documentation can be found.
- pub extern_locations: FnvHashMap<CrateNum, (String, ExternalLocation)>,
+ pub extern_locations: FxHashMap<CrateNum, (String, ExternalLocation)>,
/// Cache of where documentation for primitives can be found.
- pub primitive_locations: FnvHashMap<clean::PrimitiveType, CrateNum>,
+ pub primitive_locations: FxHashMap<clean::PrimitiveType, CrateNum>,
// Note that external items for which `doc(hidden)` applies to are shown as
// non-reachable while local items aren't. This is because we're reusing
parent_stack: Vec<DefId>,
parent_is_trait_impl: bool,
search_index: Vec<IndexItem>,
- seen_modules: FnvHashSet<DefId>,
+ seen_modules: FxHashSet<DefId>,
seen_mod: bool,
stripped_mod: bool,
deref_trait_did: Option<DefId>,
/// Later on moved into `CACHE_KEY`.
#[derive(Default)]
pub struct RenderInfo {
- pub inlined: FnvHashSet<DefId>,
+ pub inlined: FxHashSet<DefId>,
pub external_paths: ::core::ExternalPaths,
- pub external_typarams: FnvHashMap<DefId, String>,
+ pub external_typarams: FxHashMap<DefId, String>,
pub deref_trait_did: Option<DefId>,
pub deref_mut_trait_did: Option<DefId>,
}
thread_local!(static CACHE_KEY: RefCell<Arc<Cache>> = Default::default());
thread_local!(pub static CURRENT_LOCATION_KEY: RefCell<Vec<String>> =
RefCell::new(Vec::new()));
-thread_local!(static USED_ID_MAP: RefCell<FnvHashMap<String, usize>> =
+thread_local!(static USED_ID_MAP: RefCell<FxHashMap<String, usize>> =
RefCell::new(init_ids()));
-fn init_ids() -> FnvHashMap<String, usize> {
+fn init_ids() -> FxHashMap<String, usize> {
[
"main",
"search",
*s.borrow_mut() = if embedded {
init_ids()
} else {
- FnvHashMap()
+ FxHashMap()
};
});
}
pub fn run(mut krate: clean::Crate,
external_html: &ExternalHtml,
dst: PathBuf,
- passes: FnvHashSet<String>,
+ passes: FxHashSet<String>,
css_file_extension: Option<PathBuf>,
renderinfo: RenderInfo) -> Result<(), Error> {
let src_root = match krate.src.parent() {
src_root: src_root,
passes: passes,
include_sources: true,
- local_sources: FnvHashMap(),
+ local_sources: FxHashMap(),
issue_tracker_base_url: None,
layout: layout::Layout {
logo: "".to_string(),
.collect();
let mut cache = Cache {
- impls: FnvHashMap(),
+ impls: FxHashMap(),
external_paths: external_paths,
- paths: FnvHashMap(),
- implementors: FnvHashMap(),
+ paths: FxHashMap(),
+ implementors: FxHashMap(),
stack: Vec::new(),
parent_stack: Vec::new(),
search_index: Vec::new(),
parent_is_trait_impl: false,
- extern_locations: FnvHashMap(),
- primitive_locations: FnvHashMap(),
- seen_modules: FnvHashSet(),
+ extern_locations: FxHashMap(),
+ primitive_locations: FxHashMap(),
+ seen_modules: FxHashSet(),
seen_mod: false,
stripped_mod: false,
access_levels: krate.access_levels.clone(),
orphan_impl_items: Vec::new(),
- traits: mem::replace(&mut krate.external_traits, FnvHashMap()),
+ traits: mem::replace(&mut krate.external_traits, FxHashMap()),
deref_trait_did: deref_trait_did,
deref_mut_trait_did: deref_mut_trait_did,
typarams: external_typarams,
/// Build the search index from the collected metadata
fn build_index(krate: &clean::Crate, cache: &mut Cache) -> String {
- let mut nodeid_to_pathid = FnvHashMap();
+ let mut nodeid_to_pathid = FxHashMap();
let mut crate_items = Vec::with_capacity(cache.search_index.len());
let mut crate_paths = Vec::<Json>::new();
} else {
String::new()
};
+
+ let mut unsafety_flag = "";
+ if let clean::FunctionItem(ref func) = myitem.inner {
+ if func.unsafety == hir::Unsafety::Unsafe {
+ unsafety_flag = "<a title='unsafe function' href='#'><sup>⚠</sup></a>";
+ }
+ }
+
let doc_value = myitem.doc_value().unwrap_or("");
write!(w, "
<tr class='{stab} module-item'>
<td><a class='{class}' href='{href}'
- title='{title}'>{name}</a></td>
+ title='{title}'>{name}</a>{unsafety_flag}</td>
<td class='docblock-short'>
{stab_docs} {docs}
</td>
docs = shorter(Some(&Markdown(doc_value).to_string())),
class = myitem.type_(),
stab = myitem.stability_class(),
+ unsafety_flag = unsafety_flag,
href = item_path(myitem.type_(), myitem.name.as_ref().unwrap()),
title = full_path(cx, myitem))?;
}
Ok(())
}
+fn attribute_without_value(s: &str) -> bool {
+ ["must_use", "no_mangle", "unsafe_destructor_blind_to_params"].iter().any(|x| x == &s)
+}
+
+fn attribute_with_value(s: &str) -> bool {
+ ["export_name", "lang", "link_section", "must_use"].iter().any(|x| x == &s)
+}
+
+fn attribute_with_values(s: &str) -> bool {
+ ["repr"].iter().any(|x| x == &s)
+}
+
+fn render_attribute(attr: &clean::Attribute, recurse: bool) -> Option<String> {
+ match *attr {
+ clean::Word(ref s) if attribute_without_value(&*s) || recurse => {
+ Some(format!("{}", s))
+ }
+ clean::NameValue(ref k, ref v) if attribute_with_value(&*k) => {
+ Some(format!("{} = \"{}\"", k, v))
+ }
+ clean::List(ref k, ref values) if attribute_with_values(&*k) => {
+ let display: Vec<_> = values.iter()
+ .filter_map(|value| render_attribute(value, true))
+ .map(|entry| format!("{}", entry))
+ .collect();
+
+ if display.len() > 0 {
+ Some(format!("{}({})", k, display.join(", ")))
+ } else {
+ None
+ }
+ }
+ _ => {
+ None
+ }
+ }
+}
+
fn render_attributes(w: &mut fmt::Formatter, it: &clean::Item) -> fmt::Result {
+ let mut attrs = String::new();
+
for attr in &it.attrs {
- match *attr {
- clean::Word(ref s) if *s == "must_use" => {
- write!(w, "#[{}]\n", s)?;
- }
- clean::NameValue(ref k, ref v) if *k == "must_use" => {
- write!(w, "#[{} = \"{}\"]\n", k, v)?;
- }
- _ => ()
+ if let Some(s) = render_attribute(attr, false) {
+ attrs.push_str(&format!("#[{}]\n", s));
}
}
+ if attrs.len() > 0 {
+ write!(w, "<div class=\"docblock attributes\">{}</div>", &attrs)?;
+ }
Ok(())
}
#[derive(Copy, Clone)]
enum AssocItemLink<'a> {
Anchor(Option<&'a str>),
- GotoSource(DefId, &'a FnvHashSet<String>),
+ GotoSource(DefId, &'a FxHashSet<String>),
}
impl<'a> AssocItemLink<'a> {
}
}
- $("#toggle-all-docs").on("click", toggleAllDocs);
-
- $(document).on("click", ".collapse-toggle", function() {
- var toggle = $(this);
+ function collapseDocs(toggle, animate) {
var relatedDoc = toggle.parent().next();
if (relatedDoc.is(".stability")) {
relatedDoc = relatedDoc.next();
}
if (relatedDoc.is(".docblock")) {
if (relatedDoc.is(":visible")) {
- relatedDoc.slideUp({duration: 'fast', easing: 'linear'});
+ if (animate === true) {
+ relatedDoc.slideUp({duration: 'fast', easing: 'linear'});
+ toggle.children(".toggle-label").fadeIn();
+ } else {
+ relatedDoc.hide();
+ toggle.children(".toggle-label").show();
+ }
toggle.parent(".toggle-wrapper").addClass("collapsed");
toggle.children(".inner").text(labelForToggleButton(true));
- toggle.children(".toggle-label").fadeIn();
} else {
relatedDoc.slideDown({duration: 'fast', easing: 'linear'});
toggle.parent(".toggle-wrapper").removeClass("collapsed");
toggle.children(".toggle-label").hide();
}
}
+ }
+
+ $("#toggle-all-docs").on("click", toggleAllDocs);
+
+ $(document).on("click", ".collapse-toggle", function() {
+ collapseDocs($(this), true)
});
$(function() {
});
var mainToggle =
- $(toggle).append(
+ $(toggle.clone()).append(
$('<span/>', {'class': 'toggle-label'})
.css('display', 'none')
.html(' Expand description'));
var wrapper = $("<div class='toggle-wrapper'>").append(mainToggle);
$("#main > .docblock").before(wrapper);
+ var mainToggle =
+ $(toggle).append(
+ $('<span/>', {'class': 'toggle-label'})
+ .css('display', 'none')
+ .html(' Expand attributes'));
+ var wrapper = $("<div class='toggle-wrapper toggle-attributes'>").append(mainToggle);
+ $("#main > pre > .attributes").each(function() {
+ $(this).before(wrapper);
+ collapseDocs($($(this).prev().children()[0]), false);
+ });
});
$('pre.line-numbers').on('click', 'span', function() {
/* See FiraSans-LICENSE.txt for the Fira Sans license. */
@font-face {
- font-family: 'Fira Sans';
- font-style: normal;
- font-weight: 400;
- src: local('Fira Sans'), url("FiraSans-Regular.woff") format('woff');
+ font-family: 'Fira Sans';
+ font-style: normal;
+ font-weight: 400;
+ src: local('Fira Sans'), url("FiraSans-Regular.woff") format('woff');
}
@font-face {
- font-family: 'Fira Sans';
- font-style: normal;
- font-weight: 500;
- src: local('Fira Sans Medium'), url("FiraSans-Medium.woff") format('woff');
+ font-family: 'Fira Sans';
+ font-style: normal;
+ font-weight: 500;
+ src: local('Fira Sans Medium'), url("FiraSans-Medium.woff") format('woff');
}
/* See SourceSerifPro-LICENSE.txt for the Source Serif Pro license and
* Heuristica-LICENSE.txt for the Heuristica license. */
@font-face {
- font-family: 'Source Serif Pro';
- font-style: normal;
- font-weight: 400;
- src: local('Source Serif Pro'), url("SourceSerifPro-Regular.woff") format('woff');
+ font-family: 'Source Serif Pro';
+ font-style: normal;
+ font-weight: 400;
+ src: local('Source Serif Pro'), url("SourceSerifPro-Regular.woff") format('woff');
}
@font-face {
- font-family: 'Source Serif Pro';
- font-style: italic;
- font-weight: 400;
- src: url("Heuristica-Italic.woff") format('woff');
+ font-family: 'Source Serif Pro';
+ font-style: italic;
+ font-weight: 400;
+ src: url("Heuristica-Italic.woff") format('woff');
}
@font-face {
- font-family: 'Source Serif Pro';
- font-style: normal;
- font-weight: 700;
- src: local('Source Serif Pro Bold'), url("SourceSerifPro-Bold.woff") format('woff');
+ font-family: 'Source Serif Pro';
+ font-style: normal;
+ font-weight: 700;
+ src: local('Source Serif Pro Bold'), url("SourceSerifPro-Bold.woff") format('woff');
}
/* See SourceCodePro-LICENSE.txt for the Source Code Pro license. */
@font-face {
- font-family: 'Source Code Pro';
- font-style: normal;
- font-weight: 400;
- src: local('Source Code Pro'), url("SourceCodePro-Regular.woff") format('woff');
+ font-family: 'Source Code Pro';
+ font-style: normal;
+ font-weight: 400;
+ src: local('Source Code Pro'), url("SourceCodePro-Regular.woff") format('woff');
}
@font-face {
- font-family: 'Source Code Pro';
- font-style: normal;
- font-weight: 600;
- src: local('Source Code Pro Semibold'), url("SourceCodePro-Semibold.woff") format('woff');
+ font-family: 'Source Code Pro';
+ font-style: normal;
+ font-weight: 600;
+ src: local('Source Code Pro Semibold'), url("SourceCodePro-Semibold.woff") format('woff');
}
* {
-webkit-box-sizing: border-box;
- -moz-box-sizing: border-box;
- box-sizing: border-box;
+ -moz-box-sizing: border-box;
+ box-sizing: border-box;
}
/* General structure and fonts */
body {
- font: 16px/1.4 "Source Serif Pro", Georgia, Times, "Times New Roman", serif;
- margin: 0;
- position: relative;
- padding: 10px 15px 20px 15px;
+ font: 16px/1.4 "Source Serif Pro", Georgia, Times, "Times New Roman", serif;
+ margin: 0;
+ position: relative;
+ padding: 10px 15px 20px 15px;
- -webkit-font-feature-settings: "kern", "liga";
- -moz-font-feature-settings: "kern", "liga";
- font-feature-settings: "kern", "liga";
+ -webkit-font-feature-settings: "kern", "liga";
+ -moz-font-feature-settings: "kern", "liga";
+ font-feature-settings: "kern", "liga";
}
h1 {
- font-size: 1.5em;
+ font-size: 1.5em;
}
h2 {
- font-size: 1.4em;
+ font-size: 1.4em;
}
h3 {
- font-size: 1.3em;
+ font-size: 1.3em;
}
h1, h2, h3:not(.impl):not(.method):not(.type):not(.tymethod), h4:not(.method):not(.type):not(.tymethod) {
- font-weight: 500;
- margin: 20px 0 15px 0;
- padding-bottom: 6px;
+ font-weight: 500;
+ margin: 20px 0 15px 0;
+ padding-bottom: 6px;
}
h1.fqn {
- border-bottom: 1px dashed;
- margin-top: 0;
- position: relative;
+ border-bottom: 1px dashed;
+ margin-top: 0;
+ position: relative;
}
h2, h3:not(.impl):not(.method):not(.type):not(.tymethod), h4:not(.method):not(.type):not(.tymethod) {
- border-bottom: 1px solid;
+ border-bottom: 1px solid;
}
h3.impl, h3.method, h4.method, h3.type, h4.type {
- font-weight: 600;
- margin-top: 10px;
- margin-bottom: 10px;
- position: relative;
+ font-weight: 600;
+ margin-top: 10px;
+ margin-bottom: 10px;
+ position: relative;
}
h3.impl, h3.method, h3.type {
- margin-top: 15px;
+ margin-top: 15px;
}
h1, h2, h3, h4, .sidebar, a.source, .search-input, .content table :not(code)>a, .collapse-toggle {
- font-family: "Fira Sans", "Helvetica Neue", Helvetica, Arial, sans-serif;
+ font-family: "Fira Sans", "Helvetica Neue", Helvetica, Arial, sans-serif;
}
ol, ul {
- padding-left: 25px;
+ padding-left: 25px;
}
ul ul, ol ul, ul ol, ol ol {
- margin-bottom: 0;
+ margin-bottom: 0;
}
p {
- margin: 0 0 .6em 0;
+ margin: 0 0 .6em 0;
}
code, pre {
- font-family: "Source Code Pro", Menlo, Monaco, Consolas, "DejaVu Sans Mono", Inconsolata, monospace;
- white-space: pre-wrap;
+ font-family: "Source Code Pro", Menlo, Monaco, Consolas, "DejaVu Sans Mono", Inconsolata, monospace;
+ white-space: pre-wrap;
}
.docblock code, .docblock-short code {
- border-radius: 3px;
- padding: 0 0.2em;
+ border-radius: 3px;
+ padding: 0 0.2em;
}
.docblock pre code, .docblock-short pre code {
- padding: 0;
+ padding: 0;
}
pre {
- padding: 14px;
+ padding: 14px;
}
.source pre {
- padding: 20px;
+ padding: 20px;
}
img {
- max-width: 100%;
+ max-width: 100%;
}
.content.source {
- margin-top: 50px;
- max-width: none;
- overflow: visible;
- margin-left: 0px;
- min-width: 70em;
+ margin-top: 50px;
+ max-width: none;
+ overflow: visible;
+ margin-left: 0px;
+ min-width: 70em;
}
nav.sub {
- font-size: 16px;
- text-transform: uppercase;
+ font-size: 16px;
+ text-transform: uppercase;
}
.sidebar {
- width: 200px;
- position: absolute;
- left: 0;
- top: 0;
- min-height: 100%;
+ width: 200px;
+ position: absolute;
+ left: 0;
+ top: 0;
+ min-height: 100%;
}
.content, nav { max-width: 960px; }
.js-only, .hidden { display: none !important; }
.sidebar {
- padding: 10px;
+ padding: 10px;
}
.sidebar img {
- margin: 20px auto;
- display: block;
+ margin: 20px auto;
+ display: block;
}
.sidebar .location {
- font-size: 17px;
- margin: 30px 0 20px 0;
- text-align: center;
+ font-size: 17px;
+ margin: 30px 0 20px 0;
+ text-align: center;
}
.location a:first-child { font-weight: 500; }
.block {
- padding: 0 10px;
- margin-bottom: 14px;
+ padding: 0 10px;
+ margin-bottom: 14px;
}
.block h2, .block h3 {
- margin-top: 0;
- margin-bottom: 8px;
- text-align: center;
+ margin-top: 0;
+ margin-bottom: 8px;
+ text-align: center;
}
.block ul, .block li {
- margin: 0;
- padding: 0;
- list-style: none;
+ margin: 0;
+ padding: 0;
+ list-style: none;
}
.block a {
- display: block;
- text-overflow: ellipsis;
- overflow: hidden;
- line-height: 15px;
- padding: 7px 5px;
- font-size: 14px;
- font-weight: 300;
- transition: border 500ms ease-out;
+ display: block;
+ text-overflow: ellipsis;
+ overflow: hidden;
+ line-height: 15px;
+ padding: 7px 5px;
+ font-size: 14px;
+ font-weight: 300;
+ transition: border 500ms ease-out;
}
.content {
- padding: 15px 0;
+ padding: 15px 0;
}
.content.source pre.rust {
- white-space: pre;
- overflow: auto;
- padding-left: 0;
+ white-space: pre;
+ overflow: auto;
+ padding-left: 0;
}
.content pre.line-numbers {
- float: left;
- border: none;
- position: relative;
+ float: left;
+ border: none;
+ position: relative;
- -webkit-user-select: none;
- -moz-user-select: none;
- -ms-user-select: none;
- user-select: none;
+ -webkit-user-select: none;
+ -moz-user-select: none;
+ -ms-user-select: none;
+ user-select: none;
}
.line-numbers span { cursor: pointer; }
.docblock-short p {
- display: inline;
+ display: inline;
}
.docblock-short.nowrap {
- display: block;
- overflow: hidden;
- white-space: nowrap;
- text-overflow: ellipsis;
+ display: block;
+ overflow: hidden;
+ white-space: nowrap;
+ text-overflow: ellipsis;
}
.docblock-short p {
- overflow: hidden;
- text-overflow: ellipsis;
- margin: 0;
+ overflow: hidden;
+ text-overflow: ellipsis;
+ margin: 0;
}
.docblock-short code { white-space: nowrap; }
.docblock h1, .docblock h2, .docblock h3, .docblock h4, .docblock h5 {
- border-bottom: 1px solid;
+ border-bottom: 1px solid;
}
.docblock h1 { font-size: 1.3em; }
.docblock h3, .docblock h4, .docblock h5 { font-size: 1em; }
.docblock {
- margin-left: 24px;
+ margin-left: 24px;
}
.content .out-of-band {
- font-size: 23px;
- margin: 0px;
- padding: 0px;
- text-align: right;
- display: inline-block;
- font-weight: normal;
- position: absolute;
- right: 0;
+ font-size: 23px;
+ margin: 0px;
+ padding: 0px;
+ text-align: right;
+ display: inline-block;
+ font-weight: normal;
+ position: absolute;
+ right: 0;
}
h3.impl > .out-of-band {
- font-size: 21px;
+ font-size: 21px;
}
h4 > code, h3 > code, .invisible > code {
- position: inherit;
+ position: inherit;
}
.in-band, code {
- z-index: 5;
+ z-index: 5;
}
.invisible {
- background: rgba(0, 0, 0, 0);
- width: 100%;
- display: inline-block;
+ background: rgba(0, 0, 0, 0);
+ width: 100%;
+ display: inline-block;
}
.content .in-band {
- margin: 0px;
- padding: 0px;
- display: inline-block;
+ margin: 0px;
+ padding: 0px;
+ display: inline-block;
}
#main { position: relative; }
#main > .since {
- top: inherit;
- font-family: "Fira Sans", "Helvetica Neue", Helvetica, Arial, sans-serif;
+ top: inherit;
+ font-family: "Fira Sans", "Helvetica Neue", Helvetica, Arial, sans-serif;
}
.content table {
- border-spacing: 0 5px;
- border-collapse: separate;
+ border-spacing: 0 5px;
+ border-collapse: separate;
}
.content td { vertical-align: top; }
.content td:first-child { padding-right: 20px; }
.content td h1, .content td h2 { margin-left: 0; font-size: 1.1em; }
.docblock table {
- border: 1px solid;
- margin: .5em 0;
- border-collapse: collapse;
- width: 100%;
+ border: 1px solid;
+ margin: .5em 0;
+ border-collapse: collapse;
+ width: 100%;
}
.docblock table td {
- padding: .5em;
- border-top: 1px dashed;
- border-bottom: 1px dashed;
+ padding: .5em;
+ border-top: 1px dashed;
+ border-bottom: 1px dashed;
}
.docblock table th {
- padding: .5em;
- text-align: left;
- border-top: 1px solid;
- border-bottom: 1px solid;
+ padding: .5em;
+ text-align: left;
+ border-top: 1px solid;
+ border-bottom: 1px solid;
}
.content .item-list {
- list-style-type: none;
- padding: 0;
+ list-style-type: none;
+ padding: 0;
}
.content .item-list li { margin-bottom: 3px; }
.content .multi-column {
- -moz-column-count: 5;
- -moz-column-gap: 2.5em;
- -webkit-column-count: 5;
- -webkit-column-gap: 2.5em;
- column-count: 5;
- column-gap: 2.5em;
+ -moz-column-count: 5;
+ -moz-column-gap: 2.5em;
+ -webkit-column-count: 5;
+ -webkit-column-gap: 2.5em;
+ column-count: 5;
+ column-gap: 2.5em;
}
.content .multi-column li { width: 100%; display: inline-block; }
.content .method {
- font-size: 1em;
- position: relative;
+ font-size: 1em;
+ position: relative;
}
/* Shift "where ..." part of method or fn definition down a line */
.content .method .where, .content .fn .where { display: block; }
.content .methods > div { margin-left: 40px; }
.content .impl-items .docblock, .content .impl-items .stability {
- margin-left: 40px;
+ margin-left: 40px;
}
.content .impl-items .method, .content .impl-items > .type {
- margin-left: 20px;
+ margin-left: 20px;
}
.content .stability code {
- font-size: 90%;
+ font-size: 90%;
}
/* Shift where in trait listing down a line */
pre.trait .where::before {
- content: '\a ';
+ content: '\a ';
}
nav {
- border-bottom: 1px solid;
- padding-bottom: 10px;
- margin-bottom: 10px;
+ border-bottom: 1px solid;
+ padding-bottom: 10px;
+ margin-bottom: 10px;
}
nav.main {
- padding: 20px 0;
- text-align: center;
+ padding: 20px 0;
+ text-align: center;
}
nav.main .current {
- border-top: 1px solid;
- border-bottom: 1px solid;
+ border-top: 1px solid;
+ border-bottom: 1px solid;
}
nav.main .separator {
- border: 1px solid;
- display: inline-block;
- height: 23px;
- margin: 0 20px;
+ border: 1px solid;
+ display: inline-block;
+ height: 23px;
+ margin: 0 20px;
}
nav.sum { text-align: right; }
nav.sub form { display: inline; }
nav.sub, .content {
- margin-left: 230px;
+ margin-left: 230px;
}
a {
- text-decoration: none;
- background: transparent;
+ text-decoration: none;
+ background: transparent;
}
.docblock a:hover, .docblock-short a:hover, .stability a {
- text-decoration: underline;
+ text-decoration: underline;
}
.content span.enum, .content a.enum, .block a.current.enum { color: #5e9766; }
.block a.current.crate { font-weight: 500; }
.search-input {
- width: 100%;
- /* Override Normalize.css: we have margins and do
- not want to overflow - the `moz` attribute is necessary
- until Firefox 29, too early to drop at this point */
- -moz-box-sizing: border-box !important;
- box-sizing: border-box !important;
- outline: none;
- border: none;
- border-radius: 1px;
- margin-top: 5px;
- padding: 10px 16px;
- font-size: 17px;
- transition: border-color 300ms ease;
- transition: border-radius 300ms ease-in-out;
- transition: box-shadow 300ms ease-in-out;
+ width: 100%;
+ /* Override Normalize.css: we have margins and do
+ not want to overflow - the `moz` attribute is necessary
+ until Firefox 29, too early to drop at this point */
+ -moz-box-sizing: border-box !important;
+ box-sizing: border-box !important;
+ outline: none;
+ border: none;
+ border-radius: 1px;
+ margin-top: 5px;
+ padding: 10px 16px;
+ font-size: 17px;
+ transition: border-color 300ms ease;
+ transition: border-radius 300ms ease-in-out;
+ transition: box-shadow 300ms ease-in-out;
}
.search-input:focus {
- border-color: #66afe9;
- border-radius: 2px;
- border: 0;
- outline: 0;
- box-shadow: 0 0 8px #078dd8;
+ border-color: #66afe9;
+ border-radius: 2px;
+ border: 0;
+ outline: 0;
+ box-shadow: 0 0 8px #078dd8;
}
.search-results .desc {
- white-space: nowrap;
- text-overflow: ellipsis;
- overflow: hidden;
- display: block;
+ white-space: nowrap;
+ text-overflow: ellipsis;
+ overflow: hidden;
+ display: block;
}
.search-results a {
- display: block;
+ display: block;
}
.content .search-results td:first-child { padding-right: 0; }
}
body.blur > :not(#help) {
- filter: blur(8px);
- -webkit-filter: blur(8px);
- opacity: .7;
+ filter: blur(8px);
+ -webkit-filter: blur(8px);
+ opacity: .7;
}
#help {
- width: 100%;
- height: 100vh;
- position: fixed;
- top: 0;
- left: 0;
- display: flex;
- justify-content: center;
- align-items: center;
+ width: 100%;
+ height: 100vh;
+ position: fixed;
+ top: 0;
+ left: 0;
+ display: flex;
+ justify-content: center;
+ align-items: center;
}
#help > div {
- flex: 0 0 auto;
- background: #e9e9e9;
- box-shadow: 0 0 6px rgba(0,0,0,.2);
- width: 550px;
- height: 330px;
- border: 1px solid #bfbfbf;
+ flex: 0 0 auto;
+ background: #e9e9e9;
+ box-shadow: 0 0 6px rgba(0,0,0,.2);
+ width: 550px;
+ height: 330px;
+ border: 1px solid #bfbfbf;
}
#help dt {
- float: left;
- border-radius: 4px;
- border: 1px solid #bfbfbf;
- background: #fff;
- width: 23px;
- text-align: center;
- clear: left;
- display: block;
- margin-top: -1px;
+ float: left;
+ border-radius: 4px;
+ border: 1px solid #bfbfbf;
+ background: #fff;
+ width: 23px;
+ text-align: center;
+ clear: left;
+ display: block;
+ margin-top: -1px;
}
#help dd { margin: 5px 33px; }
#help .infos { padding-left: 0; }
#help h1, #help h2 { margin-top: 0; }
#help > div div {
- width: 50%;
- float: left;
- padding: 20px;
+ width: 50%;
+ float: left;
+ padding: 20px;
}
em.stab {
- display: inline-block;
- border-width: 1px;
- border-style: solid;
- padding: 3px;
- margin-bottom: 5px;
- font-size: 90%;
- font-style: normal;
+ display: inline-block;
+ border-width: 1px;
+ border-style: solid;
+ padding: 3px;
+ margin-bottom: 5px;
+ font-size: 90%;
+ font-style: normal;
}
em.stab p {
- display: inline;
+ display: inline;
}
.module-item .stab {
- border-width: 0;
- padding: 0;
- margin: 0;
- background: inherit !important;
+ border-width: 0;
+ padding: 0;
+ margin: 0;
+ background: inherit !important;
}
.module-item.unstable {
- opacity: 0.65;
+ opacity: 0.65;
}
.since {
- font-weight: normal;
- font-size: initial;
- color: grey;
- position: absolute;
- right: 0;
- top: 0;
+ font-weight: normal;
+ font-size: initial;
+ color: grey;
+ position: absolute;
+ right: 0;
+ top: 0;
}
.variants_table {
- width: 100%;
+ width: 100%;
}
.variants_table tbody tr td:first-child {
- width: 1%; /* make the variant name as small as possible */
+ width: 1%; /* make the variant name as small as possible */
}
td.summary-column {
- width: 100%;
+ width: 100%;
}
.summary {
- padding-right: 0px;
+ padding-right: 0px;
}
.line-numbers :target { background-color: transparent; }
pre.rust .macro, pre.rust .macro-nonterminal { color: #3E999F; }
pre.rust .lifetime { color: #B76514; }
pre.rust .question-mark {
- color: #ff9011;
- font-weight: bold;
+ color: #ff9011;
+ font-weight: bold;
}
pre.rust { position: relative; }
a.test-arrow {
- background-color: rgba(78, 139, 202, 0.2);
- display: inline-block;
- position: absolute;
- padding: 5px 10px 5px 10px;
- border-radius: 5px;
- font-size: 130%;
- top: 5px;
- right: 5px;
+ background-color: rgba(78, 139, 202, 0.2);
+ display: inline-block;
+ position: absolute;
+ padding: 5px 10px 5px 10px;
+ border-radius: 5px;
+ font-size: 130%;
+ top: 5px;
+ right: 5px;
}
a.test-arrow:hover{
- background-color: #4e8bca;
- text-decoration: none;
+ background-color: #4e8bca;
+ text-decoration: none;
}
.section-header:hover a:after {
- content: '\2002\00a7\2002';
+ content: '\2002\00a7\2002';
}
.section-header:hover a {
- text-decoration: none;
+ text-decoration: none;
}
.section-header a {
- color: inherit;
+ color: inherit;
}
.collapse-toggle {
- font-weight: 300;
- position: absolute;
- left: -23px;
- color: #999;
- top: 0;
+ font-weight: 300;
+ position: absolute;
+ left: -23px;
+ color: #999;
+ top: 0;
}
.toggle-wrapper > .collapse-toggle {
- left: -24px;
- margin-top: 0px;
+ left: -24px;
+ margin-top: 0px;
}
.toggle-wrapper {
- position: relative;
+ position: relative;
}
.toggle-wrapper.collapsed {
- height: 1em;
- transition: height .2s;
+ height: 1em;
+ transition: height .2s;
}
.collapse-toggle > .inner {
- display: inline-block;
- width: 1.2ch;
- text-align: center;
+ display: inline-block;
+ width: 1.2ch;
+ text-align: center;
}
.toggle-label {
- color: #999;
+ color: #999;
}
.ghost {
- display: none;
+ display: none;
}
.ghost + .since {
- position: initial;
- display: table-cell;
+ position: initial;
+ display: table-cell;
}
.since + .srclink {
- display: table-cell;
- padding-left: 10px;
+ display: table-cell;
+ padding-left: 10px;
}
span.since {
- position: initial;
- font-size: 20px;
- margin-right: 5px;
+ position: initial;
+ font-size: 20px;
+ margin-right: 5px;
}
.toggle-wrapper > .collapse-toggle {
- left: 0;
+ left: 0;
}
.variant + .toggle-wrapper > a {
- margin-top: 5px;
+ margin-top: 5px;
}
.enum > .toggle-wrapper + .docblock, .struct > .toggle-wrapper + .docblock {
- margin-left: 30px;
- margin-bottom: 20px;
- margin-top: 5px;
+ margin-left: 30px;
+ margin-bottom: 20px;
+ margin-top: 5px;
}
.enum > .collapsed, .struct > .collapsed {
- margin-bottom: 25px;
+ margin-bottom: 25px;
}
.enum .variant, .struct .structfield {
- display: block;
+ display: block;
+}
+
+.attributes {
+ display: block;
+ margin: 0px 0px 0px 30px !important;
+}
+.toggle-attributes.collapsed {
+ margin-bottom: 5px;
}
:target > code {
/* Media Queries */
@media (max-width: 700px) {
- body {
- padding-top: 0px;
- }
-
- .sidebar {
- height: 40px;
- min-height: 40px;
- width: 100%;
- margin: 0px;
- padding: 0px;
- position: static;
- }
-
- .sidebar .location {
- float: right;
- margin: 0px;
- padding: 3px 10px 1px 10px;
- min-height: 39px;
- background: inherit;
- text-align: left;
- font-size: 24px;
- }
-
- .sidebar .location:empty {
- padding: 0;
- }
-
- .sidebar img {
- width: 35px;
- margin-top: 5px;
- margin-bottom: 0px;
- float: left;
- }
-
- nav.sub {
- margin: 0 auto;
- }
-
- .sidebar .block {
- display: none;
- }
-
- .content {
- margin-left: 0px;
- }
-
- .content .in-band {
- width: 100%;
- }
-
- .content .out-of-band {
- display: none;
- }
-
- .toggle-wrapper > .collapse-toggle {
- left: 0px;
- }
-
- .toggle-wrapper {
- height: 1.5em;
- }
+ body {
+ padding-top: 0px;
+ }
+
+ .sidebar {
+ height: 40px;
+ min-height: 40px;
+ width: 100%;
+ margin: 0px;
+ padding: 0px;
+ position: static;
+ }
+
+ .sidebar .location {
+ float: right;
+ margin: 0px;
+ padding: 3px 10px 1px 10px;
+ min-height: 39px;
+ background: inherit;
+ text-align: left;
+ font-size: 24px;
+ }
+
+ .sidebar .location:empty {
+ padding: 0;
+ }
+
+ .sidebar img {
+ width: 35px;
+ margin-top: 5px;
+ margin-bottom: 0px;
+ float: left;
+ }
+
+ nav.sub {
+ margin: 0 auto;
+ }
+
+ .sidebar .block {
+ display: none;
+ }
+
+ .content {
+ margin-left: 0px;
+ }
+
+ .content .in-band {
+ width: 100%;
+ }
+
+ .content .out-of-band {
+ display: none;
+ }
+
+ .toggle-wrapper > .collapse-toggle {
+ left: 0px;
+ }
+
+ .toggle-wrapper {
+ height: 1.5em;
+ }
}
@media print {
- nav.sub, .content .out-of-band, .collapse-toggle {
- display: none;
- }
+ nav.sub, .content .out-of-band, .collapse-toggle {
+ display: none;
+ }
}
#![feature(box_patterns)]
#![feature(box_syntax)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(libc)]
#![feature(rustc_private)]
#![feature(set_stdio)]
use rustc::hir::def::Def;
use rustc::hir::def_id::LOCAL_CRATE;
use rustc::middle::privacy::AccessLevel;
-use rustc::util::nodemap::FnvHashSet;
+use rustc::util::nodemap::FxHashSet;
use rustc::hir;
pub module: Module,
pub attrs: hir::HirVec<ast::Attribute>,
pub cx: &'a core::DocContext<'a, 'tcx>,
- view_item_stack: FnvHashSet<ast::NodeId>,
+ view_item_stack: FxHashSet<ast::NodeId>,
inlining_from_glob: bool,
}
impl<'a, 'tcx> RustdocVisitor<'a, 'tcx> {
pub fn new(cx: &'a core::DocContext<'a, 'tcx>) -> RustdocVisitor<'a, 'tcx> {
// If the root is reexported, terminate all recursion.
- let mut stack = FnvHashSet();
+ let mut stack = FxHashSet();
stack.insert(ast::CRATE_NODE_ID);
RustdocVisitor {
module: Module::new(None),
//! of other types, and you can implement them for your types too. As such,
//! you'll see a few different types of I/O throughout the documentation in
//! this module: [`File`]s, [`TcpStream`]s, and sometimes even [`Vec<T>`]s. For
-//! example, `Read` adds a `read()` method, which we can use on `File`s:
+//! example, [`Read`] adds a [`read()`] method, which we can use on `File`s:
//!
//! ```
//! use std::io;
//! [`Lines`]: struct.Lines.html
//! [`io::Result`]: type.Result.html
//! [`try!`]: ../macro.try.html
+//! [`read()`]: trait.Read.html#tymethod.read
#![stable(feature = "rust1", since = "1.0.0")]
///
/// Implementors of the `Write` trait are sometimes called 'writers'.
///
-/// Writers are defined by two required methods, `write()` and `flush()`:
+/// Writers are defined by two required methods, [`write()`] and [`flush()`]:
///
-/// * The `write()` method will attempt to write some data into the object,
+/// * The [`write()`] method will attempt to write some data into the object,
/// returning how many bytes were successfully written.
///
-/// * The `flush()` method is useful for adaptors and explicit buffers
+/// * The [`flush()`] method is useful for adaptors and explicit buffers
/// themselves for ensuring that all buffered data has been pushed out to the
/// 'true sink'.
///
/// Writers are intended to be composable with one another. Many implementors
-/// throughout `std::io` take and provide types which implement the `Write`
+/// throughout [`std::io`] take and provide types which implement the `Write`
/// trait.
///
+/// [`write()`]: #tymethod.write
+/// [`flush()`]: #tymethod.flush
+/// [`std::io`]: index.html
+///
/// # Examples
///
/// ```
/// Reader adaptor which limits the bytes read from an underlying reader.
///
-/// This struct is generally created by calling [`take()`][take] on a reader.
-/// Please see the documentation of `take()` for more details.
+/// This struct is generally created by calling [`take()`] on a reader.
+/// Please see the documentation of [`take()`] for more details.
///
-/// [take]: trait.Read.html#method.take
+/// [`take()`]: trait.Read.html#method.take
#[stable(feature = "rust1", since = "1.0.0")]
pub struct Take<T> {
inner: T,
///
/// # Note
///
- /// This instance may reach EOF after reading fewer bytes than indicated by
- /// this method if the underlying `Read` instance reaches EOF.
+ /// This instance may reach `EOF` after reading fewer bytes than indicated by
+ /// this method if the underlying [`Read`] instance reaches EOF.
+ ///
+ /// [`Read`]: ../../std/io/trait.Read.html
///
/// # Examples
///
///
/// Each handle shares a global buffer of data to be written to the standard
/// output stream. Access is also synchronized via a lock and explicit control
-/// over locking is available via the `lock` method.
+/// over locking is available via the [`lock()`] method.
///
/// Created by the [`io::stdout`] method.
///
+/// [`lock()`]: #method.lock
/// [`io::stdout`]: fn.stdout.html
#[stable(feature = "rust1", since = "1.0.0")]
pub struct Stdout {
#![feature(const_fn)]
#![feature(core_float)]
#![feature(core_intrinsics)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(dropck_parametricity)]
#![feature(float_extras)]
#![feature(float_from_str_radix)]
///
/// Address type can be any implementor of `ToSocketAddrs` trait. See its
/// documentation for concrete examples.
+ /// This will return an error when the IP version of the local socket
+ /// does not match that returned from `ToSocketAddrs`
+ /// See https://github.com/rust-lang/rust/issues/34202 for more details.
#[stable(feature = "rust1", since = "1.0.0")]
pub fn send_to<A: ToSocketAddrs>(&self, buf: &[u8], addr: A)
-> io::Result<usize> {
pub fn is_parameterized(&self) -> bool {
self.is_lt_parameterized() || self.is_type_parameterized()
}
+ pub fn span_for_name(&self, name: &str) -> Option<Span> {
+ for t in &self.ty_params {
+ if t.ident.name.as_str() == name {
+ return Some(t.span);
+ }
+ }
+ None
+ }
}
impl Default for Generics {
// Allows cfg(target_has_atomic = "...").
(active, cfg_target_has_atomic, "1.9.0", Some(32976)),
- // Allows `..` in tuple (struct) patterns
- (active, dotdot_in_tuple_patterns, "1.10.0", Some(33627)),
-
// Allows `impl Trait` in function return types.
(active, conservative_impl_trait, "1.12.0", Some(34511)),
- // Allows tuple structs and variants in more contexts,
// Permits numeric fields in struct expressions and patterns.
(active, relaxed_adts, "1.12.0", Some(35626)),
(accepted, deprecated, "1.9.0", Some(29935)),
// `expr?`
(accepted, question_mark, "1.14.0", Some(31436)),
+ // Allows `..` in tuple (struct) patterns
+ (accepted, dotdot_in_tuple_patterns, "1.14.0", Some(33627)),
);
// (changing above list without updating src/doc/reference.md makes @cmr sad)
}
}
+fn starts_with_digit(s: &str) -> bool {
+ s.as_bytes().first().cloned().map_or(false, |b| b >= b'0' && b <= b'9')
+}
+
impl<'a> Visitor for PostExpansionVisitor<'a> {
fn visit_attribute(&mut self, attr: &ast::Attribute) {
if !self.context.cm.span_allows_unstable(attr.span) {
gate_feature_post!(&self, field_init_shorthand, field.span,
"struct field shorthands are unstable");
}
+ if starts_with_digit(&field.ident.node.name.as_str()) {
+ gate_feature_post!(&self, relaxed_adts,
+ field.span,
+ "numeric fields in struct expressions are unstable");
+ }
}
}
_ => {}
pattern.span,
"box pattern syntax is experimental");
}
- PatKind::Tuple(_, ddpos)
- if ddpos.is_some() => {
- gate_feature_post!(&self, dotdot_in_tuple_patterns,
- pattern.span,
- "`..` in tuple patterns is experimental");
- }
- PatKind::TupleStruct(_, ref fields, ddpos)
- if ddpos.is_some() && !fields.is_empty() => {
- gate_feature_post!(&self, dotdot_in_tuple_patterns,
- pattern.span,
- "`..` in tuple struct patterns is experimental");
- }
- PatKind::TupleStruct(_, ref fields, ddpos)
- if ddpos.is_none() && fields.is_empty() => {
- gate_feature_post!(&self, relaxed_adts, pattern.span,
- "empty tuple structs patterns are unstable");
+ PatKind::Struct(_, ref fields, _) => {
+ for field in fields {
+ if starts_with_digit(&field.node.ident.name.as_str()) {
+ gate_feature_post!(&self, relaxed_adts,
+ field.span,
+ "numeric fields in struct patterns are unstable");
+ }
+ }
}
_ => {}
}
visit::walk_impl_item(self, ii);
}
- fn visit_variant_data(&mut self, vdata: &ast::VariantData, _: ast::Ident,
- _: &ast::Generics, _: NodeId, span: Span) {
- if vdata.fields().is_empty() {
- if vdata.is_tuple() {
- gate_feature_post!(&self, relaxed_adts, span,
- "empty tuple structs and enum variants are unstable, \
- use unit structs and enum variants instead");
- }
- }
-
- visit::walk_struct_def(self, vdata)
- }
-
fn visit_vis(&mut self, vis: &ast::Visibility) {
let span = match *vis {
ast::Visibility::Crate(span) => span,
#![cfg_attr(stage0, feature(question_mark))]
#![feature(rustc_diagnostic_macros)]
#![feature(specialization)]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
extern crate core;
extern crate serialize;
html_root_url = "https://doc.rust-lang.org/nightly/")]
#![cfg_attr(not(stage0), deny(warnings))]
-#![feature(dotdot_in_tuple_patterns)]
+#![cfg_attr(stage0, feature(dotdot_in_tuple_patterns))]
#![feature(proc_macro_lib)]
#![feature(proc_macro_internals)]
#![feature(rustc_private)]
// force-host
-#![feature(dotdot_in_tuple_patterns)]
#![feature(plugin_registrar, quote, rustc_private)]
extern crate syntax;
+++ /dev/null
-// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-trait Foo {
- fn foo();
-}
-
-struct Bar;
-
-impl Foo for Bar {}
-//~^ ERROR E0046
-//~| NOTE missing `foo` in implementation
-
-fn main() {
-}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#[macro_export]
+macro_rules! define_macro {
+ ($i:ident) => {
+ macro_rules! $i { () => {} }
+ }
+}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(relaxed_adts)]
-
pub struct XEmpty1 {}
pub struct XEmpty2;
pub struct XEmpty6();
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(item_like_imports, relaxed_adts)]
+#![feature(item_like_imports)]
pub mod c {
pub struct S {}
// FIXME: Remove when `item_like_imports` is stabilized.
-#![feature(relaxed_adts)]
-
pub mod c {
pub struct S {}
pub struct TS();
// aux-build:empty-struct.rs
-#![feature(relaxed_adts)]
-
extern crate empty_struct;
use empty_struct::*;
// aux-build:empty-struct.rs
-#![feature(relaxed_adts)]
-
extern crate empty_struct;
use empty_struct::*;
// aux-build:empty-struct.rs
-#![feature(relaxed_adts)]
-
extern crate empty_struct;
use empty_struct::*;
// aux-build:empty-struct.rs
-#![feature(relaxed_adts)]
-
extern crate empty_struct;
use empty_struct::*;
+++ /dev/null
-// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-struct Z(u8, u8);
-
-enum E {
- U(u8, u8),
-}
-
-fn main() {
- match Z(0, 1) {
- Z{..} => {} //~ ERROR tuple structs and variants in struct patterns are unstable
- }
- match E::U(0, 1) {
- E::U{..} => {} //~ ERROR tuple structs and variants in struct patterns are unstable
- }
-
- let z1 = Z(0, 1);
- let z2 = Z { ..z1 }; //~ ERROR tuple structs and variants in struct patterns are unstable
-}
+++ /dev/null
-// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-struct S(); //~ ERROR empty tuple structs and enum variants are unstable
-struct Z(u8, u8);
-
-enum E {
- V(), //~ ERROR empty tuple structs and enum variants are unstable
- U(u8, u8),
-}
-
-fn main() {
- match S() {
- S() => {} //~ ERROR empty tuple structs patterns are unstable
- }
- match E::V() {
- E::V() => {} //~ ERROR empty tuple structs patterns are unstable
- }
-}
+++ /dev/null
-// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-#![feature(associated_consts)]
-
-trait Foo {
- fn bar(&self);
- //~^ NOTE item in trait
- //~| NOTE item in trait
- const MY_CONST: u32; //~ NOTE item in trait
-}
-
-pub struct FooConstForMethod;
-
-impl Foo for FooConstForMethod {
- //~^ ERROR E0046
- //~| NOTE missing `bar` in implementation
- const bar: u64 = 1;
- //~^ ERROR E0323
- //~| NOTE does not match trait
- const MY_CONST: u32 = 1;
-}
-
-pub struct FooMethodForConst;
-
-impl Foo for FooMethodForConst {
- //~^ ERROR E0046
- //~| NOTE missing `MY_CONST` in implementation
- fn bar(&self) {}
- fn MY_CONST() {}
- //~^ ERROR E0324
- //~| NOTE does not match trait
-}
-
-pub struct FooTypeForMethod;
-
-impl Foo for FooTypeForMethod {
- //~^ ERROR E0046
- //~| NOTE missing `bar` in implementation
- type bar = u64;
- //~^ ERROR E0325
- //~| NOTE does not match trait
- const MY_CONST: u32 = 1;
-}
-
-fn main () {}
+++ /dev/null
-// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-struct TS ( //~ ERROR empty tuple structs and enum variants are unstable
- #[cfg(untrue)]
- i32,
-);
-
-enum E {
- TV ( //~ ERROR empty tuple structs and enum variants are unstable
- #[cfg(untrue)]
- i32,
- )
-}
-
-fn main() {
- let s = TS;
- let tv = E::TV;
-}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(relaxed_adts)]
-
enum MyOption<T> {
MySome(T),
MyNone,
+++ /dev/null
-// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-// Regression test for #23729
-
-fn main() {
- let fib = {
- struct Recurrence {
- mem: [u64; 2],
- pos: usize,
- }
-
- impl Iterator for Recurrence {
- //~^ ERROR E0046
- //~| NOTE missing `Item` in implementation
- #[inline]
- fn next(&mut self) -> Option<u64> {
- if self.pos < 2 {
- let next_val = self.mem[self.pos];
- self.pos += 1;
- Some(next_val)
- } else {
- let next_val = self.mem[0] + self.mem[1];
- self.mem[0] = self.mem[1];
- self.mem[1] = next_val;
- Some(next_val)
- }
- }
- }
-
- Recurrence { mem: [0, 1], pos: 0 }
- };
-
- for e in fib.take(10) {
- println!("{}", e)
- }
-}
+++ /dev/null
-// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-// Regression test for #23827
-
-#![feature(core, unboxed_closures)]
-
-pub struct Prototype {
- pub target: u32
-}
-
-trait Component {
- fn apply(self, e: u32);
-}
-
-impl<C: Component> Fn<(C,)> for Prototype {
- extern "rust-call" fn call(&self, (comp,): (C,)) -> Prototype {
- comp.apply(self.target);
- *self
- }
-}
-
-impl<C: Component> FnMut<(C,)> for Prototype {
- extern "rust-call" fn call_mut(&mut self, (comp,): (C,)) -> Prototype {
- Fn::call(*&self, (comp,))
- }
-}
-
-impl<C: Component> FnOnce<(C,)> for Prototype {
- //~^ ERROR E0046
- //~| NOTE missing `Output` in implementation
- extern "rust-call" fn call_once(self, (comp,): (C,)) -> Prototype {
- Fn::call(&self, (comp,))
- }
-}
-
-fn main() {}
+++ /dev/null
-// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-// Regression test for #24356
-
-// ignore-tidy-linelength
-
-fn main() {
- {
- use std::ops::Deref;
-
- struct Thing(i8);
-
- /*
- // Correct impl
- impl Deref for Thing {
- type Target = i8;
- fn deref(&self) -> &i8 { &self.0 }
- }
- */
-
- // Causes ICE
- impl Deref for Thing {
- //~^ ERROR E0046
- //~| NOTE missing `Target` in implementation
- fn deref(&self) -> i8 { self.0 }
- }
-
- let thing = Thing(72);
-
- *thing
- };
-}
}
impl Foo for S { //~ ERROR: `Foo` is not a trait
- //~| NOTE: not a trait
+ //~| NOTE: expected trait, found type alias
//~| NOTE: type aliases cannot be used for traits
fn bar() { }
}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(relaxed_adts)]
-
struct NonCopyable(());
fn main() {
trait I {}
type K = I;
impl K for isize {} //~ ERROR: `K` is not a trait
- //~| NOTE: not a trait
+ //~| NOTE: expected trait, found type alias
//~| NOTE: aliases cannot be used for traits
use ImportError; //~ ERROR unresolved import `ImportError` [E0432]
// aux-build:namespace-mix-new.rs
-#![feature(item_like_imports, relaxed_adts)]
+#![feature(item_like_imports)]
extern crate namespace_mix_new;
use namespace_mix_new::*;
// aux-build:namespace-mix-old.rs
-#![feature(relaxed_adts)]
-
extern crate namespace_mix_old;
use namespace_mix_old::{xm1, xm2, xm3, xm4, xm5, xm6, xm7, xm8, xm9, xmA, xmB, xmC};
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+struct S(u8);
+
+fn main() {
+ let s = S{0: 10}; //~ ERROR numeric fields in struct expressions are unstable
+ match s {
+ S{0: a, ..} => {} //~ ERROR numeric fields in struct patterns are unstable
+ }
+}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// aux-build:define_macro.rs
+// error-pattern: `bar` is already in scope
+
+macro_rules! bar { () => {} }
+define_macro!(bar);
+bar!();
+
+macro_rules! m { () => { #[macro_use] extern crate define_macro; } }
+m!();
+
+fn main() {}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(dotdot_in_tuple_patterns)]
-
fn main() {
let x;
+++ /dev/null
-// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
-// file at the top-level directory of this distribution and at
-// http://rust-lang.org/COPYRIGHT.
-//
-// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
-// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
-// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
-// option. This file may not be copied, modified, or distributed
-// except according to those terms.
-
-fn main() {
- match 0 {
- (..) => {} //~ ERROR `..` in tuple patterns is experimental
- (pat, ..) => {} //~ ERROR `..` in tuple patterns is experimental
- S(pat, ..) => {} //~ ERROR `..` in tuple struct patterns is experimental
- }
-}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(dotdot_in_tuple_patterns)]
-
struct S(u8, u8, u8);
fn main() {
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+
+// This test case tests the incremental compilation hash (ICH) implementation
+// for unary and binary expressions.
+
+// The general pattern followed here is: Change one thing between rev1 and rev2
+// and make sure that the hash has changed, then change nothing between rev2 and
+// rev3 and make sure that the hash has not changed.
+
+// must-compile-successfully
+// revisions: cfail1 cfail2 cfail3
+// compile-flags: -Z query-dep-graph -Z force-overflow-checks=off
+
+#![allow(warnings)]
+#![feature(rustc_attrs)]
+#![crate_type="rlib"]
+
+
+// Change constant operand of negation -----------------------------------------
+#[cfg(cfail1)]
+pub fn const_negation() -> i32 {
+ -10
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn const_negation() -> i32 {
+ -1
+}
+
+
+
+// Change constant operand of bitwise not --------------------------------------
+#[cfg(cfail1)]
+pub fn const_bitwise_not() -> i32 {
+ !100
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn const_bitwise_not() -> i32 {
+ !99
+}
+
+
+
+// Change variable operand of negation -----------------------------------------
+#[cfg(cfail1)]
+pub fn var_negation(x: i32, y: i32) -> i32 {
+ -x
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn var_negation(x: i32, y: i32) -> i32 {
+ -y
+}
+
+
+
+// Change variable operand of bitwise not --------------------------------------
+#[cfg(cfail1)]
+pub fn var_bitwise_not(x: i32, y: i32) -> i32 {
+ !x
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn var_bitwise_not(x: i32, y: i32) -> i32 {
+ !y
+}
+
+
+
+// Change variable operand of deref --------------------------------------------
+#[cfg(cfail1)]
+pub fn var_deref(x: &i32, y: &i32) -> i32 {
+ *x
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn var_deref(x: &i32, y: &i32) -> i32 {
+ *y
+}
+
+
+
+// Change first constant operand of addition -----------------------------------
+#[cfg(cfail1)]
+pub fn first_const_add() -> i32 {
+ 1 + 3
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn first_const_add() -> i32 {
+ 2 + 3
+}
+
+
+
+// Change second constant operand of addition -----------------------------------
+#[cfg(cfail1)]
+pub fn second_const_add() -> i32 {
+ 1 + 2
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn second_const_add() -> i32 {
+ 1 + 3
+}
+
+
+
+// Change first variable operand of addition -----------------------------------
+#[cfg(cfail1)]
+pub fn first_var_add(a: i32, b: i32) -> i32 {
+ a + 2
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn first_var_add(a: i32, b: i32) -> i32 {
+ b + 2
+}
+
+
+
+// Change second variable operand of addition ----------------------------------
+#[cfg(cfail1)]
+pub fn second_var_add(a: i32, b: i32) -> i32 {
+ 1 + a
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn second_var_add(a: i32, b: i32) -> i32 {
+ 1 + b
+}
+
+
+
+// Change operator from + to - -------------------------------------------------
+#[cfg(cfail1)]
+pub fn plus_to_minus(a: i32) -> i32 {
+ 1 + a
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn plus_to_minus(a: i32) -> i32 {
+ 1 - a
+}
+
+
+
+// Change operator from + to * -------------------------------------------------
+#[cfg(cfail1)]
+pub fn plus_to_mult(a: i32) -> i32 {
+ 1 + a
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn plus_to_mult(a: i32) -> i32 {
+ 1 * a
+}
+
+
+
+// Change operator from + to / -------------------------------------------------
+#[cfg(cfail1)]
+pub fn plus_to_div(a: i32) -> i32 {
+ 1 + a
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn plus_to_div(a: i32) -> i32 {
+ 1 / a
+}
+
+
+
+// Change operator from + to % -------------------------------------------------
+#[cfg(cfail1)]
+pub fn plus_to_mod(a: i32) -> i32 {
+ 1 + a
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn plus_to_mod(a: i32) -> i32 {
+ 1 % a
+}
+
+
+
+// Change operator from && to || -----------------------------------------------
+#[cfg(cfail1)]
+pub fn and_to_or(a: bool, b: bool) -> bool {
+ a && b
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn and_to_or(a: bool, b: bool) -> bool {
+ a || b
+}
+
+
+
+// Change operator from & to | -------------------------------------------------
+#[cfg(cfail1)]
+pub fn bitwise_and_to_bitwise_or(a: i32) -> i32 {
+ 1 & a
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn bitwise_and_to_bitwise_or(a: i32) -> i32 {
+ 1 | a
+}
+
+
+
+// Change operator from & to ^ -------------------------------------------------
+#[cfg(cfail1)]
+pub fn bitwise_and_to_bitwise_xor(a: i32) -> i32 {
+ 1 & a
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn bitwise_and_to_bitwise_xor(a: i32) -> i32 {
+ 1 ^ a
+}
+
+
+
+// Change operator from & to << ------------------------------------------------
+#[cfg(cfail1)]
+pub fn bitwise_and_to_lshift(a: i32) -> i32 {
+ a & 1
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn bitwise_and_to_lshift(a: i32) -> i32 {
+ a << 1
+}
+
+
+
+// Change operator from & to >> ------------------------------------------------
+#[cfg(cfail1)]
+pub fn bitwise_and_to_rshift(a: i32) -> i32 {
+ a & 1
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn bitwise_and_to_rshift(a: i32) -> i32 {
+ a >> 1
+}
+
+
+
+// Change operator from == to != -----------------------------------------------
+#[cfg(cfail1)]
+pub fn eq_to_uneq(a: i32) -> bool {
+ a == 1
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn eq_to_uneq(a: i32) -> bool {
+ a != 1
+}
+
+
+
+// Change operator from == to < ------------------------------------------------
+#[cfg(cfail1)]
+pub fn eq_to_lt(a: i32) -> bool {
+ a == 1
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn eq_to_lt(a: i32) -> bool {
+ a < 1
+}
+
+
+
+// Change operator from == to > ------------------------------------------------
+#[cfg(cfail1)]
+pub fn eq_to_gt(a: i32) -> bool {
+ a == 1
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn eq_to_gt(a: i32) -> bool {
+ a > 1
+}
+
+
+
+// Change operator from == to <= -----------------------------------------------
+#[cfg(cfail1)]
+pub fn eq_to_le(a: i32) -> bool {
+ a == 1
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn eq_to_le(a: i32) -> bool {
+ a <= 1
+}
+
+
+
+// Change operator from == to >= -----------------------------------------------
+#[cfg(cfail1)]
+pub fn eq_to_ge(a: i32) -> bool {
+ a == 1
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn eq_to_ge(a: i32) -> bool {
+ a >= 1
+}
+
+
+
+// Change type in cast expression ----------------------------------------------
+#[cfg(cfail1)]
+pub fn type_cast(a: u8) -> u64 {
+ let b = a as i32;
+ let c = b as u64;
+ c
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn type_cast(a: u8) -> u64 {
+ let b = a as u32;
+ let c = b as u64;
+ c
+}
+
+
+
+// Change value in cast expression ---------------------------------------------
+#[cfg(cfail1)]
+pub fn value_cast(a: u32) -> i32 {
+ 1 as i32
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn value_cast(a: u32) -> i32 {
+ 2 as i32
+}
+
+
+
+// Change l-value in assignment ------------------------------------------------
+#[cfg(cfail1)]
+pub fn lvalue() -> i32 {
+ let mut x = 10;
+ let mut y = 11;
+ x = 9;
+ x
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn lvalue() -> i32 {
+ let mut x = 10;
+ let mut y = 11;
+ y = 9;
+ x
+}
+
+
+
+// Change r-value in assignment ------------------------------------------------
+#[cfg(cfail1)]
+pub fn rvalue() -> i32 {
+ let mut x = 10;
+ x = 9;
+ x
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn rvalue() -> i32 {
+ let mut x = 10;
+ x = 8;
+ x
+}
+
+
+
+// Change index into slice -----------------------------------------------------
+#[cfg(cfail1)]
+pub fn index_to_slice(s: &[u8], i: usize, j: usize) -> u8 {
+ s[i]
+}
+
+#[cfg(not(cfail1))]
+#[rustc_dirty(label="Hir", cfg="cfails2")]
+#[rustc_clean(label="Hir", cfg="cfails3")]
+#[rustc_metadata_dirty(cfg="cfail2")]
+#[rustc_metadata_clean(cfg="cfail3")]
+pub fn index_to_slice(s: &[u8], i: usize, j: usize) -> u8 {
+ s[j]
+}
--- /dev/null
+-include ../tools.mk
+
+all:
+ $(RUSTC) m1.rs -C prefer-dynamic
+ $(RUSTC) m2.rs 2>&1 | grep "error\[E0046\]: not all trait items implemented, missing: .*"
+ $(RUSTC) m2.rs 2>&1 | grep " --> m2.rs:18:1"
+ $(RUSTC) m2.rs 2>&1 | grep " | ^ missing .CONSTANT., .Type., .method. in implementation"
+ $(RUSTC) m2.rs 2>&1 | grep " = note: .CONSTANT. from trait: .const CONSTANT: u32;."
+ $(RUSTC) m2.rs 2>&1 | grep " = note: .Type. from trait: .type Type;."
+ $(RUSTC) m2.rs 2>&1 | grep " = note: .method. from trait: .fn(&Self, std::string::String) -> <Self as m1::X>::Type."
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![feature(associated_consts)]
+#![crate_type = "dylib"]
+pub trait X {
+ const CONSTANT: u32;
+ type Type;
+ fn method(&self, s: String) -> Self::Type;
+}
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![feature(associated_consts)]
+#![crate_type = "dylib"]
+extern crate m1;
+
+struct X {
+}
+
+impl m1::X for X {
+}
#![ crate_name = "test" ]
#![feature(box_syntax)]
-#![feature(dotdot_in_tuple_patterns)]
#![feature(rustc_private)]
extern crate graphviz;
#![feature(plugin_registrar)]
#![feature(box_syntax)]
-#![feature(dotdot_in_tuple_patterns)]
#![feature(rustc_private)]
extern crate syntax;
// force-host
-#![feature(dotdot_in_tuple_patterns)]
#![feature(plugin_registrar, quote, rustc_private)]
extern crate syntax;
// `#[derive(Trait)]` works for empty structs/variants with braces or parens.
-#![feature(relaxed_adts)]
#![feature(rustc_private)]
extern crate serialize as rustc_serialize;
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(relaxed_adts)]
-
pub struct XEmpty1 {}
pub struct XEmpty2;
pub struct XEmpty7();
// aux-build:empty-struct.rs
-#![feature(relaxed_adts)]
-
extern crate empty_struct;
use empty_struct::*;
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// Regression test for #18060: match arms were matching in the wrong order.
+
+fn main() {
+ assert_eq!(2, match (1, 3) { (0, 2...5) => 1, (1, 3) => 2, (_, 2...5) => 3, (_, _) => 4 });
+ assert_eq!(2, match (1, 3) { (1, 3) => 2, (_, 2...5) => 3, (_, _) => 4 });
+ assert_eq!(2, match (1, 7) { (0, 2...5) => 1, (1, 7) => 2, (_, 2...5) => 3, (_, _) => 4 });
+}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// compile-flags: -C debug_assertions=yes
+
+use std::panic;
+
+fn main() {
+ let r = panic::catch_unwind(|| {
+ let mut it = u8::max_value()..;
+ it.next().unwrap(); // 255
+ it.next().unwrap();
+ });
+ assert!(r.is_err());
+
+ let r = panic::catch_unwind(|| {
+ let mut it = i8::max_value()..;
+ it.next().unwrap(); // 127
+ it.next().unwrap();
+ });
+ assert!(r.is_err());
+}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// compile-flags: -C debug_assertions=no
+
+fn main() {
+ let mut it = u8::max_value()..;
+ assert_eq!(it.next().unwrap(), 255);
+ assert_eq!(it.next().unwrap(), u8::min_value());
+
+ let mut it = i8::max_value()..;
+ assert_eq!(it.next().unwrap(), 127);
+ assert_eq!(it.next().unwrap(), i8::min_value());
+}
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(dotdot_in_tuple_patterns)]
-
fn tuple() {
let x = (1, 2, 3);
match x {
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(dotdot_in_tuple_patterns)]
-
fn tuple() {
let x = (1,);
match x {
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(dotdot_in_tuple_patterns)]
-
fn tuple() {
let x = (1, 2, 3);
let branch = match x {
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(dotdot_in_tuple_patterns)]
-
fn tuple() {
let x = (1, 2, 3);
match x {
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(dotdot_in_tuple_patterns)]
-
fn tuple() {
struct S;
struct Z;
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-#![feature(dotdot_in_tuple_patterns)]
-
fn tuple() {
let x = (1, 2, 3, 4, 5);
match x {
--> $DIR/two_files.rs:15:6
|
15 | impl Bar for Baz { }
- | ^^^ not a trait
+ | ^^^ expected trait, found type alias
|
= note: type aliases cannot be used for traits
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+trait Foo {
+ fn foo();
+ //~^ NOTE `foo` from trait
+}
+
+struct Bar;
+
+impl Foo for Bar {}
+//~^ ERROR E0046
+//~| NOTE missing `foo` in implementation
+
+fn main() {
+}
--- /dev/null
+error[E0046]: not all trait items implemented, missing: `foo`
+ --> $DIR/E0046.rs:18:1
+ |
+12 | fn foo();
+ | --------- `foo` from trait
+...
+18 | impl Foo for Bar {}
+ | ^^^^^^^^^^^^^^^^^^^ missing `foo` in implementation
+
+error: aborting due to previous error
+
--- /dev/null
+// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![feature(associated_consts)]
+
+use std::fmt::Debug;
+
+trait Foo {
+ fn bar(&self);
+ const MY_CONST: u32;
+}
+
+pub struct FooConstForMethod;
+
+impl Foo for FooConstForMethod {
+ //~^ ERROR E0046
+ //~| NOTE missing `bar` in implementation
+ const bar: u64 = 1;
+ //~^ ERROR E0323
+ //~| NOTE does not match trait
+ const MY_CONST: u32 = 1;
+}
+
+pub struct FooMethodForConst;
+
+impl Foo for FooMethodForConst {
+ //~^ ERROR E0046
+ //~| NOTE missing `MY_CONST` in implementation
+ fn bar(&self) {}
+ fn MY_CONST() {}
+ //~^ ERROR E0324
+ //~| NOTE does not match trait
+}
+
+pub struct FooTypeForMethod;
+
+impl Foo for FooTypeForMethod {
+ //~^ ERROR E0046
+ //~| NOTE missing `bar` in implementation
+ type bar = u64;
+ //~^ ERROR E0325
+ //~| NOTE does not match trait
+ const MY_CONST: u32 = 1;
+}
+
+impl Debug for FooTypeForMethod {
+}
+
+fn main () {}
--- /dev/null
+error[E0323]: item `bar` is an associated const, which doesn't match its trait `<FooConstForMethod as Foo>`
+ --> $DIR/impl-wrong-item-for-trait.rs:25:5
+ |
+16 | fn bar(&self);
+ | -------------- item in trait
+...
+25 | const bar: u64 = 1;
+ | ^^^^^^^^^^^^^^^^^^^ does not match trait
+
+error[E0046]: not all trait items implemented, missing: `bar`
+ --> $DIR/impl-wrong-item-for-trait.rs:22:1
+ |
+16 | fn bar(&self);
+ | -------------- `bar` from trait
+...
+22 | impl Foo for FooConstForMethod {
+ | ^ missing `bar` in implementation
+
+error[E0324]: item `MY_CONST` is an associated method, which doesn't match its trait `<FooMethodForConst as Foo>`
+ --> $DIR/impl-wrong-item-for-trait.rs:37:5
+ |
+17 | const MY_CONST: u32;
+ | -------------------- item in trait
+...
+37 | fn MY_CONST() {}
+ | ^^^^^^^^^^^^^^^^ does not match trait
+
+error[E0046]: not all trait items implemented, missing: `MY_CONST`
+ --> $DIR/impl-wrong-item-for-trait.rs:33:1
+ |
+17 | const MY_CONST: u32;
+ | -------------------- `MY_CONST` from trait
+...
+33 | impl Foo for FooMethodForConst {
+ | ^ missing `MY_CONST` in implementation
+
+error[E0325]: item `bar` is an associated type, which doesn't match its trait `<FooTypeForMethod as Foo>`
+ --> $DIR/impl-wrong-item-for-trait.rs:47:5
+ |
+16 | fn bar(&self);
+ | -------------- item in trait
+...
+47 | type bar = u64;
+ | ^^^^^^^^^^^^^^^ does not match trait
+
+error[E0046]: not all trait items implemented, missing: `bar`
+ --> $DIR/impl-wrong-item-for-trait.rs:44:1
+ |
+16 | fn bar(&self);
+ | -------------- `bar` from trait
+...
+44 | impl Foo for FooTypeForMethod {
+ | ^ missing `bar` in implementation
+
+error[E0046]: not all trait items implemented, missing: `fmt`
+ --> $DIR/impl-wrong-item-for-trait.rs:53:1
+ |
+53 | impl Debug for FooTypeForMethod {
+ | ^ missing `fmt` in implementation
+ |
+ = note: `fmt` from trait: `fn(&Self, &mut std::fmt::Formatter<'_>) -> std::result::Result<(), std::fmt::Error>`
+
+error: aborting due to 7 previous errors
+
--- /dev/null
+// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// Regression test for #23729
+
+fn main() {
+ let fib = {
+ struct Recurrence {
+ mem: [u64; 2],
+ pos: usize,
+ }
+
+ impl Iterator for Recurrence {
+ //~^ ERROR E0046
+ //~| NOTE missing `Item` in implementation
+ //~| NOTE `Item` from trait: `type Item;`
+ #[inline]
+ fn next(&mut self) -> Option<u64> {
+ if self.pos < 2 {
+ let next_val = self.mem[self.pos];
+ self.pos += 1;
+ Some(next_val)
+ } else {
+ let next_val = self.mem[0] + self.mem[1];
+ self.mem[0] = self.mem[1];
+ self.mem[1] = next_val;
+ Some(next_val)
+ }
+ }
+ }
+
+ Recurrence { mem: [0, 1], pos: 0 }
+ };
+
+ for e in fib.take(10) {
+ println!("{}", e)
+ }
+}
--- /dev/null
+error[E0046]: not all trait items implemented, missing: `Item`
+ --> $DIR/issue-23729.rs:20:9
+ |
+20 | impl Iterator for Recurrence {
+ | ^ missing `Item` in implementation
+ |
+ = note: `Item` from trait: `type Item;`
+
+error: aborting due to previous error
+
--- /dev/null
+// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// Regression test for #23827
+
+#![feature(core, unboxed_closures)]
+
+pub struct Prototype {
+ pub target: u32
+}
+
+trait Component {
+ fn apply(self, e: u32);
+}
+
+impl<C: Component> Fn<(C,)> for Prototype {
+ extern "rust-call" fn call(&self, (comp,): (C,)) -> Prototype {
+ comp.apply(self.target);
+ *self
+ }
+}
+
+impl<C: Component> FnMut<(C,)> for Prototype {
+ extern "rust-call" fn call_mut(&mut self, (comp,): (C,)) -> Prototype {
+ Fn::call(*&self, (comp,))
+ }
+}
+
+impl<C: Component> FnOnce<(C,)> for Prototype {
+ //~^ ERROR E0046
+ //~| NOTE missing `Output` in implementation
+ //~| NOTE `Output` from trait: `type Output;`
+ extern "rust-call" fn call_once(self, (comp,): (C,)) -> Prototype {
+ Fn::call(&self, (comp,))
+ }
+}
+
+fn main() {}
--- /dev/null
+error[E0046]: not all trait items implemented, missing: `Output`
+ --> $DIR/issue-23827.rs:36:1
+ |
+36 | impl<C: Component> FnOnce<(C,)> for Prototype {
+ | ^ missing `Output` in implementation
+ |
+ = note: `Output` from trait: `type Output;`
+
+error: aborting due to previous error
+
--- /dev/null
+// Copyright 2015 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+// Regression test for #24356
+
+// ignore-tidy-linelength
+
+fn main() {
+ {
+ use std::ops::Deref;
+
+ struct Thing(i8);
+
+ /*
+ // Correct impl
+ impl Deref for Thing {
+ type Target = i8;
+ fn deref(&self) -> &i8 { &self.0 }
+ }
+ */
+
+ // Causes ICE
+ impl Deref for Thing {
+ //~^ ERROR E0046
+ //~| NOTE missing `Target` in implementation
+ //~| NOTE `Target` from trait: `type Target;`
+ fn deref(&self) -> i8 { self.0 }
+ }
+
+ let thing = Thing(72);
+
+ *thing
+ };
+}
--- /dev/null
+error[E0046]: not all trait items implemented, missing: `Target`
+ --> $DIR/issue-24356.rs:30:9
+ |
+30 | impl Deref for Thing {
+ | ^ missing `Target` in implementation
+ |
+ = note: `Target` from trait: `type Target;`
+
+error: aborting due to previous error
+
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+struct Foo<T: Clone>(T);
+
+use std::ops::Add;
+
+impl<T: Clone, Add> Add for Foo<T> {
+ type Output = usize;
+
+ fn add(self, rhs: Self) -> Self::Output {
+ unimplemented!();
+ }
+}
--- /dev/null
+error[E0404]: `Add` is not a trait
+ --> $DIR/issue-35987.rs:15:21
+ |
+15 | impl<T: Clone, Add> Add for Foo<T> {
+ | --- ^^^ expected trait, found type parameter
+ | |
+ | type parameter defined here
+
+error: main function not found
+
+error: cannot continue compilation due to previous error
+