# The Rust Programming Language
-This is the main source code repository for [Rust]. It contains the compiler, standard library,
-and documentation.
+This is the main source code repository for [Rust]. It contains the compiler,
+standard library, and documentation.
[Rust]: https://www.rust-lang.org
#### MinGW
-[MSYS2](http://msys2.github.io/) can be used to easily build Rust on Windows:
+[MSYS2][msys2] can be used to easily build Rust on Windows:
-1. Grab the latest MSYS2 installer and go through the installer.
+msys2: https://msys2.github.io/
-2. From the MSYS2 terminal, install the `mingw64` toolchain and other required
- tools.
+1. Grab the latest [MSYS2 installer][msys2] and go through the installer.
- ```sh
- # Update package mirrors (may be needed if you have a fresh install of MSYS2)
- $ pacman -Sy pacman-mirrors
- ```
+2. Run `mingw32_shell.bat` or `mingw64_shell.bat` from wherever you installed
+ MSYS2 (i.e. `C:\msys64`), depending on whether you want 32-bit or 64-bit
+ Rust. (As of the latest version of MSYS2 you have to run `msys2_shell.cmd
+ -mingw32` or `msys2_shell.cmd -mingw64` from the command line instead)
- Download [MinGW from
- here](http://mingw-w64.org/doku.php/download/mingw-builds), and choose the
- `version=4.9.x,threads=win32,exceptions=dwarf/seh` flavor when installing. Also, make sure to install to a path without spaces in it. After installing,
- add its `bin` directory to your `PATH`. This is due to [#28260](https://github.com/rust-lang/rust/issues/28260), in the future,
- installing from pacman should be just fine.
+3. From this terminal, install the required tools:
```sh
- # Make git available in MSYS2 (if not already available on path)
- $ pacman -S git
+ # Update package mirrors (may be needed if you have a fresh install of MSYS2)
+ $ pacman -Sy pacman-mirrors
- $ pacman -S base-devel
+ # Install build tools needed for Rust. If you're building a 32-bit compiler,
+ # then replace "x86_64" below with "i686". If you've already got git, python,
+ # or CMake installed and in PATH you can remove them from this list. Note
+ # that it is important that the `python2` and `cmake` packages **not** used.
+ # The build has historically been known to fail with these packages.
+ $ pacman -S git \
+ make \
+ diffutils \
+ mingw-w64-x86_64-python2 \
+ mingw-w64-x86_64-cmake \
+ mingw-w64-x86_64-gcc
```
-3. Run `mingw32_shell.bat` or `mingw64_shell.bat` from wherever you installed
- MSYS2 (i.e. `C:\msys`), depending on whether you want 32-bit or 64-bit Rust.
- (As of the latest version of MSYS2 you have to run `msys2_shell.cmd -mingw32`
- or `msys2_shell.cmd -mingw64` from the command line instead)
-
-4. Navigate to Rust's source code, configure and build it:
+4. Navigate to Rust's source code (or clone it), then configure and build it:
```sh
$ ./configure
$ make && make install
```
+#### MSVC with rustbuild
+
+For those who don't want the hassle of MSYS or MinGW, you can invoke rustbuild
+directly. All you need are Python 2, CMake, and Git in your PATH (make sure you
+do __not__ use the ones from MSYS!). You'll also need Visual Studio 2013 or
+newer with the C++ tools. Then all you need to do is invoke the appropriate
+vcvars bat file and kick off rustbuild.
+
+```bat
+CALL "C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\bin\amd64\vcvars64.bat"
+python .\src\bootstrap\bootstrap.py
+```
+
## Building Documentation
If you’d like to build the documentation, it’s almost the same:
```sh
-./configure
+$ ./configure
$ make docs
```
You may find that other platforms work, but these are our officially
supported build environments that are most likely to work.
-Rust currently needs between 600MiB and 1.5GiB to build, depending on platform. If it hits
-swap, it will take a very long time to build.
+Rust currently needs between 600MiB and 1.5GiB to build, depending on platform.
+If it hits swap, it will take a very long time to build.
There is more advice about hacking on Rust in [CONTRIBUTING.md].
and the Apache License (Version 2.0), with portions covered by various
BSD-like licenses.
-See [LICENSE-APACHE](LICENSE-APACHE), [LICENSE-MIT](LICENSE-MIT), and [COPYRIGHT](COPYRIGHT) for details.
+See [LICENSE-APACHE](LICENSE-APACHE), [LICENSE-MIT](LICENSE-MIT), and
+[COPYRIGHT](COPYRIGHT) for details.
+Version 1.10.0 (2016-07-07)
+===========================
+
+Language
+--------
+
+* [Allow `concat_idents!` in type positions as well as in expression
+ positions]
+ (https://github.com/rust-lang/rust/pull/33735).
+* [`Copy` types are required to have a trivial implementation of `Clone`]
+ (https://github.com/rust-lang/rust/pull/33420).
+ [RFC 1521](https://github.com/rust-lang/rfcs/blob/master/text/1521-copy-clone-semantics.md).
+* [Single-variant enums support the `#[repr(..)]` attribute]
+ (https://github.com/rust-lang/rust/pull/33355).
+* [Fix `#[derive(RustcEncodable)]` in the presence of other `encode` methods]
+ (https://github.com/rust-lang/rust/pull/32908).
+* [`panic!` can be converted to a runtime abort with the
+ `-C panic=abort` flag]
+ (https://github.com/rust-lang/rust/pull/32900).
+ [RFC 1513](https://github.com/rust-lang/rfcs/blob/master/text/1513-less-unwinding.md).
+* [Add a new crate type, 'cdylib']
+ (https://github.com/rust-lang/rust/pull/33553).
+ cdylibs are dynamic libraries suitable for loading by non-Rust hosts.
+ [RFC 1510](https://github.com/rust-lang/rfcs/blob/master/text/1510-rdylib.md).
+ Note that Cargo does not yet directly support cdylibs.
+
+Stabilized APIs
+---------------
+
+* `os::windows::fs::OpenOptionsExt::access_mode`
+* `os::windows::fs::OpenOptionsExt::share_mode`
+* `os::windows::fs::OpenOptionsExt::custom_flags`
+* `os::windows::fs::OpenOptionsExt::attributes`
+* `os::windows::fs::OpenOptionsExt::security_qos_flags`
+* `os::unix::fs::OpenOptionsExt::custom_flags`
+* [`sync::Weak::new`]
+ (http://doc.rust-lang.org/alloc/arc/struct.Weak.html#method.new)
+* `Default for sync::Weak`
+* [`panic::set_hook`]
+ (http://doc.rust-lang.org/std/panic/fn.set_hook.html)
+* [`panic::take_hook`]
+ (http://doc.rust-lang.org/std/panic/fn.take_hook.html)
+* [`panic::PanicInfo`]
+ (http://doc.rust-lang.org/std/panic/struct.PanicInfo.html)
+* [`panic::PanicInfo::payload`]
+ (http://doc.rust-lang.org/std/panic/struct.PanicInfo.html#method.payload)
+* [`panic::PanicInfo::location`]
+ (http://doc.rust-lang.org/std/panic/struct.PanicInfo.html#method.location)
+* [`panic::Location`]
+ (http://doc.rust-lang.org/std/panic/struct.Location.html)
+* [`panic::Location::file`]
+ (http://doc.rust-lang.org/std/panic/struct.Location.html#method.file)
+* [`panic::Location::line`]
+ (http://doc.rust-lang.org/std/panic/struct.Location.html#method.line)
+* [`ffi::CStr::from_bytes_with_nul`]
+ (http://doc.rust-lang.org/std/ffi/struct.CStr.html#method.from_bytes_with_nul)
+* [`ffi::CStr::from_bytes_with_nul_unchecked`]
+ (http://doc.rust-lang.org/std/ffi/struct.CStr.html#method.from_bytes_with_nul_unchecked)
+* [`ffi::FromBytesWithNulError`]
+ (http://doc.rust-lang.org/std/ffi/struct.FromBytesWithNulError.html)
+* [`fs::Metadata::modified`]
+ (http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.modified)
+* [`fs::Metadata::accessed`]
+ (http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.accessed)
+* [`fs::Metadata::created`]
+ (http://doc.rust-lang.org/std/fs/struct.Metadata.html#method.created)
+* `sync::atomic::Atomic{Usize,Isize,Bool,Ptr}::compare_exchange`
+* `sync::atomic::Atomic{Usize,Isize,Bool,Ptr}::compare_exchange_weak`
+* `collections::{btree,hash}_map::{Occupied,Vacant,}Entry::key`
+* `os::unix::net::{UnixStream, UnixListener, UnixDatagram, SocketAddr}`
+* [`SocketAddr::is_unnamed`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.SocketAddr.html#method.is_unnamed)
+* [`SocketAddr::as_pathname`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.SocketAddr.html#method.as_pathname)
+* [`UnixStream::connect`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.connect)
+* [`UnixStream::pair`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.pair)
+* [`UnixStream::try_clone`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.try_clone)
+* [`UnixStream::local_addr`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.local_addr)
+* [`UnixStream::peer_addr`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.peer_addr)
+* [`UnixStream::set_read_timeout`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.read_timeout)
+* [`UnixStream::set_write_timeout`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.write_timeout)
+* [`UnixStream::read_timeout`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.read_timeout)
+* [`UnixStream::write_timeout`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.write_timeout)
+* [`UnixStream::set_nonblocking`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.set_nonblocking)
+* [`UnixStream::take_error`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.take_error)
+* [`UnixStream::shutdown`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixStream.html#method.shutdown)
+* Read/Write/RawFd impls for `UnixStream`
+* [`UnixListener::bind`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.bind)
+* [`UnixListener::accept`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.accept)
+* [`UnixListener::try_clone`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.try_clone)
+* [`UnixListener::local_addr`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.local_addr)
+* [`UnixListener::set_nonblocking`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.set_nonblocking)
+* [`UnixListener::take_error`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.take_error)
+* [`UnixListener::incoming`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixListener.html#method.incoming)
+* RawFd impls for `UnixListener`
+* [`UnixDatagram::bind`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.bind)
+* [`UnixDatagram::unbound`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.unbound)
+* [`UnixDatagram::pair`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.pair)
+* [`UnixDatagram::connect`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.connect)
+* [`UnixDatagram::try_clone`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.try_clone)
+* [`UnixDatagram::local_addr`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.local_addr)
+* [`UnixDatagram::peer_addr`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.peer_addr)
+* [`UnixDatagram::recv_from`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.recv_from)
+* [`UnixDatagram::recv`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.recv)
+* [`UnixDatagram::send_to`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.send_to)
+* [`UnixDatagram::send`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.send)
+* [`UnixDatagram::set_read_timeout`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_read_timeout)
+* [`UnixDatagram::set_write_timeout`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_write_timeout)
+* [`UnixDatagram::read_timeout`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.read_timeout)
+* [`UnixDatagram::write_timeout`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.write_timeout)
+* [`UnixDatagram::set_nonblocking`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.set_nonblocking)
+* [`UnixDatagram::take_error`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.take_error)
+* [`UnixDatagram::shutdown`]
+ (http://doc.rust-lang.org/std/os/unix/net/struct.UnixDatagram.html#method.shutdown)
+* RawFd impls for `UnixDatagram`
+* `{BTree,Hash}Map::values_mut`
+* [`<[_]>::binary_search_by_key`]
+ (http://doc.rust-lang.org/beta/std/primitive.slice.html#method.binary_search_by_key)
+
+Libraries
+---------
+
+* [The `abs_sub` method of floats is deprecated]
+ (https://github.com/rust-lang/rust/pull/33664).
+ The semantics of this minor method are subtle and probably not what
+ most people want.
+* [Add implementation of Ord for Cell<T> and RefCell<T> where T: Ord]
+ (https://github.com/rust-lang/rust/pull/33306).
+* [On Linux, if `HashMap`s can't be initialized with `getrandom` they
+ will fall back to `/dev/urandom` temporarily to avoid blocking
+ during early boot]
+ (https://github.com/rust-lang/rust/pull/33086).
+* [Implemented negation for wrapping numerals]
+ (https://github.com/rust-lang/rust/pull/33067).
+* [Implement `Clone` for `binary_heap::IntoIter`]
+ (https://github.com/rust-lang/rust/pull/33050).
+* [Implement `Display` and `Hash` for `std::num::Wrapping`]
+ (https://github.com/rust-lang/rust/pull/33023).
+* [Add `Default` implementation for `&CStr`, `CString`, `Path`]
+ (https://github.com/rust-lang/rust/pull/32990).
+* [Implement `From<Vec<T>>` and `Into<Vec<T>>` for `VecDeque<T>`]
+ (https://github.com/rust-lang/rust/pull/32866).
+* [Implement `Default` for `UnsafeCell`, `fmt::Error`, `Condvar`,
+ `Mutex`, `RwLock`]
+ (https://github.com/rust-lang/rust/pull/32785).
+
+Cargo
+-----
+* [Cargo.toml supports the `profile.*.panic` option]
+ (https://github.com/rust-lang/cargo/pull/2687).
+ This controls the runtime behavior of the `panic!` macro
+ and can be either "unwind" (the default), or "abort".
+ [RFC 1513](https://github.com/rust-lang/rfcs/blob/master/text/1513-less-unwinding.md).
+* [Don't throw away errors with `-p` arguments]
+ (https://github.com/rust-lang/cargo/pull/2723).
+* [Report status to stderr instead of stdout]
+ (https://github.com/rust-lang/cargo/pull/2693).
+* [Build scripts are passed a `CARGO_MANIFEST_LINKS` environment
+ variable that corresponds to the `links` field of the manifest]
+ (https://github.com/rust-lang/cargo/pull/2710).
+* [Ban keywords from crate names]
+ (https://github.com/rust-lang/cargo/pull/2707).
+* [Canonicalize `CARGO_HOME` on Windows]
+ (https://github.com/rust-lang/cargo/pull/2604).
+* [Retry network requests]
+ (https://github.com/rust-lang/cargo/pull/2396).
+ By default they are retried twice, which can be customized with the
+ `net.retry` value in `.cargo/config`.
+* [Don't print extra error info for failing subcommands]
+ (https://github.com/rust-lang/cargo/pull/2674).
+* [Add `--force` flag to `cargo install`]
+ (https://github.com/rust-lang/cargo/pull/2405).
+* [Don't use `flock` on NFS mounts]
+ (https://github.com/rust-lang/cargo/pull/2623).
+* [Prefer building `cargo install` artifacts in temporary directories]
+ (https://github.com/rust-lang/cargo/pull/2610).
+ Makes it possible to install multiple crates in parallel.
+* [Add `cargo test --doc`]
+ (https://github.com/rust-lang/cargo/pull/2578).
+* [Add `cargo --explain`]
+ (https://github.com/rust-lang/cargo/pull/2551).
+* [Don't print warnings when `-q` is passed]
+ (https://github.com/rust-lang/cargo/pull/2576).
+* [Add `cargo doc --lib` and `--bin`]
+ (https://github.com/rust-lang/cargo/pull/2577).
+* [Don't require build script output to be UTF-8]
+ (https://github.com/rust-lang/cargo/pull/2560).
+* [Correctly attempt multiple git usernames]
+ (https://github.com/rust-lang/cargo/pull/2584).
+
+Performance
+-----------
+
+* [rustc memory usage was reduced by refactoring the context used for
+ type checking]
+ (https://github.com/rust-lang/rust/pull/33425).
+* [Speed up creation of `HashMap`s by caching the random keys used
+ to initialize the hash state]
+ (https://github.com/rust-lang/rust/pull/33318).
+* [The `find` implementation for `Chain` iterators is 2x faster]
+ (https://github.com/rust-lang/rust/pull/33289).
+* [Trait selection optimizations speed up type checking by 15%]
+ (https://github.com/rust-lang/rust/pull/33138).
+* [Efficient trie lookup for boolean Unicode properties]
+ (https://github.com/rust-lang/rust/pull/33098).
+ 10x faster than the previous lookup tables.
+* [Special case `#[derive(Copy, Clone)]` to avoid bloat]
+ (https://github.com/rust-lang/rust/pull/31414).
+
+Usability
+---------
+
+* Many incremental improvements to documentation and rustdoc.
+* [rustdoc: List blanket trait impls]
+ (https://github.com/rust-lang/rust/pull/33514).
+* [rustdoc: Clean up ABI rendering]
+ (https://github.com/rust-lang/rust/pull/33151).
+* [Indexing with the wrong type produces a more informative error]
+ (https://github.com/rust-lang/rust/pull/33401).
+* [Improve diagnostics for constants being used in irrefutable patterns]
+ (https://github.com/rust-lang/rust/pull/33406).
+* [When many method candidates are in scope limit the suggestions to 10]
+ (https://github.com/rust-lang/rust/pull/33338).
+* [Remove confusing suggestion when calling a `fn` type]
+ (https://github.com/rust-lang/rust/pull/33325).
+* [Do not suggest changing `&mut self` to `&mut mut self`]
+ (https://github.com/rust-lang/rust/pull/33319).
+
+Misc
+----
+
+* [Update i686-linux-android features to match Android ABI]
+ (https://github.com/rust-lang/rust/pull/33651).
+* [Update aarch64-linux-android features to match Android ABI]
+ (https://github.com/rust-lang/rust/pull/33500).
+* [`std` no longer prints backtraces on platforms where the running
+ module must be loaded with `env::current_exe`, which can't be relied
+ on](https://github.com/rust-lang/rust/pull/33554).
+* This release includes std binaries for the i586-unknown-linux-gnu,
+ i686-unknown-linux-musl, and armv7-linux-androideabi targets. The
+ i586 target is for old x86 hardware without SSE2, and the armv7
+ target is for Android running on modern ARM architectures.
+* [The `rust-gdb` and `rust-lldb` scripts are distributed on all
+ Unix platforms](https://github.com/rust-lang/rust/pull/32835).
+* [On Unix the runtime aborts by calling `libc::abort` instead of
+ generating an illegal instruction]
+ (https://github.com/rust-lang/rust/pull/31457).
+* [Rust is now bootstrapped from the previous release of Rust,
+ instead of a snapshot from an arbitrary commit]
+ (https://github.com/rust-lang/rust/pull/32942).
+
+Compatibility Notes
+-------------------
+
+* [`AtomicBool` is now bool-sized, not word-sized]
+ (https://github.com/rust-lang/rust/pull/33579).
+* [`target_env` for Linux ARM targets is just `gnu`, not
+ `gnueabihf`, `gnueabi`, etc]
+ (https://github.com/rust-lang/rust/pull/33403).
+* [Consistently panic on overflow in `Duration::new`]
+ (https://github.com/rust-lang/rust/pull/33072).
+* [Change `String::truncate` to panic less]
+ (https://github.com/rust-lang/rust/pull/32977).
+* [Add `:block` to the follow set for `:ty` and `:path`]
+ (https://github.com/rust-lang/rust/pull/32945).
+ Affects how macros are parsed.
+* [Fix macro hygiene bug]
+ (https://github.com/rust-lang/rust/pull/32923).
+* [Feature-gated attributes on macro-generated macro invocations are
+ now rejected]
+ (https://github.com/rust-lang/rust/pull/32791).
+* [Suppress fallback and ambiguity errors during type inference]
+ (https://github.com/rust-lang/rust/pull/32258).
+ This caused some minor changes to type inference.
+
+
Version 1.9.0 (2016-05-26)
==========================
;;
*-msvc)
- # There are some MSYS python builds which will auto-translate
- # windows-style paths to MSYS-style paths in Python itself.
- # Unfortunately this breaks LLVM's build system as somewhere along
- # the line LLVM prints a path into a file from Python and then CMake
- # later tries to interpret that path. If Python prints a MSYS path
- # and CMake tries to use it as a Windows path, you're gonna have a
- # Bad Time.
- #
- # Consequently here we try to detect when that happens and print an
- # error if it does.
- if $CFG_PYTHON -c 'import sys; print sys.argv[1]' `pwd` | grep '^/' > /dev/null
- then
- err "
-
-python is silently translating windows paths to MSYS paths \
-and the build will fail if this python is used.
-
-Either an official python install must be used or an \
-alternative python package in MinGW must be used.
-
-If you are building under msys2 try installing the mingw-w64-x86_64-python2 \
-package instead of python2:
-
-$ pacman -R python2 && pacman -S mingw-w64-x86_64-python2
-"
- fi
-
# There are three builds of cmake on windows: MSVC, MinGW and Cygwin
# The Cygwin build does not have generators for Visual Studio, so
# detect that here and error.
esac
done
+if [ "$CFG_OSTYPE" = "pc-windows-gnu" ] || [ "$CFG_OSTYPE" = "pc-windows-msvc" ]
+then
+ # There are some MSYS python builds which will auto-translate
+ # windows-style paths to MSYS-style paths in Python itself.
+ # Unfortunately this breaks LLVM's build system as somewhere along
+ # the line LLVM prints a path into a file from Python and then CMake
+ # later tries to interpret that path. If Python prints a MSYS path
+ # and CMake tries to use it as a Windows path, you're gonna have a
+ # Bad Time.
+ #
+ # Consequently here we try to detect when that happens and print an
+ # error if it does.
+ if $CFG_PYTHON -c 'import sys; print sys.argv[1]' `pwd` | grep '^/' > /dev/null
+ then
+ err "
+
+python is silently translating windows paths to MSYS paths \
+and the build will fail if this python is used.
+
+Either an official python install must be used or an \
+alternative python package in MinGW must be used.
+
+If you are building under msys2 try installing the mingw-w64-x86_64-python2 \
+package instead of python2:
+
+$ pacman -S mingw-w64-x86_64-python2
+"
+ fi
+fi
+
if [ -n "$CFG_PERF" ]
then
HAVE_PERF_LOGFD=`$CFG_PERF stat --log-fd 2>&1 | grep 'unknown option'`
CFG_LIB_GLOB_i686-unknown-linux-gnu=lib$(1)-*.so
CFG_LIB_DSYM_GLOB_i686-unknown-linux-gnu=lib$(1)-*.dylib.dSYM
CFG_JEMALLOC_CFLAGS_i686-unknown-linux-gnu := -m32 $(CFLAGS)
-CFG_GCCISH_CFLAGS_i686-unknown-linux-gnu := -Wall -Werror -g -fPIC -m32 $(CFLAGS)
+CFG_GCCISH_CFLAGS_i686-unknown-linux-gnu := -Wall -Werror -g -fPIC -m32 $(CFLAGS) -march=i686
CFG_GCCISH_CXXFLAGS_i686-unknown-linux-gnu := -fno-rtti $(CXXFLAGS)
CFG_GCCISH_LINK_FLAGS_i686-unknown-linux-gnu := -shared -fPIC -ldl -pthread -lrt -g -m32
CFG_GCCISH_DEF_FLAG_i686-unknown-linux-gnu := -Wl,--export-dynamic,--dynamic-list=
ifeq ($(CFG_LLVM_ROOT),)
LLVM_STAMP_$(1) = $$(CFG_LLVM_BUILD_DIR_$(1))/llvm-auto-clean-stamp
+LLVM_DONE_$(1) = $$(CFG_LLVM_BUILD_DIR_$(1))/llvm-finished-building
-$$(LLVM_CONFIG_$(1)): $$(LLVM_DEPS_TARGET_$(1)) $$(LLVM_STAMP_$(1))
+$$(LLVM_CONFIG_$(1)): $$(LLVM_DONE_$(1))
+
+$$(LLVM_DONE_$(1)): $$(LLVM_DEPS_TARGET_$(1)) $$(LLVM_STAMP_$(1))
@$$(call E, cmake: llvm)
ifeq ($$(findstring msvc,$(1)),msvc)
$$(Q)$$(CFG_CMAKE) --build $$(CFG_LLVM_BUILD_DIR_$(1)) \
else
$$(Q)$$(MAKE) -C $$(CFG_LLVM_BUILD_DIR_$(1))
endif
- $$(Q)touch $$(LLVM_CONFIG_$(1))
+ $$(Q)touch $$@
ifeq ($$(findstring msvc,$(1)),msvc)
clean-llvm$(1):
import argparse
import contextlib
+import datetime
import hashlib
import os
import shutil
import tarfile
import tempfile
+from time import time
+
def get(url, path, verbose=False):
sha_url = url + ".sha256"
data[a] = b
return data
+def format_build_time(duration):
+ return str(datetime.timedelta(seconds=int(duration)))
+
class RustBuild:
def download_stage0(self):
cache_dst = os.path.join(self.build_dir, "cache")
try:
ostype = subprocess.check_output(['uname', '-s']).strip()
cputype = subprocess.check_output(['uname', '-m']).strip()
- except subprocess.CalledProcessError:
+ except (subprocess.CalledProcessError, WindowsError):
if sys.platform == 'win32':
return 'x86_64-pc-windows-msvc'
err = "uname not found"
rb._rustc_channel, rb._rustc_date = data['rustc'].split('-', 1)
rb._cargo_channel, rb._cargo_date = data['cargo'].split('-', 1)
+ start_time = time()
+
# Fetch/build the bootstrap
rb.build = rb.build_triple()
rb.download_stage0()
env["BOOTSTRAP_PARENT_ID"] = str(os.getpid())
rb.run(args, env)
+ end_time = time()
+
+ print("Build completed in %s" % format_build_time(end_time - start_time))
+
if __name__ == '__main__':
main()
// compiler already takes into account the triple in question.
t if t.contains("android") => {
if let Some(ndk) = config.and_then(|c| c.ndk.as_ref()) {
+ let target = target.replace("armv7", "arm");
let compiler = format!("{}-{}", target, gnu_compiler);
cfg.compiler(ndk.join("bin").join(compiler));
}
use bootstrap::{dylib_path, dylib_path_var};
use build::{Build, Compiler, Mode};
+use build::util;
+
+const ADB_TEST_DIR: &'static str = "/data/tmp";
/// Runs the `linkchecker` tool as compiled in `stage` by the `host` compiler.
///
target: &str,
mode: &str,
suite: &str) {
+ println!("Check compiletest {} ({} -> {})", suite, compiler.host, target);
let mut cmd = build.tool_cmd(compiler, "compiletest");
// compiletest currently has... a lot of arguments, so let's just pass all
cmd.arg("--host").arg(compiler.host);
cmd.arg("--llvm-filecheck").arg(build.llvm_filecheck(&build.config.build));
- let mut flags = format!("-Crpath");
+ let mut flags = vec!["-Crpath".to_string()];
if build.config.rust_optimize_tests {
- flags.push_str(" -O");
+ flags.push("-O".to_string());
}
if build.config.rust_debuginfo_tests {
- flags.push_str(" -g");
+ flags.push("-g".to_string());
}
- cmd.arg("--host-rustcflags").arg(&flags);
-
- let linkflag = format!("-Lnative={}", build.test_helpers_out(target).display());
- cmd.arg("--target-rustcflags").arg(format!("{} {}", flags, linkflag));
+ let mut hostflags = build.rustc_flags(&compiler.host);
+ hostflags.extend(flags.clone());
+ cmd.arg("--host-rustcflags").arg(hostflags.join(" "));
- // FIXME: needs android support
- cmd.arg("--android-cross-path").arg("");
+ let mut targetflags = build.rustc_flags(&target);
+ targetflags.extend(flags);
+ targetflags.push(format!("-Lnative={}",
+ build.test_helpers_out(target).display()));
+ cmd.arg("--target-rustcflags").arg(targetflags.join(" "));
// FIXME: CFG_PYTHON should probably be detected more robustly elsewhere
let python_default = "python";
}
build.add_bootstrap_key(compiler, &mut cmd);
+ cmd.arg("--adb-path").arg("adb");
+ cmd.arg("--adb-test-dir").arg(ADB_TEST_DIR);
+ if target.contains("android") {
+ // Assume that cc for this target comes from the android sysroot
+ cmd.arg("--android-cross-path")
+ .arg(build.cc(target).parent().unwrap().parent().unwrap());
+ } else {
+ cmd.arg("--android-cross-path").arg("");
+ }
+
build.run(&mut cmd);
}
let mut dylib_path = dylib_path();
dylib_path.insert(0, build.sysroot_libdir(compiler, target));
cargo.env(dylib_path_var(), env::join_paths(&dylib_path).unwrap());
- cargo.args(&build.flags.args);
- build.run(&mut cargo);
+ if target.contains("android") {
+ build.run(cargo.arg("--no-run"));
+ krate_android(build, compiler, target, mode);
+ } else {
+ cargo.args(&build.flags.args);
+ build.run(&mut cargo);
+ }
+}
+
+fn krate_android(build: &Build,
+ compiler: &Compiler,
+ target: &str,
+ mode: Mode) {
+ let mut tests = Vec::new();
+ let out_dir = build.cargo_out(compiler, mode, target);
+ find_tests(&out_dir, target, &mut tests);
+ find_tests(&out_dir.join("deps"), target, &mut tests);
+
+ for test in tests {
+ build.run(Command::new("adb").arg("push").arg(&test).arg(ADB_TEST_DIR));
+
+ let test_file_name = test.file_name().unwrap().to_string_lossy();
+ let log = format!("{}/check-stage{}-T-{}-H-{}-{}.log",
+ ADB_TEST_DIR,
+ compiler.stage,
+ target,
+ compiler.host,
+ test_file_name);
+ let program = format!("(cd {dir}; \
+ LD_LIBRARY_PATH=./{target} ./{test} \
+ --logfile {log} \
+ {args})",
+ dir = ADB_TEST_DIR,
+ target = target,
+ test = test_file_name,
+ log = log,
+ args = build.flags.args.join(" "));
+
+ let output = output(Command::new("adb").arg("shell").arg(&program));
+ println!("{}", output);
+ build.run(Command::new("adb")
+ .arg("pull")
+ .arg(&log)
+ .arg(build.out.join("tmp")));
+ build.run(Command::new("adb").arg("shell").arg("rm").arg(&log));
+ if !output.contains("result: ok") {
+ panic!("some tests failed");
+ }
+ }
+}
+
+fn find_tests(dir: &Path,
+ target: &str,
+ dst: &mut Vec<PathBuf>) {
+ for e in t!(dir.read_dir()).map(|e| t!(e)) {
+ let file_type = t!(e.file_type());
+ if !file_type.is_file() {
+ continue
+ }
+ let filename = e.file_name().into_string().unwrap();
+ if (target.contains("windows") && filename.ends_with(".exe")) ||
+ (!target.contains("windows") && !filename.contains(".")) {
+ dst.push(e.path());
+ }
+ }
+}
+
+pub fn android_copy_libs(build: &Build,
+ compiler: &Compiler,
+ target: &str) {
+ println!("Android copy libs to emulator ({})", target);
+ build.run(Command::new("adb").arg("remount"));
+ build.run(Command::new("adb").args(&["shell", "rm", "-r", ADB_TEST_DIR]));
+ build.run(Command::new("adb").args(&["shell", "mkdir", ADB_TEST_DIR]));
+ build.run(Command::new("adb")
+ .arg("push")
+ .arg(build.src.join("src/etc/adb_run_wrapper.sh"))
+ .arg(ADB_TEST_DIR));
+
+ let target_dir = format!("{}/{}", ADB_TEST_DIR, target);
+ build.run(Command::new("adb").args(&["shell", "mkdir", &target_dir[..]]));
+
+ for f in t!(build.sysroot_libdir(compiler, target).read_dir()) {
+ let f = t!(f);
+ let name = f.file_name().into_string().unwrap();
+ if util::is_dylib(&name) {
+ build.run(Command::new("adb")
+ .arg("push")
+ .arg(f.path())
+ .arg(&target_dir));
+ }
+ }
}
target.ndk = Some(PathBuf::from(value));
}
"CFG_I686_LINUX_ANDROID_NDK" if value.len() > 0 => {
- let target = "i686-linux-androideabi".to_string();
+ let target = "i686-linux-android".to_string();
let target = self.target_config.entry(target)
.or_insert(Target::default());
target.ndk = Some(PathBuf::from(value));
}
"CFG_AARCH64_LINUX_ANDROID_NDK" if value.len() > 0 => {
- let target = "aarch64-linux-androideabi".to_string();
+ let target = "aarch64-linux-android".to_string();
let target = self.target_config.entry(target)
.or_insert(Target::default());
target.ndk = Some(PathBuf::from(value));
// Prepare the overlay which is part of the tarball but won't actually be
// installed
- t!(fs::create_dir_all(&overlay));
let cp = |file: &str| {
install(&build.src.join(file), &overlay, 0o644);
};
// Copy runtime DLLs needed by the compiler
if libdir != "bin" {
- t!(fs::create_dir_all(image.join(libdir)));
for entry in t!(src.join(libdir).read_dir()).map(|e| t!(e)) {
let name = entry.file_name();
if let Some(s) = name.to_str() {
let cp = |file: &str| {
install(&build.src.join(file), &image.join("share/doc/rust"), 0o644);
};
- t!(fs::create_dir_all(&image.join("share/doc/rust")));
cp("COPYRIGHT");
cp("LICENSE-APACHE");
cp("LICENSE-MIT");
fn install(src: &Path, dstdir: &Path, perms: u32) {
let dst = dstdir.join(src.file_name().unwrap());
+ t!(fs::create_dir_all(dstdir));
t!(fs::copy(src, &dst));
chmod(&dst, perms);
}
///
/// These entries currently correspond to the various output directories of the
/// build system, with each mod generating output in a different directory.
+#[derive(Clone, Copy)]
pub enum Mode {
/// This cargo is going to build the standard library, placing output in the
/// "stageN-std" directory.
"ui", "ui");
}
CheckDebuginfo { compiler } => {
- if target.target.contains("msvc") ||
- target.target.contains("android") {
+ if target.target.contains("msvc") {
// nothing to do
} else if target.target.contains("apple") {
check::compiletest(self, &compiler, target.target,
target.target);
}
+ AndroidCopyLibs { compiler } => {
+ check::android_copy_libs(self, &compiler, target.target);
+ }
+
+ // pseudo-steps
Dist { .. } |
- Doc { .. } | // pseudo-steps
+ Doc { .. } |
+ CheckTarget { .. } |
Check { .. } => {}
}
}
use std::path::Path;
use std::process::Command;
-use std::fs;
+use std::fs::{self, File};
use build_helper::output;
use cmake;
use gcc;
use build::Build;
-use build::util::{exe, staticlib, up_to_date};
+use build::util::{staticlib, up_to_date};
/// Compile LLVM for `target`.
pub fn llvm(build: &Build, target: &str) {
// artifacts are missing) then we keep going, otherwise we bail out.
let dst = build.llvm_out(target);
let stamp = build.src.join("src/rustllvm/llvm-auto-clean-trigger");
- let llvm_config = dst.join("bin").join(exe("llvm-config", target));
+ let done_stamp = dst.join("llvm-finished-building");
build.clear_if_dirty(&dst, &stamp);
- if fs::metadata(llvm_config).is_ok() {
+ if fs::metadata(&done_stamp).is_ok() {
return
}
+ println!("Building LLVM for {}", target);
+
let _ = fs::remove_dir_all(&dst.join("build"));
t!(fs::create_dir_all(&dst.join("build")));
let assertions = if build.config.llvm_assertions {"ON"} else {"OFF"};
// tools. Figure out how to filter them down and only build the right
// tools and libs on all platforms.
cfg.build();
+
+ t!(File::create(&done_stamp));
}
fn check_llvm_version(build: &Build, llvm_config: &Path) {
"arm" if target.contains("eabihf") => "armhf",
_ => arch,
};
- let target = format!("clang_rt.builtins-{}{}", builtins_arch, os_extra);
- ("linux".to_string(), target.clone(), target)
+ let target = format!("clang_rt.builtins-{}", builtins_arch);
+ ("linux".to_string(),
+ target.clone(),
+ format!("{}{}", target, os_extra))
} else if target.contains("apple-darwin") {
let builtins_arch = match arch {
"i686" => "i386",
");
}
}
+
+ if target.contains("arm-linux-android") {
+ need_cmd("adb".as_ref());
+ }
}
for host in build.flags.host.iter() {
// Steps for running tests. The 'check' target is just a pseudo
// target to depend on a bunch of others.
(check, Check { stage: u32, compiler: Compiler<'a> }),
+ (check_target, CheckTarget { stage: u32, compiler: Compiler<'a> }),
(check_linkcheck, CheckLinkcheck { stage: u32 }),
(check_cargotest, CheckCargoTest { stage: u32 }),
(check_tidy, CheckTidy { stage: u32 }),
(dist_mingw, DistMingw { _dummy: () }),
(dist_rustc, DistRustc { stage: u32 }),
(dist_std, DistStd { compiler: Compiler<'a> }),
+
+ // Misc targets
+ (android_copy_libs, AndroidCopyLibs { compiler: Compiler<'a> }),
}
}
}
self.doc_error_index(stage)]
}
Source::Check { stage, compiler } => {
- vec![
+ // Check is just a pseudo step which means check all targets,
+ // so just depend on checking all targets.
+ build.config.target.iter().map(|t| {
+ self.target(t).check_target(stage, compiler)
+ }).collect()
+ }
+ Source::CheckTarget { stage, compiler } => {
+ // CheckTarget here means run all possible test suites for this
+ // target. Most of the time, however, we can't actually run
+ // anything if we're not the build triple as we could be cross
+ // compiling.
+ //
+ // As a result, the base set of targets here is quite stripped
+ // down from the standard set of targets. These suites have
+ // their own internal logic to run in cross-compiled situations
+ // if they'll run at all. For example compiletest knows that
+ // when testing Android targets we ship artifacts to the
+ // emulator.
+ //
+ // When in doubt the rule of thumb for adding to this list is
+ // "should this test suite run on the android bot?"
+ let mut base = vec![
self.check_rpass(compiler),
- self.check_rpass_full(compiler),
self.check_rfail(compiler),
- self.check_rfail_full(compiler),
- self.check_cfail(compiler),
- self.check_cfail_full(compiler),
- self.check_pfail(compiler),
- self.check_incremental(compiler),
- self.check_ui(compiler),
self.check_crate_std(compiler),
self.check_crate_test(compiler),
- self.check_crate_rustc(compiler),
- self.check_codegen(compiler),
- self.check_codegen_units(compiler),
self.check_debuginfo(compiler),
- self.check_rustdoc(compiler),
- self.check_pretty(compiler),
- self.check_pretty_rpass(compiler),
- self.check_pretty_rpass_full(compiler),
- self.check_pretty_rfail(compiler),
- self.check_pretty_rfail_full(compiler),
- self.check_pretty_rpass_valgrind(compiler),
- self.check_rpass_valgrind(compiler),
- self.check_error_index(compiler),
- self.check_docs(compiler),
- self.check_rmake(compiler),
- self.check_linkcheck(stage),
- self.check_tidy(stage),
self.dist(stage),
- ]
+ ];
+
+ // If we're testing the build triple, then we know we can
+ // actually run binaries and such, so we run all possible tests
+ // that we know about.
+ if self.target == build.config.build {
+ base.extend(vec![
+ // docs-related
+ self.check_docs(compiler),
+ self.check_error_index(compiler),
+ self.check_rustdoc(compiler),
+
+ // UI-related
+ self.check_cfail(compiler),
+ self.check_pfail(compiler),
+ self.check_ui(compiler),
+
+ // codegen-related
+ self.check_incremental(compiler),
+ self.check_codegen(compiler),
+ self.check_codegen_units(compiler),
+
+ // misc compiletest-test suites
+ self.check_rpass_full(compiler),
+ self.check_rfail_full(compiler),
+ self.check_cfail_full(compiler),
+ self.check_pretty_rpass_full(compiler),
+ self.check_pretty_rfail_full(compiler),
+ self.check_rpass_valgrind(compiler),
+ self.check_rmake(compiler),
+
+ // crates
+ self.check_crate_rustc(compiler),
+
+ // pretty
+ self.check_pretty(compiler),
+ self.check_pretty_rpass(compiler),
+ self.check_pretty_rfail(compiler),
+ self.check_pretty_rpass_valgrind(compiler),
+
+ // misc
+ self.check_linkcheck(stage),
+ self.check_tidy(stage),
+ ]);
+ }
+ return base
}
Source::CheckLinkcheck { stage } => {
vec![self.tool_linkchecker(stage), self.doc(stage)]
Source::CheckCFail { compiler } |
Source::CheckRPassValgrind { compiler } |
Source::CheckRPass { compiler } => {
- vec![
+ let mut base = vec![
self.libtest(compiler),
- self.tool_compiletest(compiler.stage),
+ self.target(compiler.host).tool_compiletest(compiler.stage),
self.test_helpers(()),
- ]
+ ];
+ if self.target.contains("android") {
+ base.push(self.android_copy_libs(compiler));
+ }
+ base
}
Source::CheckDebuginfo { compiler } => {
vec![
self.libtest(compiler),
- self.tool_compiletest(compiler.stage),
+ self.target(compiler.host).tool_compiletest(compiler.stage),
self.test_helpers(()),
self.debugger_scripts(compiler.stage),
]
Source::CheckPrettyRPassValgrind { compiler } |
Source::CheckRMake { compiler } => {
vec![self.librustc(compiler),
- self.tool_compiletest(compiler.stage)]
+ self.target(compiler.host).tool_compiletest(compiler.stage)]
}
Source::CheckDocs { compiler } => {
vec![self.libstd(compiler)]
}
Source::CheckErrorIndex { compiler } => {
- vec![self.libstd(compiler), self.tool_error_index(compiler.stage)]
+ vec![self.libstd(compiler),
+ self.target(compiler.host).tool_error_index(compiler.stage)]
}
Source::CheckCrateStd { compiler } => {
vec![self.libtest(compiler)]
}
return base
}
+
+ Source::AndroidCopyLibs { compiler } => {
+ vec![self.libtest(compiler)]
+ }
}
}
}
clean:
$(Q)$(BOOTSTRAP) --clean
+rustc-stage1:
+ $(Q)$(BOOTSTRAP) --step libtest --stage 1
+rustc-stage2:
+ $(Q)$(BOOTSTRAP) --step libtest --stage 2
+
docs: doc
doc:
$(Q)$(BOOTSTRAP) --step doc
acts as a shorthand for writing an item signature, while not hiding
away the actual types involved as full local inference would if applied to it.
-When talking about lifetime elision, we use the term *input lifetime* and
+When talking about lifetime elision, we use the terms *input lifetime* and
*output lifetime*. An *input lifetime* is a lifetime associated with a parameter
of a function, and an *output lifetime* is a lifetime associated with the return
value of a function. For example, this function has an input lifetime:
fn debug(lvl: u32, s: &str); // elided
fn debug<'a>(lvl: u32, s: &'a str); // expanded
+```
-// In the preceding example, `lvl` doesn’t need a lifetime because it’s not a
-// reference (`&`). Only things relating to references (such as a `struct`
-// which contains a reference) need lifetimes.
+In the preceding example, `lvl` doesn’t need a lifetime because it’s not a
+reference (`&`). Only things relating to references (such as a `struct`
+which contains a reference) need lifetimes.
+```rust,ignore
fn substr(s: &str, until: u32) -> &str; // elided
fn substr<'a>(s: &'a str, until: u32) -> &'a str; // expanded
has no pointers to data somewhere else, copying it is a full copy.
All primitive types implement the `Copy` trait and their ownership is
-therefore not moved like one would assume, following the ´ownership rules´.
+therefore not moved like one would assume, following the ‘ownership rules’.
To give an example, the two following snippets of code only compile because the
`i32` and `bool` types implement the `Copy` trait.
Ugh! The return type, return line, and calling the function gets way more
complicated.
-Luckily, Rust offers a feature, borrowing, which helps us solve this problem.
-It’s the topic of the next section!
+Luckily, Rust offers a feature which helps us solve this problem.
+It’s called borrowing and is the topic of the next section!
foo(&v);
```
-errors with:
+will give us this error:
```text
error: cannot borrow immutable borrowed content `*v` as mutable
If it wasn’t, we couldn’t take a mutable borrow to an immutable value.
You'll also notice we added an asterisk (`*`) in front of `y`, making it `*y`,
-this is because `y` is a `&mut` reference. You'll also need to use them for
-accessing the contents of a reference as well.
+this is because `y` is a `&mut` reference. You'll need to use astrisks to
+access the contents of a reference as well.
Otherwise, `&mut` references are like references. There _is_ a large
difference between the two, and how they interact, though. You can tell
# The Rules
-Here’s the rules about borrowing in Rust:
+Here are the rules for borrowing in Rust:
First, any borrow must last for a scope no greater than that of the owner.
Second, you may have one or the other of these two kinds of borrows, but not
Here’s the code:
```rust,ignore
-let mut x = 5;
-let y = &mut x;
+fn main() {
+ let mut x = 5;
+ let y = &mut x;
-*y += 1;
+ *y += 1;
-println!("{}", x);
+ println!("{}", x);
+}
```
This code gives us this error:
```
This is because we’ve violated the rules: we have a `&mut T` pointing to `x`,
-and so we aren’t allowed to create any `&T`s. One or the other. The note
+and so we aren’t allowed to create any `&T`s. It's one or the other. The note
hints at how to think about this problem:
```text
scopes look like this:
```rust,ignore
-let mut x = 5;
-
-let y = &mut x; // -+ &mut borrow of x starts here
- // |
-*y += 1; // |
- // |
-println!("{}", x); // -+ - try to borrow x here
- // -+ &mut borrow of x ends here
+fn main() {
+ let mut x = 5;
+
+ let y = &mut x; // -+ &mut borrow of x starts here
+ // |
+ *y += 1; // |
+ // |
+ println!("{}", x); // -+ - try to borrow x here
+} // -+ &mut borrow of x ends here
+
```
The scopes conflict: we can’t make an `&x` while `y` is in scope.
```
There’s no problem. Our mutable borrow goes out of scope before we create an
-immutable one. But scope is the key to seeing how long a borrow lasts for.
+immutable one. So scope is the key to seeing how long a borrow lasts for.
## Issues borrowing prevents
Why have these restrictive rules? Well, as we noted, these rules prevent data
-races. What kinds of issues do data races cause? Here’s a few.
+races. What kinds of issues do data races cause? Here are a few.
### Iterator invalidation
We can’t modify `v` because it’s borrowed by the loop.
-### use after free
+### Use after free
References must not live longer than the resource they refer to. Rust will
check the scopes of your references to ensure that this is true.
#[stable(feature = "rust1", since = "1.0.0")]
pub use self::sip::SipHasher;
+#[unstable(feature = "sip_hash_13", issue = "29754")]
+pub use self::sip::{SipHasher13, SipHasher24};
+
mod sip;
/// A hashable type.
// option. This file may not be copied, modified, or distributed
// except according to those terms.
-//! An implementation of SipHash 2-4.
+//! An implementation of SipHash.
use prelude::v1::*;
+use marker::PhantomData;
use ptr;
-use super::Hasher;
+
+/// An implementation of SipHash 1-3.
+///
+/// See: https://131002.net/siphash/
+#[unstable(feature = "sip_hash_13", issue = "29754")]
+#[derive(Debug, Clone, Default)]
+pub struct SipHasher13 {
+ hasher: Hasher<Sip13Rounds>,
+}
+
+/// An implementation of SipHash 2-4.
+///
+/// See: https://131002.net/siphash/
+#[unstable(feature = "sip_hash_13", issue = "29754")]
+#[derive(Debug, Clone, Default)]
+pub struct SipHasher24 {
+ hasher: Hasher<Sip24Rounds>,
+}
/// An implementation of SipHash 2-4.
///
/// Although the SipHash algorithm is considered to be generally strong,
/// it is not intended for cryptographic purposes. As such, all
/// cryptographic uses of this implementation are _strongly discouraged_.
-#[derive(Debug)]
#[stable(feature = "rust1", since = "1.0.0")]
-pub struct SipHasher {
+#[derive(Debug, Clone, Default)]
+pub struct SipHasher(SipHasher24);
+
+#[derive(Debug)]
+struct Hasher<S: Sip> {
k0: u64,
k1: u64,
length: usize, // how many bytes we've processed
+ state: State, // hash State
+ tail: u64, // unprocessed bytes le
+ ntail: usize, // how many bytes in tail are valid
+ _marker: PhantomData<S>,
+}
+
+#[derive(Debug, Clone, Copy)]
+struct State {
// v0, v2 and v1, v3 show up in pairs in the algorithm,
// and simd implementations of SipHash will use vectors
// of v02 and v13. By placing them in this order in the struct,
// the compiler can pick up on just a few simd optimizations by itself.
- v0: u64, // hash state
+ v0: u64,
v2: u64,
v1: u64,
v3: u64,
- tail: u64, // unprocessed bytes le
- ntail: usize, // how many bytes in tail are valid
}
// sadly, these macro definitions can't appear later,
}
macro_rules! compress {
+ ($state:expr) => ({
+ compress!($state.v0, $state.v1, $state.v2, $state.v3)
+ });
($v0:expr, $v1:expr, $v2:expr, $v3:expr) =>
({
$v0 = $v0.wrapping_add($v1); $v1 = rotl!($v1, 13); $v1 ^= $v0;
$v0 = $v0.wrapping_add($v3); $v3 = rotl!($v3, 21); $v3 ^= $v0;
$v2 = $v2.wrapping_add($v1); $v1 = rotl!($v1, 17); $v1 ^= $v2;
$v2 = rotl!($v2, 32);
- })
+ });
}
impl SipHasher {
#[inline]
#[stable(feature = "rust1", since = "1.0.0")]
pub fn new_with_keys(key0: u64, key1: u64) -> SipHasher {
- let mut state = SipHasher {
+ SipHasher(SipHasher24::new_with_keys(key0, key1))
+ }
+}
+
+
+impl SipHasher13 {
+ /// Creates a new `SipHasher13` with the two initial keys set to 0.
+ #[inline]
+ #[unstable(feature = "sip_hash_13", issue = "29754")]
+ pub fn new() -> SipHasher13 {
+ SipHasher13::new_with_keys(0, 0)
+ }
+
+ /// Creates a `SipHasher13` that is keyed off the provided keys.
+ #[inline]
+ #[unstable(feature = "sip_hash_13", issue = "29754")]
+ pub fn new_with_keys(key0: u64, key1: u64) -> SipHasher13 {
+ SipHasher13 {
+ hasher: Hasher::new_with_keys(key0, key1)
+ }
+ }
+}
+
+impl SipHasher24 {
+ /// Creates a new `SipHasher24` with the two initial keys set to 0.
+ #[inline]
+ #[unstable(feature = "sip_hash_13", issue = "29754")]
+ pub fn new() -> SipHasher24 {
+ SipHasher24::new_with_keys(0, 0)
+ }
+
+ /// Creates a `SipHasher24` that is keyed off the provided keys.
+ #[inline]
+ #[unstable(feature = "sip_hash_13", issue = "29754")]
+ pub fn new_with_keys(key0: u64, key1: u64) -> SipHasher24 {
+ SipHasher24 {
+ hasher: Hasher::new_with_keys(key0, key1)
+ }
+ }
+}
+
+impl<S: Sip> Hasher<S> {
+ #[inline]
+ fn new_with_keys(key0: u64, key1: u64) -> Hasher<S> {
+ let mut state = Hasher {
k0: key0,
k1: key1,
length: 0,
- v0: 0,
- v1: 0,
- v2: 0,
- v3: 0,
+ state: State {
+ v0: 0,
+ v1: 0,
+ v2: 0,
+ v3: 0,
+ },
tail: 0,
ntail: 0,
+ _marker: PhantomData,
};
state.reset();
state
#[inline]
fn reset(&mut self) {
self.length = 0;
- self.v0 = self.k0 ^ 0x736f6d6570736575;
- self.v1 = self.k1 ^ 0x646f72616e646f6d;
- self.v2 = self.k0 ^ 0x6c7967656e657261;
- self.v3 = self.k1 ^ 0x7465646279746573;
+ self.state.v0 = self.k0 ^ 0x736f6d6570736575;
+ self.state.v1 = self.k1 ^ 0x646f72616e646f6d;
+ self.state.v2 = self.k0 ^ 0x6c7967656e657261;
+ self.state.v3 = self.k1 ^ 0x7465646279746573;
self.ntail = 0;
}
}
#[stable(feature = "rust1", since = "1.0.0")]
-impl Hasher for SipHasher {
+impl super::Hasher for SipHasher {
+ #[inline]
+ fn write(&mut self, msg: &[u8]) {
+ self.0.write(msg)
+ }
+
+ #[inline]
+ fn finish(&self) -> u64 {
+ self.0.finish()
+ }
+}
+
+#[unstable(feature = "sip_hash_13", issue = "29754")]
+impl super::Hasher for SipHasher13 {
+ #[inline]
+ fn write(&mut self, msg: &[u8]) {
+ self.hasher.write(msg)
+ }
+
+ #[inline]
+ fn finish(&self) -> u64 {
+ self.hasher.finish()
+ }
+}
+
+#[unstable(feature = "sip_hash_13", issue = "29754")]
+impl super::Hasher for SipHasher24 {
+ #[inline]
+ fn write(&mut self, msg: &[u8]) {
+ self.hasher.write(msg)
+ }
+
+ #[inline]
+ fn finish(&self) -> u64 {
+ self.hasher.finish()
+ }
+}
+
+impl<S: Sip> super::Hasher for Hasher<S> {
#[inline]
fn write(&mut self, msg: &[u8]) {
let length = msg.len();
let m = self.tail | u8to64_le!(msg, 0, needed) << 8 * self.ntail;
- self.v3 ^= m;
- compress!(self.v0, self.v1, self.v2, self.v3);
- compress!(self.v0, self.v1, self.v2, self.v3);
- self.v0 ^= m;
+ self.state.v3 ^= m;
+ S::c_rounds(&mut self.state);
+ self.state.v0 ^= m;
self.ntail = 0;
}
while i < len - left {
let mi = unsafe { load_u64_le(msg, i) };
- self.v3 ^= mi;
- compress!(self.v0, self.v1, self.v2, self.v3);
- compress!(self.v0, self.v1, self.v2, self.v3);
- self.v0 ^= mi;
+ self.state.v3 ^= mi;
+ S::c_rounds(&mut self.state);
+ self.state.v0 ^= mi;
i += 8;
}
#[inline]
fn finish(&self) -> u64 {
- let mut v0 = self.v0;
- let mut v1 = self.v1;
- let mut v2 = self.v2;
- let mut v3 = self.v3;
+ let mut state = self.state;
let b: u64 = ((self.length as u64 & 0xff) << 56) | self.tail;
- v3 ^= b;
- compress!(v0, v1, v2, v3);
- compress!(v0, v1, v2, v3);
- v0 ^= b;
+ state.v3 ^= b;
+ S::c_rounds(&mut state);
+ state.v0 ^= b;
- v2 ^= 0xff;
- compress!(v0, v1, v2, v3);
- compress!(v0, v1, v2, v3);
- compress!(v0, v1, v2, v3);
- compress!(v0, v1, v2, v3);
+ state.v2 ^= 0xff;
+ S::d_rounds(&mut state);
- v0 ^ v1 ^ v2 ^ v3
+ state.v0 ^ state.v1 ^ state.v2 ^ state.v3
}
}
-#[stable(feature = "rust1", since = "1.0.0")]
-impl Clone for SipHasher {
+impl<S: Sip> Clone for Hasher<S> {
#[inline]
- fn clone(&self) -> SipHasher {
- SipHasher {
+ fn clone(&self) -> Hasher<S> {
+ Hasher {
k0: self.k0,
k1: self.k1,
length: self.length,
- v0: self.v0,
- v1: self.v1,
- v2: self.v2,
- v3: self.v3,
+ state: self.state,
tail: self.tail,
ntail: self.ntail,
+ _marker: self._marker,
}
}
}
-#[stable(feature = "rust1", since = "1.0.0")]
-impl Default for SipHasher {
- fn default() -> SipHasher {
- SipHasher::new()
+impl<S: Sip> Default for Hasher<S> {
+ #[inline]
+ fn default() -> Hasher<S> {
+ Hasher::new_with_keys(0, 0)
+ }
+}
+
+#[doc(hidden)]
+trait Sip {
+ fn c_rounds(&mut State);
+ fn d_rounds(&mut State);
+}
+
+#[derive(Debug, Clone, Default)]
+struct Sip13Rounds;
+
+impl Sip for Sip13Rounds {
+ #[inline]
+ fn c_rounds(state: &mut State) {
+ compress!(state);
+ }
+
+ #[inline]
+ fn d_rounds(state: &mut State) {
+ compress!(state);
+ compress!(state);
+ compress!(state);
+ }
+}
+
+#[derive(Debug, Clone, Default)]
+struct Sip24Rounds;
+
+impl Sip for Sip24Rounds {
+ #[inline]
+ fn c_rounds(state: &mut State) {
+ compress!(state);
+ compress!(state);
+ }
+
+ #[inline]
+ fn d_rounds(state: &mut State) {
+ compress!(state);
+ compress!(state);
+ compress!(state);
+ compress!(state);
}
}
#[stable(feature = "rust1", since = "1.0.0")]
pub fn transmute<T, U>(e: T) -> U;
+ /// Gives the address for the return value of the enclosing function.
+ ///
+ /// Using this intrinsic in a function that does not use an out pointer
+ /// will trigger a compiler error.
+ pub fn return_address() -> *const u8;
+
/// Returns `true` if the actual type given as `T` requires drop
/// glue; returns `false` if the actual type provided for `T`
/// implements `Copy`.
//! }
//!
//! fn write_info(info: &Info) -> io::Result<()> {
-//! let mut file = try!(File::create("my_best_friends.txt"));
//! // Early return on error
+//! let mut file = match File::create("my_best_friends.txt") {
+//! Err(e) => return Err(e),
+//! Ok(f) => f,
+//! };
//! if let Err(e) = file.write_all(format!("name: {}\n", info.name).as_bytes()) {
//! return Err(e)
//! }
use test::{Bencher, black_box};
use core::hash::{Hash, Hasher};
-use core::hash::SipHasher;
+use core::hash::{SipHasher, SipHasher13, SipHasher24};
// Hash just the bytes of the slice, without length prefix
struct Bytes<'a>(&'a [u8]);
});
}
-fn hash<T: Hash>(x: &T) -> u64 {
- let mut st = SipHasher::new();
+fn hash_with<H: Hasher, T: Hash>(mut st: H, x: &T) -> u64 {
x.hash(&mut st);
st.finish()
}
-fn hash_with_keys<T: Hash>(k1: u64, k2: u64, x: &T) -> u64 {
- let mut st = SipHasher::new_with_keys(k1, k2);
- x.hash(&mut st);
- st.finish()
+fn hash<T: Hash>(x: &T) -> u64 {
+ hash_with(SipHasher::new(), x)
}
-fn hash_bytes(x: &[u8]) -> u64 {
- let mut s = SipHasher::default();
+fn hash_bytes<H: Hasher>(mut s: H, x: &[u8]) -> u64 {
Hasher::write(&mut s, x);
s.finish()
}
#[test]
#[allow(unused_must_use)]
-fn test_siphash() {
+fn test_siphash_1_3() {
+ let vecs : [[u8; 8]; 64] = [
+ [ 0xdc, 0xc4, 0x0f, 0x05, 0x58, 0x01, 0xac, 0xab ],
+ [ 0x93, 0xca, 0x57, 0x7d, 0xf3, 0x9b, 0xf4, 0xc9 ],
+ [ 0x4d, 0xd4, 0xc7, 0x4d, 0x02, 0x9b, 0xcb, 0x82 ],
+ [ 0xfb, 0xf7, 0xdd, 0xe7, 0xb8, 0x0a, 0xf8, 0x8b ],
+ [ 0x28, 0x83, 0xd3, 0x88, 0x60, 0x57, 0x75, 0xcf ],
+ [ 0x67, 0x3b, 0x53, 0x49, 0x2f, 0xd5, 0xf9, 0xde ],
+ [ 0xa7, 0x22, 0x9f, 0xc5, 0x50, 0x2b, 0x0d, 0xc5 ],
+ [ 0x40, 0x11, 0xb1, 0x9b, 0x98, 0x7d, 0x92, 0xd3 ],
+ [ 0x8e, 0x9a, 0x29, 0x8d, 0x11, 0x95, 0x90, 0x36 ],
+ [ 0xe4, 0x3d, 0x06, 0x6c, 0xb3, 0x8e, 0xa4, 0x25 ],
+ [ 0x7f, 0x09, 0xff, 0x92, 0xee, 0x85, 0xde, 0x79 ],
+ [ 0x52, 0xc3, 0x4d, 0xf9, 0xc1, 0x18, 0xc1, 0x70 ],
+ [ 0xa2, 0xd9, 0xb4, 0x57, 0xb1, 0x84, 0xa3, 0x78 ],
+ [ 0xa7, 0xff, 0x29, 0x12, 0x0c, 0x76, 0x6f, 0x30 ],
+ [ 0x34, 0x5d, 0xf9, 0xc0, 0x11, 0xa1, 0x5a, 0x60 ],
+ [ 0x56, 0x99, 0x51, 0x2a, 0x6d, 0xd8, 0x20, 0xd3 ],
+ [ 0x66, 0x8b, 0x90, 0x7d, 0x1a, 0xdd, 0x4f, 0xcc ],
+ [ 0x0c, 0xd8, 0xdb, 0x63, 0x90, 0x68, 0xf2, 0x9c ],
+ [ 0x3e, 0xe6, 0x73, 0xb4, 0x9c, 0x38, 0xfc, 0x8f ],
+ [ 0x1c, 0x7d, 0x29, 0x8d, 0xe5, 0x9d, 0x1f, 0xf2 ],
+ [ 0x40, 0xe0, 0xcc, 0xa6, 0x46, 0x2f, 0xdc, 0xc0 ],
+ [ 0x44, 0xf8, 0x45, 0x2b, 0xfe, 0xab, 0x92, 0xb9 ],
+ [ 0x2e, 0x87, 0x20, 0xa3, 0x9b, 0x7b, 0xfe, 0x7f ],
+ [ 0x23, 0xc1, 0xe6, 0xda, 0x7f, 0x0e, 0x5a, 0x52 ],
+ [ 0x8c, 0x9c, 0x34, 0x67, 0xb2, 0xae, 0x64, 0xf4 ],
+ [ 0x79, 0x09, 0x5b, 0x70, 0x28, 0x59, 0xcd, 0x45 ],
+ [ 0xa5, 0x13, 0x99, 0xca, 0xe3, 0x35, 0x3e, 0x3a ],
+ [ 0x35, 0x3b, 0xde, 0x4a, 0x4e, 0xc7, 0x1d, 0xa9 ],
+ [ 0x0d, 0xd0, 0x6c, 0xef, 0x02, 0xed, 0x0b, 0xfb ],
+ [ 0xf4, 0xe1, 0xb1, 0x4a, 0xb4, 0x3c, 0xd9, 0x88 ],
+ [ 0x63, 0xe6, 0xc5, 0x43, 0xd6, 0x11, 0x0f, 0x54 ],
+ [ 0xbc, 0xd1, 0x21, 0x8c, 0x1f, 0xdd, 0x70, 0x23 ],
+ [ 0x0d, 0xb6, 0xa7, 0x16, 0x6c, 0x7b, 0x15, 0x81 ],
+ [ 0xbf, 0xf9, 0x8f, 0x7a, 0xe5, 0xb9, 0x54, 0x4d ],
+ [ 0x3e, 0x75, 0x2a, 0x1f, 0x78, 0x12, 0x9f, 0x75 ],
+ [ 0x91, 0x6b, 0x18, 0xbf, 0xbe, 0xa3, 0xa1, 0xce ],
+ [ 0x06, 0x62, 0xa2, 0xad, 0xd3, 0x08, 0xf5, 0x2c ],
+ [ 0x57, 0x30, 0xc3, 0xa3, 0x2d, 0x1c, 0x10, 0xb6 ],
+ [ 0xa1, 0x36, 0x3a, 0xae, 0x96, 0x74, 0xf4, 0xb3 ],
+ [ 0x92, 0x83, 0x10, 0x7b, 0x54, 0x57, 0x6b, 0x62 ],
+ [ 0x31, 0x15, 0xe4, 0x99, 0x32, 0x36, 0xd2, 0xc1 ],
+ [ 0x44, 0xd9, 0x1a, 0x3f, 0x92, 0xc1, 0x7c, 0x66 ],
+ [ 0x25, 0x88, 0x13, 0xc8, 0xfe, 0x4f, 0x70, 0x65 ],
+ [ 0xa6, 0x49, 0x89, 0xc2, 0xd1, 0x80, 0xf2, 0x24 ],
+ [ 0x6b, 0x87, 0xf8, 0xfa, 0xed, 0x1c, 0xca, 0xc2 ],
+ [ 0x96, 0x21, 0x04, 0x9f, 0xfc, 0x4b, 0x16, 0xc2 ],
+ [ 0x23, 0xd6, 0xb1, 0x68, 0x93, 0x9c, 0x6e, 0xa1 ],
+ [ 0xfd, 0x14, 0x51, 0x8b, 0x9c, 0x16, 0xfb, 0x49 ],
+ [ 0x46, 0x4c, 0x07, 0xdf, 0xf8, 0x43, 0x31, 0x9f ],
+ [ 0xb3, 0x86, 0xcc, 0x12, 0x24, 0xaf, 0xfd, 0xc6 ],
+ [ 0x8f, 0x09, 0x52, 0x0a, 0xd1, 0x49, 0xaf, 0x7e ],
+ [ 0x9a, 0x2f, 0x29, 0x9d, 0x55, 0x13, 0xf3, 0x1c ],
+ [ 0x12, 0x1f, 0xf4, 0xa2, 0xdd, 0x30, 0x4a, 0xc4 ],
+ [ 0xd0, 0x1e, 0xa7, 0x43, 0x89, 0xe9, 0xfa, 0x36 ],
+ [ 0xe6, 0xbc, 0xf0, 0x73, 0x4c, 0xb3, 0x8f, 0x31 ],
+ [ 0x80, 0xe9, 0xa7, 0x70, 0x36, 0xbf, 0x7a, 0xa2 ],
+ [ 0x75, 0x6d, 0x3c, 0x24, 0xdb, 0xc0, 0xbc, 0xb4 ],
+ [ 0x13, 0x15, 0xb7, 0xfd, 0x52, 0xd8, 0xf8, 0x23 ],
+ [ 0x08, 0x8a, 0x7d, 0xa6, 0x4d, 0x5f, 0x03, 0x8f ],
+ [ 0x48, 0xf1, 0xe8, 0xb7, 0xe5, 0xd0, 0x9c, 0xd8 ],
+ [ 0xee, 0x44, 0xa6, 0xf7, 0xbc, 0xe6, 0xf4, 0xf6 ],
+ [ 0xf2, 0x37, 0x18, 0x0f, 0xd8, 0x9a, 0xc5, 0xae ],
+ [ 0xe0, 0x94, 0x66, 0x4b, 0x15, 0xf6, 0xb2, 0xc3 ],
+ [ 0xa8, 0xb3, 0xbb, 0xb7, 0x62, 0x90, 0x19, 0x9d ]
+ ];
+
+ let k0 = 0x_07_06_05_04_03_02_01_00;
+ let k1 = 0x_0f_0e_0d_0c_0b_0a_09_08;
+ let mut buf = Vec::new();
+ let mut t = 0;
+ let mut state_inc = SipHasher13::new_with_keys(k0, k1);
+
+ while t < 64 {
+ let vec = u8to64_le!(vecs[t], 0);
+ let out = hash_with(SipHasher13::new_with_keys(k0, k1), &Bytes(&buf));
+ assert_eq!(vec, out);
+
+ let full = hash_with(SipHasher13::new_with_keys(k0, k1), &Bytes(&buf));
+ let i = state_inc.finish();
+
+ assert_eq!(full, i);
+ assert_eq!(full, vec);
+
+ buf.push(t as u8);
+ Hasher::write(&mut state_inc, &[t as u8]);
+
+ t += 1;
+ }
+}
+
+#[test]
+#[allow(unused_must_use)]
+fn test_siphash_2_4() {
let vecs : [[u8; 8]; 64] = [
[ 0x31, 0x0e, 0x0e, 0xdd, 0x47, 0xdb, 0x6f, 0x72, ],
[ 0xfd, 0x67, 0xdc, 0x93, 0xc5, 0x39, 0xf8, 0x74, ],
let k1 = 0x_0f_0e_0d_0c_0b_0a_09_08;
let mut buf = Vec::new();
let mut t = 0;
- let mut state_inc = SipHasher::new_with_keys(k0, k1);
+ let mut state_inc = SipHasher24::new_with_keys(k0, k1);
while t < 64 {
let vec = u8to64_le!(vecs[t], 0);
- let out = hash_with_keys(k0, k1, &Bytes(&buf));
+ let out = hash_with(SipHasher24::new_with_keys(k0, k1), &Bytes(&buf));
assert_eq!(vec, out);
- let full = hash_with_keys(k0, k1, &Bytes(&buf));
+ let full = hash_with(SipHasher24::new_with_keys(k0, k1), &Bytes(&buf));
let i = state_inc.finish();
assert_eq!(full, i);
t += 1;
}
}
-
#[test] #[cfg(target_arch = "arm")]
fn test_hash_usize() {
let val = 0xdeadbeef_deadbeef_u64;
let k1 = black_box(0x1);
let k2 = black_box(0x2);
b.iter(|| {
- hash_with_keys(k1, k2, &u)
+ hash_with(SipHasher::new_with_keys(k1, k2), &u)
});
b.bytes = 8;
}
fn bench_bytes_4(b: &mut Bencher) {
let data = black_box([b' '; 4]);
b.iter(|| {
- hash_bytes(&data)
+ hash_bytes(SipHasher::default(), &data)
});
b.bytes = 4;
}
fn bench_bytes_7(b: &mut Bencher) {
let data = black_box([b' '; 7]);
b.iter(|| {
- hash_bytes(&data)
+ hash_bytes(SipHasher::default(), &data)
});
b.bytes = 7;
}
fn bench_bytes_8(b: &mut Bencher) {
let data = black_box([b' '; 8]);
b.iter(|| {
- hash_bytes(&data)
+ hash_bytes(SipHasher::default(), &data)
});
b.bytes = 8;
}
fn bench_bytes_a_16(b: &mut Bencher) {
let data = black_box([b' '; 16]);
b.iter(|| {
- hash_bytes(&data)
+ hash_bytes(SipHasher::default(), &data)
});
b.bytes = 16;
}
fn bench_bytes_b_32(b: &mut Bencher) {
let data = black_box([b' '; 32]);
b.iter(|| {
- hash_bytes(&data)
+ hash_bytes(SipHasher::default(), &data)
});
b.bytes = 32;
}
fn bench_bytes_c_128(b: &mut Bencher) {
let data = black_box([b' '; 128]);
b.iter(|| {
- hash_bytes(&data)
+ hash_bytes(SipHasher::default(), &data)
});
b.bytes = 128;
}
#![feature(nonzero)]
#![feature(rand)]
#![feature(raw)]
+#![feature(sip_hash_13)]
#![feature(slice_patterns)]
#![feature(step_by)]
#![feature(test)]
/// the buffer is full, this may swap.)
#[inline]
pub fn enqueue(&self, message: DepMessage) {
- debug!("enqueue: {:?} tasks_pushed={}", message, self.tasks_pushed.get());
-
// Regardless of whether dep graph construction is enabled, we
// still want to check that we always have a valid task on the
// stack when a read/write/etc event occurs.
err.note("only the last field of a struct or enum variant \
may have a dynamically sized type");
}
+ ObligationCauseCode::ConstSized => {
+ err.note("constant expressions must have a statically known size");
+ }
ObligationCauseCode::SharedStatic => {
err.note("shared static variables must have a type that implements `Sync`");
}
// Types of fields (other than the last) in a struct must be sized.
FieldSized,
+ // Constant expressions must be sized.
+ ConstSized,
+
// static items must have `Sync` type
SharedStatic,
use std::ops::{Index, IndexMut, Range};
use std::fmt;
use std::vec;
+use std::u32;
use rustc_serialize as serialize;
fn index(self) -> usize { self }
}
+impl Idx for u32 {
+ fn new(idx: usize) -> Self { assert!(idx <= u32::MAX as usize); idx as u32 }
+ fn index(self) -> usize { self as usize }
+}
+
#[derive(Clone)]
pub struct IndexVec<I: Idx, T> {
pub raw: Vec<T>,
///
/// This CAN be done in a snapshot
pub fn register_obligation(&mut self, obligation: O) {
- self.register_obligation_at(obligation, None)
+ // Ignore errors here - there is no guarantee of success.
+ let _ = self.register_obligation_at(obligation, None);
}
- fn register_obligation_at(&mut self, obligation: O, parent: Option<NodeIndex>) {
- if self.done_cache.contains(obligation.as_predicate()) { return }
+ // returns Err(()) if we already know this obligation failed.
+ fn register_obligation_at(&mut self, obligation: O, parent: Option<NodeIndex>)
+ -> Result<(), ()>
+ {
+ if self.done_cache.contains(obligation.as_predicate()) {
+ return Ok(())
+ }
match self.waiting_cache.entry(obligation.as_predicate().clone()) {
Entry::Occupied(o) => {
self.nodes[o.get().get()].dependents.push(parent);
}
}
+ if let NodeState::Error = self.nodes[o.get().get()].state.get() {
+ Err(())
+ } else {
+ Ok(())
+ }
}
Entry::Vacant(v) => {
debug!("register_obligation_at({:?}, {:?}) - ok",
v.insert(NodeIndex::new(self.nodes.len()));
self.cache_list.push(obligation.as_predicate().clone());
self.nodes.push(Node::new(parent, obligation));
+ Ok(())
}
- };
+ }
}
/// Convert all remaining obligations to the given error.
Ok(Some(children)) => {
// if we saw a Some(_) result, we are not (yet) stalled
stalled = false;
+ self.nodes[index].state.set(NodeState::Success);
+
for child in children {
- self.register_obligation_at(child,
- Some(NodeIndex::new(index)));
+ let st = self.register_obligation_at(
+ child,
+ Some(NodeIndex::new(index))
+ );
+ if let Err(()) = st {
+ // error already reported - propagate it
+ // to our node.
+ self.error_at(index);
+ }
}
-
- self.nodes[index].state.set(NodeState::Success);
}
Err(err) => {
let backtrace = self.error_at(index);
let errors = forest.to_errors(());
assert_eq!(errors.len(), 0);
}
+
+#[test]
+fn simultaneous_register_and_error() {
+ // check that registering a failed obligation works correctly
+ let mut forest = ObligationForest::new();
+ forest.register_obligation("A");
+ forest.register_obligation("B");
+
+ let Outcome { completed: ok, errors: err, .. } =
+ forest.process_obligations(&mut C(|obligation| {
+ match *obligation {
+ "A" => Err("An error"),
+ "B" => Ok(Some(vec!["A"])),
+ _ => unreachable!(),
+ }
+ }, |_|{}));
+ assert_eq!(ok.len(), 0);
+ assert_eq!(err, vec![super::Error {
+ error: "An error",
+ backtrace: vec!["A"]
+ }]);
+
+ let mut forest = ObligationForest::new();
+ forest.register_obligation("B");
+ forest.register_obligation("A");
+
+ let Outcome { completed: ok, errors: err, .. } =
+ forest.process_obligations(&mut C(|obligation| {
+ match *obligation {
+ "A" => Err("An error"),
+ "B" => Ok(Some(vec!["A"])),
+ _ => unreachable!(),
+ }
+ }, |_|{}));
+ assert_eq!(ok.len(), 0);
+ assert_eq!(err, vec![super::Error {
+ error: "An error",
+ backtrace: vec!["A"]
+ }]);
+}
rustc_back = { path = "../librustc_back" }
rustc_bitflags = { path = "../librustc_bitflags" }
rustc_const_math = { path = "../librustc_const_math" }
+rustc_data_structures = { path = "../librustc_data_structures" }
rustc_errors = { path = "../librustc_errors" }
rustc_llvm = { path = "../librustc_llvm" }
serialize = { path = "../libserialize" }
struct DecodeContext<'a, 'b, 'tcx: 'a> {
tcx: TyCtxt<'a, 'tcx, 'tcx>,
- cdata: &'b cstore::crate_metadata,
+ cdata: &'b cstore::CrateMetadata,
from_id_range: IdRange,
to_id_range: IdRange,
// Cache the last used filemap for translating spans as an optimization.
/// Decodes an item from its AST in the cdata's metadata and adds it to the
/// ast-map.
-pub fn decode_inlined_item<'a, 'tcx>(cdata: &cstore::crate_metadata,
+pub fn decode_inlined_item<'a, 'tcx>(cdata: &cstore::CrateMetadata,
tcx: TyCtxt<'a, 'tcx, 'tcx>,
parent_def_path: ast_map::DefPath,
parent_did: DefId,
trait def_id_decoder_helpers {
fn read_def_id(&mut self, dcx: &DecodeContext) -> DefId;
fn read_def_id_nodcx(&mut self,
- cdata: &cstore::crate_metadata) -> DefId;
+ cdata: &cstore::CrateMetadata) -> DefId;
}
impl<D:serialize::Decoder> def_id_decoder_helpers for D
}
fn read_def_id_nodcx(&mut self,
- cdata: &cstore::crate_metadata)
+ cdata: &cstore::CrateMetadata)
-> DefId {
let did: DefId = Decodable::decode(self).unwrap();
decoder::translate_def_id(cdata, did)
// Versions of the type reading functions that don't need the full
// DecodeContext.
fn read_ty_nodcx<'a>(&mut self, tcx: TyCtxt<'a, 'tcx, 'tcx>,
- cdata: &cstore::crate_metadata) -> Ty<'tcx>;
+ cdata: &cstore::CrateMetadata) -> Ty<'tcx>;
fn read_tys_nodcx<'a>(&mut self, tcx: TyCtxt<'a, 'tcx, 'tcx>,
- cdata: &cstore::crate_metadata) -> Vec<Ty<'tcx>>;
+ cdata: &cstore::CrateMetadata) -> Vec<Ty<'tcx>>;
fn read_substs_nodcx<'a>(&mut self, tcx: TyCtxt<'a, 'tcx, 'tcx>,
- cdata: &cstore::crate_metadata)
+ cdata: &cstore::CrateMetadata)
-> subst::Substs<'tcx>;
}
impl<'a, 'tcx> rbml_decoder_decoder_helpers<'tcx> for reader::Decoder<'a> {
fn read_ty_nodcx<'b>(&mut self, tcx: TyCtxt<'b, 'tcx, 'tcx>,
- cdata: &cstore::crate_metadata)
+ cdata: &cstore::CrateMetadata)
-> Ty<'tcx> {
self.read_opaque(|_, doc| {
Ok(
}
fn read_tys_nodcx<'b>(&mut self, tcx: TyCtxt<'b, 'tcx, 'tcx>,
- cdata: &cstore::crate_metadata) -> Vec<Ty<'tcx>> {
+ cdata: &cstore::CrateMetadata) -> Vec<Ty<'tcx>> {
self.read_to_vec(|this| Ok(this.read_ty_nodcx(tcx, cdata)) )
.unwrap()
.into_iter()
}
fn read_substs_nodcx<'b>(&mut self, tcx: TyCtxt<'b, 'tcx, 'tcx>,
- cdata: &cstore::crate_metadata)
+ cdata: &cstore::CrateMetadata)
-> subst::Substs<'tcx>
{
self.read_opaque(|_, doc| {
}
pub const tag_panic_strategy: usize = 0x114;
+
+// NB: increment this if you change the format of metadata such that
+// rustc_version can't be found.
+pub const metadata_encoding_version : &'static [u8] = &[b'r', b'u', b's', b't', 0, 0, 0, 2];
//! Validates all used crates and extern libraries and loads their metadata
-use common::rustc_version;
use cstore::{self, CStore, CrateSource, MetadataBlob};
use decoder;
use loader::{self, CratePaths};
use rustc::session::config::PanicStrategy;
use rustc::session::search_paths::PathKind;
use rustc::middle::cstore::{CrateStore, validate_crate_name, ExternCrate};
-use rustc::util::nodemap::FnvHashMap;
+use rustc::util::nodemap::{FnvHashMap, FnvHashSet};
use rustc::hir::map as hir_map;
use std::cell::{RefCell, Cell};
}
enum PMDSource {
- Registered(Rc<cstore::crate_metadata>),
+ Registered(Rc<cstore::CrateMetadata>),
Owned(MetadataBlob),
}
return ret;
}
- fn verify_rustc_version(&self,
- name: &str,
- span: Span,
- metadata: &MetadataBlob) {
- let crate_rustc_version = decoder::crate_rustc_version(metadata.as_slice());
- if crate_rustc_version != Some(rustc_version()) {
- let mut err = struct_span_fatal!(self.sess, span, E0514,
- "the crate `{}` has been compiled with {}, which is \
- incompatible with this version of rustc",
- name,
- crate_rustc_version
- .as_ref().map(|s| &**s)
- .unwrap_or("an old version of rustc"));
- err.help("consider removing the compiled binaries and recompiling \
- with your current version of rustc");
- err.emit();
- }
- }
-
fn verify_no_symbol_conflicts(&self,
span: Span,
metadata: &MetadataBlob) {
span: Span,
lib: loader::Library,
explicitly_linked: bool)
- -> (ast::CrateNum, Rc<cstore::crate_metadata>,
+ -> (ast::CrateNum, Rc<cstore::CrateMetadata>,
cstore::CrateSource) {
- self.verify_rustc_version(name, span, &lib.metadata);
self.verify_no_symbol_conflicts(span, &lib.metadata);
// Claim this crate number and cache it
let loader::Library { dylib, rlib, metadata } = lib;
- let cnum_map = self.resolve_crate_deps(root, metadata.as_slice(), span);
+ let cnum_map = self.resolve_crate_deps(root, metadata.as_slice(), cnum, span);
let staged_api = self.is_staged_api(metadata.as_slice());
- let cmeta = Rc::new(cstore::crate_metadata {
+ let cmeta = Rc::new(cstore::CrateMetadata {
name: name.to_string(),
extern_crate: Cell::new(None),
index: decoder::load_index(metadata.as_slice()),
span: Span,
kind: PathKind,
explicitly_linked: bool)
- -> (ast::CrateNum, Rc<cstore::crate_metadata>, cstore::CrateSource) {
+ -> (ast::CrateNum, Rc<cstore::CrateMetadata>, cstore::CrateSource) {
let result = match self.existing_match(name, hash, kind) {
Some(cnum) => LoadResult::Previous(cnum),
None => {
rejected_via_hash: vec!(),
rejected_via_triple: vec!(),
rejected_via_kind: vec!(),
+ rejected_via_version: vec!(),
should_match_name: true,
};
match self.load(&mut load_ctxt) {
fn update_extern_crate(&mut self,
cnum: ast::CrateNum,
- mut extern_crate: ExternCrate)
+ mut extern_crate: ExternCrate,
+ visited: &mut FnvHashSet<(ast::CrateNum, bool)>)
{
+ if !visited.insert((cnum, extern_crate.direct)) { return }
+
let cmeta = self.cstore.get_crate_data(cnum);
let old_extern_crate = cmeta.extern_crate.get();
}
cmeta.extern_crate.set(Some(extern_crate));
-
// Propagate the extern crate info to dependencies.
extern_crate.direct = false;
- for &dep_cnum in cmeta.cnum_map.borrow().values() {
- self.update_extern_crate(dep_cnum, extern_crate);
+ for &dep_cnum in cmeta.cnum_map.borrow().iter() {
+ self.update_extern_crate(dep_cnum, extern_crate, visited);
}
}
fn resolve_crate_deps(&mut self,
root: &Option<CratePaths>,
cdata: &[u8],
- span : Span)
- -> cstore::cnum_map {
+ krate: ast::CrateNum,
+ span: Span)
+ -> cstore::CrateNumMap {
debug!("resolving deps of external crate");
// The map from crate numbers in the crate we're resolving to local crate
// numbers
- decoder::get_crate_deps(cdata).iter().map(|dep| {
+ let map: FnvHashMap<_, _> = decoder::get_crate_deps(cdata).iter().map(|dep| {
debug!("resolving dep crate {} hash: `{}`", dep.name, dep.hash);
let (local_cnum, _, _) = self.resolve_crate(root,
&dep.name,
PathKind::Dependency,
dep.explicitly_linked);
(dep.cnum, local_cnum)
- }).collect()
+ }).collect();
+
+ let max_cnum = map.values().cloned().max().unwrap_or(0);
+
+ // we map 0 and all other holes in the map to our parent crate. The "additional"
+ // self-dependencies should be harmless.
+ (0..max_cnum+1).map(|cnum| map.get(&cnum).cloned().unwrap_or(krate)).collect()
}
fn read_extension_crate(&mut self, span: Span, info: &CrateInfo) -> ExtensionCrate {
rejected_via_hash: vec!(),
rejected_via_triple: vec!(),
rejected_via_kind: vec!(),
+ rejected_via_version: vec!(),
should_match_name: true,
};
let library = self.load(&mut load_ctxt).or_else(|| {
fn inject_dependency_if(&self,
krate: ast::CrateNum,
what: &str,
- needs_dep: &Fn(&cstore::crate_metadata) -> bool) {
+ needs_dep: &Fn(&cstore::CrateMetadata) -> bool) {
// don't perform this validation if the session has errors, as one of
// those errors may indicate a circular dependency which could cause
// this to stack overflow.
// Before we inject any dependencies, make sure we don't inject a
// circular dependency by validating that this crate doesn't
// transitively depend on any crates satisfying `needs_dep`.
- validate(self, krate, krate, what, needs_dep);
+ for dep in self.cstore.crate_dependencies_in_rpo(krate) {
+ let data = self.cstore.get_crate_data(dep);
+ if needs_dep(&data) {
+ self.sess.err(&format!("the crate `{}` cannot depend \
+ on a crate that needs {}, but \
+ it depends on `{}`",
+ self.cstore.get_crate_data(krate).name(),
+ what,
+ data.name()));
+ }
+ }
// All crates satisfying `needs_dep` do not explicitly depend on the
// crate provided for this compile, but in order for this compilation to
}
info!("injecting a dep from {} to {}", cnum, krate);
- let mut cnum_map = data.cnum_map.borrow_mut();
- let remote_cnum = cnum_map.len() + 1;
- let prev = cnum_map.insert(remote_cnum as ast::CrateNum, krate);
- assert!(prev.is_none());
+ data.cnum_map.borrow_mut().push(krate);
});
-
- fn validate(me: &CrateReader,
- krate: ast::CrateNum,
- root: ast::CrateNum,
- what: &str,
- needs_dep: &Fn(&cstore::crate_metadata) -> bool) {
- let data = me.cstore.get_crate_data(krate);
- if needs_dep(&data) {
- let krate_name = data.name();
- let data = me.cstore.get_crate_data(root);
- let root_name = data.name();
- me.sess.err(&format!("the crate `{}` cannot depend \
- on a crate that needs {}, but \
- it depends on `{}`", root_name, what,
- krate_name));
- }
-
- for (_, &dep) in data.cnum_map.borrow().iter() {
- validate(me, dep, root, what, needs_dep);
- }
- }
}
}
span: i.span,
direct: true,
path_len: len,
- });
+ },
+ &mut FnvHashSet());
self.cstore.add_extern_mod_stmt_cnum(info.id, cnum);
}
}
// except according to those terms.
use cstore;
+use common;
use decoder;
use encoder;
use loader;
fn metadata_encoding_version(&self) -> &[u8]
{
- encoder::metadata_encoding_version
+ common::metadata_encoding_version
}
/// Returns a map from a sufficiently visible external item (i.e. an external item that is
pub use self::MetadataBlob::*;
+use common;
use creader;
use decoder;
use index;
use rustc::hir::svh::Svh;
use rustc::middle::cstore::{ExternCrate};
use rustc::session::config::PanicStrategy;
+use rustc_data_structures::indexed_vec::IndexVec;
use rustc::util::nodemap::{FnvHashMap, NodeMap, NodeSet, DefIdMap};
use std::cell::{RefCell, Ref, Cell};
// local crate numbers (as generated during this session). Each external
// crate may refer to types in other external crates, and each has their
// own crate numbers.
-pub type cnum_map = FnvHashMap<ast::CrateNum, ast::CrateNum>;
+pub type CrateNumMap = IndexVec<ast::CrateNum, ast::CrateNum>;
pub enum MetadataBlob {
MetadataVec(Bytes),
pub translated_filemap: Rc<syntax_pos::FileMap>
}
-pub struct crate_metadata {
+pub struct CrateMetadata {
pub name: String,
/// Information about the extern crate that caused this crate to
pub extern_crate: Cell<Option<ExternCrate>>,
pub data: MetadataBlob,
- pub cnum_map: RefCell<cnum_map>,
+ pub cnum_map: RefCell<CrateNumMap>,
pub cnum: ast::CrateNum,
pub codemap_import_info: RefCell<Vec<ImportedFileMap>>,
pub staged_api: bool,
pub struct CStore {
pub dep_graph: DepGraph,
- metas: RefCell<FnvHashMap<ast::CrateNum, Rc<crate_metadata>>>,
+ metas: RefCell<FnvHashMap<ast::CrateNum, Rc<CrateMetadata>>>,
/// Map from NodeId's of local extern crate statements to crate numbers
extern_mod_crate_map: RefCell<NodeMap<ast::CrateNum>>,
used_crate_sources: RefCell<Vec<CrateSource>>,
self.metas.borrow().len() as ast::CrateNum + 1
}
- pub fn get_crate_data(&self, cnum: ast::CrateNum) -> Rc<crate_metadata> {
+ pub fn get_crate_data(&self, cnum: ast::CrateNum) -> Rc<CrateMetadata> {
self.metas.borrow().get(&cnum).unwrap().clone()
}
decoder::get_crate_hash(cdata.data())
}
- pub fn set_crate_data(&self, cnum: ast::CrateNum, data: Rc<crate_metadata>) {
+ pub fn set_crate_data(&self, cnum: ast::CrateNum, data: Rc<CrateMetadata>) {
self.metas.borrow_mut().insert(cnum, data);
}
pub fn iter_crate_data<I>(&self, mut i: I) where
- I: FnMut(ast::CrateNum, &Rc<crate_metadata>),
+ I: FnMut(ast::CrateNum, &Rc<CrateMetadata>),
{
for (&k, v) in self.metas.borrow().iter() {
i(k, v);
/// Like `iter_crate_data`, but passes source paths (if available) as well.
pub fn iter_crate_data_origins<I>(&self, mut i: I) where
- I: FnMut(ast::CrateNum, &crate_metadata, Option<CrateSource>),
+ I: FnMut(ast::CrateNum, &CrateMetadata, Option<CrateSource>),
{
for (&k, v) in self.metas.borrow().iter() {
let origin = self.opt_used_crate_source(k);
self.statically_included_foreign_items.borrow_mut().clear();
}
+ pub fn crate_dependencies_in_rpo(&self, krate: ast::CrateNum) -> Vec<ast::CrateNum>
+ {
+ let mut ordering = Vec::new();
+ self.push_dependencies_in_postorder(&mut ordering, krate);
+ ordering.reverse();
+ ordering
+ }
+
+ pub fn push_dependencies_in_postorder(&self,
+ ordering: &mut Vec<ast::CrateNum>,
+ krate: ast::CrateNum)
+ {
+ if ordering.contains(&krate) { return }
+
+ let data = self.get_crate_data(krate);
+ for &dep in data.cnum_map.borrow().iter() {
+ if dep != krate {
+ self.push_dependencies_in_postorder(ordering, dep);
+ }
+ }
+
+ ordering.push(krate);
+ }
+
// This method is used when generating the command line to pass through to
// system linker. The linker expects undefined symbols on the left of the
// command line to be defined in libraries on the right, not the other way
pub fn do_get_used_crates(&self, prefer: LinkagePreference)
-> Vec<(ast::CrateNum, Option<PathBuf>)> {
let mut ordering = Vec::new();
- fn visit(cstore: &CStore, cnum: ast::CrateNum,
- ordering: &mut Vec<ast::CrateNum>) {
- if ordering.contains(&cnum) { return }
- let meta = cstore.get_crate_data(cnum);
- for (_, &dep) in meta.cnum_map.borrow().iter() {
- visit(cstore, dep, ordering);
- }
- ordering.push(cnum);
- }
for (&num, _) in self.metas.borrow().iter() {
- visit(self, num, &mut ordering);
+ self.push_dependencies_in_postorder(&mut ordering, num);
}
info!("topological ordering: {:?}", ordering);
ordering.reverse();
}
}
-impl crate_metadata {
+impl CrateMetadata {
pub fn data<'a>(&'a self) -> &'a [u8] { self.data.as_slice() }
pub fn name(&self) -> &str { decoder::get_crate_name(self.data()) }
pub fn hash(&self) -> Svh { decoder::get_crate_hash(self.data()) }
}
impl MetadataBlob {
- pub fn as_slice<'a>(&'a self) -> &'a [u8] {
- let slice = match *self {
+ pub fn as_slice_raw<'a>(&'a self) -> &'a [u8] {
+ match *self {
MetadataVec(ref vec) => &vec[..],
MetadataArchive(ref ar) => ar.as_slice(),
- };
- if slice.len() < 4 {
+ }
+ }
+
+ pub fn as_slice<'a>(&'a self) -> &'a [u8] {
+ let slice = self.as_slice_raw();
+ let len_offset = 4 + common::metadata_encoding_version.len();
+ if slice.len() < len_offset+4 {
&[] // corrupt metadata
} else {
- let len = (((slice[0] as u32) << 24) |
- ((slice[1] as u32) << 16) |
- ((slice[2] as u32) << 8) |
- ((slice[3] as u32) << 0)) as usize;
- if len + 4 <= slice.len() {
- &slice[4.. len + 4]
+ let len = (((slice[len_offset+0] as u32) << 24) |
+ ((slice[len_offset+1] as u32) << 16) |
+ ((slice[len_offset+2] as u32) << 8) |
+ ((slice[len_offset+3] as u32) << 0)) as usize;
+ if len <= slice.len() - 4 - len_offset {
+ &slice[len_offset + 4..len_offset + len + 4]
} else {
&[] // corrupt or old metadata
}
use self::Family::*;
use astencode::decode_inlined_item;
-use cstore::{self, crate_metadata};
+use cstore::{self, CrateMetadata};
use common::*;
use def_key;
use encoder::def_to_u64;
use rustc::hir;
use rustc::session::config::PanicStrategy;
-use middle::cstore::{LOCAL_CRATE, FoundAst, InlinedItem, LinkagePreference};
+use middle::cstore::{FoundAst, InlinedItem, LinkagePreference};
use middle::cstore::{DefLike, DlDef, DlField, DlImpl, tls};
use rustc::hir::def::Def;
use rustc::hir::def_id::{DefId, DefIndex};
use syntax::ptr::P;
use syntax_pos::{self, Span, BytePos, NO_EXPANSION};
-pub type Cmd<'a> = &'a crate_metadata;
+pub type Cmd<'a> = &'a CrateMetadata;
-impl crate_metadata {
+impl CrateMetadata {
fn get_item(&self, item_id: DefIndex) -> Option<rbml::Doc> {
self.index.lookup_item(self.data(), item_id).map(|pos| {
reader::doc_at(self.data(), pos as usize).unwrap().doc
mut get_crate_data: G,
mut callback: F) where
F: FnMut(DefLike, ast::Name, ty::Visibility),
- G: FnMut(ast::CrateNum) -> Rc<crate_metadata>,
+ G: FnMut(ast::CrateNum) -> Rc<CrateMetadata>,
{
// Iterate over all children.
for child_info_doc in reader::tagged_docs(item_doc, tag_mod_child) {
get_crate_data: G,
callback: F) where
F: FnMut(DefLike, ast::Name, ty::Visibility),
- G: FnMut(ast::CrateNum) -> Rc<crate_metadata>,
+ G: FnMut(ast::CrateNum) -> Rc<CrateMetadata>,
{
// Find the item.
let item_doc = match cdata.get_item(id) {
get_crate_data: G,
callback: F) where
F: FnMut(DefLike, ast::Name, ty::Visibility),
- G: FnMut(ast::CrateNum) -> Rc<crate_metadata>,
+ G: FnMut(ast::CrateNum) -> Rc<CrateMetadata>,
{
let root_doc = rbml::Doc::new(cdata.data());
let misc_info_doc = reader::get_doc(root_doc, tag_misc_info);
return DefId { krate: cdata.cnum, index: did.index };
}
- match cdata.cnum_map.borrow().get(&did.krate) {
- Some(&n) => {
- DefId {
- krate: n,
- index: did.index,
- }
- }
- None => bug!("didn't find a crate in the cnum_map")
+ DefId {
+ krate: cdata.cnum_map.borrow()[did.krate],
+ index: did.index
}
}
// Translate a DefId from the current compilation environment to a DefId
// for an external crate.
fn reverse_translate_def_id(cdata: Cmd, did: DefId) -> Option<DefId> {
- if did.krate == cdata.cnum {
- return Some(DefId { krate: LOCAL_CRATE, index: did.index });
- }
-
- for (&local, &global) in cdata.cnum_map.borrow().iter() {
+ for (local, &global) in cdata.cnum_map.borrow().iter_enumerated() {
if global == did.krate {
return Some(DefId { krate: local, index: did.index });
}
let cnum = spec.split(':').nth(0).unwrap();
let link = spec.split(':').nth(1).unwrap();
let cnum: ast::CrateNum = cnum.parse().unwrap();
- let cnum = match cdata.cnum_map.borrow().get(&cnum) {
- Some(&n) => n,
- None => bug!("didn't find a crate in the cnum_map")
- };
+ let cnum = cdata.cnum_map.borrow()[cnum];
result.push((cnum, if link == "d" {
LinkagePreference::RequireDynamic
} else {
rbml_w.start_tag(tag_items_data_item);
encode_def_id_and_key(ecx, rbml_w, def_id);
+ encode_name(rbml_w, syntax::parse::token::intern("<closure>"));
rbml_w.start_tag(tag_items_closure_ty);
write_closure_type(ecx, rbml_w, &ecx.tcx.tables.borrow().closure_tys[&def_id]);
fn encode_crate_deps(rbml_w: &mut Encoder, cstore: &cstore::CStore) {
fn get_ordered_deps(cstore: &cstore::CStore)
- -> Vec<(CrateNum, Rc<cstore::crate_metadata>)> {
+ -> Vec<(CrateNum, Rc<cstore::CrateMetadata>)> {
// Pull the cnums and name,vers,hash out of cstore
let mut deps = Vec::new();
cstore.iter_crate_data(|cnum, val| {
}
fn encode_crate_dep(rbml_w: &mut Encoder,
- dep: &cstore::crate_metadata) {
+ dep: &cstore::CrateMetadata) {
rbml_w.start_tag(tag_crate_dep);
rbml_w.wr_tagged_str(tag_crate_dep_crate_name, &dep.name());
let hash = decoder::get_crate_hash(dep.data());
}
}
-// NB: Increment this as you change the metadata encoding version.
-#[allow(non_upper_case_globals)]
-pub const metadata_encoding_version : &'static [u8] = &[b'r', b'u', b's', b't', 0, 0, 0, 2 ];
-
pub fn encode_metadata(ecx: EncodeContext, krate: &hir::Crate) -> Vec<u8> {
let mut wr = Cursor::new(Vec::new());
// the length of the metadata to the start of the metadata. Later on this
// will allow us to slice the metadata to the precise length that we just
// generated regardless of trailing bytes that end up in it.
- let len = v.len() as u32;
- v.insert(0, (len >> 0) as u8);
- v.insert(0, (len >> 8) as u8);
- v.insert(0, (len >> 16) as u8);
- v.insert(0, (len >> 24) as u8);
- return v;
+ //
+ // We also need to store the metadata encoding version here, because
+ // rlibs don't have it. To get older versions of rustc to ignore
+ // this metadata, there are 4 zero bytes at the start, which are
+ // treated as a length of 0 by old compilers.
+
+ let len = v.len();
+ let mut result = vec![];
+ result.push(0);
+ result.push(0);
+ result.push(0);
+ result.push(0);
+ result.extend(metadata_encoding_version.iter().cloned());
+ result.push((len >> 24) as u8);
+ result.push((len >> 16) as u8);
+ result.push((len >> 8) as u8);
+ result.push((len >> 0) as u8);
+ result.extend(v);
+ result
}
fn encode_metadata_inner(rbml_w: &mut Encoder,
#[macro_use]
extern crate rustc;
+extern crate rustc_data_structures;
extern crate rustc_back;
extern crate rustc_llvm;
extern crate rustc_const_math;
//! metadata::loader or metadata::creader for all the juicy details!
use cstore::{MetadataBlob, MetadataVec, MetadataArchive};
+use common::{metadata_encoding_version, rustc_version};
use decoder;
-use encoder;
use rustc::hir::svh::Svh;
use rustc::session::Session;
pub rejected_via_hash: Vec<CrateMismatch>,
pub rejected_via_triple: Vec<CrateMismatch>,
pub rejected_via_kind: Vec<CrateMismatch>,
+ pub rejected_via_version: Vec<CrateMismatch>,
pub should_match_name: bool,
}
struct_span_err!(self.sess, self.span, E0462,
"found staticlib `{}` instead of rlib or dylib{}",
self.ident, add)
+ } else if !self.rejected_via_version.is_empty() {
+ struct_span_err!(self.sess, self.span, E0514,
+ "found crate `{}` compiled by an incompatible version of rustc{}",
+ self.ident, add)
} else {
struct_span_err!(self.sess, self.span, E0463,
"can't find crate for `{}`{}",
}
}
if !self.rejected_via_hash.is_empty() {
- err.note("perhaps this crate needs to be recompiled?");
+ err.note("perhaps that crate needs to be recompiled?");
let mismatches = self.rejected_via_hash.iter();
for (i, &CrateMismatch{ ref path, .. }) in mismatches.enumerate() {
err.note(&format!("crate `{}` path #{}: {}",
}
}
if !self.rejected_via_kind.is_empty() {
- err.help("please recompile this crate using --crate-type lib");
+ err.help("please recompile that crate using --crate-type lib");
let mismatches = self.rejected_via_kind.iter();
for (i, &CrateMismatch { ref path, .. }) in mismatches.enumerate() {
err.note(&format!("crate `{}` path #{}: {}",
self.ident, i+1, path.display()));
}
}
+ if !self.rejected_via_version.is_empty() {
+ err.help(&format!("please recompile that crate using this compiler ({})",
+ rustc_version()));
+ let mismatches = self.rejected_via_version.iter();
+ for (i, &CrateMismatch { ref path, ref got }) in mismatches.enumerate() {
+ err.note(&format!("crate `{}` path #{}: {} compiled by {:?}",
+ self.ident, i+1, path.display(), got));
+ }
+ }
err.emit();
self.sess.abort_if_errors();
}
fn crate_matches(&mut self, crate_data: &[u8], libpath: &Path) -> Option<Svh> {
+ let crate_rustc_version = decoder::crate_rustc_version(crate_data);
+ if crate_rustc_version != Some(rustc_version()) {
+ let message = crate_rustc_version.unwrap_or(format!("an unknown compiler"));
+ info!("Rejecting via version: expected {} got {}", rustc_version(), message);
+ self.rejected_via_version.push(CrateMismatch {
+ path: libpath.to_path_buf(),
+ got: message
+ });
+ return None;
+ }
+
if self.should_match_name {
match decoder::maybe_get_crate_name(crate_data) {
Some(ref name) if self.crate_name == *name => {}
pub fn as_slice<'a>(&'a self) -> &'a [u8] { unsafe { &*self.data } }
}
+fn verify_decompressed_encoding_version(blob: &MetadataBlob, filename: &Path)
+ -> Result<(), String>
+{
+ let data = blob.as_slice_raw();
+ if data.len() < 4+metadata_encoding_version.len() ||
+ !<[u8]>::eq(&data[..4], &[0, 0, 0, 0]) ||
+ &data[4..4+metadata_encoding_version.len()] != metadata_encoding_version
+ {
+ Err((format!("incompatible metadata version found: '{}'",
+ filename.display())))
+ } else {
+ Ok(())
+ }
+}
+
// Just a small wrapper to time how long reading metadata takes.
fn get_metadata_section(target: &Target, flavor: CrateFlavor, filename: &Path)
-> Result<MetadataBlob, String> {
return match ArchiveMetadata::new(archive).map(|ar| MetadataArchive(ar)) {
None => Err(format!("failed to read rlib metadata: '{}'",
filename.display())),
- Some(blob) => Ok(blob)
+ Some(blob) => {
+ try!(verify_decompressed_encoding_version(&blob, filename));
+ Ok(blob)
+ }
};
}
unsafe {
let cbuf = llvm::LLVMGetSectionContents(si.llsi);
let csz = llvm::LLVMGetSectionSize(si.llsi) as usize;
let cvbuf: *const u8 = cbuf as *const u8;
- let vlen = encoder::metadata_encoding_version.len();
+ let vlen = metadata_encoding_version.len();
debug!("checking {} bytes of metadata-version stamp",
vlen);
let minsz = cmp::min(vlen, csz);
let buf0 = slice::from_raw_parts(cvbuf, minsz);
- let version_ok = buf0 == encoder::metadata_encoding_version;
+ let version_ok = buf0 == metadata_encoding_version;
if !version_ok {
return Err((format!("incompatible metadata version found: '{}'",
filename.display())));
csz - vlen);
let bytes = slice::from_raw_parts(cvbuf1, csz - vlen);
match flate::inflate_bytes(bytes) {
- Ok(inflated) => return Ok(MetadataVec(inflated)),
+ Ok(inflated) => {
+ let blob = MetadataVec(inflated);
+ try!(verify_decompressed_encoding_version(&blob, filename));
+ return Ok(blob);
+ }
Err(_) => {}
}
}
return r;
}
-pub fn get_linker(sess: &Session) -> (String, Command) {
+// The third parameter is for an extra path to add to PATH for MSVC
+// cross linkers for host toolchain DLL dependencies
+pub fn get_linker(sess: &Session) -> (String, Command, Option<PathBuf>) {
if let Some(ref linker) = sess.opts.cg.linker {
- (linker.clone(), Command::new(linker))
+ (linker.clone(), Command::new(linker), None)
} else if sess.target.target.options.is_like_msvc {
- ("link.exe".to_string(), msvc::link_exe_cmd(sess))
+ let (cmd, host) = msvc::link_exe_cmd(sess);
+ ("link.exe".to_string(), cmd, host)
} else {
(sess.target.target.options.linker.clone(),
- Command::new(&sess.target.target.options.linker))
+ Command::new(&sess.target.target.options.linker), None)
}
}
})
}
-fn command_path(sess: &Session) -> OsString {
+fn command_path(sess: &Session, extra: Option<PathBuf>) -> OsString {
// The compiler's sysroot often has some bundled tools, so add it to the
// PATH for the child.
let mut new_path = sess.host_filesearch(PathKind::All)
if let Some(path) = env::var_os("PATH") {
new_path.extend(env::split_paths(&path));
}
- if sess.target.target.options.is_like_msvc {
- new_path.extend(msvc::host_dll_path());
- }
+ new_path.extend(extra);
env::join_paths(new_path).unwrap()
}
src: input.map(|p| p.to_path_buf()),
lib_search_paths: archive_search_paths(sess),
ar_prog: get_ar_prog(sess),
- command_path: command_path(sess),
+ command_path: command_path(sess, None),
}
}
info!("preparing {:?} from {:?} to {:?}", crate_type, objects, out_filename);
// The invocations of cc share some flags across platforms
- let (pname, mut cmd) = get_linker(sess);
- cmd.env("PATH", command_path(sess));
+ let (pname, mut cmd, extra) = get_linker(sess);
+ cmd.env("PATH", command_path(sess, extra));
let root = sess.target_filesearch(PathKind::Native).get_lib_path();
cmd.args(&sess.target.target.options.pre_link_args);
info!("linker stdout:\n{}", escape_string(&prog.stdout[..]));
},
Err(e) => {
- // Trying to diagnose https://github.com/rust-lang/rust/issues/33844
sess.struct_err(&format!("could not exec the linker `{}`: {}", pname, e))
.note(&format!("{:?}", &cmd))
.emit();
+ if sess.target.target.options.is_like_msvc && e.kind() == io::ErrorKind::NotFound {
+ sess.note_without_error("the msvc targets depend on the msvc linker \
+ but `link.exe` was not found");
+ sess.note_without_error("please ensure that VS 2013 or VS 2015 was installed \
+ with the Visual C++ option");
+ }
sess.abort_if_errors();
}
}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![allow(non_camel_case_types, non_snake_case)]
+
+use libc::c_void;
+use std::mem;
+
+type DWORD = u32;
+type WORD = u16;
+type LPVOID = *mut c_void;
+type DWORD_PTR = usize;
+
+const PROCESSOR_ARCHITECTURE_INTEL: WORD = 0;
+const PROCESSOR_ARCHITECTURE_AMD64: WORD = 9;
+
+#[repr(C)]
+struct SYSTEM_INFO {
+ wProcessorArchitecture: WORD,
+ _wReserved: WORD,
+ _dwPageSize: DWORD,
+ _lpMinimumApplicationAddress: LPVOID,
+ _lpMaximumApplicationAddress: LPVOID,
+ _dwActiveProcessorMask: DWORD_PTR,
+ _dwNumberOfProcessors: DWORD,
+ _dwProcessorType: DWORD,
+ _dwAllocationGranularity: DWORD,
+ _wProcessorLevel: WORD,
+ _wProcessorRevision: WORD,
+}
+
+extern "system" {
+ fn GetNativeSystemInfo(lpSystemInfo: *mut SYSTEM_INFO);
+}
+
+pub enum Arch {
+ X86,
+ Amd64,
+}
+
+pub fn host_arch() -> Option<Arch> {
+ let mut info = unsafe { mem::zeroed() };
+ unsafe { GetNativeSystemInfo(&mut info) };
+ match info.wProcessorArchitecture {
+ PROCESSOR_ARCHITECTURE_INTEL => Some(Arch::X86),
+ PROCESSOR_ARCHITECTURE_AMD64 => Some(Arch::Amd64),
+ _ => None,
+ }
+}
//! paths/files is based on Microsoft's logic in their vcvars bat files, but
//! comments can also be found below leading through the various code paths.
+// A simple macro to make this option mess easier to read
+macro_rules! otry {
+ ($expr:expr) => (match $expr {
+ Some(val) => val,
+ None => return None,
+ })
+}
+
#[cfg(windows)]
mod registry;
+#[cfg(windows)]
+mod arch;
#[cfg(windows)]
mod platform {
use std::path::{Path, PathBuf};
use std::process::Command;
use session::Session;
- use super::registry::{LOCAL_MACHINE};
+ use super::arch::{host_arch, Arch};
+ use super::registry::LOCAL_MACHINE;
- // Cross toolchains depend on dlls from the host toolchain
- // We can't just add it to the Command's PATH in `link_exe_cmd` because it
- // is later overridden so we publicly expose it here instead
- pub fn host_dll_path() -> Option<PathBuf> {
- get_vc_dir().and_then(|(_, vcdir)| {
- host_dll_subdir().map(|sub| {
- vcdir.join("bin").join(sub)
+ // First we need to figure out whether the environment is already correctly
+ // configured by vcvars. We do this by looking at the environment variable
+ // `VCINSTALLDIR` which is always set by vcvars, and unlikely to be set
+ // otherwise. If it is defined, then we find `link.exe` in `PATH and trust
+ // that everything else is configured correctly.
+ //
+ // If `VCINSTALLDIR` wasn't defined (or we couldn't find the linker where
+ // it claimed it should be), then we resort to finding everything
+ // ourselves. First we find where the latest version of MSVC is installed
+ // and what version it is. Then based on the version we find the
+ // appropriate SDKs.
+ //
+ // If despite our best efforts we are still unable to find MSVC then we
+ // just blindly call `link.exe` and hope for the best.
+ //
+ // This code only supports VC 11 through 15. For versions older than that
+ // the user will need to manually execute the appropriate vcvars bat file
+ // and it should hopefully work.
+ //
+ // The second member of the tuple we return is the directory for the host
+ // linker toolchain, which is necessary when using the cross linkers.
+ pub fn link_exe_cmd(sess: &Session) -> (Command, Option<PathBuf>) {
+ let arch = &sess.target.target.arch;
+ env::var_os("VCINSTALLDIR").and_then(|_| {
+ debug!("Detected that vcvars was already run.");
+ let path = otry!(env::var_os("PATH"));
+ // Mingw has its own link which is not the link we want so we
+ // look for `cl.exe` too as a precaution.
+ env::split_paths(&path).find(|path| {
+ path.join("cl.exe").is_file()
+ && path.join("link.exe").is_file()
+ }).map(|path| {
+ (Command::new(path.join("link.exe")), None)
})
+ }).or_else(|| {
+ None.or_else(|| {
+ find_msvc_latest(arch, "15.0")
+ }).or_else(|| {
+ find_msvc_latest(arch, "14.0")
+ }).or_else(|| {
+ find_msvc_12(arch)
+ }).or_else(|| {
+ find_msvc_11(arch)
+ }).map(|(cmd, path)| (cmd, Some(path)))
+ }).unwrap_or_else(|| {
+ debug!("Failed to locate linker.");
+ (Command::new("link.exe"), None)
})
}
- pub fn link_exe_cmd(sess: &Session) -> Command {
- let arch = &sess.target.target.arch;
- let (binsub, libsub, vclibsub) =
- match (bin_subdir(arch), lib_subdir(arch), vc_lib_subdir(arch)) {
- (Some(x), Some(y), Some(z)) => (x, y, z),
- _ => return Command::new("link.exe"),
- };
+ // For MSVC 14 or newer we need to find the Universal CRT as well as either
+ // the Windows 10 SDK or Windows 8.1 SDK.
+ fn find_msvc_latest(arch: &str, ver: &str) -> Option<(Command, PathBuf)> {
+ let vcdir = otry!(get_vc_dir(ver));
+ let (mut cmd, host) = otry!(get_linker(&vcdir, arch));
+ let sub = otry!(lib_subdir(arch));
+ let ucrt = otry!(get_ucrt_dir());
+ debug!("Found Universal CRT {:?}", ucrt);
+ add_lib(&mut cmd, &ucrt.join("ucrt").join(sub));
+ if let Some(dir) = get_sdk10_dir() {
+ debug!("Found Win10 SDK {:?}", dir);
+ add_lib(&mut cmd, &dir.join("um").join(sub));
+ } else if let Some(dir) = get_sdk81_dir() {
+ debug!("Found Win8.1 SDK {:?}", dir);
+ add_lib(&mut cmd, &dir.join("um").join(sub));
+ } else {
+ return None
+ }
+ Some((cmd, host))
+ }
- // First we need to figure out whether the environment is already correctly
- // configured by vcvars. We do this by looking at the environment variable
- // `VCINSTALLDIR` which is always set by vcvars, and unlikely to be set
- // otherwise. If it is defined, then we derive the path to `link.exe` from
- // that and trust that everything else is configured correctly.
- //
- // If `VCINSTALLDIR` wasn't defined (or we couldn't find the linker where it
- // claimed it should be), then we resort to finding everything ourselves.
- // First we find where the latest version of MSVC is installed and what
- // version it is. Then based on the version we find the appropriate SDKs.
- //
- // For MSVC 14 (VS 2015) we look for the Win10 SDK and failing that we look
- // for the Win8.1 SDK. We also look for the Universal CRT.
- //
- // For MSVC 12 (VS 2013) we look for the Win8.1 SDK.
- //
- // For MSVC 11 (VS 2012) we look for the Win8 SDK.
- //
- // For all other versions the user has to execute the appropriate vcvars bat
- // file themselves to configure the environment.
- //
- // If despite our best efforts we are still unable to find MSVC then we just
- // blindly call `link.exe` and hope for the best.
- return env::var_os("VCINSTALLDIR").and_then(|dir| {
- debug!("Environment already configured by user. Assuming it works.");
- let mut p = PathBuf::from(dir);
- p.push("bin");
- p.push(binsub);
- p.push("link.exe");
- if !p.is_file() { return None }
- Some(Command::new(p))
- }).or_else(|| {
- get_vc_dir().and_then(|(ver, vcdir)| {
- debug!("Found VC installation directory {:?}", vcdir);
- let linker = vcdir.clone().join("bin").join(binsub).join("link.exe");
- if !linker.is_file() { return None }
- let mut cmd = Command::new(linker);
- add_lib(&mut cmd, &vcdir.join("lib").join(vclibsub));
- if ver == "14.0" {
- if let Some(dir) = get_ucrt_dir() {
- debug!("Found Universal CRT {:?}", dir);
- add_lib(&mut cmd, &dir.join("ucrt").join(libsub));
- }
- if let Some(dir) = get_sdk10_dir() {
- debug!("Found Win10 SDK {:?}", dir);
- add_lib(&mut cmd, &dir.join("um").join(libsub));
- } else if let Some(dir) = get_sdk81_dir() {
- debug!("Found Win8.1 SDK {:?}", dir);
- add_lib(&mut cmd, &dir.join("um").join(libsub));
- }
- } else if ver == "12.0" {
- if let Some(dir) = get_sdk81_dir() {
- debug!("Found Win8.1 SDK {:?}", dir);
- add_lib(&mut cmd, &dir.join("um").join(libsub));
- }
- } else { // ver == "11.0"
- if let Some(dir) = get_sdk8_dir() {
- debug!("Found Win8 SDK {:?}", dir);
- add_lib(&mut cmd, &dir.join("um").join(libsub));
- }
- }
- Some(cmd)
- })
- }).unwrap_or_else(|| {
- debug!("Failed to locate linker.");
- Command::new("link.exe")
- });
+ // For MSVC 12 we need to find the Windows 8.1 SDK.
+ fn find_msvc_12(arch: &str) -> Option<(Command, PathBuf)> {
+ let vcdir = otry!(get_vc_dir("12.0"));
+ let (mut cmd, host) = otry!(get_linker(&vcdir, arch));
+ let sub = otry!(lib_subdir(arch));
+ let sdk81 = otry!(get_sdk81_dir());
+ debug!("Found Win8.1 SDK {:?}", sdk81);
+ add_lib(&mut cmd, &sdk81.join("um").join(sub));
+ Some((cmd, host))
+ }
+
+ // For MSVC 11 we need to find the Windows 8 SDK.
+ fn find_msvc_11(arch: &str) -> Option<(Command, PathBuf)> {
+ let vcdir = otry!(get_vc_dir("11.0"));
+ let (mut cmd, host) = otry!(get_linker(&vcdir, arch));
+ let sub = otry!(lib_subdir(arch));
+ let sdk8 = otry!(get_sdk8_dir());
+ debug!("Found Win8 SDK {:?}", sdk8);
+ add_lib(&mut cmd, &sdk8.join("um").join(sub));
+ Some((cmd, host))
}
- // A convenience function to make the above code simpler
+
+ // A convenience function to append library paths.
fn add_lib(cmd: &mut Command, lib: &Path) {
let mut arg: OsString = "/LIBPATH:".into();
arg.push(lib);
cmd.arg(arg);
}
- // To find MSVC we look in a specific registry key for the newest of the
- // three versions that we support.
- fn get_vc_dir() -> Option<(&'static str, PathBuf)> {
- LOCAL_MACHINE.open(r"SOFTWARE\Microsoft\VisualStudio\SxS\VC7".as_ref())
- .ok().and_then(|key| {
- ["14.0", "12.0", "11.0"].iter().filter_map(|ver| {
- key.query_str(ver).ok().map(|p| (*ver, p.into()))
- }).next()
- })
+ // Given a possible MSVC installation directory, we look for the linker and
+ // then add the MSVC library path.
+ fn get_linker(path: &Path, arch: &str) -> Option<(Command, PathBuf)> {
+ debug!("Looking for linker in {:?}", path);
+ bin_subdir(arch).into_iter().map(|(sub, host)| {
+ (path.join("bin").join(sub).join("link.exe"),
+ path.join("bin").join(host))
+ }).filter(|&(ref path, _)| {
+ path.is_file()
+ }).map(|(path, host)| {
+ (Command::new(path), host)
+ }).filter_map(|(mut cmd, host)| {
+ let sub = otry!(vc_lib_subdir(arch));
+ add_lib(&mut cmd, &path.join("lib").join(sub));
+ Some((cmd, host))
+ }).next()
+ }
+
+ // To find MSVC we look in a specific registry key for the version we are
+ // trying to find.
+ fn get_vc_dir(ver: &str) -> Option<PathBuf> {
+ let key = otry!(LOCAL_MACHINE
+ .open(r"SOFTWARE\Microsoft\VisualStudio\SxS\VC7".as_ref()).ok());
+ let path = otry!(key.query_str(ver).ok());
+ Some(path.into())
}
// To find the Universal CRT we look in a specific registry key for where
// find the newest version. While this sort of sorting isn't ideal, it is
// what vcvars does so that's good enough for us.
fn get_ucrt_dir() -> Option<PathBuf> {
- LOCAL_MACHINE.open(r"SOFTWARE\Microsoft\Windows Kits\Installed Roots".as_ref())
- .ok().and_then(|key| {
- key.query_str("KitsRoot10").ok()
- }).and_then(|root| {
- fs::read_dir(Path::new(&root).join("Lib")).ok()
- }).and_then(|readdir| {
- let mut dirs: Vec<_> = readdir.filter_map(|dir| {
- dir.ok()
- }).map(|dir| {
- dir.path()
- }).filter(|dir| {
- dir.components().last().and_then(|c| {
- c.as_os_str().to_str()
- }).map(|c| c.starts_with("10.")).unwrap_or(false)
- }).collect();
- dirs.sort();
- dirs.pop()
- })
+ let key = otry!(LOCAL_MACHINE
+ .open(r"SOFTWARE\Microsoft\Windows Kits\Installed Roots".as_ref()).ok());
+ let root = otry!(key.query_str("KitsRoot10").ok());
+ let readdir = otry!(fs::read_dir(Path::new(&root).join("lib")).ok());
+ readdir.filter_map(|dir| {
+ dir.ok()
+ }).map(|dir| {
+ dir.path()
+ }).filter(|dir| {
+ dir.components().last().and_then(|c| {
+ c.as_os_str().to_str()
+ }).map(|c| {
+ c.starts_with("10.") && dir.join("ucrt").is_dir()
+ }).unwrap_or(false)
+ }).max()
}
// Vcvars finds the correct version of the Windows 10 SDK by looking
- // for the include um/Windows.h because sometimes a given version will
+ // for the include `um\Windows.h` because sometimes a given version will
// only have UCRT bits without the rest of the SDK. Since we only care about
- // libraries and not includes, we just look for the folder `um` in the lib
- // section. Like we do for the Universal CRT, we sort the possibilities
+ // libraries and not includes, we instead look for `um\x64\kernel32.lib`.
+ // Since the 32-bit and 64-bit libraries are always installed together we
+ // only need to bother checking x64, making this code a tiny bit simpler.
+ // Like we do for the Universal CRT, we sort the possibilities
// asciibetically to find the newest one as that is what vcvars does.
fn get_sdk10_dir() -> Option<PathBuf> {
- LOCAL_MACHINE.open(r"SOFTWARE\Microsoft\Microsoft SDKs\Windows\v10.0".as_ref())
- .ok().and_then(|key| {
- key.query_str("InstallationFolder").ok()
- }).and_then(|root| {
- fs::read_dir(Path::new(&root).join("lib")).ok()
- }).and_then(|readdir| {
- let mut dirs: Vec<_> = readdir.filter_map(|dir| dir.ok())
- .map(|dir| dir.path()).collect();
- dirs.sort();
- dirs.into_iter().rev().filter(|dir| {
- dir.join("um").is_dir()
- }).next()
- })
+ let key = otry!(LOCAL_MACHINE
+ .open(r"SOFTWARE\Microsoft\Microsoft SDKs\Windows\v10.0".as_ref()).ok());
+ let root = otry!(key.query_str("InstallationFolder").ok());
+ let readdir = otry!(fs::read_dir(Path::new(&root).join("lib")).ok());
+ let mut dirs: Vec<_> = readdir.filter_map(|dir| dir.ok())
+ .map(|dir| dir.path()).collect();
+ dirs.sort();
+ dirs.into_iter().rev().filter(|dir| {
+ dir.join("um").join("x64").join("kernel32.lib").is_file()
+ }).next()
}
// Interestingly there are several subdirectories, `win7` `win8` and
// applies to us. Note that if we were targetting kernel mode drivers
// instead of user mode applications, we would care.
fn get_sdk81_dir() -> Option<PathBuf> {
- LOCAL_MACHINE.open(r"SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.1".as_ref())
- .ok().and_then(|key| {
- key.query_str("InstallationFolder").ok()
- }).map(|root| {
- Path::new(&root).join("lib").join("winv6.3")
- })
+ let key = otry!(LOCAL_MACHINE
+ .open(r"SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.1".as_ref()).ok());
+ let root = otry!(key.query_str("InstallationFolder").ok());
+ Some(Path::new(&root).join("lib").join("winv6.3"))
}
fn get_sdk8_dir() -> Option<PathBuf> {
- LOCAL_MACHINE.open(r"SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.0".as_ref())
- .ok().and_then(|key| {
- key.query_str("InstallationFolder").ok()
- }).map(|root| {
- Path::new(&root).join("lib").join("win8")
- })
+ let key = otry!(LOCAL_MACHINE
+ .open(r"SOFTWARE\Microsoft\Microsoft SDKs\Windows\v8.0".as_ref()).ok());
+ let root = otry!(key.query_str("InstallationFolder").ok());
+ Some(Path::new(&root).join("lib").join("win8"))
}
// When choosing the linker toolchain to use, we have to choose the one
// where someone on 32-bit Windows is trying to cross compile to 64-bit and
// it tries to invoke the native 64-bit linker which won't work.
//
- // FIXME - This currently functions based on the host architecture of rustc
- // itself but it should instead detect the bitness of the OS itself.
+ // For the return value of this function, the first member of the tuple is
+ // the folder of the linker we will be invoking, while the second member
+ // is the folder of the host toolchain for that linker which is essential
+ // when using a cross linker. We return a Vec since on x64 there are often
+ // two linkers that can target the architecture we desire. The 64-bit host
+ // linker is preferred, and hence first, due to 64-bit allowing it more
+ // address space to work with and potentially being faster.
//
// FIXME - Figure out what happens when the host architecture is arm.
- //
- // FIXME - Some versions of MSVC may not come with all these toolchains.
- // Consider returning an array of toolchains and trying them one at a time
- // until the linker is found.
- fn bin_subdir(arch: &str) -> Option<&'static str> {
- if cfg!(target_arch = "x86_64") {
- match arch {
- "x86" => Some("amd64_x86"),
- "x86_64" => Some("amd64"),
- "arm" => Some("amd64_arm"),
- _ => None,
- }
- } else if cfg!(target_arch = "x86") {
- match arch {
- "x86" => Some(""),
- "x86_64" => Some("x86_amd64"),
- "arm" => Some("x86_arm"),
- _ => None,
- }
- } else { None }
+ fn bin_subdir(arch: &str) -> Vec<(&'static str, &'static str)> {
+ match (arch, host_arch()) {
+ ("x86", Some(Arch::X86)) => vec![("", "")],
+ ("x86", Some(Arch::Amd64)) => vec![("amd64_x86", "amd64"), ("", "")],
+ ("x86_64", Some(Arch::X86)) => vec![("x86_amd64", "")],
+ ("x86_64", Some(Arch::Amd64)) => vec![("amd64", "amd64"), ("x86_amd64", "")],
+ ("arm", Some(Arch::X86)) => vec![("x86_arm", "")],
+ ("arm", Some(Arch::Amd64)) => vec![("amd64_arm", "amd64"), ("x86_arm", "")],
+ _ => vec![],
+ }
}
+
fn lib_subdir(arch: &str) -> Option<&'static str> {
match arch {
"x86" => Some("x86"),
_ => None,
}
}
+
// MSVC's x86 libraries are not in a subfolder
fn vc_lib_subdir(arch: &str) -> Option<&'static str> {
match arch {
_ => None,
}
}
- fn host_dll_subdir() -> Option<&'static str> {
- if cfg!(target_arch = "x86_64") { Some("amd64") }
- else if cfg!(target_arch = "x86") { Some("") }
- else { None }
- }
}
// If we're not on Windows, then there's no registry to search through and MSVC
use std::path::PathBuf;
use std::process::Command;
use session::Session;
- pub fn link_exe_cmd(_sess: &Session) -> Command {
- Command::new("link.exe")
+ pub fn link_exe_cmd(_sess: &Session) -> (Command, Option<PathBuf>) {
+ (Command::new("link.exe"), None)
}
- pub fn host_dll_path() -> Option<PathBuf> { None }
}
+
pub use self::platform::*;
}
pub fn run_assembler(sess: &Session, outputs: &OutputFilenames) {
- let (pname, mut cmd) = get_linker(sess);
+ let (pname, mut cmd, _) = get_linker(sess);
cmd.arg("-c").arg("-o").arg(&outputs.path(OutputType::Object))
.arg(&outputs.temp_path(OutputType::Assembly));
register_long_diagnostics! {
+E0510: r##"
+`return_address` was used in an invalid context. Erroneous code example:
+
+```ignore
+#![feature(intrinsics)]
+
+extern "rust-intrinsic" {
+ fn return_address() -> *const u8;
+}
+
+unsafe fn by_value() -> i32 {
+ let _ = return_address();
+ // error: invalid use of `return_address` intrinsic: function does
+ // not use out pointer
+ 0
+}
+```
+
+Return values may be stored in a return register(s) or written into a so-called
+out pointer. In case the returned value is too big (this is
+target-ABI-dependent and generally not portable or future proof) to fit into
+the return register(s), the compiler will return the value by writing it into
+space allocated in the caller's stack frame. Example:
+
+```
+#![feature(intrinsics)]
+
+extern "rust-intrinsic" {
+ fn return_address() -> *const u8;
+}
+
+unsafe fn by_pointer() -> String {
+ let _ = return_address();
+ String::new() // ok!
+}
+```
+"##,
+
E0511: r##"
Invalid monomorphization of an intrinsic function was used. Erroneous code
example:
},
+
+ (_, "return_address") => {
+ if !fcx.fn_ty.ret.is_indirect() {
+ span_err!(tcx.sess, span, E0510,
+ "invalid use of `return_address` intrinsic: function \
+ does not use out pointer");
+ C_null(Type::i8p(ccx))
+ } else {
+ PointerCast(bcx, llvm::get_param(fcx.llfn, 0), Type::i8p(ccx))
+ }
+ }
+
(_, "discriminant_value") => {
let val_ty = substs.types.get(FnSpace, 0);
match val_ty.sty {
impl<'tcx> fmt::Display for Instance<'tcx> {
fn fmt(&self, f: &mut fmt::Formatter) -> fmt::Result {
- ppaux::parameterized(f, &self.substs, self.def, ppaux::Ns::Value, &[],
- |tcx| Some(tcx.lookup_item_type(self.def).generics))
+ ppaux::parameterized(f, &self.substs, self.def, ppaux::Ns::Value, &[], |_| None)
}
}
"fadd_fast" | "fsub_fast" | "fmul_fast" | "fdiv_fast" | "frem_fast" =>
(1, vec![param(ccx, 0), param(ccx, 0)], param(ccx, 0)),
+ "return_address" => (0, vec![], tcx.mk_imm_ptr(tcx.types.u8)),
+
"assume" => (0, vec![tcx.types.bool], tcx.mk_nil()),
"discriminant_value" => (1, vec![
let rty = ccx.tcx.node_id_to_type(id);
let fcx = FnCtxt::new(&inh, ty::FnConverging(rty), e.id);
let declty = fcx.tcx.lookup_item_type(ccx.tcx.map.local_def_id(id)).ty;
+ fcx.require_type_is_sized(declty, e.span, traits::ConstSized);
fcx.check_const_with_ty(sp, e, declty);
});
}
use borrow::Borrow;
use cmp::max;
use fmt::{self, Debug};
-use hash::{Hash, SipHasher, BuildHasher};
+use hash::{Hash, Hasher, BuildHasher, SipHasher13};
use iter::FromIterator;
use mem::{self, replace};
use ops::{Deref, Index};
#[stable(feature = "hashmap_build_hasher", since = "1.7.0")]
impl BuildHasher for RandomState {
- type Hasher = SipHasher;
+ type Hasher = DefaultHasher;
#[inline]
- fn build_hasher(&self) -> SipHasher {
- SipHasher::new_with_keys(self.k0, self.k1)
+ fn build_hasher(&self) -> DefaultHasher {
+ DefaultHasher(SipHasher13::new_with_keys(self.k0, self.k1))
+ }
+}
+
+/// The default `Hasher` used by `RandomState`.
+///
+/// The internal algorithm is not specified, and so it and its hashes should
+/// not be relied upon over releases.
+#[unstable(feature = "hashmap_default_hasher", issue = "0")]
+pub struct DefaultHasher(SipHasher13);
+
+#[unstable(feature = "hashmap_default_hasher", issue = "0")]
+impl Hasher for DefaultHasher {
+ #[inline]
+ fn write(&mut self, msg: &[u8]) {
+ self.0.write(msg)
+ }
+
+ #[inline]
+ fn finish(&self) -> u64 {
+ self.0.finish()
}
}
#![feature(reflect_marker)]
#![feature(rustc_attrs)]
#![feature(shared)]
+#![feature(sip_hash_13)]
#![feature(slice_bytes)]
#![feature(slice_concat_ext)]
#![feature(slice_patterns)]
/// some other type (e.g. a string) just for it to be converted back to
/// `SocketAddr` in constructor methods is pointless.
///
+/// Addresses returned by the operating system that are not IP addresses are
+/// silently ignored.
+///
/// Some examples:
///
/// ```no_run
fn resolve_socket_addr(s: &str, p: u16) -> io::Result<vec::IntoIter<SocketAddr>> {
let ips = lookup_host(s)?;
- let v: Vec<_> = ips.map(|a| {
- a.map(|mut a| {
- a.set_port(p);
- a
- })
- }).collect()?;
+ let v: Vec<_> = ips.map(|mut a| { a.set_port(p); a }).collect();
Ok(v.into_iter())
}
addresses",
issue = "27705")]
impl Iterator for LookupHost {
- type Item = io::Result<SocketAddr>;
- fn next(&mut self) -> Option<io::Result<SocketAddr>> { self.0.next() }
+ type Item = SocketAddr;
+ fn next(&mut self) -> Option<SocketAddr> { self.0.next() }
}
/// Resolve the host specified by `host` as a number of `SocketAddr` instances.
/// This method may perform a DNS query to resolve `host` and may also inspect
/// system configuration to resolve the specified hostname.
///
+/// The returned iterator will skip over any unknown addresses returned by the
+/// operating system.
+///
/// # Examples
///
/// ```no_run
///
/// # fn foo() -> std::io::Result<()> {
/// for host in try!(net::lookup_host("rust-lang.org")) {
-/// println!("found address: {}", try!(host));
+/// println!("found address: {}", host);
/// }
/// # Ok(())
/// # }
self._push(path.as_ref())
}
- #[allow(deprecated)]
fn _push(&mut self, path: &Path) {
// in general, a separator is needed if the rightmost byte is not a separator
let mut need_sep = self.as_mut_vec().last().map(|c| !is_sep_byte(*c)).unwrap_or(false);
}
impl Iterator for LookupHost {
- type Item = io::Result<SocketAddr>;
- fn next(&mut self) -> Option<io::Result<SocketAddr>> {
- unsafe {
- if self.cur.is_null() { return None }
- let ret = sockaddr_to_addr(mem::transmute((*self.cur).ai_addr),
- (*self.cur).ai_addrlen as usize);
- self.cur = (*self.cur).ai_next as *mut c::addrinfo;
- Some(ret)
+ type Item = SocketAddr;
+ fn next(&mut self) -> Option<SocketAddr> {
+ loop {
+ unsafe {
+ let cur = match self.cur.as_ref() {
+ None => return None,
+ Some(c) => c,
+ };
+ self.cur = cur.ai_next;
+ match sockaddr_to_addr(mem::transmute(cur.ai_addr),
+ cur.ai_addrlen as usize)
+ {
+ Ok(addr) => return Some(addr),
+ Err(_) => continue,
+ }
+ }
}
}
}
static USED_ATTRS: RefCell<Vec<u64>> = RefCell::new(Vec::new())
}
+enum AttrError {
+ MultipleItem(InternedString),
+ UnknownMetaItem(InternedString),
+ MissingSince,
+ MissingFeature,
+ MultipleStabilityLevels,
+}
+
+fn handle_errors(diag: &Handler, span: Span, error: AttrError) {
+ match error {
+ AttrError::MultipleItem(item) => span_err!(diag, span, E0538,
+ "multiple '{}' items", item),
+ AttrError::UnknownMetaItem(item) => span_err!(diag, span, E0541,
+ "unknown meta item '{}'", item),
+ AttrError::MissingSince => span_err!(diag, span, E0542, "missing 'since'"),
+ AttrError::MissingFeature => span_err!(diag, span, E0546, "missing 'feature'"),
+ AttrError::MultipleStabilityLevels => span_err!(diag, span, E0544,
+ "multiple stability levels"),
+ }
+}
+
pub fn mark_used(attr: &Attribute) {
let AttrId(id) = attr.node.id;
USED_ATTRS.with(|slot| {
if let s@Some(_) = attr.value_str() {
s
} else {
- diag.struct_span_err(attr.span,
- "export_name attribute has invalid format")
- .help("use #[export_name=\"*\"]")
- .emit();
+ struct_span_err!(diag, attr.span, E0533,
+ "export_name attribute has invalid format")
+ .help("use #[export_name=\"*\"]")
+ .emit();
None
}
} else {
MetaItemKind::List(ref n, ref items) if n == "inline" => {
mark_used(attr);
if items.len() != 1 {
- diagnostic.map(|d|{ d.span_err(attr.span, "expected one argument"); });
+ diagnostic.map(|d|{ span_err!(d, attr.span, E0534, "expected one argument"); });
InlineAttr::None
} else if contains_name(&items[..], "always") {
InlineAttr::Always
} else if contains_name(&items[..], "never") {
InlineAttr::Never
} else {
- diagnostic.map(|d|{ d.span_err((*items[0]).span, "invalid argument"); });
+ diagnostic.map(|d| {
+ span_err!(d, (*items[0]).span, E0535, "invalid argument");
+ });
InlineAttr::None
}
}
mis.iter().all(|mi| cfg_matches(cfgs, &mi, sess, features)),
ast::MetaItemKind::List(ref pred, ref mis) if &pred[..] == "not" => {
if mis.len() != 1 {
- sess.span_diagnostic.span_err(cfg.span, "expected 1 cfg-pattern");
+ span_err!(sess.span_diagnostic, cfg.span, E0536, "expected 1 cfg-pattern");
return false;
}
!cfg_matches(cfgs, &mis[0], sess, features)
}
ast::MetaItemKind::List(ref pred, _) => {
- sess.span_diagnostic.span_err(cfg.span, &format!("invalid predicate `{}`", pred));
+ span_err!(sess.span_diagnostic, cfg.span, E0537, "invalid predicate `{}`", pred);
false
},
ast::MetaItemKind::Word(_) | ast::MetaItemKind::NameValue(..) => {
if let Some(metas) = attr.meta_item_list() {
let get = |meta: &MetaItem, item: &mut Option<InternedString>| {
if item.is_some() {
- diagnostic.span_err(meta.span, &format!("multiple '{}' items",
- meta.name()));
+ handle_errors(diagnostic, meta.span, AttrError::MultipleItem(meta.name()));
return false
}
if let Some(v) = meta.value_str() {
*item = Some(v);
true
} else {
- diagnostic.span_err(meta.span, "incorrect meta item");
+ span_err!(diagnostic, meta.span, E0539, "incorrect meta item");
false
}
};
match tag {
"rustc_deprecated" => {
if rustc_depr.is_some() {
- diagnostic.span_err(item_sp, "multiple rustc_deprecated attributes");
+ span_err!(diagnostic, item_sp, E0540,
+ "multiple rustc_deprecated attributes");
break
}
"since" => if !get(meta, &mut since) { continue 'outer },
"reason" => if !get(meta, &mut reason) { continue 'outer },
_ => {
- diagnostic.span_err(meta.span, &format!("unknown meta item '{}'",
- meta.name()));
+ handle_errors(diagnostic, meta.span,
+ AttrError::UnknownMetaItem(meta.name()));
continue 'outer
}
}
})
}
(None, _) => {
- diagnostic.span_err(attr.span(), "missing 'since'");
+ handle_errors(diagnostic, attr.span(), AttrError::MissingSince);
continue
}
_ => {
- diagnostic.span_err(attr.span(), "missing 'reason'");
+ span_err!(diagnostic, attr.span(), E0543, "missing 'reason'");
continue
}
}
}
"unstable" => {
if stab.is_some() {
- diagnostic.span_err(item_sp, "multiple stability levels");
+ handle_errors(diagnostic, attr.span(), AttrError::MultipleStabilityLevels);
break
}
"reason" => if !get(meta, &mut reason) { continue 'outer },
"issue" => if !get(meta, &mut issue) { continue 'outer },
_ => {
- diagnostic.span_err(meta.span, &format!("unknown meta item '{}'",
- meta.name()));
+ handle_errors(diagnostic, meta.span,
+ AttrError::UnknownMetaItem(meta.name()));
continue 'outer
}
}
if let Ok(issue) = issue.parse() {
issue
} else {
- diagnostic.span_err(attr.span(), "incorrect 'issue'");
+ span_err!(diagnostic, attr.span(), E0545,
+ "incorrect 'issue'");
continue
}
}
})
}
(None, _, _) => {
- diagnostic.span_err(attr.span(), "missing 'feature'");
+ handle_errors(diagnostic, attr.span(), AttrError::MissingFeature);
continue
}
_ => {
- diagnostic.span_err(attr.span(), "missing 'issue'");
+ span_err!(diagnostic, attr.span(), E0547, "missing 'issue'");
continue
}
}
}
"stable" => {
if stab.is_some() {
- diagnostic.span_err(item_sp, "multiple stability levels");
+ handle_errors(diagnostic, attr.span(), AttrError::MultipleStabilityLevels);
break
}
"feature" => if !get(meta, &mut feature) { continue 'outer },
"since" => if !get(meta, &mut since) { continue 'outer },
_ => {
- diagnostic.span_err(meta.span, &format!("unknown meta item '{}'",
- meta.name()));
+ handle_errors(diagnostic, meta.span,
+ AttrError::UnknownMetaItem(meta.name()));
continue 'outer
}
}
})
}
(None, _) => {
- diagnostic.span_err(attr.span(), "missing 'feature'");
+ handle_errors(diagnostic, attr.span(), AttrError::MissingFeature);
continue
}
_ => {
- diagnostic.span_err(attr.span(), "missing 'since'");
+ handle_errors(diagnostic, attr.span(), AttrError::MissingSince);
continue
}
}
_ => unreachable!()
}
} else {
- diagnostic.span_err(attr.span(), "incorrect stability attribute type");
+ span_err!(diagnostic, attr.span(), E0548, "incorrect stability attribute type");
continue
}
}
}
stab.rustc_depr = Some(rustc_depr);
} else {
- diagnostic.span_err(item_sp, "rustc_deprecated attribute must be paired with \
- either stable or unstable attribute");
+ span_err!(diagnostic, item_sp, E0549,
+ "rustc_deprecated attribute must be paired with \
+ either stable or unstable attribute");
}
}
mark_used(attr);
if depr.is_some() {
- diagnostic.span_err(item_sp, "multiple deprecated attributes");
+ span_err!(diagnostic, item_sp, E0550, "multiple deprecated attributes");
break
}
depr = if let Some(metas) = attr.meta_item_list() {
let get = |meta: &MetaItem, item: &mut Option<InternedString>| {
if item.is_some() {
- diagnostic.span_err(meta.span, &format!("multiple '{}' items",
- meta.name()));
+ handle_errors(diagnostic, meta.span, AttrError::MultipleItem(meta.name()));
return false
}
if let Some(v) = meta.value_str() {
*item = Some(v);
true
} else {
- diagnostic.span_err(meta.span, "incorrect meta item");
+ span_err!(diagnostic, meta.span, E0551, "incorrect meta item");
false
}
};
"since" => if !get(meta, &mut since) { continue 'outer },
"note" => if !get(meta, &mut note) { continue 'outer },
_ => {
- diagnostic.span_err(meta.span, &format!("unknown meta item '{}'",
- meta.name()));
+ handle_errors(diagnostic, meta.span,
+ AttrError::UnknownMetaItem(meta.name()));
continue 'outer
}
}
if !set.insert(name.clone()) {
panic!(diagnostic.span_fatal(meta.span,
- &format!("duplicate meta item `{}`", name)));
+ &format!("duplicate meta item `{}`", name)));
}
}
}
Some(ity) => Some(ReprInt(item.span, ity)),
None => {
// Not a word we recognize
- diagnostic.span_err(item.span,
- "unrecognized representation hint");
+ span_err!(diagnostic, item.span, E0552,
+ "unrecognized representation hint");
None
}
}
}
}
// Not a word:
- _ => diagnostic.span_err(item.span, "unrecognized enum representation hint")
+ _ => span_err!(diagnostic, item.span, E0553,
+ "unrecognized enum representation hint"),
}
}
}
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![allow(non_snake_case)]
+
+// Error messages for EXXXX errors.
+// Each message should start and end with a new line, and be wrapped to 80 characters.
+// In vim you can `:set tw=80` and use `gq` to wrap paragraphs. Use `:set tw=0` to disable.
+register_long_diagnostics! {
+
+E0533: r##"
+```compile_fail,E0533
+#[export_name]
+pub fn something() {}
+
+fn main() {}
+```
+"##,
+
+}
+
+register_diagnostics! {
+ E0534, // expected one argument
+ E0535, // invalid argument
+ E0536, // expected 1 cfg-pattern
+ E0537, // invalid predicate
+ E0538, // multiple [same] items
+ E0539, // incorrect meta item
+ E0540, // multiple rustc_deprecated attributes
+ E0541, // unknown meta item
+ E0542, // missing 'since'
+ E0543, // missing 'reason'
+ E0544, // multiple stability levels
+ E0545, // incorrect 'issue'
+ E0546, // missing 'feature'
+ E0547, // missing 'issue'
+ E0548, // incorrect stability attribute type
+ E0549, // rustc_deprecated attribute must be paired with either stable or unstable attribute
+ E0550, // multiple deprecated attributes
+ E0551, // incorrect meta item
+ E0552, // unrecognized representation hint
+ E0553, // unrecognized enum representation hint
+ E0554, // #[feature] may not be used on the [] release channel
+ E0555, // malformed feature attribute, expected #![feature(...)]
+ E0556, // malformed feature, expected just one word
+ E0557, // feature has been removed
+}
match attr.meta_item_list() {
None => {
- span_handler.span_err(attr.span, "malformed feature attribute, \
- expected #![feature(...)]");
+ span_err!(span_handler, attr.span, E0555,
+ "malformed feature attribute, expected #![feature(...)]");
}
Some(list) => {
for mi in list {
let name = match mi.node {
ast::MetaItemKind::Word(ref word) => (*word).clone(),
_ => {
- span_handler.span_err(mi.span,
- "malformed feature, expected just \
- one word");
+ span_err!(span_handler, mi.span, E0556,
+ "malformed feature, expected just one word");
continue
}
};
}
else if let Some(&(_, _, _)) = REMOVED_FEATURES.iter()
.find(|& &(n, _, _)| name == n) {
- span_handler.span_err(mi.span, "feature has been removed");
+ span_err!(span_handler, mi.span, E0557, "feature has been removed");
}
else if let Some(&(_, _, _)) = ACCEPTED_FEATURES.iter()
.find(|& &(n, _, _)| name == n) {
for attr in &krate.attrs {
if attr.check_name("feature") {
let release_channel = option_env!("CFG_RELEASE_CHANNEL").unwrap_or("(unknown)");
- let ref msg = format!("#[feature] may not be used on the {} release channel",
- release_channel);
- span_handler.span_err(attr.span, msg);
+ span_err!(span_handler, attr.span, E0554,
+ "#[feature] may not be used on the {} release channel",
+ release_channel);
}
}
}
#![feature(str_escape)]
#![feature(unicode)]
#![feature(question_mark)]
+#![feature(rustc_diagnostic_macros)]
extern crate serialize;
extern crate term;
})
}
+#[macro_use]
+pub mod diagnostics {
+ #[macro_use]
+ pub mod macros;
+ pub mod plugin;
+ pub mod metadata;
+}
+
+// NB: This module needs to be declared first so diagnostics are
+// registered before they are used.
+pub mod diagnostic_list;
+
pub mod util {
pub mod interner;
pub mod lev_distance;
pub use self::thin_vec::ThinVec;
}
-pub mod diagnostics {
- pub mod macros;
- pub mod plugin;
- pub mod metadata;
-}
-
pub mod json;
pub mod syntax {
pub mod macro_rules;
}
}
+
+// __build_diagnostic_array! { libsyntax, DIAGNOSTICS }
"rustc_back 0.0.0",
"rustc_bitflags 0.0.0",
"rustc_const_math 0.0.0",
+ "rustc_data_structures 0.0.0",
"rustc_errors 0.0.0",
"rustc_llvm 0.0.0",
"serialize 0.0.0",
extern crate a;
extern crate b; //~ ERROR: found possibly newer version of crate `a` which `b` depends on
-//~| NOTE: perhaps this crate needs to be recompiled
+//~| NOTE: perhaps that crate needs to be recompiled
//~| NOTE: crate `a` path #1:
//~| NOTE: crate `b` path #1:
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+use std::fmt::Debug;
+
+const CONST_0: Debug+Sync = *(&0 as &(Debug+Sync));
+//~^ ERROR `std::fmt::Debug + Sync + 'static: std::marker::Sized` is not satisfied
+//~| NOTE does not have a constant size known at compile-time
+//~| NOTE constant expressions must have a statically known size
+
+const CONST_FOO: str = *"foo";
+//~^ ERROR `str: std::marker::Sized` is not satisfied
+//~| NOTE does not have a constant size known at compile-time
+//~| NOTE constant expressions must have a statically known size
+
+static STATIC_1: Debug+Sync = *(&1 as &(Debug+Sync));
+//~^ ERROR `std::fmt::Debug + Sync + 'static: std::marker::Sized` is not satisfied
+//~| NOTE does not have a constant size known at compile-time
+//~| NOTE constant expressions must have a statically known size
+
+static STATIC_BAR: str = *"bar";
+//~^ ERROR `str: std::marker::Sized` is not satisfied
+//~| NOTE does not have a constant size known at compile-time
+//~| NOTE constant expressions must have a statically known size
+
+fn main() {
+ println!("{:?} {:?} {:?} {:?}", &CONST_0, &CONST_FOO, &STATIC_1, &STATIC_BAR);
+}
--- /dev/null
+// Copyright 2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+#![allow(warnings)]
+#![feature(intrinsics)]
+
+extern "rust-intrinsic" {
+ fn return_address() -> *const u8;
+}
+
+unsafe fn f() { let _ = return_address(); }
+//~^ ERROR invalid use of `return_address` intrinsic: function does not use out pointer
+
+unsafe fn g() -> isize { let _ = return_address(); 0 }
+//~^ ERROR invalid use of `return_address` intrinsic: function does not use out pointer
+
+fn main() {}
fn main() {
static foo: Fn() -> u32 = || -> u32 {
//~^ ERROR: mismatched types
+ //~| ERROR: `std::ops::Fn() -> u32 + 'static: std::marker::Sized` is not satisfied
+
0
};
}
#![stable(feature = "rust1", since = "1.0.0")]
mod bogus_attribute_types_1 {
- #[stable(feature = "a", since = "a", reason)] //~ ERROR unknown meta item 'reason'
+ #[stable(feature = "a", since = "a", reason)] //~ ERROR unknown meta item 'reason' [E0541]
fn f1() { }
- #[stable(feature = "a", since)] //~ ERROR incorrect meta item
+ #[stable(feature = "a", since)] //~ ERROR incorrect meta item [E0539]
fn f2() { }
- #[stable(feature, since = "a")] //~ ERROR incorrect meta item
+ #[stable(feature, since = "a")] //~ ERROR incorrect meta item [E0539]
fn f3() { }
- #[stable(feature = "a", since(b))] //~ ERROR incorrect meta item
+ #[stable(feature = "a", since(b))] //~ ERROR incorrect meta item [E0539]
fn f5() { }
- #[stable(feature(b), since = "a")] //~ ERROR incorrect meta item
+ #[stable(feature(b), since = "a")] //~ ERROR incorrect meta item [E0539]
fn f6() { }
}
mod bogus_attribute_types_2 {
- #[unstable] //~ ERROR incorrect stability attribute type
+ #[unstable] //~ ERROR incorrect stability attribute type [E0548]
fn f1() { }
- #[unstable = "a"] //~ ERROR incorrect stability attribute type
+ #[unstable = "a"] //~ ERROR incorrect stability attribute type [E0548]
fn f2() { }
- #[stable] //~ ERROR incorrect stability attribute type
+ #[stable] //~ ERROR incorrect stability attribute type [E0548]
fn f3() { }
- #[stable = "a"] //~ ERROR incorrect stability attribute type
+ #[stable = "a"] //~ ERROR incorrect stability attribute type [E0548]
fn f4() { }
#[stable(feature = "a", since = "b")]
- #[rustc_deprecated] //~ ERROR incorrect stability attribute type
+ #[rustc_deprecated] //~ ERROR incorrect stability attribute type [E0548]
fn f5() { }
#[stable(feature = "a", since = "b")]
- #[rustc_deprecated = "a"] //~ ERROR incorrect stability attribute type
+ #[rustc_deprecated = "a"] //~ ERROR incorrect stability attribute type [E0548]
fn f6() { }
}
mod missing_feature_names {
- #[unstable(issue = "0")] //~ ERROR missing 'feature'
+ #[unstable(issue = "0")] //~ ERROR missing 'feature' [E0546]
fn f1() { }
- #[unstable(feature = "a")] //~ ERROR missing 'issue'
+ #[unstable(feature = "a")] //~ ERROR missing 'issue' [E0547]
fn f2() { }
- #[stable(since = "a")] //~ ERROR missing 'feature'
+ #[stable(since = "a")] //~ ERROR missing 'feature' [E0546]
fn f3() { }
}
mod missing_version {
- #[stable(feature = "a")] //~ ERROR missing 'since'
+ #[stable(feature = "a")] //~ ERROR missing 'since' [E0542]
fn f1() { }
#[stable(feature = "a", since = "b")]
- #[rustc_deprecated(reason = "a")] //~ ERROR missing 'since'
+ #[rustc_deprecated(reason = "a")] //~ ERROR missing 'since' [E0542]
fn f2() { }
}
#[unstable(feature = "a", issue = "0")]
-#[stable(feature = "a", since = "b")]
-fn multiple1() { } //~ ERROR multiple stability levels
+#[stable(feature = "a", since = "b")] //~ ERROR multiple stability levels [E0544]
+fn multiple1() { }
#[unstable(feature = "a", issue = "0")]
-#[unstable(feature = "a", issue = "0")]
-fn multiple2() { } //~ ERROR multiple stability levels
+#[unstable(feature = "a", issue = "0")] //~ ERROR multiple stability levels [E0544]
+fn multiple2() { }
#[stable(feature = "a", since = "b")]
-#[stable(feature = "a", since = "b")]
-fn multiple3() { } //~ ERROR multiple stability levels
+#[stable(feature = "a", since = "b")] //~ ERROR multiple stability levels [E0544]
+fn multiple3() { }
#[stable(feature = "a", since = "b")]
#[rustc_deprecated(since = "b", reason = "text")]
#[rustc_deprecated(since = "b", reason = "text")]
-fn multiple4() { } //~ ERROR multiple rustc_deprecated attributes
+fn multiple4() { } //~ ERROR multiple rustc_deprecated attributes [E0540]
//~^ ERROR Invalid stability or deprecation version found
#[rustc_deprecated(since = "a", reason = "text")]
-fn deprecated_without_unstable_or_stable() { } //~ ERROR rustc_deprecated attribute must be paired
+fn deprecated_without_unstable_or_stable() { }
+//~^ ERROR rustc_deprecated attribute must be paired with either stable or unstable attribute
fn main() { }
extern crate a;
extern crate b; //~ ERROR: found possibly newer version of crate `a` which `b` depends on
-//~| NOTE: perhaps this crate needs to be recompiled
+//~| NOTE: perhaps that crate needs to be recompiled
//~| NOTE: crate `a` path #1:
//~| NOTE: crate `b` path #1:
extern crate a;
extern crate b; //~ ERROR: found possibly newer version of crate `a` which `b` depends on
-//~| NOTE: perhaps this crate needs to be recompiled
+//~| NOTE: perhaps that crate needs to be recompiled
//~| NOTE: crate `a` path #1:
//~| NOTE: crate `b` path #1:
extern crate a;
extern crate b; //~ ERROR: found possibly newer version of crate `a` which `b` depends on
-//~| NOTE: perhaps this crate needs to be recompiled
+//~| NOTE: perhaps that crate needs to be recompiled
//~| NOTE: crate `a` path #1:
//~| NOTE: crate `b` path #1:
extern crate a;
extern crate b; //~ ERROR: found possibly newer version of crate `a` which `b` depends on
-//~| NOTE: perhaps this crate needs to be recompiled
+//~| NOTE: perhaps that crate needs to be recompiled
//~| NOTE: crate `a` path #1:
//~| NOTE: crate `b` path #1:
extern crate a;
extern crate b; //~ ERROR: found possibly newer version of crate `a` which `b` depends on
-//~| NOTE: perhaps this crate needs to be recompiled
+//~| NOTE: perhaps that crate needs to be recompiled
//~| NOTE: crate `a` path #1:
//~| NOTE: crate `b` path #1:
extern crate a;
extern crate b; //~ ERROR: found possibly newer version of crate `a` which `b` depends on
-//~| NOTE: perhaps this crate needs to be recompiled
+//~| NOTE: perhaps that crate needs to be recompiled
//~| NOTE: crate `a` path #1:
//~| NOTE: crate `b` path #1:
extern crate uta;
extern crate utb; //~ ERROR: found possibly newer version of crate `uta` which `utb` depends
-//~| NOTE: perhaps this crate needs to be recompiled?
+//~| NOTE: perhaps that crate needs to be recompiled?
//~| NOTE: crate `uta` path #1:
//~| NOTE: crate `utb` path #1:
# Ensure crateC fails to compile since A1 is "missing" and A2/A3 hashes do not match
$(RUSTC) -L $(A2) -L $(A3) crateC.rs >$(LOG) 2>&1 || true
grep "error: found possibly newer version of crate \`crateA\` which \`crateB\` depends on" $(LOG)
- grep "note: perhaps this crate needs to be recompiled?" $(LOG)
+ grep "note: perhaps that crate needs to be recompiled?" $(LOG)
grep "note: crate \`crateA\` path #1:" $(LOG)
grep "note: crate \`crateA\` path #2:" $(LOG)
grep "note: crate \`crateB\` path #1:" $(LOG)
--- /dev/null
+// Copyright 2012-2014 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+
+#![feature(intrinsics)]
+
+use std::ptr;
+
+struct Point {
+ x: f32,
+ y: f32,
+ z: f32,
+}
+
+extern "rust-intrinsic" {
+ fn return_address() -> *const u8;
+}
+
+fn f(result: &mut usize) -> Point {
+ unsafe {
+ *result = return_address() as usize;
+ Point {
+ x: 1.0,
+ y: 2.0,
+ z: 3.0,
+ }
+ }
+
+}
+
+fn main() {
+ let mut intrinsic_reported_address = 0;
+ let pt = f(&mut intrinsic_reported_address);
+ let actual_address = &pt as *const Point as usize;
+ assert_eq!(intrinsic_reported_address, actual_address);
+}
--- /dev/null
+// Copyright 2016 The Rust Project Developers. See the COPYRIGHT
+// file at the top-level directory of this distribution and at
+// http://rust-lang.org/COPYRIGHT.
+//
+// Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
+// http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
+// <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
+// option. This file may not be copied, modified, or distributed
+// except according to those terms.
+
+fn main() {
+ struct X;
+ trait Foo<T> {
+ fn foo(&self) where (T, Option<T>): Ord {}
+ fn bar(&self, x: &Option<T>) -> bool
+ where Option<T>: Ord { *x < *x }
+ }
+ impl Foo<X> for () {}
+ let _ = &() as &Foo<X>;
+}
// Parse the JSON output from the compiler and extract out the messages.
let actual_errors = json::parse_output(&file_name, &proc_res.stderr, &proc_res);
- let mut unexpected = 0;
- let mut not_found = 0;
+ let mut unexpected = Vec::new();
let mut found = vec![false; expected_errors.len()];
for actual_error in &actual_errors {
let opt_index =
.map_or(String::from("message"),
|k| k.to_string()),
actual_error.msg));
- unexpected += 1;
+ unexpected.push(actual_error.clone());
}
}
}
}
+ let mut not_found = Vec::new();
// anything not yet found is a problem
for (index, expected_error) in expected_errors.iter().enumerate() {
if !found[index] {
.map_or("message".into(),
|k| k.to_string()),
expected_error.msg));
- not_found += 1;
+ not_found.push(expected_error.clone());
}
}
- if unexpected > 0 || not_found > 0 {
+ if unexpected.len() > 0 || not_found.len() > 0 {
self.error(
&format!("{} unexpected errors found, {} expected errors not found",
- unexpected, not_found));
+ unexpected.len(), not_found.len()));
print!("status: {}\ncommand: {}\n",
proc_res.status, proc_res.cmdline);
- println!("actual errors (from JSON output): {:#?}\n", actual_errors);
- println!("expected errors (from test file): {:#?}\n", expected_errors);
+ if unexpected.len() > 0 {
+ println!("unexpected errors (from JSON output): {:#?}\n", unexpected);
+ }
+ if not_found.len() > 0 {
+ println!("not found errors (from test file): {:#?}\n", not_found);
+ }
panic!();
}
}
return None;
}
- if file.ends_with("std/sys/ext/index.html") {
- return None;
- }
-
- if let Some(file) = file.to_str() {
- // FIXME(#31948)
- if file.contains("ParseFloatError") {
- return None;
- }
- // weird reexports, but this module is on its way out, so chalk it up to
- // "rustdoc weirdness" and move on from there
- if file.contains("scoped_tls") {
- return None;
- }
- }
-
let mut parser = UrlParser::new();
parser.base_url(base);
// Search for anything that's the regex 'href[ ]*=[ ]*".*?"'
with_attrs_in_source(&contents, " href", |url, i| {
+ // Ignore external URLs
+ if url.starts_with("http:") || url.starts_with("https:") ||
+ url.starts_with("javascript:") || url.starts_with("ftp:") ||
+ url.starts_with("irc:") || url.starts_with("data:") {
+ return;
+ }
// Once we've plucked out the URL, parse it using our base url and
- // then try to extract a file path. If either of these fail then we
- // just keep going.
+ // then try to extract a file path.
let (parsed_url, path) = match url_to_file_path(&parser, url) {
Some((url, path)) => (url, PathBuf::from(path)),
- None => return,
+ None => {
+ *errors = true;
+ println!("{}:{}: invalid link - {}",
+ pretty_file.display(),
+ i + 1,
+ url);
+ return;
+ }
};
// Alright, if we've found a file name then this file had better
Ok(res) => res,
Err(LoadError::IOError(err)) => panic!(format!("{}", err)),
Err(LoadError::BrokenRedirect(target, _)) => {
- print!("{}:{}: broken redirect to {}",
- pretty_file.display(),
- i + 1,
- target.display());
+ *errors = true;
+ println!("{}:{}: broken redirect to {}",
+ pretty_file.display(),
+ i + 1,
+ target.display());
return;
}
Err(LoadError::IsRedirect) => unreachable!(),