1 //! Single-threaded reference-counting pointers. 'Rc' stands for 'Reference
4 //! The type [`Rc<T>`][`Rc`] provides shared ownership of a value of type `T`,
5 //! allocated in the heap. Invoking [`clone`][clone] on [`Rc`] produces a new
6 //! pointer to the same allocation in the heap. When the last [`Rc`] pointer to a
7 //! given allocation is destroyed, the value stored in that allocation (often
8 //! referred to as "inner value") is also dropped.
10 //! Shared references in Rust disallow mutation by default, and [`Rc`]
11 //! is no exception: you cannot generally obtain a mutable reference to
12 //! something inside an [`Rc`]. If you need mutability, put a [`Cell`]
13 //! or [`RefCell`] inside the [`Rc`]; see [an example of mutability
14 //! inside an Rc][mutability].
16 //! [`Rc`] uses non-atomic reference counting. This means that overhead is very
17 //! low, but an [`Rc`] cannot be sent between threads, and consequently [`Rc`]
18 //! does not implement [`Send`][send]. As a result, the Rust compiler
19 //! will check *at compile time* that you are not sending [`Rc`]s between
20 //! threads. If you need multi-threaded, atomic reference counting, use
21 //! [`sync::Arc`][arc].
23 //! The [`downgrade`][downgrade] method can be used to create a non-owning
24 //! [`Weak`] pointer. A [`Weak`] pointer can be [`upgrade`][upgrade]d
25 //! to an [`Rc`], but this will return [`None`] if the value stored in the allocation has
26 //! already been dropped. In other words, `Weak` pointers do not keep the value
27 //! inside the allocation alive; however, they *do* keep the allocation
28 //! (the backing store for the inner value) alive.
30 //! A cycle between [`Rc`] pointers will never be deallocated. For this reason,
31 //! [`Weak`] is used to break cycles. For example, a tree could have strong
32 //! [`Rc`] pointers from parent nodes to children, and [`Weak`] pointers from
33 //! children back to their parents.
35 //! `Rc<T>` automatically dereferences to `T` (via the [`Deref`] trait),
36 //! so you can call `T`'s methods on a value of type [`Rc<T>`][`Rc`]. To avoid name
37 //! clashes with `T`'s methods, the methods of [`Rc<T>`][`Rc`] itself are associated
38 //! functions, called using function-like syntax:
42 //! let my_rc = Rc::new(());
44 //! Rc::downgrade(&my_rc);
47 //! [`Weak<T>`][`Weak`] does not auto-dereference to `T`, because the inner value may have
48 //! already been dropped.
50 //! # Cloning references
52 //! Creating a new reference to the same allocation as an existing reference counted pointer
53 //! is done using the `Clone` trait implemented for [`Rc<T>`][`Rc`] and [`Weak<T>`][`Weak`].
57 //! let foo = Rc::new(vec![1.0, 2.0, 3.0]);
58 //! // The two syntaxes below are equivalent.
59 //! let a = foo.clone();
60 //! let b = Rc::clone(&foo);
61 //! // a and b both point to the same memory location as foo.
64 //! The `Rc::clone(&from)` syntax is the most idiomatic because it conveys more explicitly
65 //! the meaning of the code. In the example above, this syntax makes it easier to see that
66 //! this code is creating a new reference rather than copying the whole content of foo.
70 //! Consider a scenario where a set of `Gadget`s are owned by a given `Owner`.
71 //! We want to have our `Gadget`s point to their `Owner`. We can't do this with
72 //! unique ownership, because more than one gadget may belong to the same
73 //! `Owner`. [`Rc`] allows us to share an `Owner` between multiple `Gadget`s,
74 //! and have the `Owner` remain allocated as long as any `Gadget` points at it.
81 //! // ...other fields
87 //! // ...other fields
91 //! // Create a reference-counted `Owner`.
92 //! let gadget_owner: Rc<Owner> = Rc::new(
94 //! name: "Gadget Man".to_string(),
98 //! // Create `Gadget`s belonging to `gadget_owner`. Cloning the `Rc<Owner>`
99 //! // gives us a new pointer to the same `Owner` allocation, incrementing
100 //! // the reference count in the process.
101 //! let gadget1 = Gadget {
103 //! owner: Rc::clone(&gadget_owner),
105 //! let gadget2 = Gadget {
107 //! owner: Rc::clone(&gadget_owner),
110 //! // Dispose of our local variable `gadget_owner`.
111 //! drop(gadget_owner);
113 //! // Despite dropping `gadget_owner`, we're still able to print out the name
114 //! // of the `Owner` of the `Gadget`s. This is because we've only dropped a
115 //! // single `Rc<Owner>`, not the `Owner` it points to. As long as there are
116 //! // other `Rc<Owner>` pointing at the same `Owner` allocation, it will remain
117 //! // live. The field projection `gadget1.owner.name` works because
118 //! // `Rc<Owner>` automatically dereferences to `Owner`.
119 //! println!("Gadget {} owned by {}", gadget1.id, gadget1.owner.name);
120 //! println!("Gadget {} owned by {}", gadget2.id, gadget2.owner.name);
122 //! // At the end of the function, `gadget1` and `gadget2` are destroyed, and
123 //! // with them the last counted references to our `Owner`. Gadget Man now
124 //! // gets destroyed as well.
128 //! If our requirements change, and we also need to be able to traverse from
129 //! `Owner` to `Gadget`, we will run into problems. An [`Rc`] pointer from `Owner`
130 //! to `Gadget` introduces a cycle. This means that their
131 //! reference counts can never reach 0, and the allocation will never be destroyed:
132 //! a memory leak. In order to get around this, we can use [`Weak`]
135 //! Rust actually makes it somewhat difficult to produce this loop in the first
136 //! place. In order to end up with two values that point at each other, one of
137 //! them needs to be mutable. This is difficult because [`Rc`] enforces
138 //! memory safety by only giving out shared references to the value it wraps,
139 //! and these don't allow direct mutation. We need to wrap the part of the
140 //! value we wish to mutate in a [`RefCell`], which provides *interior
141 //! mutability*: a method to achieve mutability through a shared reference.
142 //! [`RefCell`] enforces Rust's borrowing rules at runtime.
146 //! use std::rc::Weak;
147 //! use std::cell::RefCell;
151 //! gadgets: RefCell<Vec<Weak<Gadget>>>,
152 //! // ...other fields
157 //! owner: Rc<Owner>,
158 //! // ...other fields
162 //! // Create a reference-counted `Owner`. Note that we've put the `Owner`'s
163 //! // vector of `Gadget`s inside a `RefCell` so that we can mutate it through
164 //! // a shared reference.
165 //! let gadget_owner: Rc<Owner> = Rc::new(
167 //! name: "Gadget Man".to_string(),
168 //! gadgets: RefCell::new(vec![]),
172 //! // Create `Gadget`s belonging to `gadget_owner`, as before.
173 //! let gadget1 = Rc::new(
176 //! owner: Rc::clone(&gadget_owner),
179 //! let gadget2 = Rc::new(
182 //! owner: Rc::clone(&gadget_owner),
186 //! // Add the `Gadget`s to their `Owner`.
188 //! let mut gadgets = gadget_owner.gadgets.borrow_mut();
189 //! gadgets.push(Rc::downgrade(&gadget1));
190 //! gadgets.push(Rc::downgrade(&gadget2));
192 //! // `RefCell` dynamic borrow ends here.
195 //! // Iterate over our `Gadget`s, printing their details out.
196 //! for gadget_weak in gadget_owner.gadgets.borrow().iter() {
198 //! // `gadget_weak` is a `Weak<Gadget>`. Since `Weak` pointers can't
199 //! // guarantee the allocation still exists, we need to call
200 //! // `upgrade`, which returns an `Option<Rc<Gadget>>`.
202 //! // In this case we know the allocation still exists, so we simply
203 //! // `unwrap` the `Option`. In a more complicated program, you might
204 //! // need graceful error handling for a `None` result.
206 //! let gadget = gadget_weak.upgrade().unwrap();
207 //! println!("Gadget {} owned by {}", gadget.id, gadget.owner.name);
210 //! // At the end of the function, `gadget_owner`, `gadget1`, and `gadget2`
211 //! // are destroyed. There are now no strong (`Rc`) pointers to the
212 //! // gadgets, so they are destroyed. This zeroes the reference count on
213 //! // Gadget Man, so he gets destroyed as well.
217 //! [`Rc`]: struct.Rc.html
218 //! [`Weak`]: struct.Weak.html
219 //! [clone]: ../../std/clone/trait.Clone.html#tymethod.clone
220 //! [`Cell`]: ../../std/cell/struct.Cell.html
221 //! [`RefCell`]: ../../std/cell/struct.RefCell.html
222 //! [send]: ../../std/marker/trait.Send.html
223 //! [arc]: ../../std/sync/struct.Arc.html
224 //! [`Deref`]: ../../std/ops/trait.Deref.html
225 //! [downgrade]: struct.Rc.html#method.downgrade
226 //! [upgrade]: struct.Weak.html#method.upgrade
227 //! [`None`]: ../../std/option/enum.Option.html#variant.None
228 //! [mutability]: ../../std/cell/index.html#introducing-mutability-inside-of-something-immutable
230 #![stable(feature = "rust1", since = "1.0.0")]
233 use crate::boxed::Box;
238 use core::array::LengthAtMost32;
240 use core::cell::Cell;
241 use core::cmp::Ordering;
242 use core::convert::{From, TryFrom};
244 use core::hash::{Hash, Hasher};
245 use core::intrinsics::abort;
247 use core::marker::{self, PhantomData, Unpin, Unsize};
248 use core::mem::{self, align_of_val_raw, forget, size_of_val};
249 use core::ops::{CoerceUnsized, Deref, DispatchFromDyn, Receiver};
251 use core::ptr::{self, NonNull};
252 use core::slice::from_raw_parts_mut;
254 use crate::alloc::{box_free, handle_alloc_error, AllocInit, AllocRef, Global, Layout};
255 use crate::borrow::{Cow, ToOwned};
256 use crate::string::String;
262 // This is repr(C) to future-proof against possible field-reordering, which
263 // would interfere with otherwise safe [into|from]_raw() of transmutable
266 struct RcBox<T: ?Sized> {
272 /// A single-threaded reference-counting pointer. 'Rc' stands for 'Reference
275 /// See the [module-level documentation](./index.html) for more details.
277 /// The inherent methods of `Rc` are all associated functions, which means
278 /// that you have to call them as e.g., [`Rc::get_mut(&mut value)`][get_mut] instead of
279 /// `value.get_mut()`. This avoids conflicts with methods of the inner
282 /// [get_mut]: #method.get_mut
283 #[cfg_attr(not(test), rustc_diagnostic_item = "Rc")]
284 #[stable(feature = "rust1", since = "1.0.0")]
285 pub struct Rc<T: ?Sized> {
286 ptr: NonNull<RcBox<T>>,
287 phantom: PhantomData<RcBox<T>>,
290 #[stable(feature = "rust1", since = "1.0.0")]
291 impl<T: ?Sized> !marker::Send for Rc<T> {}
292 #[stable(feature = "rust1", since = "1.0.0")]
293 impl<T: ?Sized> !marker::Sync for Rc<T> {}
295 #[unstable(feature = "coerce_unsized", issue = "27732")]
296 impl<T: ?Sized + Unsize<U>, U: ?Sized> CoerceUnsized<Rc<U>> for Rc<T> {}
298 #[unstable(feature = "dispatch_from_dyn", issue = "none")]
299 impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Rc<U>> for Rc<T> {}
301 impl<T: ?Sized> Rc<T> {
302 fn from_inner(ptr: NonNull<RcBox<T>>) -> Self {
303 Self { ptr, phantom: PhantomData }
306 unsafe fn from_ptr(ptr: *mut RcBox<T>) -> Self {
307 Self::from_inner(unsafe { NonNull::new_unchecked(ptr) })
312 /// Constructs a new `Rc<T>`.
319 /// let five = Rc::new(5);
321 #[stable(feature = "rust1", since = "1.0.0")]
322 pub fn new(value: T) -> Rc<T> {
323 // There is an implicit weak pointer owned by all the strong
324 // pointers, which ensures that the weak destructor never frees
325 // the allocation while the strong destructor is running, even
326 // if the weak pointer is stored inside the strong one.
328 Box::leak(box RcBox { strong: Cell::new(1), weak: Cell::new(1), value }).into(),
332 /// Constructs a new `Rc` with uninitialized contents.
337 /// #![feature(new_uninit)]
338 /// #![feature(get_mut_unchecked)]
342 /// let mut five = Rc::<u32>::new_uninit();
344 /// let five = unsafe {
345 /// // Deferred initialization:
346 /// Rc::get_mut_unchecked(&mut five).as_mut_ptr().write(5);
348 /// five.assume_init()
351 /// assert_eq!(*five, 5)
353 #[unstable(feature = "new_uninit", issue = "63291")]
354 pub fn new_uninit() -> Rc<mem::MaybeUninit<T>> {
356 Rc::from_ptr(Rc::allocate_for_layout(Layout::new::<T>(), |mem| {
357 mem as *mut RcBox<mem::MaybeUninit<T>>
362 /// Constructs a new `Rc` with uninitialized contents, with the memory
363 /// being filled with `0` bytes.
365 /// See [`MaybeUninit::zeroed`][zeroed] for examples of correct and
366 /// incorrect usage of this method.
371 /// #![feature(new_uninit)]
375 /// let zero = Rc::<u32>::new_zeroed();
376 /// let zero = unsafe { zero.assume_init() };
378 /// assert_eq!(*zero, 0)
381 /// [zeroed]: ../../std/mem/union.MaybeUninit.html#method.zeroed
382 #[unstable(feature = "new_uninit", issue = "63291")]
383 pub fn new_zeroed() -> Rc<mem::MaybeUninit<T>> {
385 let mut uninit = Self::new_uninit();
386 ptr::write_bytes::<T>(Rc::get_mut_unchecked(&mut uninit).as_mut_ptr(), 0, 1);
391 /// Constructs a new `Pin<Rc<T>>`. If `T` does not implement `Unpin`, then
392 /// `value` will be pinned in memory and unable to be moved.
393 #[stable(feature = "pin", since = "1.33.0")]
394 pub fn pin(value: T) -> Pin<Rc<T>> {
395 unsafe { Pin::new_unchecked(Rc::new(value)) }
398 /// Returns the inner value, if the `Rc` has exactly one strong reference.
400 /// Otherwise, an [`Err`][result] is returned with the same `Rc` that was
403 /// This will succeed even if there are outstanding weak references.
405 /// [result]: ../../std/result/enum.Result.html
412 /// let x = Rc::new(3);
413 /// assert_eq!(Rc::try_unwrap(x), Ok(3));
415 /// let x = Rc::new(4);
416 /// let _y = Rc::clone(&x);
417 /// assert_eq!(*Rc::try_unwrap(x).unwrap_err(), 4);
420 #[stable(feature = "rc_unique", since = "1.4.0")]
421 pub fn try_unwrap(this: Self) -> Result<T, Self> {
422 if Rc::strong_count(&this) == 1 {
424 let val = ptr::read(&*this); // copy the contained object
426 // Indicate to Weaks that they can't be promoted by decrementing
427 // the strong count, and then remove the implicit "strong weak"
428 // pointer while also handling drop logic by just crafting a
431 let _weak = Weak { ptr: this.ptr };
442 /// Constructs a new reference-counted slice with uninitialized contents.
447 /// #![feature(new_uninit)]
448 /// #![feature(get_mut_unchecked)]
452 /// let mut values = Rc::<[u32]>::new_uninit_slice(3);
454 /// let values = unsafe {
455 /// // Deferred initialization:
456 /// Rc::get_mut_unchecked(&mut values)[0].as_mut_ptr().write(1);
457 /// Rc::get_mut_unchecked(&mut values)[1].as_mut_ptr().write(2);
458 /// Rc::get_mut_unchecked(&mut values)[2].as_mut_ptr().write(3);
460 /// values.assume_init()
463 /// assert_eq!(*values, [1, 2, 3])
465 #[unstable(feature = "new_uninit", issue = "63291")]
466 pub fn new_uninit_slice(len: usize) -> Rc<[mem::MaybeUninit<T>]> {
467 unsafe { Rc::from_ptr(Rc::allocate_for_slice(len)) }
471 impl<T> Rc<mem::MaybeUninit<T>> {
472 /// Converts to `Rc<T>`.
476 /// As with [`MaybeUninit::assume_init`],
477 /// it is up to the caller to guarantee that the inner value
478 /// really is in an initialized state.
479 /// Calling this when the content is not yet fully initialized
480 /// causes immediate undefined behavior.
482 /// [`MaybeUninit::assume_init`]: ../../std/mem/union.MaybeUninit.html#method.assume_init
487 /// #![feature(new_uninit)]
488 /// #![feature(get_mut_unchecked)]
492 /// let mut five = Rc::<u32>::new_uninit();
494 /// let five = unsafe {
495 /// // Deferred initialization:
496 /// Rc::get_mut_unchecked(&mut five).as_mut_ptr().write(5);
498 /// five.assume_init()
501 /// assert_eq!(*five, 5)
503 #[unstable(feature = "new_uninit", issue = "63291")]
505 pub unsafe fn assume_init(self) -> Rc<T> {
506 Rc::from_inner(mem::ManuallyDrop::new(self).ptr.cast())
510 impl<T> Rc<[mem::MaybeUninit<T>]> {
511 /// Converts to `Rc<[T]>`.
515 /// As with [`MaybeUninit::assume_init`],
516 /// it is up to the caller to guarantee that the inner value
517 /// really is in an initialized state.
518 /// Calling this when the content is not yet fully initialized
519 /// causes immediate undefined behavior.
521 /// [`MaybeUninit::assume_init`]: ../../std/mem/union.MaybeUninit.html#method.assume_init
526 /// #![feature(new_uninit)]
527 /// #![feature(get_mut_unchecked)]
531 /// let mut values = Rc::<[u32]>::new_uninit_slice(3);
533 /// let values = unsafe {
534 /// // Deferred initialization:
535 /// Rc::get_mut_unchecked(&mut values)[0].as_mut_ptr().write(1);
536 /// Rc::get_mut_unchecked(&mut values)[1].as_mut_ptr().write(2);
537 /// Rc::get_mut_unchecked(&mut values)[2].as_mut_ptr().write(3);
539 /// values.assume_init()
542 /// assert_eq!(*values, [1, 2, 3])
544 #[unstable(feature = "new_uninit", issue = "63291")]
546 pub unsafe fn assume_init(self) -> Rc<[T]> {
547 unsafe { Rc::from_ptr(mem::ManuallyDrop::new(self).ptr.as_ptr() as _) }
551 impl<T: ?Sized> Rc<T> {
552 /// Consumes the `Rc`, returning the wrapped pointer.
554 /// To avoid a memory leak the pointer must be converted back to an `Rc` using
555 /// [`Rc::from_raw`][from_raw].
557 /// [from_raw]: struct.Rc.html#method.from_raw
564 /// let x = Rc::new("hello".to_owned());
565 /// let x_ptr = Rc::into_raw(x);
566 /// assert_eq!(unsafe { &*x_ptr }, "hello");
568 #[stable(feature = "rc_raw", since = "1.17.0")]
569 pub fn into_raw(this: Self) -> *const T {
570 let ptr = Self::as_ptr(&this);
575 /// Provides a raw pointer to the data.
577 /// The counts are not affected in any way and the `Rc` is not consumed. The pointer is valid
578 /// for as long there are strong counts in the `Rc`.
585 /// let x = Rc::new("hello".to_owned());
586 /// let y = Rc::clone(&x);
587 /// let x_ptr = Rc::as_ptr(&x);
588 /// assert_eq!(x_ptr, Rc::as_ptr(&y));
589 /// assert_eq!(unsafe { &*x_ptr }, "hello");
591 #[stable(feature = "weak_into_raw", since = "1.45.0")]
592 pub fn as_ptr(this: &Self) -> *const T {
593 let ptr: *mut RcBox<T> = NonNull::as_ptr(this.ptr);
595 // SAFETY: This cannot go through Deref::deref or Rc::inner because
596 // this is required to retain raw/mut provenance such that e.g. `get_mut` can
597 // write through the pointer after the Rc is recovered through `from_raw`.
598 unsafe { &raw const (*ptr).value }
601 /// Constructs an `Rc<T>` from a raw pointer.
603 /// The raw pointer must have been previously returned by a call to
604 /// [`Rc<U>::into_raw`][into_raw] where `U` must have the same size
605 /// and alignment as `T`. This is trivially true if `U` is `T`.
606 /// Note that if `U` is not `T` but has the same size and alignment, this is
607 /// basically like transmuting references of different types. See
608 /// [`mem::transmute`][transmute] for more information on what
609 /// restrictions apply in this case.
611 /// The user of `from_raw` has to make sure a specific value of `T` is only
614 /// This function is unsafe because improper use may lead to memory unsafety,
615 /// even if the returned `Rc<T>` is never accessed.
617 /// [into_raw]: struct.Rc.html#method.into_raw
618 /// [transmute]: ../../std/mem/fn.transmute.html
625 /// let x = Rc::new("hello".to_owned());
626 /// let x_ptr = Rc::into_raw(x);
629 /// // Convert back to an `Rc` to prevent leak.
630 /// let x = Rc::from_raw(x_ptr);
631 /// assert_eq!(&*x, "hello");
633 /// // Further calls to `Rc::from_raw(x_ptr)` would be memory-unsafe.
636 /// // The memory was freed when `x` went out of scope above, so `x_ptr` is now dangling!
638 #[stable(feature = "rc_raw", since = "1.17.0")]
639 pub unsafe fn from_raw(ptr: *const T) -> Self {
640 let offset = unsafe { data_offset(ptr) };
642 // Reverse the offset to find the original RcBox.
643 let fake_ptr = ptr as *mut RcBox<T>;
644 let rc_ptr = unsafe { set_data_ptr(fake_ptr, (ptr as *mut u8).offset(-offset)) };
646 unsafe { Self::from_ptr(rc_ptr) }
649 /// Consumes the `Rc`, returning the wrapped pointer as `NonNull<T>`.
654 /// #![feature(rc_into_raw_non_null)]
655 /// #![allow(deprecated)]
659 /// let x = Rc::new("hello".to_owned());
660 /// let ptr = Rc::into_raw_non_null(x);
661 /// let deref = unsafe { ptr.as_ref() };
662 /// assert_eq!(deref, "hello");
664 #[unstable(feature = "rc_into_raw_non_null", issue = "47336")]
665 #[rustc_deprecated(since = "1.44.0", reason = "use `Rc::into_raw` instead")]
667 pub fn into_raw_non_null(this: Self) -> NonNull<T> {
668 // safe because Rc guarantees its pointer is non-null
669 unsafe { NonNull::new_unchecked(Rc::into_raw(this) as *mut _) }
672 /// Creates a new [`Weak`][weak] pointer to this allocation.
674 /// [weak]: struct.Weak.html
681 /// let five = Rc::new(5);
683 /// let weak_five = Rc::downgrade(&five);
685 #[stable(feature = "rc_weak", since = "1.4.0")]
686 pub fn downgrade(this: &Self) -> Weak<T> {
688 // Make sure we do not create a dangling Weak
689 debug_assert!(!is_dangling(this.ptr));
690 Weak { ptr: this.ptr }
693 /// Gets the number of [`Weak`][weak] pointers to this allocation.
695 /// [weak]: struct.Weak.html
702 /// let five = Rc::new(5);
703 /// let _weak_five = Rc::downgrade(&five);
705 /// assert_eq!(1, Rc::weak_count(&five));
708 #[stable(feature = "rc_counts", since = "1.15.0")]
709 pub fn weak_count(this: &Self) -> usize {
713 /// Gets the number of strong (`Rc`) pointers to this allocation.
720 /// let five = Rc::new(5);
721 /// let _also_five = Rc::clone(&five);
723 /// assert_eq!(2, Rc::strong_count(&five));
726 #[stable(feature = "rc_counts", since = "1.15.0")]
727 pub fn strong_count(this: &Self) -> usize {
731 /// Returns `true` if there are no other `Rc` or [`Weak`][weak] pointers to
734 /// [weak]: struct.Weak.html
736 fn is_unique(this: &Self) -> bool {
737 Rc::weak_count(this) == 0 && Rc::strong_count(this) == 1
740 /// Returns a mutable reference into the given `Rc`, if there are
741 /// no other `Rc` or [`Weak`][weak] pointers to the same allocation.
743 /// Returns [`None`] otherwise, because it is not safe to
744 /// mutate a shared value.
746 /// See also [`make_mut`][make_mut], which will [`clone`][clone]
747 /// the inner value when there are other pointers.
749 /// [weak]: struct.Weak.html
750 /// [`None`]: ../../std/option/enum.Option.html#variant.None
751 /// [make_mut]: struct.Rc.html#method.make_mut
752 /// [clone]: ../../std/clone/trait.Clone.html#tymethod.clone
759 /// let mut x = Rc::new(3);
760 /// *Rc::get_mut(&mut x).unwrap() = 4;
761 /// assert_eq!(*x, 4);
763 /// let _y = Rc::clone(&x);
764 /// assert!(Rc::get_mut(&mut x).is_none());
767 #[stable(feature = "rc_unique", since = "1.4.0")]
768 pub fn get_mut(this: &mut Self) -> Option<&mut T> {
769 if Rc::is_unique(this) { unsafe { Some(Rc::get_mut_unchecked(this)) } } else { None }
772 /// Returns a mutable reference into the given `Rc`,
773 /// without any check.
775 /// See also [`get_mut`], which is safe and does appropriate checks.
777 /// [`get_mut`]: struct.Rc.html#method.get_mut
781 /// Any other `Rc` or [`Weak`] pointers to the same allocation must not be dereferenced
782 /// for the duration of the returned borrow.
783 /// This is trivially the case if no such pointers exist,
784 /// for example immediately after `Rc::new`.
789 /// #![feature(get_mut_unchecked)]
793 /// let mut x = Rc::new(String::new());
795 /// Rc::get_mut_unchecked(&mut x).push_str("foo")
797 /// assert_eq!(*x, "foo");
800 #[unstable(feature = "get_mut_unchecked", issue = "63292")]
801 pub unsafe fn get_mut_unchecked(this: &mut Self) -> &mut T {
802 unsafe { &mut this.ptr.as_mut().value }
806 #[stable(feature = "ptr_eq", since = "1.17.0")]
807 /// Returns `true` if the two `Rc`s point to the same allocation
808 /// (in a vein similar to [`ptr::eq`]).
815 /// let five = Rc::new(5);
816 /// let same_five = Rc::clone(&five);
817 /// let other_five = Rc::new(5);
819 /// assert!(Rc::ptr_eq(&five, &same_five));
820 /// assert!(!Rc::ptr_eq(&five, &other_five));
823 /// [`ptr::eq`]: ../../std/ptr/fn.eq.html
824 pub fn ptr_eq(this: &Self, other: &Self) -> bool {
825 this.ptr.as_ptr() == other.ptr.as_ptr()
829 impl<T: Clone> Rc<T> {
830 /// Makes a mutable reference into the given `Rc`.
832 /// If there are other `Rc` pointers to the same allocation, then `make_mut` will
833 /// [`clone`] the inner value to a new allocation to ensure unique ownership. This is also
834 /// referred to as clone-on-write.
836 /// If there are no other `Rc` pointers to this allocation, then [`Weak`]
837 /// pointers to this allocation will be disassociated.
839 /// See also [`get_mut`], which will fail rather than cloning.
841 /// [`Weak`]: struct.Weak.html
842 /// [`clone`]: ../../std/clone/trait.Clone.html#tymethod.clone
843 /// [`get_mut`]: struct.Rc.html#method.get_mut
850 /// let mut data = Rc::new(5);
852 /// *Rc::make_mut(&mut data) += 1; // Won't clone anything
853 /// let mut other_data = Rc::clone(&data); // Won't clone inner data
854 /// *Rc::make_mut(&mut data) += 1; // Clones inner data
855 /// *Rc::make_mut(&mut data) += 1; // Won't clone anything
856 /// *Rc::make_mut(&mut other_data) *= 2; // Won't clone anything
858 /// // Now `data` and `other_data` point to different allocations.
859 /// assert_eq!(*data, 8);
860 /// assert_eq!(*other_data, 12);
863 /// [`Weak`] pointers will be disassociated:
868 /// let mut data = Rc::new(75);
869 /// let weak = Rc::downgrade(&data);
871 /// assert!(75 == *data);
872 /// assert!(75 == *weak.upgrade().unwrap());
874 /// *Rc::make_mut(&mut data) += 1;
876 /// assert!(76 == *data);
877 /// assert!(weak.upgrade().is_none());
880 #[stable(feature = "rc_unique", since = "1.4.0")]
881 pub fn make_mut(this: &mut Self) -> &mut T {
882 if Rc::strong_count(this) != 1 {
883 // Gotta clone the data, there are other Rcs
884 *this = Rc::new((**this).clone())
885 } else if Rc::weak_count(this) != 0 {
886 // Can just steal the data, all that's left is Weaks
888 let mut swap = Rc::new(ptr::read(&this.ptr.as_ref().value));
889 mem::swap(this, &mut swap);
891 // Remove implicit strong-weak ref (no need to craft a fake
892 // Weak here -- we know other Weaks can clean up for us)
897 // This unsafety is ok because we're guaranteed that the pointer
898 // returned is the *only* pointer that will ever be returned to T. Our
899 // reference count is guaranteed to be 1 at this point, and we required
900 // the `Rc<T>` itself to be `mut`, so we're returning the only possible
901 // reference to the allocation.
902 unsafe { &mut this.ptr.as_mut().value }
908 #[stable(feature = "rc_downcast", since = "1.29.0")]
909 /// Attempt to downcast the `Rc<dyn Any>` to a concrete type.
914 /// use std::any::Any;
917 /// fn print_if_string(value: Rc<dyn Any>) {
918 /// if let Ok(string) = value.downcast::<String>() {
919 /// println!("String ({}): {}", string.len(), string);
923 /// let my_string = "Hello World".to_string();
924 /// print_if_string(Rc::new(my_string));
925 /// print_if_string(Rc::new(0i8));
927 pub fn downcast<T: Any>(self) -> Result<Rc<T>, Rc<dyn Any>> {
928 if (*self).is::<T>() {
929 let ptr = self.ptr.cast::<RcBox<T>>();
931 Ok(Rc::from_inner(ptr))
938 impl<T: ?Sized> Rc<T> {
939 /// Allocates an `RcBox<T>` with sufficient space for
940 /// a possibly-unsized inner value where the value has the layout provided.
942 /// The function `mem_to_rcbox` is called with the data pointer
943 /// and must return back a (potentially fat)-pointer for the `RcBox<T>`.
944 unsafe fn allocate_for_layout(
945 value_layout: Layout,
946 mem_to_rcbox: impl FnOnce(*mut u8) -> *mut RcBox<T>,
948 // Calculate layout using the given value layout.
949 // Previously, layout was calculated on the expression
950 // `&*(ptr as *const RcBox<T>)`, but this created a misaligned
951 // reference (see #54908).
952 let layout = Layout::new::<RcBox<()>>().extend(value_layout).unwrap().0.pad_to_align();
954 // Allocate for the layout.
956 .alloc(layout, AllocInit::Uninitialized)
957 .unwrap_or_else(|_| handle_alloc_error(layout));
959 // Initialize the RcBox
960 let inner = mem_to_rcbox(mem.ptr.as_ptr());
962 debug_assert_eq!(Layout::for_value(&*inner), layout);
964 ptr::write(&mut (*inner).strong, Cell::new(1));
965 ptr::write(&mut (*inner).weak, Cell::new(1));
971 /// Allocates an `RcBox<T>` with sufficient space for an unsized inner value
972 unsafe fn allocate_for_ptr(ptr: *const T) -> *mut RcBox<T> {
973 // Allocate for the `RcBox<T>` using the given value.
975 Self::allocate_for_layout(Layout::for_value(&*ptr), |mem| {
976 set_data_ptr(ptr as *mut T, mem) as *mut RcBox<T>
981 fn from_box(v: Box<T>) -> Rc<T> {
983 let box_unique = Box::into_unique(v);
984 let bptr = box_unique.as_ptr();
986 let value_size = size_of_val(&*bptr);
987 let ptr = Self::allocate_for_ptr(bptr);
989 // Copy value as bytes
990 ptr::copy_nonoverlapping(
991 bptr as *const T as *const u8,
992 &mut (*ptr).value as *mut _ as *mut u8,
996 // Free the allocation without dropping its contents
997 box_free(box_unique);
1005 /// Allocates an `RcBox<[T]>` with the given length.
1006 unsafe fn allocate_for_slice(len: usize) -> *mut RcBox<[T]> {
1008 Self::allocate_for_layout(Layout::array::<T>(len).unwrap(), |mem| {
1009 ptr::slice_from_raw_parts_mut(mem as *mut T, len) as *mut RcBox<[T]>
1015 /// Sets the data pointer of a `?Sized` raw pointer.
1017 /// For a slice/trait object, this sets the `data` field and leaves the rest
1018 /// unchanged. For a sized raw pointer, this simply sets the pointer.
1019 unsafe fn set_data_ptr<T: ?Sized, U>(mut ptr: *mut T, data: *mut U) -> *mut T {
1021 ptr::write(&mut ptr as *mut _ as *mut *mut u8, data as *mut u8);
1027 /// Copy elements from slice into newly allocated Rc<\[T\]>
1029 /// Unsafe because the caller must either take ownership or bind `T: Copy`
1030 unsafe fn copy_from_slice(v: &[T]) -> Rc<[T]> {
1032 let ptr = Self::allocate_for_slice(v.len());
1033 ptr::copy_nonoverlapping(v.as_ptr(), &mut (*ptr).value as *mut [T] as *mut T, v.len());
1038 /// Constructs an `Rc<[T]>` from an iterator known to be of a certain size.
1040 /// Behavior is undefined should the size be wrong.
1041 unsafe fn from_iter_exact(iter: impl iter::Iterator<Item = T>, len: usize) -> Rc<[T]> {
1042 // Panic guard while cloning T elements.
1043 // In the event of a panic, elements that have been written
1044 // into the new RcBox will be dropped, then the memory freed.
1052 impl<T> Drop for Guard<T> {
1053 fn drop(&mut self) {
1055 let slice = from_raw_parts_mut(self.elems, self.n_elems);
1056 ptr::drop_in_place(slice);
1058 Global.dealloc(self.mem, self.layout);
1064 let ptr = Self::allocate_for_slice(len);
1066 let mem = ptr as *mut _ as *mut u8;
1067 let layout = Layout::for_value(&*ptr);
1069 // Pointer to first element
1070 let elems = &mut (*ptr).value as *mut [T] as *mut T;
1072 let mut guard = Guard { mem: NonNull::new_unchecked(mem), elems, layout, n_elems: 0 };
1074 for (i, item) in iter.enumerate() {
1075 ptr::write(elems.add(i), item);
1079 // All clear. Forget the guard so it doesn't free the new RcBox.
1087 /// Specialization trait used for `From<&[T]>`.
1088 trait RcFromSlice<T> {
1089 fn from_slice(slice: &[T]) -> Self;
1092 impl<T: Clone> RcFromSlice<T> for Rc<[T]> {
1094 default fn from_slice(v: &[T]) -> Self {
1095 unsafe { Self::from_iter_exact(v.iter().cloned(), v.len()) }
1099 impl<T: Copy> RcFromSlice<T> for Rc<[T]> {
1101 fn from_slice(v: &[T]) -> Self {
1102 unsafe { Rc::copy_from_slice(v) }
1106 #[stable(feature = "rust1", since = "1.0.0")]
1107 impl<T: ?Sized> Deref for Rc<T> {
1111 fn deref(&self) -> &T {
1116 #[unstable(feature = "receiver_trait", issue = "none")]
1117 impl<T: ?Sized> Receiver for Rc<T> {}
1119 #[stable(feature = "rust1", since = "1.0.0")]
1120 unsafe impl<#[may_dangle] T: ?Sized> Drop for Rc<T> {
1123 /// This will decrement the strong reference count. If the strong reference
1124 /// count reaches zero then the only other references (if any) are
1125 /// [`Weak`], so we `drop` the inner value.
1130 /// use std::rc::Rc;
1134 /// impl Drop for Foo {
1135 /// fn drop(&mut self) {
1136 /// println!("dropped!");
1140 /// let foo = Rc::new(Foo);
1141 /// let foo2 = Rc::clone(&foo);
1143 /// drop(foo); // Doesn't print anything
1144 /// drop(foo2); // Prints "dropped!"
1147 /// [`Weak`]: ../../std/rc/struct.Weak.html
1148 fn drop(&mut self) {
1151 if self.strong() == 0 {
1152 // destroy the contained object
1153 ptr::drop_in_place(self.ptr.as_mut());
1155 // remove the implicit "strong weak" pointer now that we've
1156 // destroyed the contents.
1159 if self.weak() == 0 {
1160 Global.dealloc(self.ptr.cast(), Layout::for_value(self.ptr.as_ref()));
1167 #[stable(feature = "rust1", since = "1.0.0")]
1168 impl<T: ?Sized> Clone for Rc<T> {
1169 /// Makes a clone of the `Rc` pointer.
1171 /// This creates another pointer to the same allocation, increasing the
1172 /// strong reference count.
1177 /// use std::rc::Rc;
1179 /// let five = Rc::new(5);
1181 /// let _ = Rc::clone(&five);
1184 fn clone(&self) -> Rc<T> {
1186 Self::from_inner(self.ptr)
1190 #[stable(feature = "rust1", since = "1.0.0")]
1191 impl<T: Default> Default for Rc<T> {
1192 /// Creates a new `Rc<T>`, with the `Default` value for `T`.
1197 /// use std::rc::Rc;
1199 /// let x: Rc<i32> = Default::default();
1200 /// assert_eq!(*x, 0);
1203 fn default() -> Rc<T> {
1204 Rc::new(Default::default())
1208 #[stable(feature = "rust1", since = "1.0.0")]
1209 trait RcEqIdent<T: ?Sized + PartialEq> {
1210 fn eq(&self, other: &Rc<T>) -> bool;
1211 fn ne(&self, other: &Rc<T>) -> bool;
1214 #[stable(feature = "rust1", since = "1.0.0")]
1215 impl<T: ?Sized + PartialEq> RcEqIdent<T> for Rc<T> {
1217 default fn eq(&self, other: &Rc<T>) -> bool {
1222 default fn ne(&self, other: &Rc<T>) -> bool {
1227 // Hack to allow specializing on `Eq` even though `Eq` has a method.
1228 #[rustc_unsafe_specialization_marker]
1229 pub(crate) trait MarkerEq: PartialEq<Self> {}
1231 impl<T: Eq> MarkerEq for T {}
1233 /// We're doing this specialization here, and not as a more general optimization on `&T`, because it
1234 /// would otherwise add a cost to all equality checks on refs. We assume that `Rc`s are used to
1235 /// store large values, that are slow to clone, but also heavy to check for equality, causing this
1236 /// cost to pay off more easily. It's also more likely to have two `Rc` clones, that point to
1237 /// the same value, than two `&T`s.
1239 /// We can only do this when `T: Eq` as a `PartialEq` might be deliberately irreflexive.
1240 #[stable(feature = "rust1", since = "1.0.0")]
1241 impl<T: ?Sized + MarkerEq> RcEqIdent<T> for Rc<T> {
1243 fn eq(&self, other: &Rc<T>) -> bool {
1244 Rc::ptr_eq(self, other) || **self == **other
1248 fn ne(&self, other: &Rc<T>) -> bool {
1249 !Rc::ptr_eq(self, other) && **self != **other
1253 #[stable(feature = "rust1", since = "1.0.0")]
1254 impl<T: ?Sized + PartialEq> PartialEq for Rc<T> {
1255 /// Equality for two `Rc`s.
1257 /// Two `Rc`s are equal if their inner values are equal, even if they are
1258 /// stored in different allocation.
1260 /// If `T` also implements `Eq` (implying reflexivity of equality),
1261 /// two `Rc`s that point to the same allocation are
1267 /// use std::rc::Rc;
1269 /// let five = Rc::new(5);
1271 /// assert!(five == Rc::new(5));
1274 fn eq(&self, other: &Rc<T>) -> bool {
1275 RcEqIdent::eq(self, other)
1278 /// Inequality for two `Rc`s.
1280 /// Two `Rc`s are unequal if their inner values are unequal.
1282 /// If `T` also implements `Eq` (implying reflexivity of equality),
1283 /// two `Rc`s that point to the same allocation are
1289 /// use std::rc::Rc;
1291 /// let five = Rc::new(5);
1293 /// assert!(five != Rc::new(6));
1296 fn ne(&self, other: &Rc<T>) -> bool {
1297 RcEqIdent::ne(self, other)
1301 #[stable(feature = "rust1", since = "1.0.0")]
1302 impl<T: ?Sized + Eq> Eq for Rc<T> {}
1304 #[stable(feature = "rust1", since = "1.0.0")]
1305 impl<T: ?Sized + PartialOrd> PartialOrd for Rc<T> {
1306 /// Partial comparison for two `Rc`s.
1308 /// The two are compared by calling `partial_cmp()` on their inner values.
1313 /// use std::rc::Rc;
1314 /// use std::cmp::Ordering;
1316 /// let five = Rc::new(5);
1318 /// assert_eq!(Some(Ordering::Less), five.partial_cmp(&Rc::new(6)));
1321 fn partial_cmp(&self, other: &Rc<T>) -> Option<Ordering> {
1322 (**self).partial_cmp(&**other)
1325 /// Less-than comparison for two `Rc`s.
1327 /// The two are compared by calling `<` on their inner values.
1332 /// use std::rc::Rc;
1334 /// let five = Rc::new(5);
1336 /// assert!(five < Rc::new(6));
1339 fn lt(&self, other: &Rc<T>) -> bool {
1343 /// 'Less than or equal to' comparison for two `Rc`s.
1345 /// The two are compared by calling `<=` on their inner values.
1350 /// use std::rc::Rc;
1352 /// let five = Rc::new(5);
1354 /// assert!(five <= Rc::new(5));
1357 fn le(&self, other: &Rc<T>) -> bool {
1361 /// Greater-than comparison for two `Rc`s.
1363 /// The two are compared by calling `>` on their inner values.
1368 /// use std::rc::Rc;
1370 /// let five = Rc::new(5);
1372 /// assert!(five > Rc::new(4));
1375 fn gt(&self, other: &Rc<T>) -> bool {
1379 /// 'Greater than or equal to' comparison for two `Rc`s.
1381 /// The two are compared by calling `>=` on their inner values.
1386 /// use std::rc::Rc;
1388 /// let five = Rc::new(5);
1390 /// assert!(five >= Rc::new(5));
1393 fn ge(&self, other: &Rc<T>) -> bool {
1398 #[stable(feature = "rust1", since = "1.0.0")]
1399 impl<T: ?Sized + Ord> Ord for Rc<T> {
1400 /// Comparison for two `Rc`s.
1402 /// The two are compared by calling `cmp()` on their inner values.
1407 /// use std::rc::Rc;
1408 /// use std::cmp::Ordering;
1410 /// let five = Rc::new(5);
1412 /// assert_eq!(Ordering::Less, five.cmp(&Rc::new(6)));
1415 fn cmp(&self, other: &Rc<T>) -> Ordering {
1416 (**self).cmp(&**other)
1420 #[stable(feature = "rust1", since = "1.0.0")]
1421 impl<T: ?Sized + Hash> Hash for Rc<T> {
1422 fn hash<H: Hasher>(&self, state: &mut H) {
1423 (**self).hash(state);
1427 #[stable(feature = "rust1", since = "1.0.0")]
1428 impl<T: ?Sized + fmt::Display> fmt::Display for Rc<T> {
1429 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1430 fmt::Display::fmt(&**self, f)
1434 #[stable(feature = "rust1", since = "1.0.0")]
1435 impl<T: ?Sized + fmt::Debug> fmt::Debug for Rc<T> {
1436 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1437 fmt::Debug::fmt(&**self, f)
1441 #[stable(feature = "rust1", since = "1.0.0")]
1442 impl<T: ?Sized> fmt::Pointer for Rc<T> {
1443 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
1444 fmt::Pointer::fmt(&(&**self as *const T), f)
1448 #[stable(feature = "from_for_ptrs", since = "1.6.0")]
1449 impl<T> From<T> for Rc<T> {
1450 fn from(t: T) -> Self {
1455 #[stable(feature = "shared_from_slice", since = "1.21.0")]
1456 impl<T: Clone> From<&[T]> for Rc<[T]> {
1458 fn from(v: &[T]) -> Rc<[T]> {
1459 <Self as RcFromSlice<T>>::from_slice(v)
1463 #[stable(feature = "shared_from_slice", since = "1.21.0")]
1464 impl From<&str> for Rc<str> {
1466 fn from(v: &str) -> Rc<str> {
1467 let rc = Rc::<[u8]>::from(v.as_bytes());
1468 unsafe { Rc::from_raw(Rc::into_raw(rc) as *const str) }
1472 #[stable(feature = "shared_from_slice", since = "1.21.0")]
1473 impl From<String> for Rc<str> {
1475 fn from(v: String) -> Rc<str> {
1480 #[stable(feature = "shared_from_slice", since = "1.21.0")]
1481 impl<T: ?Sized> From<Box<T>> for Rc<T> {
1483 fn from(v: Box<T>) -> Rc<T> {
1488 #[stable(feature = "shared_from_slice", since = "1.21.0")]
1489 impl<T> From<Vec<T>> for Rc<[T]> {
1491 fn from(mut v: Vec<T>) -> Rc<[T]> {
1493 let rc = Rc::copy_from_slice(&v);
1495 // Allow the Vec to free its memory, but not destroy its contents
1503 #[stable(feature = "shared_from_cow", since = "1.45.0")]
1504 impl<'a, B> From<Cow<'a, B>> for Rc<B>
1506 B: ToOwned + ?Sized,
1507 Rc<B>: From<&'a B> + From<B::Owned>,
1510 fn from(cow: Cow<'a, B>) -> Rc<B> {
1512 Cow::Borrowed(s) => Rc::from(s),
1513 Cow::Owned(s) => Rc::from(s),
1518 #[stable(feature = "boxed_slice_try_from", since = "1.43.0")]
1519 impl<T, const N: usize> TryFrom<Rc<[T]>> for Rc<[T; N]>
1521 [T; N]: LengthAtMost32,
1523 type Error = Rc<[T]>;
1525 fn try_from(boxed_slice: Rc<[T]>) -> Result<Self, Self::Error> {
1526 if boxed_slice.len() == N {
1527 Ok(unsafe { Rc::from_raw(Rc::into_raw(boxed_slice) as *mut [T; N]) })
1534 #[stable(feature = "shared_from_iter", since = "1.37.0")]
1535 impl<T> iter::FromIterator<T> for Rc<[T]> {
1536 /// Takes each element in the `Iterator` and collects it into an `Rc<[T]>`.
1538 /// # Performance characteristics
1540 /// ## The general case
1542 /// In the general case, collecting into `Rc<[T]>` is done by first
1543 /// collecting into a `Vec<T>`. That is, when writing the following:
1546 /// # use std::rc::Rc;
1547 /// let evens: Rc<[u8]> = (0..10).filter(|&x| x % 2 == 0).collect();
1548 /// # assert_eq!(&*evens, &[0, 2, 4, 6, 8]);
1551 /// this behaves as if we wrote:
1554 /// # use std::rc::Rc;
1555 /// let evens: Rc<[u8]> = (0..10).filter(|&x| x % 2 == 0)
1556 /// .collect::<Vec<_>>() // The first set of allocations happens here.
1557 /// .into(); // A second allocation for `Rc<[T]>` happens here.
1558 /// # assert_eq!(&*evens, &[0, 2, 4, 6, 8]);
1561 /// This will allocate as many times as needed for constructing the `Vec<T>`
1562 /// and then it will allocate once for turning the `Vec<T>` into the `Rc<[T]>`.
1564 /// ## Iterators of known length
1566 /// When your `Iterator` implements `TrustedLen` and is of an exact size,
1567 /// a single allocation will be made for the `Rc<[T]>`. For example:
1570 /// # use std::rc::Rc;
1571 /// let evens: Rc<[u8]> = (0..10).collect(); // Just a single allocation happens here.
1572 /// # assert_eq!(&*evens, &*(0..10).collect::<Vec<_>>());
1574 fn from_iter<I: iter::IntoIterator<Item = T>>(iter: I) -> Self {
1575 ToRcSlice::to_rc_slice(iter.into_iter())
1579 /// Specialization trait used for collecting into `Rc<[T]>`.
1580 trait ToRcSlice<T>: Iterator<Item = T> + Sized {
1581 fn to_rc_slice(self) -> Rc<[T]>;
1584 impl<T, I: Iterator<Item = T>> ToRcSlice<T> for I {
1585 default fn to_rc_slice(self) -> Rc<[T]> {
1586 self.collect::<Vec<T>>().into()
1590 impl<T, I: iter::TrustedLen<Item = T>> ToRcSlice<T> for I {
1591 fn to_rc_slice(self) -> Rc<[T]> {
1592 // This is the case for a `TrustedLen` iterator.
1593 let (low, high) = self.size_hint();
1594 if let Some(high) = high {
1598 "TrustedLen iterator's size hint is not exact: {:?}",
1603 // SAFETY: We need to ensure that the iterator has an exact length and we have.
1604 Rc::from_iter_exact(self, low)
1607 // Fall back to normal implementation.
1608 self.collect::<Vec<T>>().into()
1613 /// `Weak` is a version of [`Rc`] that holds a non-owning reference to the
1614 /// managed allocation. The allocation is accessed by calling [`upgrade`] on the `Weak`
1615 /// pointer, which returns an [`Option`]`<`[`Rc`]`<T>>`.
1617 /// Since a `Weak` reference does not count towards ownership, it will not
1618 /// prevent the value stored in the allocation from being dropped, and `Weak` itself makes no
1619 /// guarantees about the value still being present. Thus it may return [`None`]
1620 /// when [`upgrade`]d. Note however that a `Weak` reference *does* prevent the allocation
1621 /// itself (the backing store) from being deallocated.
1623 /// A `Weak` pointer is useful for keeping a temporary reference to the allocation
1624 /// managed by [`Rc`] without preventing its inner value from being dropped. It is also used to
1625 /// prevent circular references between [`Rc`] pointers, since mutual owning references
1626 /// would never allow either [`Rc`] to be dropped. For example, a tree could
1627 /// have strong [`Rc`] pointers from parent nodes to children, and `Weak`
1628 /// pointers from children back to their parents.
1630 /// The typical way to obtain a `Weak` pointer is to call [`Rc::downgrade`].
1632 /// [`Rc`]: struct.Rc.html
1633 /// [`Rc::downgrade`]: struct.Rc.html#method.downgrade
1634 /// [`upgrade`]: struct.Weak.html#method.upgrade
1635 /// [`Option`]: ../../std/option/enum.Option.html
1636 /// [`None`]: ../../std/option/enum.Option.html#variant.None
1637 #[stable(feature = "rc_weak", since = "1.4.0")]
1638 pub struct Weak<T: ?Sized> {
1639 // This is a `NonNull` to allow optimizing the size of this type in enums,
1640 // but it is not necessarily a valid pointer.
1641 // `Weak::new` sets this to `usize::MAX` so that it doesn’t need
1642 // to allocate space on the heap. That's not a value a real pointer
1643 // will ever have because RcBox has alignment at least 2.
1644 // This is only possible when `T: Sized`; unsized `T` never dangle.
1645 ptr: NonNull<RcBox<T>>,
1648 #[stable(feature = "rc_weak", since = "1.4.0")]
1649 impl<T: ?Sized> !marker::Send for Weak<T> {}
1650 #[stable(feature = "rc_weak", since = "1.4.0")]
1651 impl<T: ?Sized> !marker::Sync for Weak<T> {}
1653 #[unstable(feature = "coerce_unsized", issue = "27732")]
1654 impl<T: ?Sized + Unsize<U>, U: ?Sized> CoerceUnsized<Weak<U>> for Weak<T> {}
1656 #[unstable(feature = "dispatch_from_dyn", issue = "none")]
1657 impl<T: ?Sized + Unsize<U>, U: ?Sized> DispatchFromDyn<Weak<U>> for Weak<T> {}
1660 /// Constructs a new `Weak<T>`, without allocating any memory.
1661 /// Calling [`upgrade`] on the return value always gives [`None`].
1663 /// [`upgrade`]: #method.upgrade
1664 /// [`None`]: ../../std/option/enum.Option.html
1669 /// use std::rc::Weak;
1671 /// let empty: Weak<i64> = Weak::new();
1672 /// assert!(empty.upgrade().is_none());
1674 #[stable(feature = "downgraded_weak", since = "1.10.0")]
1675 pub fn new() -> Weak<T> {
1676 Weak { ptr: NonNull::new(usize::MAX as *mut RcBox<T>).expect("MAX is not 0") }
1679 /// Returns a raw pointer to the object `T` pointed to by this `Weak<T>`.
1681 /// The pointer is valid only if there are some strong references. The pointer may be dangling,
1682 /// unaligned or even [`null`] otherwise.
1687 /// use std::rc::Rc;
1690 /// let strong = Rc::new("hello".to_owned());
1691 /// let weak = Rc::downgrade(&strong);
1692 /// // Both point to the same object
1693 /// assert!(ptr::eq(&*strong, weak.as_ptr()));
1694 /// // The strong here keeps it alive, so we can still access the object.
1695 /// assert_eq!("hello", unsafe { &*weak.as_ptr() });
1698 /// // But not any more. We can do weak.as_ptr(), but accessing the pointer would lead to
1699 /// // undefined behaviour.
1700 /// // assert_eq!("hello", unsafe { &*weak.as_ptr() });
1703 /// [`null`]: ../../std/ptr/fn.null.html
1704 #[stable(feature = "weak_into_raw", since = "1.45.0")]
1705 pub fn as_ptr(&self) -> *const T {
1706 let ptr: *mut RcBox<T> = NonNull::as_ptr(self.ptr);
1708 // SAFETY: we must offset the pointer manually, and said pointer may be
1709 // a dangling weak (usize::MAX) if T is sized. data_offset is safe to call,
1710 // because we know that a pointer to unsized T was derived from a real
1711 // unsized T, as dangling weaks are only created for sized T. wrapping_offset
1712 // is used so that we can use the same code path for the non-dangling
1713 // unsized case and the potentially dangling sized case.
1715 let offset = data_offset(ptr as *mut T);
1716 set_data_ptr(ptr as *mut T, (ptr as *mut u8).wrapping_offset(offset))
1720 /// Consumes the `Weak<T>` and turns it into a raw pointer.
1722 /// This converts the weak pointer into a raw pointer, preserving the original weak count. It
1723 /// can be turned back into the `Weak<T>` with [`from_raw`].
1725 /// The same restrictions of accessing the target of the pointer as with
1726 /// [`as_ptr`] apply.
1731 /// use std::rc::{Rc, Weak};
1733 /// let strong = Rc::new("hello".to_owned());
1734 /// let weak = Rc::downgrade(&strong);
1735 /// let raw = weak.into_raw();
1737 /// assert_eq!(1, Rc::weak_count(&strong));
1738 /// assert_eq!("hello", unsafe { &*raw });
1740 /// drop(unsafe { Weak::from_raw(raw) });
1741 /// assert_eq!(0, Rc::weak_count(&strong));
1744 /// [`from_raw`]: struct.Weak.html#method.from_raw
1745 /// [`as_ptr`]: struct.Weak.html#method.as_ptr
1746 #[stable(feature = "weak_into_raw", since = "1.45.0")]
1747 pub fn into_raw(self) -> *const T {
1748 let result = self.as_ptr();
1753 /// Converts a raw pointer previously created by [`into_raw`] back into `Weak<T>`.
1755 /// This can be used to safely get a strong reference (by calling [`upgrade`]
1756 /// later) or to deallocate the weak count by dropping the `Weak<T>`.
1758 /// It takes ownership of one weak count (with the exception of pointers created by [`new`],
1759 /// as these don't have any corresponding weak count).
1763 /// The pointer must have originated from the [`into_raw`] and must still own its potential
1764 /// weak reference count.
1766 /// It is allowed for the strong count to be 0 at the time of calling this, but the weak count
1767 /// must be non-zero or the pointer must have originated from a dangling `Weak<T>` (one created
1773 /// use std::rc::{Rc, Weak};
1775 /// let strong = Rc::new("hello".to_owned());
1777 /// let raw_1 = Rc::downgrade(&strong).into_raw();
1778 /// let raw_2 = Rc::downgrade(&strong).into_raw();
1780 /// assert_eq!(2, Rc::weak_count(&strong));
1782 /// assert_eq!("hello", &*unsafe { Weak::from_raw(raw_1) }.upgrade().unwrap());
1783 /// assert_eq!(1, Rc::weak_count(&strong));
1787 /// // Decrement the last weak count.
1788 /// assert!(unsafe { Weak::from_raw(raw_2) }.upgrade().is_none());
1791 /// [`into_raw`]: struct.Weak.html#method.into_raw
1792 /// [`upgrade`]: struct.Weak.html#method.upgrade
1793 /// [`Rc`]: struct.Rc.html
1794 /// [`Weak`]: struct.Weak.html
1795 /// [`new`]: struct.Weak.html#method.new
1796 /// [`forget`]: ../../std/mem/fn.forget.html
1797 #[stable(feature = "weak_into_raw", since = "1.45.0")]
1798 pub unsafe fn from_raw(ptr: *const T) -> Self {
1802 // See Rc::from_raw for details
1804 let offset = data_offset(ptr);
1805 let fake_ptr = ptr as *mut RcBox<T>;
1806 let ptr = set_data_ptr(fake_ptr, (ptr as *mut u8).offset(-offset));
1807 Weak { ptr: NonNull::new(ptr).expect("Invalid pointer passed to from_raw") }
1813 pub(crate) fn is_dangling<T: ?Sized>(ptr: NonNull<T>) -> bool {
1814 let address = ptr.as_ptr() as *mut () as usize;
1815 address == usize::MAX
1818 impl<T: ?Sized> Weak<T> {
1819 /// Attempts to upgrade the `Weak` pointer to an [`Rc`], delaying
1820 /// dropping of the inner value if successful.
1822 /// Returns [`None`] if the inner value has since been dropped.
1824 /// [`Rc`]: struct.Rc.html
1825 /// [`None`]: ../../std/option/enum.Option.html
1830 /// use std::rc::Rc;
1832 /// let five = Rc::new(5);
1834 /// let weak_five = Rc::downgrade(&five);
1836 /// let strong_five: Option<Rc<_>> = weak_five.upgrade();
1837 /// assert!(strong_five.is_some());
1839 /// // Destroy all strong pointers.
1840 /// drop(strong_five);
1843 /// assert!(weak_five.upgrade().is_none());
1845 #[stable(feature = "rc_weak", since = "1.4.0")]
1846 pub fn upgrade(&self) -> Option<Rc<T>> {
1847 let inner = self.inner()?;
1848 if inner.strong() == 0 {
1852 Some(Rc::from_inner(self.ptr))
1856 /// Gets the number of strong (`Rc`) pointers pointing to this allocation.
1858 /// If `self` was created using [`Weak::new`], this will return 0.
1860 /// [`Weak::new`]: #method.new
1861 #[stable(feature = "weak_counts", since = "1.41.0")]
1862 pub fn strong_count(&self) -> usize {
1863 if let Some(inner) = self.inner() { inner.strong() } else { 0 }
1866 /// Gets the number of `Weak` pointers pointing to this allocation.
1868 /// If no strong pointers remain, this will return zero.
1869 #[stable(feature = "weak_counts", since = "1.41.0")]
1870 pub fn weak_count(&self) -> usize {
1873 if inner.strong() > 0 {
1874 inner.weak() - 1 // subtract the implicit weak ptr
1882 /// Returns `None` when the pointer is dangling and there is no allocated `RcBox`
1883 /// (i.e., when this `Weak` was created by `Weak::new`).
1885 fn inner(&self) -> Option<&RcBox<T>> {
1886 if is_dangling(self.ptr) { None } else { Some(unsafe { self.ptr.as_ref() }) }
1889 /// Returns `true` if the two `Weak`s point to the same allocation (similar to
1890 /// [`ptr::eq`]), or if both don't point to any allocation
1891 /// (because they were created with `Weak::new()`).
1895 /// Since this compares pointers it means that `Weak::new()` will equal each
1896 /// other, even though they don't point to any allocation.
1901 /// use std::rc::Rc;
1903 /// let first_rc = Rc::new(5);
1904 /// let first = Rc::downgrade(&first_rc);
1905 /// let second = Rc::downgrade(&first_rc);
1907 /// assert!(first.ptr_eq(&second));
1909 /// let third_rc = Rc::new(5);
1910 /// let third = Rc::downgrade(&third_rc);
1912 /// assert!(!first.ptr_eq(&third));
1915 /// Comparing `Weak::new`.
1918 /// use std::rc::{Rc, Weak};
1920 /// let first = Weak::new();
1921 /// let second = Weak::new();
1922 /// assert!(first.ptr_eq(&second));
1924 /// let third_rc = Rc::new(());
1925 /// let third = Rc::downgrade(&third_rc);
1926 /// assert!(!first.ptr_eq(&third));
1929 /// [`ptr::eq`]: ../../std/ptr/fn.eq.html
1931 #[stable(feature = "weak_ptr_eq", since = "1.39.0")]
1932 pub fn ptr_eq(&self, other: &Self) -> bool {
1933 self.ptr.as_ptr() == other.ptr.as_ptr()
1937 #[stable(feature = "rc_weak", since = "1.4.0")]
1938 impl<T: ?Sized> Drop for Weak<T> {
1939 /// Drops the `Weak` pointer.
1944 /// use std::rc::{Rc, Weak};
1948 /// impl Drop for Foo {
1949 /// fn drop(&mut self) {
1950 /// println!("dropped!");
1954 /// let foo = Rc::new(Foo);
1955 /// let weak_foo = Rc::downgrade(&foo);
1956 /// let other_weak_foo = Weak::clone(&weak_foo);
1958 /// drop(weak_foo); // Doesn't print anything
1959 /// drop(foo); // Prints "dropped!"
1961 /// assert!(other_weak_foo.upgrade().is_none());
1963 fn drop(&mut self) {
1964 if let Some(inner) = self.inner() {
1966 // the weak count starts at 1, and will only go to zero if all
1967 // the strong pointers have disappeared.
1968 if inner.weak() == 0 {
1970 Global.dealloc(self.ptr.cast(), Layout::for_value(self.ptr.as_ref()));
1977 #[stable(feature = "rc_weak", since = "1.4.0")]
1978 impl<T: ?Sized> Clone for Weak<T> {
1979 /// Makes a clone of the `Weak` pointer that points to the same allocation.
1984 /// use std::rc::{Rc, Weak};
1986 /// let weak_five = Rc::downgrade(&Rc::new(5));
1988 /// let _ = Weak::clone(&weak_five);
1991 fn clone(&self) -> Weak<T> {
1992 if let Some(inner) = self.inner() {
1995 Weak { ptr: self.ptr }
1999 #[stable(feature = "rc_weak", since = "1.4.0")]
2000 impl<T: ?Sized + fmt::Debug> fmt::Debug for Weak<T> {
2001 fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
2006 #[stable(feature = "downgraded_weak", since = "1.10.0")]
2007 impl<T> Default for Weak<T> {
2008 /// Constructs a new `Weak<T>`, allocating memory for `T` without initializing
2009 /// it. Calling [`upgrade`] on the return value always gives [`None`].
2011 /// [`None`]: ../../std/option/enum.Option.html
2012 /// [`upgrade`]: ../../std/rc/struct.Weak.html#method.upgrade
2017 /// use std::rc::Weak;
2019 /// let empty: Weak<i64> = Default::default();
2020 /// assert!(empty.upgrade().is_none());
2022 fn default() -> Weak<T> {
2027 // NOTE: We checked_add here to deal with mem::forget safely. In particular
2028 // if you mem::forget Rcs (or Weaks), the ref-count can overflow, and then
2029 // you can free the allocation while outstanding Rcs (or Weaks) exist.
2030 // We abort because this is such a degenerate scenario that we don't care about
2031 // what happens -- no real program should ever experience this.
2033 // This should have negligible overhead since you don't actually need to
2034 // clone these much in Rust thanks to ownership and move-semantics.
2037 trait RcBoxPtr<T: ?Sized> {
2038 fn inner(&self) -> &RcBox<T>;
2041 fn strong(&self) -> usize {
2042 self.inner().strong.get()
2046 fn inc_strong(&self) {
2047 let strong = self.strong();
2049 // We want to abort on overflow instead of dropping the value.
2050 // The reference count will never be zero when this is called;
2051 // nevertheless, we insert an abort here to hint LLVM at
2052 // an otherwise missed optimization.
2053 if strong == 0 || strong == usize::MAX {
2056 self.inner().strong.set(strong + 1);
2060 fn dec_strong(&self) {
2061 self.inner().strong.set(self.strong() - 1);
2065 fn weak(&self) -> usize {
2066 self.inner().weak.get()
2070 fn inc_weak(&self) {
2071 let weak = self.weak();
2073 // We want to abort on overflow instead of dropping the value.
2074 // The reference count will never be zero when this is called;
2075 // nevertheless, we insert an abort here to hint LLVM at
2076 // an otherwise missed optimization.
2077 if weak == 0 || weak == usize::MAX {
2080 self.inner().weak.set(weak + 1);
2084 fn dec_weak(&self) {
2085 self.inner().weak.set(self.weak() - 1);
2089 impl<T: ?Sized> RcBoxPtr<T> for Rc<T> {
2091 fn inner(&self) -> &RcBox<T> {
2092 unsafe { self.ptr.as_ref() }
2096 impl<T: ?Sized> RcBoxPtr<T> for RcBox<T> {
2098 fn inner(&self) -> &RcBox<T> {
2103 #[stable(feature = "rust1", since = "1.0.0")]
2104 impl<T: ?Sized> borrow::Borrow<T> for Rc<T> {
2105 fn borrow(&self) -> &T {
2110 #[stable(since = "1.5.0", feature = "smart_ptr_as_ref")]
2111 impl<T: ?Sized> AsRef<T> for Rc<T> {
2112 fn as_ref(&self) -> &T {
2117 #[stable(feature = "pin", since = "1.33.0")]
2118 impl<T: ?Sized> Unpin for Rc<T> {}
2120 /// Get the offset within an `ArcInner` for
2121 /// a payload of type described by a pointer.
2125 /// This has the same safety requirements as `align_of_val_raw`. In effect:
2127 /// - This function is safe for any argument if `T` is sized, and
2128 /// - if `T` is unsized, the pointer must have appropriate pointer metadata
2129 /// aquired from the real instance that you are getting this offset for.
2130 unsafe fn data_offset<T: ?Sized>(ptr: *const T) -> isize {
2131 // Align the unsized value to the end of the `RcBox`.
2132 // Because it is ?Sized, it will always be the last field in memory.
2133 // Note: This is a detail of the current implementation of the compiler,
2134 // and is not a guaranteed language detail. Do not rely on it outside of std.
2135 unsafe { data_offset_align(align_of_val_raw(ptr)) }
2139 fn data_offset_align(align: usize) -> isize {
2140 let layout = Layout::new::<RcBox<()>>();
2141 (layout.size() + layout.padding_needed_for(align)) as isize