Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix misspellings in comments. #13653

Merged
merged 1 commit into from
Apr 22, 2014
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions src/libcollections/btree.rs
Original file line number Diff line number Diff line change
Expand Up @@ -659,13 +659,13 @@ impl<K: fmt::Show + TotalOrd, V: fmt::Show> fmt::Show for Branch<K, V> {
}
}

//A LeafElt containts no left child, but a key-value pair.
//A LeafElt contains no left child, but a key-value pair.
struct LeafElt<K, V> {
key: K,
value: V
}

//A BranchElt has a left child in insertition to a key-value pair.
//A BranchElt has a left child in insertion to a key-value pair.
struct BranchElt<K, V> {
left: ~Node<K, V>,
key: K,
Expand Down
8 changes: 4 additions & 4 deletions src/libcollections/hashmap.rs
Original file line number Diff line number Diff line change
Expand Up @@ -587,7 +587,7 @@ static INITIAL_LOAD_FACTOR: Fraction = (9, 10);
//
// > Why a load factor of 90%?
//
// In general, all the distances to inital buckets will converge on the mean.
// In general, all the distances to initial buckets will converge on the mean.
// At a load factor of α, the odds of finding the target bucket after k
// probes is approximately 1-α^k. If we set this equal to 50% (since we converge
// on the mean) and set k=8 (64-byte cache line / 8-byte hash), α=0.92. I round
Expand All @@ -600,7 +600,7 @@ static INITIAL_LOAD_FACTOR: Fraction = (9, 10);
// > Wait, what? Where did you get 1-α^k from?
//
// On the first probe, your odds of a collision with an existing element is α.
// The odds of doing this twice in a row is approximatelly α^2. For three times,
// The odds of doing this twice in a row is approximately α^2. For three times,
// α^3, etc. Therefore, the odds of colliding k times is α^k. The odds of NOT
// colliding after k tries is 1-α^k.
//
Expand Down Expand Up @@ -681,7 +681,7 @@ static INITIAL_LOAD_FACTOR: Fraction = (9, 10);
/// let mut book_reviews = HashMap::new();
///
/// // review some books.
/// book_reviews.insert("Adventures of Hucklebury Fin", "My favorite book.");
/// book_reviews.insert("Adventures of Huckleberry Finn", "My favorite book.");
/// book_reviews.insert("Grimms' Fairy Tales", "Masterpiece.");
/// book_reviews.insert("Pride and Prejudice", "Very enjoyable.");
/// book_reviews.insert("The Adventures of Sherlock Holmes", "Eye lyked it alot.");
Expand Down Expand Up @@ -771,7 +771,7 @@ impl<K: TotalEq + Hash<S>, V, S, H: Hasher<S>> HashMap<K, V, H> {
/// from its 'ideal' location.
///
/// In the cited blog posts above, this is called the "distance to
/// inital bucket", or DIB.
/// initial bucket", or DIB.
fn bucket_distance(&self, index_of_elem: &table::FullIndex) -> uint {
// where the hash of the element that happens to reside at
// `index_of_elem` tried to place itself first.
Expand Down
2 changes: 1 addition & 1 deletion src/libcollections/treemap.rs
Original file line number Diff line number Diff line change
Expand Up @@ -308,7 +308,7 @@ pub struct RevMutEntries<'a, K, V> {
// (with many different `x`) below, so we need to optionally pass mut
// as a tt, but the only thing we can do with a `tt` is pass them to
// other macros, so this takes the `& <mutability> <operand>` token
// sequence and forces their evalutation as an expression.
// sequence and forces their evaluation as an expression.
macro_rules! addr { ($e:expr) => { $e }}
// putting an optional mut into type signatures
macro_rules! item { ($i:item) => { $i }}
Expand Down
2 changes: 1 addition & 1 deletion src/libcollections/trie.rs
Original file line number Diff line number Diff line change
Expand Up @@ -141,7 +141,7 @@ impl<T> TrieMap<T> {
// (with many different `x`) below, so we need to optionally pass mut
// as a tt, but the only thing we can do with a `tt` is pass them to
// other macros, so this takes the `& <mutability> <operand>` token
// sequence and forces their evalutation as an expression. (see also
// sequence and forces their evaluation as an expression. (see also
// `item!` below.)
macro_rules! addr { ($e:expr) => { $e } }

Expand Down
2 changes: 1 addition & 1 deletion src/libnum/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -171,7 +171,7 @@ macro_rules! impl_integer_for_int {
/// `other`.
#[inline]
fn lcm(&self, other: &$T) -> $T {
// should not have to recaluculate abs
// should not have to recalculate abs
((*self * *other) / self.gcd(other)).abs()
}

Expand Down
6 changes: 3 additions & 3 deletions src/librustc/back/link.rs
Original file line number Diff line number Diff line change
Expand Up @@ -1171,7 +1171,7 @@ fn link_args(sess: &Session,
// actually creates "invalid" objects [1] [2], but only for some
// introspection tools, not in terms of whether it can be loaded.
//
// Long story shory, passing this flag forces the linker to *not*
// Long story short, passing this flag forces the linker to *not*
// truncate section names (so we can find the metadata section after
// it's compiled). The real kicker is that rust compiled just fine on
// windows for quite a long time *without* this flag, so I have no idea
Expand Down Expand Up @@ -1491,7 +1491,7 @@ fn add_upstream_rust_crates(args: &mut Vec<~str>, sess: &Session,
}

// Link in all of our upstream crates' native dependencies. Remember that
// all of these upstream native depenencies are all non-static
// all of these upstream native dependencies are all non-static
// dependencies. We've got two cases then:
//
// 1. The upstream crate is an rlib. In this case we *must* link in the
Expand All @@ -1509,7 +1509,7 @@ fn add_upstream_rust_crates(args: &mut Vec<~str>, sess: &Session,
// be instantiated in the target crate, meaning that the native symbol must
// also be resolved in the target crate.
fn add_upstream_native_libraries(args: &mut Vec<~str>, sess: &Session) {
// Be sure to use a topological sorting of crates becuase there may be
// Be sure to use a topological sorting of crates because there may be
// interdependencies between native libraries. When passing -nodefaultlibs,
// for example, almost all native libraries depend on libc, so we have to
// make sure that's all the way at the right (liblibc is near the base of
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/back/svh.rs
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
//! such.
//!
//! The core of this problem is when an upstream dependency changes and
//! downstream dependants are not recompiled. This causes compile errors because
//! downstream dependents are not recompiled. This causes compile errors because
//! the upstream crate's metadata has changed but the downstream crates are
//! still referencing the older crate's metadata.
//!
Expand Down
4 changes: 2 additions & 2 deletions src/librustc/metadata/loader.rs
Original file line number Diff line number Diff line change
Expand Up @@ -222,7 +222,7 @@ impl<'a> Context<'a> {
//
// A Library candidate is created if the metadata for the set of
// libraries corresponds to the crate id and hash criteria that this
// serach is being performed for.
// search is being performed for.
let mut libraries = Vec::new();
for (_hash, (rlibs, dylibs)) in candidates.move_iter() {
let mut metadata = None;
Expand Down Expand Up @@ -278,7 +278,7 @@ impl<'a> Context<'a> {
// rlib/dylib).
//
// The return value is `None` if `file` doesn't look like a rust-generated
// library, or if a specific version was requested and it doens't match the
// library, or if a specific version was requested and it doesn't match the
// apparent file's version.
//
// If everything checks out, then `Some(hash)` is returned where `hash` is
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/dead.rs
Original file line number Diff line number Diff line change
Expand Up @@ -275,7 +275,7 @@ fn create_and_seed_worklist(tcx: &ty::ctxt,
None => ()
}

// Seed implemeneted trait methods
// Seed implemented trait methods
let mut life_seeder = LifeSeeder {
worklist: worklist
};
Expand Down
4 changes: 2 additions & 2 deletions src/librustc/middle/liveness.rs
Original file line number Diff line number Diff line change
Expand Up @@ -480,7 +480,7 @@ fn visit_expr(ir: &mut IrMaps, expr: &Expr) {
// var must be dead afterwards
moves::CapMove => true,

// var can stil be used
// var can still be used
moves::CapCopy | moves::CapRef => false
};
call_caps.push(CaptureInfo {ln: cv_ln,
Expand Down Expand Up @@ -613,7 +613,7 @@ impl<'a> Liveness<'a> {
f: |&mut Liveness<'a>, LiveNode, Variable, Span, NodeId|) {
// only consider the first pattern; any later patterns must have
// the same bindings, and we also consider the first pattern to be
// the "authoratative" set of ids
// the "authoritative" set of ids
if !pats.is_empty() {
self.pat_bindings(pats[0], f)
}
Expand Down
6 changes: 3 additions & 3 deletions src/librustc/middle/privacy.rs
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ impl Visitor<()> for ParentVisitor {
let prev = self.curparent;
match item.node {
ast::ItemMod(..) => { self.curparent = item.id; }
// Enum variants are parented to the enum definition itself beacuse
// Enum variants are parented to the enum definition itself because
// they inherit privacy
ast::ItemEnum(ref def, _) => {
for variant in def.variants.iter() {
Expand Down Expand Up @@ -1034,7 +1034,7 @@ impl<'a> Visitor<()> for SanePrivacyVisitor<'a> {
}

impl<'a> SanePrivacyVisitor<'a> {
/// Validates all of the visibility qualifers placed on the item given. This
/// Validates all of the visibility qualifiers placed on the item given. This
/// ensures that there are no extraneous qualifiers that don't actually do
/// anything. In theory these qualifiers wouldn't parse, but that may happen
/// later on down the road...
Expand Down Expand Up @@ -1262,7 +1262,7 @@ impl<'a> Visitor<()> for VisiblePrivateTypesVisitor<'a> {
self_is_public_path = visitor.outer_type_is_public_path;
}

// miscellanous info about the impl
// miscellaneous info about the impl

// `true` iff this is `impl Private for ...`.
let not_private_trait =
Expand Down
6 changes: 3 additions & 3 deletions src/librustc/middle/region.rs
Original file line number Diff line number Diff line change
Expand Up @@ -182,7 +182,7 @@ impl RegionMaps {

// else, locate the innermost terminating scope
// if there's one. Static items, for instance, won't
// have an enclusing scope, hence no scope will be
// have an enclosing scope, hence no scope will be
// returned.
let mut id = match self.opt_encl_scope(expr_id) {
Some(i) => i,
Expand Down Expand Up @@ -533,7 +533,7 @@ fn resolve_expr(visitor: &mut RegionResolutionVisitor,
// the invoked function is actually running* and call.id
// represents *the time to prepare the arguments and make the
// call*. See the section "Borrows in Calls" borrowck/doc.rs
// for an extended explanantion of why this distinction is
// for an extended explanation of why this distinction is
// important.
//
// record_superlifetime(new_cx, expr.callee_id);
Expand Down Expand Up @@ -604,7 +604,7 @@ fn resolve_local(visitor: &mut RegionResolutionVisitor,
// (covers cases `expr` borrows an rvalue that is then assigned
// to memory (at least partially) owned by the binding)
//
// Here are some examples hopefully giving an intution where each
// Here are some examples hopefully giving an intuition where each
// rule comes into play and why:
//
// Rule A. `let (ref x, ref y) = (foo().x, 44)`. The rvalue `(22, 44)`
Expand Down
4 changes: 2 additions & 2 deletions src/librustc/middle/resolve.rs
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ pub enum LastPrivate {
// `use` directives (imports) can refer to two separate definitions in the
// type and value namespaces. We record here the last private node for each
// and whether the import is in fact used for each.
// If the Option<PrivateDep> fields are None, it means there is no defintion
// If the Option<PrivateDep> fields are None, it means there is no definition
// in that namespace.
LastImport{pub value_priv: Option<PrivateDep>,
pub value_used: ImportUse,
Expand Down Expand Up @@ -3610,7 +3610,7 @@ impl<'a> Resolver<'a> {
}
}

// n.b. the discr expr gets visted twice.
// n.b. the discr expr gets visited twice.
// but maybe it's okay since the first time will signal an
// error if there is one? -- tjc
self.with_type_parameter_rib(HasTypeParameters(generics,
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/subst.rs
Original file line number Diff line number Diff line change
Expand Up @@ -272,7 +272,7 @@ impl Subst for ty::Region {
substs: &ty::substs,
_: Option<Span>) -> ty::Region {
// Note: This routine only handles regions that are bound on
// type declarationss and other outer declarations, not those
// type declarations and other outer declarations, not those
// bound in *fn types*. Region substitution of the bound
// regions that appear in a function signature is done using
// the specialized routine
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/trans/_match.rs
Original file line number Diff line number Diff line change
Expand Up @@ -963,7 +963,7 @@ fn get_options(bcx: &Block, m: &[Match], col: uint) -> Vec<Opt> {
if set.iter().any(|l| opt_eq(tcx, l, &val)) {return;}
set.push(val);
}
// Vector comparisions are special in that since the actual
// Vector comparisons are special in that since the actual
// conditions over-match, we need to be careful about them. This
// means that in order to properly handle things in order, we need
// to not always merge conditions.
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/trans/callee.rs
Original file line number Diff line number Diff line change
Expand Up @@ -370,7 +370,7 @@ pub fn trans_fn_ref_with_vtables(
false
};

// Create a monomorphic verison of generic functions
// Create a monomorphic version of generic functions
if must_monomorphise {
// Should be either intra-crate or inlined.
assert_eq!(def_id.krate, ast::LOCAL_CRATE);
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/trans/closure.rs
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ use syntax::ast_util;
// because the alignment requirements of the bound data affects the
// alignment requires of the closure_data struct as a whole. However,
// right now this is a non-issue in any case, because the size of the
// rust_opaque_box header is always a mutiple of 16-bytes, which is
// rust_opaque_box header is always a multiple of 16-bytes, which is
// the maximum alignment requirement we ever have to worry about.
//
// The only reason alignment matters is that, in order to learn what data
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/trans/debuginfo.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2503,7 +2503,7 @@ fn populate_scope_map(cx: &CrateContext,
ast::PatIdent(_, ref path_ref, ref sub_pat_opt) => {

// Check if this is a binding. If so we need to put it on the scope stack and maybe
// introduce an articial scope
// introduce an artificial scope
if pat_util::pat_is_binding(def_map, pat) {

let ident = ast_util::path_to_ident(path_ref);
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/trans/monomorphize.rs
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ pub fn monomorphic_fn(ccx: &CrateContext,
// Static default methods are a little unfortunate, in
// that the "internal" and "external" type of them differ.
// Internally, the method body can refer to Self, but the
// externally visable type of the method has a type param
// externally visible type of the method has a type param
// inserted in between the trait type params and the
// method type params. The substs that we are given are
// the proper substs *internally* to the method body, so
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/ty.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2338,7 +2338,7 @@ pub fn is_instantiable(cx: &ctxt, r_ty: t) -> bool {
let r = match get(ty).sty {
// fixed length vectors need special treatment compared to
// normal vectors, since they don't necessarily have the
// possibilty to have length zero.
// possibility to have length zero.
ty_vec(_, Some(0)) => false, // don't need no contents
ty_vec(mt, Some(_)) => type_requires(cx, seen, r_ty, mt.ty),

Expand Down
4 changes: 2 additions & 2 deletions src/librustc/middle/typeck/check/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -238,7 +238,7 @@ pub struct FnCtxt<'a> {
//
// What we do in such cases is to generate a region variable with
// `region_lb` as a lower bound. The regionck pass then adds
// other constriants based on how the variable is used and region
// other constraints based on how the variable is used and region
// inference selects the ultimate value. Finally, borrowck is
// charged with guaranteeing that the value whose address was taken
// can actually be made to live as long as it needs to live.
Expand Down Expand Up @@ -2548,7 +2548,7 @@ fn check_expr_with_unifier(fcx: &FnCtxt,
ty::mt {ty: t, mutbl: mutability},
None)), // Sadly, we know the length
// - Some(args.len()) - but
// must thow it away or cause
// must throw it away or cause
// confusion further down the
// pipeline. Hopefully we can
// remedy this later.
Expand Down
2 changes: 1 addition & 1 deletion src/librustc/middle/typeck/infer/resolve.rs
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@
// therefore cannot sensibly be mapped to any particular result. By
// default, we will leave such variables as is (so you will get back a
// variable in your result). The options force_* will cause the
// resolution to fail in this case intead, except for the case of
// resolution to fail in this case instead, except for the case of
// integral variables, which resolve to `int` if forced.
//
// # resolve_all and force_all
Expand Down
4 changes: 2 additions & 2 deletions src/librustc/util/sha2.rs
Original file line number Diff line number Diff line change
Expand Up @@ -146,14 +146,14 @@ impl FixedBuffer for FixedBuffer64 {
}
}

// While we have at least a full buffer size chunks's worth of data, process that data
// While we have at least a full buffer size chunk's worth of data, process that data
// without copying it into the buffer
while input.len() - i >= size {
func(input.slice(i, i + size));
i += size;
}

// Copy any input data into the buffer. At this point in the method, the ammount of
// Copy any input data into the buffer. At this point in the method, the amount of
// data left in the input vector will be less than the buffer size and the buffer will
// be empty.
let input_remaining = input.len() - i;
Expand Down
4 changes: 2 additions & 2 deletions src/libstd/cmp.rs
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
//!
//! // Our implementation of `Eq` to support `==` and `!=`.
//! impl Eq for SketchyNum {
//! // Our custom eq allows numbers which are near eachother to be equal! :D
//! // Our custom eq allows numbers which are near each other to be equal! :D
//! fn eq(&self, other: &SketchyNum) -> bool {
//! (self.num - other.num).abs() < 5
//! }
Expand Down Expand Up @@ -283,7 +283,7 @@ mod test {

// Our implementation of `Eq` to support `==` and `!=`.
impl Eq for SketchyNum {
// Our custom eq allows numbers which are near eachother to be equal! :D
// Our custom eq allows numbers which are near each other to be equal! :D
fn eq(&self, other: &SketchyNum) -> bool {
(self.num - other.num).abs() < 5
}
Expand Down
2 changes: 1 addition & 1 deletion src/libstd/iter.rs
Original file line number Diff line number Diff line change
Expand Up @@ -937,7 +937,7 @@ impl<A: TotalOrd, T: Iterator<A>> OrdIterator<A> for T {
loop {
// `first` and `second` are the two next elements we want to look at.
// We first compare `first` and `second` (#1). The smaller one is then compared to
// current mininum (#2). The larger one is compared to current maximum (#3). This
// current minimum (#2). The larger one is compared to current maximum (#3). This
// way we do 3 comparisons for 2 elements.
let first = match self.next() {
None => break,
Expand Down
2 changes: 1 addition & 1 deletion src/libstd/macros.rs
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@
/// which is transmitted.
///
/// The multi-argument form of this macro fails with a string and has the
/// `format!` sytnax for building a string.
/// `format!` syntax for building a string.
///
/// # Example
///
Expand Down
2 changes: 1 addition & 1 deletion src/libstd/ptr.rs
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
//! an unsafe pointer when safe pointers are unsuitable;
//! checking for null; and converting back to safe pointers.
//! As a result, there is not yet an abundance of library code
//! for working with unsafe poniters, and in particular,
//! for working with unsafe pointers, and in particular,
//! since pointer math is fairly uncommon in Rust, it is not
//! all that convenient.
//!
Expand Down
2 changes: 1 addition & 1 deletion src/libstd/raw.rs
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
//! They can be used as targets of transmutes in unsafe code for manipulating
//! the raw representations directly.
//!
//! Their definitition should always match the ABI defined in `rustc::back::abi`.
//! Their definition should always match the ABI defined in `rustc::back::abi`.

use cast;

Expand Down
Loading