Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: "associated to" -> "associated with" #1557

Merged
merged 1 commit into from
Sep 26, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion common/src/bitset.rs
Original file line number Diff line number Diff line change
Expand Up @@ -277,7 +277,7 @@ impl BitSet {
self.tinyset(el / 64u32).contains(el % 64)
}

/// Returns the first non-empty `TinySet` associated to a bucket lower
/// Returns the first non-empty `TinySet` associated with a bucket lower
/// or greater than bucket.
///
/// Reminder: the tiny set with the bucket `bucket`, represents the
Expand Down
2 changes: 1 addition & 1 deletion doc/src/basis.md
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ to get tantivy to fit your use case:

*Example 1* You could for instance use hadoop to build a very large search index in a timely manner, copy all of the resulting segment files in the same directory and edit the `meta.json` to get a functional index.[^2]

*Example 2* You could also disable your merge policy and enforce daily segments. Removing data after one week can then be done very efficiently by just editing the `meta.json` and deleting the files associated to segment `D-7`.
*Example 2* You could also disable your merge policy and enforce daily segments. Removing data after one week can then be done very efficiently by just editing the `meta.json` and deleting the files associated with segment `D-7`.

## Merging

Expand Down
2 changes: 1 addition & 1 deletion examples/deleting_updating_documents.rs
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ fn main() -> tantivy::Result<()> {
// on its id.
//
// Note that `tantivy` does nothing to enforce the idea that
// there is only one document associated to this id.
// there is only one document associated with this id.
//
// Also you might have noticed that we apply the delete before
// having committed. This does not matter really...
Expand Down
4 changes: 2 additions & 2 deletions examples/iterating_docs_and_positions.rs
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ fn main() -> tantivy::Result<()> {
// A segment contains different data structure.
// Inverted index stands for the combination of
// - the term dictionary
// - the inverted lists associated to each terms and their positions
// - the inverted lists associated with each terms and their positions
let inverted_index = segment_reader.inverted_index(title)?;

// A `Term` is a text token associated with a field.
Expand Down Expand Up @@ -105,7 +105,7 @@ fn main() -> tantivy::Result<()> {
// A segment contains different data structure.
// Inverted index stands for the combination of
// - the term dictionary
// - the inverted lists associated to each terms and their positions
// - the inverted lists associated with each terms and their positions
let inverted_index = segment_reader.inverted_index(title)?;

// This segment posting object is like a cursor over the documents matching the term.
Expand Down
2 changes: 1 addition & 1 deletion fastfield_codecs/src/column.rs
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ use std::ops::RangeInclusive;
use tantivy_bitpacker::minmax;

pub trait Column<T: PartialOrd = u64>: Send + Sync {
/// Return the value associated to the given idx.
/// Return the value associated with the given idx.
///
/// This accessor should return as fast as possible.
///
Expand Down
2 changes: 1 addition & 1 deletion src/collector/custom_score_top_collector.rs
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ pub trait CustomSegmentScorer<TScore>: 'static {
pub trait CustomScorer<TScore>: Sync {
/// Type of the associated [`CustomSegmentScorer`].
type Child: CustomSegmentScorer<TScore>;
/// Builds a child scorer for a specific segment. The child scorer is associated to
/// Builds a child scorer for a specific segment. The child scorer is associated with
/// a specific segment.
fn segment_scorer(&self, segment_reader: &SegmentReader) -> crate::Result<Self::Child>;
}
Expand Down
2 changes: 1 addition & 1 deletion src/collector/facet_collector.rs
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ fn facet_depth(facet_bytes: &[u8]) -> usize {
/// let index = Index::create_in_ram(schema);
/// {
/// let mut index_writer = index.writer(3_000_000)?;
/// // a document can be associated to any number of facets
/// // a document can be associated with any number of facets
/// index_writer.add_document(doc!(
/// title => "The Name of the Wind",
/// facet => Facet::from("/lang/en"),
Expand Down
2 changes: 1 addition & 1 deletion src/collector/histogram_collector.rs
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ impl HistogramCollector {
/// The scale/range of the histogram is not dynamic. It is required to
/// define it by supplying following parameter:
/// - `min_value`: the minimum value that can be recorded in the histogram.
/// - `bucket_width`: the length of the interval that is associated to each buckets.
/// - `bucket_width`: the length of the interval that is associated with each buckets.
/// - `num_buckets`: The overall number of buckets.
///
/// Together, this parameters define a partition of `[min_value, min_value + num_buckets *
Expand Down
4 changes: 2 additions & 2 deletions src/collector/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ pub trait Collector: Sync + Send {
/// e.g. `usize` for the `Count` collector.
type Fruit: Fruit;

/// Type of the `SegmentCollector` associated to this collector.
/// Type of the `SegmentCollector` associated with this collector.
type Child: SegmentCollector;

/// `set_segment` is called before beginning to enumerate
Expand All @@ -156,7 +156,7 @@ pub trait Collector: Sync + Send {
/// Returns true iff the collector requires to compute scores for documents.
fn requires_scoring(&self) -> bool;

/// Combines the fruit associated to the collection of each segments
/// Combines the fruit associated with the collection of each segments
/// into one fruit.
fn merge_fruits(
&self,
Expand Down
2 changes: 1 addition & 1 deletion src/collector/top_score_collector.rs
Original file line number Diff line number Diff line change
Expand Up @@ -693,7 +693,7 @@ impl Collector for TopDocs {
}
}

/// Segment Collector associated to `TopDocs`.
/// Segment Collector associated with `TopDocs`.
pub struct TopScoreSegmentCollector(TopSegmentCollector<Score>);

impl SegmentCollector for TopScoreSegmentCollector {
Expand Down
2 changes: 1 addition & 1 deletion src/collector/tweak_score_top_collector.rs
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ pub trait ScoreTweaker<TScore>: Sync {
/// Type of the associated [`ScoreSegmentTweaker`].
type Child: ScoreSegmentTweaker<TScore>;

/// Builds a child tweaker for a specific segment. The child scorer is associated to
/// Builds a child tweaker for a specific segment. The child scorer is associated with
/// a specific segment.
fn segment_tweaker(&self, segment_reader: &SegmentReader) -> Result<Self::Child>;
}
Expand Down
8 changes: 4 additions & 4 deletions src/core/index_meta.rs
Original file line number Diff line number Diff line change
Expand Up @@ -130,7 +130,7 @@ impl SegmentMeta {
/// Returns the relative path of a component of our segment.
///
/// It just joins the segment id with the extension
/// associated to a segment component.
/// associated with a segment component.
pub fn relative_path(&self, component: SegmentComponent) -> PathBuf {
let mut path = self.id().uuid_string();
path.push_str(&*match component {
Expand Down Expand Up @@ -326,13 +326,13 @@ pub struct IndexMeta {
/// `IndexSettings` to configure index options.
#[serde(default)]
pub index_settings: IndexSettings,
/// List of `SegmentMeta` information associated to each finalized segment of the index.
/// List of `SegmentMeta` information associated with each finalized segment of the index.
pub segments: Vec<SegmentMeta>,
/// Index `Schema`
pub schema: Schema,
/// Opstamp associated to the last `commit` operation.
/// Opstamp associated with the last `commit` operation.
pub opstamp: Opstamp,
/// Payload associated to the last commit.
/// Payload associated with the last commit.
///
/// Upon commit, clients can optionally add a small `String` payload to their commit
/// to help identify this commit.
Expand Down
4 changes: 2 additions & 2 deletions src/core/inverted_index_reader.rs
Original file line number Diff line number Diff line change
Expand Up @@ -9,11 +9,11 @@ use crate::schema::{IndexRecordOption, Term};
use crate::termdict::TermDictionary;

/// The inverted index reader is in charge of accessing
/// the inverted index associated to a specific field.
/// the inverted index associated with a specific field.
///
/// # Note
///
/// It is safe to delete the segment associated to
/// It is safe to delete the segment associated with
/// an `InvertedIndexReader`. As long as it is open,
/// the `FileSlice` it is relying on should
/// stay available.
Expand Down
8 changes: 4 additions & 4 deletions src/core/searcher.rs
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ pub struct Searcher {
}

impl Searcher {
/// Returns the `Index` associated to the `Searcher`
/// Returns the `Index` associated with the `Searcher`
pub fn index(&self) -> &Index {
&self.inner.index
}
Expand Down Expand Up @@ -108,7 +108,7 @@ impl Searcher {
store_reader.get_async(doc_address.doc_id).await
}

/// Access the schema associated to the index of this searcher.
/// Access the schema associated with the index of this searcher.
pub fn schema(&self) -> &Schema {
&self.inner.schema
}
Expand Down Expand Up @@ -161,11 +161,11 @@ impl Searcher {
///
/// Search works as follows :
///
/// First the weight object associated to the query is created.
/// First the weight object associated with the query is created.
///
/// Then, the query loops over the segments and for each segment :
/// - setup the collector and informs it that the segment being processed has changed.
/// - creates a SegmentCollector for collecting documents associated to the segment
/// - creates a SegmentCollector for collecting documents associated with the segment
/// - creates a `Scorer` object associated for this segment
/// - iterate through the matched documents and push them to the segment collector.
///
Expand Down
2 changes: 1 addition & 1 deletion src/core/segment.rs
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ impl Segment {
/// Returns the relative path of a component of our segment.
///
/// It just joins the segment id with the extension
/// associated to a segment component.
/// associated with a segment component.
pub fn relative_path(&self, component: SegmentComponent) -> PathBuf {
self.meta.relative_path(component)
}
Expand Down
2 changes: 1 addition & 1 deletion src/core/segment_component.rs
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ use std::slice;
/// except the delete component that takes an `segment_uuid`.`delete_opstamp`.`component_extension`
#[derive(Copy, Clone, Eq, PartialEq)]
pub enum SegmentComponent {
/// Postings (or inverted list). Sorted lists of document ids, associated to terms
/// Postings (or inverted list). Sorted lists of document ids, associated with terms
Postings,
/// Positions of terms in each document.
Positions,
Expand Down
10 changes: 5 additions & 5 deletions src/core/segment_reader.rs
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ impl SegmentReader {
&self.fast_fields_readers
}

/// Accessor to the `FacetReader` associated to a given `Field`.
/// Accessor to the `FacetReader` associated with a given `Field`.
pub fn facet_reader(&self, field: Field) -> crate::Result<FacetReader> {
let field_entry = self.schema.get_field_entry(field);

Expand Down Expand Up @@ -208,13 +208,13 @@ impl SegmentReader {
})
}

/// Returns a field reader associated to the field given in argument.
/// Returns a field reader associated with the field given in argument.
/// If the field was not present in the index during indexing time,
/// the InvertedIndexReader is empty.
///
/// The field reader is in charge of iterating through the
/// term dictionary associated to a specific field,
/// and opening the posting list associated to any term.
/// term dictionary associated with a specific field,
/// and opening the posting list associated with any term.
///
/// If the field is not marked as index, a warn is logged and an empty `InvertedIndexReader`
/// is returned.
Expand All @@ -241,7 +241,7 @@ impl SegmentReader {

if postings_file_opt.is_none() || record_option_opt.is_none() {
// no documents in the segment contained this field.
// As a result, no data is associated to the inverted index.
// As a result, no data is associated with the inverted index.
//
// Returns an empty inverted index.
let record_option = record_option_opt.unwrap_or(IndexRecordOption::Basic);
Expand Down
8 changes: 4 additions & 4 deletions src/directory/composite_file.rs
Original file line number Diff line number Diff line change
Expand Up @@ -154,14 +154,14 @@ impl CompositeFile {
}
}

/// Returns the `FileSlice` associated
/// to a given `Field` and stored in a `CompositeFile`.
/// Returns the `FileSlice` associated with
/// a given `Field` and stored in a `CompositeFile`.
pub fn open_read(&self, field: Field) -> Option<FileSlice> {
self.open_read_with_idx(field, 0)
}

/// Returns the `FileSlice` associated
/// to a given `Field` and stored in a `CompositeFile`.
/// Returns the `FileSlice` associated with
/// a given `Field` and stored in a `CompositeFile`.
pub fn open_read_with_idx(&self, field: Field, idx: usize) -> Option<FileSlice> {
self.offsets_index
.get(&FileAddr { field, idx })
Expand Down
2 changes: 1 addition & 1 deletion src/directory/directory.rs
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ impl RetryPolicy {

/// The `DirectoryLock` is an object that represents a file lock.
///
/// It is associated to a lock file, that gets deleted on `Drop.`
/// It is associated with a lock file, that gets deleted on `Drop.`
pub struct DirectoryLock(Box<dyn Send + Sync + 'static>);

struct DirectoryLockGuard {
Expand Down
2 changes: 1 addition & 1 deletion src/directory/mmap_directory.rs
Original file line number Diff line number Diff line change
Expand Up @@ -334,7 +334,7 @@ impl Directory for MmapDirectory {
Ok(Arc::new(owned_bytes))
}

/// Any entry associated to the path in the mmap will be
/// Any entry associated with the path in the mmap will be
/// removed before the file is deleted.
fn delete(&self, path: &Path) -> result::Result<(), DeleteError> {
let full_path = self.resolve_path(path);
Expand Down
4 changes: 2 additions & 2 deletions src/fastfield/bytes/reader.rs
Original file line number Diff line number Diff line change
Expand Up @@ -39,13 +39,13 @@ impl BytesFastFieldReader {
start..end
}

/// Returns the bytes associated to the given `doc`
/// Returns the bytes associated with the given `doc`
pub fn get_bytes(&self, doc: DocId) -> &[u8] {
let range = self.range(doc);
&self.values.as_slice()[range.start as usize..range.end as usize]
}

/// Returns the length of the bytes associated to the given `doc`
/// Returns the length of the bytes associated with the given `doc`
pub fn num_bytes(&self, doc: DocId) -> u64 {
let range = self.range(doc);
range.end - range.start
Expand Down
6 changes: 3 additions & 3 deletions src/fastfield/bytes/writer.rs
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ use crate::DocId;
///
/// Once acquired, writing is done by calling
/// [`.add_document_val(&[u8])`](BytesFastFieldWriter::add_document_val)
/// once per document, even if there are no bytes associated to it.
/// once per document, even if there are no bytes associated with it.
pub struct BytesFastFieldWriter {
field: Field,
vals: Vec<u8>,
Expand All @@ -45,7 +45,7 @@ impl BytesFastFieldWriter {
pub fn mem_usage(&self) -> usize {
self.vals.capacity() + self.doc_index.capacity() * std::mem::size_of::<u64>()
}
/// Access the field associated to the `BytesFastFieldWriter`
/// Access the field associated with the `BytesFastFieldWriter`
pub fn field(&self) -> Field {
self.field
}
Expand All @@ -67,7 +67,7 @@ impl BytesFastFieldWriter {
}
}

/// Register the bytes associated to a document.
/// Register the bytes associated with a document.
///
/// The method returns the `DocId` of the document that was
/// just written.
Expand Down
6 changes: 3 additions & 3 deletions src/fastfield/facet_reader.rs
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ use crate::termdict::{TermDictionary, TermOrdinal};
use crate::DocId;

/// The facet reader makes it possible to access the list of
/// facets associated to a given document in a specific
/// facets associated with a given document in a specific
/// segment.
///
/// Rather than manipulating `Facet` object directly, the API
Expand Down Expand Up @@ -58,7 +58,7 @@ impl FacetReader {
&self.term_dict
}

/// Given a term ordinal returns the term associated to it.
/// Given a term ordinal returns the term associated with it.
pub fn facet_from_ord(
&mut self,
facet_ord: TermOrdinal,
Expand All @@ -74,7 +74,7 @@ impl FacetReader {
Ok(())
}

/// Return the list of facet ordinals associated to a document.
/// Return the list of facet ordinals associated with a document.
pub fn facet_ords(&self, doc: DocId, output: &mut Vec<u64>) {
self.term_ords.get_vals(doc, output);
}
Expand Down
2 changes: 1 addition & 1 deletion src/fastfield/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ mod writer;
pub trait MultiValueLength {
/// returns the positions for a docid
fn get_range(&self, doc_id: DocId) -> std::ops::Range<u64>;
/// returns the num of values associated to a doc_id
/// returns the num of values associated with a doc_id
fn get_len(&self, doc_id: DocId) -> u64;
/// returns the sum of num values for all doc_ids
fn get_total_len(&self) -> u64;
Expand Down
8 changes: 4 additions & 4 deletions src/fastfield/multivalued/reader.rs
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@ impl<Item: FastValue> MultiValuedFastFieldReader<Item> {
}
}

/// Returns `[start, end)`, such that the values associated
/// to the given document are `start..end`.
/// Returns `[start, end)`, such that the values associated with
/// the given document are `start..end`.
#[inline]
fn range(&self, doc: DocId) -> Range<u64> {
let idx = doc as u64;
Expand All @@ -40,15 +40,15 @@ impl<Item: FastValue> MultiValuedFastFieldReader<Item> {
start..end
}

/// Returns the array of values associated to the given `doc`.
/// Returns the array of values associated with the given `doc`.
#[inline]
fn get_vals_for_range(&self, range: Range<u64>, vals: &mut Vec<Item>) {
let len = (range.end - range.start) as usize;
vals.resize(len, Item::make_zero());
self.vals_reader.get_range(range.start, &mut vals[..]);
}

/// Returns the array of values associated to the given `doc`.
/// Returns the array of values associated with the given `doc`.
#[inline]
pub fn get_vals(&self, doc: DocId, vals: &mut Vec<Item>) {
let range = self.range(doc);
Expand Down
2 changes: 1 addition & 1 deletion src/fastfield/multivalued/writer.rs
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ impl MultiValuedFastFieldWriter {
+ self.doc_index.capacity() * std::mem::size_of::<u64>()
}

/// Access the field associated to the `MultiValuedFastFieldWriter`
/// Access the field associated with the `MultiValuedFastFieldWriter`
pub fn field(&self) -> Field {
self.field
}
Expand Down
Loading