Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feat(vector store): lancedb #33

Merged
merged 45 commits into from
Oct 7, 2024
Merged
Changes from 1 commit
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
9d7cbb7
feat: start implementing VectorStore trait for lancedb
marieaurore123 Sep 16, 2024
1d9fd64
refactor: create wrapper for vec<DocumentEmbeddings> for from/tryfrom…
marieaurore123 Sep 16, 2024
e9d18c5
feat: implement add_documents on VectorStore trait
marieaurore123 Sep 17, 2024
d3c7f9a
feat: implement search by id for VectorStore trait
marieaurore123 Sep 18, 2024
7be1f85
feat: implement get_document method of VectorStore trait
marieaurore123 Sep 19, 2024
168f53e
feat: start implementing top_n_from_query for trait VectorStoreIndex
marieaurore123 Sep 19, 2024
27c6bda
Merge branch 'main' into feat(vector-store)/lancedb
marieaurore123 Sep 19, 2024
1e52e6a
docs: add doc string to mongodb search params struct
marieaurore123 Sep 19, 2024
bb5c767
docs: Add doc strings to utility methods
marieaurore123 Sep 19, 2024
b788cd5
feat: implement ANN search example
marieaurore123 Sep 20, 2024
d62bbbf
refactor: conversions from arrow types to primitive types
marieaurore123 Sep 23, 2024
22b43ba
feat: add vector_search_s3_ann example
marieaurore123 Sep 23, 2024
ad45690
feat: create enum for embedding models
marieaurore123 Sep 23, 2024
9cba7a1
Merge branch 'main' into feat(vector-store)/lancedb
marieaurore123 Sep 23, 2024
e22d778
ci: makes the protoc compiler available on github workflows
marieaurore123 Sep 23, 2024
4debc0e
fix: reduce opanai generated content in ANN examples
marieaurore123 Sep 23, 2024
7b71aa1
feat: add indexes and tables for simple search
marieaurore123 Sep 23, 2024
e63d5a1
style: cargo fmt
marieaurore123 Sep 23, 2024
4d28f61
refactor: remove associated type on VectorStoreIndex trait
marieaurore123 Sep 24, 2024
0bcf5a3
refactor: use constants instead of enum for model names
marieaurore123 Sep 24, 2024
2f5844d
fix: make PR requested changes
marieaurore123 Sep 24, 2024
2436ca3
Merge branch 'main' into feat(vector-store)/lancedb
marieaurore123 Sep 24, 2024
dfe32e2
fix: make PR requested changes
marieaurore123 Sep 24, 2024
5644a1d
style: cargo clippy
marieaurore123 Sep 24, 2024
6fede36
feat: implement deserialization for any recordbatch returned from lan…
marieaurore123 Sep 25, 2024
4a22b15
feat: finish implementing deserialiser for record batch
marieaurore123 Sep 26, 2024
ff85fa5
refactor: remove print statement
marieaurore123 Sep 26, 2024
57d8287
Merge branch 'main' into feat(vector-store)/lancedb
marieaurore123 Oct 1, 2024
70802b3
refactor: update rig core version on lancedb crate, remove implementa…
marieaurore123 Oct 1, 2024
313cf14
Merge branch 'main' into feat(vector-store)/lancedb
marieaurore123 Oct 1, 2024
205f0c7
feat: merge all arrow columns into JSON document in deserializer
marieaurore123 Oct 1, 2024
921b313
feat: replace document embeddings with serde json value
marieaurore123 Oct 2, 2024
0050925
feat: update examples to use new version of VectorStoreIndex trait
marieaurore123 Oct 2, 2024
4a6a87d
docs: add doc strings
marieaurore123 Oct 2, 2024
ec44d4a
fix: fix bug in deserializing type run end
marieaurore123 Oct 2, 2024
edae694
docs: add example docstring
marieaurore123 Oct 2, 2024
9c3eb0e
fix: mongodb vector search - use num_candidates from search params
marieaurore123 Oct 2, 2024
3eef745
fix(lancedb): replace VectorStoreIndexDyn with VectorStoreIndex in ex…
marieaurore123 Oct 2, 2024
f0840fb
Merge branch 'main' into feat(vector-store)/lancedb
marieaurore123 Oct 3, 2024
27435e4
fix: make PR changes pt I
marieaurore123 Oct 3, 2024
b55e86e
fix: make PR changes Pt II
marieaurore123 Oct 4, 2024
cc5a328
fix(ci): install protobuf-compiler in test job
marieaurore123 Oct 4, 2024
9a310cb
fix: update lancedb examples test data
marieaurore123 Oct 7, 2024
9b09639
refactor: lance db examples
marieaurore123 Oct 7, 2024
d5dc56a
style: cargo fmt
marieaurore123 Oct 7, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
67 changes: 67 additions & 0 deletions rig-lancedb/examples/fixtures/lib.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
use std::sync::Arc;

use arrow_array::{types::Float64Type, ArrayRef, FixedSizeListArray, RecordBatch, StringArray};
use lancedb::arrow::arrow_schema::{DataType, Field, Fields, Schema};
use rig::embeddings::DocumentEmbeddings;

// Schema of table in LanceDB.
pub fn schema(dims: usize) -> Schema {
Schema::new(Fields::from(vec![
Field::new("id", DataType::Utf8, false),
Field::new("content", DataType::Utf8, false),
Field::new(
"embedding",
DataType::FixedSizeList(
Arc::new(Field::new("item", DataType::Float64, true)),
dims as i32,
),
false,
),
]))
}

// Convert DocumentEmbeddings objects to a RecordBatch.
pub fn as_record_batch(
records: Vec<DocumentEmbeddings>,
dims: usize,
) -> Result<RecordBatch, lancedb::arrow::arrow_schema::ArrowError> {
let id = StringArray::from_iter_values(
records
.iter()
.flat_map(|record| (0..record.embeddings.len()).map(|i| format!("{}-{i}", record.id)))
.collect::<Vec<_>>(),
);

let content = StringArray::from_iter_values(
records
.iter()
.flat_map(|record| {
record
.embeddings
.iter()
.map(|embedding| embedding.document.clone())
})
.collect::<Vec<_>>(),
);

let embedding = FixedSizeListArray::from_iter_primitive::<Float64Type, _, _>(
records
.into_iter()
.flat_map(|record| {
record
.embeddings
.into_iter()
.map(|embedding| embedding.vec.into_iter().map(Some).collect::<Vec<_>>())
.map(Some)
.collect::<Vec<_>>()
})
.collect::<Vec<_>>(),
dims as i32,
);

RecordBatch::try_from_iter(vec![
("id", Arc::new(id) as ArrayRef),
("content", Arc::new(content) as ArrayRef),
("embedding", Arc::new(embedding) as ArrayRef),
])
}
71 changes: 51 additions & 20 deletions rig-lancedb/examples/vector_search_local_ann.rs
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The examples sometimes fail if GPT-4o doesnt generate valid JSON synthetic data. To make the examples more reliable (and simpler), I would just create a dummy definition and copy it 256 times

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove AI generated data in favor of repeated hardcoded data

Original file line number Diff line number Diff line change
@@ -1,29 +1,35 @@
use std::env;
use std::{env, sync::Arc};

use arrow_array::RecordBatchIterator;
use fixture::{as_record_batch, schema};
use lancedb::{index::vector::IvfPqIndexBuilder, DistanceType};
use rig::{
completion::Prompt,
embeddings::EmbeddingsBuilder,
embeddings::{EmbeddingModel, EmbeddingsBuilder},
providers::openai::{Client, TEXT_EMBEDDING_ADA_002},
vector_store::{VectorStore, VectorStoreIndexDyn},
vector_store::VectorStoreIndexDyn,
};
use rig_lancedb::{LanceDbVectorStore, SearchParams};
use serde::Deserialize;

#[path = "./fixtures/lib.rs"]
mod fixture;

#[derive(Deserialize, Debug)]
pub struct VectorSearchResult {
pub id: String,
pub content: String,
}

#[tokio::main]
async fn main() -> Result<(), anyhow::Error> {
// Initialize OpenAI client. Use this to generate embeddings (and generate test data for RAG demo).
let openai_api_key = env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY not set");
let openai_client = Client::new(&openai_api_key);

// Select the embedding model and generate our embeddings
// Select an embedding model.
let model = openai_client.embedding_model(TEXT_EMBEDDING_ADA_002);

let search_params = SearchParams::default().distance_type(DistanceType::Cosine);

// Initialize LanceDB locally.
let db = lancedb::connect("data/lancedb-store").execute().await?;
let mut vector_store = LanceDbVectorStore::new(&db, &model, &search_params).await?;

// Generate test data for RAG demo
let agent = openai_client
.agent("gpt-4o")
@@ -39,6 +45,7 @@ async fn main() -> Result<(), anyhow::Error> {
definitions.extend(definitions.clone());
definitions.extend(definitions.clone());

// Generate embeddings for the test data.
let embeddings = EmbeddingsBuilder::new(model.clone())
.simple_document("doc0", "Definition of *flumbrel (noun)*: a small, seemingly insignificant item that you constantly lose or misplace, such as a pen, hair tie, or remote control.")
.simple_document("doc1", "Definition of *zindle (verb)*: to pretend to be working on something important while actually doing something completely unrelated or unproductive")
@@ -47,26 +54,50 @@ async fn main() -> Result<(), anyhow::Error> {
.build()
.await?;

// Add embeddings to vector store
// vector_store.add_documents(embeddings).await?;
// Define search_params params that will be used by the vector store to perform the vector search.
let search_params = SearchParams::default().distance_type(DistanceType::Cosine);

// Initialize LanceDB locally.
let db = lancedb::connect("data/lancedb-store").execute().await?;

// Create table with embeddings.
let record_batch = as_record_batch(embeddings, model.ndims());
let table = db
.create_table(
"definitions",
RecordBatchIterator::new(vec![record_batch], Arc::new(schema(model.ndims()))),
)
.execute()
.await?;

let vector_store = LanceDbVectorStore::new(table, model, "id", search_params).await?;

// See [LanceDB indexing](https://lancedb.github.io/lancedb/concepts/index_ivfpq/#product-quantization) for more information
vector_store
.create_index(lancedb::index::Index::IvfPq(
IvfPqIndexBuilder::default()
// This overrides the default distance type of L2.
// Needs to be the same distance type as the one used in search params.
.distance_type(DistanceType::Cosine),
))
.create_index(
lancedb::index::Index::IvfPq(
IvfPqIndexBuilder::default()
// This overrides the default distance type of L2.
// Needs to be the same distance type as the one used in search params.
.distance_type(DistanceType::Cosine),
),
&["embedding"],
)
.await?;

// Query the index
let results = vector_store
.top_n("My boss says I zindle too much, what does that mean?", 1)
.await?
.into_iter()
.map(|(score, id, doc)| (score, id, doc))
.collect::<Vec<_>>();
.map(|(score, id, doc)| {
anyhow::Ok((
score,
id,
serde_json::from_value::<VectorSearchResult>(doc)?,
))
})
.collect::<Result<Vec<_>, _>>()?;

println!("Results: {:?}", results);

42 changes: 28 additions & 14 deletions rig-lancedb/examples/vector_search_local_enn.rs
Original file line number Diff line number Diff line change
@@ -1,11 +1,17 @@
use std::env;
use std::{env, sync::Arc};

use arrow_array::RecordBatchIterator;
use fixture::{as_record_batch, schema};
use rig::{
embeddings::EmbeddingsBuilder,
embeddings::{EmbeddingModel, EmbeddingsBuilder},
providers::openai::{Client, TEXT_EMBEDDING_ADA_002},
vector_store::{VectorStore, VectorStoreIndexDyn},
vector_store::VectorStoreIndexDyn,
};
use rig_lancedb::{LanceDbVectorStore, SearchParams};
use serde::Deserialize;

#[path = "./fixtures/lib.rs"]
mod fixture;

#[tokio::main]
async fn main() -> Result<(), anyhow::Error> {
@@ -16,27 +22,35 @@ async fn main() -> Result<(), anyhow::Error> {
// Select the embedding model and generate our embeddings
let model = openai_client.embedding_model(TEXT_EMBEDDING_ADA_002);

// Initialize LanceDB locally.
let db = lancedb::connect("data/lancedb-store").execute().await?;
let mut vector_store = LanceDbVectorStore::new(&db, &model, &SearchParams::default()).await?;

let embeddings = EmbeddingsBuilder::new(model.clone())
.simple_document("doc0", "Definition of *flumbrel (noun)*: a small, seemingly insignificant item that you constantly lose or misplace, such as a pen, hair tie, or remote control.")
.simple_document("doc1", "Definition of *zindle (verb)*: to pretend to be working on something important while actually doing something completely unrelated or unproductive")
.simple_document("doc2", "Definition of *glimber (adjective)*: describing a state of excitement mixed with nervousness, often experienced before an important event or decision.")
.build()
.await?;

// Add embeddings to vector store
// vector_store.add_documents(embeddings).await?;
// Define search_params params that will be used by the vector store to perform the vector search.
let search_params = SearchParams::default();

// Initialize LanceDB locally.
let db = lancedb::connect("data/lancedb-store").execute().await?;

// Create table with embeddings.
let record_batch = as_record_batch(embeddings, model.ndims());
let table = db
.create_table(
"definitions",
RecordBatchIterator::new(vec![record_batch], Arc::new(schema(model.ndims()))),
)
.execute()
.await?;

let vector_store = LanceDbVectorStore::new(table, model, "id", search_params).await?;

// Query the index
let results = vector_store
.top_n("My boss says I zindle too much, what does that mean?", 1)
.await?
.into_iter()
.map(|(score, id, doc)| (score, id, doc))
.collect::<Vec<_>>();
.top_n_ids("My boss says I zindle too much, what does that mean?", 1)
.await?;

println!("Results: {:?}", results);

83 changes: 56 additions & 27 deletions rig-lancedb/examples/vector_search_s3_ann.rs
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Remove AI generated data in favor of repeated hardcoded data

Original file line number Diff line number Diff line change
@@ -1,17 +1,28 @@
use std::env;
use std::{env, sync::Arc};

use arrow_array::RecordBatchIterator;
use fixture::{as_record_batch, schema};
use lancedb::{index::vector::IvfPqIndexBuilder, DistanceType};
use rig::{
completion::Prompt,
embeddings::EmbeddingsBuilder,
embeddings::{EmbeddingModel, EmbeddingsBuilder},
providers::openai::{Client, TEXT_EMBEDDING_ADA_002},
vector_store::{VectorStore, VectorStoreIndexDyn},
vector_store::VectorStoreIndexDyn,
};
use rig_lancedb::{LanceDbVectorStore, SearchParams};
use serde::Deserialize;

#[path = "./fixtures/lib.rs"]
mod fixture;

#[derive(Deserialize, Debug)]
pub struct VectorSearchResult {
pub id: String,
pub content: String,
}

// Note: see docs to deploy LanceDB on other cloud providers such as google and azure.
// https://lancedb.github.io/lancedb/guides/storage/

#[tokio::main]
async fn main() -> Result<(), anyhow::Error> {
// Initialize OpenAI client. Use this to generate embeddings (and generate test data for RAG demo).
@@ -21,23 +32,13 @@ async fn main() -> Result<(), anyhow::Error> {
// Select the embedding model and generate our embeddings
let model = openai_client.embedding_model(TEXT_EMBEDDING_ADA_002);

let search_params = SearchParams::default().distance_type(DistanceType::Cosine);

// Initialize LanceDB on S3.
// Note: see below docs for more options and IAM permission required to read/write to S3.
// https://lancedb.github.io/lancedb/guides/storage/#aws-s3
let db = lancedb::connect("s3://lancedb-test-829666124233")
.execute()
.await?;
let mut vector_store = LanceDbVectorStore::new(&db, &model, &search_params).await?;

// Generate test data for RAG demo
let agent = openai_client
.agent("gpt-4o")
.preamble("Return the answer as JSON containing a list of strings in the form: `Definition of {generated_word}: {generated definition}`. Return ONLY the JSON string generated, nothing else.")
.build();
let response = agent
.prompt("Invent at least 100 words and their definitions")
.prompt("Invent 100 words and their definitions")
.await?;
let mut definitions: Vec<String> = serde_json::from_str(&response)?;

@@ -46,34 +47,62 @@ async fn main() -> Result<(), anyhow::Error> {
definitions.extend(definitions.clone());
definitions.extend(definitions.clone());

let embeddings: Vec<rig::embeddings::DocumentEmbeddings> = EmbeddingsBuilder::new(model.clone())
// Generate embeddings for the test data.
let embeddings = EmbeddingsBuilder::new(model.clone())
.simple_document("doc0", "Definition of *flumbrel (noun)*: a small, seemingly insignificant item that you constantly lose or misplace, such as a pen, hair tie, or remote control.")
.simple_document("doc1", "Definition of *zindle (verb)*: to pretend to be working on something important while actually doing something completely unrelated or unproductive")
.simple_document("doc2", "Definition of *glimber (adjective)*: describing a state of excitement mixed with nervousness, often experienced before an important event or decision.")
.simple_documents(definitions.clone().into_iter().enumerate().map(|(i, def)| (format!("doc{}", i+3), def)).collect())
.build()
.await?;

// Add embeddings to vector store
// vector_store.add_documents(embeddings).await?;
// Define search_params params that will be used by the vector store to perform the vector search.
let search_params = SearchParams::default().distance_type(DistanceType::Cosine);

// Initialize LanceDB on S3.
// Note: see below docs for more options and IAM permission required to read/write to S3.
// https://lancedb.github.io/lancedb/guides/storage/#aws-s3
let db = lancedb::connect("s3://lancedb-test-829666124233")
.execute()
.await?;
// Create table with embeddings.
let record_batch = as_record_batch(embeddings, model.ndims());
let table = db
.create_table(
"definitions",
RecordBatchIterator::new(vec![record_batch], Arc::new(schema(model.ndims()))),
)
.execute()
.await?;

let vector_store = LanceDbVectorStore::new(table, model, "id", search_params).await?;

// See [LanceDB indexing](https://lancedb.github.io/lancedb/concepts/index_ivfpq/#product-quantization) for more information
vector_store
.create_index(lancedb::index::Index::IvfPq(
IvfPqIndexBuilder::default()
// This overrides the default distance type of L2.
// Needs to be the same distance type as the one used in search params.
.distance_type(DistanceType::Cosine),
))
.create_index(
lancedb::index::Index::IvfPq(
IvfPqIndexBuilder::default()
// This overrides the default distance type of L2.
// Needs to be the same distance type as the one used in search params.
.distance_type(DistanceType::Cosine),
),
&["embedding"],
)
.await?;

// Query the index
let results = vector_store
.top_n("My boss says I zindle too much, what does that mean?", 1)
.top_n("I'm always looking for my phone, I always seem to forget it in the most counterintuitive places. What's the word for this feeling?", 1)
.await?
.into_iter()
.map(|(score, id, doc)| (score, id, doc))
.collect::<Vec<_>>();
.map(|(score, id, doc)| {
anyhow::Ok((
score,
id,
serde_json::from_value::<VectorSearchResult>(doc)?,
))
})
.collect::<Result<Vec<_>, _>>()?;

println!("Results: {:?}", results);

Loading