Notebooks
M
Milvus
Imdb Metadata Filtering

Imdb Metadata Filtering

image-searchvector-databaseRetrievalsemantic-searchmilvusembeddingsunstructured-dataquestion-answeringLLMmilvus-bootcampdeep-learningimage-recognitionimage-classificationaudio-searchPythonbootcampragNLP

Read CSV data into a pandas dataframe

The data used in this notebook is Kaggle 48K movies which contains a lot of metadata in addition to the raw review text.

Usually there is a data cleaning step. Such as replace empty strings with "" or unusual and empty fields with median values. Below, I'll just drop rows with null values.

[1]
[2]
[3]
48513
45036
Example text length: 6556
('Example text: Sallie Gardner at a Gallop Sallie Gardner at a Gallop is a '
 'short starring Gilbert Domm and Sallie Gardner. The clip shows a jockey, '
 'Domm, riding a horse, Sally Gardner. The clip is not filmed but instead '
 'consists of 24 individual photographs shot in rapid... Sometimes ascribed as '
 '"The Father of the Motion Picture", Eadweard Muybridge undeniably '
 'accomplished exploiting and sometimes introducing a means of instantaneous '
 'and serial images to analyze and synthesize animal locomotion. In part, the '
 "reasons for and the claims made of his work support Virgilio Tosi's thesis "
 'that cinema was invented out of the needs of scientific research. '
 "Conversely, they're informed by Muybridge's background as an artistic "
 'location photographer and, as Phillip Prodger suggests, in book sales and '
 'more useful to art than to science, as Marta Braun has demonstrated (see '
 'sources at bottom). Additionally, Muybridge quickly exploited their '
 'entertainment value via projection to audiences across the U.S. and Europe. '
 'Muybridge pursued both of these paths of invention: the path taken by Jules '
 'Janssen, Étienne-Jules Marey and others for science and the path taken by '
 'Ottomar Anschütz, Thomas Edison, the Lumiére brothers and others for fame '
 'and profit.\n'
 '\n'
 'Muybridge began taking instantaneous single photographs of multi-millionaire '
 "railroad magnate Leland Stanford's horses in motion in 1872. It was disputed "
 "at the time whether all four of a horse's legs were off the ground "
 'simultaneously at any time while running. Although no surviving photographs '
 'prove it, contemporary lithographs and paintings likely based on the '
 'photographs, indeed, show the moment of "unsupported transit". In between '
 'and interrupting these experiments, Muybridge was found not guilty of the '
 "admittedly premeditated fatal shooting of his wife's lover and possibly her "
 "son's father.\n"
 '\n'
 "Publication of Marey's graphic measurements of a horse's movements reignited "
 "Stanford's interest in the gait of horses. In turn, Marey was convinced to "
 "switch to photography in his motion studies after witnessing Muybridge's "
 'work (see "Falling Cat" (1894)). This work in "automatic '
 'electro-photographs" began in 1878 at Stanford\'s Palo Alto Stock Farm. '
 'Multiple cameras were stored in a shed parallel to a track. A series of '
 'closing boards serving as shutters were triggered by tripped threads and '
 'electrical means. The wet collodion process of the time, reportedly, could '
 'need up to half a minute for an exposure. For the split-second shutter '
 'speeds required here, a white canvas background and powdered lime on the '
 'track provided more contrast to compensate for less light getting to the '
 "glass plates. Employees of Stanford's Central Pacific Railroad and others "
 'helped in constructing this "set" and camera equipment.\n'
 '\n'
 'Contrary to unattributed claims on the web, this so-called "Sallie Gardner '
 'at a Gallop" wasn\'t the first series photographed by Muybridge. Six series '
 'of Muybridge\'s first subjects were published on cards entitled "The Horse '
 'in Motion". The first is of the horse Abe Edgington trotting on 11 June '
 '1878. Reporters were invited for the next two series on June 15th, and, as '
 'they reported, again, Abe went first—trotting and pulling the driver behind '
 'in a sulky, which is what tripped the threads. The second subject that day '
 'was Sallie Gardner running and, thus, the mare had to trip the threads. '
 'Reporters noted how this spooked her and how that was reflected in the '
 'negatives developed on the spot. As one article said, she "gave a wild bound '
 'in the air, breaking the saddle girth as she left the ground." Based on such '
 "descriptions, it doesn't seem that this series exists anymore. The "
 'animations on the web that are actually of Sallie are dated June 19th on '
 '"The Horse in Motion" card. Many animations claimed to be Sallie on YouTube, '
 'Wikipedia and elsewhere, as of this date, are actually of a mare named Annie '
 "G. and were part of Muybridge's University of Pennsylvania work published in "
 '1887, as the Library of Congress and other reliable sources have made clear. '
 "The early Palo Alto photographs aren't as detailed and are closer to "
 "silhouettes. The 12 images of Gardner also include one where she's "
 "stationary. The Morse's Gallery pictures are entirely in silhouette, while "
 'the La Nature engravings of these same images show the rider in a white '
 'shirt.\n'
 '\n'
 'The shot of the horse stationary, as Braun points out, was added later and '
 'is indicative of the artistic and un-scientific assemblages Muybridge made '
 'of his images—with the intent of publication, including in his own books. '
 'This was especially prominent in his Pennsylvania work, which included many '
 'nude models that were surely useful for art. Muybridge influenced artists '
 'from Realists like Thomas Eakins and Meissonier, Impressionists like Edgar '
 'Degas and Frederick Remington, to the more abstract works of Francis Bacon. '
 'His precedence has also been cited in the photography of Steven Pippin and '
 'Hollis Frampton, as well as the bullet-time effects in "The Matrix" (1999).\n'
 '\n'
 'Muybridge lectured on this relationship with art when touring with his '
 'Zoöpraxiscope, which was a combination of the magic lantern and '
 'phenakistoscope. With it, he projected, from glass disks, facsimiles of his '
 'photographs hand-painted by Erwin Faber. Without intermittent movement, the '
 'Zoöpraxiscope compressed the images, so elongated drawings were used instead '
 'of photographs. Muybridge and others also used his images for '
 'phenakistoscopes and zoetropes. The first demonstration of the Zoöpraxiscope '
 'was to Stanford and friends in the autumn of 1879. A public demonstration '
 'was given on 4 May 1880 for the San Francisco art association, and Muybridge '
 'continued these lectures for years—personally touring the U.S. and Europe. '
 'Although there were predecessors in animated projections as far back as 1847 '
 'by Leopold Ludwig Döbler, in 1853 by Franz von Uchatius, and with posed '
 'photographs by Henry Heyl in 1870, the chronophotographic and artistic basis '
 "offered some novelty for Muybridge's presentations. They also led him to "
 'meet Edison and Marey and inspire the likes of Anschütz and others—those who '
 'took the next steps in the invention of movies.\n'
 '\n'
 '(Main Sources: "The Inventor and the Tycoon" by Edward Ball. "Eadweard '
 'Muybridge" and "Picturing Time" by Marta Braun. "The Man Who Stopped Time" '
 'by Brian Clegg. "Man in Motion" by Robert Bartlett Haas. "The Father of the '
 'Motion Picture" by Gordon Hendricks. "The Stanford Years, 1872-1882" edited '
 'by Anita Ventura Mozley. "Time Stands Still" by Phillip Prodger. "Cinema '
 'Before Cinema" by Virgilio Tosi.)')
id               int64
url             object
Name            object
PosterLink      object
Genres          object
Actors          object
Director        object
Keywords        object
RatingValue    float32
text            object
MovieYear        int64
dtype: object

Connect using Milvus Lite

Milvus Lite is a light Python server that can run locally. It's ideal for getting started with Milvus, running on a laptop, in a Jupyter notebook, or on Colab.

⛔️ Please note Milvus Lite is only meant for demos, not for production workloads.

[4]
pymilvus:2.4.3

Optional - Connect using Milvus standalone Docker

This section uses Milvus standalone on Docker.

⛔️ Make sure you pip install the correct version of pymilvus and server yml file. Versions (major and minor) should all match.

  1. Install Docker
  2. Start your Docker Desktop
  3. Download the latest docker-compose.yml (or run the wget command, replacing version to what you are using)

wget https://github.com/milvus-io/milvus/releases/download/v2.4.0-rc.1/milvus-standalone-docker-compose.yml -O docker-compose.yml

  1. From your terminal:
    • cd into directory where you saved the .yml file (usualy same dir as this notebook)
    • docker compose up -d
    • verify (either in terminal or on Docker Desktop) the containers are running
  2. From your code (see notebook code below):
    • Import milvus
    • Connect to the local milvus server
[5]

Load the Embedding Model checkpoint and use it to create vector embeddings

Embedding model: We will use the open-source sentence transformers available on HuggingFace to encode the documentation text. We will download the model from HuggingFace and run it locally.

💡Tip: A good way to choose a sentence transformer model is to check the MTEB Leaderboard. Sort descending by column "Retrieval Average" and choose the best-performing small model.

Two model parameters of note below:

  1. EMBEDDING_DIM refers to the dimensionality or length of the embedding vector. In this case, the embeddings generated for EACH token in the input text will have the SAME length = 1024. This size of embedding is often associated with BERT-based models, where the embeddings are used for downstream tasks such as classification, question answering, or text generation.

  2. MAX_SEQ_LENGTH is the maximum Context Length the encoder model can handle for input sequences. In this case, if sequences longer than 512 tokens are given to the model, everything longer will be (silently!) chopped off. This is the reason why a chunking strategy is needed to segment input texts into chunks with lengths that will fit in the model's input.
[6]
[7]
/opt/miniconda3/envs/py311-unum/lib/python3.11/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
model_name: BAAI/bge-large-en-v1.5
EMBEDDING_DIM: 1024
MAX_SEQ_LENGTH: 512

Create a Milvus collection

You can think of a collection in Milvus like a "table" in SQL databases. The collection will contain the

  • Schema (or no-schema Milvus client).
    💡 You'll need the vector EMBEDDING_DIM parameter from your embedding model. Typical values are:
    • 1024 for sbert embedding models
    • 1536 for ada-002 OpenAI embedding models
  • Vector index for efficient vector search
  • Vector distance metric for measuring nearest neighbor vectors
  • Consistency level In Milvus, transactional consistency is possible; however, according to the CAP theorem, some latency must be sacrificed. 💡 Searching movie reviews is not mission-critical, so eventually consistent is fine here.

Add a Vector Index

The vector index determines the vector search algorithm used to find the closest vectors in your data to the query a user submits.

Most vector indexes use different sets of parameters depending on whether the database is:

  • inserting vectors (creation mode) - vs -
  • searching vectors (search mode)

Scroll down the docs page to see a table listing different vector indexes available on Milvus. For example:

  • FLAT - deterministic exhaustive search
  • IVF_FLAT or IVF_SQ8 - Hash index (stochastic approximate search)
  • HNSW - Graph index (stochastic approximate search)
  • AUTOINDEX - Automatically determined based on OSS vs Zilliz cloud, type of GPU, size of data.

Besides a search algorithm, we also need to specify a distance metric, that is, a definition of what is considered "close" in vector space. In the cell below, the HNSW search index is chosen. Its possible distance metrics are one of:

  • L2 - L2-norm
  • IP - Dot-product
  • COSINE - Angular distance

💡 Most use cases work better with normalized embeddings, in which case L2 is useless (every vector has length=1) and IP and COSINE are the same. Only choose L2 if you plan to keep your embeddings unnormalized.

[8]
Successfully dropped collection: `IMDB_metadata`
---------------------------------------------------------------------------
NameError                                 Traceback (most recent call last)
Cell In[8], line 18
      9     print(f"Successfully dropped collection: `{COLLECTION_NAME}`")
     11 # # Check if collection already exists, if so drop it.
     12 # has = utility.has_collection(COLLECTION_NAME)
     13 # if has:
   (...)
     16 
     17 # Use no-schema Milvus client uses flexible json key:value format.
---> 18 mc = MilvusClient(connections=connection)
     20 # Create a collection with flexible schema and AUTOINDEX.
     21 mc.create_collection(COLLECTION_NAME, 
     22                      EMBEDDING_DIM,
     23                      consistency_level="Eventually", 
     24                      auto_id=True,  
     25                      overwrite=True,
     26                     )

NameError: name 'connection' is not defined

Simple Chunking

Before embedding, it is necessary to decide your chunk strategy, chunk size, and chunk overlap. This section uses:

  • Strategy = Simple fixed chunk lengths.
  • Chunk size = Use the embedding model's parameter MAX_SEQ_LENGTH
  • Overlap = Rule-of-thumb 10-15%
  • Function =
    • Langchain's RecursiveCharacterTextSplitter to split up long reviews recursively.
[ ]
[ ]

Insert data into Milvus

For each original text chunk, we'll write the quadruplet (vector, text, source, h1, h2) into the database.

The Milvus Client wrapper can only handle loading data from a list of dictionaries.

Otherwise, in general, Milvus supports loading data from:

  • pandas dataframes
  • list of dictionaries

Below, we use the embedding model provided by HuggingFace, download its checkpoint, and run it locally as the encoder.

[ ]
[ ]
[ ]
[ ]
[ ]

Ask a question about your data

So far in this demo notebook:

  1. Your custom data has been mapped into a vector embedding space
  2. Those vector embeddings have been saved into a vector database

Next, you can ask a question about your custom data!

💡 In LLM vocabulary:

Query is the generic term for user questions.
A query is a list of multiple individual questions, up to maybe 1000 different questions!

Question usually refers to a single user question.
In our example below, the user question is "What is AUTOINDEX in Milvus Client?"

Semantic Search = very fast search of the entire knowledge base to find the TOP_K documentation chunks with the closest embeddings to the user's query.

💡 The same model should always be used for consistency for all the embeddings data and the query.

[ ]

Execute a vector search

Search Milvus using PyMilvus API.

💡 By their nature, vector searches are "semantic" searches. For example, if you were to search for "leaky faucet":

Traditional Key-word Search - either or both words "leaky", "faucet" would have to match some text in order to return a web page or link text to the document.

Semantic search - results containing words "drippy" "taps" would be returned as well because these words mean the same thing even though they are different words.

[ ]
[ ]
[ ]
[ ]
[ ]