Contents Menu Expand Light mode Dark mode Auto light/dark, in light mode Auto light/dark, in dark mode
Light Logo Semantic AI
Dark Logo Semantic AI
Light Logo

Semantic AI

Dark Logo

Semantic AI

Documentation

  • Introduction
  • Installation
  • Connectors
    • Microsoft SharePoint
    • MYSQL
    • SQLITE
  • DF Extraction
  • Embeddings
    • Hugging Face 🤗
    • Open AI
  • Vector Database
    • Elastic Search
    • Qdrant
    • OpenSearch
  • Indexer
  • LLM Models
    • IBM Watsonx LLM
    • LLAMA 2
    • OpenAI
  • NLP to Text
    • Natural Language Processing
  • Semantic Search
Back to top
Join

Semantic Search¶

Semantic Search is, to retrieve the expected results and generate human-readable conversational responses with the help of LLM (Large Language Model).

First we need to create LLM and vector db object.

LLM:

import os
os.environ['OPENAI_API_KEY'] = "<openai_api_key>"
from semantic_ai.llm import Openai

llm_model = await Openai().llm_model()

Vector DB:

from semantic_ai.indexer import ElasticsearchIndexer
elastic_search = await ElasticsearchIndexer(
        url="http://localhost:9200",
        index_name="test_index",
        embedding=embeddings
).create()

Search:

from semantic_ai.search.semantic_search import Search
search_obj = Search(
            model=llm_model,
            load_vector_db=elastic_search
)
query = "What is an AI"
search = await search_obj.generate(query)

We can change the top_k value and prompt using top_k and ‘prompt’ params respectively

search_obj = Search(
            model=llm_model,
            load_vector_db=elastic_search,
            top_k=5,
            prompt=prompt
)
query = "What is an AI"
search = await search_obj.generate(query)
Previous
Natural Language Processing
Copyright © 2024, DecisionFacts
Made with Sphinx and @pradyunsg's Furo