Tuesday, 21 January 2025

Hour 12++ Local LLM for iMMSS AI

Let us have our Local LLM free and customize with name candy in just 5 Minutes!!!

Ollama is a free and open-source project that lets you run various open source LLMs locally on your system.

OLLAMA - Omni-Layer Learning Language Acquisition Model


  • Please download ollama and run llama2. It will take some time at first instance. Then say /bye to quit in the prompt >>>.
  • Create a Modelfile with the following contents


FROM llama3.2
SYSTEM """Your name is Candy ! You are very Clever assistant who knows everything.
          You are very succinct and informative."""
PARAMETER temperature 0.1


  • #ollama create candy -f  Modelfile

Please check by

#ollama list  (my typical list is shown below. Please check candy is there

ollama list
NAME                       ID              SIZE      MODIFIED      
candy:latest               2ea6c7bb34ec    2.0 GB    58 minutes ago    
nomic-embed-text:latest    0a109f422b47    274 MB    3 hours ago      
llama3.2:latest            a80c4f17acd5    2.0 GB    3 hours ago      
mistral:latest             f974a74358d6    4.1 GB    20 hours ago      
llama3.1:latest            46e0c10c039e    4.9 GB    31 hours ago      
phi3:latest                4f2222927938    2.2 GB    38 hours ago      
phi:latest                 e2fd6321a5fe    1.6 GB    2 days ago        

 Local LLM using OLLAMA with Gradio

 main.py 

#llm-api-demo
import requests
import json
#pip install gradio
import gradio as gr

url="http://localhost:11434/api/generate"
headers = {
    "Content-Type": "application/json"
}

history=[]
import gradio as gr

#ollama create candy -f candy_modelfile.modelfile
def generate_response(prompt):
 
    history.append(prompt)
    final_prompt="\n".join(history)
    data = {
        "model": "candy",
        "prompt": final_prompt,
        "stream": False
    }
   
    response = requests.post(url, headers=headers,data=json.dumps(data))
   
    if response.status_code == 200:
        response =  response.text
        data =json.loads(response)
        actual_response=data["response"]
        return actual_response
    else:
        return "Error generating response"
   
interface = gr.Interface(
    fn=generate_response,
    inputs=gr.Textbox(lines=3,placeholder="Enter your Prompt",
label=" I am iMMSS AI Candy , How Can i Help You?"),
    outputs="text",
     title="IMMSS AI Candy",
    description="I am a helpful AI assistant that uses Ollama language
model to generate responses based on your inputs. I am also capable
of generating multiple prompts and providing a history of all your
previous prompts."
)

interface.launch()


Response :

f:/ollama/ollama-gradio.py Running on local URL:  http://127.0.0.1:7860. Go to this URL .You will get the window with input box. Type prompt as What Ministry of Corporate affairs do in India? and click SUBMIT Button. You will get response as shown below:


Eureka ! SO easy to built Customized Local LLM with Ollama & Gradio for UI.. 


Here are some examples of prompts that can be generated using the MCA in India by our Loacl iMMSS Candy LLM:

1. **Drafting a complaint letter to the MCA**: "Write a sample complaint letter to the MCA regarding a company's non-compliance with regulatory requirements."

2. **Creating a template for a company's compliance report**: "Design a template for a company's compliance report, as required by the MCA under the Companies Act, 2013."

3. **Researching on recent developments in corporate governance**: "Research and write about recent developments in corporate governance in India, highlighting the role of the MCA in promoting good governance practices."

4. **Drafting a response to an investor's query**: "Write a sample response to an investor's query regarding a company's compliance with regulatory requirements, as per the guidance provided by the MCA."


After Adding Title and Description in Gradio interface:

Gradio UI will look with title and description as shown below:



Hopefully{!}, you just got one endpoint of Ollama Universe!!!.

-------------------------------------------------------------------------------------------------------


No comments:

OpenWebUI - Beginner's Tutorial

  OpenWebUI Tutorial: Setting Up and Using Local Llama 3.2 with Ollama Introduction This tutorial provides a step-by-step guide to setting...