Let us Begin with Schedule
Here’s a 12-hour lesson schedule for the "Ollama LLM Basics" course, designed specifically for undergraduate students. Each hour includes a topic, objective, and practical code examples to ensure interactive learning.
What is OLLAM?
Lesson Plan: Ollama LLM Basics in 12 Hours
Hour 1: Introduction & Installation
- Objective: Understand what Ollama is, its purpose, and how to install it.
- Topics:
- What is Ollama?
- Installing Ollama on Windows/Mac/Linux.
- Download from https://ollama.com/ . Will take some time to download.
- Click ollama setup exe file. It will take some time to install. Be Patient.
- Verifying installation.
- Code Example:
# Install Ollama CLI curl -sSL https://install.ollama.com | sh # Verify installation ollama --version
- my system responses are given in color:
ollama version is 0.5.7
Hour 2: Exploring File Structure
- Objective: Learn the organization of files and directories used by Ollama.
- Topics:
- Default installation paths.
- Config files and logs.
- Modelfile structure.
- Code Example:
# View installed models ollama list
C:\Users\AURMC>ollama list NAME ID SIZE MODIFIED llama3.1:latest 46e0c10c039e 4.9 GB 21 hours ago mistral:latest f974a74358d6 4.1 GB 39 hours ago
# Sample modelfile FROM llama3.1 SYSTEM """You are a helpful assistant."""
Hour 3: Understanding Chunks
- Objective: Learn how large text data is broken into manageable pieces (chunks).
- Topics:
- What are chunks?
- Chunking strategies for text processing.
- Code Example:
# Python example for chunking texttext = "This is a long text that needs to be chunked for processing."chunk_size = 10chunks = [text[i:i + chunk_size] for i in range(0, len(text), chunk_size)]print(chunks)
- Response:
['This is a ', 'long text ', 'that needs', ' to be chu', 'nked for
p', 'rocessing.']
Hour 4: Introduction to Embeddings
- Objective: Understand embeddings and their role in text representation.
- Topics:
- What are embeddings?
- Generating embeddings with Ollama.
- Code Example:
from sentence_transformers import SentenceTransformermodel = SentenceTransformer('all-MiniLM-L6-v2')sentences = ["Hello, world!", "Ollama LLM is amazing!"]embeddings = model.encode(sentences)print("-----------------EMBEDDINGS--------------------------")print(embeddings)
- Result
-----------------EMBEDDINGS-------------------------- [[-3.81771401e-02 3.29110846e-02 -5.45941014e-03 1.43699581e-02 -4.02910560e-02 -1.16532438e-01 3.16876620e-02 1.91176578e-03 -4.26223464e-02 2.91680731e-02 4.24267240e-02 3.20417434e-02 2.98446603e-02 1.09803034e-02 -5.39396629e-02 -5.02772443e-02 -2.35078204e-02 1.07792029e-02 -1.37707949e-01 4.11502784e-03 2.93331258e-02 6.68411776e-02 -1.53893353e-02 4.84376810e-02 -8.81496668e-02 -1.27268126e-02 4.14090157e-02 4.08315025e-02 -5.01559377e-02 -5.81249408e-02 4.88015078e-02 6.88901246e-02 5.87469414e-02 8.73094704e-03 -1.59182958e-02 8.51419345e-02 -7.81474710e-02 -7.75167868e-02 2.07237024e-02 1.61942001e-02 3.25106494e-02 -5.34888878e-02 -6.22287430e-02 -2.43146196e-02 7.41280057e-03 2.39777211e-02 6.36085821e-03 5.11451215e-02 7.27667361e-02 3.46496813e-02 -5.47710471e-02 -5.93285151e-02 -7.16699800e-03 2.01376863e-02 3.58462855e-02 5.59091708e-03 1.07735712e-02 -5.27636781e-02 1.01473657e-02 -8.73170607e-03 -6.28155991e-02 3.84666473e-02 -1.39427427e-02 7.35258833e-02 9.00083259e-02 -7.99807608e-02 -1.63945667e-02 4.47223373e-02 -6.88312128e-02 -3.30075920e-02 -1.53511707e-02 1.12996865e-02 3.64983901e-02 6.62666932e-02 -5.44066764e-02 8.79522134e-03 1.20746056e-02 -3.81696075e-02 6.86009508e-03 5.11236265e-02 7.74175823e-02 -1.22962601e-01 1.63531359e-02 4.95427698e-02 3.17454040e-02 -3.96687239e-02 1.70919718e-03 9.66940355e-03 -3.25299501e-02 -3.39294821e-02 -1.33264661e-01 7.39698997e-03 -1.02342460e-02 3.85915153e-02 -9.33099091e-02 -4.16540280e-02 6.98687881e-02 -2.62886435e-02 -1.49698272e-01 1.34353206e-01 3.75027917e-02 5.28785214e-02 4.49671261e-02 1.85995400e-02 5.44252023e-02 1.72716249e-02 -3.25182974e-02 4.60802428e-02 -4.67377640e-02 -3.06096878e-02 -1.82269346e-02 -4.86957096e-02 3.28580067e-02 -3.92104220e-03 5.01056127e-02 -5.82455061e-02 -1.00808796e-02 1.05503015e-02 -4.01561335e-02 -1.55406340e-03 6.08768240e-02 -4.55005951e-02 4.92598452e-02 2.61242576e-02 1.98506005e-02 -1.58551638e-03 5.95724918e-02 -6.51836156e-33 6.35851622e-02 3.06808040e-03 2.88845059e-02 1.73389703e-01 2.99482583e-03 2.76896749e-02 -9.51482728e-02 -3.11608184e-02 2.66698711e-02 -1.08712912e-02 2.39151940e-02 2.38444153e-02 -3.12164649e-02 4.94056903e-02 -2.49788594e-02 1.01824269e-01 -7.92783350e-02 -3.24880145e-03 4.30461839e-02 9.49331149e-02 -6.65736273e-02 6.32929103e-03 2.22788937e-02 6.99767917e-02 -7.53036793e-03 -1.74517767e-03 2.70446669e-02 -7.53242448e-02 1.14057772e-01 8.55989382e-03 -2.36880034e-02 -4.69805487e-02 1.43719483e-02 1.98205505e-02 -4.58440976e-03 1.37424469e-03 -3.43194194e-02 -5.41235246e-02 -9.41645727e-02 -2.89598238e-02 -1.87951066e-02 4.58199978e-02 4.75889631e-02 -3.19493655e-03 -3.32171507e-02 -1.33890789e-02 5.10315076e-02 3.10757998e-02 1.53145595e-02 5.42222485e-02 -8.50554109e-02 1.33045558e-02 -4.78141271e-02 7.10233822e-02 -1.31599419e-02 -2.42902199e-03 5.02184331e-02 -4.16094363e-02 -1.41695673e-02 3.23882364e-02 5.37522417e-03 9.12261531e-02 4.55126213e-03 -1.83422435e-02 -1.52029581e-02 -4.63723540e-02 3.86708602e-02 1.46850180e-02 5.20024821e-02 1.90892536e-03 -1.49174873e-02 2.70289853e-02 3.12726051e-02 2.36842241e-02 -4.80394019e-03 3.61618809e-02 6.67887479e-02 -1.89079018e-03 2.13743206e-02 -5.76926693e-02 1.91576816e-02 3.15590464e-02 -1.84470546e-02 -4.07710709e-02 1.03958264e-01 1.19074751e-02 -1.49295218e-02 -1.05078369e-01 -1.23709971e-02 -3.03968292e-04 -9.50323790e-02 5.83013035e-02 4.26109806e-02 -2.50124354e-02 -9.46091413e-02 4.00341821e-33 1.32172972e-01 5.45836426e-03 -3.31513099e-02 -9.10780430e-02 -3.15739699e-02 -3.38863246e-02 -7.19889924e-02 1.25911891e-01 -8.33933428e-02 5.27782515e-02 1.12834899e-03 2.19777953e-02 1.04063727e-01 1.29887573e-02 4.08715233e-02 1.87306218e-02 1.14286341e-01 2.48653665e-02 1.46103669e-02 6.18676795e-03 -1.13734407e-02 -3.57009135e-02 -3.80397551e-02 1.11748585e-02 -5.12898117e-02 7.88668543e-03 6.72205687e-02 3.40399565e-03 -9.28479359e-02 3.70485485e-02 -2.22387873e-02 4.00781557e-02 -3.07132658e-02 -1.14213638e-02 -1.44186579e-02 2.50452217e-02 -9.75571945e-02 -3.53958197e-02 -3.75209711e-02 -1.00685051e-02 -6.38861731e-02 2.54725274e-02 2.06223894e-02 3.76770832e-02 -1.04281694e-01 -2.82805804e-02 -5.21057136e-02 1.28588900e-02 -5.14237396e-02 -2.90379338e-02 -9.63977948e-02 -4.23605926e-02 6.70597404e-02 -3.07707321e-02 -1.03944102e-02 2.74102576e-02 -2.80103851e-02 1.02891792e-02 4.30914946e-02 2.22976301e-02 8.01123120e-03 5.61470538e-02 4.08860780e-02 9.28760916e-02 1.65849533e-02 -5.38247675e-02 5.74128935e-04 5.07842228e-02 4.24875207e-02 -2.92069651e-02 9.23895743e-03 -1.06735444e-02 -3.71526964e-02 2.36696796e-03 -3.03456467e-02 7.45053813e-02 2.62968591e-03 -1.76091380e-02 2.82063824e-03 3.83742936e-02 7.22766807e-03 4.56757583e-02 4.00427356e-02 1.42476438e-02 -1.43159870e-02 5.86556010e-02 3.63530442e-02 5.52345589e-02 -1.99883450e-02 -8.04460645e-02 -3.02477330e-02 -1.49129014e-02 2.22688243e-02 1.19625861e-02 -6.91014528e-02 -1.88071034e-08 -7.85505325e-02 4.67108600e-02 -2.40804385e-02 6.34383336e-02 2.40137652e-02 1.43956603e-03 -9.08353627e-02 -6.68759272e-02 -8.00782964e-02 5.66642964e-03 5.36584072e-02 1.04866281e-01 -6.68805018e-02 1.54984007e-02 6.71184212e-02 7.08868802e-02 -3.19899991e-02 2.08834969e-02 -2.19384190e-02 -7.26882834e-03 -1.08125005e-02 4.08999948e-03 3.31554450e-02 -7.89784566e-02 3.87152582e-02 -7.53203556e-02 -1.58048701e-02 5.96058834e-03 5.19896764e-03 -6.14302047e-02 4.20058295e-02 9.53628048e-02 -4.32335250e-02 1.43934786e-02 -1.06087439e-01 -2.79941484e-02 1.09662544e-02 6.95307106e-02 6.69810250e-02 -7.47754872e-02 -7.85745457e-02 4.27465998e-02 -3.46037522e-02 -1.06056280e-01 -3.56334858e-02 5.15012406e-02 6.86735362e-02 -4.99745123e-02 1.52899241e-02 -6.45574033e-02 -7.59340078e-02 2.61542946e-02 7.42642283e-02 -1.24497749e-02 1.33297831e-01 7.47664198e-02 5.12522347e-02 2.09903158e-02 -2.68759523e-02 8.89062434e-02 4.00209315e-02 -4.08902839e-02 3.18714194e-02 1.81631427e-02] [-1.34197408e-02 -2.53689438e-02 -2.26690806e-02 -1.70212220e-02 -5.17734736e-02 -2.02849461e-03 1.76195987e-02 -1.46494880e-02 9.98302735e-03 2.52004550e-03 -8.69949907e-03 7.94213172e-03 -3.54129523e-02 -4.32522595e-03 6.21141866e-02 1.02334157e-01 1.14114217e-01 1.34758158e-02 4.18537967e-02 -5.88170579e-03 5.72691113e-02 9.39999241e-03 9.08033028e-02 5.66187277e-02 -2.75126509e-02 -6.59019826e-03 3.10945921e-02 7.64758363e-02 3.16258296e-02 -1.44398510e-01 4.29602750e-02 4.00338918e-02 -5.77960387e-02 1.07570179e-02 -4.75503355e-02 -6.44259807e-03 -3.71104702e-02 -8.41036737e-02 -5.59654869e-02 3.18917856e-02 4.27088328e-02 6.05388619e-02 -2.54824832e-02 -7.67846555e-02 -6.66406611e-03 -8.21973309e-02 2.91341599e-02 -4.02449071e-02 -2.89948210e-02 1.34600690e-02 -2.23043673e-02 -5.57528250e-02 -4.07000072e-02 -1.98784936e-02 -3.44458483e-02 4.05377522e-02 2.30706595e-02 -9.01744068e-02 -4.42707352e-02 6.68446533e-03 3.05309165e-02 5.91168329e-02 -7.71483555e-02 -7.81458523e-03 5.79939894e-02 -1.10273004e-01 1.65912434e-02 8.19363892e-02 -1.58739462e-02 -3.60843465e-02 1.88729316e-02 -2.64394227e-02 5.44615686e-02 9.53782946e-02 -7.53078014e-02 1.96546204e-02 5.37670478e-02 -3.13007608e-02 2.67171953e-02 9.46422890e-02 5.36850579e-02 -6.45416752e-02 -1.97675824e-02 -2.75815884e-03 5.02288081e-02 -5.57174049e-02 -7.45305419e-02 1.83074083e-02 6.52123764e-02 -4.76244055e-02 3.45655493e-02 4.01180610e-02 -1.08663693e-01 2.00674739e-02 4.89524454e-02 -5.12875244e-02 5.31208515e-02 -7.47177824e-02 -5.77281415e-02 7.30274543e-02 -3.37995961e-02 -3.78731592e-03 1.89436739e-03 -1.80729385e-02 -6.23091124e-02 6.91562833e-04 1.56883121e-01 1.52821895e-02 -1.87833086e-02 5.30515872e-02 6.28089979e-02 -5.00498340e-02 6.21494055e-02 1.07895350e-02 8.36328231e-03 -7.91967474e-03 2.09000856e-02 1.76434387e-02 -4.29326482e-02 -2.98710428e-02 -9.00149439e-03 -7.10350275e-02 6.12564944e-02 2.23955438e-02 1.54022975e-02 2.88838390e-02 8.99639633e-03 7.58042787e-34 5.52469753e-02 -3.97232478e-04 -2.21904106e-02 2.18989775e-02 7.97123611e-02 5.90920895e-02 3.12324688e-02 -3.04346122e-02 -8.52317736e-02 -3.89409475e-02 -8.74049962e-02 -1.10988133e-02 -2.29365211e-02 5.97982928e-02 3.02519673e-03 1.31010730e-02 -3.64159234e-02 -3.37298438e-02 2.45791022e-02 4.45135795e-02 2.73731761e-02 9.89020839e-02 6.01138771e-02 -1.98334251e-02 -2.33668666e-02 6.91590607e-02 3.37757766e-02 -7.77444709e-03 2.88937730e-03 6.35830015e-02 9.61187705e-02 1.94451995e-02 -3.53601165e-02 -1.50566073e-02 -1.06598837e-02 6.37599900e-02 -2.07270965e-01 -7.63735250e-02 -2.55645439e-02 -3.04020219e-03 2.07179170e-02 -7.18224235e-03 -1.32060755e-04 5.03950845e-03 -9.79800522e-02 3.54118720e-02 -3.34603265e-02 5.61116003e-02 1.12005055e-01 -7.01964051e-02 -2.62060296e-02 1.16380658e-02 -1.45574361e-01 3.99550162e-02 8.88689701e-03 3.58781368e-02 -1.41713279e-03 -4.67833318e-02 1.10225067e-01 -2.25594863e-02 -1.13815106e-01 4.23575938e-02 4.68225125e-03 -6.37562349e-02 2.37143952e-02 -1.49678148e-03 -1.39404126e-02 -2.78088567e-03 -1.97210759e-02 -6.38542846e-02 5.71562722e-02 1.00229224e-02 -3.17893736e-02 -5.91306109e-03 -2.49743834e-02 -7.20832273e-02 5.12854904e-02 2.55959183e-02 -1.25757353e-02 1.82677675e-02 6.54356480e-02 3.88515443e-02 9.95321572e-03 -1.17471553e-02 9.98865627e-03 -3.13529335e-02 -2.61255000e-02 -6.96595805e-03 -7.22942036e-03 8.77045766e-02 1.99745893e-02 -2.58409157e-02 8.93652961e-02 4.05767970e-02 -4.99616228e-02 3.12125548e-34 2.35601217e-02 -2.91741192e-02 5.34389243e-02 6.28088042e-02 8.47557634e-02 -1.04106851e-02 -6.76913634e-02 1.66408584e-01 2.08638459e-02 8.56259186e-03 2.84161977e-02 -1.67625602e-02 3.12059317e-02 2.26311162e-02 2.42651603e-03 -4.41376157e-02 2.35256348e-02 1.43914772e-02 4.44756635e-02 -5.75170554e-02 -3.09628863e-02 6.48214296e-02 -1.41509250e-02 -2.63607334e-02 -1.78468619e-02 7.88891986e-02 -3.73390950e-02 -1.53230000e-02 -6.59822077e-02 2.92350370e-02 5.91682224e-03 5.04821492e-03 -5.33139594e-02 -5.83177060e-02 -3.94952036e-02 1.13237515e-01 7.44273281e-03 6.35099178e-03 -3.91905718e-02 8.16960856e-02 2.04164349e-02 6.36203308e-03 4.19646055e-02 5.26594790e-03 -4.44559306e-02 -2.00080015e-02 -2.66355611e-02 1.32193817e-02 -2.79513560e-02 -3.52666043e-02 -9.73540638e-03 -2.56695487e-02 -1.70612484e-02 -4.87555526e-02 8.72932002e-02 -2.60999966e-02 5.78898676e-02 -6.58388734e-02 -7.13080540e-03 -1.86845511e-02 1.09980525e-02 -2.87970044e-02 -7.51761254e-03 -4.27996442e-02 4.82634753e-02 1.28486678e-02 4.15548757e-02 -2.18930170e-02 -8.27717036e-02 -3.29928100e-02 -6.88753054e-02 2.05334201e-02 -1.34646684e-01 6.18755482e-02 1.97726190e-02 -1.51370857e-02 -7.13358000e-02 -1.28756491e-02 5.74856102e-02 -4.22032438e-02 2.40339953e-02 -3.04572880e-02 -6.22801818e-02 3.53980586e-02 7.89841488e-02 7.77615085e-02 8.56208354e-02 -2.91944798e-02 2.21786584e-04 6.41850606e-02 3.41101661e-02 4.48811352e-02 -2.48523876e-02 2.16588452e-02 1.06995165e-01 -1.66009873e-08 -3.84524725e-02 -5.43523133e-02 -2.78189182e-02 1.23060867e-02 2.57597156e-02 -7.94619098e-02 -5.89542463e-03 -2.52906661e-02 -1.22407116e-02 2.99652256e-02 1.26678541e-01 9.77547616e-02 -5.67951128e-02 2.99453456e-02 1.54014584e-02 1.19677866e-02 3.30095999e-02 5.14508300e-02 -2.31375974e-02 -4.49272841e-02 2.83227619e-02 2.69311480e-02 8.60512406e-02 -5.95798269e-02 3.00312452e-02 -7.51487464e-02 -3.45640630e-02 6.74876198e-02 5.90112992e-02 -3.23370285e-02 -2.43565766e-03 -3.56118311e-03 4.29011974e-03 2.74957120e-02 -6.54878691e-02 -1.72954500e-02 3.22995447e-02 -3.44966315e-02 -1.82300489e-02 -5.76971807e-02 5.67945950e-02 -4.98785898e-02 5.01253866e-02 1.41746132e-02 -3.74590941e-02 -2.58184224e-02 5.35937250e-02 -1.05096117e-01 -5.24494164e-02 -7.67563805e-02 -3.09417769e-02 -4.93066385e-02 5.62944300e-02 6.04514666e-02 -9.33144428e-03 -2.60093510e-02 -4.29241359e-02 -7.56293815e-03 -1.23269381e-02 5.14618047e-02 1.19561754e-01 -8.99389088e-02 2.91423942e-03 1.99287571e-02]]
Hour 5: Working with Vector Databases
- Objective: Learn to store and query embeddings using vector databases.
- Topics:
- Introduction to vector databases (e.g., Pinecone, Weaviate).
- Storing embeddings and querying.
- Code Example:
- pip install pinecone
- get pinecone key
-
# Example with Pinecone import pinecone pinecone.init(api_key='your-api-key', environment='us-west1-gcp') index = pinecone.Index("example-index") index.upsert([("id1", embeddings[0]), ("id2", embeddings[1])]) results = index.query([embeddings[0]], top_k=1) print(results)
Hour 6: Overview of Models
- Objective: Understand the types of models supported by Ollama.
- Topics:
- Pre-trained vs fine-tuned models.
- Supported architectures (Llama, Mistral, Gemma).
- Code Example:
# List available models ollama list
- # Run a specific model ollama run llama3.1NAME ID SIZE MODIFIEDllama3.1:latest 46e0c10c039e 4.9 GB 21 hours agomistral:latest f974a74358d6 4.1 GB 40 hours ago
>>> What is AI?
- AI, or Artificial Intelligence, refers to the development of computer systems that can perform tasks that typically require human intelligence, such as:1. **Learning**: AI systems can learn from data and improve their performance over time.2. **Problem-solving**: AI systems can solve complex problems, make decisions, and take actions based on those decisions.3. **Reasoning**: AI systems can reason about the world, draw conclusions, and apply knowledge to specific situations.4. **Perception**: AI systems can perceive their environment through sensors, such as cameras, microphones, or other devices.AI involves a range of techniques, including:1. **Machine learning** (ML): A subset of AI that enables computers to learn from data without being explicitly programmed.2. **Deep learning** (DL): A type of ML that uses neural networks with multiple layers to analyze and interpret complex data.3. **Natural language processing** (NLP): A field of study focused on enabling computers to understand, generate, and process human language.4. **Computer vision**: A field of study focused on enabling computers to interpret and understand visual information from images and videos.AI can be applied in many areas, including:1. **Robotics**: AI-powered robots that can perform tasks autonomously or with minimal human intervention.2. **Virtual assistants**: AI-powered virtual assistants, such as Siri, Alexa, or Google Assistant, that can perform tasks like answering questions, setting reminders, and controlling other devices.3. **Image recognition**: AI-powered systems that can recognize objects, people, and scenes in images and videos.4. **Predictive maintenance**: AI-powered systems that can predict when equipment is likely to fail or require maintenance.5. **Healthcare**: AI-powered systems that can analyze medical data, diagnose diseases, and develop personalized treatment plans.The potential benefits of AI include:1. **Increased efficiency**: AI can automate tasks, freeing up human time for more strategic and creative work.2. **Improved accuracy**: AI can perform tasks with greater accuracy than humans, reducing errors and improving outcomes.3. **Enhanced decision-making**: AI can analyze large amounts of data and provide insights that inform business or organizational decisions.However, AI also raises concerns about:1. **Job displacement**: AI may automate certain jobs, potentially displacing human workers.2. **Bias and fairness**: AI systems may perpetuate biases and unfairness if they are trained on biased data or designed with a particular worldview.3. **Security**: AI systems can be vulnerable to cyber threats, compromising sensitive information.Overall, AI has the potential to transform many aspects of our lives, but it also requires careful consideration of its benefits and limitations.
Hour 7: Creating Custom Models
- Objective: Learn how to create a custom model with Ollama.
- Topics:
- Creating a modelfile.
- Custom prompts and templates.
- Code Example:
# Create a custom model ollama create mymodel -f ./mymodel.modelfile
Hour 8: Introduction to Fine-Tuning
- Objective: Understand the basics of fine-tuning models.
- Topics:
- When and why to fine-tune.
- Steps in the fine-tuning process.
- Code Example:
# Fine-tune a model ollama fine-tune --model llama3.1 --data fine_tune_data.json
Hour 9: Metrics & Evaluation
- Objective: Learn how to evaluate model performance.
- Topics:
- Common metrics (accuracy, loss, BLEU, etc.).
- Using metrics to improve models.
- Code Example:
# Calculate BLEU score from nltk.translate.bleu_score import sentence_bleu reference = [['this', 'is', 'a', 'test']] candidate = ['this', 'is', 'a', 'test'] score = sentence_bleu(reference, candidate) print("BLEU Score:", score)
Hour 10: Advanced Fine-Tuning Techniques
- Objective: Dive deeper into advanced fine-tuning methods.
- Topics:
- Adjusting hyperparameters.
- Using new datasets.
- Code Example:
# Example with quantization ollama fine-tune --model llama3.1 --data new_data.json --quantization q4_0
Hour 11: Practical Applications
- Objective: Apply Ollama to real-world scenarios.
- Topics:
- Using Ollama for chatbots.
- Generating summaries or translations.
- Code Example:
ollama run llama3.1 --prompt "Summarize this article: ..."
Hour 12: Wrap-Up and Q&A
- Objective: Review all topics and address student questions.
- Activities:
- Quick recap of all lessons.
- Hands-on exercise: Create and run a custom model with embeddings and vector storage.
- Code Example:
# Final task: Integrate everything ollama create finalmodel -f ./finalmodel.modelfile ollama run finalmodel --prompt "Explain quantum computing."
This schedule ensures a blend of theory and practice, making the course both engaging and effective for undergraduate students.
No comments:
Post a Comment