Ollama 101: Windows 11 Quickstart Checklist

Everything you need to install, run, and experiment with local LLMs in minutes

🚧 This post is under construction 🚧

Under Windows 11, use Ollama to install and run LLM locally
Under Windows 11, use Ollama to install and run LLM locally

Install Ollama

Visit ollama.com

Visit ollama.com
Visit ollama.com
irm https://ollama.com/install.ps1 | iex

Install and Run QWEN

This is a “small” model. Perfect to check our setup.

ollama run qwen2.5:3b
Install and Run QWEN in Ollama
Install and Run QWEN in Ollama

First prompt

Once you see the >>>>, you can try your first prompt.

# prompt:
Write a Rust function that reverses a string safely

# Explain what Rust ownership is in simple terms
# ...

# /exit or CTRL+D to exit

See below the answer:

First prompt within Ollama
First prompt within Ollama

Getting Help and Information

  • Exit Ollama (/exit)
  • In the terminal type
ollama --help
Getting Help from Ollama
Getting Help from Ollama
  • Let’s try ollama list to see the list of model available locally:
ollama list
ollama show qwen2.5:3b
Getting information about QWEN
Getting information about QWEN

Check where the models are stored

Ollama is installed in $env:USERPROFILE

Get-ChildItem "$env:USERPROFILE\.ollama\models" -Recurse
Ollama: checking where the models are stored locally
Ollama: checking where the models are stored locally

Back to top

Published on: Apr 10 2026 at 03:00 PM | Last updated: Apr 13 2026 at 11:53 AM

Copyright © 1964-2026 - 40tude