Guide to Deleting Ollama Models Using CLI and WebUI

How to Delete Ollama Models

Deleting Ollama models is possible through several methods, primarily using the command line interface (CLI) or the Open WebUI. Users can also manage Ollama services and files directly for complete removal.

1. Deleting Models via Command Line Interface (CLI)

Use the CLI to efficiently manage installed models. First, check which models are present:

ollama list

This command lists all models currently installed. To delete a specific model, run:

ollama rm model-name

For example:

ollama rm deepseek-r1:32b

deleted ‘deepseek-r1:32b’

After deletion, verify removal by listing the models again:

ollama list

2. Deleting Models via the Open WebUI

Access the Open WebUI through your browser. Navigate to the Models section where all installed models appear.

  • Select the model you want to delete.
  • Send a DELETE request to the backend API to remove the model.

Example API call:

DELETE http://localhost:8000/api/models/deepseek-r1:32b

The WebUI should confirm the deletion. Refresh the Models page to ensure the model no longer appears.

3. Managing Ollama Service and Files for Complete Removal

To fully remove models and Ollama components, stop the Ollama service first. This prevents conflicts during deletion:

  • Stop the Ollama service.
  • Disable it from starting on system reboot.

Next, delete the service file, Ollama binary, downloaded models, and any user data related to Ollama.

On macOS, model files typically reside in ~/.ollama/models. Linux systems use a similar directory.

Manual deletion of these files is risky and may corrupt the Ollama system cache. Use official commands or the WebUI whenever possible. Utilities exist for cache management, but manual removal should be a last resort.

Key Takeaways

  • Delete models by using ollama rm <model-name> in the CLI.
  • Use Open WebUI and backend API DELETE requests for model removal through a browser.
  • Stop and disable Ollama services before removing binaries and data files.
  • Avoid manual deletion of cache files to prevent corruption.

How to Delete Ollama Models: The Definitive Guide for Humans (and Robots Alike)

Wondering how to delete those pesky Ollama models you no longer need? Relax. It’s easier than you think. Whether you’re clearing up space, tidying your AI workspace, or just testing your command line skills, this guide walks you through every step — with clear, no-nonsense instructions and just a splash of humor.

Why Bother Deleting Ollama Models?

Models take up disk space — sometimes lots of it. If you keep installing new ones for experiments or demos, your computer might start to groan. Plus, a cluttered AI environment is a developer’s nightmare. Keeping only what you use streamlines workflows and saves resources. If that sounds like good housekeeping to you (because who likes a messy garage?), read on!

Related — Is MLX Really Faster Than Ollama? A Performance Benchmark on Apple Silicon

The Two Main Ways to Delete Ollama Models: CLI vs. WebUI

You can delete Ollama models either through the Command Line Interface (CLI) or via the WebUI. Both work great, but each has its perks. CLI is perfect for those who love typing commands with swagger, while the WebUI suits folks who prefer clicking and pointing.

1. Deleting Ollama Models Using the Command Line Interface (CLI)

CLI is the classic method, powerful and straightforward. Even if you’re not a terminal wizard, these commands are simple to follow.

a. Check Which Models Are Installed

First, list all your current Ollama models by typing: ollama list

This command spits out the models you have. Think of it like checking your pantry before tossing expired snacks.

b. Remove the Unwanted Model (Seriously, Get Rid of That One!)

Say you want to delete a model named deepseek-r1:32b. Type this: ollama rm deepseek-r1:32b

You’ll see a sweet little confirmation: deleted ‘deepseek-r1:32b’

That’s your sign the model hit the digital dustbin!

c. Double-Check the Deletion

Just to be sure your model took a hike, list the models again: ollama list

You should no longer see that model on the list. Promise, we’re not ghosting you.

2. Deleting Ollama Models Through the Open WebUI

Maybe you prefer a GUI approach or want to manage models without remembering commands. The WebUI lets you do just that. Here’s how:

a. Access the Open WebUI

Simply open your web browser and head to the Open WebUI interface of Ollama. If you’re running everything locally… localhost is your friend.

b. Navigate to the Models Section

Find the “Models” tab or section. It lists all installed models, as neat as books on a shelf.

c. Select and DELETE (Yes, Literally Delete) Your Model

Click on the model you’re ready to part ways with. Now, here’s a pro tip: the deletion request happens via the backend API. If you want to turbocharge your deletion, send this DELETE request to the Ollama API endpoint using your favorite HTTP client or even command line tools like curl: DELETE http://localhost:8000/api/models/deepseek-r1:32b

Upon success, you’ll get a response confirming the model’s farewell.

d. Refresh to Confirm

Hit refresh on the Models section. If that model disappeared like magic, congratulations! You just mastered model management with a few clicks and moves.

3. When to Stop Ollama Service and Why It Matters

Thinking about uninstalling Ollama completely? Before you start nuking files, it’s smart to stop the Ollama service first. Why? Because active services might resist deletion or cause errors.

  • Stop the Ollama service so you don’t run into “file in use” problems.
  • Disable it from restarting on system boot for a clean slate.
  • Then proceed with deleting models, files, and binaries safely.

Consider this the “pause before the cleanup party.”

4. Manual Deletion and Why It Is Generally Unsavory

You could technically navigate to Ollama’s model folders (on macOS, usually ~/.ollama/models) and delete files manually. But

WARNING:

This is not recommended. Messing directly with cache or model files can corrupt your environment irreversibly.

Stick to the CLI or WebUI methods for safety.

5. Putting It All Together: Your Step-by-Step Cheat Sheet

  1. Stop Ollama service if you plan a full uninstall or want smooth deletions.
  2. Use CLI with ollama rm <model_name> to delete individual models, or delete via the WebUI API.
  3. Verify model removal by listing models via CLI or refreshing the WebUI.
  4. For total cleanup, remove Ollama binaries, service files, and user data.

Pro Tips for Model Management

  • Regularly check installed models with ollama list to avoid clutter.
  • Before deleting, back up any models you might use later (model files can be quite hefty).
  • Use APIs for automation if you manage many models programmatically.
  • Relax and remember: deleting models won’t delete your AI skills. You can always redownload or train new ones.

So, What’s the Takeaway?

Deleting Ollama models is a piece of cake—CLI or WebUI, you choose your adventure. Both methods are straightforward and safe, so panic not if you’re at the terminal or staring at a browser tab. Follow the steps, verify your cleanup, and enjoy a leaner AI setup.

Got a favorite method? Tried deleting multiple models without breaking a sweat? Feel free to share your experience! Or ask a question. After all, tech is better when shared.


How can I delete an Ollama model using the command line?

Run ollama list to see installed models. Use ollama rm <model-name> to delete a model. For example, ollama rm deepseek-r1:32b. Confirm deletion with ollama list again.

Is there a way to remove Ollama models via a web interface?

Yes. Open the Open WebUI in your browser, go to the Models section, select the model, and send a DELETE request to the API endpoint like DELETE http://localhost:8000/api/models/deepseek-r1:32b. Refresh the page to confirm removal.

Should I manually delete Ollama model files from the system?

It is not recommended to delete model files manually from the cache folder (e.g., ~/.ollama/models on macOS). This can cause corruption. Use the CLI or API methods instead.

What steps should I take to fully remove Ollama and its models?

Stop and disable the Ollama service first. Then remove the service file, the Ollama binary, and all downloaded models and user data. This ensures complete removal without leftovers.

Where are Ollama model files stored on my system?

On macOS, models are located in ~/.ollama/models. Linux has a similar directory. However, avoid manual deletion to prevent issues.

Leave a Reply

Your email address will not be published. Required fields are marked *