Skip to content

Ollama Enum#21271

Open
h00die wants to merge 5 commits intorapid7:masterfrom
h00die:ollama_enum
Open

Ollama Enum#21271
h00die wants to merge 5 commits intorapid7:masterfrom
h00die:ollama_enum

Conversation

@h00die
Copy link
Copy Markdown
Contributor

@h00die h00die commented Apr 10, 2026

This PR adds an ollama LLM scanner. It'll connect to Ollama instances, and enumerate which LLMs are installed and details about them.

Verification

List the steps needed to make sure this thing works

  1. Start the ollama docker
  2. Start msfconsole
  3. Do: use auxiliary/scanner/http/ollama_info
  4. Do: set rhosts [IPs]
  5. Do: run
  6. You should get information about the models in the ollama instance

Copy link
Copy Markdown
Contributor

@dwelch-r7 dwelch-r7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you rebase this on master the tests should pass now

)
running = []
local_models = list_running_models
local_models['models'].each do |model|
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

local_models can be nil, we either need to add nil checks for all the results from the helper functions or I think I'd prefer raising early so we know something hasn't gone quite right

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Note

Copilot was unable to run its full agentic suite in this review.

Adds a new Metasploit auxiliary HTTP scanner module to detect Ollama instances and enumerate installed/running LLM models (including basic model details), along with usage documentation.

Changes:

  • Introduced auxiliary/scanner/http/ollama_info to query Ollama APIs (/api/ps, /api/tags, /api/show) and print a model table.
  • Added helper parsing/formatting for sizes, temperature, and system prompt.
  • Added module documentation with a Docker-based repro scenario and sample output.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 7 comments.

File Description
modules/auxiliary/scanner/http/ollama_info.rb Implements the Ollama scanner and model enumeration/output formatting.
documentation/modules/auxiliary/scanner/http/ollama_info.md Documents how to build/run a sample Ollama container and validate the module output.

Comment thread modules/auxiliary/scanner/http/ollama_info.rb Outdated
Comment thread modules/auxiliary/scanner/http/ollama_info.rb Outdated
Comment thread modules/auxiliary/scanner/http/ollama_info.rb Outdated
Comment thread modules/auxiliary/scanner/http/ollama_info.rb Outdated
Comment thread modules/auxiliary/scanner/http/ollama_info.rb Outdated
Comment thread modules/auxiliary/scanner/http/ollama_info.rb Outdated
Comment thread documentation/modules/auxiliary/scanner/http/ollama_info.md Outdated
@bwatters-r7
Copy link
Copy Markdown
Contributor

msf > use auxiliary/scanner/http/ollama_info 
msf auxiliary(scanner/http/ollama_info) > set rhosts 10.5.134.171
rhosts => 10.5.134.171
msf auxiliary(scanner/http/ollama_info) > set verbose true
verbose => true
msf auxiliary(scanner/http/ollama_info) > run
[*] Checking 10.5.134.171
[*]   Found model: llama3.1:latest
[*] 10.5.134.171 Ollama Models
==========================

  Name      Release  Status     Size     Parameter Size  Temperature  System Prompt
  ----      -------  ------     ----     --------------  -----------  -------------
  llama3.1  latest   Installed  4.58 GB  8.0B            N/A          N/A

[*] Scanned 1 of 1 hosts (100% complete)
[*] Auxiliary module execution completed

@bwatters-r7
Copy link
Copy Markdown
Contributor

@h00die the failing rspec tests are not your fault, and have been fixed. Unfortunately, to get the new spec tests, you need to rebase or push another commit. Locally everything passes when I rebase, so hopefully the tests will start passing when you address the copilot suggestions.

My results when rebasing and testing locally:

Finished in 6 minutes 26 seconds (files took 9.3 seconds to load)
18847 examples, 0 failures, 569 pending

@bwatters-r7 bwatters-r7 moved this from Ready to Waiting on Contributor in Metasploit Kanban Apr 28, 2026
@bwatters-r7 bwatters-r7 moved this from Waiting on Contributor to Ready in Metasploit Kanban May 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

Status: Ready

Development

Successfully merging this pull request may close these issues.

5 participants