GitHub MCP Example¶
This notebook demonstrates how to connect an LLMAgent to the official
GitHub MCP server using the
streamable HTTP transport introduced in Chapter 5.
Requirements¶
- A GitHub Personal Access Token (PAT) with read access to public repositories,
stored in the environment variable
GITHUB_PAT
Setup Instructions¶
To ensure you have the required dependencies to run this notebook, you'll need to have our llm-agents-from-scratch framework installed on the running Jupyter kernel. To do this, you can launch this notebook with the following command while within the project's root directory:
# we need the notebook-utils and openai extras for this notebook
uv sync --extra notebook-utils --extra openai
# to launch the notebook
uv run --with jupyter jupyter lab
Alternatively, if you just want to use the published version of llm-agents-from-scratch without local development, you can install it from PyPI by uncommenting the cell below.
# Uncomment the line below to install `llm-agents-from-scratch` from PyPI
# !pip install 'llm-agents-from-scratch[notebook-utils,openai]'
Running an Ollama service¶
To execute the code provided in this notebook, you’ll need to have Ollama installed on your local machine and have its LLM hosting service running. To download Ollama, follow the instructions found on this page: https://ollama.com/download. After downloading and installing Ollama, you can start a service by opening a terminal and running the command ollama serve.
If running on Runpod using the Runpod templates for this Capstone project, then an Ollama service will already be running for you.
import logging
from llm_agents_from_scratch.logger import enable_console_logging
enable_console_logging(logging.INFO)
Creating an MCPToolProvider for GitHub's MCP Server¶
Unlike the stdio-based MCP servers used earlier in this chapter, GitHub's MCP
server is a remote server accessed over HTTP. We connect to it using the
streamable_http_url parameter on MCPToolProvider, along with
streamable_http_headers to pass authentication credentials.
To authenticate, you'll need a GitHub Personal Access Token (PAT) with read access to public repositories. If you don't have one, follow the instructions here to create one. The next cell will prompt you to enter it securely, and won't be echoed to the screen or stored in the notebook.
import getpass
import os
os.environ["GITHUB_PAT"] = getpass.getpass("GitHub PAT: ")
from llm_agents_from_scratch.tools import MCPToolProvider
github_mcp_provider = MCPToolProvider(
name="github_mcp",
streamable_http_url="https://api.githubcopilot.com/mcp/",
streamable_http_headers={
"Authorization": f"Bearer {os.environ['GITHUB_PAT']}",
},
)
Create our LLM agent¶
We'll now use the LLMAgentBuildler class to create an LLM agent that is equipped
with the tools from the GitHub MCP server. Recall from Chapter 5 that the builder
takes care of tool-discovery for each of its attached MCP tool providers.
from llm_agents_from_scratch import LLMAgentBuilder
from llm_agents_from_scratch.llms import OllamaLLM
llm = OllamaLLM(
model="qwen3:14b",
)
agent = (
await LLMAgentBuilder()
.with_llm(llm)
.with_mcp_provider(github_mcp_provider)
.build()
)
# Print all tools
for tool in agent.tools:
desc = (tool.description or "")[:50]
print(f"{tool.name}:\n\t{desc}...")
mcp__github_mcp__add_comment_to_pending_review: Add review comment to the requester's latest pendi... mcp__github_mcp__add_issue_comment: Add a comment to a specific issue in a GitHub repo... mcp__github_mcp__add_reply_to_pull_request_comment: Add a reply to an existing pull request comment. T... mcp__github_mcp__assign_copilot_to_issue: Assign Copilot to a specific issue in a GitHub rep... mcp__github_mcp__create_branch: Create a new branch in a GitHub repository... mcp__github_mcp__create_or_update_file: Create or update a single file in a GitHub reposit... mcp__github_mcp__create_pull_request: Create a new pull request in a GitHub repository.... mcp__github_mcp__create_repository: Create a new GitHub repository in your account or ... mcp__github_mcp__delete_file: Delete a file from a GitHub repository... mcp__github_mcp__fork_repository: Fork a GitHub repository to your account or specif... mcp__github_mcp__get_commit: Get details for a commit from a GitHub repository... mcp__github_mcp__get_file_contents: Get the contents of a file or directory from a Git... mcp__github_mcp__get_label: Get a specific label from a repository.... mcp__github_mcp__get_latest_release: Get the latest release in a GitHub repository... mcp__github_mcp__get_me: Get details of the authenticated GitHub user. Use ... mcp__github_mcp__get_release_by_tag: Get a specific release by its tag name in a GitHub... mcp__github_mcp__get_tag: Get details about a specific git tag in a GitHub r... mcp__github_mcp__get_team_members: Get member usernames of a specific team in an orga... mcp__github_mcp__get_teams: Get details of the teams the user is a member of. ... mcp__github_mcp__issue_read: Get information about a specific issue in a GitHub... mcp__github_mcp__issue_write: Create a new or update an existing issue in a GitH... mcp__github_mcp__list_branches: List branches in a GitHub repository... mcp__github_mcp__list_commits: Get list of commits of a branch in a GitHub reposi... mcp__github_mcp__list_issue_types: List supported issue types for repository owner (o... mcp__github_mcp__list_issues: List issues in a GitHub repository. For pagination... mcp__github_mcp__list_pull_requests: List pull requests in a GitHub repository. If the ... mcp__github_mcp__list_releases: List releases in a GitHub repository... mcp__github_mcp__list_tags: List git tags in a GitHub repository... mcp__github_mcp__merge_pull_request: Merge a pull request in a GitHub repository.... mcp__github_mcp__pull_request_read: Get information on a specific pull request in GitH... mcp__github_mcp__pull_request_review_write: Create and/or submit, delete review of a pull requ... mcp__github_mcp__push_files: Push multiple files to a GitHub repository in a si... mcp__github_mcp__request_copilot_review: Request a GitHub Copilot code review for a pull re... mcp__github_mcp__search_code: Fast and precise code search across ALL GitHub rep... mcp__github_mcp__search_issues: Search for issues in GitHub repositories using iss... mcp__github_mcp__search_pull_requests: Search for pull requests in GitHub repositories us... mcp__github_mcp__search_repositories: Find GitHub repositories by name, description, rea... mcp__github_mcp__search_users: Find GitHub users by username, real name, or other... mcp__github_mcp__sub_issue_write: Add a sub-issue to a parent issue in a GitHub repo... mcp__github_mcp__update_pull_request: Update an existing pull request in a GitHub reposi... mcp__github_mcp__update_pull_request_branch: Update the branch of a pull request with the lates...
A simple task for our agent¶
Now that we have our agent, let's give it a simple task to perform. We'll ask our LLM agent to retrieve the latest release for the book's GitHub repo: nerdai/llm-agents-from-scratch.
Note: Smaller language models like the qwen3:14b model used in this notebook often need more explicit guidance in the task instruction — for example, specifying which tool to use or reminding the model not to guess. They also tend to struggle with tasks that require multiple sequential tool calls or complex reasoning over tool outputs. For more demanding tasks, consider using a larger model.
# task
from llm_agents_from_scratch.data_structures import Task
instruction = (
"Use the get_latest_release tool to get the latest release for the "
"GitHub repo with owner 'nerdai' and repo 'llm-agents-from-scratch'."
)
task = Task(
instruction=instruction,
)
handler = agent.run(task, max_steps=10)
INFO (llm_agents_fs.LLMAgent) : 🚀 Starting task: Use the get_latest_release tool to get the latest release for the GitHub repo with owner 'nerdai' and repo 'llm-agents-from-scratch'. INFO (llm_agents_fs.TaskHandler) : ⚙️ Processing Step: Use the get_latest_release tool to get the latest release for the GitHub repo with owner 'nerdai' and repo 'llm-agents-from-scratc...[TRUNCATED]
INFO (llm_agents_fs.TaskHandler) : 🛠️ Executing Tool Call: mcp__github_mcp__get_latest_release
INFO (llm_agents_fs.TaskHandler) : ✅ Successful Tool Call: [{'type': 'text', 'text': '{"tag_name":"v0.0.13","target_commitish":"main","name":"v0.0.13","body":"## What\'s Changed\\n* bui...[TRUNCATED]
INFO (llm_agents_fs.TaskHandler) : ✅ Step Result: The latest release for the GitHub repository 'llm-agents-from-scratch' owned by 'nerdai' is version **v0.0.13**. This release includes ...[TRUNCATED]
INFO (llm_agents_fs.TaskHandler) : No new step required.
INFO (llm_agents_fs.LLMAgent) : 🏁 Task completed: The latest release for the GitHub repository 'llm-agents-from-scratch' owned by 'nerdai' is version **v0.0.13**. This release includ...[TRUNCATED]
result = handler.exception() or handler.result()
print(result)
The latest release for the GitHub repository 'llm-agents-from-scratch' owned by 'nerdai' is version **v0.0.13**. This release includes several updates and improvements, such as: - Bumping dependencies in the all-python-packages group. - Installing OpenAI extra in Docker for RunPod. - Fixing the `OpenAILLM.continue_chat_with_tool_results()` method. - Adding `MCPToolProvider` and `MCPTool` features. - Implementing `LLMAgentBuilder`. - Improving convergence in capstone examples. - Other bug fixes and enhancements. You can view the full changelog for this release [here](https://github.com/nerdai/llm-agents-from-scratch/compare/v0.0.12...v0.0.13).