How to Use Ollama Modelfile: Custom Models, System Prompts, and Parameters

A practical guide to Ollama Modelfiles: creating custom named models with persistent system prompts, setting temperature, context window, stop sequences and other inference parameters, four ready-to-use Modelfile templates for code review, JSON output, document summarisation, and low-RAM setups, using custom models through the Ollama REST API, seeding few-shot examples with MESSAGE, exporting and sharing Modelfiles with teammates, and the most common gotchas around context window memory, stop tokens, and reproducibility.