Stata-MCP Configuration

Configure your Stata installation and output settings

{% if saved %}
Configuration saved successfully!
{% endif %} {% if errors %}
Configuration errors:
    {% for field, error in errors.items() %} {% if field != 'general' %}
  • {{ field.replace('_', ' ').title() }}: {{ error }}
  • {% else %}
  • {{ error }}
  • {% endif %} {% endfor %}
{% endif %}

Stata Configuration

Path to your Stata executable. Common locations:

  • macOS: /Applications/Stata/StataMP.app/Contents/MacOS/stata-mp
  • Linux: /usr/local/stata17/stata-mp
  • Windows: C:\Program Files\Stata17\StataMP-64.exe

Output Configuration

Directory where all Stata outputs, datasets, and logs will be stored.

The directory will be created if it doesn't exist.

LLM Configuration

Choose between local Ollama models or OpenAI cloud service.

Ollama Configuration

Available models: qwen2.5-coder:7b, llama2, codellama, etc.

URL where your Ollama server is running (usually localhost:11434).

OpenAI Configuration

Available models: gpt-3.5-turbo, gpt-4, gpt-4-turbo, etc.

OpenAI API endpoint (usually https://api.openai.com/v1).

Your OpenAI API key (starts with 'sk-'). Keep this secret!