loading…
Search for a command to run...
loading…
An MCP server that executes SQL via ConnectorX and streams the result to a CSV or Parquet file. Supports PostgreSQL, MariaDB, BigQuery, RedShift, MS SQL Server,
An MCP server that executes SQL via ConnectorX and streams the result to a CSV or Parquet file. Supports PostgreSQL, MariaDB, BigQuery, RedShift, MS SQL Server, etc.
An MCP server that executes SQL via ConnectorX and streams the result to CSV or Parquet in
PyArrow RecordBatch chunks.
csv or parquet"OK" on success, or "Error: <message>" on failuretiktoken (o200k_base) with a warning thresholdRecordBatch chunksConnectorX supports many databases. Common examples include:
For the complete and up-to-date list of supported databases and connection-token (conn) formats, see the official docs:
uvx run-sql-connectorx \
--conn "<connection_token>" \
--csv-token-threshold 500000
--conn <connection_token> (required): ConnectorX connection token (conn)--csv-token-threshold <int> (default 0): when > 0, enable CSV per-line token counting using tiktoken(o200k_base); the value is a warning thresholdTo launch the server from an MCP-aware client such as Cursor, add the following snippet to
.cursor/mcp.json at the project root:
{
"mcpServers": {
"run-sql-connectorx": {
"command": "uvx",
"args": [
"--from", "git+https://github.com/gigamori/mcp-run-sql-connectorx",
"run-sql-connectorx",
"--conn", "<connection_token>"
]
}
}
}
batch_size is 100 000 rows.--csv-token-threshold > 0):csv.writer writes (including header row when present, delimiters, quotes, and newlines), UTF-8tiktoken(o200k_base) per written CSV lineThe tool returns a single text message.
OK--csv-token-threshold = 0: OK--csv-token-threshold > 0: OK N tokens (or OK N tokens. Too many tokens may impair processing. Handle appropriately when N >= threshold)OK 0 tokensError: <message> (any partial output file is deleted)The server exposes a single MCP tool run_sql.
| Argument | Type | Required | Description |
|---|---|---|---|
sql_file |
string | yes | Path to a file that contains the SQL text to execute |
output_path |
string | yes | Destination file for the query result |
output_format |
enum | yes | One of "csv" or "parquet" |
batch_size |
int | no | RecordBatch size (default 100000) |
{
"tool": "run_sql",
"arguments": {
"sql_file": "sql/queries/sales.sql",
"output_path": "output/sales.parquet",
"output_format": "parquet",
"batch_size": 200000
}
}
Distributed under the MIT License. See LICENSE for details.
Add this to claude_desktop_config.json and restart Claude Desktop.
{
"mcpServers": {
"gigamori-mcp-run-sql-connectorx": {
"command": "npx",
"args": []
}
}
}