Connect V-Lab to your assistant.
V-Lab’s research-grade risk data now responds to the AI assistant you already use. That includes volatility, SRISK, liquidity, climate benchmarks, and more. No pipelines, no scraping. Just ask your assistant in plain English.
“What was Citigroup’s SRISK at the end of January, and how does it rank globally?”
$108B
SRISK
#8
Global rank
+14%
MoM
V-Lab reading · 30 January 2026
Early access · Active development
V-Lab MCP is pre-1.0. New tools and fields are added regularly. Within a minor version, existing field names and response shapes will not change; a major-version bump signals possible breaking changes. Reconnect your client occasionally to see new tools to pick up the latest capabilities. Use submit_feedback to shape what ships next. Listings at Anthropic’s MCP Directory and the community registries (mcp.so, Smithery, Glama) are pending as of launch. For now, add V-Lab manually using the instructions below.
What you can ask
The measures you already cite, now callable.
Every tool corresponds to a published V-Lab measure: the same GARCH, SRISK, CRISK, and COVOL series cited across the finance literature. Each response names the underlying V-Lab analysis, so you can trace any number back to its source on vlab.stern.nyu.edu. Measures built on peer-reviewed research from the Volatility and Risk Institute at NYU Stern.
Volatility & long-run risk
GARCH-family volatility, long-run VaR, and stress scenarios across 39,000+ firms and 90 markets. Queryable by ticker, country, or sector.
Systemic & climate risk
SRISK, CRISK, country-level rankings, and climate-benchmark correlations. Ask “Which G-SIBs moved most this week?” and get a ranked answer with citations.
Liquidity & common factors
Illiquidity composites, COVOL events, and common volatility factors across markets delivered to your assistant as structured, citable data.
How it works
How to connect.
- 01
Create a V-Lab account
Sign up at vlab.stern.nyu.edu. It’s free.
- 02
Connect your assistant
Pick your client below (Claude, ChatGPT, Cursor, VS Code, and more) and copy the config. OAuth means no API key for most clients.
- 03
Ask away
Query V-Lab in plain English. Use
submit_feedbackfrom inside your client to send us bugs and feature requests.
Try it
Three questions to start with.
Three prompts that exercise the core of the server. Paste any of them into your assistant once you have V-Lab connected.
“What’s Citigroup’s SRISK as of last month, and how does it rank globally?”
Single-firm systemic-risk lookup with global context.
“Compare US and Japanese systemic risk over the last 12 months.”
Cross-country SRISK time series — demonstrates the currency-aware response and per-country breakdown.
“Show me which sectors have the most deteriorating liquidity in the US right now.”
Sector-level ILLIQ change ranking — surfaces the worst-trending corners of the market in one call.
Setup
Pick your client.
Server URL: https://vlab.stern.nyu.edu/mcp
Help wanted
These client instructions are early, and we’d rather hear from you than leave a trap for the next reader. If yours doesn’t work, or if your client supports a cleaner path than the one we describe, let us know and we’ll update the page.
AI Assistants
Claude Desktop
Claude Desktop offers two paths. The Connectors UI is the quickest and requires no file editing. The claude_desktop_config.json file only accepts stdio (local command) entries, so users on Free or older versions should use the mcp-remote stdio bridge below.
macOS:
~/Library/Application Support/Claude/claude_desktop_config.jsonWindows:
%APPDATA%\Claude\claude_desktop_config.json
Claude Code
Create or edit .mcp.json in your project directory:
ChatGPT
In ChatGPT settings, go to Connectors > Add connector > Model Context Protocol and enter the server URL:
Gemini CLI
Edit ~/.gemini/settings.json (or project-level .gemini/settings.json). Gemini CLI doesn’t natively handle remote-OAuth MCP yet, so wrap V-Lab with the mcp-remote stdio bridge:
Code Editors
Cursor
Create .cursor/mcp.json in your project root, or go to Settings > Tools & MCP > New MCP Server:
VS Code + GitHub Copilot
Create .vscode/mcp.json in your workspace:
Windsurf
Edit ~/.codeium/windsurf/mcp_config.json. Windsurf doesn’t natively handle remote-OAuth MCP yet, so wrap V-Lab with the mcp-remote stdio bridge:
Zed
Open Zed > Settings > Open Settings. Zed expects an mcp-remote stdio bridge under the context_servers key:
Open Source & Self-Hosted
Continue.dev
Edit ~/.continue/config.yaml (or workspace-level .continue/config.yaml). Continue expects a stdio command, so bridge V-Lab through mcp-remote:
Cline
Open the Cline sidebar > MCP Servers > Configure > “Configure MCP Servers”:
Ollama
Ollama is an LLM runtime, not an MCP client. Use it as the model backend in any of the clients above (Continue.dev, Cline, etc.). Configure the MCP server in your chosen client, and point the model configuration at your Ollama instance.
Troubleshooting
401 Unauthorized: Try disconnecting and reconnecting the server to trigger a fresh OAuth sign-in.
OAuth sign-in loop: Ensure your client supports OAuth 2.1 with PKCE. If your client doesn’t handle remote OAuth natively, use the mcp-remote stdio bridge shown in its setup section above.
429 Too Many Requests: Rate limit exceeded. Wait a moment and try again.
Connection refused: Ensure the URL uses https (not http).
Client doesn’t connect: Verify your client supports Streamable HTTP transport and you’re using the correct field name for the URL. Older clients using SSE transport may need to upgrade. V-Lab does not serve a legacy SSE endpoint; if a guide you’re reading references
/mcp/sse, it’s out of date.Still stuck? If your client won’t connect and the sections above haven’t helped, ask your assistant what’s wrong. Paste the error, the client name, and the config you’re using. Modern LLMs know MCP well and are surprisingly good at diagnosing configuration issues. Still stuck after that, drop us a note with the client, OS, and full error text.
Reporting security issues
Found a security vulnerability? Please report it privately to vlab@stern.nyu.edu. We investigate every report and will acknowledge yours within two business days.
Working with MCP
A few habits that pay for themselves.
Model Context Protocol is still young. The MCP ecosystem rewards a bit of discipline. Here’s how to get the most out of V-Lab without blowing up your context window.
01
Fewer servers, clearer context
Every active MCP loads its tool definitions into your context window. If you’re focused on V-Lab work, disable MCPs you aren’t using. Leave only what your assistant actually needs to reach for.
02
Prefer project-scoped configs
Claude Code’s .mcp.json, Cursor’s .cursor/mcp.json, and VS Code’s .vscode/mcp.json attach V-Lab to a single workspace rather than your whole machine. Keeps your global context clean and makes it obvious which projects have V-Lab available.
03
Be specific in your prompts
Tool calls burn tokens. “Rank these five G-SIBs by current SRISK” beats “tell me about systemic risk.” Narrower asks produce sharper answers and fewer unnecessary calls.
04
Reconnect to pick up new tools
V-Lab MCP ships new tools and refinements on a regular cadence. Restart or reconnect your client every week or two so your assistant can see the latest capabilities.
05
Use submit_feedback in-flow
If a tool returns something odd or you want a capability that doesn’t exist, ask your assistant to call submit_feedback right there. It’s faster than a web form and lands directly with the V-Lab research team.
Feedback, built in
See something wrong? Report it from inside your client.
V-Lab MCP exposes a submit_feedback tool. Use it from inside your client to report data issues, request features, or flag confusing behavior — no context switch, no web form, no support ticket. It goes straight to the V-Lab research team.
Disclaimer & citation
V-Lab data is provided on an “as is” basis for informational and academic research purposes only. It does not constitute investment advice or professional financial consultation, and users assume all risk associated with use of V-Lab data. When using V-Lab data in published work, please cite V-Lab accordingly. See full legal provisions and citation guidance.