Skip to main content

Overview

LangChain provides a powerful framework for building AI agents that can access GDELT data through the MCP server. This guide shows you how to integrate GDELT MCP tools and prompts into LangChain agents using the official langchain-mcp-adapters package.
The langchain-mcp-adapters package provides seamless integration between MCP servers and LangChain agents.

Installation

Install the required packages:
pip install langchain-mcp-adapters langchain-openai langchain

Complete Example

Here’s how to build a LangChain agent with GDELT MCP tools and the core system prompt:
import os
import asyncio
from langchain_mcp_adapters.client import MultiServerMCPClient
from langchain_openai import ChatOpenAI
from langchain.agents import create_agent

async def main():
    # Get API key from environment
    api_key = os.getenv("GDELT_API_KEY")
    if not api_key:
        raise ValueError("GDELT_API_KEY not set")
    
    # Initialize MCP client with GDELT Cloud server
    mcp_client = MultiServerMCPClient({
        "gdelt-cloud": {
            "transport": "streamable_http",
            "url": "https://gdelt-cloud-mcp.fastmcp.app/mcp",
            "headers": {
                "Authorization": f"Bearer {api_key}"
            }
        }
    })
    
    # Fetch GDELT tools
    gdelt_tools = await mcp_client.get_tools(server_name="gdelt-cloud")
    print(f"Loaded {len(gdelt_tools)} GDELT tools")
    
    # Fetch GDELT core system prompt
    gdelt_core_prompt = ""
    try:
        messages = await mcp_client.get_prompt("gdelt-cloud", "gdelt_system_prompt")
        if messages:
            gdelt_core_prompt = "\n\n".join(msg.content for msg in messages)
            print("Fetched GDELT core prompt from MCP server")
    except Exception as e:
        print(f"Warning: Could not fetch GDELT core prompt: {e}")
    
    # Build system prompt with GDELT guidance
    system_prompt = f"""You are a GDELT data analyst assistant.

Help users query and analyze global events using GDELT data.

{gdelt_core_prompt}
"""
    
    # Create LLM
    llm = ChatOpenAI(model="gpt-4", temperature=0)
    
    # Create agent with GDELT tools and prompt
    agent = create_agent(
        model=llm,
        tools=gdelt_tools,
        system_prompt=system_prompt
    )
    
    # Run query
    result = await agent.ainvoke({
        "messages": [{
            "role": "user",
            "content": "What are the most significant protests in Europe this week?"
        }]
    })
    
    print(result["messages"][-1]["content"])

if __name__ == "__main__":
    asyncio.run(main())

Key Components

1. MCP Client Configuration

from langchain_mcp_adapters.client import MultiServerMCPClient

mcp_client = MultiServerMCPClient({
    "gdelt-cloud": {
        "transport": "streamable_http",
        "url": "https://gdelt-cloud-mcp.fastmcp.app/mcp",
        "headers": {
            "Authorization": f"Bearer {api_key}"
        }
    }
})
Store your API key in environment variables:
import os
api_key = os.getenv("GDELT_API_KEY")

2. Fetch Tools and Prompt

# Get GDELT tools
gdelt_tools = await mcp_client.get_tools(server_name="gdelt-cloud")

# Get GDELT core system prompt
messages = await mcp_client.get_prompt("gdelt-cloud", "gdelt_system_prompt")
gdelt_core_prompt = "\n\n".join(msg.content for msg in messages)
The GDELT MCP server provides a core system prompt that includes important guidance for querying GDELT data effectively. Always fetch and include this prompt in your agent.

3. Create Agent

from langchain_openai import ChatOpenAI
from langchain.agents import create_agent

llm = ChatOpenAI(model="gpt-4", temperature=0)

agent = create_agent(
    model=llm,
    tools=gdelt_tools,
    system_prompt=system_prompt
)

4. Run Agent

result = await agent.ainvoke({
    "messages": [{
        "role": "user",
        "content": "Find protests in France in the last week"
    }]
})

print(result["messages"][-1]["content"])
Never hardcode API keys. Always use environment variables or secure secret management.

Available Tools

The GDELT MCP server provides the following tools:
ToolDescription
prepare_gdelt_queryValidate SQL queries before execution
execute_queryExecute validated SQL queries against GDELT data
present_sqlPresent SQL for user review (alert creation)
get_resourceFetch CAMEO codes and schema documentation

Reference Documentation

For more information on using MCP with LangChain:

Brief Example

Here’s a simple example using the official LangChain MCP adapter:
pip install langchain-mcp-adapters
from langchain_mcp_adapters.client import MultiServerMCPClient  
from langchain.agents import create_agent

client = MultiServerMCPClient(  
    {
        "gdelt": {
            "transport": "http",
            "url": "https://gdelt-cloud-mcp.fastmcp.app/mcp",
            "headers": {
                "Authorization": "Bearer YOUR_API_KEY"
            }
        }
    }
)

tools = await client.get_tools()  
agent = create_agent(
    "claude-sonnet-4-5-20250929",
    tools  
)

response = await agent.ainvoke(
    {"messages": [{"role": "user", "content": "What protests happened in France this week?"}]}
)

Next Steps