Neo N3 MCP Server Integration Guide

This guide will help you integrate the Neo N3 MCP Server with your AI systems, enabling them to interact with the Neo N3 blockchain and its famous contracts through the Model Context Protocol (MCP).

By integrating the Neo N3 MCP Server, your AI systems will gain the ability to:

  • Query the Neo N3 blockchain for information
  • Manage Neo N3 wallets and accounts
  • Transfer assets between addresses
  • Interact with famous Neo N3 contracts like NeoFS, NeoBurger, Flamingo, and more

Understanding MCP

The Model Context Protocol (MCP) is a standardized protocol for AI systems to interact with external tools and resources. It provides a consistent interface for AI models to access capabilities outside of their training data.

Key concepts of MCP:

  • Tools: Functions that an AI system can call to perform actions
  • Resources: Data sources that an AI system can access
  • Server: A provider of tools and resources
  • Client: A consumer of tools and resources

The Neo N3 MCP Server implements the MCP protocol to provide tools and resources for interacting with the Neo N3 blockchain. AI systems that support MCP can use these tools and resources to perform blockchain operations without requiring direct knowledge of blockchain APIs.

MCP Architecture

MCP Architecture

In this architecture:

  • The AI System is your application or model that needs blockchain functionality
  • The MCP Client is responsible for communication with the MCP Server
  • The Neo N3 MCP Server provides tools and resources for interacting with the Neo N3 blockchain
  • The Neo N3 Blockchain is the underlying blockchain platform

Setting up Neo N3 MCP Server

This section covers how to set up the Neo N3 MCP Server.

Installation

You can install the Neo N3 MCP Server using npm:

npm install @r3e/neo-n3-mcp

For global installation:

npm install -g @r3e/neo-n3-mcp

Requirements

  • Node.js v14 or later
  • npm v6 or later

Configuration

The Neo N3 MCP Server can be configured through environment variables or a configuration file.

Environment Variables

Variable Description Default
NEO_MAINNET_RPC_URL URL for Neo N3 mainnet https://mainnet1.neo.coz.io:443
NEO_TESTNET_RPC_URL URL for Neo N3 testnet https://testnet1.neo.coz.io:443
NEO_MCP_PORT Port number for the MCP server 8080
NEO_MCP_LOG_LEVEL Logging level (debug, info, warn, error) info

Configuration File

You can also provide a configuration file at ~/.neo-n3-mcp/config.json:

{
  "port": 3000,
  "network": "mainnet",
  "mainnetRpcUrl": "https://mainnet1.neo.coz.io:443",
  "testnetRpcUrl": "https://testnet1.neo.coz.io:443",
  "walletPath": "./wallets"
}

Running the Server

Start the server using the following command:

npx neo-n3-mcp

By default, the server will run on port 8080. You can customize this by setting the NEO_MCP_PORT environment variable:

NEO_MCP_PORT=3000 npx neo-n3-mcp

You should see output similar to:

Neo N3 MCP Server started on port 8080
Mainnet RPC: https://mainnet1.neo.coz.io:443
Testnet RPC: https://testnet1.neo.coz.io:443
Log level: info

Integrating with AI Systems

This section covers how to integrate the Neo N3 MCP Server with various AI systems.

LangChain Integration

LangChain is a framework for developing applications powered by language models. You can integrate the Neo N3 MCP Server with LangChain to create AI applications that can interact with the Neo N3 blockchain.

Installation

npm install langchain @modelcontextprotocol/sdk @r3e/neo-n3-mcp

Basic Integration

import { ChatOpenAI } from 'langchain/chat_models/openai';
import { Client } from '@modelcontextprotocol/sdk/client';
import { createMcpAgent } from '@langchain/mcp';

// Initialize the LLM
const llm = new ChatOpenAI({
  modelName: 'gpt-4',
  temperature: 0
});

// Initialize the MCP client
const mcpClient = new Client({
  url: 'http://localhost:8080'
});

// Create an agent with MCP tools
const agent = createMcpAgent({
  llm,
  mcpClient,
  tools: ['get_blockchain_info', 'get_balance', 'list_famous_contracts']
});

// Run the agent
const result = await agent.invoke({
  input: 'What is the current block height of the Neo N3 mainnet?'
});

console.log(result.output);

Advanced Integration

For more advanced integration, you can create custom tools that combine multiple MCP tools:

import { ChatOpenAI } from 'langchain/chat_models/openai';
import { Client } from '@modelcontextprotocol/sdk/client';
import { createMcpAgent } from '@langchain/mcp';
import { DynamicTool } from 'langchain/tools';

// Initialize the LLM
const llm = new ChatOpenAI({
  modelName: 'gpt-4',
  temperature: 0
});

// Initialize the MCP client
const mcpClient = new Client({
  url: 'http://localhost:8080'
});

// Create a custom tool that combines multiple MCP tools
const getAccountInfoTool = new DynamicTool({
  name: 'get_account_info',
  description: 'Get detailed information about a Neo N3 account, including balances for NEO, GAS, and bNEO',
  func: async (address) => {
    // Get NEO balance
    const neoBalance = await mcpClient.callTool('get_balance', {
      address,
      asset: 'NEO',
      network: 'mainnet'
    });
    
    // Get GAS balance
    const gasBalance = await mcpClient.callTool('get_balance', {
      address,
      asset: 'GAS',
      network: 'mainnet'
    });
    
    // Get bNEO balance
    const bNeoBalance = await mcpClient.callTool('neoburger_get_balance', {
      address,
      network: 'mainnet'
    });
    
    return JSON.stringify({
      address,
      balances: {
        neo: neoBalance.amount,
        gas: gasBalance.amount,
        bNeo: bNeoBalance.amount
      }
    });
  }
});

// Create an agent with MCP tools and custom tools
const agent = createMcpAgent({
  llm,
  mcpClient,
  tools: ['get_blockchain_info', 'list_famous_contracts'],
  additionalTools: [getAccountInfoTool]
});

// Run the agent
const result = await agent.invoke({
  input: 'What are the balances for the account NXV7ZhHaLY2GNjp6R1AYBV9FqrVnGLfQcz?'
});

console.log(result.output);

LlamaIndex Integration

LlamaIndex is a data framework for LLM applications. You can integrate the Neo N3 MCP Server with LlamaIndex to create AI applications that can retrieve and process blockchain data.

Installation

npm install llamaindex openai @modelcontextprotocol/sdk @r3e/neo-n3-mcp

Basic Integration

import { VectorStoreIndex, SimpleDirectoryReader } from 'llamaindex';
import { OpenAI } from 'llamaindex/llms/openai';
import { Client } from '@modelcontextprotocol/sdk/client';

// Initialize the LLM
const llm = new OpenAI({
  model: 'gpt-4',
  temperature: 0
});

// Initialize the MCP client
const mcpClient = new Client({
  url: 'http://localhost:8080'
});

// Create custom Neo N3 tools
const neoTools = {
  getBlockchainInfo: async () => {
    return await mcpClient.callTool('get_blockchain_info', {
      network: 'mainnet'
    });
  },
  
  getBalance: async (address, asset = 'NEO') => {
    return await mcpClient.callTool('get_balance', {
      address,
      asset,
      network: 'mainnet'
    });
  }
};

// Create a query engine with our tools
const queryEngine = await VectorStoreIndex.fromDocuments(
  await new SimpleDirectoryReader().loadData('./docs'),
  { llm }
);

// Augment the query engine with Neo N3 tools
const augmentedQueryEngine = queryEngine.asQueryEngine({
  tools: [
    {
      name: 'neo_get_blockchain_info',
      description: 'Get blockchain information from Neo N3',
      func: neoTools.getBlockchainInfo
    },
    {
      name: 'neo_get_balance',
      description: 'Get asset balance for a Neo N3 address',
      func: neoTools.getBalance
    }
  ]
});

// Query the engine
const response = await augmentedQueryEngine.query(
  'What is the current block height of Neo N3 and the NEO balance of address NXV7ZhHaLY2GNjp6R1AYBV9FqrVnGLfQcz?'
);

console.log(response.toString());

Custom Integration

You can also integrate the Neo N3 MCP Server directly into your custom AI systems using the MCP SDK.

Installation

npm install @modelcontextprotocol/sdk @r3e/neo-n3-mcp

Basic Integration

import { Client } from '@modelcontextprotocol/sdk/client';

// Initialize the MCP client
const mcpClient = new Client({
  url: 'http://localhost:8080'
});

// List available tools
const tools = await mcpClient.listTools();
console.log('Available tools:', tools);

// Call a tool
async function callNeoTool(toolName, params) {
  try {
    const result = await mcpClient.callTool(toolName, params);
    return result;
  } catch (error) {
    console.error(`Error calling tool ${toolName}:`, error);
    throw error;
  }
}

// Example: Get blockchain info
const blockchainInfo = await callNeoTool('get_blockchain_info', {
  network: 'mainnet'
});
console.log('Blockchain info:', blockchainInfo);

// Example: Get balance
const balance = await callNeoTool('get_balance', {
  address: 'NXV7ZhHaLY2GNjp6R1AYBV9FqrVnGLfQcz',
  asset: 'NEO',
  network: 'mainnet'
});
console.log('Balance:', balance);

Integrating into a Custom AI System

Here's an example of integrating the Neo N3 MCP Server into a custom AI system:

import { Client } from '@modelcontextprotocol/sdk/client';
import { OpenAI } from 'openai';

// Initialize the OpenAI API
const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY
});

// Initialize the MCP client
const mcpClient = new Client({
  url: 'http://localhost:8080'
});

// Function to handle blockchain queries
async function handleBlockchainQuery(query) {
  // Use the LLM to identify the intent
  const response = await openai.chat.completions.create({
    model: 'gpt-4',
    messages: [
      {
        role: 'system',
        content: 'You are a helpful assistant that answers questions about the Neo N3 blockchain.'
      },
      {
        role: 'user',
        content: `Identify the intent and parameters from this blockchain query: "${query}"`
      }
    ],
    response_format: { type: 'json_object' }
  });
  
  const intent = JSON.parse(response.choices[0].message.content);
  
  // Based on the intent, call the appropriate MCP tool
  switch (intent.action) {
    case 'get_blockchain_info':
      return await mcpClient.callTool('get_blockchain_info', {
        network: intent.network || 'mainnet'
      });
      
    case 'get_balance':
      return await mcpClient.callTool('get_balance', {
        address: intent.address,
        asset: intent.asset || 'NEO',
        network: intent.network || 'mainnet'
      });
      
    case 'get_contract_info':
      return await mcpClient.callTool('get_contract_info', {
        contractName: intent.contractName,
        network: intent.network || 'mainnet'
      });
      
    default:
      throw new Error(`Unknown intent: ${intent.action}`);
  }
}

// Example usage
const query = 'What is the current block height of Neo N3 mainnet?';
const result = await handleBlockchainQuery(query);
console.log(result);

Integration Examples

This section provides examples of how to integrate the Neo N3 MCP Server into various AI applications.

Blockchain Chatbot

Create a chatbot that can answer questions about the Neo N3 blockchain.

import { ChatOpenAI } from 'langchain/chat_models/openai';
import { Client } from '@modelcontextprotocol/sdk/client';
import { createMcpAgent } from '@langchain/mcp';

// Initialize the LLM
const llm = new ChatOpenAI({
  modelName: 'gpt-4',
  temperature: 0
});

// Initialize the MCP client
const mcpClient = new Client({
  url: 'http://localhost:8080'
});

// Create an agent with MCP tools
const agent = createMcpAgent({
  llm,
  mcpClient,
  tools: [
    'get_blockchain_info',
    'get_block',
    'get_transaction',
    'get_balance',
    'list_famous_contracts',
    'get_contract_info'
  ]
});

// Create a chatbot interface
async function handleChatMessage(message) {
  try {
    const result = await agent.invoke({
      input: message
    });
    
    return result.output;
  } catch (error) {
    console.error('Error handling chat message:', error);
    return 'Sorry, I encountered an error while processing your request.';
  }
}

// Example usage
const response = await handleChatMessage(
  'What is the current block height and when was the last block mined?'
);
console.log(response);

NFT Creation Assistant

Create an assistant that helps users create and list NFTs on GhostMarket.

import { ChatOpenAI } from 'langchain/chat_models/openai';
import { Client } from '@modelcontextprotocol/sdk/client';
import { createMcpAgent } from '@langchain/mcp';

// Initialize the LLM
const llm = new ChatOpenAI({
  modelName: 'gpt-4',
  temperature: 0
});

// Initialize the MCP client
const mcpClient = new Client({
  url: 'http://localhost:8080'
});

// Create an agent with MCP tools
const agent = createMcpAgent({
  llm,
  mcpClient,
  tools: [
    'ghostmarket_create_nft',
    'ghostmarket_list_nft',
    'ghostmarket_get_token_info'
  ]
});

// Create a function to handle NFT creation
async function createAndListNFT(walletPath, walletPassword, tokenURI, properties, price, paymentToken) {
  try {
    // Create the NFT
    const nftResult = await mcpClient.callTool('ghostmarket_create_nft', {
      walletPath,
      walletPassword,
      tokenURI,
      properties,
      network: 'mainnet'
    });
    
    const tokenId = nftResult.tokenId;
    
    // List the NFT for sale
    const listingResult = await mcpClient.callTool('ghostmarket_list_nft', {
      walletPath,
      walletPassword,
      tokenId,
      price,
      paymentToken,
      network: 'mainnet'
    });
    
    return {
      nft: nftResult,
      listing: listingResult
    };
  } catch (error) {
    console.error('Error creating and listing NFT:', error);
    throw error;
  }
}

// Example usage
const result = await createAndListNFT(
  '/path/to/wallet.json',
  'your-password',
  'https://example.com/nft/metadata.json',
  [
    { key: 'artist', value: 'ExampleArtist' },
    { key: 'edition', value: '1/1' }
  ],
  '100',
  '0xd2a4cff31913016155e38e474a2c06d08be276cf'
);
console.log(result);

DeFi Advisor

Create an advisor that helps users optimize their DeFi investments on Neo N3.

import { ChatOpenAI } from 'langchain/chat_models/openai';
import { Client } from '@modelcontextprotocol/sdk/client';
import { createMcpAgent } from '@langchain/mcp';

// Initialize the LLM
const llm = new ChatOpenAI({
  modelName: 'gpt-4',
  temperature: 0
});

// Initialize the MCP client
const mcpClient = new Client({
  url: 'http://localhost:8080'
});

// Create an agent with MCP tools
const agent = createMcpAgent({
  llm,
  mcpClient,
  tools: [
    'get_balance',
    'neoburger_get_balance',
    'flamingo_get_balance',
    'neocompound_get_balance'
  ]
});

// Create a function to get a user's portfolio
async function getUserPortfolio(address) {
  try {
    // Get NEO balance
    const neoBalance = await mcpClient.callTool('get_balance', {
      address,
      asset: 'NEO',
      network: 'mainnet'
    });
    
    // Get GAS balance
    const gasBalance = await mcpClient.callTool('get_balance', {
      address,
      asset: 'GAS',
      network: 'mainnet'
    });
    
    // Get bNEO balance
    const bNeoBalance = await mcpClient.callTool('neoburger_get_balance', {
      address,
      network: 'mainnet'
    });
    
    // Get FLM balance
    const flmBalance = await mcpClient.callTool('flamingo_get_balance', {
      address,
      network: 'mainnet'
    });
    
    return {
      address,
      balances: {
        neo: neoBalance.amount,
        gas: gasBalance.amount,
        bNeo: bNeoBalance.amount,
        flm: flmBalance.amount
      }
    };
  } catch (error) {
    console.error('Error getting user portfolio:', error);
    throw error;
  }
}

// Create a function to recommend DeFi strategies
async function recommendDeFiStrategies(address) {
  const portfolio = await getUserPortfolio(address);
  
  // Use the LLM to generate recommendations
  const response = await llm.invoke([
    {
      role: 'system',
      content: 'You are a DeFi advisor for Neo N3. Generate investment recommendations based on the user\'s portfolio.'
    },
    {
      role: 'user',
      content: `Here is the user's portfolio: ${JSON.stringify(portfolio, null, 2)}\n\nWhat DeFi strategies would you recommend?`
    }
  ]);
  
  return response.content;
}

// Example usage
const recommendations = await recommendDeFiStrategies('NXV7ZhHaLY2GNjp6R1AYBV9FqrVnGLfQcz');
console.log(recommendations);

Deployment Strategies

This section covers various strategies for deploying the Neo N3 MCP Server.

Local Deployment

The simplest deployment option is to run the Neo N3 MCP Server locally.

Running as a Background Process

You can use tools like PM2 to run the server as a background process:

# Install PM2
npm install -g pm2

# Start the server
pm2 start "npx neo-n3-mcp" --name neo-n3-mcp

# Check status
pm2 status

# View logs
pm2 logs neo-n3-mcp

# Stop the server
pm2 stop neo-n3-mcp

Cloud Deployment

You can deploy the Neo N3 MCP Server to cloud platforms like AWS, Azure, or GCP.

AWS Deployment

Here's an example of deploying to AWS EC2:

  1. Launch an EC2 instance with Node.js installed
  2. Clone your Neo N3 MCP Server repository
  3. Install dependencies: npm install
  4. Start the server: npm start

Heroku Deployment

Here's an example of deploying to Heroku:

  1. Create a Procfile with the content: web: npm start
  2. Create a Heroku app: heroku create
  3. Set environment variables: heroku config:set NEO_MAINNET_RPC_URL=https://mainnet1.neo.coz.io:443
  4. Deploy to Heroku: git push heroku main

Containerization

You can containerize the Neo N3 MCP Server using Docker.

Dockerfile

FROM node:16-alpine

WORKDIR /app

# Copy package.json and package-lock.json
COPY package*.json ./

# Install dependencies
RUN npm install

# Copy source code
COPY . .

# Set environment variables
ENV NEO_MAINNET_RPC_URL=https://mainnet1.neo.coz.io:443
ENV NEO_TESTNET_RPC_URL=https://testnet1.neo.coz.io:443
ENV NEO_MCP_PORT=8080

# Expose port
EXPOSE 8080

# Start the server
CMD ["npm", "start"]

Docker Compose

version: '3'

services:
  neo-n3-mcp:
    build: .
    ports:
      - "8080:8080"
    environment:
      - NEO_MAINNET_RPC_URL=https://mainnet1.neo.coz.io:443
      - NEO_TESTNET_RPC_URL=https://testnet1.neo.coz.io:443
      - NEO_MCP_PORT=8080
    volumes:
      - ./wallets:/app/wallets

Security Considerations

When integrating the Neo N3 MCP Server with AI systems, it's important to consider security implications.

Wallet Security

  • Never hardcode wallet passwords in your code
  • Use environment variables or a secure configuration manager for sensitive information
  • Implement proper access controls for wallet operations
  • Regularly back up wallet files

Network Security

  • Use HTTPS when exposing the MCP server publicly
  • Implement rate limiting to prevent abuse
  • Consider implementing authentication for MCP server access
  • Restrict MCP server access to trusted networks when possible

Operational Security

  • Implement logging and monitoring for suspicious activities
  • Regularly update the Neo N3 MCP Server and its dependencies
  • Test on testnet before deploying to mainnet
  • Implement safeguards for high-value transactions

Troubleshooting

This section covers common issues and their solutions.

Connection Issues

MCP server is running, but client can't connect

Possible solutions:

  • Verify that the server URL is correct
  • Check if there's a firewall blocking the connection
  • Verify that the server is listening on the expected port
Server can't connect to Neo N3 RPC

Possible solutions:

  • Verify that the RPC URL is correct
  • Check if the RPC server is running
  • Check your network connection
  • Try a different RPC provider

Tool Execution Issues

Tool returns an error

Possible solutions:

  • Check the error message for specific information
  • Verify that the tool parameters are correct
  • Check if the operation requires signing and if you've provided the correct wallet information
Transaction fails

Possible solutions:

  • Verify that you have sufficient balance
  • Check if the transaction requires GAS and if you have enough
  • Verify that the contract is available on the selected network

AI Integration Issues

AI system doesn't recognize MCP tools

Possible solutions:

  • Verify that you've registered the tools with the AI system
  • Check if the tool descriptions are clear and informative
  • Ensure that the AI system understands how to use the tools
AI system uses tools incorrectly

Possible solutions:

  • Improve the tool descriptions
  • Provide examples of correct tool usage
  • Implement parameter validation to catch incorrect usage