Overseas access: www.kdjingpai.com
Ctrl + D Favorites

Scira MCP Chat is an open source AI chat tool built on the Model Context Protocol (MCP) protocol. It supports multiple AI models through Vercel AI SDK and allows users to connect to different MCP servers to extend AI functionality. Developed by Zaid Mukaddam, the project uses Next.js and Tailwind CSS for a clean and modern interface that supports multiple transports such as HTTP, SSE and stdio. Users can easily connect to the MCP server through the settings interface and experience smooth text streaming responses and tool integration. The project is free and open on GitHub for developers, AI enthusiasts, and users who need customized AI tools.

Function List

  • Support for multiple AI models: With the Vercel AI SDK, users can seamlessly switch between OpenAI, xAI Grok, and many other AI models.
  • MCP Server Integration: Connect to any MCP-compatible server to extend tool functionality such as search, code interpreters, etc.
  • Multi-Transport: Support HTTP, SSE and stdio transport protocols to adapt to different tool providers.
  • Tool extensions: Built-in tool integrations to enhance AI functionality such as code debugging, task management, and data analysis.
  • Modern interface: Based on shadcn/ui and Tailwind CSS, the interface is responsive, beautiful and intuitive.
  • Streaming Text Response: real-time display of AI responses to enhance the interactive experience.
  • Open source and free: the code is publicly available on GitHub, allowing users to modify and deploy it freely.
  • Settings Management: Easily add and activate MCP servers via the settings icon in the chat interface.

Using Help

Installation process

Scira MCP Chat is a Next.js based web application that requires a basic development environment for deployment. Below are the detailed installation steps:

  1. environmental preparation
    • Make sure you have Node.js (recommended version 16 or higher) and npm installed.
    • Install Git for cloning projects from GitHub.
    • Optional: Install Docker and Docker Compose for containerized deployments.
  2. cloning project
    Open a terminal and run the following command to clone the Scira MCP Chat code base:

    git clone https://github.com/zaidmukaddam/scira-mcp-chat.git
    cd scira-mcp-chat
    
  3. Installation of dependencies
    Run the following command in the project directory to install the required dependencies:

    npm install
    
  4. Configuring Environment Variables
    Create a .env.local file, add the necessary environment variables. Example:

    NEXT_PUBLIC_AI_SDK_PROVIDER=openai
    AI_SDK_API_KEY=your_api_key
    
    • NEXT_PUBLIC_AI_SDK_PROVIDER: Specify the AI provider (e.g., OpenAI or other supported models).
    • AI_SDK_API_KEY: API key obtained from the AI provider.
    • If you are using an MCP server, you may need to additionally configure the server address and authentication information.
  5. Running Projects
    After the installation is complete, run the following command to start the development server:

    npm run dev
    

    Open your browser and visit http://localhost:3000You can see the interface of Scira MCP Chat.

  6. Docker deployment (optional)
    If using Docker, run the following command:

    docker-compose up --build
    

    assure docker-compose.yml file has been properly configured, the project will run on the specified port (default 3000).

Usage

  1. Accessing the Chat Interface
    After launching the project, open a browser and go to the home page of Scira MCP Chat. The interface contains the chat window, model selector and settings icons.
  2. Connecting to the MCP Server
    • Click on the Settings icon (⚙️) in the upper right corner of the chat screen.
    • In the Settings pop-up window, enter the name and address of the MCP server (e.g., Composio's or Zapier's server).
    • Select the transport type (HTTP, SSE or stdio) and click "Use" to activate the server.
    • When activated, the server's tools will be integrated into the chat, such as search, code interpreter or task management.
  3. Selecting an AI model
    • In the Model Selector, select a supported AI model (e.g., xAI's Grok 3 or OpenAI's model).
    • If you need to switch models, just reselect them, and the Vercel AI SDK will handle the model switching automatically.
  4. Using the Tool Functions
    • Enter a question or task and AI will call the relevant tool based on the connected MCP server. For example, if you enter "Search for the latest AI news," the system will call the search tool through the MCP server and return the results.
    • For code debugging, enter a code snippet and AI will provide optimization suggestions or error analysis.
    • The results of the tool are displayed as text or UI components (if using the MCP-UI server).
  5. Management Settings
    • In the setup screen, you can add multiple MCP servers and switch them at any time.
    • Supports saving configurations for next time.

Featured Function Operation

  • Streaming Text Response: When you enter a question, the AI response is displayed in real time, so you don't have to wait for a full response.
  • MCP-UI Integration: If the connected server supports MCP-UI (e.g. idosal/scira-mcp-ui-chat), the result of the tool call is displayed as an interactive UI component. For example.show_task_status The tool displays a graphical interface of the task status.
  • MTP: Choose the appropriate transport based on the tool provider. For example, SSE is good for real-time data streaming and stdio is good for local tool calls.
  • Open Source Customization: Developers can modify the code to add custom tools or interface components to fit specific needs.

caveat

  • Ensure that the API key is valid, otherwise you cannot connect to the AI model.
  • The MCP server address needs to be accurate and it is recommended to refer to official documentation (e.g. Composio or Zapier).
  • The project relies on the Vercel AI SDK and needs to maintain a network connection to invoke external AI services.
  • If deployed to a production environment, it is recommended to use HTTPS for security.

application scenario

  1. Developer Debugging Code
    Developers can use Scira MCP Chat to connect to code interpreter tools, enter code snippets, and the AI will analyze errors, optimize the code, or provide debugging suggestions. Ideal for quickly verifying code logic.
  2. AI Tool Extension
    Users can connect to Composio's or Zapier's MCP servers to invoke search, task management or data analysis tools suitable for automating workflows or obtaining real-time information.
  3. Education and learning
    Students or researchers can use AI models to answer academic questions or query papers and analyze data through tool integrations that are suitable for academic research or learning scenarios.
  4. Customer Service Support in Production Environments
    Organizations can integrate Scira MCP Chat into their customer service system by customizing the MCP server to automatically answer frequently asked questions or invoke external tools to handle customer requests.

QA

  1. Is Scira MCP Chat free?
    Yes, Scira MCP Chat is an open source project and the code is freely available on GitHub. Users only need to pay for possible AI model API fees or MCP server fees.
  2. How do I add a new MCP server?
    Click on the Settings icon in the Chat screen, enter the server name and address, select the transport type (HTTP, SSE or stdio) and click "Use" to activate it. Composio, Zapier and other compatible servers are supported.
  3. What AI models are supported?
    With the Vercel AI SDK, a wide range of models are supported, such as xAI's Grok 3, OpenAI's models, and so on. Specific support depends on the configured API provider.
  4. How are the results of tool calls handled?
    While normal tools return text results, the MCP-UI server returns interactive UI components (e.g., task status graphs). Users can interact directly with the UI, e.g. by clicking to view details.
  5. Programming experience required?
    No programming experience is required to use the chat feature. Basic knowledge of Node.js and Git is required for deployment or customization.
0Bookmarked
0kudos

Recommended

Can't find AI tools? Try here!

Just type in the keyword Accessibility Bing SearchYou can quickly find all the AI tools on this site.

inbox

Contact Us

Top

en_USEnglish