Getting Started

To compile and run the program, use:

Getting Started — React + ICP (Rust) Project

This project combines a React frontend with a Rust-based Internet Computer (ICP) backend using the DFX SDK. Below are the steps to set up, build, and run the project locally.


1️⃣ Prerequisites

Make sure you have the following installed:

  • Node.js (v16+ recommended) or Yarn

  • Rust and Cargo

  • DFX SDK (Internet Computer SDK)

  • Git

  • Ollama (for AI integration, if applicable)


2️⃣ Installation & Setup

2.1 Clone the repository

git clone https://github.com/CoFi-Xyntra/copilot-finance.git
cd copilot-finance

2.2 Install DFX

If you don’t have DFX installed yet:

Verify the installation:


2.3 Install Rust & Cargo

If you don’t have Rust installed:

Load Rust environment variables:

Add the WebAssembly target:

Check Cargo version:


3️⃣ Project Structure Overview

After cloning, check the project structure:

Make sure the following files/folders exist:

  • dfx.json → DFX configuration file

  • package.json → Frontend dependencies

  • src/ → Rust backend code

  • frontend/ or similar → React frontend source code


4️⃣ Install Dependencies

4.1 Rust dependencies (optional, if required by backend)

4.2 Frontend dependencies

Using npm:

Or using yarn:


5️⃣ Running the Project

5.1 Start the local Internet Computer replica

5.2 Deploy backend canisters

(Optional for testing playground mode):

5.3 Start the frontend development server

Or:

(depending on your package.json setup)


6️⃣ Additional Commands

Stop the local DFX replica:

Manually create canisters:

Build the project:


7️⃣ Installing Candid Extractor

The Candid Extractor tool can be useful for extracting Candid interface definitions from Rust canisters:


Terminal 1 — Build & Start replica:

Terminal 2 — Deploy canisters:

Or for playground:

Terminal 3 — Start frontend:


9️⃣ Ollama + DeepSeek Integration

This project also integrates Ollama with DeepSeek for AI-powered features.

9.1 Install Ollama

Follow installation instructions from https://ollama.ai.

9.2 Pull the DeepSeek model

9.3 Running DeepSeek locally

Last updated