InkCop’s AI features require at least one LLM provider configured.

Supported Providers (16+)

Chinese Providers

ProviderModelFeatures
Alibaba Bailianqwen3-235b-a22bStrong Chinese, multimodal
DeepSeekdeepseek-reasonerExcellent reasoning, cost-effective
Moonshot (Kimi)kimi-latestLong-context, Chinese-friendly
Zhipu GLMglm-4.7Stable tool calling

International Providers

ProviderModelFeatures
OpenAIgpt-4oStrongest overall
Anthropicclaude-4-sonnetGreat academic writing
Google Geminigemini-2.5-proStrong multimodal
OllamaLocal modelsFully offline

Configuration

  1. SettingsAILLM Providers
  2. Select provider, enter API Key and Model
  3. Click Test Connection
  4. Toggle Enable
PlanMain ModelEmbeddingBest For
Cost-effectiveDeepSeekBailian text-embedding-v4Beginners
Best experienceClaude 4 / GPT-4oOpenAI text-embedding-3-largeEnglish papers
Fully offlineOllama + Qwen3-32BOllama + nomic-embed-textPrivacy-focused
⚠️

Local models require 16GB+ GPU VRAM. Use cloud services if your hardware is limited.