nuxt-ollama
nuxt-ollama

Simple integration of the official Ollama JavaScript Library for your Nuxt application.

Nuxt Ollama

npm versionnpm downloadsLicenseNuxt

Simple integration of the official Ollama JavaScript Library for your Nuxt application.

You have to install Ollama to use this module. See Official Website to download it for your system.

Quick Setup

Install the module to your Nuxt application with one command:

  • using npm
npx nuxi module add nuxt-ollama
  • using pnpm
pnpx nuxi module add nuxt-ollama

That's it! You can now use Nuxt Ollama in your Nuxt app ✨

Features

  • Vue 3 composable
  • Server utils

Usage

Usage on pages or server side:

const ollama = useOllama()

const response = await ollama.chat({
  model: 'gpt-oss:120b',
  messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
console.log(response.message.content)

See documentation for more information or examples.

Settings

Local models

// nuxt.config.ts
export default defineNuxtConfig({
  //...
  ollama: {
    protocol: 'http', // or 'https'
    host: 'localhost', //domain or ip address
    port: 11434, // port
    proxy: false, // use proxy
  }
})

- Cloud models with API key on https://ollama.com

// nuxt.config.ts
export default defineNuxtConfig({
  //...
  ollama: {
    protocol: 'https', // or 'https'
    host: 'ollama.com', //domain or ip address
    api_key: 'your_api_key_here' // your Ollama API key
  }
})

Contribution

Contributions are welcome, feel free to open an issue or submit a pull request!

Please see the contributing guide for details.

Local development
# Install dependencies
pnpm install

# Generate type stubs
pnpm run dev:prepare

# Develop with the playground
pnpm run dev