Nuxt OllamaSimple integration of the official Ollama JavaScript Library for your Nuxt application.
You have to install Ollama to use this module. See Official Website to download it for your system.
Install the module to your Nuxt application with one command:
npx nuxi module add nuxt-ollama
pnpx nuxi module add nuxt-ollama
That's it! You can now use Nuxt Ollama in your Nuxt app ✨
Usage on pages or server side:
const ollama = useOllama()
const response = await ollama.chat({
model: 'gpt-oss:120b',
messages: [{ role: 'user', content: 'Why is the sky blue?' }],
})
console.log(response.message.content)
See documentation for more information or examples.
// nuxt.config.ts
export default defineNuxtConfig({
//...
ollama: {
protocol: 'http', // or 'https'
host: 'localhost', //domain or ip address
port: 11434, // port
proxy: false, // use proxy
}
})
// nuxt.config.ts
export default defineNuxtConfig({
//...
ollama: {
protocol: 'https', // or 'https'
host: 'ollama.com', //domain or ip address
api_key: 'your_api_key_here' // your Ollama API key
}
})
Contributions are welcome, feel free to open an issue or submit a pull request!
Please see the contributing guide for details.
# Install dependencies
pnpm install
# Generate type stubs
pnpm run dev:prepare
# Develop with the playground
pnpm run dev