Adding Ollama + models + ZED
This commit is contained in:
@@ -891,6 +891,7 @@ This section describes a way of installing packages, either through nixpkgs orr
|
||||
This module enables and configures the Ollama system service on NixOS, including optional GPU acceleration (CUDA or ROCm).
|
||||
It ensures the Ollama CLI is available system-wide for interacting with local models.
|
||||
It automatically pulls and prepares selected coding models (e.g., Qwen2.5-Coder and StarCoder2) at system activation.
|
||||
|
||||
#+begin_src nix :tangle configuration/apps/ai.nix :noweb tangle :mkdirp yes
|
||||
{ config, lib, pkgs, ... }:
|
||||
{
|
||||
@@ -2029,7 +2030,7 @@ You'll notice the color values in multiple places outside this as well.
|
||||
This Home-Manager module installs and configures the Zed editor in a user environment.
|
||||
It integrates Ollama as a local LLM provider within Zed’s AI settings for code assistance.
|
||||
It also generates a Continue configuration file pointing to the local Ollama instance for compatible editors.
|
||||
#+begin_src nix :tangle home/apps/theme.nix :noweb tangle :mkdirp yes.
|
||||
#+begin_src nix :tangle home/apps/ai.nix :noweb tangle :mkdirp yes.
|
||||
{ config, lib, pkgs, ... }:
|
||||
let
|
||||
# Continue gebruikt tegenwoordig bij voorkeur config.yaml; config.json bestaat nog
|
||||
|
||||
Reference in New Issue
Block a user