Skip to main content

Optional Configurations

Installing Bun and configuring the package registry are the only required onboarding steps to get you up and running. While the below isn't required, we recommend taking a peruse to see if anything speaks to you.

Code Editor Setup

note

We understand code editors can be a deeply personal thing, and we prefer you use an editor that you are comfortable with over one we recommend.

Linting and formatting are done through our CI/CD process regardless of what editor is installed.

Our code editor of choice is VSCodium for it's decoupling from Microsoft and offering a clean, open-source fork of the VS Code binary. While we strongly do not support Microsoft, we understand that their development ecosystem reaches far & wide while offering a rich extension marketplace. While we have explored other editors (and have high hopes for Zed in the future), using VSCodium is our current best solution.

Marketplace Configuration

VSCodium doesn't ship with the VSCode Marketplace by default, so we're going to have to hack it a bit.

Let's create a product.json file in VSCodium's application folder.

touch ~/Library/Application Support/VSCodium/product.json

Open up product.json and insert the following:

product.json
{
"nameShort": "Visual Studio Code",
"nameLong": "Visual Studio Code",
"extensionsGallery": {
"serviceUrl": "https://marketplace.visualstudio.com/_apis/public/gallery",
"cacheUrl": "https://vscode.blob.core.windows.net/gallery/index",
"itemUrl": "https://marketplace.visualstudio.com/items"
}
}

The extension marketplace should update on VSCodium's next clean restart.

Here are some extensions we recommend for our development experience:

  1. Biome: For it's language independent, performant formatting and linting.
  2. Code Spell Checker: Does what it says on the tin and absolutely rules.
  3. DevDb: Lightweight database viewer.
  4. indent-rainbow: Indentation visual aid, we recommend light mode.
  5. Intellisense for CSS: Autocomplete CSS classes.
  6. YAML: YAML support from the Redhat crew.
  7. markdownlint: Integrated Markdown support.

You can find all of these in the extension marketplace, or by running the following command after installing VSCodium:

codium --install-extension biomejs.biome --install-extension streetsidesoftware.code-spell-checker --install-extension damms005.devdb --install-extension zignd.html-css-class-completion --install-extension redhat.vscode-yaml --install-extension davidanson.vscode-markdownlint

Code Assistant

While we do not support and, quite frankly, detest AI/LLMs as an emerging unethical product, LLMs are an invaluable tool for personal use and can be ran locally given you have a mid-range graphics card/a M2 and above Apple chip configuration.

Ollama

Ollama allows us to run open-source Large Language Models (LLMs) locally on our machine. You can install it via the following:

brew install ollama

Download Models

Once ollama is installed, we can run the following to pull three popular LLMs we're going to use.

warning

While These models are well optimized and light, they will still take up around 10GB of disk space.

ollama pull mistral
ollama pull qwen2.5-coder:1.5b-base
ollama pull nomic-embed-text:latest

We'll be using:

  • Mistral for chatting
  • Qwen-coder for...coding
  • Nomic as our vector transform (how we turn files, documents, etc into referenceable data).

Install Continue

Continue is an open-source code assistant that can be ran locally and has built in functionality and configurations for context and RAG. If you don't know what any of that means, it just means it can be trained on our code base and is great at scanning what you're working on currently.

Continue can be found in the extension marketplace or via CLI:

codium --install-extension continue.continue

Continue can be accessed in the side menu of Codium by clicking on it's icon. While Continue can be annoying initially about signing in, you do not have to make an account on their service to use their product.

Continue Configuration

In your global .continue folder (~/.continue on Mac, %USERPROFILE%\.continue on Windows), copy the following into the config.yaml file:

name: Local Assistant
version: 0.0.0
schema: v1
models:
- name: mistral
provider: ollama
apiBase: http://localhost:11434
model: mistral
roles:
- chat
- edit
- apply
- name: Qwen2.5-Coder 1.5B
provider: ollama
apiBase: http://localhost:11434
model: qwen2.5-coder:1.5b-base
roles:
- autocomplete
- name: Nomic Embed
provider: ollama
apiBase: http://localhost:11434
model: nomic-embed-text:latest
roles:
- embed

Chat Usage

We recommend taking a look at Continue's chat quick start guide to learn how to add your codebase, docs, files, etc into your assistants context (memory).