My Vibe Coding Setup — Jaydip Bhanderi

My Vibe Coding Setup

toolsworkflow

Using AI for coding is pretty much inevitable at this point. If you haven’t fully leaned into it yet, you will. Personally, I started back in 2023 with GitHub Copilot — back when it felt almost magical just to get a function autocompleted. From there to today, the delta is genuinely wild. The tools have gotten smarter, faster, and honestly more opinionated about how you should code.

This post is my honest take on what I’ve tried, what stuck, and how my current setup actually works.

The Tools I’ve Used (In Rough Chronological Order)

  • GitHub Copilot in VS Code
  • Copilot Chat and Agent in IntelliJ
  • Cursor
  • OpenCode with a Copilot subscription

My Ranking

I’ve run all of these on real projects, not toy examples. Here’s how they compare across the things I actually care about: ease of use, context awareness, accuracy, hallucination rate, speed, and token efficiency.

1. OpenCode 🏆

This is where I live now. It’s a CLI-first coding agent and it shows — it’s fast, thoughtful about how it compacts context, and the plugin ecosystem is genuinely impressive. Things like superpower skills and oh-my-opencode can completely transform how you interact with it. If you’re comfortable in the terminal, you’ll feel right at home.

The context compaction algorithm is decent out of the box, and with the right plugins you can push it pretty far. More on my exact setup below.

2. Cursor

Honestly? Cursor is excellent. If cost wasn’t a factor, this would be a tie. The agent mode is probably the best I’ve used across any tool — it reasons well about multi-file changes and doesn’t lose the plot mid-task. The main reason it’s not #1 for me is that with a Copilot subscription, OpenCode ends up being cheaper for the same (or better) workflow. Hard to argue with that math.

3. VS Code with Copilot Plugin

Good starting point, nothing more. If you’ve been using Cursor or OpenCode for a while and you drop back into VS Code with Copilot, you’ll immediately feel the drag. It reads files slowly, the context feels shallower, and the whole thing just moves at a different pace. That said, there’s a real argument for recommending this to people just getting started — it requires zero extra setup, it’s right there in your favourite IDE, and it gets the job done.

4. Copilot in IntelliJ

I genuinely struggle to understand why this exists in its current state. It’s slow, it hallucinates more than anything else on this list, and it’s more frustrating than helpful on any complex task. If you’re on IntelliJ and want AI assistance, you’re honestly better off just using the gh copilot CLI separately. Skip this one.

Personal preference: I strongly prefer CLI over IDE for AI coding. The feedback loop is tighter, it feels faster, and you have way more control over context. If you’ve never tried it, give it an honest week.

My OpenCode Setup

Here’s exactly how I have things configured. This took some iteration to get right.

Installation (macOS)

The recommended way is through Homebrew using the OpenCode tap, which stays most up to date:

brew install anomalyco/tap/opencode

Alternatively, use the one-line install script:

curl -fsSL https://opencode.ai/install | bash

Or if you prefer npm:

npm i -g opencode-ai

Verify it’s working:

opencode --version

Then just cd into any project and run opencode to launch the TUI.

Enable LSP

OpenCode integrates with Language Server Protocol servers to give the AI real code intelligence — diagnostics, definitions, references, the works. For most popular languages (TypeScript, Python, Go, Rust, etc.), LSP servers are auto-enabled when the relevant file extensions are detected.

To enable all built-in LSP servers explicitly, add this to your ~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "lsp": true
}

For finer control (enabling built-ins while customizing specific servers):

{
  "$schema": "https://opencode.ai/config.json",
  "lsp": {
    "typescript": {},
    "python": {}
  }
}

To also enable the experimental LSP tool (lets the AI actively query definitions, references, and call hierarchies):

OPENCODE_EXPERIMENTAL_LSP_TOOL=true opencode

Or add it to your shell profile (~/.zshrc or ~/.bashrc) to make it permanent:

export OPENCODE_EXPERIMENTAL_LSP_TOOL=true

Superpower Skills

Superpowers is a skills framework that makes the AI stop and think before it starts writing code. Instead of immediately jumping into implementation, it runs a brainstorming phase, produces a design, then writes a proper plan before touching a single file. Once you’ve used it, vanilla agent mode feels reckless.

To install, just open OpenCode and tell it:

Fetch and follow instructions from https://raw.githubusercontent.com/obra/superpowers/refs/heads/main/.opencode/INSTALL.md

OpenCode will handle the rest automatically. To verify it’s working, ask:

Tell me about your superpowers

If it describes a structured workflow (brainstorming → plan → implementation), you’re good. To manually check available skills:

use skill tool to list skills

oh-my-opencode

oh-my-opencode is a full multi-agent harness built on top of OpenCode. It adds specialist agents (orchestrator, architect, researcher, etc.), parallel task execution, and a whole lot more. The magic keyword is ultrawork (or ulw) — add it to any prompt and you activate the full pipeline.

Install it interactively:

bunx oh-my-opencode install

Or non-interactively (replace flags based on your subscriptions):

bunx oh-my-opencode install --no-tui --copilot=yes --claude=no --gemini=no

To verify everything’s wired up correctly:

bunx oh-my-opencode doctor

Important caveat: oh-my-opencode is a token guzzler. It’s powerful, but if you’re working on quick, focused tasks it’s overkill — you’ll burn through tokens fast. Reserve it for deep, multi-step work where the orchestration overhead actually pays off.

To disable it temporarily without deleting the config, remove it from the plugin array in ~/.config/opencode/opencode.json:

{
  "plugin": []
}

Or use jq to remove it non-destructively:

jq '.plugin = [.plugin[] | select(. != "oh-my-opencode")]' \
  ~/.config/opencode/opencode.json > /tmp/oc.json && \
  mv /tmp/oc.json ~/.config/opencode/opencode.json

Wrapping Up

That’s the setup. It’s taken a while to land here but it genuinely feels like the right balance of speed, control, and capability. If you’re still living in an IDE plugin, I’d really encourage you to try the CLI side — even for a week. The difference is hard to explain until you’ve felt it.

References