Back to blog
April 6, 2026
4 min read

We Built Codeusage Because Nobody Knows How Their Team Uses AI Coding Tools

SK
Sarah K
payments-api
3 sessions18 tasks24.1K tokens
AL
Alex L
auth-service
5 sessions32 tasks41.8K tokens
RP
Ravi P
dashboard-ui
2 sessions12 tasks15.3K tokens
MN
Maya N
mobile-app
4 sessions27 tasks35.6K tokens
Team ActivityLive
24
Sessions today
142
Tasks this week
8
Active devs
payments-api
38 tasks
auth-service
27 tasks
dashboard-ui
19 tasks
mobile-app
14 tasks

Here's a question most engineering leads can't answer: which of your developers are actually using AI coding tools — and on which projects?

Teams are rolling out AI coding tools fast. Licences are bought, seats are assigned. But after that? Silence. There's no dashboard that tells you who's using them, how often, on what projects, or whether the investment is actually changing how your team works.

We thought that was a problem worth solving.


What Codeusage Does

Codeusage is a lightweight platform that gives engineering teams visibility into AI coding tool usage across their organisation. Think of it as your AI tool intelligence layer — one dashboard that shows you sessions, developers, projects, and activity patterns.

No code changes. No complex setup. Just install, init, and it works.

12

Sessions

87

Tasks

5

Active devs

284K

Tokens

a3f82c1e...12 tasks
SK
Sarah K·payments-api
1h 24m
42.3K
7b4d9f02...8 tasks
AL
Alex L·auth-service
52m
28.1K
e1c56a89...15 tasks
RP
Ravi P·dashboard-ui
2h 10m
65.8K

Setup Takes 30 Seconds

$ npm i -g codeusage-cli
$ codeusage init

That's it. The CLI hooks into your AI coding tool's native event system. Every time a developer completes a task, Codeusage captures the metadata — model used, duration, files changed, tools used — and sends it to your team dashboard. The developer doesn't need to change how they work. It runs silently in the background.

Terminal

See Everything in One Place

Once your team is connected, the dashboard shows you:

  • Sessions and tasks — what's happening, grouped by developer and project
  • Developer activity — who's active, how often, and adoption patterns across the team
  • Project breakdowns — which codebases are getting the most AI-assisted work
  • Usage patterns — session durations, models used, tools invoked

The question shifts from "are people using AI tools?" to "how is AI changing the way our team builds software?"


Prompt Guard: Built-In Credential Protection

This one came from a real concern. Developers paste things into AI prompts — connection strings, API keys, tokens. Sometimes by accident, sometimes out of habit.

Prompt Guard scans every prompt locally, on the developer's machine, before it reaches the AI model. If it detects something that looks like a credential — an AWS key, a database URL, a private key — it blocks the prompt and tells the developer what it found.

No prompts are sent to our servers. No code is captured. The check happens entirely on the local machine in milliseconds.

Terminal
>

The Line We Don't Cross

Codeusage captures metadata only.

No prompts. No AI responses. No code. No diffs. No file contents. Ever.

We track models, durations, file counts, and tool usage. That's it. The telemetry payload is transparent — you can see exactly what gets sent at any time.


Who It's For

Engineering managers and team leads who want to understand AI tool adoption across their team. Are people actually using these tools? On which projects? Is the investment translating into changed workflows?

CTOs and VPs of Engineering who need to report on AI tool adoption and impact without relying on anecdotes.

Developers who want to track their own sessions, see their activity patterns over time, and use Prompt Guard to scan prompts for credentials before they reach the AI.


Get Started

Codeusage is live and ready to use.

Questions or feedback — reach us at hashim@codeusage.dev