You’re debugging an issue and ask your AI agent to write a quick script that checks your database state. It hands you Python or JavaScript. You can read it, sure, but you can’t review it at a glance the way you can with C#.
With .NET 10’s dotnet run file.cs, there’s no reason to leave your main coding language anymore for the utility scripts your agent writes during development.
Why your agent doesn’t write C#
You ask your agent for a quick helper script. Maybe it’s something to inspect your local API response, dump some data to a file, or check which config values differ between environments. Thirty seconds later you’re staring at Python. Or JavaScript, if your project happens to have a package.json nearby. Agents pick up on project context and default to whatever scripting language fits the repo.
Either way, it’s not C#. LLMs were trained on massive amounts of Python and JavaScript scripting code. Both dominate Stack Overflow answers, GitHub snippets, and tutorial content for these kinds of throwaway tasks. Your agent isn’t choosing them because they’re better for the job or collaboration with you. It’s choosing them because that’s what its training data looks like.
The problem isn’t that Python or JavaScript are bad. It’s that you’re a C# developer. You write C# all day. When your agent hands you a Python script, you’re mentally translating requests.get to HttpClient, mapping dict comprehensions to LINQ, and hoping the edge cases are handled correctly. For a script you’ll run once during a debugging session and maybe keep around for next time, that translation cost isn’t worth it.
A script you can’t read at full speed is a script you can’t trust.
.NET 10 closed this gap
.NET 10 shipped file-based apps: single .cs files that run without a project file. dotnet run file.cs and you’re done. Top-level statements, implicit usings, nullable reference types, all enabled out of the box.
Three ways to run:
dotnet run app.cs
dotnet app.cs
echo 'Console.WriteLine("hello");' | dotnet run -
The SDK generates a virtual project behind the scenes. You never see it, never manage it, never think about it. The experience is as smooth as python script.py, though the first run is slower (~4.1s cold start vs ~23ms for Python). The .NET team is actively working on that, and subsequent runs drop to ~310ms. More on the numbers below.
The #: directives
C# 14 introduced a new #: prefix that the compiler ignores but the SDK tooling processes. Think of them as inline project configuration. They go at the top of your file:
#:package Humanizer@2.14.1
using Humanizer;
var release = DateTimeOffset.Parse("2025-11-12");
var since = DateTimeOffset.Now - release;
Console.WriteLine($".NET 10 released {since.Humanize()} ago.");
That #:package line pulls the NuGet package. No dotnet add package, no separate install step. It’s right there in the file, versioned and explicit.
Here’s the full set:
| Directive | Purpose | Example |
|---|---|---|
#:package |
NuGet reference | #:package Newtonsoft.Json@13.0.3 |
#:sdk |
SDK selection | #:sdk Microsoft.NET.Sdk.Web |
#:property |
MSBuild property | #:property LangVersion=preview |
#:project |
Project reference | #:project ../Shared/Shared.csproj |
The #:sdk directive is the wild one. Want a web server in a single file? Swap the SDK:
#:sdk Microsoft.NET.Sdk.Web
#:package Microsoft.AspNetCore.OpenApi@10.*-*
var builder = WebApplication.CreateBuilder();
builder.Services.AddOpenApi();
var app = builder.Build();
app.MapGet("/", () => "Hello from a single file");
app.MapGet("/time", () => DateTime.UtcNow);
app.Run();
That’s a full ASP.NET Core app with OpenAPI. One file.
And it’s just C#
File-based apps aren’t a new language or a special scripting dialect. It’s the same C# you write in your production codebase. Same BCL, same NuGet packages, same async/await. The only difference is the missing .csproj. You don’t need to teach your agent a new tool, just tell it to use regular C# in a single file.
You can even reference your own projects. Say you have a Ordering.Domain library with your domain entities and you want a quick script to deserialize a JSON file into those objects:
#:project ../Ordering.Domain/Ordering.Domain.csproj
using System.Text.Json;
using Ordering.Domain;
var json = File.ReadAllText(args[0]);
var orders = JsonSerializer.Deserialize<List<Order>>(json);
foreach (var order in orders!)
Console.WriteLine($"{order.Id}: {order.Status} — {order.Total:C}");
No copying classes, no redefining types, no Python dataclass that’s “close enough” to your C# model. The script uses your actual domain types. If Order changes in the project, the script picks it up on the next run.
Hit your local API and validate the response
using System.Net.Http.Json;
using System.Text.Json;
var client = new HttpClient();
var response = await client.GetAsync("http://localhost:5000/api/orders?status=pending");
Console.WriteLine($"Status: {response.StatusCode}");
var orders = await response.Content.ReadFromJsonAsync<JsonElement>();
foreach (var order in orders.EnumerateArray())
{
var id = order.GetProperty("id").GetString();
var total = order.GetProperty("total").GetDecimal();
var items = order.GetProperty("items").GetArrayLength();
Console.WriteLine($" {id}: {total:C} ({items} items)");
}
HttpClient is available through implicit usings. System.Net.Http.Json needs an explicit using as shown above, but System.Text.Json ships with the runtime. No packages needed.
Compare config files between environments
using System.Text.Json;
var devConfig = JsonSerializer.Deserialize<Dictionary<string, JsonElement>>(
File.ReadAllText(args.Length > 0 ? args[0] : "appsettings.Development.json"));
var prodConfig = JsonSerializer.Deserialize<Dictionary<string, JsonElement>>(
File.ReadAllText(args.Length > 1 ? args[1] : "appsettings.Production.json"));
var allKeys = devConfig!.Keys.Union(prodConfig!.Keys).OrderBy(k => k);
Console.WriteLine($"{"Key",-40} {"Dev",-25} {"Prod",-25}");
Console.WriteLine(new string('-', 90));
foreach (var key in allKeys)
{
var dev = devConfig.TryGetValue(key, out var d) ? d.ToString() : "(missing)";
var prod = prodConfig.TryGetValue(key, out var p) ? p.ToString() : "(missing)";
if (dev != prod)
Console.WriteLine($" ▸ {key,-38} {dev,-25} {prod,-25}");
}
Zero packages. The kind of thing you’d ask for when deploying and you want to double-check what’s different. Run it with dotnet run config-diff.cs -- appsettings.Development.json appsettings.Production.json.
Gotchas and limitations
The project file trap
When a .csproj exists in the current directory and you run dotnet run file.cs, the CLI treats file.cs as an argument to the project instead of running it as a file-based app. Silent, confusing, and your agent will hit this constantly.
Use the explicit flag:
dotnet run --file file.cs
Or keep scripts in a separate directory. You could use a scripts/ folder outside your solution directory for exactly this reason.
Shebang support
On Linux and macOS, you can make scripts directly executable with #!/usr/bin/env dotnet at the top. Two things to watch: LF line endings (not CRLF) and no BOM. On Linux this is the default.
chmod +x info.cs
./info.cs
Startup time
On my machine (Linux, .NET 10.0.103), a hello world:
| First run | Subsequent runs | |
|---|---|---|
| Python | ~23ms | ~23ms |
| PowerShell | ~365ms | ~365ms |
C# (dotnet run) |
~4.1s | ~310ms |
The first run is slow. No way around it. This isn’t a Linux quirk either — the .NET team’s own measurements show similar numbers. But look at the second column: ~310ms is in the same ballpark as PowerShell. The SDK uses a three-tier cache that skips the build entirely when nothing changed, and for the scripts this article is about, local dev scripts your agent writes that you run and re-run, you pay the cold start once.
The cold start matters for CI pipelines and tight loops, not for the agent-assisted workflow we’re talking about here. And once your script does real work, the math shifts. .NET’s compiled runtime chews through computation roughly twice as fast as Python’s interpreter. A script that processes 500 JSON records, does some math, and formats output. Python finishes the compute in ~424ms, C# in ~210ms. The heavier the actual work, the more that startup gap shrinks as a share of total time.
Knowing the .NET team is actively working on this, I expect the cold start to improve in the next version(s). Simple scripts already bypass MSBuild entirely via the compiler server. For .NET 11, there’s an effort to AOT-compile the dotnet CLI itself, targeting 80%+ reduction in CLI startup overhead.
Growing into a console app
When a script outgrows a single file, one command turns it into a full console app:
dotnet project convert app.cs
This generates a .csproj and Program.cs, translating all #: directives into MSBuild equivalents. From there it’s a regular .NET project — add classes, split into files, wire up DI, whatever you need. Your original script stays untouched.
Multi-file support for file-based apps is planned for .NET 11.
What about PowerShell?
PowerShell is the obvious counter-argument. It runs on .NET, calls .NET APIs directly, and your agent probably already knows it well.
Closer to C#, yes. But still its own language. Get-Content instead of File.ReadAllLines. Invoke-RestMethod instead of HttpClient. Pipelines instead of LINQ. $_ instead of lambda parameters. You’re still translating.
# PowerShell
$repos = Invoke-RestMethod "https://api.github.com/users/eriklieben/repos"
$repos | Sort-Object stargazers_count -Descending |
Select-Object -First 10 name, stargazers_count, language |
Format-Table
// C# file-based app
using System.Net.Http.Json;
using System.Text.Json;
var client = new HttpClient();
var repos = await client.GetFromJsonAsync<JsonElement[]>(
"https://api.github.com/users/eriklieben/repos");
repos!.OrderByDescending(r => r.GetProperty("stargazers_count").GetInt32())
.Take(10)
.ToList()
.ForEach(r => Console.WriteLine(
$"{r.GetProperty("name"),-30} ⭐ {r.GetProperty("stargazers_count"),4}"));
The PowerShell version is shorter. But which one can you debug in your sleep?
PowerShell also has no inline package management, no dotnet project convert equivalent, and quirky error handling ($ErrorActionPreference, try/catch that sometimes doesn’t catch). For agent work: a PowerShell script that needs to become production code means a rewrite. A C# script becomes a C# project with one command.
The full comparison
Python isn’t standing still either. Python 3.12+ with PEP 723 and tools like uv now supports inline script dependencies via # /// script metadata blocks, the direct equivalent of #:package. And Python’s type checking has matured with Mypy and Pyright being mainstream.
| Python | PowerShell | C# (.NET 10) | |
|---|---|---|---|
| Run a file | python script.py |
pwsh script.ps1 |
dotnet run script.cs |
| No project file | Yes | Yes | Yes |
| Inline packages | Yes (PEP 723 + uv) |
No (Install-Module) |
#:package Name@version |
| Type safety | Optional (Mypy/Pyright) | Optional | Yes (compiler-enforced) |
| Shebang | Yes | Yes | Yes |
| Publish to binary | PyInstaller / Nuitka | No | dotnet publish (native AOT) |
| Growth path | Manual | Rewrite to C# | dotnet project convert |
| Multi-file | Yes | Yes (dot-sourcing) | Single file only |
| Cold start | ~23ms | ~365ms | ~4.1s first run |
| Cached start | ~23ms | ~365ms | ~310ms (build skipped) |
| Same language as production code | No | No | Yes |
Python wins on cold startup, multi-file scripts, and ecosystem breadth for data/scripting tasks. PowerShell wins on familiarity for ops work. C# wins on type safety, binary publishing, the growth path, and zero context-switch from your production code.
Teaching your agent to use C# instead
The feature exists, but your agent doesn’t know about it. .NET 10 file-based apps are new enough that there’s far less training data for dotnet run file.cs than for Python or JavaScript scripting. You need to be explicit.
Even when an agent does reach for C#, it often picks a different tool. In my testing, agents without specific instructions defaulted to dotnet-script. It’s an open-source project that uses .csx files, #r "nuget:..." directives, and requires a separate dotnet tool install -g dotnet-script.
It’s a great project that filled this gap for years and still has features file-based apps don’t (REPL, multi-file scripts, remote URL execution).
But .csx is a dialect of C#, not standard C#. With .NET 10, the built-in dotnet run file.cs uses regular .cs files, the new #:package syntax, and needs nothing beyond the SDK.
Put the knowledge in a skill, not just a rule
Adding “use C# for scripts” to your CLAUDE.md isn’t enough. The agent needs the #: directive syntax, the --file flag gotcha, when to use #:sdk vs the default SDK. That’s too much for a project instruction file.
Create a skill with the full reference, then point your CLAUDE.md at it. In Claude Code, a skill is a markdown file in .claude/skills/ that the agent loads when it matches:
# .claude/skills/dotnet-scripting.md
Use this skill when asked to write scripts, utilities, or one-off automation tasks.
## Rules
- Write all scripts as C# file-based apps (.NET 10)
- Use `dotnet run script.cs` to execute
- When a .csproj exists in the working directory, use `dotnet run --file script.cs`
- Prefer BCL APIs over adding packages when possible
## Directives
- `#:package Name@version` — NuGet package (use `@*` for latest)
- `#:sdk Microsoft.NET.Sdk.Web` — switches SDK
- `#:property Key=Value` — MSBuild property
- `#:project ../Path/Project.csproj` — project reference
Then in your CLAUDE.md:
- For scripting tasks, use C# file-based apps. See the `dotnet-scripting` skill.
For other AI tools
Most tools have converged on AGENTS.md as the standard for project-level instructions:
- GitHub Copilot:
.github/copilot-instructions.md(also readsAGENTS.md) - OpenAI Codex CLI:
AGENTS.mdin any directory from repo root to cwd - JetBrains Junie:
.junie/AGENTS.mdorAGENTS.mdin the project root - OpenCode:
AGENTS.mdin the project root (also readsCLAUDE.mdas fallback)
These tools don’t have a skill system, so inline the directive syntax and the --file flag warning directly.
What changes in practice
Before I added this to my config, a typical exchange:
Me: “Write a script that checks which of my NuGet packages have updates available”
Agent: writes Python with
subprocesscallingdotnet list package --outdated, parses the text output with regex
After:
Agent: writes C# using
NuGet.Protocolto query package feeds directly, with#:package NuGet.Protocol@6.12.1at the top
I can read it instantly and spot bugs because I know the API. The agent got the directive syntax right because the skill had an example.
When to use this (and when not to)
This is for: the utility scripts your agent writes while you’re developing. Debugging helpers, data inspection tools, API response validators, config comparisons, quick transformers you run from your terminal during a coding session. Scripts you might run once and throw away, or keep in a scripts/ folder for next time. The kind of work where you want to read the output, not maintain the script.
Stick with Python or Bash when:
- You’re writing CI pipeline glue where cold start matters and nothing is cached
- You need minimal Docker containers (the .NET SDK image is ~800MB vs Python slim at ~60MB)
- The task is data science or ML work where Python’s ecosystem is stronger
- You’re publishing scripts as native binaries.
dotnet publish app.csuses AOT by default, and libraries using runtime reflection will crash silently. Disable with#:property PublishAot=falseif you’re unsure.
Stop paying the tax
The cold start is real. The single-file limitation is real. CI scenarios still favor lighter alternatives.
But for the most common case, your agent writing a quick script that you need to read and trust, it should be in the language you already think in.
One skill file, one line in your project config, and you stop translating.