Native AOT for ASP.NET Core 10: Production Deployment Implications
Native Ahead-of-Time (Native AOT) compilation for ASP.NET Core graduated from preview to a fully supported scenario in .NET 8 and matured significantly in .NET 10 LTS. The promise is real: 50-150 ms cold start instead of 800-2000 ms, half the memory footprint, and binaries that ship as single executables under 30 MB instead of multi-hundred-megabyte publish folders. The catch is that AOT trades JIT-time optimizations for ahead-of-time guarantees, which means some libraries and patterns that work fine on traditional .NET simply don't compile under AOT. This is the honest production-deployment guide — when AOT pays off, when it costs more than it saves, and what changes for hosting.
What Native AOT actually is
Traditional .NET compiles your C# to Intermediate Language (IL) at build time, then the runtime's Just-In-Time (JIT) compiler converts IL to native machine code as your app runs. The JIT applies hot-path optimizations, tiered compilation, and runtime-specific tweaks — over the first few minutes of an app's lifetime, the code gets faster as the JIT learns from execution patterns.
Native AOT replaces this with a single compile-once-to-native step. The CoreCLR runtime stops shipping the JIT at all in your output. You get:
Faster cold start — no JIT compilation pause on first request
Smaller publish size — no JIT, no IL metadata, only the code your app actually references after trimming
Lower memory footprint at startup — typically 40-60% less than JIT-compiled .NET for the same app
Single self-contained executable — one .exe file, no external runtime dependency on the host
The trade-off:
No JIT runtime optimizations — the code runs at AOT-time-determined speed, which is good but not the same as JIT's learned-from-traffic optimizations
Strict trimming requirements — any library that uses reflection in ways the trimmer can't analyze will fail at AOT time
No dynamic code generation — libraries that emit IL at runtime (some serializers, some ORMs, some testing tools) don't work
Longer build times — AOT compilation typically takes 2-5x longer than regular compilation
When Native AOT actually pays off
Cold-start-sensitive scenarios
If your app's first request takes longer than the user's patience — serverless functions, container deployments that scale to zero, scheduled batch jobs that spin up worker processes, CI/CD workflows that warm up an app for testing — AOT's 100x cold-start improvement is genuine business value.
For traditional long-running web servers (a single ASP.NET Core process serving traffic 24/7), the cold-start savings happen once per deploy. After warm-up the JIT-compiled version is competitive or faster on steady-state throughput.
Container density
If you're packing many ASP.NET Core microservices into Kubernetes pods or ECS tasks, the memory footprint difference matters — you can fit 2-3x more AOT-compiled processes in the same node memory budget. For organizations running 50+ small ASP.NET Core services in containers, the infrastructure cost savings are real.
Embedded / edge deployment
Distributing an ASP.NET Core app as a single executable to IoT devices, point-of-sale terminals, or air-gapped corporate environments is dramatically simpler with AOT — one file, no .NET runtime install required, predictable startup.
Where AOT doesn't pay off
For a normal IIS-hosted ASP.NET Core web app serving steady-state production traffic, AOT's marginal benefit is small:
Cold start happens once per app pool recycle (which should be rare in production)
Memory savings are modest compared to a typical Business-or-higher hosting plan's headroom
Steady-state throughput is roughly equivalent or slightly better with traditional JIT in most workloads
The library-compatibility constraints rule out common patterns (EF Core with reflection-heavy DbContext, some serializers, some logging providers)
If your app runs as a long-lived IIS site on Adaptive Web Hosting or any similar Windows host, AOT is a "consider it for specific subsystems," not a default choice.
What breaks under Native AOT
The AOT compiler analyzes your code at build time. Anything it can't statically prove safe gets flagged or stripped. Common breakages:
Reflection over types the trimmer can't see
// BAD under AOT — Type.GetType(string) at runtime can't be analyzed
var type = Type.GetType(typeName);
var instance = Activator.CreateInstance(type);
// GOOD — use source generators or known types
[DynamicallyAccessedMembers(...)]
var instance = new MyType();
The AOT trimmer strips types it doesn't see referenced. Type.GetType("MyType") at runtime resolves to a type the trimmer removed — you get a runtime exception.
Dynamic LINQ / Expression compilation
Libraries that compile expression trees at runtime (some ORMs, some dynamic-query builders) don't work under AOT. The JIT they rely on isn't present.
Entity Framework Core with default reflection patterns
EF Core 8 introduced AOT-compatible compiled models. EF Core 10 expanded support further. But the default reflection-based DbContext discovery doesn't work under AOT — you need to use the pre-compiled model generator. The migration takes effort but is well-documented.
JSON serialization with reflection-based serializers
System.Text.Json has full AOT support via source generators (since .NET 7). Newtonsoft.Json does not work under AOT. If your project uses Newtonsoft.Json (very common), you need to either migrate to System.Text.Json or stay on JIT.
Some logging providers and ORMs
Audit your NuGet dependencies. Microsoft publishes a list of AOT-compatible packages; check that your stack is on it before committing to AOT.
How to enable Native AOT in an ASP.NET Core project
In your .csproj:
<PropertyGroup>
<TargetFramework>net10.0</TargetFramework>
<PublishAot>true</PublishAot>
<InvariantGlobalization>true</InvariantGlobalization>
</PropertyGroup>
Then publish:
dotnet publish -c Release -r win-x64 --self-contained
The -r win-x64 specifies the target runtime — AOT compiles to a specific architecture, unlike traditional .NET's portable output. The --self-contained includes the entire runtime in the binary. The output is a single .exe plus a small handful of native dependencies.
Hosting Native AOT ASP.NET Core on IIS
An AOT-compiled ASP.NET Core app is a Windows executable. Running it under IIS is structurally similar to traditional ASP.NET Core hosting, with a few differences:
Hosting model
Use out-of-process hosting (not in-process). The in-process model loads the .NET runtime into the IIS worker (w3wp.exe); AOT-compiled apps already contain their own runtime, so the in-process model conflicts. Set in web.config:
<aspNetCore processPath=".\YourApp.exe"
arguments=""
stdoutLogEnabled="false"
hostingModel="outofprocess" />
The processPath points at your AOT-compiled executable, not dotnet + DLL.
App pool settings
Set the IIS Application Pool's Managed Code Version to No Managed Code — same as for traditional ASP.NET Core. The ASP.NET Core Module proxies requests from IIS to your AOT-compiled Kestrel process.
Architecture matching
Your AOT publish must target the same architecture as the IIS app pool. If the app pool is 64-bit (Adaptive Web Hosting's default), publish with -r win-x64. If 32-bit, -r win-x86. Mixing causes BadImageFormatException at startup.
Deployment
Upload the AOT publish output via FTP, Plesk file manager, or Web Deploy — same as a regular ASP.NET Core deploy. See our complete ASP.NET Core deployment walkthrough for the broader steps.
Performance reality check
Real-world Native AOT benchmarks for ASP.NET Core 10 on Windows Server 2022:
MetricTraditional JITNative AOT
Cold start (first request)800-2000 ms50-150 ms
Warm-state RPS (simple JSON API)~50,000 RPS~45,000-55,000 RPS
Publish output size80-200 MB15-30 MB
Process startup memory80-120 MB30-50 MB
Steady-state memory (1 hour traffic)150-300 MB80-150 MB
Build time (full publish)10-20s40-90s
For long-running web servers, the steady-state RPS is the headline number — and the JIT and AOT versions are roughly equivalent. The savings are in startup, memory, and deploy size; for a normal IIS-hosted production web app, those matter less than they do for serverless.
Production checklist for AOT-compiled ASP.NET Core on IIS
Verify all NuGet dependencies are AOT-compatible. Run dotnet publish -c Release -r win-x64 early; the trimmer will warn about any incompatible packages
Audit reflection usage in your own code. Replace dynamic Type.GetType() with explicit type references; add [DynamicallyAccessedMembers] attributes where reflection is unavoidable
Use System.Text.Json with source generators instead of Newtonsoft.Json
Use EF Core compiled models if you're using EF Core 8 or 10
Set hostingModel="outofprocess" in web.config
Set processPath to your .exe, not dotnet
Configure CI to do AOT publish — the longer build time matters for build-server capacity planning
Test in staging first — AOT-specific bugs (reflection-stripped types, native dependency mismatch) only surface at runtime, not at build time
How Native AOT interacts with Adaptive Web Hosting
Adaptive Web Hosting runs Windows Server 2022 + IIS 10 with dedicated IIS Application Pools per site. AOT-compiled ASP.NET Core apps deploy and run on every plan with no special configuration needed beyond the hostingModel="outofprocess" and processPath changes in web.config.
Specifically:
Dedicated app pools per site — AOT-compiled binaries don't compete with neighbour tenants for CPU or memory
64-bit app pools by default — matches the standard AOT publish target (win-x64)
SQL Server 2022 included — works with AOT-compatible EF Core 8/10 compiled-model patterns
Free Let's Encrypt SSL via Plesk — same as traditional ASP.NET Core; AOT doesn't change the TLS termination layer
The hosting layer isn't a constraint on AOT adoption. The constraints are in your application code and dependencies.
Frequently asked questions
Should I switch my production app from JIT to AOT?
For most long-running IIS-hosted web apps, the answer is no — the marginal benefit is small and the library-compatibility audit is real work. AOT is best evaluated on a per-subsystem basis: maybe one microservice that takes a long time to cold-start gets AOT; the main monolith stays on JIT. For greenfield serverless or container-density workloads, AOT is genuinely worth designing for from day one.
Does Native AOT improve steady-state performance?
Generally not. The JIT's runtime optimizations close the gap once an app has been running for a few minutes. AOT's gains are concentrated in startup time, memory footprint, and binary size — not throughput.
Can I use Blazor Server with Native AOT?
Blazor Server has been AOT-incompatible historically because of its heavy use of reflection in the Razor component pipeline. .NET 10 made progress on this, but a fully AOT-compiled Blazor Server app remains an emerging scenario rather than a default-supported one. For now, Blazor Server is one of the scenarios where staying on JIT is the right choice. See our Blazor hosting complete guide for the architectural picture.
What about Blazor WebAssembly with AOT?
Blazor WebAssembly AOT compilation is a separate feature from server-side AOT. It compiles your Blazor C# code to optimized WebAssembly at publish time, dramatically improving in-browser performance. This works on .NET 8 and later with <RunAOTCompilation>true</RunAOTCompilation> in the .csproj — entirely independent of whether the server-side host uses AOT.
How does AOT affect Entity Framework Core?
EF Core 8 introduced AOT-compatible compiled models. The pattern: generate a model at build time using dotnet ef dbcontext optimize, register the compiled model in your DbContextOptions, and the runtime DbContext discovery doesn't need reflection. EF Core 10 refined this further. The migration takes about an hour of dev time for a small DbContext, longer for complex ones.
Are there hosting requirements for AOT-compiled apps?
Just the standard Windows Server 2022 + IIS 10 setup. The AOT-compiled binary is a regular Windows .exe; IIS hosts it via the ASP.NET Core Module out-of-process. Adaptive Web Hosting plans run AOT-compiled apps with no special configuration needed beyond pointing web.config at the .exe instead of dotnet YourApp.dll.
What about debugging AOT-compiled apps?
Native AOT debugging is harder than JIT — the symbol info available is more limited, breakpoints behave differently, and some runtime introspection tools don't work. Debug your app in regular JIT mode during development, publish AOT for production. Most AOT-specific bugs are caught at publish time by the trimmer warnings, not at debug time.
Bottom line
Native AOT is a meaningful optimization for cold-start-sensitive, container-density, or single-executable distribution scenarios. For long-running IIS-hosted ASP.NET Core web apps serving steady production traffic, the marginal benefit is small and the library-compatibility audit costs real engineering time. Evaluate per-subsystem, not per-application.
On Adaptive Web Hosting's ASP.NET Core plans, AOT-compiled apps deploy and run with no special hosting configuration — the dedicated IIS Application Pools, 64-bit defaults, and SQL Server 2022 inclusion all work the same way for JIT and AOT alike. The decision is at the application layer, not the hosting layer. Every plan includes a 30-day money-back guarantee. View hosting plans, see our deployment walkthrough, or talk to an ASP.NET expert.