If you’ve been building with Flutter for a while, you’ve definitely heard terms like AOT and JIT thrown around.
Maybe you know debug is slower than release.
Maybe you’ve wondered what actually ends up inside your APK.
But like… what’s really going on under the hood?
Let’s open it up properly — no buzzwords, no hand-wavy stuff. Just what actually happens to your code, when it happens, and why Flutter lowkey has a massive edge.
The Simplest Way to Think About It
Imagine you invited someone over for dinner.
AOT (Ahead of Time) is you cooking everything beforehand.
Full meal ready. Table set. Chef mode ON.
They walk in → boom, food’s already there.
But if they suddenly go:
“Actually I feel like pasta instead of rice…”
…yeah, you’re screwed XD
You’re starting from scratch.
JIT (Just in Time) is the opposite.
You’re like:
“Tell me what you want, I’ll make it right now.”
Flexible? Yes.
Fast? …depends how hungry they are ;)
Flutter basically looked at both and said:
“Why not use both where they make sense?”
And that’s exactly what it does.
Debug Mode — JIT and the Dart VM
When you run:
flutter run
Your code is NOT compiled to native machine code.
Yeah — surprise.
Instead, Dart takes a completely different route.
The Compilation Pipeline
Your Dart code first goes through something called the CFE (Common Frontend).
Sounds fancy, but it’s basically:
- parsing your code
- checking types
- resolving imports
And then it produces something called a Kernel AST.
Think of it like:
not source code, not machine code… but a structured “meaning” of your program
This gets saved as .dill files (Dart Intermediate Language).
From here, the Dart VM takes over.
Tiered JIT — This Is Where It Gets Spicy
The Dart VM doesn’t just compile everything upfront.
Because that would be dumb.
Why compile everything when half your code might never even run?
So it uses tiers:
Tier 0 — “Just run it bro”
- Runs in an interpreter
- No compilation
- Starts instantly
Downside? Slow af.
But meanwhile, the VM is silently watching:
- what functions are called
- what types are used
- which paths are hot
Tier 1 — “Okay this seems important”
Once a function gets called enough times:
- it gets compiled to native code
- but not heavily optimized
Faster than interpreter, but still warming up.
Tier 2 — “Alright I know your patterns now”
This is where things get spicy!
Now the VM has enough data to go:
“Cool, I trust this pattern.”
So it starts making bold assumptions.
Example:
If add(a, b) has only ever seen integers…
It goes:
“I’m skipping type checks. We going full speed.”
But what if suddenly a double shows up?
Boom.
Deoptimization.
- throws away optimized code
- falls back
- re-learns
No drama. Just adapts.
The Real Trick
Cold code → stays cheap
Hot code → gets optimized like crazy
You only pay cost where it actually matters.
Hot Reload — The Real MVP
This is where JIT absolutely flexes.
You change a widget color → hit save.
Here’s what happens behind the scenes:
- Only changed code gets recompiled
- Sent to Dart VM over WebSocket
- VM patches functions in-place
- Flutter re-runs
build()
And here’s the crazy part:
Your app state stays exactly the same
- scroll position? safe
- form input? safe
- counters? safe
Because state lives in memory (heap), and you’re only swapping instructions.
That’s why hot reload feels like magic.
Why Debug Mode Feels Slow
It’s not just JIT.
Debug mode has extra baggage:
- No tree shaking → everything included
- Assertions ON → extra checks
- DevTools hooks running
- No aggressive optimizations
- Constant profiling
Basically:
Debug mode = comfort + visibility
Release mode = speed + violence
Release Mode — AOT and Native Code
Now things change completely.
flutter build apk --release
No VM games anymore.
This is full AOT mode.
The Compilation Pipeline (Again, But Different Ending)
Same start:
But now instead of VM → it goes to:
gen_snapshot (AOT compiler)
Type Flow Analysis — The Big Brain Move
Unlike JIT, AOT cannot guess.
No safety net. No “fix later”.
So it does something called Type Flow Analysis (TFA):
- traces every possible path in your app
- figures out what types can exist where
- removes anything unreachable
Now the compiler can actually be bold:
- inline functions
- remove dead code
- skip dynamic checks
- devirtualize calls
Result?
Extremely optimised native machine code
What’s Inside Your APK?
If you unzip it, you’ll see something like:
your_app.apk
├── lib/
│ ├── arm64-v8a/
│ │ ├── libapp.so ← YOUR code (compiled)
│ │ └── libflutter.so ← Flutter engine
├── assets/
├── AndroidManifest.xml
└── classes.dex ← tiny Android entry point
That classes.dex?
Not your app logic.
It’s just Android saying:
“Okay Flutter, you take over now.”
After that:
-->Flutter engine loads libapp.so
--> CPU runs your code directly
No VM. No interpreter. No waiting.
Why Hot Reload Is Impossible Here
In release mode:
- no Dart VM
- no runtime patching
- no WebSocket updates
It’s just static native code.
So yeah…
--> Want changes? Rebuild. Reinstall. Done.
Profile Mode — Quick Side Note
flutter run --profile
This is the middle ground:
- AOT performance
- but still profiling tools available
Use it when you want real performance numbers without debug slowdown.
Why Flutter Hits Different
This is where things get really interesting.
React Native:
- JS runs in the engine
- communicates via bridge
- data serialized/deserialized
- async overhead
Flutter:
- Dart → native machine code
- talks directly to rendering engine
- draws everything itself
No bridge. No middleman.
Flutter literally went:
“We’ll just render everything ourselves.”
And honestly?
Yeah… kinda insane.
The Mental Model
If you take one thing away, let it be this:
Debug mode → Dart VM is alive
- dynamic
- patchable
- flexible
- hot reload
Release mode → Dart VM is gone
- static
- precompiled
- fast af
- no flexibility
Flutter gives you both.
- JIT for development speed
- AOT for runtime performance
Compilation never disappears.
It just moves to where it makes the most sense.