I Built a Batch Asset Validator for Blender in Python — Here's What I Learned

I Built a Batch Asset Validator for Blender in Python — Here's What I Learned

posted 5 min read

I Built a Batch Asset Validator for Blender in Python — Here's What I Learned

Game developers and 3D artists deal with a problem that doesn't get talked about enough: you find out your assets are broken after you've already submitted them.

Wrong scale. Tri count way over budget. Missing textures. Unapplied transforms. Negative scale that makes your mesh flip inside-out in Unity. These issues are obvious in hindsight, but catching them manually across a folder of 50 FBX files before every export is the kind of tedious work that either doesn't happen, or happens inconsistently.

So I built AssetSentinel — a Blender addon that batch validates FBX and glTF assets against a target platform profile before they leave your machine. Here's the technical breakdown of how it works, what I ran into building it, and a full demo video at the end.


The Core Problem: Validation Belongs Before Export, Not After

Most 3D pipelines look like this:

Model → Export FBX/glTF → Import to Engine → Discover problem → Go back to Blender

The discovery step is the expensive one. If you're submitting to a marketplace like Fab or CGTrader, a rejection costs you review time. If you're a studio delivering to a client, it costs credibility. The fix is moving validation to before the export step:

Model → Validate → Fix → Export FBX/glTF → Import to Engine

That's what AssetSentinel does.


How It Works: Temp Scene Import Loop

The core mechanism is a modal operator that imports each file into a temporary Blender scene, runs a series of checks, then destroys the scene and moves to the next file. This keeps the user's working scene completely untouched.

def _evaluate_in_temp_scene(context, path: str, profile: dict) -> dict:
    win = context.window
    orig_scene = win.scene
    temp_scene = bpy.data.scenes.new(name="ASSETSENTINEL_TEMP")
    win.scene = temp_scene
    try:
        _import_file(path)
        # run all checks here
        tris, verts = _tri_vert_counts()
        ngons = _ngon_count()
        # ... more checks
    except Exception:
        # classify the import failure
        detail = _tb_last_meaningful_line(traceback.format_exc())
        ftype, suggestion = _classify_import_failure(detail)
    finally:
        win.scene = orig_scene
        bpy.data.scenes.remove(temp_scene)
        bpy.ops.outliner.orphans_purge(do_recursive=True)

The finally block is non-negotiable. You have to restore the original scene and purge orphan data whether the import succeeded or failed — otherwise you accumulate stale mesh data across every file in the batch.


The Modal Operator Pattern

Blender's UI locks up if you run long operations on the main thread. The solution is a modal operator driven by a timer event. Each tick of the timer processes one file, then yields back to Blender so the UI stays responsive and the user can cancel with ESC.

def modal(self, context, event):
    wm = context.window_manager
    if event.type == "ESC":
        wm.assetsentinel_cancel_requested = True
    if event.type != "TIMER":
        return {"PASS_THROUGH"}
    if self._idx >= len(self._files):
        wm.assetsentinel_status = "Done"
        return self._finish(context, cancelled=False)
    # process one file per tick
    path = self._files[self._idx]
    res = _evaluate_in_temp_scene(context, path, prof)
    self._results.append(res)
    self._idx += 1
    wm.assetsentinel_progress = int((self._idx / len(self._files)) * 100)
    return {"RUNNING_MODAL"}

The timer interval is 0.1 seconds. Fast enough to feel responsive, slow enough not to hammer the CPU.


Profile-Based Validation

Rather than hardcoding limits, AssetSentinel uses JSON profiles — one per target platform. Each profile defines the limits that matter for that destination:

{
  "UNITY_MOBILE": {
    "label": "Unity - Mobile",
    "max_tris": 20000,
    "max_texture_res": 2048,
    "max_material_slots": 4,
    "lod_required": true,
    "no_spaces": true,
    "min_bbox_m": 0.05,
    "max_bbox_m": 10.0,
    "tri_over_is_error": true,
    "lod_missing_is_error": false
  }
}

The _is_error flags per rule let you configure severity independently. A tri count violation might be a hard error on mobile, but only a warning on desktop. The profile system auto-merges on load — if a user has an older profiles.json, new keys from the defaults get added without losing their custom values.


The Checks

Here's what runs on every imported asset:

Geometry:

  • Triangle and vertex count via me.calc_loop_triangles() — falls back to bmesh triangulate if that fails
  • Ngon detection by checking p.loop_total > 4 on every polygon
  • Negative scale on any axis
  • Unapplied transforms (scale/rotation not at identity)

Textures:

  • Total image count from bpy.data.images
  • External file resolution — whether the filepath exists on disk
  • Max texture resolution from img.size[0] and img.size[1]
  • Missing external textures list

Naming:

  • Regex match against allowed_name_chars from the profile
  • Space detection in object and material names

Scale:

  • World-space bounding box via evaluated_get(depsgraph).bound_box
  • Compared against min_bbox_m and max_bbox_m in the profile

LOD:

  • Name-based detection — checks for _LOD0, _LOD1, _LOD2, _LOD3 substrings

Import Failure Classification

One of the more useful features: when an import fails, it doesn't just say "import error." It classifies the failure and suggests a fix:

def _classify_import_failure(detail_line: str) -> tuple[str, str]:
    s = (detail_line or "").lower()
    if "khr_texture_basisu" in s:
        return (
            "UNSUPPORTED_EXTENSION",
            "Re-export without BasisU/KHR_texture_basisu and try again."
        )
    if "bad gltf" in s and "json error" in s:
        return (
            "INVALID_GLTF",
            "The glTF/GLB appears corrupted. Re-export from your DCC."
        )
    return ("UNKNOWN", "Try re-exporting and verify the file opens in Blender.")

This is genuinely useful in practice. KHR_texture_basisu failures come up frequently with AI-generated GLB files and photogrammetry exports. The error message from Blender's importer isn't helpful on its own — this surfaces the actual fix.


The HTML Report

After the scan, results are exported as an HTML report with a CSV alongside it. The report uses a dark theme, shows per-asset pass/warning/fail status with a triangle count bar against the profile limit, and includes filter controls to show only failed or warning assets.

The report also surfaces "top violations" — which checks failed most frequently across the batch. On a folder of 26 game assets, the most common failures were tri count overages and suspicious bounding box scales. Both are things that would have caused problems in the engine.


Watch the Full Demo

[embed: https://www.youtube.com/watch?v=64qQiMLCSD0]

The video walks through a real batch scan of 26 assets against the Unreal Desktop profile — showing the scan running, reading the HTML report, and interpreting the results.


Get the Addon

AssetSentinel is available on Superhive (Blender Market): https://superhivemarket.com/products/assetsentinel

Supports Blender 3.6 and above. No external dependencies — pure Python, no PyPI packages.


What I'd Do Differently

A few things I'd approach differently if I started over:

Results storage. I'm serialising the results to JSON and storing them in a WindowManager property string. It works but it's awkward — a proper addon-level data store would be cleaner.

The modal pattern. The timer-based modal is the right approach for Blender, but the 0.1s interval is a guess. A smarter approach would adapt the interval based on file size or import time.

Profile UI. Right now profiles are JSON files edited externally. An in-panel profile editor would make this accessible to artists who don't want to touch JSON.

Happy to answer questions about any part of the implementation — drop them in the comments.


More Posts

How I Built a React Portfolio in 7 Days That Landed ₹1.2L in Freelance Work

Dharanidharan - Feb 9

I Built an ML Routing Engine for 3D Mesh Retopology — Here's How It Works

wilsonanibe - Mar 19

Dashboard Operasional Armada Rental Mobil dengan Python + FastAPI

Masbadar - Mar 12

Why We Built a Deterministic Render Gate for Blender (and Why “Final” Is a Lie Without One)

wilsonanibe - Feb 2

Architecting a Local-First Hybrid RAG for Finance

Pocket Portfolio - Feb 25
chevron_left

Related Jobs

View all jobs →

Commenters (This Week)

6 comments
1 comment

Contribute meaningful comments to climb the leaderboard and earn badges!