3D Texture Model

YEAR

2022-23

ABOUT

Poly started when we built the first SOTA model for 3D texture generation — allowing designers to create production-ready AI-generated textures. These were PBR materials in 8K resolution and 32-bit EXR maps, from just a photo or text prompt, and customizable in a full editor. 


With over 200K monthly active users and 100K+ daily generations, we also ran a large scale creator program producing human-generated training data as inputs to our reward model pipeline.

View ↗

KEY INITIATIVES

Hand curated dataset, human data generation program

Pre-Training Dataset

[HAND-CURATED DATASET]

I started by assembling an initial dataset of 30,000 high-quality, manually constructed samples sourced from:


  • Procedural textures created in tools like Adobe Substance and Blender

  • Photogrammetry and PBR-scanned real-world materials

  • Community and licensed libraries such as Poliigon, Quixel Megascans, and Poly Haven

View ↗

Human Data

[POST-TRAINING DATA GENERATION]

After post-training, I launched a large-scale creator program to generate high-quality, human-made texture outputs using our tool. These results served two purposes:


  1. Feeding our reward model pipeline with diverse inputs

  2. Populating a new Gallery section that let users quickly browse textures and directly edit pre-generated materials

View ↗

[TASK DESIGN]

With no specialized 3D vendors available at the time, we built our own network of 3D artists and design studios to handle the workflows. We provided step-by-step guides and ran hands-on workshops to help creatives get the most out of our model and consistently produce high-quality results.

View ↗

[QUALITY CONTROL]

We set up a custom database and submission process that allowed creatives to manage their outputs and track progress. Each submission was hand-reviewed by me using our evaluation frameworks.


To ensure both variety and quality in the dataset, we established weekly category-specific targets and continuously raised quality benchmarks over time.

View ↗

[PROMPT GUIDES]

Writing prompt guides was key to helping creators achieve high-quality results. We taught prompt engineering techniques — like using specific adjectives, environmental and detail descriptors, and defining clear styles and types.


These were paired with instructions for the rest of the texture generation workflow, including patch selection, upscaling, and choosing PBR material types.

View ↗

[RESULTS AND ANNOTATION]

In total, we produced over 6,000 unique outputs — each meticulously annotated and tagged. Importantly, the dataset was created in the context of the product itself, capturing not just the final textures but also the associated maps and settings used in their generation.

View ↗

Texture Generation Tool

[RESULTS]

We became the leading 3D texture generation tool and model, reaching over 200K monthly active users and 100K+ daily generations. Our system outperformed competing startups and research previews from Nvidia and Adobe, and was featured in all major 3D design publications.

View ↗

[OUTPUTS IN USE]

More importantly, we became an integral part of many 3D artists’ workflows — our textures were used in professional design projects and shared millions of times across platforms like YouTube, showcasing real-world creative adoption at scale.

View ↗