Methodology

How we test mockups.

TheLayer scores mockups as working files, not as thumbnails. A scored review requires evidence: test data, screenshots, a documented test date, methodology version and a human editorial review.

Scoring rubric (100 points)

Criterion Weight What it measures
Realism 18 Light/shadow plausibility and surface fidelity at 100% zoom.
Smart-object workflow 12 Replace-contents reliability, transform fidelity, layer order.
Displacement quality 13 Texture warp accuracy on dimensional materials (fabric, paper, etc.).
Shadow & highlight handling 12 Tonal range under different brand artwork; banding control.
Layer organization 8 Named groups, disabled layers, hidden helpers, masking discipline.
Resolution & PPI 8 Effective output size at 100% scale for typical print and screen use.
Color management 8 Profile embedded, working space, soft-proof viability.
License clarity 10 Commercial-use scope, POD limits, redistribution terms documented.
Price & value 6 Value relative to comparable files of the same category.
Use-case fit 5 Match between marketing claims and realistic production use.

Publishing gate

No review may claim testing or display a score without (1) supporting evidence, (2) a named human reviewer, and (3) a recorded methodology version. No AggregateRating structured data is emitted unless real, independently sourced user ratings exist and are visible on the page.

Frequently asked questions

Do you publish unscored opinions?

Yes — explicitly labelled as opinion or note. Scored reviews require evidence (test data, screenshots, license check, named human reviewer). Opinion pieces never display a numeric score or AggregateRating.

Do you accept mockups for review from publishers?

Yes. We test publisher-supplied files using the same rubric. The relationship is disclosed at the top of the article in a DisclosureBox component, and any commercial / affiliate link uses rel="sponsored".

How do you handle AggregateRating schema?

We do not emit AggregateRating without real, independently sourced ratings visible on the page. Per Google structured data policy, AggregateRating cannot be invented from the editor's own opinion.

How often is the methodology revised?

The methodology version is recorded on every scored review. We update it when measurement methods change — typically once or twice a year — and the prior version remains accessible.

Versioning

This is methodology version 2026.05. The version recorded on each scored review pins the rubric used at publication time, so historical scores remain interpretable when the rubric is later revised.