Platform Audit | 11 platforms · 55 sites · April 2026
SEO & Performance Audit

Photography Platforms:
What the Data Actually Shows

We audited the portfolio sites each platform showcases as its own best work, measuring Core Web Vitals, structured data, security headers, and AI-readability. Here is every number, unfiltered.

55 sites audited
11 platforms tested
5 scoring dimensions
1 platform with full security headers

01 / / Methodology

How we selected sites and what we measured.

Every site in this audit was sourced directly from each platform's public "Case Studies" or "Featured Photographers" pages. These are not random samples — they are the sites each platform chooses to put in front of prospective customers. If a platform's best example underperforms, that's structural, not incidental.

Each site was scored across four weighted dimensions:

⚡ Performance & Load

Google Lighthouse performance score, plus raw load metrics: total requests, JavaScript file count, render-blocking scripts, Time to First Byte (TTFB), page weight (MB), and fully-loaded time.

🏗 Structural Integrity

Presence of a single H1, logical heading hierarchy (H1→H2→H3), semantic HTML elements (<main>, <nav>, <footer>), and alt-text coverage across all images.

🤖 AI Readability & Schema

Count and type of Schema.org structured data objects, Open Graph tags, and Twitter Card metadata — the signals AI crawlers (Gemini, ChatGPT, Perplexity) use to understand and surface content.

🔒 Security Headers

Presence of five HTTP security headers: Content-Security-Policy (CSP), HSTS, X-Frame-Options, X-XSS-Protection, and X-Content-Type-Options. Each header is worth 20 points of the security score.

Why this matters for SEO

Google's ranking signals include Core Web Vitals (speed), accessibility (alt text, heading structure), HTTPS, and structured data. A platform that consistently fails these in its own showcase sites is one that makes organic search harder for every photographer on it.

02 / Master Comparison Table

All scores are averages across three audited sites per platform. Performance and AI scores are out of 100.

Platform Perf Score AI / Schema Security Structure Page Weight Avg JS Files Avg TTFB Total Score
Bablab
96.7
84.3
100.0 86.7 0.6 MB 0.7 126 ms 92.0
Photofolio
71.7
69.0
83.3 71.7 12.8 MB 2.3 378 ms 75.4
Wix
59.7
68.3
50.0 59.8 4.8 MB 90.3 86 ms 56.2
Squarespace
25.0
68.0
27.8 68.5 12.3 MB 27.3 162 ms 48.0
Zenfolio
53.3
57.8
38.9 61.7 1.7 MB 22.3 670 ms 51.7
Pixieset
41.3
57.5
66.7 59.1 0.89 MB 6.7 11 ms 63.5
Pixpa
47.7
46.3
83.3 55.2 6.3 MB 29.3 159 ms 51.7
Format
48.7
37.6
66.7 38.4 1.1 MB 15.0 119 ms 49.5
Smugmug
34.3
34.2
33.3 18.3 6.1 MB 41.7 276 ms 32.5
Cargo
54.3
30.4
16.7 21.7 1.9 MB 4.3 310 ms 42.6
Photoshelter
46.0
25.0
27.8 5.0 3.6 MB 23.7 209 ms 27.7

03 / Platform Deep-Dives

Key findings, per-site breakdowns, and what each result means for discoverability.

Bablab Total Score: 92.0 / 100
96.7Performance
8Schema Types
10Avg Requests
0.6 MBPage Weight

Bablab is the clear outlier - in the right direction. All three featured sites averaged 10 total HTTP requests and loaded in under 900ms despite displaying 50+ images. Every site embeds the same 8 Schema types: CreativeWork, ImageGallery, ImageObject, ItemList, Person, SiteNavigationElement, WebPage, and WebSite - the most complete structured data set of any platform tested. Zero missing alt tags across all 152 images audited. The security score of 100 is achieved through platform-level headers, not individual site configuration. Open Graph and Twitter Card tags are present on all three sites, meaning social share previews work correctly across LinkedIn, Slack, and X.

  • sivanaskayo.comPerf: 98 · 0.52 MB · 11 req · TTFB: 146ms
  • portraittoronto.comPerf: 93 · 0.66 MB · 8 req · TTFB: 93ms
  • luisgarciafoto.comPerf: 99 · 0.51 MB · 11 req · TTFB: 137ms
Squarespace Total Score: 48.0 / 100
25.0Performance
3Schema Types
87.3Avg Requests
12.3 MBPage Weight

Squarespace has the worst performance score of any platform tested, despite its sites often looking the most polished. The structural issue: an average of 3.3 render-blocking scripts per page (one site had 9), forcing the browser to pause rendering on every load. Page weight ranged from 2.3 MB to 31 MB - the heaviest single site in the entire audit. All three sites share the same Schema types (LocalBusiness, Organization, WebSite), which are generic and provide no image-specific context to AI crawlers. Security scores are weak: only one site scored above 16.7, meaning most security headers are absent. One site (ivanimages.com) had 169 images with zero missing alt tags - a rare bright spot.

  • crop45.comPerf: 40 · 3.47 MB · 89 req · 9 render-blocking
  • ivanimages.comPerf: 32 · 31.05 MB · 112 req · TTFB: 163ms
  • juliettecharvet.comPerf: 3 · 2.34 MB · 61 req · 1 render-blocking
Wix Total Score: 56.2 / 100
59.7Performance
3.3Schema Types
90.3Avg JS Files
86 msAvg TTFB

Wix presents a split personality. Its CDN is fast - average TTFB of just 86ms, the second-best in the audit, but it offsets that with an extraordinary JavaScript payload. The three tested sites loaded 77, 100, and 94 JS files respectively, totalling 169–189 HTTP requests per page. One site (decmichal.com) weighed 10.9 MB. The SEO score is a strong 100 (matching Bablab), and Wix does generate some Schema, though the types are generic (WebSite, LocalBusiness, SearchAction), not photography-specific. Accessibility scored highest of any platform at 91.7 avg. Security is inconsistent: all three sites scored 50/100 (3 of 5 headers missing, HTTPS present).

  • frebermedia.comPerf: 71 · 1.09 MB · 131 req · 77 JS files
  • decmichal.comPerf: 36 · 10.87 MB · 188 req · 100 JS files
  • yukaidu.comPerf: 72 · 2.56 MB · 189 req · 94 JS files
Photoshelter Total Score: 27.7 / 100
46.0Performance
0Schema Types
85Avg Requests
3,547 msAvg Fully Loaded

Photoshelter is the weakest performer in the audit overall. All three sites returned zero Schema types, zero Open Graph tags, and zero Twitter Cards - meaning there is no structured data signal for any crawler, AI or otherwise. Average fully-loaded time was 3.5 seconds, driven by 85 average requests and an average of 3 render-blocking scripts per page. The lowest-scoring individual site (mike-pickles.com) scored 22/100 total. Structure scores were effectively zero: no semantic HTML elements detected across any of the three sites. One site (joemcnally.com) made 99 HTTP requests.

  • helenewiesenhaan.photoshelter.comPerf: 49 · 3.37 MB · 71 req · 2,966ms load
  • portfolio.joemcnally.comPerf: 60 · 3.53 MB · 99 req · 3,858ms load
  • mike-pickles.comPerf: 29 · 3.89 MB · 85 req · 3,818ms load
Smugmug Total Score: 32.5 / 100
34.3Performance
0Schema Types
41.7Avg JS Files
3,111 msAvg Fully Loaded

Smugmug consistently scores near the bottom across every dimension. Zero Schema across all three sites. Average of 41.7 JavaScript files per page (one site, vonwong.com, loaded 42 JS files and weighed 15.5 MB). All three sites average over 3 seconds to fully load. Structure score of 18.3 reflects absent semantic HTML. On the positive side, Smugmug sites tend to have clean alt-text records, though this is partly because image counts in the audited pages were low to zero, making it a hollow metric here.

  • gilmoregang.comPerf: 28 · 1.85 MB · 67 req · 42 JS files
  • portfolio.shoottokyo.comPerf: 37 · 0.89 MB · 64 req · 41 JS files
  • vonwong.comPerf: 38 · 15.5 MB · 116 req · 42 JS files
Cargo Total Score: 42.6 / 100
54.3Performance
0Schema Types
334Max Missing Alt
16.7Security

Cargo's load numbers are acceptable relative to the field (1.9 MB average, only 4.3 JS files) but the SEO fundamentals are broken. Zero Schema across all three sites, zero semantic HTML, the lowest security score of any platform (16.7), and a catastrophic alt-text record: one site (danwilton.co.uk) had 334 images with zero alt tags. Every single image invisible to screen readers and image search. No OG or Twitter tags across any site. Cargo appears optimised for visual presentation with no infrastructure for search discoverability.

  • danwilton.co.uk334 images · 0 alt tags · Perf: 43
  • estudioblende.comPerf: 61 · 0.65 MB · 18 req · TTFB: 310ms
  • hugomapelli.comPerf: 59 · 3.23 MB · 70 req · 2,146ms load

04 / AI Readability & Structured Data

Schema.org implementation across all platforms - the data layer that AI-powered search relies on.

Generative search engines (Google SGE, ChatGPT search, Perplexity) don't read your site the way a human does. They consume structured signals: Schema.org objects that explicitly describe who you are, what your images depict, and where you work. A portfolio without these signals is effectively anonymous to AI-driven discovery.

⚠ Critical Finding

5 of 11 platforms returned zero Schema types across all their featured sites: Photoshelter, Smugmug, Cargo, Format, and Photofolio. These platforms provide no structured data signal to AI crawlers whatsoever. In an era where AI increasingly mediates search discovery, this is a foundational gap.

Schema Types Found Per Platform

Platform Avg Schema Count Types Detected
Bablab 8.0 CreativeWork, ImageGallery, ImageObject, ItemList, Person, SiteNavigationElement, WebPage, WebSite
Zenfolio 3.0 (1 site only) BreadcrumbList, ImageObject, WebPage, WebSite (+ more on carolinetran.net only)
Squarespace 3.0 LocalBusiness, Organization, WebSite (all 3 sites identical)
Wix 3.3 WebSite, LocalBusiness, SearchAction, PostalAddress (varies by site)
Pixieset 1.0 WebSite only (all 3 sites)
Pixpa 0.3 WebSite (1 of 3 sites only)
Format 0.3 WebSite (1 of 3 sites only)
Photofolio 0 None detected
Photoshelter 0 None detected
Smugmug 0 None detected
Cargo 0 None detected
What ImageObject and Person Schema actually do

The ImageObject Schema type lets you attach metadata (caption, subject, location, copyright) directly to each image in a machine-readable format. Person Schema ties your name, location, and specialty to the domain. Together they give AI crawlers a rich, authoritative signal for surfaces like Google's AI Overviews, ChatGPT Browse, and Perplexity answers. A WebSite-only Schema (like Pixieset) tells crawlers almost nothing beyond the site's existence.

Open Graph & Twitter Cards

OG tags control how your site appears when shared on LinkedIn, Slack, Facebook, and iMessage — custom image preview, title, description. Twitter Cards do the same for X/Twitter. These are platform-level defaults: either the platform injects them automatically or it doesn't.

Platform Open Graph Twitter Cards Notes
Bablab ✓ 3/3 ✓ 3/3 Consistent across all sites
Cargo ✓ 3/3 ✓ 3/3 Consistent across all sites
Format ✓ 3/3 ✓ 3/3 Consistent across all sites
Pixieset ✓ 3/3 ✓ 3/3 Consistent across all sites
Pixpa ✓ 3/3 ✓ 3/3 Consistent across all sites
Smugmug ✓ 3/3 ✓ 3/3 Consistent across all sites
Squarespace ✓ 3/3 ✓ 3/3 Consistent across all sites
Wix ✓ 3/3 ✓ 3/3 Consistent across all sites
Photoshelter ✓ 3/3 ✗ 0/3 No Twitter Card support
Zenfolio ✓ 3/3 ✗ 0/3 No Twitter Card support
Photofolio ½ 2/3 ✗ 0/3 OG inconsistent; no Twitter Cards
What this means in practice

OG tags are now a baseline expectation, most platforms get them right. The real differentiator is Twitter Cards: Photoshelter, Zenfolio, and Photofolio don't implement them, meaning every share of your work on X renders as a plain link with no image preview. Photofolio's inconsistent OG implementation (2/3 sites) suggests it's left to the photographer rather than injected by the platform.

05 / Security Headers

HTTP security headers across all 11 platforms. Each header is a Google trust signal and a user-safety mechanism.

Security headers are set at the server level - meaning the platform controls them, not the photographer. A missing Content Security Policy (CSP) leaves your site vulnerable to cross-site scripting. Missing HSTS means browsers may not enforce HTTPS. These aren't obscure hardening details, they're baseline trust signals that affect both user safety and, increasingly, search ranking.

Key Finding

Bablab is the only platform to implement all six security headers consistently (CSP, HSTS, X-Frame-Options, X-Content-Type-Options, Referrer Policy, and HTTPS) across all three tested sites, scoring a perfect 100. Every other platform has at least one critical gap, and CSP is missing on all but Bablab, Pixpa and Pixieset. Two Squarespace sites don't even serve over HTTPS.

The matrix below shows per-platform averages. A ✓ means the header was present on all three tested sites; ½ means present on some; ✗ means absent on all.

Platform
HTTPS
CSP
HSTS
X-Frame
X-Content
Referrer
Bablab
Photofolio
Pixpa
Pixieset
Format
½
½
½
½
Wix
Zenfolio
½
½
Squarespace
½
½
Smugmug
Photoshelter
½
½
Cargo

✓ = present on all 3 sites · ½ = present on some sites · ✗ = absent on all sites. Two Squarespace showcase sites (ivanimages.com, juliettecharvet.com) were served over plain HTTP - flagged as ½ on HTTPS.

06 / How to Evaluate a Platform Before Committing

A concrete checklist based on the gaps this audit found.

The most reliable test: find the sites a platform publicly showcases and run them through PageSpeed Insights and Google's Rich Results Test. Their best sites define the ceiling of what the platform's architecture can deliver.

  • [1]
    Run a featured site through PageSpeed Insights Look for a Performance score above 70 on mobile. Below 50 means structural problems with the platform itself, not just the individual site. Check "Render Blocking Resources" - more than 2 is a red flag.
  • [2]
    Check for photography-specific Schema types Use Google's Rich Results Test or view-source and search for schema.org. A platform generating only WebSite Schema is doing the bare minimum. Look for ImageObject, ImageGallery, and Person - these are the types that give AI crawlers meaningful context about your work.
  • [3]
    Count the JavaScript files Open the browser DevTools Network tab on a featured site, filter by JS, and count. Under 20 is healthy. Over 50 means the platform is loading a significant overhead on every page view - regardless of how fast the initial TTFB feels.
  • [4]
    Check alt-text coverage In PageSpeed Insights, look for "Image elements do not have [alt] attributes". If a platform's own showcase site fails this, the platform either doesn't prompt for alt text or actively strips it. This directly affects image search indexing and accessibility compliance.
  • [5]
    Test security headers Run the URL through securityheaders.com. A grade of D or F means the platform hasn't implemented basic HTTP security headers. This is entirely platform-controlled - you can't fix it yourself.
  • [6]
    Check for Open Graph tags View-source any featured site and search for og:image. If it's absent, every share of your work on LinkedIn, Facebook, or Slack will render as a blank preview - no image, no formatted title.
All data collected April 2026. Sites audited are those publicly listed as case studies or featured examples by each platform at time of testing. Scores are averages across three sites per platform. PageSpeed Insights data reflects mobile Lighthouse scores.