Protected Presentation
COMMB × Vertical Impression · March 2026

Asking what people did is one methodology. Calculating it live is another.

The Office Study asks. AVA measures attention, presence, and behaviour every day, in real time. We're here to bring live data into COMMB's framework — to let the data inform the metrics for dwell, traffic, and trip counts.

Date
Monday, May 4
Duration
60 minutes
Lead
Nicolette Leonardis
Outcome
Peer alignment + joint advocacy
01 · Why we're here 2 minutes Nikki

For behavioural metrics: calculating beats asking. Survey stays for sentiment, intent, opinion.

Focus Area 1
Dwell Time
Measured to the millisecond
Focus Area 2
Ground Truth Counts
Consistent-bias proxy for traffic
Focus Area 3
Trip Counts
Elevator + lobby, validated
Focus Area 4
Supplementary Insights
For Office Network Study releases
02 · The methodology today 3 minutes Nikki

The Office Study is the shared foundation.

Survey-based, audited, used by every office network operator in Canada. The starting point — not the problem.

17 questions. 99.6% attention. 93% segment overlap. The numbers are right — they stay.

Question on the table: can some of those answers be made measurably sharper by a deterministic layer underneath?

2025 Office Study
17
Questions across demographics, behavior, attitude
Attention Finding
99.6%
Of professionals pay full or some attention
Audience Segments
2
Professionals + Decision Makers (93% overlap)
Operators Aligned
All
Shared methodology across the industry
🏛️
The frame for today: the current methodology is agreed upon, audited, and out in the market — and it should be. We're here to propose the next layer that sits inside it: continuous, deterministic measurement of the metrics where calculation produces a better answer than asking.
03 · Why calculating live matters 4 minutes Nikki

The 2025 survey was a moment. We've been calculating every day since.

Here's what continuous calculation has already told us about Canada's office presence — that the static survey was structurally unable to catch.

📡
The setup: the 2025 Office Study captured a single window. AVA has been measuring continuously since the day it closed — every screen, every weekday, every event. The result: a year of behavioural data the survey methodology cannot produce.
Office Presence YoY
+2.4%
2025 → 2026 weekday avg · calculated live
Tuesday Surge
+7.3%
Strongest mid-week growth
Wednesday Growth
+5.5%
Mid-week peak activity
Daily Lift Avg
+410
Additional events / weekday in 2026

None of these numbers existed when the 2025 study closed. Survey captures a moment. Calculation captures the movement.

Tuesday + Wednesday are the new office days. Mid-week is consolidating. Ready for the next Office Network data refresh — no survey cycle required.

Survey methodology vs AVA Vision Events — side by side
Dimension Survey Methodology AVA Vision Events
ApproachAsks people what they didCalculates what people did
Data typeSelf-reported recallLive measurement
SamplePanel-based extrapolationCensus · every building
CadencePeriodic study cycleContinuous · every event
GranularityAggregateSub-hourly · per location
Best forSentiment · intent · opinionDwell · presence · traffic · trips
Bias profileRecall bias · social desirabilitySystematic undercount (consistent)
🎯
The frame: we're not asking the survey to be something it's not. We're asking the methodology to use the right tool for each metric. Asking for sentiment. Calculating for behaviour.
04 · AVA Vision Events 5 minutes Nikki

A vision event is the discrete unit of attention we measure.

On-device. Anonymous. Five conditions — all of which must be met simultaneously.

On-device · edge-processed · anonymous. No images, no video, no PII ever leaves the screen — only counts.

Five conditions gate every vision event. All five required. Conservative by design.

Vertical Impression · AVA System Anonymous Video Analytics · Live Screen Data
How a Vision Event is Triggered
Ad creative on screen
VI Digital Elevator Screen · Live AVA Capture
AVA CAM
6.8s ▲
4.2s
NO EVENT
← looking away — no vision event triggered
Live Screen Data
Sample location · Last 30 min
Avg Attention
7.1s
+31%
👥
Vision Events
284
today
🎯
Engagement Rate
78%
>3s
Demographic Breakdown
Female52%
Male48%
18–3436%
35–5446%
55+18%
1
Face Detection
A human face is identified within the camera's field of view
2
Gaze Confirmed
Face angle and eye alignment must be oriented toward the screen
3
Duration Recorded
Attention time measured to the millisecond while gaze and quality hold
4
Anonymous Output
No images stored — only counts, durations, timestamps
🔒On-device · Zero PII
Privacy by Design certified · KPMG-audited
📜Patented
🍁Funded by the Government of Canada
The 5 conditions of a valid vision event
👤
Factor 1
Face Detection
A human face must be detected in frame
🧭
Factor 2
Face Angle
Oriented toward the screen — profiles excluded
👁️
Factor 3
Eye Alignment
Gaze direction confirmed toward screen plane
⏱️
Factor 4
Duration
Held for minimum threshold, measured in ms
🔍
Factor 5
Image Quality
Detection clarity above quality floor
🎬 05 · Live Tech Demo 18 minutes Han · screensharing
Live screenshare · Han

From model to methodology — at the code level.

Han will walk through the actual AVA code panel. Specifically: how opportunity-to-see is distinguished from attention. How attention is calculated. How sentiment is captured. How pitch and yaw of the jaw confirm gaze. This is the methodology bit — answered at the methodology level.

Opportunity-to-See vs Attention Attention calculation Sentiment capture Jaw pitch & yaw Code-panel walkthrough

Han takes the screen. Agenda minimizes. Resume here for section 6.

06 · The four COMMB focus areas 12 minutes Gio + Derek

How vision events answer the four questions COMMB put on the table.

Dwell time. Ground truth counts. Trip counts. Supplementary insights. One at a time.

06a · Dwell Time

Dwell time is the literal time gaze held — measured to the millisecond.

Dwell time = literal duration gaze + 4 other conditions held simultaneously. Milliseconds. Not self-reported. Maps to COMMB rows 7 & 8.

#COMMB MetricVI CounterpartData SourceGoal
7Avg. Time in ElevatorAvg. Duration in ElevatorElevator vision event durationValidate survey trends
8Avg. Time in LobbyAvg. Duration in LobbyLobby vision event durationValidate survey trends
06b · Ground Truth Visitor Counts

A consistent-bias proxy — and we want to be explicit about that.

⚠️
Saying it before anyone else does: AVA undercounts. A person looking down at their phone, or facing the elevator door, or at the back of a crowded lobby — does not register a vision event. By construction, AVA counts are lower than true headcount. We're not hiding that; it's the foundation of the methodology.

Not a people counter (yet?) — AVA only counts the people who look at the screen. But the fraction it captures stays stable across every location, every day. So while totals undercount the room, relative comparisons stay reliable: Building A vs B · Tuesday vs Wednesday · 8am vs 5pm.

What's reliable: relative volume · time-of-day · day-of-week · YoY trends.

Complement: survey owns absolute headcount calibration. AVA tells you when those totals shift — in real time, between cycles.

What AVA does NOT claim
An absolute headcount
We do not claim every person who passes a screen generates a vision event. Anyone facing away, on their phone, or out of camera FOV is missed — AVA captures attention, not headcount.
What AVA DOES claim
Reliable relative patterns
If Building A generates 2× the vision events of Building B during morning rush hour, A reliably has 2× more elevator traffic. The absolute number may differ from a foot counter — the relative comparison is valid.
Phase 2 Note We can turn on absolute counts.

We've built the capability and we know it works. We're holding the public claim until reliability is rigorously tested.

Phase 2's Parallel Measurement & Validation compares vision events directly against survey data — so we can confirm 100% before we commit. That's the next work, and it's exactly what this approval unlocks.

Reference: Methodology document · Phase 2 — Parallel Measurement & Validation (Derek)
06c · Trip Counts (Elevator + Lobby)

Vision events independently validated the 2025 survey's building class spread.

COMMB rows 5 & 6 — elevator trips, lobby visits. Vision events independently validate the 2025 survey's class spread.

Survey: A 4.7 · B 4.1 · C 3.3 trips. AVA: A 1.2× · B 1.0× · C 0.8× of B baseline. The spread matches.

Building ClassAvg. Elevator Trips (Survey)Elevator Vision Events (B = baseline)Avg. Lobby Trips (Survey)Lobby Vision Events (B = baseline)
Class A4.71.2×3.80.4×
Class B4.11.0× (baseline)3.81.0× (baseline)
Class C3.30.8×3.61.0×
⚠️ Saying this before anyone asks
Class A lobby figure (0.4×) — yes, we see it.
Survey reports identical lobby trips for Class A and B (3.8 each). Vision events show A at 0.4× B. That's a real discrepancy — and the reason is structural, not measurement error. Class A buildings have multiple access points — underground parking, transit / subway connections, pedways and PATH-style skywalks. A large share of visitors enter directly from below or above and go straight to the elevators, bypassing the lobby entirely. The lobby camera isn't missing them — they were never in the lobby. Phase 2's expanded sample lets us quantify that share. Flagging it now so it doesn't read as a contradiction we ignored.

Sample: 75 elevator screens + 8 lobby screens. Phase 2 expansion to 105 elevator + 11 lobby is in flight.

06d · Supplementary Insights for the Office Network Study

Insights the survey can't produce — every week, no field cycle required.

The data shape no other system in Canadian OOH can produce. Continuous · network-wide · no survey cycle required.

Mid-week consolidation, weekend decoupling — release-ready insights every week.

Day of Week2025 Events2026 EventsChange% Change
Monday15,41215,540+127+0.8%
Tuesday18,42819,779+1,351+7.3%
Wednesday19,25520,320+1,065+5.5%
Thursday18,62517,755−870−4.7%
Friday14,41714,798+381+2.6%
Weekday YoY
+2.4%
+410 events/weekday avg
Tuesday Peak
+7.3%
Strongest mid-week growth
Wednesday Peak Activity
+5.5%
Highest absolute volume day
Friday Growth
+2.6%
Modest but positive
📊
The release-ready insight: Canadian office presence is consolidating around the mid-week core. Tuesday and Wednesday are now the strongest office days — and the year-over-year growth is calculated, not estimated. Grounded in continuous, deterministic data, not a survey snapshot.
07 · Privacy & compliance 3 minutes Nikki

Anonymous by design. Validated by independents.

On-device. Zero PII. Privacy by Design certified (3rd-party audited by KPMG) · Patented · Funded by the Government of Canada.

Architecture: on-device · zero PII captured at any layer.

Three independent validators: Privacy by Design certified (3rd-party audited by KPMG) · Patented · Funded by the Government of Canada.

Privacy by Design Certified
3rd-party audited by KPMG · PECB MS
Patented
Issued patent on the AVA technology
Funded by Government of Canada
Accelerated Growth Service
08 · Phased rollout 3 minutes Nikki

Complement first. Replace selectively. Where deterministic measurement objectively wins.

Four phases. We're in Phase 2 today — and Phase 3 is what this conversation is about.

Phase 1 active today. The ask in this room: approve the move into Phase 2.

P2 is where we run vision events in parallel with the survey, validate against it, and earn the right to call this primary measurement in the metrics where calculation beats recall.

Phase 01
Active
Starting Point
COMMB's Office Study runs on its existing survey methodology. The current operational standard — in market, audited, and trusted.
Phase 02
Today's ask
Parallel Measurement & Validation
VI runs vision events in parallel with the survey, comparing outputs to validate reliability against the current standard. This is what today's approval unlocks.
Phase 03
Roadmap
Methodology Transition
Once P2 validates: vision events become primary only for metrics where calculation beats recall. Survey remains source of truth for sentiment, intent, opinion.
Phase 04
Roadmap
Residential Expansion
Validated methodology extended to residential. On the horizon — not on today's table.
09 · The ask 2 minutes Nikki

Two specific things from this room today.

Two asks. One room. Today.

The ask
01
Peer alignment
Acknowledgement that adding a deterministic vision-events layer to the methodology — alongside the survey — is a directionally good idea for the industry. Not a sign-off on every metric. A directional yes.
02
Joint advocacy to the research committee
Bring this proposal to the COMMB research committee jointly — as the office network operators who all benefit from a stronger methodology, rather than as a single vendor's pitch. A more credible posture for committee. A faster path to a yes.

Next step if we align: short joint memo to the research committee. Happy to draft.

10 · Q&A · Parking lot 8 minutes All

Questions, objections, and items to park.

We'll capture anything we want to follow up on in the parking lot panel — bottom-left of the agenda. Nothing gets dropped.

Questions for any of us. Items to come back on get parked — captured live, visible.

📋
Click "Parking lot" in the sidebar to log a question or follow-up live during Q&A.
End of agenda
COMMB × Vertical Impression · AVA Vision Events Methodology · Monday, May 4, 2026
🎬 Live Demo · Han is screensharing

Walking through the AVA code panel

Opportunity-to-see vs attention · attention calculation · sentiment · jaw pitch & yaw. Agenda is paused. Resume when ready.

Opportunity-to-See vs Attention Attention calc Sentiment capture Pitch & yaw of jaw Privacy-preserving inference
📋 Parking lot
Nothing parked yet. Items go here during Q&A.