Every study conducted through the Angus Reid Forum follows
consistent, proven methods to ensure data clarity and integrity—whether for public release or exclusive
client use. All data is validated using Survey Sentinel, our proprietary fraud detection system built to
eliminate bots, bad actors, and AI-enabled manipulation. The result: reliable, human-sourced insights and
trusted data.
Reliable Research, Every Time
We field a diverse range of surveys for both
public and private sector organizations. Whether results are shared publicly or kept confidential, we apply the same
level of methodological rigor. Each study is custom-built based on project needs and reflects our high standards for
design, fieldwork, and data processing.
Our internal editorial team and the Angus
Reid Institute also run research initiatives aimed at helping Canadians and Americans better understand current
issues. These studies are designed for transparency, and we openly publish methodological details including field
dates, sample sizes, and weighting protocols.
Digital-First Survey Design
Since 1979, Angus Reid has led the evolution
of opinion research. Today, our teams leverage the scalability and adaptability of online data collection and
mobile-first, conversational survey tools. All surveys are conducted online and can be completed via mobile device,
tablet, or desktop—whenever is most convenient for the respondent.
Membership in the Angus Reid Forum is open to
all, though for certain projects we invite participants based on specific criteria. Rigorous quality controls ensure
that only reliable data is included in the final results. Our surveys often incorporate multimedia elements such as
images, videos, audio clips, or additional context—offering more textured insights than traditional
approaches.
Because we manage our own panel, we control
the sampling process end to end. Members provide detailed demographic information when they join, helping us reduce
redundant questions and monitor shifts in attitudes and behaviors over time.
We use non-probability sampling, which
means participants are drawn from our panel rather than the general population at random. To improve
representativeness, we apply statistical weights to the data after fielding is complete.
Panel Recruitment and Participation
The Angus Reid Forum is open to any Canadian
or American adult with internet access, representing more than 95% of the population. We recruit primarily through
digital advertising. To promote inclusivity, we offer surveys in English, French, Spanish, and support additional
languages on the Rival platform—including those that read right-to-left, such as Arabic.
Panelists earn points for participating in
surveys, redeemable for modest rewards. While incentives play a role, many members join to share their views and
contribute to meaningful research.
Sampling and Selection
Each project starts with a clear definition
of the target group—whether it’s a national population, a subgroup, or specific voters. Using data from
reliable sources like the census or electoral rolls, we identify relevant traits and select a representative mix of
panelists accordingly.
If more people qualify than needed, we use
secondary filters such as recency of participation, topic-specific history, and past response quality to refine the
sample. Most of our surveys involve 1,000 to 2,000 respondents and are designed to take under 20 minutes. Fielding
typically runs between 1 and 5 days.
Weighting and Representation
After fielding, we apply statistical
weighting to ensure the sample accurately reflects the target population. This includes adjustments for age, gender,
race, education, and voting behavior, using reliable national benchmarks. In some cases, we use multivariate
techniques (e.g., adjusting for race and education simultaneously) to better capture the complexity of real-world
identities.
Safeguarding Quality and Data Integrity
From registration through final analysis,
we uphold strict data quality standards. New panelists undergo email and IP verification. We screen for authenticity
and use cross-validation and response analysis to flag low-quality data.
All surveys include built-in quality
controls such as timing metrics, consistency checks, and pattern detection. Poor-quality responses are removed, and
repeat offenders are excluded from the panel.
To reduce bias, we randomize question and
answer order where applicable and use neutral, easy-to-understand language throughout.
Privacy and Participant Choice
Our panelists maintain control over their
data. They choose which surveys to join and which questions to answer. We never sell personal data and honor
requests for access, correction, or deletion of personal information.
Identifiable information is never
published. For follow-up interviews or deeper dives, we request explicit permission from panelists. Sensitive topics
include “Prefer not to say” options, and skipping questions is usually allowed.
Understanding Margin of Error
All surveys have a margin of error, which
reflects potential variation if the full population were surveyed. A margin of ±3%, for example, means the
true result likely lies within 3 percentage points of the reported figure.
Larger sample sizes reduce this
error—though with diminishing returns. The margin of error applies to the full sample; subgroup margins (e.g.,
young women) will be higher.
Margins only account for random sampling
error. They do not reflect other possible biases such as question wording or nonresponse. Weighting helps, but
doesn’t eliminate all sources of bias.
Election Modeling
To project election outcomes, we use
Multilevel Regression with Post-Stratification (MRP)—a validated technique in both Canadian and U.S. contexts.
Panel data trains a model that estimates likely turnout and vote choice, even among non-panelists. The model
proceeds in three steps: modeling turnout, estimating vote preference, and applying post-stratification to generate
national and regional forecasts.
See our track record in electoral forecasting
[here].
Our
Approach to Polling
All polling by the Angus Reid Institute is
conducted online using the Angus Reid Forum panel. This shift away from telephone polling reflects broader trends:
phone response rates now often fall below 10%, limiting representativeness. Online research provides a more
inclusive, engaging, and reliable alternative.
Our surveys use multimedia content to better
mirror real-world experiences and yield more thoughtful answers—even on complex subjects. Sensitive topics are
easier to address online, improving honesty and response quality.
Unlike phone polls, which rely on one-time
respondents, our online panel is built on long-term engagement. Panelists go through a double opt-in process and
complete detailed demographic profiles at registration—enabling more precise targeting and minimizing survey
fatigue.
Managing Our Panel
Despite lingering myths, online panels can
produce highly representative data. With internet usage above 95% in North America, our panel spans all regions,
ages, and income levels. Samples are balanced using current census and electoral data, and weighted to match the
population of interest.
We monitor and recruit continuously to
maintain quality. Sampling regions align with electoral districts and historical voting patterns, helping ensure
geographic and demographic accuracy.
Survey Process and Participation
We begin each project with a representative
sample matrix, then randomly invite participants via secure email links. We limit the frequency of surveys to reduce
fatigue and maintain fresh perspectives.
Incentives—usually small cash rewards
or prize entries—help keep panelists engaged and verified. Rewards are delivered by mail, adding another layer
of data integrity. We never share personal data with clients and strictly follow all privacy guidelines.
A
Word About Web Polls
It’s important to distinguish our
scientific research from online “quick polls.” These informal polls, often seen on websites, are
unrepresentative and unscientific. While we may use them to invite new panelists, their results are not published or
reported.
Dispelling Online Research Myths
Let’s clarify a few common
misconceptions:
-
“Online surveys don’t reach everyone.” Our panel includes all key demographic groups.
Combined with robust sampling and weighting, our results are representative.
-
“Panel surveys aren’t random.” While panelists join voluntarily, survey invitations are randomized and based on
verified benchmarks.
-
“Online panels are too biased.” Every method has limits. What matters is how those limits are addressed.
With balanced sample frames, verification, and rigorous weighting, our data is trustworthy and
high-quality.