Creating an Age-friendly Community Survey
We have developed guidance on how to produce your own local survey to understand older residents’ experiences and priorities around ageing.

Community surveys involve a structured set of questions aimed at gathering insights from a group of people in a specific geographic area and can be a great tool for getting feedback from older people living in your area.
This approach can be particularly important in an age-friendly community. The first stage of the WHO Age-friendly Programme Cycle, ‘Engage and understand’, is about understanding how ‘age-friendly’ your community currently is. A community survey can be used at this stage to get an idea of older peoples’ priorities locally, to help prioritise areas for improvement. Alongside other data it can also be used to help establish a baseline at the beginning of your Age-friendly programme, which you can deliver again after a period of time to see what has changed.
Age-friendly Community Survey: Template
This is a downloadable template community survey that can be used as a baseline for your own survey. You may eliminate or add questions as needed to reflect local priorities, but we would encourage you to reflect the diversity of topics captured by the World Health Organisation Eight Domains as this survey does. Please see our Good Practice points below for more guidance on the community survey process- from how to write your own questions, to how to distribute the survey and analyse the data collected.
Limits of a community survey
Community surveys are a useful tool for understanding the needs of your community. However, they do have some limitations, may not be appropriate for all communities and should always be used in combination with other engagement methods. Some potential limitations include:
Response bias - those who respond to surveys are often more affluent and less diverse than the surrounding community, meaning survey responses may not be representative. This is why it is important to gather demographic information from respondents and take action to increase response rates amongst underrepresented groups (See: ‘Distribute & Report’ on p.3).
Consultation fatigue - residents can often feel they are asked to give their time to a surveys or consultation processes with little transparency about the outcomes of that survey or consultation. Before embarking on a new Age-friendly community survey, places should review existing local surveys/ consultations - where Age-friendly themes could be identified and the results disaggregated by age such as local housing or transport surveys. If still choosing to do a community survey, places should be in contact with respondents about the ways in which their responses are used. (See: ‘Distribute and Report’ on p.3)
Consider Your Purpose
- Make sure each question you have is necessary and suits your purpose - this includes any demographic questions. You should have intended purposes for all the data you’re gathering. Think about the influence you have in each age-friendly domain and ask questions about those things that you have the power to change- keeping your survey to things you have control over will help you more clearly demonstrate impact to survey respondents.
- Don’t make it too long - even if your survey covers a lot, keep it short! If the survey is too long, it creates an additional barrier for people to respond, which will bias results. Make it so that people can answer some questions and still submit the survey, versus having to respond to every question. Stick to surveys that take no more than 20 minutes to complete. You may create a full survey and a shorter one for those who don’t have the time or capacity to complete the full survey. Even with short surveys, put the questions you most want answered at the front of the survey in case respondents don’t complete every question.
- Work with older people to develop your questions – consider co-writing your survey with older people and ensure you test your draft survey with groups of older residents to get feedback on what works and what doesn’t before distributing more widely.
- Think about how often you intend on repeating the survey - will it be a one-off to establish a baseline, or will you repeat the survey over time to track progress?
- Think about how you will communicate results to survey participants - Building in feedback in the process- like collecting emails or offering a check-box for those who wish to be contacted by mail about the results- helps to show participants that their contribution was meaningful.
Best Practices for Survey Writing
Closed questions (like multiple choice) are easier to compare across respondents and leave less room for misinterpretation, but they make assumptions about what topics should be covered. Open-ended questions allow for a wider range of topics and can collect more rich data, but they are more time-consuming to interpret. Both methods have strengths and weaknesses, so using a combination of the two, where possible, is ideal.
Example:
A closed question looks like…………….
Which of these options do you think would improve you town centre most?
- A. More parks and green spaces
- B. More libraries
- C. More high street shops
An open-ended question looks like…………….
In your opinion, how can your town centre be improved?
(No answers, just room for respondent to write)
This allows for people to express their feelings more accurately that a simple agree/disagree binary, and better show changing opinions year-over-year.
Example:
- Question: My community streets are easy for me to navigate
- Answer: Agree- Somewhat Agree- Neutral- Somewhat Disagree- Disagree
Leading language can bias the results of your survey, and complex language, particularly the use of subject-specific jargon, can confuse or discourage respondents.
Example:
A question that uses difficult or leading language may say….
Don’t you think that more public toilets would make the community better?
Yes/No
A less leading question may look like…
Do you think there are sufficient public toilets in your community?
Yes/No
An even less leading question may look like...
How satisfied are you with the provision of public toilets in this area?
Not at all satisfied/Not satisfied/Neutral/Satisfied/Very Satisfied OR Write-In
Questions that involve multiple issues, or ask about two different things at once, can be confusing for the respondent. Keep your questions as precise as possible.
Example:
A combined question may say….
How easy do you find it to access benches, parks, community centres, and shops in your community? (Very Difficult/Difficult/Neutral/Easy/Very Easy)
A more specific question may say….
How easy do you find it to access the following in your community..... (Very Difficult/Difficult/Neutral/Easy/Very Easy)
- A. Benches
- B. Parks
- C. Community Centres
- D. Shops
of personal experience engages participants and allows them to respond from the expertise they have about their own needs. In some cases you might want to ask from the perspective of friends and family, but not “older people” in general.
Example:
A question that uses non-personalised language may look like….
It’s convenient for older people to access the local community centre
A question that uses personalised language may look like…
It’s convenient for me to access my local community centre
Material experience focuses more on ‘things’- what is or is present in the environment- whereas subjective experience covers ‘feelings’- how people might feel about their interactions with the environment around them. Both of these influence resident satisfaction and positive outcomes, so aim to include a range of questions that covers both.
Example:
A question about material experience of safety may look like……
How would you rate the street lighting in your area?
A question about subjective experience of safety may look like……
How safe do you feel in your neighbourhood?
You should ask about the respondent’s demographics but place these questions at the end and make them optional. Asking about demographics like age group, ethnicity , gender, sexual orientation, etc. can be useful for understanding the different intersectional needs of different subgroups within your area, rather than relying on a one-size-fits-all approach. However, this information is personal and may be a barrier to people responding- keeping questions about demographics optional (eg. Offering a ‘prefer not to say’ option) and putting them at the end rather than the start of your survey can help reduce that barrier.
Distribute
- You will be collecting and storing personal data, so you must make sure that you comply with data protection regulations- refer to your organisation’s UK GDPR and other related policies. You can find additional guidance here.
- While response rates will vary based on the size of your community, a standard local authority should aim for about 500-1,000 responses from a diverse range of community members (analysis can be done with fewer responses, but it will be less robust)
- Distribute your survey using a variety of methods- to make your approach age-friendly, you may consider accessibility standards for online and print materials and producing paper versions of the survey. Make sure your survey is translated into any other major languages spoken by groups in your community.
- Distribution through community service providers and trusted community networks is key to outreach, particularly for underrepresented groups. Places like doctors surgeries, libraries, and town halls can be great places to leave survey.
- Shopping centres, community events, social groups, and other places where people gather are also great opportunities to reach more people.
- Going in-person to help people fill out surveys will help increase your response rate, and make sure that people are filling out the full survey.
- Look at who you’re getting responses back from- if the respondents aren’t representative of the mix of age-ranges, ethnicities, incomes, genders, etc. in your community, do additional targeted outreach.
Analyse and Report
- Break down the data- this can be as simple as breaking down your responses into percentages (E.g. 30% of respondents agree or strongly agree that their streets are easy to navigate)
- For questions that ask participants to rank options, you may compare (eg. 40% of respondents thought that libraries would improve their communities, more than chose shops (28%) or green spaces (32%))
- If you have a large enough response rate, you can break down the data by different demographic or other characteristics (E.g. What services did 50-65 year olds rate as most important, compared to those 65 and older?)
- For open-ended questions you may want to theme results when you analyse, to see what results are coming up across the community.
- Collate and present the results in an accessible way- this most likely will form part of a baseline assessment for your Age-friendly Community, or may provide context for any Age-friendly strategy or action plan.
- Keep respondents in-the-loop about how their insights are informing action in the community.