Changes to the GP Patient Survey

Download

2017 survey

1. Questionnaire changes

No changes were made to the GP Patient Survey questionnaire prior to January-March 2017 fieldwork.

2. Methodology changes

The GP Patient Survey (GPPS) has, for the past 10 years, provided information for patients, GP practices and other organisations, about patient experiences of local GP and other health services. Over this time the frequency with which the survey has been administered has fluctuated between annual, quarterly and bi-annual iterations. For the first three years of the survey (January 2007 – March 2009), the survey was conducted on an annual basis. In April 2009 (Year 4) the GPPS was conducted four times a year and then in July 2011 (Year 6) it moved to twice a year. From 2011 (Year 6) survey results were published every six months, comprising the two most recent waves of data (Wave 1: July-September and Wave 2: January-March).

In 2017 (Year 11) the survey reverted to an annual formation (an annual publication of one wave of fieldwork), in order to reduce survey costs and rationalise the data collection process. The fieldwork took place in January-March 2017 which aligns with the fieldwork dates for Wave 2 in earlier years.

Measuring changes in survey data

In this context, NHS England and Ipsos MORI carried out a detailed analysis on Year 6 to 11 (2011-2017) of the survey to assess whether there are any systematic differences in the data collected between the Wave 1 (July-September) and Wave 2 (January-March) fieldwork periods, as these differences could impact upon comparisons of survey estimates on trend. Such systematic differences are referred to here as a ‘fieldwork timing effect’; in simple terms, this is where evidence indicates that differences between data collected in Wave 1 and Wave 2 may be a result of the different times of year that fieldwork was conducted in. If the analysis found consistent differences between results from Wave 1 and 2 fieldwork periods, then it would not be appropriate to compare trend data from Year 11 to full-year trend data from previous years (that is, data comprising of two waves of fieldwork, July to September and January to March). Instead, Year 11 data would need to be compared to the corresponding Wave 2 fieldwork period in earlier years (that is, data from January to March fieldwork only).

Key findings

The analyses conducted suggest that there is evidence of a small difference in the data collected between waves (with Wave 2 slightly more positive), which is more strongly observed at the national level. However, it is impossible to know what is causing this and these differences are unlikely to be solely due to fieldwork timing with other factors, such as sampling variance (i.e. statistical differences due to chance) and genuine local change, also contributing.

Based on the analyses there is insufficient evidence that switching from two waves of fieldwork to a single period will make any substantial difference to the survey estimates. However, because the sample sizes for GPPS are so large at national level, we suggest taking a conservative approach to any future trend analysis, comparing Year 11 data against Wave 2 only data from previous years of the survey. This will ensure that any observed differences cannot possibly be a result of an underlying ‘fieldwork timing effect’. Where national-level trends are reported on the website, they have been updated to reflect this approach.

For categories with smaller sample sizes such as CCGs and GP practices this approach is not considered necessary. This is based on caveats around evidence of a ‘fieldwork timing effect’ and the fact the observed effect is inconsistent across CCGs, in both degree and direction. CCGs can vary notably in size; although some CCGs may comprise a large number of cases, there are others which are relatively small. Despite these differences, a consistent approach must be used when reporting results at this level in order to facilitate comparisons across CCGs. Therefore, our recommended approach takes into account best practice for those CCGs with a smaller sample size. This means comparing Year 11 data to a full year of data from previous years of the survey (both at CCG and practice level).

For full details of this analysis please see ‘Assessing the impact of change to an annual GP Patient Survey’.

 

Summary guidance for time series data at national, CCG and practice level

Data level Approach for analysis on trend

National

Compare Year 11 estimates to historical estimates from Wave 2 only (Jan-March data)

CCG

Compare Year 11 estimates to historical estimates from both waves (a full year of data)

Practice

Compare Year 11 estimates to historical estimates from both waves (a full year of data)

 

Further details on the 2017 methodology can be found here.

For more information

3 Thomas More Square London E1W 1YW

t: +44 (0)20 3059 5000
www.ipsos.com/en-uk
https://x.com/IpsosUK

About Ipsos Public Affairs

Ipsos Public Affairs works closely with national governments, local public 
services and the not-for-profit sector. Its c.200 research staff focus on public 
service and policy issues. Each has expertise in a particular part of the 
public sector, ensuring we have a detailed understanding of specific sectors 
and policy challenges. Combined with our methods and communications 
expertise, this helps ensure that our research makes a difference for 
decision makers and communities.