Narrow the gap in school readiness between vulnerable children and their better off peers through a Parent Child Home Programme.
Target population
Families with a child aged 2-3 in Kensington & Chelsea and Westminster who do not meet age related goals on speech and language, and social-personal skills assessed by the Ages and Stages Questionnaire during the national 2-year old checks.
Location
Country
United Kingdom
Service delivery locations
Royal Borough of Kensington and Chelsea (London)
Westminster City Council (London)
Involved organisations
Configuration of contracting parties:
Direct contract between outcome payer and provider
Metric 1: Beneficiaries initially engage with the programme.
Metric 2: Beneficiaries complete the first cycle of the programme.
Metric 3: Speech and language skills at ASQ level for children within 16 months of starting the project.
Metric 4: Social-personal skills at ASQ level for children within 16 months of starting the project.
Metric 5: Parental self efficacy increased by 5% over 15 months of the project compared to baseline assessment.
Metric 6: Beneficiaries complete the second cycle of the programme.
Metric 7: Speech and language skills at EYFS level for children within 39 months of starting the project.
Metric 8: Social-personal-emotional skills at EYFS level for children within 39 months of starting the project.
Results
ParentChild+ started delivering services in June 2019 and will finish in September 2025. Data was last updated in June 2024. These are interim results.
Outcome achievements
Overall target is based on the high case scenario defined in the Life Chances Fund Final Award Offer or Variation Agreements.
Generating plot, please wait...
The graph above shows interim results for the project’s outcome achievements. Each bar represents a key participant outcome or metrics. Each metric is detailed above the graph (under the ‘Outcome metrics’ section of this page). Users can hover over the bars to access data on the expectations and achievements for that particular metric. Labels at the top of the bar represent the overall expectations for specific metrics, for the entire life of the project. The coloured section of the bar represents the project’s achievements so far.
Each bar takes the unit of analysis of the metric (if the metric is measured in number of individuals, the bar graph is representing individuals achieving that metric. If the metric is measured in weeks, the bar graph is representing weeks).
A note on targets (or expectations): the graph above shows the latest targets for the project. These targets are based on the best-case scenario expectations for every project. These targets may be different from the targets set at the start, as projects adapt to unexpected challenges or changes in circumstances. In addition, these targets could also work as a ‘cap’ for payments. We offer these parameters as a reference on outcome achievement projections. If projects are under implementation, they are not expected to have achieved any of these targets yet.
The project commissioner interpretation of this graph (July 2023):
'Metric 1: Beneficiaries initially engage with the programme. 214 starters out of the 198 target (we had an additional cohort of 48 added to the initial target of 150 as a result of the LCF extension funding). We also had additional starters on top of the 198 target to factor in expected attrition rates.
Metric 2: Beneficiaries complete the first cycle of the programme. 150 completed half of the programme, also known as first cycle/mid-point.
Metric 3: Speech and language skills at ASQ level for children within 16 months of starting the project. 116 achieved, though we used another tool of measurement for children with special educational needs as the project evolved. We used a Child Behavioural Trait tool at beginning, mid and end of programme to demonstrate the outcomes for these children.
Metric 4: Social-personal skills at ASQ level for children within 16 months of starting the project. 115 achieved, though we used another tool of measurement for children with special educational needs as the project evolved. We used a Child Behavioural Trait tool at beginning, mid and end of programme to demonstrate the outcomes for these children.
Metric 5: Parental self-efficacy increased by 5% over 15 months of the project compared to baseline assessment. 71 achieved. This metric had some challenges that required resolution during the programme. They were primarily covid-related and missing responses. The other challenge was parental understanding of the set of questions in this metric.
Metric 6: Beneficiaries complete the second cycle of the programme. 146 completed the programme. Around 64 disengaged at various points of the programme. Disengagement was an expected risk factor and was factored into our projections.
Metric 7: Speech and language skills at EYFS level for children within 39 months of starting the project. We have yet to receive sufficient responses for this metric, all results due in 2025.
Metric 8: Social-personal-emotional skills at EYFS level for children within 39 months of starting the project. We have yet to receive sufficient responses for this metric, all results due in 2025.
Speech and language and school readiness are key priority areas within the Bi-borough. Evidence shows that language outcomes in the early years is one of the best predictors of a child being able to ‘buck the trend’ in terms of poorer long term life outcomes and escaping poverty in later adult life associated with social deprivation. Research suggests that early intervention can prevent children from a widening gap of attainment later in life. Subsequently, the metrics were designed around indicators of improvements in communication, social-personal-emotional development and parenting self-efficacy.
However, some metrics faced significant challenges during Covid where baseline questionnaires collected from starter parents were lost, most likely due to potential issues during this period. An average baseline was used to measure against the end of programme returns for these families. There were also challenges in were observed in measuring “Parent self-efficacy” for parents with ESL (English as a second language) and provider staff needed to assist with completion. Covid and cost of living crisis may also have impacted improvement levels. While evaluating results from ASQs (Ages and stages questionnaire), children with SEN (Special educational needs) were assessed using a different tool called CBT (Child Behavioural Traits) and assessments were carried out by the provider.
It is also acknowledged that completion of ASQs is an area of continuous improvement within the Bi-Borough and therefore adding an additional cohort for repeat ASQs increased the demand. This can be avoided by training other staff members/partners to undertake this activity. There have been delays with ASQ assessment due to Health Visitor workloads and children moving out of borough. An alternative measurement, CBT, has been used to assess both SEN children and children that moved out of borough during/after the programme. This is because they fell outside of the catchment area of our health visiting service.
The broader metrics are still key priorities within the Bi-Borough (and nationally) but assessments/measures that were used to determine the success need reconsideration. Early Years is a challenging area in general to measure outcomes and progress and so demands for bespoke assessments or using existing assessments that are tried and tested but not undertaken routinely/widely (particularly for children with additional needs).'
Outcome payments
Generating plot, please wait...
The graph above shows interim outcome payment results. The x-axis displays the years since the start date of the project to the anticipated completion date. The y-axis represents the value of the payments for outcomes realised by participants in the programme. The aim of this graph is to enable users to compare the initial expectations of the project against the actual value of the outcomes that were achieved.
The dotted lines represent the different plans that projects had at different moments- labelled as ‘Plans’ in the key. The data for these dotted lines (or single dotted line) comes from the outcome payment profiles that projects shared with the commissioners and their values represent expectations according to 'best-case scenarios' (if projects achieved as many outcomes as possible). There are different dotted lines as projects can renegotiate their payment plans as they face changes that affect delivery (such as the COVID pandemic) or adjust their expectations during the life of the project. Each dotted line is made of a set of points. Each point represents a quarter. Users can hover over those points and access data on the expectations for that quarter.
The solid line shows the outcome payments that the project already claimed and received- labelled as ‘Actual’ in the key. Squared points on the 'Actual' line indicate that the payment for that quarter was a COVID-19 medium-scenario grant. This was one of the temporary funding options offered to projects during the COVID-19 pandemic (this included activity payments based on projected medium-case performance scenarios). On the top-right corner, the ‘Plans’ and ‘Actual’ lines can be selected and deselected to change which lines appear in the graph.
A note on the representation of different payment profiles (or plans): when Life Chances Fund projects reprofile their payment plans, they use a template provided by the National Lottery Fund. When they complete data for the past quarters, some projects preferred to leave those cells blank, other preferred to repeat the previous expectations and other decided to complete those cells with data from actual payments. To avoid confusions around these different criteria, we start representing a plan from the moment when the plan is valid.
The project commissioner interpretation of this graph:
Payments between April 2020 – March 2021 were made based on a Medium Scenario payment profile due to the impact of covid to the delivery model (face to face to virtual delivery) and risk of potential impact to outcomes that was outside of the control of the service provider. This was to ensure the service could be sustained when this programme was needed most by the cohort.Ongoing impact from the initial and subsequent covid lockdowns and restrictions created a knock-on effect to the payment timeline.
Factors impacting outcome delays are covered in the “Outcome Achievements” narrative and include:
Delays in ASQ assessment completion.
Higher numbers of SEN children than anticipated.
Resolving the issue of missing parental self-efficacy returns.
INDIGO data are shared for research and policy analysis purposes. INDIGO data can be used to support a range of insights, for example, to understand the social outcomes that projects aim to improve, the network of organisations across projects, trends, scales, timelines and summary information. The collaborative system by which we collect, process, and share data is designed to advance data-sharing norms, harmonise data definitions and improve data use. These data are NOT shared for auditing, investment, or legal purposes. Please independently verify any data that you might use in decision making. We provide no guarantees or assurances as to the quality of these data. Data may be inaccurate, incomplete, inconsistent, and/or not current for various reasons: INDIGO is a collaborative and iterative initiative that mostly relies on projects all over the world volunteering to share their data. We have a system for processing information and try to attribute data to named sources, but we do not audit, cross-check, or verify all information provided to us. It takes time and resources to share data, which may not have been included in a project’s budget. Many of the projects are ongoing and timely updates may not be available. Different people may have different interpretations of data items and definitions. Even when data are high quality, interpretation or generalisation to different contexts may not be possible and/or requires additional information and/or expertise. Help us improve our data quality: email us at indigo@bsg.ox.ac.uk if you have data on new projects, changes or performance updates on current projects, clarifications or corrections on our data, and/or confidentiality or sensitivity notices. Please also give input via the INDIGO Data Definitions Improvement Tool and INDIGO Feedback Questionnaire.