Skip to main content

Fidelity and moderating factors in complex interventions: a case study of a continuum of care program for frail elderly people in health and social care

Abstract

Background

Prior studies measuring fidelity of complex interventions have mainly evaluated adherence, and not taken factors affecting adherence into consideration. A need for studies that clarify the concept of fidelity and the function of factors moderating fidelity has been emphasized. The aim of the study was to systematically evaluate implementation fidelity and possible factors influencing fidelity of a complex care continuum intervention for frail elderly people.

Methods

The intervention was a systematization of the collaboration between a nurse with geriatric expertise situated at the emergency department, the hospital ward staff, and a multi-professional team with a case manager in the municipal care services for older people. Implementation was evaluated between September 2008 and May 2010 with observations of work practices, stakeholder interviews, and document analysis according to a modified version of The Conceptual Framework for Implementation Fidelity.

Results

A total of 16 of the 18 intervention components were to a great extent delivered as planned, while some new components were added to the model. No changes in the frequency or duration of the 18 components were observed, but the dose of the added components varied over time. Changes in fidelity were caused in a complex, interrelated fashion by all the moderating factors in the framework, i.e., context, staff and participant responsiveness, facilitation, recruitment, and complexity.

Discussion

The Conceptual Framework for Implementation Fidelity was empirically useful and included comprehensive measures of factors affecting fidelity. Future studies should focus on developing the framework with regard to how to investigate relationships between the moderating factors and fidelity over time.

Trial registration

ClinicalTrials.gov, NCT01260493.

Peer Review reports

Background

Intervention research has seldom systematically documented how different intervention components have been implemented in practice [1]. Analysis of the implementation process and its fidelity is important in order to understand what specific reasons caused an intervention to succeed or fail [2–6]. This is especially relevant for complex interventions that consist of several active ingredients [7, 8]. Otherwise, there is a risk of evaluating effects of a program that have been described but not fully implemented [8].

Implementation fidelity is often defined as the degree to which a particular program follows an original program model, i.e., the model that was intended to be used by the program developers [9]. Fidelity can act as a potential mediator of the relationship between interventions and their intended outcomes [6]. Several prior studies have demonstrated that the implementation fidelity affects how well the program succeeds; programs with high fidelity have had better outcomes than programs with lower fidelity [10–16]. Some programs only had significant effects in the high-fidelity samples as compared to the entire intervention group [17, 18]. However, an intervention cannot always be implemented fully according to the program model, because local conditions may require some program adaptation [6]. Some authors argue that local adaptations improve the fit of the intervention to local context, and successful interventions are dependent on adaptations [11]. Others argue that program implementation can be flexible as long as the essential elements of an intervention are implemented with high fidelity. According to the Replicating Effective Programs (REP) framework, the core elements of an intervention should be standardized, but the mechanism by which these are operationalized can be changed to allow flexibility in implementation [19]. Two basic forms of program adaptation involve modifications of program content and form of program delivery [20]. The content can be changed by omitting, modifying, or adding components. Changes concerning the delivery can deal with the manner or intensity with which the intervention components are delivered [20]. Fixsen et al. [5] suggested that selection of staff with correct competence and high motivation, adequate training, coaching and support, as well as continuous program evaluation, together with enabling financial, organizational, and human resources policies, were components related to implementation with high fidelity. Other authors have suggested that reasons for program changes included staff desire to increase a sense of ownership and create a better fit between a program and local needs [11], as well as a desire to improve program results [21]. Others have suggested that poor staff training and uncommitted staff are reasons for non-adherence in implementation [21].

The Conceptual Framework for Implementation Fidelity [6] suggested that fidelity is influenced by moderating factors in participant responsiveness to a program, complexity of an intervention, facilitation strategies, and quality of delivery. The framework has later been modified [22] by an additional two moderating factors--context and participant recruitment. Context refers to surrounding social systems, such as structures and cultures of organizations and groups, and historical and concurrent events [1]. Participant recruitment covers aspects such as reasons for nonparticipation among potential participants, subgroups that were less likely to participate, and consistency of recruitment procedures among potential participants [23, 24]. Participant responsiveness refers to how well participants respond to, or are engaged by, an intervention. It involves judgments by participants about the outcomes and relevance of an intervention [6]. Responsiveness refers both to individuals receiving the intervention and individuals responsible for delivering it. Complex interventions and interventions that were vaguely described are assumed to be more difficult to implement with high fidelity than simple interventions [25]. Adequate facilitation strategies increase opportunities for higher and more standardized fidelity. Quality of delivery concerns 'the extent to which a provider approaches a theoretical ideal in terms of delivering program content' [9]. The authors [6] also suggested that there are complex relationships at work between the moderators, which may further affect the relationship between an intervention and the implementation fidelity. Fixsen et al. [5] also suggested that the implementation components are compensatory in nature. For example, less training may be supplemented with greater amounts of coaching or careful selection, and very well-designed staff performance evaluations may compensate for less training and little coaching. In summary, The Conceptual Framework for Implementation Fidelity suggested that different moderating factors might affect, positively or negatively, the implementation process and its fidelity. These factors interact with each other, and the effect of one factor on fidelity might be influenced by another moderating factor. Currently, there is little empirical research on factors that influence fidelity [6, 9, 26–28], as most prior studies on fidelity have focused solely on adherence [17]. A need for studies that make sense of the fidelity concept and clarifies the function of factors affecting fidelity and their relationship to one another has been emphasized [6].

This study concerns a care continuum intervention for frail elderly people living in their own homes. Community-dwelling frail older people often receive care from many providers and they are frequently admitted to hospitals [29]. The transition from hospital to home is a vulnerable period of discontinuity and potential adverse events [30, 31]. Integrated care programs have been used to reduce fragmentation and to improve continuity and coordination of care. Such programs are often complex in nature, including several care providers and professions, which might challenge the evaluation of the programs [32]. Prior studies on integrated care programs for elderly people have found positive effects on older peoples' medication consumption, satisfaction with care, and activities of daily living and quality of life [33–35]. However, to our knowledge, none of the prior studies have conducted a thorough analysis of implementation fidelity. The present study was conducted as a randomized control trial [36]. The first results of the study showed that the older people receiving the continuum of care intervention perceived higher quality of care than those receiving regular care. Program effects on participants' healthcare utilization, functional ability, activities of daily living, health-related quality of life, and life satisfaction will be presented in upcoming articles. A prior analysis of the intervention implementation has highlighted hindering and facilitating factors during the initial implementation [32]. The present study is a first attempt to analyze the implementation fidelity of the intervention. It aims to systematically evaluate implementation fidelity and possible factors influencing fidelity of a complex care continuum intervention for frail elderly people. The modified version [22] of The Conceptual Framework for Implementation Fidelity [6] is used.

The study has the following objectives:

  1. 1.

    To empirically test The Conceptual Framework for Implementation Fidelity.

  2. 2.

    To evaluate the level of implementation fidelity of the intervention.

  3. 3.

    To evaluate how different moderating factors in the framework affect the implementation fidelity.

Methods

Study design

The intervention study was a randomized controlled study with a total of 161 elderly participants divided into intervention and control groups. The intervention took place in one middle-sized municipality in Sweden. Evaluations were made of the possible effects of the intervention on the participants' capability to perform activities, their health-related quality of life, their satisfaction with care, and emergency care consumption at 3, 6, and 12 months after the baseline measurement. The present study had a longitudinal design using multiple qualitative methods to investigate the implementation process and its fidelity during the study period.

Description of the intervention

The intervention consisted of developing a continuum of care model for frail older persons. The ambition of the program was to include all essential care providers, i.e., municipal health and social care, a university hospital, and primary care (PC). The intervention was a systematization of collaboration between a nurse with geriatric expertise situated at the emergency department (ED), the hospital ward staff, and a multi-professional team for the care of the elderly with a case manager (CM) in the community. The multi-professional team included a nurse (the CM), a qualified social worker, an occupational therapist, and a physiotherapist. The intervention was based on prior research on integrated care programs for elderly people and diagnoses of the local needs [36]. It was developed by researchers at the VĂ¥rdal Institute, in close collaboration with the local practices [36]. Table 1 presents the logic model of the intervention.

Table 1 The logic model of the intervention

The intervention is briefly described here and can been seen in more detail in Table 2. The inclusion and randomization of participants and the intervention program started at the ED. For the intervention group, the nurse with geriatric expertise made an assessment of the elderly patients' needs of rehabilitation, nursing, and geriatric care. The assessment was transferred to the next care provider (ward nurses and municipality multi-professional team) to be used as a basis for further care planning. The municipality's CM contacted elderly persons at the hospital ward, the ward staff, and relatives of the elderly person if this was approved by the elderly person and if any relatives were available. Relatives were offered support and help if needed and desired. The multi-professional team made a care plan at the participants' home (instead of at the hospital) after their discharge from the hospital or after visiting the ED. The results of the geriatric assessment made at the ED were used as a basis for this assessment and care planning. The planning was also done in consultation with the participants. All care providers, such as home help services and home nursing care, were informed regarding the plan made. The CM followed up the care plan within a week and had telephone contact with participants at least once a month. The CM was available to the participants and their relatives for questions and consultation if needed. The control group received conventional care and outcome evaluations. Access to a case manager or multi-professional team was not available in the present organization. Care planning was conducted at the hospital and no information transfer was made from the hospital to the municipality for patients discharged from the ED directly to their homes. A more thorough presentation of the intervention and the conventional care has been presented previously [22, 36].

Table 2 Implementation fidelity for each intervention component and moderating factors affecting fidelity

Data collection

Data for the present study was collected from the start of the intervention (September 2008) until May 2010, which was the phase of the intervention when all of the participants had been included in the study. In accordance to the modified version [22] of The Conceptual Framework for Implementation Fidelity [6], data concerning the implementation fidelity and the moderating factors were collected. The measurement of implementation fidelity is a measurement of adherence, with its subcategories--content, frequency, duration, and coverage (dose). Thus, adherence relates to the content and dose of the intervention, i.e., whether the active ingredients of the intervention have been received by the participants as often and for as long as was planned [6]. Fidelity assessment should focus on all intervention activities if no analyses have been made of active ingredients of an intervention [5]. As such analyses had not been conducted in the present intervention, we evaluated adherence of all intervention components. The modified version of the framework identifies six moderating factors: participant responsiveness, comprehensiveness of policy description, strategies to facilitate implementation, quality of delivery, recruitment, and context (Figure 1). The framework suggested that all these factors should be evaluated systematically when conducting a process evaluation. However, it has been suggested [9] that quality of delivery can only be measured if an external benchmarking can be established. As the present study was the first evaluation of the care continuum model, no benchmarking was available. Thus, we evaluated the other five potential moderators.

Figure 1
figure 1

Assessment of fidelity and moderating factors in the present study in accordance to the modified version of The Conceptual Framework for Implementation Fidelity (originally from Carroll et al.).

Adherence and moderating factors were evaluated with observations, interviews, and document analysis. The second author of this paper conducted non-participant observations of the case managers' work practices. The CM was selected as an object of the observations because she had the most central role in the intervention. While observing her, the other actors, such as the multi-professional team, participating older people, hospital ward staff, geriatric nurses at the ED, and PC actors could also be observed. The observations were conducted once every six months at randomly selected three-day periods. A total of four observation visits were made during the study period. An observation protocol was developed based on the description of the planned intervention (Additional File 1). The observer reported how frequently (never, seldom, sometimes, often, always) the different intervention activities were conducted according to the plan. The observations were followed by questions to clarify the observed work practices and reasons for possible non-adherences. These clarifications were noted in a column for comments in the observations protocol.

Repeated interviews (once every six months, a total of four occasions) were conducted with key staff members at the operating level, i.e., nurses with geriatric expertise (two persons), case managers (two persons), the multi-professional team (four persons of which one is a municipality project leader), and a hospital project leader (one person). These individuals were the ones mainly involved in the delivery of the intervention. Interviews were also conducted with staff members who were less actively involved in the intervention. These persons were staff delivering support for family members (three persons, on two occasions) and representatives from hospital wards (four persons, at one occasion). Most of the interviews were conducted with two respondents from the same profession simultaneously. Thus, a total of 27 interviews were conducted with 16 actors at operating level. Workplace managers were interviewed but not included in the present analysis because the focus is on practical implementation of the intervention activities. The researchers contacted potential respondents with an e-mail that consisted of information about the project and the planned interview. It was emphasized that the participation was voluntary. Before the interview started, respondents were given more information about the study and the consent forms were signed. All interviews were conducted by one or two of the authors of the paper; they were semi-structured and lasted between 45 and 75 min. A general interview guide was developed focusing on: respondents' current work and role in the project including possible changes in these; experience of facilitation, feedback, and support; perceptions of organizational or other factors facilitating or hindering their work or the project in general; perceptions of the intervention content and the relevance/benefits for the participants; and expectations concerning the project and its effects (Additional File 2). This was used as a basic guide for all interviews. Additional questions were asked in some of the interviews in order to follow up earlier observations. The interviews were recorded and were later transcribed.

The researchers also gathered all project documentation from the participating organizations. The hospital and municipality project leaders were also asked to keep an ongoing work diary concerning the project activities, factors affecting the implementation of the intervention, and possible changes in the content or delivery of the intervention. A total of 119 documents were gathered; these included the project leaders' work diaries and notes (15 documents), minutes of meetings (101 documents), and project information letters for collaborators (three documents).

Data analyses

The notes in the observation protocols were discussed by the first and second author after each observation visit. Delivery of each intervention component was discussed, and the level of adherence was determined. The interview and document data were analyzed independently by the two authors using content analysis [37]. The authors compared the interview and document data to the findings of the observed data of adherence. All non-adherences or obscurities in the data concerning the actual delivery of the intervention components were further investigated with specific interview questions in the next interview with the key stakeholders. The interview and document data was also analyzed to identify factors affecting fidelity. The Conceptual Framework for Implementation Fidelity was used as a coding scheme for the analysis. The interviews were categorized based on the moderating factors: participant responsiveness, comprehensiveness of policy description, strategies to facilitate implementation, recruitment, and context. The analysis started with a reading of each transcript independently by the first and second authors. The two authors coded the texts according the moderating factors and a comparison was made between the two codifications. For instance, the interview respondents described the intervention as highly relevant for the target group and believed it to have great impact on older peoples' health. This was coded as participant responsiveness. The few (less than 10%) differences in codifications that occurred were discussed among the authors. After the discussions, 100% agreement was obtained on the codifications of the moderating factors. At the end of the study period, all of the observation protocols were compared over time by the two authors. A general level of fidelity was determined for each component.

Ethical approval

Ethical approval was granted by the Gothenburg University (Dossier number 413-08).

Results

Adherence

Content

A total of 16 of the 18 intervention components were always or most often delivered as these were described in the program protocol (Table 2). The two components not delivered according to the program plan concerned the geriatric assessment at ED. Recruitment and geriatric assessment was most often conducted in the wards instead of ED (component 1, Table 2). Consequently, the wards did not use the geriatric assessment as the basis for their care planning as it was transferred to them when patients were already in the wards (component 2, Table 2). The three intervention components most often delivered according to the plan concerned CM not having contact with all of the participants once a month (component 10, Table 2), not initiating support for all participants' relatives (component 17, Table 2) and not all team members always participating in care-planning meetings (component 11, Table 2).

Non-adherence also dealt with components that were added to the model. Team members also started, in addition to plan rehabilitation, to conduct the rehabilitation because the original rehabilitation staff had long waiting lists. The team also started to conduct six-month follow-up meetings with all participants, which was not planned in the original model. According to the municipality work praxis, every elderly person receiving home help services or home nursing care had a six-month follow-up. Thus, the team decided to conduct similar checkups even for those not having any services. CM had telephone contacts with relatives. She was very helpful and friendly, and it is possible that these telephone calls could have functioned as support for these relatives. In addition, the participants were allowed to contact the CM even after the 12-month period that originally was the intervention time for each participant.

Frequency and duration (dose)

No changes in the frequency or duration of the 18 components were observed. However, dose of the added components varied over time. For instance, at middle stages of the project, the rehabilitation staff in the team received more resources, and they started to offer participants rehabilitation. At the end of the study, the CM and the team received a heavier workload as the total number of participants increased. The six-month follow-up evaluations were first conducted via telephone instead of home visit and after a while, no six-month follow-ups were offered to participants without any municipality elderly care services. To finish the official 12-month study period, the team first organized a home visit for the participants. However, during heavier workload periods this was conducted via telephone.

Coverage

The project leaders' notes showed that a total of 340 persons who met the inclusion criteria were asked to participate. Of these, 159 (47%) individuals declined to participate.

Moderating factors

Recruitment

All staff respondents and the project documentation indicated that participant recruitment was problematic. It was difficult to find individuals that fulfilled the inclusion criteria. The recruitment procedure took a long time to conduct, resulting in many elderly people not having sufficient energy for this procedure. There was seldom time at the ED for conducting the recruitment procedure and the geriatric assessment. This meant that the participants were recruited from the hospital wards instead of the ED. The geriatric assessment was also made in the wards instead of the ED. Patients leaving the ED without a hospitalization were not always possible to recruit to the study. The staff respondents experienced that the frailest people refused to participate, and assumed that the comprehensive procedure was the reason why these individuals did not have sufficient energy to get involved with the project. The project documentation also showed that the older individuals who declined to participate gave most often 'the project seems too demanding' (n = 76) and 'I'm too ill to participate' (n = 12) as reasons for not participating.

Participant responsiveness

The older persons' preferences and wishes were reasons for not always delivering the components concerning the CM's contact with the participants and their relatives. The CM was supposed to contact all participants at least once a month and contact participants' relatives to offer them support. A total of 5% of the participants wanted to contact the CM by themselves. Thus, the frequency of the contact was determined by the older participant and was not always once a month as was planned in the program description. Most of the older participants did not want the CM to contact their family members because they were concerned that this would burden the families. On the other hand, one fundamental component of the intervention was that all planning should be conducted in collaboration with the participant. Thus, according to this component, the intervention had high adherence.

High staff responsiveness seemed to be one of the main reasons for adding components to the intervention. All staff respondents expressed high enthusiasm about the project and about their own roles in it. Some of the respondents also had previous positive experiences of working with care continuum models for the elderly. The project staff was also proud of the project and voluntarily presented it at local and national conferences. The above-mentioned added components were conducted as staff members were highly engaged in their work; they believed that the intervention was relevant and that it had potential for reaching good outcomes for older people. The staff wished to further improve the benefits for the participants by adding components such as rehabilitation and follow-up meetings. On several occasions, the project staff received positive feedback from the participants and their families, which gave them further assurance that their work was valuable.

Context

Some contextual factors had a direct impact on fidelity. Financial resources for the employment of the rehabilitation staff in the team fluctuated during the project. This meant that during a period of fewer resources not all rehabilitation staff members could attend all care planning meetings (component 11, Table 2). Another impact of contextual factors concerned support for participants' relatives. The formal support available at the municipality focused on relatives of people with dementia and did not therefore suit the relatives in the project. Little formal support for relatives was available during most of the project time. Later on during the project, the municipality widened the target group, and some support for project relatives was available. Some contextual factors affected the adherence by adding intervention components. PC had a concurrent project where physicians and nurses carried out home visits to elderly individuals. The team in the present project established collaboration with that project and could offer those services to the participants during some months of the project.

Some of the contextual factors had more indirect impact on the implementation. The importance of having positive experiences from similar projects was an important driving force at the initial stage of the project. All staff respondents frequently referred to their prior experiences and used those to market the current project to collaborations. Prior work practices also affected co-workers' attitudes towards the project. Hospital physicians initially expressed concerns for patient safety because the care planning was conducted at patients' homes and not at the hospital ward as usual. Because these concerns were responded to with adequate information from the project team, no consequences for fidelity were observed. Other contextual factors affected the work of project staff, but did not have an impact on fidelity. Ongoing changes--such as a new IT program, remodeling at the ED, reorganization of the hospital organization, and new workplace leaders--created uncertainty among project staff and made it more difficult to accomplish their everyday work, but did not seem to affect fidelity.

Complexity and facilitation strategies

Initially, all staff respondents expressed that they had received little information from the program designers concerning the project and the work descriptions, which were experienced as unclear. Facilitation at the initial stage was reported to be limited. Some of the respondents, especially at the hospital, thought that this was problematic, while others, mostly at the municipality site, seemed satisfied with the freedom to act according to their own judgment. These interviewees reported that they had taken more active roles in the project because no detailed information or facilitation was available. During the later phases of the study, the respondents experienced more information and facilitation. Some respondents also perceived the feedback from project steering groups and their work leaders as limited, which bothered some of the staff, while others stated that the limited feedback did not disturb them.

Discussion

Comprehensive, longitudinal data material showed that the level of the fidelity of this complex intervention generally was high. A total of 16 of the 18 intervention components were always or most often delivered according to the original plan. However, some non-adherence was also observed, including components that were not delivered, were modified, and were added to the original. The different moderating factors in the Conceptual Framework for Implementation Fidelity all affected the fidelity in a complex, interrelated way. The effects of the moderating factors on fidelity also changed over time, which further illustrates the challenges of evaluating impact of factors influencing fidelity. The Conceptual Framework for Implementation Fidelity [6, 22] was in general found to be empirically useful. The strengths, limitations, and the future use of the framework are discussed below.

Measurement of the four dimensions of adherence (content, frequency, duration, and coverage) included in the Conceptual Framework for Implementation Fidelity was found to be extensive and challenging, but also useful. First, some flexibility existed in the interpretation of the intervention components and delivery descriptions, which complicated the evaluation of adherence. Continuous discussions needed to be carried out in order to clarify each component. Standardization of core components and their delivery has also been emphasized by other authors [19]. It is challenging to describe content and delivery of several components so that no unclearness exists. Perhaps future studies could take into consideration the four adherence dimensions when formulating descriptions of intervention components and delivery. This could help to specify content, frequency, and duration for each component.

The last adherence dimension, coverage, was especially useful in the present study because almost half of the potential participants declined to participate. Many prior studies have not evaluated coverage [24], which makes it difficult to determine to what population the findings are generalizable. The moderating factor, recruitment, was found useful because it provided information on factors affecting coverage. Another challenge concerning the evaluation of adherence was the fact that no standards exist for what is the optimal degree of adherence. We considered high adherence only when the components were always or most often delivered as planned concerning content, frequency, and duration. There is also no agreement on whether and how to weight fidelity of the different intervention components, i.e., whether high fidelity for core component compensates for low fidelity for less important components. It is recommended that further studies discuss and define acceptable levels of adherence for the four adherence dimensions.

The findings also showed that non-adherence also dealt with components that were added to the model. Therefore, all measurements of adherence, such as fidelity protocols, should also include categories for additional components. Our analysis showed that staff did not reflect and recover components they had added, which could make it difficult to capture these in a protocol or interview. Therefore, it is strongly recommended that observations be used repeatedly to measure adherence and added components.

We found that staff enthusiasm for the project (responsiveness) was high, and this seemed to be a reason for adding components to the intervention. These additional components were in line with the theoretical ideas of the intervention, and no contradictory components were added. It seems that a desire to give the best possible care for the participants was a driving force for adding components. This is in line with Fraser et al. [21], suggesting that a desire to improve program results can be a reason for local intervention adaptations. Fixsen et al. [5] highlighted that understanding the principles of intervention core components may allow for flexibility in form without sacrificing the function associated with the components. We also found that some contextual factors in terms of merging services with concurrent projects and additional resources enabled the staff to add components to the present intervention. Thus, contextual factors enabled the additional components, but high staff responsiveness determined that the components were actually added. Staff with lower enthusiasm would perhaps not have added the components although contextual factors made that possible. Some authors have suggested that local additions to an original model tended to enhance effectiveness [11]. Effectiveness of additional components was not the focus of the present analysis, but we suggest that future studies should investigate the possible positive (and/or negative) impact of staff responsiveness and added components on program outcomes.

Contextual factors such as organizational routines were often reasons for not delivering or modifying components. For instance, the formal support for the relatives at the municipality was not targeting relatives of the present project, and therefore no formal support for relatives could be offered. In addition, staff enthusiasm about the project made them add components, but contextual factors such as increased workloads made them remove these in order to focus on the original components. This is a classical situation in organizational intervention research where interventions are not conducted in a vacuum. The longitudinal analysis revealed how the staff strived to strike a balance between resources and workloads on the one hand and staff willingness to deliver high quality care on the other hand. Fixsen et al. [5] suggested that high fidelity practices is best achieved when implementation is well-supported by strong organizational structures and cultures. This is most certainly valid, but difficult to achieve in practice when dealing with complex organizational interventions during a longer time period. In our case, the project had strong leadership support and the content of the intervention was developed in collaboration with the participating practices in order to develop a program that would suit the local context [36]. Participating organizations change leaders, reorganize their units, and get involved in new projects, and these actions make it difficult to plan in advance. This further emphasized the importance of longitudinal, systematic analysis of implementation fidelity in connection with an intervention study.

We also found that participant responsiveness,' i.e., elderly peoples' preferences, was a reason for not delivering components. The CM was supposed to contact all participants at least once a month to check their status. However, some participants wanted to contact the CM by themselves and as often as they wished. Prior studies have shown that intervention components that are not in line with recipients' wishes are most often not delivered [38].

Most of the respondents experienced the intervention as a complex program, the description of the intervention as vague, and the initial facilitation as limited. This is in line with prior studies reporting initial confusion in project work [39]. While some described the lack of clear descriptions in the initial intervention phase as frustrating and hindering, others experienced it as positive, because it gave them the possibility for individual interpretations of the intervention. Especially the municipality staff, which had long experience of working in similar projects, reported that they took a more active role and enjoyed the freedom to act according to their own judgment. It seems that the experiences of complexity and lack of initial facilitation did not impact fidelity, which is contradictory to prior studies suggesting that simple interventions and interventions with detailed descriptions are more likely to be implemented with high fidelity [6]. In this study, the staff was highly responsive to the intervention, which may have functioned as a driving force for them to solve complicated practical issues and take a more active role in the implementation. It is possible that unmotivated staff would not have made the same efforts if they had experienced limited facilitation. Some prior studies have reported staff to be more engaged, motivated, and effective when they feel they are exercising their judgment and expertise [40, 41]. With this approach, the staff is not expected to follow process protocols exactly, but rather work according to their own judgments of what fits with the client characteristics and context and the program theory [42]. Implementation components, such as training, need to be standardized, but also flexibly adapted to different provider levels of experience [43]. In line with that approach, our findings suggest that individual and organizational differences in prior experiences and responsiveness to the intervention are important to consider by those delivering an intervention when developing work descriptions and planning facilitation activities.

Our findings emphasize the interrelationship that the moderating factors can have with each other and the fidelity. For instance, staff experiences of prior similar projects were to a great degree affecting their responsiveness to the present project, which in turn influenced preferred level of details in work descriptions and facilitation strategies. Many contextual factors also hindered and facilitated the work of the project staff, while the impact of these factors on fidelity seemed to be modified by the other moderating factors such as staff responsiveness. Fixsen et al. [5] suggested that the interactive implementation drivers compensate for one another so that a weakness in one component can be overcome by strengths in other components. Based on the results of the present study, it seems that staff willingness to deliver the program with high fidelity and participants' willingness to receive the components were the fundamental conditions for implementation of the program. Factors of particular importance for fidelity were staff and participants' responsiveness to the intervention on one hand, and the enabling and hindering contextual factors on the other hand. Implementation fidelity was shaped by the staff's commitment to the intervention program, as well as their ability to perform its content within the resources at hand. A staff with high responsiveness was also willing to overcome potential obstacles, such as contextual factors. These factors are recommended as first steps for evaluation of factors affecting fidelity. It is suggested that more research is needed for investigating the relationship between the moderating factors and fidelity.

The previously proposed [6, 22] Conceptual Framework for Implementation Fidelity was a useful tool for organizing the data collection of adherence and moderating factors. It covered factors causing non-adherence, suggesting that these factors are comprehensive measures of factors affecting fidelity. However, the framework does not provide any guidance for how to investigate the interrelations between the moderating factors. It is suggested that the framework be further developed or used together with other models to examine the relative impact of the moderating factors on each other and fidelity longitudinally.

Methodological discussion

The main strengths of the study were the use of three different data collection methods and the longitudinal design. In line with suggestions from other authors [44, 45], the different data sources complemented each other and offered reliable results. The direct observations were especially valuable. A longitudinal analysis allows the researcher to track the development of the program over time, providing a more thorough understanding [44]. Some authors [45] have suggested that fidelity also needs to be measured in control groups. In the present study there was no possibility that the intervention components could have been delivered to controls due to organizational routines. The control group received care planning at the hospital and did not have any CM or a multi-professional team to contact in the municipality. Thus, after a careful evaluation, a decision was made that the research resources were not to be put into the evaluation of the control group. One limitation is also that elderly participants were not interviewed because their respondent burden was considered too high. Finally, the intervention was conducted in local practice, but in a research context. Thus, it is possible that the factors affecting fidelity in this project are not totally comparable to real-life situations, because support from researchers was offered. Nonetheless, as Dane and Schneider [10] point out, understanding fidelity under the research conditions is a first step to understanding program fidelity. The next step would be to study the implementation of the intervention after the research program.

Conclusions

The Conceptual Framework for Implementation Fidelity was an empirically useful tool to collect and analyze data concerning the adherence. It also included comprehensive measures of factors affecting fidelity and provided guidance for analyzing the moderating factors. However, a complex interrelationship existed between the moderating factors, and the framework provided limited guidance for how to investigate the relations between the moderating factors over time. It is suggested that this framework be further developed or used together with other models to examine the relative impact of the moderating factors on each other and fidelity longitudinally.

References

  1. Lipsey MW, Cordray DS: Evaluation methods for social intervention. Annu Rev Psychol. 2000, 51: 345-375. 10.1146/annurev.psych.51.1.345.

    Article  CAS  PubMed  Google Scholar 

  2. Dobson D, Cook TJ: Avoiding type III error in program evaluation: Results from a field experiment. Eval Program Planning. 1980, 3: 269-276. 10.1016/0149-7189(80)90042-7.

    Article  Google Scholar 

  3. Borrelli B, Sepinwall D, Ernst D, Bellg AJ, Czajkowski S, Breger R, DeFrancesco C, Levesque C, Sharp DL, Ogedegbe G: A new tool to assess treatment fidelity and evaluation of treatment fidelity across 10 years of health behavior research. J Consult Clin Psychol. 2005, 73: 852-

    Article  PubMed  Google Scholar 

  4. Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, Ogedegbe G, Orwig D, Ernst D, Czajkowski S: Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH Behavior Change Consortium. Heal Psychol. 2004, 23: 443-451.

    Article  Google Scholar 

  5. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F: Implementation research: A synthesis of the literature. 2005, Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication 231)

    Google Scholar 

  6. Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S: A conceptual framework for implementation fidelity. Implement Sci. 2007, 2: 40-10.1186/1748-5908-2-40.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Bradley F, Wiles R, Kinmonth AL, Mant D, Gantley M: Development and evaluation of complex interventions in health services research: case study of the Southampton heart integrated care project (SHIP). The SHIP Collaborative Group. BMJ. 1999, 318: 711-715. 10.1136/bmj.318.7185.711.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  8. Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M: Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008, 337: a1655-10.1136/bmj.a1655.

    Article  PubMed  PubMed Central  Google Scholar 

  9. Dusenbury L, Brannigan R, Falco M, Hansen WB: A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Heal Educ Res. 2003, 18: 237-256. 10.1093/her/18.2.237.

    Article  Google Scholar 

  10. Dane AV, Schneider BH: Program integrity in primary and early secondary prevention: are implementation effects out of control?. Clin Psychol Rev. 1998, 18: 23-45. 10.1016/S0272-7358(97)00043-3.

    Article  CAS  PubMed  Google Scholar 

  11. Blakely CH, Mayer JP, Gottschalk RG, Schmitt N, Davidson WS, Roitman DB, Emshoff JG: The fidelity-adaptation debate: Implications for the implementation of public sector social programs. Am J Commun Psychology. 1987, 15: 253-268.

    Article  Google Scholar 

  12. Becker , Smith J, Tanzman B, Drake RE, Tremblay T: Fidelity of supported employment programs and employment outcomes. Psychiatric Services. 2001, 52: 834-10.1176/appi.ps.52.6.834.

    Article  CAS  PubMed  Google Scholar 

  13. Keith RE, Hopp FP, Subramanian U, Wiitala W, Lowery JC: Fidelity of implementation: development and testing of a measure. Implement Sci. 2010, 5: 99-10.1186/1748-5908-5-99.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Abbott RD, O'Donnell J, Hawkins JD, Hill KG, Kosterman R, Catalano RF: Changing teaching practices to promote achievement and bonding to school. Am J Orthopsychiatry. 1998, 68: 542-552.

    Article  CAS  PubMed  Google Scholar 

  15. Hansen WB, Graham JW, Wolkenstein BH, Rohrbach LA: Program integrity as a moderator of prevention program effectiveness: Results for fifth-grade students in the adolescent alcohol prevention trial. Journal of Studies on Alcohol. 1991, 52 (6): 568-579.

    Article  CAS  PubMed  Google Scholar 

  16. Rohrbach LA, Graham JW, Hansen WB: Diffusion of a school-based substance abuse prevention program: Predictors of program implementation. Preventive Med. 1993, 22: 237-260. 10.1006/pmed.1993.1020.

    Article  CAS  Google Scholar 

  17. Mihalic S: The importance of implementation fidelity. Emotional & Behav Disorders Youth. 2004, 4: 83-86.

    Google Scholar 

  18. McGrew JH, Griss ME: Concurrent and predictive validity of two scales to assess the fidelity of implementation of supported employment. Psychiatric Rehabilitation J. 2005, 29: 41-47. 10.2975/29.2005.41.47.

    Article  Google Scholar 

  19. Kilbourne AM, Neumann MS, Pincus HA, Bauer MS, Stall R: Implementing evidence-based interventions in health care: application of the replicating effective programs framework. Implement Sci. 2007, 2: 42-10.1186/1748-5908-2-42.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Castro FG, Barrera M, Martinez CR: The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prev Sci. 2004, 5: 41-45.

    Article  PubMed  Google Scholar 

  21. Fraser MW, Richman JM, Galinsky MJ: Intervention research: Developing social programs. 2009, USA: Oxford University Press

    Google Scholar 

  22. Hasson H: Study protocol: Systematic evaluation of implementation fidelity of complex interventions in health and social care. Implement Sci. 2010, 5: 67-10.1186/1748-5908-5-67.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Baranowski T, Stables G: Process evaluations of the 5-a-day projects. Health Edu & Behavior. 2000, 27: 157-166. 10.1177/109019810002700202.

    Article  CAS  Google Scholar 

  24. Steckler AB, Linnan L, Israel BA: Process evaluation for public health interventions and research. 2002, Jossey-Bass

    Google Scholar 

  25. Greenhalgh T, Robert G, Macfarlane F, Bate P, Kyriakidou O: Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004, 82: 581-629. 10.1111/j.0887-378X.2004.00325.x.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Naleppa MJ, Cagle JG: Treatment Fidelity in Social Work Intervention Research: A Review of Published Studies. Res Soc Work Pract. 2010, 20: 674-10.1177/1049731509352088.

    Article  Google Scholar 

  27. Glasgow RE, Lichtenstein E, Marcus AC: Why don't we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. Am J Publ Health. 2003, 93: 1261-10.2105/AJPH.93.8.1261.

    Article  Google Scholar 

  28. Noonan RK, Emshoff JG, Mooss A, Armstrong M, Weinberg J, Ball B: Adoption, adaptation, and fidelity of implementation of sexual violence prevention programs. Heal Promot Pract. 2009, 10: 59S-10.1177/1524839908329374.

    Article  Google Scholar 

  29. Condelius A, Edberg AK, Jakobsson U, Hallberg IR: Hospital admissions among people 65+ related to multimorbidity, municipal and outpatient care. Arch Gerontol Geriatr. 2008, 46: 41-55. 10.1016/j.archger.2007.02.005.

    Article  PubMed  Google Scholar 

  30. Kripalani S, Jackson AT, Schnipper JL, Coleman EA: Promoting effective transitions of care at hospital discharge: a review of key issues for hospitalists. J Hosp Med. 2007, 2: 314-323. 10.1002/jhm.228.

    Article  PubMed  Google Scholar 

  31. Coleman EA, Berenson RA: Lost in transition: challenges and opportunities for improving the quality of transitional care. Ann Intern Med. 2004, 141: 533-536.

    Article  PubMed  Google Scholar 

  32. Dunér A, Blomberg S, Hasson H: Implementing a continuum of care model for older people - results from a Swedish case study. International Journal of Integrated Care. 2011, 11: e136-

    Article  PubMed  PubMed Central  Google Scholar 

  33. Ouwens M, Wollersheim H, Hermens R, Hulscher M, Grol R: Integrated care programmes for chronically ill patients: a review of systematic reviews. Int J Qual Health Care. 2005, 17: 141-146. 10.1093/intqhc/mzi016.

    Article  PubMed  Google Scholar 

  34. Eklund K, Wilhelmson K: Outcomes of coordinated and integrated interventions targeting frail elderly people: A systematic review of randomised controlled trials. Health & Social Care Commun. 2009, 17: 447-458. 10.1111/j.1365-2524.2009.00844.x.

    Article  Google Scholar 

  35. Hallberg IR, Kristensson J: Preventive home care of frail older people: a review of recent case management studies. J Clin Nurs. 2004, 13: 112-120. 10.1111/j.1365-2702.2004.01054.x.

    Article  PubMed  Google Scholar 

  36. Wilhelmson K, Duner A, Eklund K, Gosman-Hedstrom G, Blomberg S, Hasson H, Gustafsson H, Landahl S, Dahlin-Ivanoff S: Continuum of care for frail elderly people: Design of a randomized controlled study of a multi-professional and multidimensional intervention targeting frail elderly people. BMC Geriatr. 2011, 11: 24-10.1186/1471-2318-11-24.

    Article  PubMed  PubMed Central  Google Scholar 

  37. Stemler S: An overview of content analysis. Pract Assess, Res & Eval. 2001, 7: 137-146.

    Google Scholar 

  38. Martens M, Van Assema P, Paulussen T, Schaalma H, Brug J: Krachtvoer: process evaluation of a Dutch programme for lower vocational schools to promote healthful diet. Heal Educ Res. 2006, 21: 695-10.1093/her/cyl082.

    Article  Google Scholar 

  39. Gällstedt M: Working conditions in projects: perceptions of stress and motivation among project team members and project managers. Int J Proj Manag. 2003, 21: 449-455. 10.1016/S0263-7863(02)00098-4.

    Article  Google Scholar 

  40. Glisson C, Hemmelgarn A: The effects of organizational climate and interorganizational coordination on the quality and outcomes of children's service systems. Child Abuse & Neglect. 1998, 22: 401-421. 10.1016/S0145-2134(98)00005-2.

    Article  CAS  Google Scholar 

  41. Henry WP, Strupp HH, Butler SF, Schacht TE, Binder JL: Effects of training in time-limited dynamic psychotherapy: Changes in therapist behavior. J Consult Clin Psychol. 1993, 61: 434-

    Article  CAS  PubMed  Google Scholar 

  42. Mowbray CT, Holter MC, Teague GB, Bybee D: Fidelity criteria: Development, measurement, and validation. Am J Eval. 2003, 24: 315-

    Article  Google Scholar 

  43. Borrelli B: The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. J Publ Health Dentistry. 2011, 71: S52-S63.

    Article  Google Scholar 

  44. Bouffard JA, Taxman FS, Silverman R: Improving process evaluations of correctional programs by using a comprehensive evaluation methodology. Eval Program Planning. 2003, 26: 149-161. 10.1016/S0149-7189(03)00010-7.

    Article  Google Scholar 

  45. Bond GR, Evans L, Salyers MP, Williams J, Kim HW: Measurement of Fidelity in Psychiatric Rehabilitation. Ment Heal Serv Res. 2000, 2: 75-87. 10.1023/A:1010153020697.

    Article  CAS  Google Scholar 

Download references

Acknowledgements

The authors would like to thank all the interview respondents. The VĂ¥rdal Institute financed this implementation study and the development and evaluation of the intervention 'Continuum of care for frail elderly persons, from the emergency ward to living at home intervention.' In addition, the project received funding from the VinnvĂ¥rd research program. The first author also received a postdoctoral funding from ERA-AGE2, Future Leaders of Ageing Research in Europe (FLARE)/Swedish Council for Working Life and Social Research.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Henna Hasson.

Additional information

Competing interests

The authors declare that they have no competing interests.

Authors' contributions

HH, SB, and AD designed the study and collected the data. HH and SB did the main analyses of the data with help from AD. HH drafted the manuscript with help from SB and AD. All authors read and approved the final manuscript.

Electronic supplementary material

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Rights and permissions

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article

Hasson, H., Blomberg, S. & Dunér, A. Fidelity and moderating factors in complex interventions: a case study of a continuum of care program for frail elderly people in health and social care. Implementation Sci 7, 23 (2012). https://0-doi-org.brum.beds.ac.uk/10.1186/1748-5908-7-23

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://0-doi-org.brum.beds.ac.uk/10.1186/1748-5908-7-23

Keywords