Section 3: Overview of Methods
The overall study employed an exploratory mixed-methods approach, including interviews, observations, document analysis, and surveys. Studies of communities of practice (CoPs) typically utilize both quantitative and qualitative methods to identify trends and examine underlying mechanisms within the communities (Fontaine & Millen, 2004). In line with exploratory mixed-methods studies (Creswell & Plano Clark, 2011), we began with qualitative data collection of participant interviews of leaders and staff from each community, observations of signature community events, and document analyses for key documents from the four communities that we studied. This allowed us to better understand the design principles of and the nature of involvement in these communities as a means to explore this topic. We utilized findings from this phase of data collection in order to inform the survey design for the second phase.
Observations and Document analysis
The study began with a review of documents in order to develop a context for these four STEM reform communities. Samples of the types of items we collected include: notes from meetings, planning documents, advisory board correspondence, descriptions of their missions, philosophies, and values, key correspondence between leaders, grant applications, reporting on grants, reports for advisory boards and other key groups, as well as on-going correspondence with the community via newsletters. Later, as part of interviews, we collected key documents that they identified that might help us to better understand what was engaging to them, such as publications, web-blogs, or newsletters.
We observed a signature event for each community, visited each community’s main office (where we also went through their archives), joined their listservs, visited their websites on an on-going basis, and attended other key events about which they informed us. Observation took place over 2 and a half years. During observations, our researchers took fieldnotes about the activities, using the literature on communities of practice and professional learning communities to develop observation protocols. Observation notes from key events were taken in each case by more than one researcher, and then they were compared for validity. Fieldnotes from events were typically quite long, 30 to 35 single-spaced pages for each event. As Rogers (2003) has noted, we can learn a tremendous amount from real-time studies that follow networks and communities of practice, and from watching their activities.
After initial review of documents and of data from site visits, we interviewed 112 people—between 26 and 30 people within each community (including both organization staff and faculty leaders). Each community studied is supported by an organization that includes leaders and staff that have worked extensively with the communities, both in their current forms and in the past. The communities each have longstanding members and leaders that have helped sustain them; for this study, we drew on interviews with faculty leaders in particular. We also asked to speak with faculty who had less involvement in these communities in order to get a sense of their experiences as well. Interviews lasted between one and two hours, and they followed a common protocol that asked about impacts or outcomes from participating in the community, level of involvement, what they found most engaging in the community, what they perceived shaped the outcomes they noted, and other areas related to their engagement and involvement. The communities of practice literature informed the interview protocol. All interviews were recorded digitally and transcribed. Interviews were used to inform items for the survey and to build on the literature we brought to the study.
For the interview portion of the study, our sample (n=112) consisted of 75% current faculty members (n=84)—60.7% of whom were professors (n=51) and 29.8% associate professors (n=25). The remaining 25% of participants (n=28)were either former faculty members, current administrators, or staff members of the four reform communities. When asked to indicate their primary job responsibilities, 42% of the sample indicated teaching (n=47), 33.9% indicated administration (n=38), 3.6% indicated research (n=4), and 20.5% indicated other responsibilities (n=23). As for personal demographics, 57% of participants identified as female (n=64) and 92% identified as White (n=103).
The survey was conducted last, and the survey design was informed by the interviews, documents, and observations. The survey invitation was sent to 17,868 e-mail addresses.5 The survey was custom designed for each community’s particular structures (e.g., activities, communication vehicles), but it followed a common survey design to allow for comparison across the four communities. It addressed the following areas: participants’ involvement in the community over time; perceptions of community activities; perceived outcomes of community involvement for individuals, their departments, and their institutions; perceptions of the importance of community design elements on their participants’ practice; and individual and professional characteristics. Survey design was informed by the information gathered in the first phase of data collection, as well as by the literature pertaining to design and outcomes of networks and communities of practice. This allowed us to identify the design aspects and involvement opportunities that characterized these communities.
A total of 3,927 participants responded to the survey invitation, indicating a 22% initial response rate. This response rate is similar to the response rates of other surveys administered to national samples of STEM faculty (e.g., Hurtado, Eagan, Pryor, Whang, & Tran, 2012). The final sample for this study consists of 2,503 participants who completed the entire survey; these participants were distributed amongst 997 institutions (ranging from 1 to 28 observations per institution) and four communities (ranging from 235 to 1,102 observations per community). The survey sample consisted of 36.7% professors (n=919), 27.9% associate professors (n=699), 9.2% assistant professors (n=231), 20.2% non-tenure-track faculty or faculty working in institutions without tenure (n=506), and 5.9% individuals with no academic rank (n=148). The mean length of time spent teaching undergraduate students was 16.8 years (SD = 8.67). More than half of the participants worked in public institutions (n=1320, 52.7%), and 21.2% worked in doctoral institutions (n=530), 32.6% in master’s institutions (n=816), 27.8% in baccalaureate institutions (n=695), 13.7% in associates institutions (n=342), with the remaining portion working in other organizations or types of higher education institutions (n=120; 4.8%). As for personal demographics of the survey sample, 54.3% identified as female (n=1359), 82.4% as White (n=2062), and the average age of participants was 49.9 years (SD = 10.5).
The qualitative data were coded and analyzed using Boyatzis’ (1998) thematic approach. This approach involved first going through the data for new or emerging inductive codes. Second, deductive codes derived from the literature on communities of practice and learning communities was then applied. Deductive codes included items reviewed in the literature related to stages of CoP development and design principles, as well as items from the literature on learning communities. The qualitative data were analyzed using HyperRESEARCH, a qualitative software program that helps manage and analyze large amounts of qualitative data and eases the coding process. All forms of qualitative data including interviews, observation fieldnotes, and documents were inputted into the software.
We utilized several quantitative analytical procedures to analyze quantitative data. Scale scores for our outcome variables were calculated by averaging the individual items in each scale, rather than by summing the items, in order to contribute to the ease of interpretation and comparison with other outcome items (Furr, 2011). We utilized descriptive statistics of our outcomes and design variables to identify trends in the data.
We then utilized ordinary least squares (OLS) regression to examine the extent to which participants’ perceptions of CoP design characteristics and engagement are associated with the three dependent variables in our study. Prior to utilizing OLS regression, we examined the unconditional intraclass correlations (ICCs) for three individual outcome variables (learning and improving practice, skills for leadership and change, and networking) and two organizational outcome variables (departmental change and institutional change) because our participants exhibited clustering by institution and reform community. We opted to utilize OLS regression rather than multi-level modeling for two reasons. First, the majority of the variance in our dependent variables was within institutions rather than between institutions or communities. Second, our sample contained a large proportion of singletons in institutions, as well as institutions with only two participants (35.2%), threatening the estimates and validity of utilizing multi-level modeling with these data (Rabe-Hesketh & Skrondal, 2012).
Prior to running the regression models, we calculated descriptive statistics and examined histograms for each continuous variable in the study to ensure approximately normal distributions. We also calculated multicollinearity statistics for all variables. Variance inflation factor (VIF) values were low (ranging from 1.04 to 3.42) and well within the acceptable range to indicate no issues with multicollinearity in the analyses (Meyers, Gams, & Guarino, 2006).We ran regression models that included focal variables (design characteristics and engagement behavior) and control variables (personal demographics, professional characteristics and motivations, and institutional characteristics). All continuous variables (including the dependent variables) were standardized (i.e., grand-mean centered) prior to their inclusion in the models.
Trustworthiness and Validity
We utilized multiple forms of trustworthiness, including outside experts and auditors, member checks, triangulation, piloting, and multiple coders. We had two advisory boards that informed the study design and reviewed results: an external board comprised of national STEM experts and an internal board comprised of members from each of the four initiatives studied. We presented data collection protocols and instruments, as well as findings, to each board for input. The internal board was able to serve as a member check and to register whether the findings seemed to reflect their insights and experience. We piloted the interview and observation protocols. We triangulated data from multiple sources—documents, observations, and interviews. For the focus on sustainability, the key data were examining alignment or any discrepancies between interviews and archival data about development and sustainability. Lastly, we had three different coders of data that compared their interpretation of the emerging trends and coding of deductive codes within HyperRESEARCH. Coding was conducted separately and then compared.
The exploratory mixed-methods nature of this study addressed a set of research questions that spanned outcomes, engagement to lifecycle of the CoPs. The qualitative work allowed us to understand the ways in which the communities operate or engage faculty, how they formed, and how they have been sustained over time. This information contributed to our ability to design a survey to community members in order to best understand the outcomes of participating in these communities, and to identify how engagement (often in terms of design principles) in these communities contributes to individual and broader outcomes in members’ departments and campuses. We now turn to the key findings from our study, beginning first with the finding that these communities can be identified and understood as a variant of communities of practice. This model, which we call a “community of transformation,” encapsulates how we think these communities work scaling STEM reform.
5. The administrative staff of the four STEM reform communities provided us with contact information for each individual on their e-mail lists in order to send personalized invitations and to track responses. All four organizations acknowledged the existence of out-of-date contact information for participants and individuals who do not identify as faculty (i.e., members of other organizations) on their contact lists. Additionally, one community has a high-school arm of their initiative, and they were unable to separate those addresses from the larger list. So, while the population in the study was approximately 18,000, there is no way for us to know the true population size.