This study did not require Insittutional Review Board approval. As a research team, we paid considerable attention to ethics throughout. During our research, it was important that we maintained the confidentiality of all participants and any of their personal or identifiable information. Interviews were recorded on cell phones, then transferred to computers with passwords. At the end of our research, audio interviews were deleted. Before beginning, we ensured voluntary participation and obtained verbal informed consent. We were mindful of the participants, asking each question in a conversational style, and we offered them the ability to end the interview at any time.
We used purposive non-probability sampling to identify survey participants that attended ConnectiKids between the years 2002 to 2009. The organization provided us with a data set that consisted of 161 names of past ConnectiKids participants in 5 cohorts: 2002-2003, 2003-2004, 2006-2007, 2007-2008, 2008-2009. The participants were organized into these cohorts based on their last year in the program with the average last year of participation being 6th grade.
We conducted a mixed methods study as it enabled us to capture a more holistic, multi-dimensional picture of ConnectiKids’ impacts. Our qualitative findings add considerable breadth and depth to our quantitative findings and vice versa. It also enabled us to be more flexible with our procedural decision making.
We utilized the National Student Clearinghouse (NSC), a leading provider of educational reporting and data exchange, to verify college attendance and completion rates of past ConnectiKids participants. The NSC’sStudentTracker service aggregates data from the Department of Education’s National Student Loan Data System. NSC provided data on whether individuals in our sample had enrolled in and/or completed an education program at a post-secondary institution, what type of college the students had enrolled in, how many semesters they were enrolled, and what is their highest credential or degree earned.
We also utilized a survey that was constructed and disseminated through Google Forms and was developed using past PYD program analyses as references. We sent out two waves of messages on social media that included the link to our survey. In the first wave we sent messages via Facebook to 53 individuals from our data set. In the second wave we sent another 23 messages via Facebook, LinkedIn, and Instagram. The survey consisted of 14 total questions (see Appendix B for survey questions). The first questions assessed statistics such as high school completion, and current individual and household incomes, to establish a sense of past participants’ financial stability. The second grouping of questions spoke to social and emotional contentedness to develop an understanding of past participants’ mental wellbeing. The remaining questions sought to determine if the respondents felt any strong positive connections to ConnectiKids, such that they might recommend the program to someone else or wish to be reconnected.
We conducted interviews using a standardized guide (see Appendix C for the interview guide). The guide had seven questions, four of which were directly related to ConnectiKids and three of which were constructed to establish an understanding of the individual’s experiences. Our approach was semi-structured, allowing us to ask probing questions and cultivate a calm and conversational atmosphere with the intention of earning our interviewee’s trust and honesty. We used Temi, an online transcription service, to do the initial transcription, then edited manually to ensure complete transcripts. We coded the transcripts using a research-based deductive codebook and an additional inductive coding process (see Appendix D for the codebook). We did not end up including our inductive codes. Finally, before conducting our analysis, we collected all of the coded content and input them into a self-constructed anecdote matrix with columns for each individual code term. This process enabled better visualization of the qualitative data and the themes they represent.
Our sample came from ConnectiKids’ intake data, and included contact information for participants’ parents and emergency contacts from nearly two decades ago. This limitation resulted in zero initial contacts. In response, we created messaging for social media and employed a questionnaire to elicit participants for semi-structured interviews. This limited participants to those we could find on Facebook & LinkedIn and resulted in 7 survey completions and 3 semi-structured interviews. Additionally, the secondary quantitative data we have is not directly comparable. The demographics of our sample aren’t representative of New England and national data, reducing our ability to draw significant statistical comparisons. Lastly, the semester-based time constraint of 13 weeks prevented us from delving deeper into ConnectiKids’ outcomes by restricting us to those survey and interview respondents who were easiest to find online.