Table of Contents:

Methods of Gathering Information
The Data
Evaluation of Research Methods
Methodological Suggestions for Future Researchers
Full Online Survey PDF





Methods of Gathering Information:

     To accomplish this goal, we chose to use an online survey as well as a shorter informal in person interview.

Online Survey:

Source: WikiMedia

     We developed these surveys to produce a mix of mainly quantitative with some qualitative data. This method can be analyzed both in a way that utilizes the rich information of open-ended perspectives while allowing for a statistical processing opportunity as well. 

     Our survey was anonymous and was made using an online program called Qualtrics and there was a maximum of 39 questions although some people answered less questions because the amount was dynamic based on each respondent’s answers. The types of questions we used were multiple choice or text entry for nominal questions and Likert scale for ordinal questions. Some of the questions had both a multiple choice with and a text entry response option for respondents to elaborate or be more specific.
     Many of these questions had pre-determined responses which both helped respondents to consider options they might not have otherwise thought of or stimulated a new response that we hadn’t thought to include. Having pre-categorized responses gives us consistent and measurable answers and allowed us to compare between categories such as gender, race, or education. 
     There were many questions that did not force a response, permitting respondents to only answer when they had something definitive they wanted to share. Some questions had a neutral option such as “I don’t know.”

The topics of questions included:
1. Demographic information
2. Questions about CHS
3. Questions about other nearby more well-known public cultural institutions

     It was distributed on our team’s social media web pages. It was also given to the Connecticut Historical Society who distributed the survey on their social media pages, mailing lists, and to other partner organizations such as Hartford Public Library, West End Civic Association, and Leadership Greater Hartford. The survey was available for ten days from November 9th, 2022 until November 19th, 2022. 


Informal Interviews:

Source: PXHere

     Our informal in person interviews consisted of a maximum of 21 questions that were similar to the online survey, but shorter. These interviews had a mix of qualitative and quantitative questions. We went out to three Hartford community events: Domingo on Main Street, Halloween on Vernon Street, and the Capital Community College Halloween Costume Party to ask people to participate in a short interview.

     It took respondents about 5-8 minutes to complete while we guided them through the questions, noting their responses. We followed a script of questions that were worded carefully to be clear. Respondents were asked to do the interview based on a likelihood of being within the target age range. These informal interview questions consisted of yes or no answers, open ended questions, a multiple-choice question, and one 0-10 rating question. Since respondents were not filling out the sheet themselves or looking at it, there tended to be more qualitative responses. 


The Data:

Source: StockVault

Cleaning the Data:

     To clean the data, we exported the data into an excel sheet where we were able to sort the data and look at it all side by side. While in excel, we looked for data that stood out in negative ways that would merit deleting that response or other responses like it. The criteria we used for excluding data was:

    • Any zip code outside of the pre-determined ones (about 15 minutes from CHS) Including a large number of identical entries that were over the number of digits necessary for a zip code. 
    • Any response that took three minutes or less to complete
    • Any responses that answered a question with “disremember (there were a large number of them and we haven’t heard this word used in any context)”  
    • Instances where open ended responses were very specific, long, and identical. These responses often didn’t make any sense.  

Analyzing the Data:

Source: PXHere

     First, we each looked at the cleaned data and made graphs based on responses that were interesting or important to pair with one another. 

     Then, we looked at the survey and interview questions that were most relevant to our research questions and made multi variable tables and graphs from the responses. 

     Using these graphs, we looked at the data to see how the results answered our research questions, processed the main findings of the graphs, and turned those into recommendations. 


Evaluation of Research Methods:

     There was a possibility and maybe even likelihood of researcher bias when developing the research questions and the survey and interview questions. This may have come from researchers ties to the Hartford area community or from researchers preconceived notions about which direction to focus the research on.
     Another possibility of error could have been from the short time frame the research project needed to be done in which didn’t allow for thorough exploration of each step in the research project.
     The reliability of some results were not good.  This was exampled in a couple instances on the online survey where some questions meant different things to different respondents. In these instances, there were a group of responses that answered the question correctly and another group of responses that misinterpreted the question.

Informal Interview:

   They were a good supplement to the online survey, but relied on the interviewer recording everything in writing which may have introduced error. There was a possibility for instances of leading the respondent with body language or audible responses. Some examples:
     1. If the interviewer responded to an answer by saying “good” or “that’s great.”
     2. Any other verbal response that produced anything other than a neutral tone.     
     3. A sigh may have misled respondents into thinking they were taking too long, answering in an unfavorable way, or any other thoughts about their own performance in answering the questions.
     4. Respondents may have felt embarrassed if they didn’t know about CHS and felt like they should have. They also might feel pressured into saying that they do want to visit when maybe they actually don’t.

     The informal interview was kept short and direct which kept respondents from being impatient and ready to be done with the survey.

     One of the positive aspects about the informal interview was being able to talk to people who spend their leisure time in Downtown Hartford which is nearby to CHS. Those people are potential CHS visitors because they are local residents that are already exhibiting a practice to spend their time at a Hartford cultural event.

     The online survey was distributed using the researchers and community partners networks, which doesn’t give the same diversity and perspectives that were given by the participants of the random interviews in Hartford. The informal interviews were a more natural and random selection of people in Hartford who are not tied to learning institutions or universities and might be just as interested in Connecticut history because it is where they live or grew up.

     The informal interviews were also good to do because they gave us an opportunity to compare the interview responses with the data from the online survey and see what was different between them.


Online Survey:    

     The online survey was a good choice because we got many more participants than we could have gotten via in-person interviews in our time frame. It may have reached more academic types of people based on the institutions that helped promote the survey.

     Respondents also were able to take the survey at their own timing, in their own privacy, without any pressure of feeling like they had to answer in any specific way. Since the survey was online, and everything was recorded digitally, we have the exact answers that respondents gave and they were not affected by an interviewer’s inability to keep up with writing the answers physically. This gives us accurate data that is much better for analyzing and processing conclusions.

     The type of questions used on the survey were good because it gave a good balance of providing wide-ranging suggestions as well as room for respondents to voice their more personal perspectives. 

Methodological Suggestions for Future Researchers:

    1. Try more open-ended questions.
    2. Find a method of distribution that is more representative of Hartford demographics.

View our full online survey below:


Deprecated: Directive 'allow_url_include' is deprecated in Unknown on line 0