Evaluation


Final Evaluation:

Advice_Exhibit_Evaluation_Report.doc

 

Planning

There are three portions to the evaluation plan for the Advice exhibit:

 

*Interaction data collecting (how many times each element was used)

*Interviews of exhibit visitors

*Survey for class members

 

Our main research question is: did strangers interact? 

Interactives data collection

Data collection at end of exhibit.  We will be counting comments, activities in each of the methods of interaction.  We are counting at the end of the exhibit to ensure that we do not double count any interactions. The list below is intended to be exhaustive, so please let us know if we missed something.  Because we are counting at the end of the exhibit please DO NOT THROW ANYTHING AWAY even if you have taken a photo, please retain everything.

 

Online

flickr (julie)

twitter (erin)

tumbler (alex)

voicemail (alex)

emails (alex)

facebook (kylie)

delicious(kylie)

blogs (kylie)

 

Physical (all kylie and jaisa)

# of post-its

# of advice booth participants

# of bathroom comments

# of buttons made (based on amount of used materials)

 

Interviews 

We are interested in getting some qualitative data regarding experiences and are going to do interviews.  We will approach every third participant (if possible) to the exhibit in order to ensure a random sampling.  If subjects have been to the exhibit more than once, ask only about their most recent trip through the exhibit.

 

1) How did you hear about the exhibit?

2) Did you talk with a stranger while in the exhibit? Is there anything you'd like to share about your interaction?

3) Did you solicit, contribute, or comment on any advice in the exhibit?

4) What, if anything, about this exhibit are you likely to share with someone else?

5) Is there anything else you would like to share?

 

 

Survey for class members

We will send out a catalyst survey to members of class to generate self-reflective information about the successes and weaknesses of this exhibit.  This survey is meant to focus on the exhibit solely, and not on the class as a whole.  The survey will be sent out via catalyst on Sunday and will close Tuesday at noon.  

 

1) What was the best part of the exhibit and why?

2) In what ways do you think the exhibit was successful in getting strangers to talk to eachother? In what was was it not successful?

3) If you were to mount this exhibit again, what would you change to improve visitor experience?

4) What parts of this exhibit could be incorporated into a museum setting? What parts could not be incorporated?

5) Did you observe anything noteworthy regarding visitor interaction?  

6) Do you have any other reflections on this experience?

 

 

Team Members:

Alex,Kylie, Jaisa, Erin, Julie

 

Hello team!  It seems like we are in a position to sit and observe for a short time, until the exhibit and the goals of the exhibit are more clearly defined. While the main goal is to get strangers to interact with one another, there will undoubtably be additional goals to be evaluated.  As the exhibit comes together, be thinking about what we want to evaluate in addition to what tools would be most valuable: observation - timing and/or tracking, surveys, interviews, etc.  Seems unlikely that a focus group will be needed for this evaluation. Time will be an issue for our evaluation, so let's think focused and manageable.

 

Stay tuned and feel free to comment as wanted/needed.  We have a good group and I'm excited to see what we come up with. (julie)

 

...

We're quickly approaching a time when it make sense to begin planning our evaluation.  Here's a tentative schedule, just to throw out.  Make any changes/suggestions you see fit.  It's going to be tight, but the only part I'm really worried about is begin able to collect data, evaluate, and prepare a report in one week.  Tight fit.

 

May 17-23:

Determine question(s) to be answered by evaluation

Determine method of evaluation

 

May 24-30:

Determine sampling method

Draft of evaluation tools

Determine length of data collection, any coding, analyzing, etc.

 

May 31-June 6:

Finalize and prepare evaluation tools for use

Assign roles for evaluation

 

June 7-12:

Collect data during first part of week (June 6-8)

Code/analyze during middle of week

Prepare report 

 

Three cheers for designing, executing and analyzing an evaluation in four weeks!

 

...

Here are the visitor goals, keep these in mind when thinking about evaluation questions. (julie)

 

Visitor Goals 

 

 

 

Blog Posts about us

 

http://thinkingshift.wordpress.com/2009/04/02/talk-to-a-stranger/

http://scienceblogs.com/clock/2009/03/try_to_get_strangers_to_talk_u.php

http://hankblog.wordpress.com/2009/06/04/can-museums-learn-from-games/ 

http://dailyuw.com/2009/6/4/give-and-let-give-student-exhibit-encourages-inter/

http://www.artsjournal.com/anotherbb/ 

http://burkemuseum.blogspot.com/2009/05/burke-behind-scenes-training-future.html 

http://slog.thestranger.com/slog/archives/visual-art/ 

http://hankblog.wordpress.com/2009/05/29/uw-museology-exhibition-on-advice/ 

http://www.artsci.washington.edu/features/adviceexhibit.asp 

http://www.digitalcity.com/2009/05/22/advice-to-graduates-on-exhibition/