VE Class of 2020

Feedback on your experiences within the program is crucial for us to better understand what elements are most appreciated and what elements need improvement or modification. We take your comments very serious. Indeed, the successful aspects of the program today are very much a result of the feedback from previous VE cohorts. We are proud of our program, but are not ready to rest on our laurels. The larger visual ethnography team will be meeting next week to reflect on the year and plan for the coming one, so please complete this survey by 17 June so we can take your feedback into account. All responses are anonymous.

Thanks in advance.

Part 1: General Questions

When you compare where you are today and where you were when you joined the masters, how do you evaluate your progress? Have you learned more than you hoped/expected? Or less? Explain any differences.

Do you think the balance between academic research training and practical audiovisual training is about right? Or is one emphasized at the expense of the other?

The program privileges ethnographic filmmaking as a specific output, but attempts to complement this approach with a general foundation in multimodal methods. In your opinion, does the composition between these specific and general qualities work? Would you modify this balance in some way?

Part 2: The Program Phases

The program is presented in three phases: 1) research design and proposal development (September-November); 2) field research and data generation (January-March); 3) data analysis and thesis development (April-June).

Between these three phases are transitioning workshops on a) field preparations (1st week in December) and b) data organizing, editing, and analysis (end of March).

Please respond to the following questions about each of these phases and transitioning workshops. The next section will ask you more specific questions about the learning elements.

Phase 1: Research Proposal

Phase 1 includes LISP, Research Design, and MiP, which are collectively meant to help you conceptualize, develop, and design an individualize research project. Did these three courses/training programs function well together to achieve this goal? Did they complement or compete with each other? What would help to give them greater synergy?

These courses were coordinated with a series of supervisory group meetings and occasional one-on-one meetings. This construction is a response to the very limited hours of supervision allocated for your thesis work. Did you find the group supervision sessions a good solution to this problem? Were they effective in helping you develop your proposal? What could be done to help you in this process further?

Workshop A: Field Preparations

This workshop combines intensive training and feedback to provide students with a final preparatory push to bring all their skills together and anticipate the issues they’ll face in the field. Was this week effective in preparing you for the field? What could be done in this week to help you be better prepared?

Phase 2: Field Research

While in the field, students are required to submit a series of three Field Reports. Did the reports help you organize your data and thoughts as you were progressing? Did they provide an adequate mechanism to communicate with your supervisor? Did this correspondence work to provide you with sufficient feedback? What could be improved in this interface?

These Field Reports also provided a context for sharing AV materials on Pitch2Peer, where students could get feedback from classmates. Was this an effective mechanism? Would you prefer more open-ended or directed assignments for these AV materials? Share any specific ideas you have.

Workshop B: Organizing, Editing, & Analysis

This workshop provides an intense series of modules on selecting, sequencing, editing, theory, and output in order to help students move rapidly from data generation to data analysis and be ready for the final phase of developing their theses. Was this two-week workshop effective in transitioning out of the field? What could be done to help you better transition?

The foundation of this workshop is organizing your materials so they can be systematically accessed and assessed. The key mechanism for this is a process of logging your footage. As it is a massive task, we asked students to initiate this while in the field. How effective was the process of logging your materials for getting an organizational handle on your project? Do you feel the hard work of logging and organizing resulted in a more efficient and effective interaction with your data? What are potential trade-offs for the heavy time investment and do you see any alternatives?

Phase 3: Thesis Development

This phase is student-centered with, on the one hand, a series of structured assignments designed to provide a scaffold in which to build the thesis project and, on the other hand, a multifaceted effort to provide intensive feedback with input from both their individual supervisor and the other instructors, peer feedback in both written and verbal forms, and live sessions in both small and large groups. Ideally, this phase affords students with the opportunity to experiment while also making systematic progress. As a whole, was this phase effective in helping you prepare your thesis outputs?

With its focus on assignments and feedback, this phase does not emphasize instruction, however, there was an effort to provide students with structured examples and they have been given small written exercises in the workgroups. Was the balance between individualized feedback and general instruction adequate? If not, what sort of added instruction would you need?

Unexpectedly, we had to shift this phase to an entirely online experience due to the COVID-19 pandemic and policy response. While not ideal, were there elements introduced that could be effectively adapted in more regular contexts, e.g. submitting and getting feedback on googledocs, the contruction of the CRIT sessions with simultaneouos chat feedback, etc? In other words, what from the online teaching experience would be worth retaining, if not further developing?

Part 3: Learning Elements

The program combines a number of pedagogical elements in a structure of blended learning, which includes online content, practical exercises, methodologically-driven assignments, peer feedback, etc. The questions below ask you to assess these specific elements.

How would you evaluate the multimodal modules? Would you have liked more online lectures? Could they have been improved in anyway?

Did you find the class time was used well? Which of the in-class elements did you find most useful? Which did you find least useful?

Did you find the AV Field Study assignments useful in learning the skills you needed to produce your film? How could they be improved?

Did you find the AV editing assignments useful in developing your film? How could they be improved?

Did you find the written assignments useful in advancing your research? How could they be improved?

Did you think the peer-review elements of the course – online and in class – were valuable? Would you have liked more or less of this kind of feedback?

Do you find the assigned reading and viewing helpful in developing and contextualizing your own research?

Part 4: Metacommentary

Prospective students have a variety of choices when it comes to selecting an audiovisual ethnography training program, so it is important for us to distinguish the “Leiden School” from others. What in your opinion makes this program unique and valuable? What attributes make it stand out? What among these qualities should be cultivated?

The world is a rapidly changing place and our training should provide students with skills to engage with future possibilities, so what should we do to remain competitive, perhaps even ground-breaking and trend-setting, in this domain?

Do you have anything else to mention about this program?

Thank you for your feedback and all the hard work and dedication you’ve put into the program this year.

Good luck and much perseverance during the last weeks.

This contact form is deactivated because you refused to accept Google reCaptcha service which is necessary to validate any messages sent by the form.