Webinar 2 – Assessing the student experience: Student affairs learning outcomes

>>Alexandra MacFarlane: Good morning, everyone. Thank you for joining us for today’s
webinar, assessing the student experience, student affairs learning outcomes. In this webinar, we will
explore why it is important to measure the skills students are
developing in co-curricular activities and how different institutions are developing
and assessing their learning outcomes. My name is Alexandra MacFarlane. I am a researcher at the Higher Education
Quality Council of Ontario otherwise known as HEQCO and I will be the
moderator for today’s event. As I go over some housekeeping items
and introduce today’s panellists, please take a moment to fill out the form
on the right-hand side of the screen. This will help our presenters
better understand who you are. So a few things before we begin. If you have any questions for the panellists
or experiencing any technical difficulties, you can send us a note in the Q&A on the
bottom right-hand side of the screen. When you do this, please keep the
send defaulted to all panellists so that everyone can see your question. So you may be wondering who
is in this webinar with you and it may appear like you are the only one. But I assure you this is not the case. We have over a hundred people with us today. So thanks for everyone for joining us. So the plan for today is that we are going
to hear from three different panellists for about 10 minutes each
and then we’ll have some time for some questions at the end of this webinar. So if you have any questions,
your chance to send them to the panellist is through the Q&A section. We’d like to answer everyone’s question but
concerning the large size of our group today, we would like to apologize in advance
if we do not get to your question. For those of you who cannot stay with us
for the full hour or for your colleagues that were unable to join us, you will be able
to find this webinar posted on our website at www.heqco.ca shortly following
today’s session. So without further delay, it is my
pleasure to welcome today’s presenters. With us, we have Dr. Cara Wehkamp who
served as the Manager of the Office of Intercultural Affairs and Student
Life at the University of Guelph. We also have Mr. Adam Kuhn who is the Director
of Student and Campus Community Development at the University of Toronto and Dr. Sonia
DeLuca Fernandez who is the Director of Research and Assessment for the Division of
Student Affairs at New York University. So if you have not already done so,
please answer the two poll questions on the right-hand side of the screen. The poll will be closing in approximately
10 seconds and Cara will go over this poll. So at this time, I’d like to welcome Cara.>>Cara Wehkamp: [Inaudible],
good morning everyone. I’m honoured to be able to
share with you all today. To place in context, I think
my contribution as the Chair of my Departmental Assessment Committee and
at Guelph, the Department of Student Life, reside within the Division of Student
Affairs and is composed of a few units, the Centre for New Students, Community
Engagement and Global Citizenship, Off-Campus Living and the
Office of Intercultural Affairs. And for our context, the Assessment
Committee is made up of individuals that are from across all of those units. So in preparation today, I think the
facilitators, we all reflected a bit on what the student affairs as a profession
is and some words that came to mind for me are included on the slide. I think that oftentimes, when we are talking
across our institutions or sometimes outside of our institutions, there’s a view of
student affairs professionals as the people that are helping students navigate their way through our institutions and
supporting them personally. People often think of housing, athletics, different units that are helping
students succeed in the classroom. And while parts of that are true, I believe
we’re all educators ourselves and we’re working to foster environment and opportunities where
students can learn outside of the classroom, and integrate their classroom learning
with volunteer and care professional role. I think it’s through those roles and experiences
that student affairs professionals help students in demonstrating and articulating the
knowledge, behaviours and skills that are needed to be successful in postgraduate at —
postgraduate work and the labour market. For us at Guelph, we’re guided by our student
affairs mission developing the person, the scholar and the citizen. You want to take a second
to look at the poll results. It looks that many of the people
on the call today are responsible within their role for assessment so 79% of us. And there’s a range of the personal
experience is also noted there. So in student life, our vision is to be partners
in learning, that was what I was indicating and our department in the current
configuration is somewhat young. And while there are a number of
assessment activities going on, on those individual unit level including
things like personalized learning and development plans, some pre-imposed
assessment activities and of course, a number of surveys, there was no harmonization
in the process or result of the initiative. However, there was an understanding that a
coordinated assessment approach would be helpful to reinforce what it is that we offer, what’s
the value added in what we offer students, and to help us measure the impact
on student learning and development that we are having outside of the classroom. But we weren’t starting from scratch. So in 2010, our divisional outcomes were
released that had crossed divisional input and that results in five learning outcomes. In 2012, there were institutional
outcomes released. They had a very strong academic focus and
there were another five learning outcomes. But we started this work in around 2013 with our outcomes being established
in 2014 so they are quite new. And it was cross departmental work. Each unit had the freedom to map
their own outcomes and any outcomes that they were using previously
from other sources. All key members were included in the process
including directors, managers, coordinators, [inaudible] and then at the
unit level, students. So that each unit established inventories
of their core program and key to all of these processes, I think what is
really important in assessment is that we have the support of our
director who has prioritized assessment. She helps provide us with a vision. She included the resources needed to
do this including people and the time and the platforms to help us make it happen. Naturally, there are a number of
promising practices that we were revisiting in the literature that we drew from. Ans I think some of those are like AAC and
U, the Association of American Colleges and Universities, Learning Reconsidered,
Learning Reconsidered Two, CAS Standards, the University of Pittsburgh that helped
decide a classroom curriculum and support from Campus Lab because all of our campus, they’re utilized through
Baseline and CollegiateLink. Just to give an example of what that looks like
on the ground and what our colleagues were asked to do with the inventory process, I’m going to
start sort of in the 9:00 to 12:00 o’clock area. We asked them to link their core
programs to the divisional outcome. And we also asked them to include any other
outcomes that they’re using from another core. We asked them to then also describe
the purpose of the assessment and some people’s assessment was
being linked to reporting needs or any collaborative initiatives
they were doing. If there were specific populations that
they were working with, in our case, we highly work with diverse students, first gen,
average [inaudible] first year, international and off-campus, just to name a few. And finally, we asked them what their
current assessment methods were. And as some might expect, most popular
assessment method were surveys. There was also a lot of one-on-one observational
work happening and personal reflection. There were also tons of evaluative processes
so satisfaction surveys were overly abundant. The process allowed individual
units the freedom and autonomy to self-identify their core programs and it
was important that there was collaboration and learning throughout the process rather that a real prescriptive strategy
coming from the committee. So that resulted in eight learning domains
which I have included on this past slide. Each has a set of learning outcomes. We actually have 44 outcomes in total
so we have quite a few outcomes. And provided with those are each domain has a
preamble that is the context, sort of giving all of the overall goals of the
domain and the learning outcomes. And because students have always
been at the forefront of the work, the language has been really important
in how we are promoting this to them. The images for the badges
are also very well focussed. I know a few people are surprised
when they see our communication bag — the badges are cannon and
that means being strange but if you’ve ever experienced our campus,
and know that our cannon is painted nightly by the students, the relevance is clear
that it is a message board on our campus. This, some of the mapping
was an interesting process. So you have 44 learning outcomes and you set
people free to decide what are the outcomes that they feel are linked to their programs. And we did see a little bit in, of
over excitement in that first round. Particularly for units where people weren’t yet
using learning outcomes, there was a little bit of loss in focus in being
intentional which is really easy to do. We all know that anecdotally our programs
might do this or that, so we had witnessed some of the power and potential of
indirect learning in our programs but we weren’t necessarily measuring it. So we went back and did a second round
where we really guided staff to look at what are we intentionally
teaching, and training, and selecting the outcomes
that we can measure and assess. The next step for the assessment team was to
develop rubrics for the learning outcomes. So this is just one example from those. The rubrics are a direct measure for
assessments but they require the student to have the true opportunity to
display their knowledge, behaviours, or skills so that they can be measured. Overall, they can help communicate
the expectations for students but I also think it helps communicate
to staff the different levels that their programming might take
part in providing for students. They help allow for consistency
across our units. They also provide feedback from student
learning and programmatic feedback. And so through all of that, they’re
helping us refine our practice and enhance student learning. It also was a really time-consuming process. So we were fortunate to have
the opportunity to — we met probably weekly for months in
a row as we worked on those rubrics. So what did that really look like
on the ground kind of as an overall? Checking out the literature and the promising
practices, talking to a lot of different people, I found a lot of people who were already
ahead of us who were really willing to share which was important to the
way that we did our work. It helped us look for efficiencies
and how we were doing things. We really needed to engage staff and
keep them engaged in the process. So we continued to make sure that assessment
was a priority within the year and made sure that there was a section in all of our departmental meetings
that we talked about assessment. It wasn’t always work. It could be someone providing a highlight from
assessment results that they had completed. And we then started to have and put together
some toolkits for staff beyond the rubrics. And we continued to further
develop those that were working. We’re intending this to share
our promising practices with wider community both
institutionally, and with colleagues that are our institutions with our funders. What does it look like for students? We use, like I mentioned, CollegiateLink so like
[inaudible] called Gryphlife and we’re fortunate to have a platform that allows us to both
promote events and track student learning and that allows them to access the
results of some of that learning. They — we put together a
team of staff and peer helpers that help students navigate
the different programs and opportunities that are available on campus. We’re finding students have fatigue about
all of the different offerings that are there so this helps them navigate the way — the
different learning that they want to explore and spend their time a little
more wisely to reach their goals. So we’ve really just completed a
year of launching some of this work and we’ve had great response from both
students but we’ve also had great response from other departments within student affairs. And more recently even some of the academic
programs who are looking at the co-curricular within their program engaging
with some of the work we’re doing. So it’s really exciting to
see people on campus engaging in the work beyond just within
our own department. The assessment committee is working
strongly on an assessment handbook. Whether that will actually be book based or
more a web tool that our community can continue to use to support themselves
between opportunities for learning that we have to finalize. And then the thing that we’ve recently started which is exciting is we’re looking
at core assessment questions. So for this first stage, we’re
looking at core assessment questions for each learning domain,
not each learning outcome. And it should be an exciting year
as we start to really work on those and re-engage some students
in the work that we’re doing. So I’m going to leave it there. Like I said, I think it was
really valuable to get — to be able to reach out to people so I would
be really happy to have people reach out to me. if there’s anything that I shared today that
you’re interested in learning more about, I’m happy to share and can be contacted anytime. Thank you.>>Adam Kuhn: All right, thanks, Cara. Thank you so much for sharing your
information about the amazing things that are taking place at
the University of Guelph. My name is Adam Kuhn and I
am the Director of Student and Campus Community Development
at the University of Toronto. My area consists of three main teams, club
and leadership development, mentorship and peer programs, and lastly
orientation, transition, and engagement. I also have responsibility for providing
divisional leadership in the area of assessment along with our newly-hired
Manager of Assessment and Analysis, Jeff Burrow. So to understand a little bit about our context,
in terms of student life and student affairs, U of T is quite large and decentralized. So while today I’ll be speaking about the
work we’re doing in the central division of student life, there is also a ton
of great work being done with regards to learning outcomes and assessment
in the various faculties, residences, campuses and undergraduate colleges. So in our discussion today, I thought it
would be useful to start big and broad and then slowly zoom into some specifics about
how we are thinking about learning outcomes and assessment here in the
Division of Student Life. So I thought I would start
with a quote and I found — I was digging through some old Caucus monographs
and I found this one from 1988 written by William Stewart and I thought
it would be a great place to start. So in order for us to become more influential
contributors to the development of students on our campuses, we need to
work toward the development of an appropriate vision of our work. This vision and the way through which we
implement it is must maximize our support of the [inaudible] process,
clarified co-curricular role relative to the whole student experience, bring
the student support services together and explain our role to the campus community. I thought this would be interesting to
highlight because what Stewart was saying in 1988 is still very much true today. In student affairs, we are consistently
reflecting on how we can meaningfully contribute to the whole student experience and this
is just as true today as it was then. Assessment efforts are a great way to
bring clarity to our role as educators, both to ourselves as well as to
students and other campus stakeholders. And true as it was in 1988, it is
vitally important to showcase the value of student life units to the entire campus as
well as improve programs and program offerings in order to maximize the level of support
and engagement for students on our campuses. Whenever I speak about assessment, I
draw upon a story that I find useful in explaining the importance of
learning outcomes and assessment. It involves a target which is why I chose
this beautiful Jasper Johns painting. A man is walking by a barn and notices
a 10-year-old kid throwing darts at targets painted on the side of the barn. He looks more closely and notices that out
of the 20 or so darts the kid has thrown, the kid has hit the target every time. Not only is he hitting the target but he
has gotten a bull’s eye with every throw. The man approaches the kid and
says, “That is so incredible. You are so talented. How is possible that you’re able to
hit the bull’s eye every single time?” The kid says, “It’s easy, sir. First I throw the dart and then I walk up right
after and draw and paint a circle around it.” I like the story because it speaks
how often we look at assessment. We run a program. We run in an initiative — or an intervention
of service or an activity and then afterwards, we conduct an assessment
and then draw the circle around what we’ve accomplished
and say that it was successful. The story is helpful because it helps us
realize that another way to do would be to throw — or draw the target first. This can look like a learning outcome,
a participant array or any kind of goal that you have related to
your service or activity. When we deliver the service or
program, we are then throwing the dart. The act of assessing the program then lets
us know how close we got to the bull’s eye and if we are slightly off, the data can
help us course correct for future programs and understand how we can improve. Here in our division, we’ve lots of
strategies to assess our learning outcomes and Cara mentioned several that they’re
also using at the University of Guelph. We have an abundance of surveys that we’re
conducting and Cara also mentioned some of the great work that they’re
doing at University of Guelph. One of the things that seem to be
also quite popular is the idea of pre and posttests though administering the
same instrument in advance of an activity and then doing it again at the end
then being able to track the difference between the pre-activity responses
and the post-activity responses. I had an interesting experience
with this at a previous institution where we’re using socially just leadership
inventory in order to do a pre and posttest with regards to a leadership certificate. What we observed is that students actually in
the posttest were rating themselves at scores that were much lower than where they
were in the start of the activity. So we had a bit of a freak
out because we were nervous that our leadership certificate
was having a negative effect. So in order to assess this further, we conducted
some informal focus groups as a way of kind of member checking and bringing the data
back to the students who had participated. What they showed us was that, in fact,
prior to participating in the certificate, they were very confident that they were
socially adjust, that they were collaborative, that they understood equity,
diversity and inclusion. And through the process of going
through the leadership certificate, they had a more in-depth knowledge
and awareness and so at the posttest, they had a little bit more — they used
the word humility around these topics and therefore ranked themselves
a little bit harder. So in fact, when we’re able to do a bit of
a mixed method with the pre and the post and the member check kind of focus
group, we’re able to see that, in fact, our certificate was having the
results that we were hoping. A strategy that we have been using to assess
kind of a multicomponent program is kind of our approach to mapping learning outcomes. So if you have a workshop or a
one-off event, sometimes you can map out several learning outcomes and then design
your assessment strategy to take a look at if you are or aren’t kind
of meeting those outcomes or to what degree you are
meeting those outcomes. But then it can become quite complicated when
you’re looking at multicomponent programs so I’m thinking of perhaps those student leader
training programs, alternative reading week or service learning, anything
where students are going to be having multiple different touchpoints or
exposures to different learning experiences. So I’ve got this slide up here which is
just a crude example of mapping outcomes for a training program that we have used in some
of our training programs here in our office. So on the left, you’ll see that we’ve got
some of the learning outcomes outlined and on the top row, you’ve got
the various learning experience. So it could be a workshop. It could be a keynote speaker. It could be an experiential-type opportunity and this is just a crude example
because it’s simple yes or no. So you might have in workshop one,
this person is learning a little bit about learning outcome one; in learning
experience B, yes, yes; and then you can kind of track it all the way through
and mapping out the degree to which students are being exposed
to various learning outcomes. You might see after mapping it
that students are getting one of the learning outcomes a great
deal and others not so much. So this is a great way to kind of do an
assessment of a multicomponent program to see where in which students are getting
exposed to various ideas and thoughts. You can take this an additional step
further by looking at the degree to which students are getting into a particular
learning outcome in a learning experience so for example, a very basic example
could be around active listening. So for example, learning outcome
one and learning experience A, they might just be exposed to
the idea or to the value of it. With learning experience B, that could be
a workshop with a really kind of developing and practising that skill and
perhaps learning experience C, they’re kind of really getting
a chance to practice and pick apart the skills of active listening. So you can take this to another level by kind
of getting into the depth as well as the breadth of the various learning experiences. So I wanted to just give a few examples
of how some of the tools and strategies that we’re using in terms of learning outcomes
assessment but then I also wanted to kind of speak a little bit about
the work that we’re doing with our learning outcomes
and assessment committee. So this has been formed. It has representation from all the units within
the Division of Student Life here at St. George. And we have been focussing on three main
areas and the first one is around capacity. So as a committee, we’ve been working
to try to identify and organize staff PD to help us build our confidence and
competence with regards to assessment. Looking at the utility piece, so looking
at discussing the results of divisional and institutional projects and as
well as an attempt to disseminate, share and apply relevant
data across the division. So I think we actually are quite good at
collecting information and I think informally, we’re great at taking that,
processing and applying it. But we’re looking at trying to create a
strategic way of doing that across the division. And then in strategies, looking at
divisional key performance indicators and overall assessment strategies for
the division, guide learning outcomes and assessment initiatives
across the division, as well. So this is what we’ve been working on
in terms of our committee but today, I just wanted to focus a little bit
about that top one there around capacity. So we came up with a bit of a five-point plan
for how to build capacity across the division. And I’ll just speak about these briefly and then
I’ll talk a little about specifically our blog. So the first one is run a competency
framework which there are some that exist and we borrowed very liberally
from the ACPAS ASK standards which are the assessment
skills and knowledge standards. It’s a competency framework that outlines
all the different ways staff should be able to engage in activity efforts. So we took that and through our committee,
reviewed it and designed it in such a way that it could be applicable for
our particular context here. And that, I think has been useful
because with the [inaudible] assessment, some folks don’t necessarily see themselves
as experts and kind of want to know where they should be able to
manoeuvre these activities so that they can feel confident to do so. So with the competency framework in mind,
we developed a workshop series that is about 10 workshops that we’re just
starting to roll out this semester and each of the learning outcomes from the competency
framework are mapped onto these workshops. So if you attend all of them,
you would be at one point, exposed to each of the competencies
in the framework. On the other side, you could cherry
pick which workshops you want to attend so if you were using the
assessment competency framework as a reflection tool, you
might be able to see, okay. One of them is ability to design and
utilize a rubric and so you might want to find the workshop that is most
relevant to that and attend that one, because we have such a diversity of
levels of experience across the division. The last one, sorry, item
number three is our blog which has been kind of a
central point of contact. We wanted to build up this idea of
having a discussion around assessment, kind of sharing the different processes
that people are engaged with in terms of what they’re trying out
in our different units. And so the blog has been useful and I’ll speak
a little bit more about that in the next slide. Peer reviews, so this is just a
really great habit when we’re talking about any practice with regards to assessment. A lot of people are kind of at their desks
either developing their own learning outcomes or drawing from other kind of instruments
in order to guide that but we really wanted to encourage the practice of sharing
that and getting a second set of eyes. So we’re really trying to
promote this idea of peer review. Whether you’re trying to design your
assessment plan, you’ve got a survey instrument or a focus group protocol that you want
to work through, we’re really trying to promote this idea and we booked
space and time on our campus that people can gather to have this. They can meet with members of the
committee as well as other peers to kind of engage in that practice of peer review. And we’re finding that people who have
participated have enjoyed it, have had it, enjoy the opportunity to kind of articulate
their goals and their strategies to someone else and kind of received an honest feedback
in order to again, enhance their practice. And the last one is around resource
creation and I think this is similar to what Cara was speaking about that,
with regards to that toolkit and the one that has been kind of the most popular is our
toolkit around asking demographic questions. What we noticed is that many people across
the division are asking lots and lots of demographic questions but aren’t
necessarily using it in a meaningful way or might be asking it in a way that might
not necessarily be appropriate or relevant. So we created this whole toolkit
on asking demographic questions, one of them with a huge caveat around
why thinking and really reflecting on why you might need to ask these questions. And so once you’ve come to the decision that
you need to, here are the best ways to ask them and kind of navigate that particular
part of the assessment process. So the blog is called Measuring
UP and this was an existing blog from the old Division of Student Life. And I just wanted to show you what
it looks like and it’s kind of — they look actually a little bit fun. We’ve got lots of different articles
you can see there that there’s one on applying classroom assessment
techniques to Student Life programming. We have one around ensuring accurate
representation of student data. Let’s try to promote the
whole idea of member checking. But then we’ve also been engaging
some students in the discussion so we had a student write a blog post around,
it was called “To Complete or Not to Complete, A Student’s Perspective on Taking Surveys.” Great! And then we had one called,
“On a Scale of One to Five – How Much Do You Love Likert Scales?” And then one around using infographics
to share assessment results. So it’s both a space for us to reflect on our
practice as well as share some of the practices that we’re engaging in that might
useful to other folks on the campus. So lastly, I would be remiss
if I didn’t mention some of the resources that have been really useful. And Cara did mention them so I just
got the visuals here to support that. We’ve got Learning Reconsidered
and Learning Reconsidered Two. Assessment Reconsidered as well as the
Professional Standards for Higher Education which comes from CAS which is the short
form for Council for the Advancement of Standards in Higher Education. So a lot of really great resources out
there, both online and through webinars and through lots of different sources
but these are some of the, I would say, formative documents that would be useful
if you are exploring learning assessment — learning outcomes assessment on your campus.>>Sonia DeLuca Fernandez: Oh, hi! This is Sonia. I’m trying to advance my slide, if
you’ll bear with me for a second. Here we go. So I am so excited not only to participate
today but I just want to get in a car, train, or whatever to visit Cara and Adam. They’re doing such fabulous exciting things. I’m just, I’m going to run through some of
my slides that I think are less interesting because Cara and Adam did them better. I want to let you know that my
contact information not only is in this slide presentation but it’s also
at the end and I want to encourage you, if you would like to ask questions today and
we don’t get to them or there is a conversation that we can have, please feel
free to contact me afterwards. I also want to call your attention to the
poll questions on the right-hand side. Yes, those are for you. They didn’t randomly — well, they look
random, appear in our webinar today. We’re going to get to some of those
answers and why I want to talk about those questions and the answers later on. So at NYU, we make a distinction between
student services and student engagement based on what co-curricular learning looks like here. So for this context where I’m working, student
services has a tendency to be more transactional than student learning or
co-curricular engagement or learning. Co-curricular, we try to make an intentional, thoughtful focus on student
development and learning over time. And that’s also, as you can see, with the caveat
that these categories aren’t mutually exclusive. But hopefully, you get an idea
between that transactional piece and that student learning piece
and the distinctions that we make. So I want to talk about three things here today. One, assessment in a co-curricular, what is it? Two, how do we do it at NYU? But also three, why do we do it? And I recognize that it’s probably very
insulting to all of you that I’m going to try to answer those questions in any
sort of way in the next nine minutes. So again, if you would like to say,
“Hey, I think that was messed up,” or “Could we talk more about this,”
please get in touch afterwards. So the bottom line for me and for
how we look at assessment at NYU, is that it is a process of
making explicit what we do. We make explicit what the purpose is and
the impact we hope to have and why we do it. We want to support a culture of
continuous improvement and quite frankly, we need a lot of data and a lot of
attention to be able to do that well. So at the core, it’s about
communicating why we’re here and why, if at all, what we do is important. This cycle of assessment emphasizes
that assessment is a process and not necessarily an activity or an event
in isolation, standing alone, maybe crying. But assessment as an activity, obviously
is part of the cycle of assessment. I think I have a little arrow on the screen
and you can see down here where assessment as an activity fits into the process. So something else I’d like to call your
attention to is how we organize looking at assessment in the co-curricular at student
affairs involves starting with the goals. The goals being the big chunks of
what we want to accomplish and why, how students should be impacted by engaging
with us, our program services advising and how we operationalize those
goals are what we call outcomes. Those are the smaller bits that put into
action specifically what it would look like to accomplish a goal. So a couple of notes on that. We also are some of my two biggest,
I was going to say challenges, but they’re really also opportunities and
discussions, is one, being able to focus on what students experience
and learn, not what staff do. So frequently an outcome might be
phrased as increase programming by 25% for this particular population. Well, that’s something that
we might call a service goal and what the staff are expected to do. But it doesn’t really capture or
communicate how students are supposed to be impacted by an event. The other focus or challenge I have
is discussing the difference between and enacting the difference between
direct and indirect evidence. So the direct evidence being
preferable but more likely than not, we’re collecting indirect evidence. And the difference there would be if
I asked you what does HEQCO stand for? And you answered me, that would be a data
point when we’re collecting direct evidence. It’s measuring what you know. Indirect, I would ask you, “Do you
think you know what HEQCO stands for?” So we want to keep an eye out for that balance
of direct and indirect but we also want to err on the side of direct evidence. Again, my portfolio looks similar
to that of my peers here today. This is something that I work with, the Division of Student Affairs has — how
many units do I work with? Like 15 units in the Division
of Student Affairs. I also work with a couple of units outside
the Division of Student Affairs at NYU. And these are some of the activities. This is how I’m doing the work here at NYU. And I’ll say — you’ll notice that
I sometimes say the office or we, and I realized that that’s not
probably very healthy or accurate. I’m an office of one so I guess
I’m also then speaking about myself in the third person which is kind of weird so. But it’s also very [inaudible] to
say, “I do this and I do everything,” because obviously there are terms of
collaboration just like my other colleagues and we really can’t do good assessment or we
can’t do assessment well without collaborating. But my office at NYU, the Office of
Research and Assessment is just me. These are some of the things I do. The assessment plans and audits
mirror some of the inventory that Cara was talking about earlier. I’m also tasked with addressing individual
unit, as well as division projects, and what I mean by unit are
some of those functional areas. We have a Centre of Multicultural Education
and Programs, LGBTQ Student Centre. We have an Office of Interactive
Media, Resident’s Life and Housing. Those are some of the examples but I’m happy
to talk with you about this context offline, if you want to determine how helpful what
we do here might be for your context. So now we get into some of
the why is this important. And it’s important because we are
in a unique position in working with students and co-curricular learning. We’re in a unique position not only to be
partners with the students in their learning but also faculty and staff around
the college and university. We’re in this position to provide
information that rounds out that picture of how are students experiencing
your institution. And in order to be able to do
that, we need a complement of data. Students are spending a majority of their
time outside the classroom and that in order to develop some of those accurate
models of student’s experiences as well as predict metrics; we have to do a better job
of correct collecting that co-curricular data. I’m going to provide a couple of
examples really quickly again, of the how we’re doing assessment work at NYU. This first one is what I
was calling a quick cut. I’m a big fan of short and small. So this is an actual index
card that has two sides. One, a student is writing what religious
literacy means to them and they’re designing it. On the other side, they’re providing
some demographic information. And we do this to gather data that is not
program-specific but rather it’s still aligned with those learning outcomes that are
operationalizing the unit’s goal, in this case, the unit’s global spiritual life. In the Division of Student Affairs at
NYU, we do not have division-wide goals. Now some of this is all very unit-specific. Another short pre-post Adam was alluding
to it earlier and also the challenge of that direct versus indirect evidence. Do you think the campus is inclusive and
before and after the intervention in between, it might create that awareness to be
more critical than they might have been at the beginning of the intervention? And a different approach to addressing that is
that we focus on collecting direct evidence. So in a pre and post [inaudible] quiz,
we ask students to define terminology. They have a question to write on
and they also identify resources. And we can do that before and
after in order to assess some of what happened as a result
of the intervention. Part of how we’re doing assessment also is
I — maybe it’s because I’m an office of one but I try to maximize impact that
any assessment activity can have. So this is one way of making assessment in
the co-curricular organizationally valuable. So I’m suggesting that we
centre any organizational — or excuse me, any assessment focus in
the centre here of where pressing issues in a co-curricular intersect with learning
outcomes and high impact educational practices. And that is part of how we spend some
time on some of the important skills for graduates the employers have talked about. So we consider the extent to which life and
the co-curricular, we are very much concerned with these as outcomes and
learning opportunities. I can’t see if you all have answered the poll. If you haven’t, could you do that now? We’re going to close it in a second. Now that’s all to say that some
of my criticism of my field is that we are unduly attached
to procedures and methods. And we don’t really talk about some
of the characteristics of assessment as a political and social action and activity. We need to be more mindful, in my opinion, of
some of the influences, some of the context and much more critical about
what has meaning and value. So towards that end, I wanted to
chat briefly about our poll results. Which of these shows do you, did
you watch with the most frequency? None of the above. Now, I wanted to use this as an example to consider how might we apply a
critical framework to what we do? We have to look at describing versus
interpreting, versus evaluating and part of what I wanted to do that with
these two TV show questions. So here’s what these results tell me. So we have 59% of the respondents said, “I didn’t watch any of those
shows most frequently.” So first of all — — that — what we can say about that is that
the question was probably written poorly. We also need to consider the
assumptions that were going into not only constructing the
question but also the options. So part of it, for example,
with these TV questions, we might be assuming people have access to a TV. If they have access to a TV, they
have access to these networks, is that an assumption we’re making? Are we making assumptions of
people’s age as well as location with the offerings of these particular TV shows? And what we can say about not only this item,
this first item of frequency is very little. We would just be able to say the frequency
with which someone watched a show. We wouldn’t be able to comment like
the second item might consider, which show was your favourite? It looks like I have some
flashpoint fans out there. That’s exciting. So when we go to the next step of
being able to interpret responses to — in this case, quantitatively approached items, we can still say of respondents,
this is what was the favourite. What we can’t say is, we can’t say
anything about the respondents. We can’t say anything about the time period. This does not account for the
difference in runtime of any shows. It also doesn’t account for, “Hey, Sonia. Why did you ask us this? How is this information going to be used?” So to be able to do work that
gets to the target, if you will, if I could use Adam’s metaphor there,
and if we want to maximize the chance of hitting our target, we might
want to consider all sorts of things about the story that Adam told us. We want to consider the size of the target,
the distance we are from the target, the skill of the archer, the type of
material or equipment the archer is using. The same is true for how we’re conducting
assessment, what we’re prioritizing and why, because ultimately we want to get to
what type of meaning can be inferred. So what type of meaning can
we infer from question two of what there was a favourite TV show. One, someone who is not from Canada
has probably done a pretty poor job in picking Canadian TV shows. That’s one of the things I might
consider a high inference interpretation. Or some of my favourite TV shows, Sonia’s, are not the same as anyone
else’s favourite TV shows. What can it say about an individual that
has a greater liability because I get to tell you my truth and talk about
how it relates to me as an individual? As a group, what can we say about this? Well, unless we’ve considered those assumptions
and framing points, what are the assumptions that go into how we focus assessment? What do we ask? How do we ask it? What do we gather ultimately to be
able to say, “What does it mean?” We want to be able to improve
how we evaluate or explain. That’s the basis of a critical approach. Because there is a tipping point where
a lack of specificity or high levels of inference render our data useless. So — — putting that altogether, I’m suggesting
today that engaging assessment as a social and political practice requires
us to pay a lot of attention of who is represented, why, how and when. Who is deciding what is worthy
of assessment attention? Because assessment in an indirect
way is prioritizing learning over other learning, so who benefits and why? And so then I suggest for best practices
in assessment, that we [inaudible] and prioritize experiences of the
underrepresented and underserved but that I would also encourage
you not only to contribute actively to your professional spheres,
but also to cross some of those boundaries in order to collaborate. I have a couple of suggestions
of how you could do that. You will have access to the slides at the end
of the presentation so if these links don’t work or you have a hard time finding the information that I’m talking about, please
feel free to reach out. I’m just going to say a word about the
Student Affairs Assessment Leaders. I’m a member and I’m on the
board of this organization. Part of our professional development activities
include monthly structured conversations. They are webinars where you can
probably get involved just simply because of the number of people. So check out a structured conversation.>>Alexandra MacFarlane: Thank you, Cara, Adam and Sonia for your engaging
and informative presentations. So we now have a few minutes left
for some questions from the audience. So if you have any questions right now, you
can also send them to us via the Q&A panel. So our first question is from Bob. Have you done any collaborative
work with academic units on their learning outcomes
including how activities associated with your unit support the
academic unit’s learning outcomes? So we’re going to go to Cara
and then Adam and Sonia. So Cara, do you have any comments on this?>>Cara Wehkamp: Just that that’s
work that we are just beginning to do so we’ve had a few academic units who are also
looking at their co-curricular work in relation to the students and their programs and so they
are engaging us in some of those conversations. But they’re early conversations at
this point so I don’t have a lot to share on what that exactly looks like. But maybe somebody else does
so I’ll pass it along.>>Alexandra MacFarlane: Adam?>>Adam Kuhn: Alexandra, can
you repeat the question please?>>Alexandra MacFarlane: Yeah, sure. Have you done any collaborative
work with academic units on their learning outcomes
including how activities associated with your unit support the
academic unit’s learning outcomes?>>Adam Kuhn: Great, so that
is a great question. I would also echo Cara’s
observation that something that we’re kind of in the early stages of. One of the things that have proven quite useful in this regard is our co-curricular
record competency framework which we have developed based
on the CAS standards. And so, our co-curricular record is
essentially an activity’s document that records validated participation in campus-related activity linked
for co-curricular engagement. And with that, we have a competency
framework that has kind of broad, broadly named competencies like teamwork,
communication, systems thinking and what that has allowed is for us to
look at all the activities that go on to the co-curricular record and
map those on to the competencies. So with academic departments, when they’re
looking at some of their activities that do support their academic mission,
they were able to speak a shared language and were able to kind of map who’s doing what. And so the overall process of developing the
co-curricular record has really advanced our ability to kind of look at that and
relate to one another in that regard.>>Alexandra MacFarlane: Sonia,
have you done any collaborative work with academic units on their learning outcomes?>>Sonia DeLuca Fernandez: No, we have not. Some of that problem is because
of the size and complexity of New York University’s institutions. Part of it is because academic assessment
activities are very separated organizationally from student affairs, and part of it is because
our academic assessment peers at NYU have come to the assessment party a lot later
than my co-curricular colleagues. And so we have not collaborated
particularly on identifying learning outcomes.>>Alexandra MacFarlane: Thank you. So our next question, if somebody came to
you looking for advice on how to get started on an assessment project, what
advice would you give them? Maybe we’ll hear from Adam first
then go to Cara and then Sonia.>>Adam Kuhn: Great, one of the questions
and this happens quite a bit especially with our peer-to-peer consultations. People are at all different steps and places
with regards to their assessment activities. If one of them is — we just asked
overall, what are your goals? What difference do you want to
make in the lives of students? What impact do you want to have? What skills do you want them to have as a
result of participating in your activity or what knowledge or awareness or increased
appreciation do you want them to have? So just kind of thinking
broadly and then kind of zooming in from there around specific outcomes. We think about looking at — there’s lots of
different ways to achieve an outcome so in order to kind of even start thinking about
your particular program or your how, you have to look at the what’s as
well and what specific goals you have. And that’s where the conversations
typically start.>>Alexandra MacFarlane: Cara,
do you have any advice on how to get started on an assessment project?>>Cara Wehkamp: I think there are small
groups of people during the work at many of our institutions and I think it’s
important to look around your campus. And also your networks of your campus or
colleagues who might be able to like willing to act as a resource, if you’re starting
sort of and you don’t know what’s out there. Because I think there are a
number of knowledgeable people and really exciting initiatives
but it’s not necessarily something that is very visible to everyone. And I just — I found that it’s been really
helpful to reach out to some of those people and people have been really
generous in sharing their knowledge. I think there is a strong interest in this
area right now so people seem very excited to share their knowledge and their experience. And I think it’s really helpful so that
you don’t feel sort of isolated and alone. And you can get a few different perspectives
because sometimes the things that work on one campus don’t always
work on other campuses. So you can also get that sort of multicampus
perspective that you reach out both within your campus and outside of your campus.>>Alexandra MacFarlane:
Sonia, do you have any advice for people starting out on assessment projects?>>Sonia DeLuca Fernandez: I underscore
both what Adam and Cara have offered and I don’t think I can contribute any better.>>Alexandra MacFarlane: All right, thank you. So unfortunately, we’ve run out of
time but I would like to thank Cara, Adam and Sonia for joining us today. If you have any questions for our presenters, feel free to contact them using
the information on the screen. If you are interested in learning more about
HEQCO and would like to join our mailing list, you can go to our website at www.heqco.ca
and then to your contact information on the bottom right corner of this page. This webinar will be posted on our website
probably in a day or two with the slides so you can go and find it there and for
any of your colleagues that weren’t able to join us, they can also find it there. So when you leave the webinar, we ask that
you take a moment to complete the survey. We would like to hear about your
experiences so we can improve our webinars. We also would like to welcome
ideas for future webinars. So thank you again to our presenters
and I hope you all have a wonderful day.

Leave a Reply