Monday, July 2, 2007

GMIT 660--Week Eight--Community Colleges

Jessie La Cross and I led the GMIT 66o Seminar this week--lots of work participating in two seminars and getting material ready to lead one!

Let's see what happened!

Jessie--Question 1:

Because there are so many shorter readings, please do not feel you have to respond to each article. But if there was a journal article (or two) from this week that particularly caught your interest, please share your thoughts on it:

Which article interested you the most?

What did you find most interesting about the ideas presented?

Did you agree or disagree with the strategy used? Why?

What do you feel are the specific strengths or weaknesses of the strategy used?

Do you feel this strategy could be relevant to SCC?

For reaction to Jessie's first question I'm going to choose Gina DeFreece's main reply posting and reflect on it:

I know everyone is busy and no one has extra time, but the article from Butler County was really interesting!!!!!! If you want a different take on assessment, this would be a good article choice!

Which article interested you the most? I choose the Butler Community College article on their PACT Individualized Assessment program.

What did you find most interesting about the ideas presented? - I liked the idea of creating assessments that truly are individualized and indicated individual growth and development over time. Butler County developed a program by which they evaluated students on 4 areas P- Personal Develpment Skills (including everything from self-concept to teamwork) A- Analytical Skills (critical thinking, analyzing) C- Communication (listening, speaking, etc.) and T- Technological Skills (Computer/technology)

Did you agree or disagree with the strategy used? I did agree, because I thought the use of rubrics to determine whether or not the compentencies were met, but then formulating another rubric based on the PACT info was a true indicator of the students personal development within their perspective field was a more useful assessment of their learning. I thought this was a much more comprehensive approach to assessment rather than the traditional letter grade.

Do you feel this strategy could be relevant to SCC? I LOVED this idea! Of course, I think we should try it in our program, and Butler County had an actual Assessment team that converted the findings from the rubrics into the information that they would be using for Assessment, (and we don't have that at SCC so this would create more work for us) but it was really a good tool so I want to talk with my team about how we could integrate this into our Graduation Seminar class. I think it could be used as a pilot program in maybe ECED or Human Services a bit more, but the article indicated how different programs were interpreting how to use the PACT tool.

My reflection:

Gina is genuinely excited about what this article had to offer in the PACT method of assessment!


P--Personal Development Skills: self-concept/teamwork. Don't we all need more personal development skills?! I am so impressed by where Gina is at with her career and her skills as an instructor, not to mention a wonderful cohort! I've gotten to know her so much in the past year or so; I never would have had that opportunity if not for this program! I feel so far behind everyone else in education since I have been around academics on a large scale for only three years now, but I have grown leaps and bounds in the past year or so! My self-concept has improved immensely, as well as my skills to work on a team! I hope for years to come that I can continue to share and learn in such a wonderful community. We owe it to our students to continue to improve on our own skills so we can pass them on to them so they can be the best they can be. Teamwork and community are so important in this day and age of self-agrandizement. We have become such an individualistic society in our culture and that does not bode well in the long run. We were not meant to be that way!

A--Analytical Skills: from critical thinking to analyzing. I am hearing more and more each day about the importance of critical thinking! Problem-solving is such a crucial skill that is in high demand with today's rapid growth in technology, and critical thinking is the main element! We take other's thoughts and ideas about a topic, really think hard about them and what they mean to us, and share all we can about how they can affect each of us......and, when we do it as a team it can be that much better!

C--Communication: listening and speaking, etc. Being a good listener is vitally important. There has to be a good blend of listening and speaking in conversation for it to be fruitful. If it is too one-sided there is a decline in the power of the message to be shared by both parties. You have to give and take equally. If you are too busy thinking about what you are going to say next you may miss something that can make the conversation really fruitful.

T--Technological Skills: computers/technology. It wasn't that long ago that I was very afraid that I was too far behind in this area so I made a mental challenge to myself that I had to catch up or I would paid dearly for it! I caught up to a good degree and I am very thankful that I did. I do, however, have a way to go before I can comfortably say that I have great technological skills!

Jessie--Question 2:

Please feel free to choose to respond to only one of the following:

a) Both the Glimpse of Community and Technical College Students’ Perceptions and the Butler County Community College article seemed to address the role of students and community college assessments.

The first article, “A Glimpse of Community and Technical College Students’ Perceptions of Student Engagement” explores how 48 Community Colleges participated in an assessment strategy that examined educational practices that are related to student success from the perspective of their students.

At the end of the article, based on information from the survey used, several suggestions are made to help increase student engagement, two of which are:

How to optimize the use of classroom time to help students actively engage more with one another, with faculty, and with the subject matter.

How to find ways students can communicate substantively with their peers and faculty outside the classroom setting.

I was curious how important our cohorts feel out-of-class collaborative activities are in community college courses? To what extent do you feel it is important to have students working together outside of class on class-related activities? The practical logistics of having students try to find common times to communicate and or work together, in addition to juggling work and family responsibilities, seems almost overwhelming. If a student doesn’t have a home computer, or reliable transportation, or does have heavy work and family commitments, is it reasonable to expect collaborative out-of-class activities of them? Are there outside collaborative activities that teachers are currently using, or that they could use in the future, that work in spite of these complications? What else did you think was of value (or concern) in this article?

In the second article, Butler County Community College discussed a strategy in which students’ personal development skills, broken down in 10 different areas, were included in their assessment processes. Faculty were directly involved in devising the assessment strategies, created rubrics for measuring performance, informed students that these assessments would be integrated with their coursework, and they went so far as to give participating students individualized results from the assessment process.

If we are going to engage in collecting assessment data from our students, does it seem reasonable to you that students should also get individual feedback on their performance? Would this strategy require that much more work to administer? Would it be worth the extra effort? What else did you think was of value (or concern) in this article?

b) Both the Wisconsin Technical College System and the Colorado Community College and Occupational Education System articles seemed to look at some state wide issues and Community College assessment strategies.

What impressed me about the Wisconsin Technical College System strategy was their insistence on consensus as critical to decisions being adopted. Being familiar with Quakers, who move only by consensus, I know fully well how long even the simplest decisions can take using consensus building. But the article states “buy-in and consensus building take time and were recognized to be essential for the model to be credible and viable.” Do you agree with this statement? Why or why not?

If one of the primary missions of community colleges involves meeting the needs of the local community, community colleges could conceivably have some very different priorities and programs. Just thinking about Nebraska, with large urban areas like Omaha, or smaller rural areas further out west, seems like state-wide decisions would be incredibly hard to agree on. But then I looked at the thirteen core indicators chosen for Colorado’s 12 community colleges, and I was totally impressed with the list. The list seemed both broad and fair in reflecting ways that many community colleges could be accountable at a state wide level. What do you think of this list? What are the advantages and disadvantages of applying state-wide assessment strategies in community colleges? What else did you think was of value (or concern) in this article?

Please remember, you only have to respond to a) or b) if you wish.

For this question I am going to choose Kelly Findley's response:

How important are out-of-class collaborative activities in community college courses? To what extent do you feel it is important to have students working together outside of class on class-related activities? Is it reasonable to expect collaborative out-of-class activities of them? Are there outside collaborative activities that teachers are currently using, or that they could use in the future, that work in spite of these complications?

Students have to learn to interact as a team and be a functional, responsible member of that team. Therefore, it is very important that there be some out-of-class collaborative activity to promote this ideal. What I do is try to divide them according to clinical sites (students spend approximately 21 hours per week performing hands-on exams at a hospital site consequently they see each other a lot!) so they have time to discuss/plan/interact OR I divide them randomly and give them portions of class time to strategize. At the end of the activity, I often have students do a peer review of each other’s performance and in that I include an area pertaining to contributions and functioning as a member of the team.

My reflection:

I think Kelly has a great plan for her students! She divides them into groups that head to hospital sites for hands on experiences and testing. She makes sure they do plenty of collaborating in their teams before and after the experience which is very important. That way they can have feedback from each other on both ends to create the best possible learning experience.

Imagine where technologies would be today without group collaboration. Ideas spark ideas; problem solving creates new questions; new questions lead to more problem solving; the cycle then repeats itself. If we don't have critical thinking and problem solving technology stands still.

Students have to learn to be partners in education; they have to know that each of them has a small ownership in the forward movement in all education. Getting them to buy into this can be challenging, but letting them know that you are learning right along with them will definitely help!

Jessie and I had very good resources provided by our instructor that were very relevant to our coursework theme. We presented them well and we had good participation by our cohorts. We have learned a lot from each other, yet we will never stop learning. We have to have the mindset that we are lifelong learners!


Monday, June 25, 2007

Week Seven--General Education

I'm just going to reflect on Gina and Kelly's seminar questions here for GMIT 660 like I did in GMIT 650:

Kelly's first question/s:

The article on “Assessing General Education Core Objectives” was based on the curriculum at Southeast Missouri State University being assessed for 3 core objectives: 1) the ability to locate and gather information; 2) the ability to think, reason, and analyze critically; and 3) the ability to communicate effectively. The Assessment Committee evaluated samples from freshman and senior seminars and upon completion were able to itemize major findings (page 5) concluded from the analysis.

1) Although all of these findings are important, which two should be given top priority and why?

2) Does SCC face some of these main issues and if so, which ones and why?
Remember, you can answer either question.


Kathy Zabel's main reply to Kelly:

One of the most critical findings is part of the first bullet in Major Findings and that is the statement, “Some artifacts produced in the freshman seminar were evaluated as stronger performances than some pieces produced in the senior capstone courses”. Students ready to graduate should by all means be producing superior work to the lower level students. It implies that those seniors didn’t “learn how to learn”, didn’t learn from their experiences, or didn’t have the experiences in earlier courses. Either way, it is a detriment to those students getting ready to graduate.

The second top priority is the lack of critical thinking observed. That is the talk everywhere, that students can think critically and solve problems. They must be able to think critically in real life to survive on their jobs. That’s not just instructor talk, but that’s what the supervisors in the physician offices are saying to us. “Your students MUST be able to think critically”.

My main reply to Kelly:

1) Although all of these findings are important, which two should be given top priority and why?

Kelly, I would say that "number one" would be the fact that "all who were involved with the assessment project agreed that opportunities are needed for dialogue among faculty to discuss possible program modifications based on the results of the assessment project."

Agreeing is the easy part; the hard part is to follow through on this agreement! At least two or three faculty should be the committee to plan agendas, and schedule times and places for this to happen!

"Number two" would be, for me, the fact that there was such a wide range in performance in the areas of formulating a thesis, producing an edited writing sample, citing source materials accurately, locating relevant source material, evaluating others' and constructing their own arguments, and producing polished pieces of writing.

Some students could barely get started, while some performed masterfully. That is not acceptable as an instructor! That gap has to be closed significantly!

2) Does SCC face some of these main issues and if so, which ones and why?

From what I have seen, yes, SCC faces these same problems/issues! I don't think there is a lot of agreement or discussion that takes place in regard to program modifications based on assessment. I don't know if there even has been any cross-disciplinary faculty group assessments done. Have there been any that you know of?

In regard to the wide range in performance, I have seen it first hand and it is very disappointing. I've seen it in database research training and in the classroom. How do we work to close that gap? Will it take more time working with students one-on-one? Would more milestones working toward better performance help? Do we need classes designed specifically for this?

My conclusion and reflection:

Kathy told Kelly that her top concern was that the rookies were outscoring the veterans! That is not too surprising to me. Some students are way ahead of their classmates, and sometimes ahead of people two or three years their senior because they are just extra-gifted students. It would be more alarming if it was across the board.

I think it is much more important to try to close the gap on very poor performances versus outstanding performances amongst the students. To me there is no good reason for this. For students to be at a level of higher education they have to bring more to the table than just getting by. I kind of take it personal as an instructor. If I work hard at being a good instructor I want all students to give a good effort. I want to find out why they don't. I ask them why they seem to me to be struggling so much. Is it time constraints? Is it that something is bothering them personally? I want to find that out!

Kelly's second question/s:

The article “Imposed Upon: Using Authentic Assessment of Critical Thinking at a Community College” dealt with the issue of the Board of Trustees of the State University of New York changing the core requirements. This impacted all the transferring graduates of the local community colleges and resulted in major academic changes. Then came the question of assessment to ensure students were obtaining a secure knowledge base and valued skills such as information management and critical thinking. A plan was devised by three instructors who taught on different campuses to each write an essay at a different performance level. These essays and an instruction sheet were given to students for them to assess in a written report an argument as to how they would rank the paper. A rubric was developed to assess the students’ essays and overall observations were made in regards to their critical thinking abilities.

1) Did this method fulfill the requirement of assessing critical thinking? Why or why not?

2) Is one method enough or should multiple tools be used? Suggestions?

You can answer either question.

Doug Brtek's main reply to Kelly:

First of all, I am satisfied that the method was sufficient to assess critical thinking in this case. However, it wouldn't hurt to explore alternative ways to assess critical thinking such as verbal or written expression, reflection, observation, experience and reasoning. I know assessment is never the proper place to explore new ideas, but they should be considered at different stages throughout program assessment. After all, shouldn't we be assessing our assessment methods?

My main reply to Kelly:

Kelly, according to the article, it was sufficient to fulfill SUNY's mandated assessment requirements. I think it was a great way to assess, as well as teach, because showing the wrong, almost right, and right way to do an assignment is a tremendous tool in getting students to learn, let alone to think and write critically. It helps remove some of the gray areas and misunderstandings associated with learning.

It's kind of like teaching a youngster not to play with fire, or a boiling pot of water; sometimes you have to let them get burned (just a tiny bit without hurting them, mind you) to get them to know what NOT to do, as well as what to do, and how to do it. The tricky part is letting them get burned without hurting them.

Showing students the right and wrong way to do an assignment (by using the three different levels of quality essays), as well as getting some assessment practice at the same time was a great idea the authors came up with. It leaves the students with a more clear mental picture of the assignment, that hopefully they would retain.

As far as other ways to assess critical thinking:

One might take the essay/assessment thing a step further and have a panel or roundtable discussion amongst the students to critique each other's essays (hopefully without too much tension) to help each other learn more through sharing thoughts or ideas. The main focus would be to discuss how each could have done better to think more critically about the essays.

My conclusion and reflection:

Doug agreed with the assessment in general, but he suggested verbal and written expression, reflection, observation, experience, and reasoning as viable alternatives to consider. I do like Doug's suggestions, but I feel they wanted to keep it more simple and standardized to start. If they were to do this on an ongoing basis, these would be good guidelines.

Kelly's third question/s:

In the article “Community College Strategies – Assessing the Achievement of General Education Objectives: Five Years of Assessment,” Oakton Community College did a locally developed assessment of general education objectives. From 1999 through 2002, the approach was to use “prompts” for assessment; in 2003, the approach was changed to evaluating actual classroom work. The article continues with comparing the 5 years of assessment and stating observations about the whole entire process and results.

In the article “Community College Strategies – Assessing General Education Using a Standardized Test: Challenges and Successful Solutions,” College of DuPage took an entirely different approach than Oakton Community College. Instead of using “prompts,” they developed a strategy in which 6 American College Test/Collegiate Assessment of Academic Proficiency (ACT/CAAP) area tests were given to a select number of introductory courses (beginning students) and advanced courses (graduating students). The article continues with the results of their assessment method.

1) Of these two methods of assessment, was one method superior over the other in obtaining assessment data that can be used constructively? Explain your reasoning.

2) Pick one of these methods and explain the advantages and disadvantages of using it as a tool for assessment.

Remember, answer the question that appeals to you the most!

Jessie La Cross's main reply:

I keep going back and forth on this question. On one hand, I was not comfortable with the prompting methods Oakton CC used.

On the other hand, the purpose of the assessments are to measure how well students can meet the general education objectives, some of which were listed as:

1) define problems
2) construct hypotheses
3) interpret data using a variety of sources
4) explain how information fits within a historical context
5) communicate findings effectively in writing and in speech
6) work and communicate effectively with people
7) apply ethical principles to issues

I don't know how some of these, especially the highlighted ones, could adequately be measured with a standardized test. I liked the approach used in 2003, where actual classroom work in real time was used to evaluate the speech and teamwork objectives. Even though not as many students were able to be assessed this way, with practice, this seems like it could be a good strategy to use.

Each tool--standardized objective tests and performance based tests--has different strengths and weaknesses, but when used together I think they can actually complement each other.

My main reply:

Kelly, I think both articles display good examples of assessment. What I like most about the "Five Years of Assessment" example is the fact that they tried different formats over a significant time period. That allows for a wider range of experimentation which should translate to finding a good assessment vehicle depending on student background and coursework implemented.

Regarding the "Standardized Tests" article: the thing I liked best about their format is that they used response sheets that faculty used to voice what should be done with the results of the assessment data, and then made public (anonymously) in a document. Now, if they used that document to further improve coursework, they made great strides in improving learning and assessment.

My conclusion and reflection:

Jessie gave some very good reasons--pro and con--on standardized versus performance based testes. Her feelings, in the end, was that using the two together somehow would be a good way to go about testing.

My reply was that I liked assessment with variety over a period of years to get a better picture, and a more realistic view. Try different things--get a more extensive sampling of assessment so it is more reliable.

Feedback both ways is a huge factor here, also! Making it public for peer reviewing has to do a lot of good, also. When others can see what your results are it makes for everyone trying harder to do better and work towards more improved learning!

My final reflection on the seminar:

I think Kelly and Gina did a great job again! Kelly is so good at keeping discussions going with replying thoughtfully to replies and asking for more input! I feel that I did a good job of adding to the seminar with my thoughts and feelings on the different provided questions. It was another great learning experience! See you next week in GMIT 660!