A few weeks ago, I read something that made me think of this topic. I'm sorry I didn't save the link, but someone stated, "If we were lawyers, we would get paid for billable hours." This got me to thinking, what if we as education professionals, started billing for the work we did as accountants, lawyers, or even "trainers" (who ARE educators, by the way, even though there are some who continue to make the division) do? What if our contracts were to deliver a service, but we could charge more for larger classes (training contract put a cap on maximum class size), tailoring classes or classes we have not taught before (so prep), office hours, administrative work such as inputting grades, writing recommendations or calculating midterm grades for students who need them or verification of progress, making copies of course material, grading student work (which means all of the innovative new ways of teaching requiring more advanced means of assessment or faculty teaching writing courses would get paid more because of the required extra hours) or research in which the school's name is used in a journal or conference (this is part of school image after all).
So, the problem as I see it is two fold: 1) Adjuncts/contingents don't get paid for all that they do; and 2) most people (except for adjuncts and some department heads & faculty) are unaware of all the work an average adjunct does and does not get paid for.
Over the last week or so I've collected some blog posts and reports about the work situation for adjuncts which are finally coming to light. However, there is still little information and actual data on adjuncts. So my adjunctchat (Tuesday, April 1 at 4PM Eastern Standard Time) will look at the information gap about adjuncts and how we can address that.
Some of the questions I'd like to discuss include:
1) How are adjuncts/contingents/teaching assistants identified and used in higher education? Are there differences in unionized/non-unionized, public/private institutions, community colleges/colleges/research universities, or regions/countries?
2) What are the requirements and hiring practices for adjuncts? How is this different from Tenure Track? How is this different from other part time employees? How is this different from contract professionals?
3) How can we make hiring and the work adjuncts do more transparent? How can be begin to publicize the real adjunct work (rather than the public image of an adjunct some experience in the class-but not necessarily an academic degree- popping into to class to lecture 1-3 times a week and giving the class 2 pre-developed standard tests created by the publisher or a tenure track professor, which is graded by machine)?
4) How much time per class per semester do you contribute to class, service, and research? How can we gather this information so it is not just a self reported guestimate but is methodically collected (I'm thinking there's got to be an app out there)?
I hope you can make this. It would also help if we could begin to get some additional participants outside of the US and Canada as I think other countries are beginning to be pressured to accept this adjunct/contingent" model which has creeped into the US and is beginning to move into the Canadian system. If we have data, academics around the world can begin to push back so there is a more equitable system of pay and work.
References:
Conjob (video)
Congressional Report: The just in time professor
Portrait of part-time faculty by CAW
CUPA-HR Professionals in Higher Education Salary Survey
Just visiting
Adjunctaction Town Meeting
Chronical's Vitae Adjunct stats
Columbia Professors fired after yrs as contingents
UK policy makers getting more data from private sector rather than unis
In case people buy that this is a new problem, read this article from 1995
Please tweet me or add in the comment section any other resources you might have.
About Me
- V Yonkers
- Education, the knowledge society, the global market all connected through technology and cross-cultural communication skills are I am all about. I hope through this blog to both guide others and travel myself across disciplines, borders, theories, languages, and cultures in order to create connections to knowledge around the world. I teach at the University level in the areas of Business, Language, Communication, and Technology.
Showing posts with label higher education model. Show all posts
Showing posts with label higher education model. Show all posts
Monday, March 31, 2014
Wednesday, January 30, 2013
Flipped Classroom: How many hours does an instructor REALLY put into teaching
I'm writing my blog instead of preparing for my class today. Part of the reason is that I find Blogging a way to ease back into academic writing that will be published (rather than writing my dissertation which I have now completed). However, this blog post has weighed on my mind for the last 24 hours.
I have previously written about my participation in a "flipped classroom" project through our school. The last class was less than stellar! First, we woke to some bad weather in the area, and since I teach first thing in the morning, I sent out email instructions for students that may not have been able to come into class because of the weather. Unfortunately, I teach two levels of Group Communication and I mixed the two up, sending the email to the wrong (later) class. More than that, I am using 2 new technologies this semester. The combination of two new technologies (I usually only introduce one new technology each semester) made transitions in set up slow.
So coming home exhausted from 3 straight hours of teaching, I came across this article about how the IRS has warned colleges that the calculations for hours adjuncts work may need to be recalculated. This is the first time that I have seen someone recognize that adjuncts do more than classroom contact hours. It got me to thinking about how much time I put into my classes compared to a full time faculty member. Likewise, while on paper the "flipped classroom" may look like it is less work for the instructor because the "students are teaching themselves", like online learning, in fact, the instructor's role requires a lot more time commitment, often outside of anyone else's view.
Prep Time
There are a number of factors that come into prep time. As mentioned before, I am using two new technologies: clickers and an ipad. I usually only use one technology because of the learning curve in using the technology, figuring out timing (for set-up, transition from activity to activity, student interaction with technology before they feel comfortable). However, I was willing to use the two technologies because I had some prep time and support in using both (something I know many adjuncts or part-time instructors don't have).
Many of the activities I use in class, had to be modified for the flipped classroom. Halfway through my second class, the class chosen for the training, I realized I was "directing" the conversation too much, taking it away from the students. This is something that will be difficult to change. At the same time, I don't want to loose some of the concepts I want them to walk away with. This balance is something I will need to work on in the next few weeks, perhaps coming up with some additional discussion questions before class (I have always been good at reacting to student comments, but now I need them to also participate in directing the conversation).
Another difficulty (and this is just the nature of the demand for our courses, lack of faculty, and the ability for students to drop/add) is putting together groups, getting to know students/student strengths and weaknesses. As a result, I spend the first two weeks frantically putting together groups, coordinating supporting information, answering emails, making sure students have access to the technology we will be using, and collecting information about the students.
This semester so far, I have put in about 50 hours of prep work even before classes begin.
Class prep and assignment management
I usually have a class of 35-45 students. This semester, because one of my courses was added at the last minute, I have a class of 42 and a class of 39. Off courses, this still may change over the next week or so, but these are the numbers I'm starting with.
I have always taught using a style in which I take the content (which I am familiar with) and modify it to meet the needs of a particular class. Sometimes, student written skills are strong, but they lack interpersonal; sometimes their know of communication theory is strong, but they lack practical experience. It is important for me therefore to always prepare before my class. However, with the flipped classroom, I feel I have to be even more prepared, understanding ALL the reading concepts as students my bring up concepts I had not thought of in our discussion. It is not that I don't know the concepts, but rather the vocabulary used by the author. This means, my usual 45 minute prep for class will require twice that amount. I need to be prepared if students bring up concepts I did not necessarily feel were important.
Because I will be evaluating students in class more, I will need to be more "present" in the class (so I can evaluate them). This is very fatiguing especially when teaching 3 hours straight. In addition, there is more assessment after class and follow up (I tape a review of the key points I wanted them to take away from class based on what we did in class). I have cut down on some of my written assessments, but I still need to figure out how to access the statistics from the clickers that I will use in my in-class assessment.
I estimate that I will be spending about 10 hours a week per class in class prep and assignment management (this means 26 hours a week of work on my class alone for 2 3-hour classes).
Other school related responsibilities
In addition to my classroom requirements, our department expects us to have office hours. I have about 3 hours a week scheduled, although I don't usually have students during that time. In addition, I often meet with students when they can't make my office hours. Being part of a large university, most students don't take advantage of office hours. However, I put in as many hours with students outside of the class as most full-time undergraduate instructors/professors (graduate student interaction is different).
Because of my style of teaching, I get know my students as individuals. Because of this, I have requests for 4-8 letters of recommendation per semester. This is not overly time consuming, but does take about 15 minutes per student (an additional 1-2 hours a semester). Those students that do come to speak with me, usually do not discuss the course but rather graduate school and career advice. Our department is lucky in that instructors are part of the faculty and there are a number of faculty who are term (have worked for more than 3 years and therefore are offered full year contracts). As a result, students view these instructors as valuable resources when they have professional and academic questions.
Finally, if I want a tenure track position, I will need to continue to participate in profession activities such as blogging, publishing, attending conventions (if I can get funding for it), interacting with the community, networking, and reviewing journal/conference papers. In this area, expectations for part-faculty are the same as for full-time, but full-time/tenure track faculty get paid for it.
The only area that differs between full-time faculty expectations and part-time is in the area of college service. While I was asked be a representative for part-time faculty for the faculty senate, without being paid for it, I could refuse without it hurting my career.
I have previously written about my participation in a "flipped classroom" project through our school. The last class was less than stellar! First, we woke to some bad weather in the area, and since I teach first thing in the morning, I sent out email instructions for students that may not have been able to come into class because of the weather. Unfortunately, I teach two levels of Group Communication and I mixed the two up, sending the email to the wrong (later) class. More than that, I am using 2 new technologies this semester. The combination of two new technologies (I usually only introduce one new technology each semester) made transitions in set up slow.
So coming home exhausted from 3 straight hours of teaching, I came across this article about how the IRS has warned colleges that the calculations for hours adjuncts work may need to be recalculated. This is the first time that I have seen someone recognize that adjuncts do more than classroom contact hours. It got me to thinking about how much time I put into my classes compared to a full time faculty member. Likewise, while on paper the "flipped classroom" may look like it is less work for the instructor because the "students are teaching themselves", like online learning, in fact, the instructor's role requires a lot more time commitment, often outside of anyone else's view.
Prep Time
There are a number of factors that come into prep time. As mentioned before, I am using two new technologies: clickers and an ipad. I usually only use one technology because of the learning curve in using the technology, figuring out timing (for set-up, transition from activity to activity, student interaction with technology before they feel comfortable). However, I was willing to use the two technologies because I had some prep time and support in using both (something I know many adjuncts or part-time instructors don't have).
Many of the activities I use in class, had to be modified for the flipped classroom. Halfway through my second class, the class chosen for the training, I realized I was "directing" the conversation too much, taking it away from the students. This is something that will be difficult to change. At the same time, I don't want to loose some of the concepts I want them to walk away with. This balance is something I will need to work on in the next few weeks, perhaps coming up with some additional discussion questions before class (I have always been good at reacting to student comments, but now I need them to also participate in directing the conversation).
Another difficulty (and this is just the nature of the demand for our courses, lack of faculty, and the ability for students to drop/add) is putting together groups, getting to know students/student strengths and weaknesses. As a result, I spend the first two weeks frantically putting together groups, coordinating supporting information, answering emails, making sure students have access to the technology we will be using, and collecting information about the students.
This semester so far, I have put in about 50 hours of prep work even before classes begin.
Class prep and assignment management
I usually have a class of 35-45 students. This semester, because one of my courses was added at the last minute, I have a class of 42 and a class of 39. Off courses, this still may change over the next week or so, but these are the numbers I'm starting with.
I have always taught using a style in which I take the content (which I am familiar with) and modify it to meet the needs of a particular class. Sometimes, student written skills are strong, but they lack interpersonal; sometimes their know of communication theory is strong, but they lack practical experience. It is important for me therefore to always prepare before my class. However, with the flipped classroom, I feel I have to be even more prepared, understanding ALL the reading concepts as students my bring up concepts I had not thought of in our discussion. It is not that I don't know the concepts, but rather the vocabulary used by the author. This means, my usual 45 minute prep for class will require twice that amount. I need to be prepared if students bring up concepts I did not necessarily feel were important.
Because I will be evaluating students in class more, I will need to be more "present" in the class (so I can evaluate them). This is very fatiguing especially when teaching 3 hours straight. In addition, there is more assessment after class and follow up (I tape a review of the key points I wanted them to take away from class based on what we did in class). I have cut down on some of my written assessments, but I still need to figure out how to access the statistics from the clickers that I will use in my in-class assessment.
I estimate that I will be spending about 10 hours a week per class in class prep and assignment management (this means 26 hours a week of work on my class alone for 2 3-hour classes).
Other school related responsibilities
In addition to my classroom requirements, our department expects us to have office hours. I have about 3 hours a week scheduled, although I don't usually have students during that time. In addition, I often meet with students when they can't make my office hours. Being part of a large university, most students don't take advantage of office hours. However, I put in as many hours with students outside of the class as most full-time undergraduate instructors/professors (graduate student interaction is different).
Because of my style of teaching, I get know my students as individuals. Because of this, I have requests for 4-8 letters of recommendation per semester. This is not overly time consuming, but does take about 15 minutes per student (an additional 1-2 hours a semester). Those students that do come to speak with me, usually do not discuss the course but rather graduate school and career advice. Our department is lucky in that instructors are part of the faculty and there are a number of faculty who are term (have worked for more than 3 years and therefore are offered full year contracts). As a result, students view these instructors as valuable resources when they have professional and academic questions.
Finally, if I want a tenure track position, I will need to continue to participate in profession activities such as blogging, publishing, attending conventions (if I can get funding for it), interacting with the community, networking, and reviewing journal/conference papers. In this area, expectations for part-faculty are the same as for full-time, but full-time/tenure track faculty get paid for it.
The only area that differs between full-time faculty expectations and part-time is in the area of college service. While I was asked be a representative for part-time faculty for the faculty senate, without being paid for it, I could refuse without it hurting my career.
Thursday, July 12, 2012
The Ethics of Higher Ed Marketing in the US
As I start looking for colleges/universities for my daughter in the US this year, I'm much more savvy about how US colleges/universities market themselves. My son, a 2 year student at Penn State, went through this last year. As a student/professor for 20 years at US colleges, I went into his higher ed search a bit cocky. After all, I knew what I was looking for in a student, I knew what my student profiles were in multiple departments, I understood the system for college credit transfer, financial aid, housing. But I was shocked when I got sucked into the higher ed marketing that goes on in the US.
First, there is the number of marketing brochures, emails, and personal phone calls a student receives. My son did very well on his SAT's (Standardized college entrance exams), even better on his AP exams (Advanced Placement standardized tests), and outstanding on his SAT II (or subject matter tests). His PSAT's that he took his second year in high school were also very good.
We knew he did well on these tests which placed him on schools' radar. However, what we didn't know (but we know now) is that most of the marketing was for those that placed above 50% on the SAT's. Schools also looked at the profiles (curriculum, ethnicity, etc...). We were very surprised when he received recruiting information from ivy league schools as we had never expected he could get into them (which in fact he could not). He was bombarded with information from Dartmouth, Vanderbilt, and University of Chicago in particular, all very competitive schools. Naive that I was, I now realize this barrage of information was targeted towards my husband and I. We looked at Dartmouth and convinced my son to apply to there and Vanderbilt as it appeared he had a chance to go to school there (even if my husband and I had to mortgage our house!). We paid the application fees to all of the schools the required it (often $60-100) and then waited. My son did not get into one of these prestigious schools.
Looking at their sites, many of these schools really did not need to advertise (which is why we were suckered in as why would they send information to a student they did not want?). So why did they? I have 2 theories: 1) to make money off of the application process. I you are charging $50-100 per application and you have 20,000 applications, that's a nice chunk of change you just earned. Not to mention, you now have your name out to a broad group through the marketing funded by the application fees. 2) In searches, the more competitive your school is, the more apt your school is to be at the top of search engines. How do you determine competitiveness? Using the ratio of # of applicants to #accepted. As a result, it is in the best interest for these prestigious schools to increase their number of applicants.
Now I look at websites in a new way. While some schools claim they do not use SAT's or ACT's to determine acceptance, if they publish the range of scores for the incoming class, and most are at the very top, then I say they do use those scores. Why publish the range if it is not going to be used? Why not have a range of standardized test scores from the students?
Do not get me wrong. My son has a perfect match with the school he currently attends. Academically, it is very strong, especially in Liberal Arts which is what he is studying. This, ultimately, is what parents and prospective students should be looking for. The cultures of those schools he applied to did not fit his style. However, I felt like someone who bought a bogus product from a latenight TV show when his rejection letters began to come in. These prestigious schools lured us in to applying for a school that would never fit my son's style, nor would he have the profile they were looking for. This, I feel, is very unethical.
First, there is the number of marketing brochures, emails, and personal phone calls a student receives. My son did very well on his SAT's (Standardized college entrance exams), even better on his AP exams (Advanced Placement standardized tests), and outstanding on his SAT II (or subject matter tests). His PSAT's that he took his second year in high school were also very good.
We knew he did well on these tests which placed him on schools' radar. However, what we didn't know (but we know now) is that most of the marketing was for those that placed above 50% on the SAT's. Schools also looked at the profiles (curriculum, ethnicity, etc...). We were very surprised when he received recruiting information from ivy league schools as we had never expected he could get into them (which in fact he could not). He was bombarded with information from Dartmouth, Vanderbilt, and University of Chicago in particular, all very competitive schools. Naive that I was, I now realize this barrage of information was targeted towards my husband and I. We looked at Dartmouth and convinced my son to apply to there and Vanderbilt as it appeared he had a chance to go to school there (even if my husband and I had to mortgage our house!). We paid the application fees to all of the schools the required it (often $60-100) and then waited. My son did not get into one of these prestigious schools.
Looking at their sites, many of these schools really did not need to advertise (which is why we were suckered in as why would they send information to a student they did not want?). So why did they? I have 2 theories: 1) to make money off of the application process. I you are charging $50-100 per application and you have 20,000 applications, that's a nice chunk of change you just earned. Not to mention, you now have your name out to a broad group through the marketing funded by the application fees. 2) In searches, the more competitive your school is, the more apt your school is to be at the top of search engines. How do you determine competitiveness? Using the ratio of # of applicants to #accepted. As a result, it is in the best interest for these prestigious schools to increase their number of applicants.
Now I look at websites in a new way. While some schools claim they do not use SAT's or ACT's to determine acceptance, if they publish the range of scores for the incoming class, and most are at the very top, then I say they do use those scores. Why publish the range if it is not going to be used? Why not have a range of standardized test scores from the students?
Do not get me wrong. My son has a perfect match with the school he currently attends. Academically, it is very strong, especially in Liberal Arts which is what he is studying. This, ultimately, is what parents and prospective students should be looking for. The cultures of those schools he applied to did not fit his style. However, I felt like someone who bought a bogus product from a latenight TV show when his rejection letters began to come in. These prestigious schools lured us in to applying for a school that would never fit my son's style, nor would he have the profile they were looking for. This, I feel, is very unethical.
Tuesday, December 7, 2010
Three models for a new higher ed economic model (part 2)
This has been some time in coming. I wanted to think through some of the options based on my earlier post on this issue.
In reviewing all the factors, I wanted to develop an economic or funding model that could be implemented sooner, rather than later. However, I realized that I have seen at least 3 models in use that have been effective over the last decade. So why reinvent the wheel? Two of the models are based on a large component of instruction being distance learning based, the third is more traditional.
Pure distance learning and assessment based higher education
The first model is based on a local university that was developed out of the New York state civil service training department. Often civil service workers, who did not have higher ed degrees, received training that was equivalent to a university course. Recognizing that those that received this training should be able to receive college credit, a system of testing and granting college credit for the training was established.
Approximately 10 years ago, a Swiss university bought out this service, adding there own model on to the established service. Currently, the university identifies online courses, creates its own courses, and creates a standard curriculum, all of which are based on a examination process. Students actually pay for exams rather than the courses themselves. This means that a student may not take any course, as long as they can demonstrate knowledge through the exams. Students that opt to take instruction through the university will pay for those courses. Students can also submit courses, training, and instruction to receive credit. However, they will need to pass the exam for the criteria laid out for their degree.
This allows a standard learning outcome to be used, with several options by the student (depending on their circumstances including access to courses, resources, location, learning needs) to fulfill the curriculum requirements. This also means that there is a minimal instructional staff, with most of the staff working on assessment and curriculum development. There are some area specialists who help in the curriculum development, but they are only used on an as needed basis. An instructor does not have to be a Ph.d. in the area in which they are teaching, but rather need to be effective instructors. This is because the subject matter is already developed by specialists in the form of assessment tools and curriculum. This also always the university to be more flexible based on the students' learning needs and goals.
One disadvantage to this model is that most of the students are learning in isolation. This also requires a great deal of motivation on the part of the learner to arrange for those courses/learning that will best help them pass the assessments. In addition, a great deal of resources go into the monitoring and revision of curriculum and assessment tools.
Individualized learning plans
Another local university in which I have worked uses an individual learning plan. The first course a student takes is a three credit course in which students sit down with a "mentor" and outline their learning objectives. They then plan how they will achieve these objectives academically. There are usually three options: test out, small group tutorials (face to face at learning centers), or online courses. A fourth option is an independent study, but that is used rarely. Unlike the model above, there are set courses which students must complete. Only a certain percentage of those courses can be assessments (either CLEF or assessment of real life experience).
Unlike a traditional university, there are no "departments". Rather there are designated "Area Specialists" who are in charge of a group of faculty (tenured and part-time). These specialists often are part of programs such as labor relations, healthcare, nursing, teaching, and humanities. In other words, they are more profession oriented and broader than a traditional university department. Because this is a state university, there are general education core courses that students must have to be granted a degree. However, if enough students are interested in a specific area, the mentors can ask the area specialists to develop a tutorial in that topic.
In this model, new specialties can be developed within an "area" that a traditional department might have difficulty with. In addition, students can flow in and out of the university as needed (an open university model). Most of the students work full time. One disadvantage of this model, like the one above, is that there is not a single "campus". Unlike the model above, however, students can develop a sense of "college" at their college centers, having tutorials with other students, and establishing a close relationship with their mentors.
Another disadvantage of this model is that it is very labor intensive. For the model to work, the mentors need to be knowledgeable about course options, adult learning, and have constant contact with their mentees. In fact, one reason I don't teach there anymore is that the pay scale was ridiculously low, with faculty being paid by the size of their student load (i.e. a class of 5 had a pay scale much lower than a class of 25. Someone that taught 5 classes to 5 students would make the same as person who taught 25 students in one class, even thought there was more time commitment for the 5 classes).
Traditional model
The fact is that many who go to school full time do so as much for the social aspects of being part of a campus as for the academics. In order to change the traditional model of education, there would have to be a cultural change, which could be difficult at universities that have been steeped in their culture for many years.
My current university used a very successful model to change this culture and cross disciplines and departments in order to integrate technology into its instruction. Basically, it was structured by creating a pool of funding to hire faculty who were adept in instructional technology. Each department was required to either train a current faculty member, identify a current faculty member with a technology specialty, or hire a new faculty member who had expertise in educational technology. Once this core group of faculty were established, they received tenure within the technology group (not their department, per se). This meant that if there was a need for one of these faculty in a certain department, they might be reassigned to that department or courses within that department. For example, one of the faculty members in the dept of communication also had expertise in information technology. As there were two within the communication department who were part of the technology group, one of them went over to information technology when one of the designated faculty members left the university. This same person also taught some courses in the school of business when the designated technology person in business took his sabbatical.
Imagine, for example, if this same model were used for Communication Skills, Creativity, Critical thinking, scientific inquiry, or writing (often the areas of core courses). This would allow a university to have tenure track faculty who could teach interdisciplinary courses without fear of cannibalizing a department. Smaller departments could go to the "centers" to find faculty that could teach courses in their department. General Education courses could be offered through the "centers" so that there would be a guarantee of having a pool of faculty to draw on for these courses, which may not be money producers, but are vital to the degrees. However, outside of the "centers", faculty expertise (specialties) could be offered within the departments. In addition, new areas of study, which might not fall neatly into a department, could be developed within the centers.
Unfortunately, with a new administration, the traditional culture and departmental structures proved to have to strong an influence and we have now moved back to the traditional departmental structures where departments fight for resources and/or are pitted against each other to keep "tenure track lines" for their department. For any of these models to work, faculty, administrators, students, and stakeholders (including employees and alumni) need to be open to a new way of funding higher education. Using a "business" model will never work as "knowledge" is becoming less and less a commodity that is possessed and more and more a necessity that everyone is working with.
In reviewing all the factors, I wanted to develop an economic or funding model that could be implemented sooner, rather than later. However, I realized that I have seen at least 3 models in use that have been effective over the last decade. So why reinvent the wheel? Two of the models are based on a large component of instruction being distance learning based, the third is more traditional.
Pure distance learning and assessment based higher education
The first model is based on a local university that was developed out of the New York state civil service training department. Often civil service workers, who did not have higher ed degrees, received training that was equivalent to a university course. Recognizing that those that received this training should be able to receive college credit, a system of testing and granting college credit for the training was established.
Approximately 10 years ago, a Swiss university bought out this service, adding there own model on to the established service. Currently, the university identifies online courses, creates its own courses, and creates a standard curriculum, all of which are based on a examination process. Students actually pay for exams rather than the courses themselves. This means that a student may not take any course, as long as they can demonstrate knowledge through the exams. Students that opt to take instruction through the university will pay for those courses. Students can also submit courses, training, and instruction to receive credit. However, they will need to pass the exam for the criteria laid out for their degree.
This allows a standard learning outcome to be used, with several options by the student (depending on their circumstances including access to courses, resources, location, learning needs) to fulfill the curriculum requirements. This also means that there is a minimal instructional staff, with most of the staff working on assessment and curriculum development. There are some area specialists who help in the curriculum development, but they are only used on an as needed basis. An instructor does not have to be a Ph.d. in the area in which they are teaching, but rather need to be effective instructors. This is because the subject matter is already developed by specialists in the form of assessment tools and curriculum. This also always the university to be more flexible based on the students' learning needs and goals.
One disadvantage to this model is that most of the students are learning in isolation. This also requires a great deal of motivation on the part of the learner to arrange for those courses/learning that will best help them pass the assessments. In addition, a great deal of resources go into the monitoring and revision of curriculum and assessment tools.
Individualized learning plans
Another local university in which I have worked uses an individual learning plan. The first course a student takes is a three credit course in which students sit down with a "mentor" and outline their learning objectives. They then plan how they will achieve these objectives academically. There are usually three options: test out, small group tutorials (face to face at learning centers), or online courses. A fourth option is an independent study, but that is used rarely. Unlike the model above, there are set courses which students must complete. Only a certain percentage of those courses can be assessments (either CLEF or assessment of real life experience).
Unlike a traditional university, there are no "departments". Rather there are designated "Area Specialists" who are in charge of a group of faculty (tenured and part-time). These specialists often are part of programs such as labor relations, healthcare, nursing, teaching, and humanities. In other words, they are more profession oriented and broader than a traditional university department. Because this is a state university, there are general education core courses that students must have to be granted a degree. However, if enough students are interested in a specific area, the mentors can ask the area specialists to develop a tutorial in that topic.
In this model, new specialties can be developed within an "area" that a traditional department might have difficulty with. In addition, students can flow in and out of the university as needed (an open university model). Most of the students work full time. One disadvantage of this model, like the one above, is that there is not a single "campus". Unlike the model above, however, students can develop a sense of "college" at their college centers, having tutorials with other students, and establishing a close relationship with their mentors.
Another disadvantage of this model is that it is very labor intensive. For the model to work, the mentors need to be knowledgeable about course options, adult learning, and have constant contact with their mentees. In fact, one reason I don't teach there anymore is that the pay scale was ridiculously low, with faculty being paid by the size of their student load (i.e. a class of 5 had a pay scale much lower than a class of 25. Someone that taught 5 classes to 5 students would make the same as person who taught 25 students in one class, even thought there was more time commitment for the 5 classes).
Traditional model
The fact is that many who go to school full time do so as much for the social aspects of being part of a campus as for the academics. In order to change the traditional model of education, there would have to be a cultural change, which could be difficult at universities that have been steeped in their culture for many years.
My current university used a very successful model to change this culture and cross disciplines and departments in order to integrate technology into its instruction. Basically, it was structured by creating a pool of funding to hire faculty who were adept in instructional technology. Each department was required to either train a current faculty member, identify a current faculty member with a technology specialty, or hire a new faculty member who had expertise in educational technology. Once this core group of faculty were established, they received tenure within the technology group (not their department, per se). This meant that if there was a need for one of these faculty in a certain department, they might be reassigned to that department or courses within that department. For example, one of the faculty members in the dept of communication also had expertise in information technology. As there were two within the communication department who were part of the technology group, one of them went over to information technology when one of the designated faculty members left the university. This same person also taught some courses in the school of business when the designated technology person in business took his sabbatical.
Imagine, for example, if this same model were used for Communication Skills, Creativity, Critical thinking, scientific inquiry, or writing (often the areas of core courses). This would allow a university to have tenure track faculty who could teach interdisciplinary courses without fear of cannibalizing a department. Smaller departments could go to the "centers" to find faculty that could teach courses in their department. General Education courses could be offered through the "centers" so that there would be a guarantee of having a pool of faculty to draw on for these courses, which may not be money producers, but are vital to the degrees. However, outside of the "centers", faculty expertise (specialties) could be offered within the departments. In addition, new areas of study, which might not fall neatly into a department, could be developed within the centers.
Unfortunately, with a new administration, the traditional culture and departmental structures proved to have to strong an influence and we have now moved back to the traditional departmental structures where departments fight for resources and/or are pitted against each other to keep "tenure track lines" for their department. For any of these models to work, faculty, administrators, students, and stakeholders (including employees and alumni) need to be open to a new way of funding higher education. Using a "business" model will never work as "knowledge" is becoming less and less a commodity that is possessed and more and more a necessity that everyone is working with.
Tuesday, November 16, 2010
A new economic model for Higher Education: Part 1 history
More and more people have recently been writing about a new economic model for higher education (Andy Coverdale, Clark Quinn, and Tom Haskins, just to name a few). However, as we grapple around the world with how higher education should be structured and funded, we aren't willing to reexamine the underlying beliefs upon which the funding and academic structures were created. Now is the time to begin to look at the basis of the traditional structures and how they have changed, and the current needs in a new structure that will fit Higher Education's needs.
The History of Higher Education in the West
Higher education came out of a belief that only those within power should have access to knowledge. The knowledge included philosophy, history, stories (literature), music, etc... In other words, the humanities. This made sense as only those who were rich and powerful would have the time to study subjects that did not necessarily contribute to every day economics of that time: agriculture, warfare, trade skills. Often, those that were educated were the spare sons. This allowed powerful families to control what knowledge was passed down and how that knowledge would be perpetuated.
The advent of the printing press allowed for knowledge to be transferred from location to location in greater amounts. Still, higher education was only for those who were "scholars". The economic reasons for this was that the serf system allowed powerful families to maintain their power, and knowledge was perceived as a commodity to be controlled by those families who had power, land (thus resources), and a means to control their serfs.
Adam Smith developed the principles of Capitalism as the economy, in the form of the industrial revolution and the age of mercantilism, changed the need for knowledge within the economy. No longer was a person's wealth tied to family (birth), but also know how, skills, and the ability to understand the complex systems outside of the local environs. People were "human capital" and became mobile, something that was not possible under a serfdom. More importantly, a person could go to a university, if they were clever enough, and "gain" the knowledge that was originally set aside for children of the wealthy and powerful landowners.
There was also a shift towards science and the creation (think industrial revolution) of products, tools, and technology as people moved away from their sources of substance (food, water). Soon, in places like the US, there was the recognition that knowledge was a commodity that, when invested, could lead to power, riches, and opportunities. In other words, the university was one means to "acquire" the knowledge that could be used to participate in the economy. However, up until the end of the 20th century, higher education was still perceived as something that could be withheld or distributed, thus allowing some to "possess" the knowledge and then use that to be successful in the economy.
During this period of time, knowledge was also perceived as being individual. An individual could pass knowledge on to other individuals. If an individual did not do well in a class, it was because that individual, even though they had access to knowledge, was not able to use it because he or she was lacking in some way (not smart enough, not motivated enough, looking for the wrong type of knowledge that would be useful for that individual). The university was a way to train future leaders in the economy, and as a result, universities decided on who would have the most potential, which subjects to study, and what would be the most useful for the economy. This is one reason why so many universities eventually became government run. The university was a means to implement public policy.
However, also during this time period, the economy changed to one in which corporations, not individuals became the structure within which the economic decisions were being made. While there has been a lot written and criticized about corporations, they have had an impact on how business is done and who controls resources. Adam Smith's theories included an explanation of motivations based on the serfdom model in which the individual landowner would have a self interest in making sure that those within his or her community were taken care of. However, as communities became mobile, and companies no longer had individuals, but rather a collective making decisions, his theories no longer are true.
New Basis for Economic Model for Higher Education
Much work has been done in the last two decades on the knowledge economy. In addition, during the 20th century, there was a realization that the economic principles of the past were not fitting the economic realities on the present. With this in mind, any new economic model for higher education will have to take the following premises in mind:
1) Knowledge is no longer just in individual "thing" possessed internally. Knowledge can be collective (within an organization for example), be located externally (via the web for example), and time dated (it can be irrelevant the moment it is created).
2) Humans are no longer "capital" that can be or are expected to be moved around to take advantage of opportunities. When they do move, often it is based on many factors, most of which may not be quantifiable. Humans don't always make "rational" decisions. And societies in the 21st century have (for the most part) recognized that individuals have the right to make decisions about their education, work, where they live, and what they do with their free time.
3) Every individual has the right to education and literacy. It no longer (for the most part) should be limited to just those born into power and privilege.
4) Knowledge and services are major contributors to the economy. The basis of many of our jobs is the ability to learn new skills and apply both individual and collective knowledge to a situation.
As a result, it is clear to me that the current capitalistic model used currently to decide what we should be doing with higher education is no longer relevant. My next post, I will try to present some of my ideas on what a new economic model should include.
The History of Higher Education in the West
Higher education came out of a belief that only those within power should have access to knowledge. The knowledge included philosophy, history, stories (literature), music, etc... In other words, the humanities. This made sense as only those who were rich and powerful would have the time to study subjects that did not necessarily contribute to every day economics of that time: agriculture, warfare, trade skills. Often, those that were educated were the spare sons. This allowed powerful families to control what knowledge was passed down and how that knowledge would be perpetuated.
The advent of the printing press allowed for knowledge to be transferred from location to location in greater amounts. Still, higher education was only for those who were "scholars". The economic reasons for this was that the serf system allowed powerful families to maintain their power, and knowledge was perceived as a commodity to be controlled by those families who had power, land (thus resources), and a means to control their serfs.
Adam Smith developed the principles of Capitalism as the economy, in the form of the industrial revolution and the age of mercantilism, changed the need for knowledge within the economy. No longer was a person's wealth tied to family (birth), but also know how, skills, and the ability to understand the complex systems outside of the local environs. People were "human capital" and became mobile, something that was not possible under a serfdom. More importantly, a person could go to a university, if they were clever enough, and "gain" the knowledge that was originally set aside for children of the wealthy and powerful landowners.
There was also a shift towards science and the creation (think industrial revolution) of products, tools, and technology as people moved away from their sources of substance (food, water). Soon, in places like the US, there was the recognition that knowledge was a commodity that, when invested, could lead to power, riches, and opportunities. In other words, the university was one means to "acquire" the knowledge that could be used to participate in the economy. However, up until the end of the 20th century, higher education was still perceived as something that could be withheld or distributed, thus allowing some to "possess" the knowledge and then use that to be successful in the economy.
During this period of time, knowledge was also perceived as being individual. An individual could pass knowledge on to other individuals. If an individual did not do well in a class, it was because that individual, even though they had access to knowledge, was not able to use it because he or she was lacking in some way (not smart enough, not motivated enough, looking for the wrong type of knowledge that would be useful for that individual). The university was a way to train future leaders in the economy, and as a result, universities decided on who would have the most potential, which subjects to study, and what would be the most useful for the economy. This is one reason why so many universities eventually became government run. The university was a means to implement public policy.
However, also during this time period, the economy changed to one in which corporations, not individuals became the structure within which the economic decisions were being made. While there has been a lot written and criticized about corporations, they have had an impact on how business is done and who controls resources. Adam Smith's theories included an explanation of motivations based on the serfdom model in which the individual landowner would have a self interest in making sure that those within his or her community were taken care of. However, as communities became mobile, and companies no longer had individuals, but rather a collective making decisions, his theories no longer are true.
New Basis for Economic Model for Higher Education
Much work has been done in the last two decades on the knowledge economy. In addition, during the 20th century, there was a realization that the economic principles of the past were not fitting the economic realities on the present. With this in mind, any new economic model for higher education will have to take the following premises in mind:
1) Knowledge is no longer just in individual "thing" possessed internally. Knowledge can be collective (within an organization for example), be located externally (via the web for example), and time dated (it can be irrelevant the moment it is created).
2) Humans are no longer "capital" that can be or are expected to be moved around to take advantage of opportunities. When they do move, often it is based on many factors, most of which may not be quantifiable. Humans don't always make "rational" decisions. And societies in the 21st century have (for the most part) recognized that individuals have the right to make decisions about their education, work, where they live, and what they do with their free time.
3) Every individual has the right to education and literacy. It no longer (for the most part) should be limited to just those born into power and privilege.
4) Knowledge and services are major contributors to the economy. The basis of many of our jobs is the ability to learn new skills and apply both individual and collective knowledge to a situation.
As a result, it is clear to me that the current capitalistic model used currently to decide what we should be doing with higher education is no longer relevant. My next post, I will try to present some of my ideas on what a new economic model should include.
Tuesday, November 2, 2010
A new model for Higher Education
I recently read two good blog posts about higher education: one by Clark Quinn and the other by Andy Cloverdale. In both posts they point out the need for change in the way that education is provided at the University and the way instructors/professors are trained to teach in the University.
This and the extreme budget cuts to our university in the face of rising enrollments got me thinking about the call for "reform" in how our universities are run in the US today.
Current system in the US
To understand what we are up against in the US, it is important to understand the model of education as it currently stands in the US. Our current system is based on a belief that the ultimate goal of education is to become an expert (which was redefined as "specialist" in the 1980's) in a specific field of study. In other words, the Ph.d. holds all knowledge about a content area, thus making them an "expert".
After a broad basic education at high school (secondary school), a person is expected to learn the basic requirements of functioning in our society (through understanding our culture through the study of history, literature, and social studies, to basic written communication skills through the study of language arts, to basic calculation skills through the study of math, to the understanding of our environment, health. and work processes through the study of science). This is the ideal.
What used to be called Junior College but is now called Community College has developed into two tracks: the first is vocational and advanced technical training to meet the needs of an educated workforce (but not management), especially those in manufacturing and the service industry, whereas the second is the preparation for those underprepared or not able to afford a university or college education. In the second case, students are expected to take a broad range of courses across disciplines. In the first case, students are expected to become proficient in a given skill or discipline. However, in our current model of community college education, those that finish community college (usually with an associate's degree) do not hold expertise even if they have specialized in an area. Rather, they are able to work with the experts and/or gain expertise as they work within the discipline.
The current model for undergraduate education is 2 years of general education courses (also known as gen ed or core courses) from categories of disciplines (i.e. quantitative studies, language and arts, culture, social sciences, man and environment, etc...). Then a student will specialize or "major" or "minor" in a field. The traditional majors and minors normally fall into humanities, social sciences, applied sciences, natural sciences, liberal arts, or professional schools (pre-law, pre-med, education, accounting, etc...). Each major normally has a dedicated faculty consisting of tenured and/or full-time professors and adjunct, part-time, or student instructors. In the last two decades, "interdisciplinary" majors consist of faculty drawn from different majors. Tuition flows into the traditional majors to sustain faculty positions and support staff. The interdisciplinary major ends up being "gravy" (extra money) as there is no support staff or dedicated instructors for these majors.
One problem with the interdisciplinary majors (which I suffered at both the undergraduate and graduate level since both of my degrees were interdisciplinary) is that many of the required courses for these interdisciplinary majors are cut during budget crisis because they are perceived as "electives" within the traditional majors. The result is that required courses for interdisciplinary majors are cut and students in these majors are unable to complete their course work in a timely manor. This has just happened with a course I have taught in our major. It now is a part of Public Policy, an interdisciplinary major. Normally the course is offered either every 2 or 3 semesters, depending on the faculty interest. But now that it is part of another major, the demand for the course has increased. It is possible that I will need to teach it more often or if I leave, it won't be offered at all (we are short staffed within the Communication Dept. for our department's required courses as it is).
Once students leave with a Bachelor's degree, at the end of their college experience, they are expected to have a certain cache of skills and abilities that will make them employable. As a result, more and more colleges are basing their curriculum on employer needs (i.e. computer program specific, accounting law specific, ability to be licensed or certified in a field). The college graduate, in other words, will bring away from the college, the content they will need in the work place.
At the Master and Ph.d. level, students are expected to drill down to one area of expertise, that area being specific to the field of study they are pursuing. Graduate studies are based on the expertise of the faculty in a program/ field of study. In our department (Communication), for example our programs focus on Healthcare communication, political communication, and interpersonal communication. Other schools of communication might focus on mass communication, written communication, speech communication and disorders, intercultural communication, communication strategy, organizational communication, communication technology, etc... Many graduate schools try to build up a reputation in a marketable area. They will hire new faculty to reflect trends in specialties and encourage tenured faculty to change their expertise through grant writing support and research funding. A department that does not bring in funding (either through research, grants, or student tuition) usually will have programs or entire departments cut from the university.
Impact of this model on the Current Higher Ed System
This business model of Higher Education does not connect with the educational needs of the 21st century. As our economy and society moves into the knowledge economy, CONTENT is not as important as understanding how to find, interpret, analyze, and update content/expertise. Companies may be looking for specific content from their graduates, but what they need are employees that have critical thinking and reading, communication, analytic, information literacy, technology literacy, creativity, and collaboration skills. These skills might manifest themselves in different ways within different disciplines, but for the most part they can be found in all fields. As a result, it is important that those at the upper end of higher education (Master, Ph.d.), be prepared to cross the traditional disciplines to understand how each functions within a certain field of study.
Likewise, the internet has made content available on a mass basis, whereas it was limited to the university, publishing houses, depositories (such as libraries), and management before social networking. Access to information is not as important as knowing how to find that information and what to do with it when it is found. "Expertise" can be found outside of those trained and educated in the discipline, thus making the expert professor obsolete. The result is a need for professors that can teach, mentor, and develop life-long learning skills, something that was limited to graduate students in the past.
With the focus on new skills over content and access to expertise and content outside of the university, the current system of testing for content and expertise is lacking. There needs to be a deeper level of assessment that objective tests don't access.
Finally, the current process of appropriating funding based on a major or program will limit education to those areas dictated by market needs and tradition. New ideas will not be funded nor will more imaginative, ground breaking approaches to learning and application of student learning. As education becomes more costly, students and stakeholders expect more with less resources, and education is in greater demand from populations that would not have thought of higher education a generation ago, the current system is not meeting the needs (economically or educationally) of the US society.
A new model
With this in mind, I'd like to propose a new model for higher education in the US.
1) The curriculum of higher ed should change focus from general to specific to one of having students work on a specific area they are interested in in order to learn life long learning skills such as critical reading, self-direction, information literacy, technology literacy, communication skills, and collaboration skills. What if freshman were to start their education with a research project, rather than waiting at the end of their 4 years to bring everything together. They would learn the basic skills needed to learn in any profession. This would allow them to work in smaller groups, to be mentored by an educational specialist, and given the ability to work on those areas where they might be lacking. At the Master and Ph.d level, students would be expected to move in and out of various disciplines, learning in a complex system rather than limiting their learning to just one area. There would not be Ph.d. departments but rather one Ph.d. program in which students worked with faculty in multiple settings doing research in multiple disciplines. This would require a much higher level of thinking and abstraction, creating Ph.d's that could work solving society's problems outside of the unnatural boundaries of academic departments. Many are already doing this.
2) Funding would be a combination of educational professionals (with Ph.d's in a variety of disciplines, but training in learning theory for adults), learning centers, research centers, and learning support services (i.e. collaboration, written and spoken communication, critical reading and writing skills, quantitative research methodology and analysis, project based learning and scientific problem solving, etc...).
3) Learning and degree granting would be based on a portfolio of work and oral examinations rather than a testing of "content". In fact, the use of computers to identify content would be encouraged for the assessment tests rather than excluded from the process. My Ph.d. program does this now. We are given some articles to analyze and then given an oral exam based on our analysis. The topic can be anything related to education whether we are interested in it or not, have learned about it or not. We are given 3 weeks to prepare a paper and then defend it to a committee. Not only are they testing our understanding of the field, they are testing our ability to learn something new in a short time, to find resources to support this learning, to collaborate with colleagues when we don't understand something, and then to present a view point and support it appropriately.
These are just some ideas I have been kicking around. I am sure there are others who have better and more creative ideas. But one thing is for sure, the system will need to change if we are going to keep up with the changes and needs of society.
This and the extreme budget cuts to our university in the face of rising enrollments got me thinking about the call for "reform" in how our universities are run in the US today.
Current system in the US
To understand what we are up against in the US, it is important to understand the model of education as it currently stands in the US. Our current system is based on a belief that the ultimate goal of education is to become an expert (which was redefined as "specialist" in the 1980's) in a specific field of study. In other words, the Ph.d. holds all knowledge about a content area, thus making them an "expert".
After a broad basic education at high school (secondary school), a person is expected to learn the basic requirements of functioning in our society (through understanding our culture through the study of history, literature, and social studies, to basic written communication skills through the study of language arts, to basic calculation skills through the study of math, to the understanding of our environment, health. and work processes through the study of science). This is the ideal.
What used to be called Junior College but is now called Community College has developed into two tracks: the first is vocational and advanced technical training to meet the needs of an educated workforce (but not management), especially those in manufacturing and the service industry, whereas the second is the preparation for those underprepared or not able to afford a university or college education. In the second case, students are expected to take a broad range of courses across disciplines. In the first case, students are expected to become proficient in a given skill or discipline. However, in our current model of community college education, those that finish community college (usually with an associate's degree) do not hold expertise even if they have specialized in an area. Rather, they are able to work with the experts and/or gain expertise as they work within the discipline.
The current model for undergraduate education is 2 years of general education courses (also known as gen ed or core courses) from categories of disciplines (i.e. quantitative studies, language and arts, culture, social sciences, man and environment, etc...). Then a student will specialize or "major" or "minor" in a field. The traditional majors and minors normally fall into humanities, social sciences, applied sciences, natural sciences, liberal arts, or professional schools (pre-law, pre-med, education, accounting, etc...). Each major normally has a dedicated faculty consisting of tenured and/or full-time professors and adjunct, part-time, or student instructors. In the last two decades, "interdisciplinary" majors consist of faculty drawn from different majors. Tuition flows into the traditional majors to sustain faculty positions and support staff. The interdisciplinary major ends up being "gravy" (extra money) as there is no support staff or dedicated instructors for these majors.
One problem with the interdisciplinary majors (which I suffered at both the undergraduate and graduate level since both of my degrees were interdisciplinary) is that many of the required courses for these interdisciplinary majors are cut during budget crisis because they are perceived as "electives" within the traditional majors. The result is that required courses for interdisciplinary majors are cut and students in these majors are unable to complete their course work in a timely manor. This has just happened with a course I have taught in our major. It now is a part of Public Policy, an interdisciplinary major. Normally the course is offered either every 2 or 3 semesters, depending on the faculty interest. But now that it is part of another major, the demand for the course has increased. It is possible that I will need to teach it more often or if I leave, it won't be offered at all (we are short staffed within the Communication Dept. for our department's required courses as it is).
Once students leave with a Bachelor's degree, at the end of their college experience, they are expected to have a certain cache of skills and abilities that will make them employable. As a result, more and more colleges are basing their curriculum on employer needs (i.e. computer program specific, accounting law specific, ability to be licensed or certified in a field). The college graduate, in other words, will bring away from the college, the content they will need in the work place.
At the Master and Ph.d. level, students are expected to drill down to one area of expertise, that area being specific to the field of study they are pursuing. Graduate studies are based on the expertise of the faculty in a program/ field of study. In our department (Communication), for example our programs focus on Healthcare communication, political communication, and interpersonal communication. Other schools of communication might focus on mass communication, written communication, speech communication and disorders, intercultural communication, communication strategy, organizational communication, communication technology, etc... Many graduate schools try to build up a reputation in a marketable area. They will hire new faculty to reflect trends in specialties and encourage tenured faculty to change their expertise through grant writing support and research funding. A department that does not bring in funding (either through research, grants, or student tuition) usually will have programs or entire departments cut from the university.
Impact of this model on the Current Higher Ed System
This business model of Higher Education does not connect with the educational needs of the 21st century. As our economy and society moves into the knowledge economy, CONTENT is not as important as understanding how to find, interpret, analyze, and update content/expertise. Companies may be looking for specific content from their graduates, but what they need are employees that have critical thinking and reading, communication, analytic, information literacy, technology literacy, creativity, and collaboration skills. These skills might manifest themselves in different ways within different disciplines, but for the most part they can be found in all fields. As a result, it is important that those at the upper end of higher education (Master, Ph.d.), be prepared to cross the traditional disciplines to understand how each functions within a certain field of study.
Likewise, the internet has made content available on a mass basis, whereas it was limited to the university, publishing houses, depositories (such as libraries), and management before social networking. Access to information is not as important as knowing how to find that information and what to do with it when it is found. "Expertise" can be found outside of those trained and educated in the discipline, thus making the expert professor obsolete. The result is a need for professors that can teach, mentor, and develop life-long learning skills, something that was limited to graduate students in the past.
With the focus on new skills over content and access to expertise and content outside of the university, the current system of testing for content and expertise is lacking. There needs to be a deeper level of assessment that objective tests don't access.
Finally, the current process of appropriating funding based on a major or program will limit education to those areas dictated by market needs and tradition. New ideas will not be funded nor will more imaginative, ground breaking approaches to learning and application of student learning. As education becomes more costly, students and stakeholders expect more with less resources, and education is in greater demand from populations that would not have thought of higher education a generation ago, the current system is not meeting the needs (economically or educationally) of the US society.
A new model
With this in mind, I'd like to propose a new model for higher education in the US.
1) The curriculum of higher ed should change focus from general to specific to one of having students work on a specific area they are interested in in order to learn life long learning skills such as critical reading, self-direction, information literacy, technology literacy, communication skills, and collaboration skills. What if freshman were to start their education with a research project, rather than waiting at the end of their 4 years to bring everything together. They would learn the basic skills needed to learn in any profession. This would allow them to work in smaller groups, to be mentored by an educational specialist, and given the ability to work on those areas where they might be lacking. At the Master and Ph.d level, students would be expected to move in and out of various disciplines, learning in a complex system rather than limiting their learning to just one area. There would not be Ph.d. departments but rather one Ph.d. program in which students worked with faculty in multiple settings doing research in multiple disciplines. This would require a much higher level of thinking and abstraction, creating Ph.d's that could work solving society's problems outside of the unnatural boundaries of academic departments. Many are already doing this.
2) Funding would be a combination of educational professionals (with Ph.d's in a variety of disciplines, but training in learning theory for adults), learning centers, research centers, and learning support services (i.e. collaboration, written and spoken communication, critical reading and writing skills, quantitative research methodology and analysis, project based learning and scientific problem solving, etc...).
3) Learning and degree granting would be based on a portfolio of work and oral examinations rather than a testing of "content". In fact, the use of computers to identify content would be encouraged for the assessment tests rather than excluded from the process. My Ph.d. program does this now. We are given some articles to analyze and then given an oral exam based on our analysis. The topic can be anything related to education whether we are interested in it or not, have learned about it or not. We are given 3 weeks to prepare a paper and then defend it to a committee. Not only are they testing our understanding of the field, they are testing our ability to learn something new in a short time, to find resources to support this learning, to collaborate with colleagues when we don't understand something, and then to present a view point and support it appropriately.
These are just some ideas I have been kicking around. I am sure there are others who have better and more creative ideas. But one thing is for sure, the system will need to change if we are going to keep up with the changes and needs of society.
Subscribe to:
Posts (Atom)