About Me

Education, the knowledge society, the global market all connected through technology and cross-cultural communication skills are I am all about. I hope through this blog to both guide others and travel myself across disciplines, borders, theories, languages, and cultures in order to create connections to knowledge around the world. I teach at the University level in the areas of Business, Language, Communication, and Technology.

Tuesday, November 23, 2010

We just want a chance to try and to be heard

Next week I'll post about my new economic model for Higher Ed. That post is going to require some deep thinking, which can't be accomplished this week as it is Thanksgiving on Thursday. As I posted last year, Thanksgiving is a very important holiday in the US. This week we have relatives visiting from across the country (Seattle), down south (Georgia), and nieces and nephews coming home from college.

I have also been very busy with work, volunteering with my daughter's school, and just life in general as the mother of two teens, one of which is in the middle of his search for universities to attend next year.

Over the last week, I had a few insights into my own kids, education, and young adults in general.

1) My daughter tried out for two plays this past month. She went in with a positive outlook, confident in her abilities (she has a spectacular voice if I do say so myself), willing to take any part. Her resume is very good and she is willing to take any part, including chorus, when required.

However, it has been very difficult for her to go through the two auditions she had. In both cases, it was obvious that most of the parts were already pre-cast. As she said, she would have felt better had she known, "we only are casting part X and part Y, looking for this certain look." Instead, she went through one casting call, waited for 3 weeks during which they announced 3 additional casting calls. Then they changed the play and made everyone try out again. It became very obvious during the next audition that the play had already been cast as some people were told by the director what to sing. She also was given a 30 second audition and then told nothing. Others waiting for the audition, however, already knew when call backs would be, and the implication was that they had a time BEFORE the audition as to when they should return.

The second audition she had was not to obviously fixed. However, before going to the audition, she was told who would be getting the leads. It is demoralizing for those trying out to know that they have no chance, though. One girl, in particular, my daughter can relate to. Her older sister is a very talented singer (she is the other lead). My daughter, like this girl, has always felt that she has lived in the shadow of her sibling.

This leads me to the conclusion that most students just want a chance to show their abilities in a fair and equitable process. This is especially true when they might not have been heard within our system of education. A student that does not test well wants to be able to show that they KNOW things that don't fit into the process. Students feel powerless when they walk into a class with certain expectations because of siblings or records that as often as not are based on politics or a system in which those who know how to work the system come out on top. I can hear students' silent screams when they come into my class with an attitude that says, "I don't care if I do well or not. I'm not going to try so you can prove I can't." This is why I try to permit them to have as much choice as possible to prove to me (and themselves) that they can.

2) Related to this was work that I did with my daughter's school. My daughter took on an unbelievable task in putting together a musical review, the first for her "Science/Math/Technology" based school (I put this in quotes because the fact is, the majority of the students are incredibly creative and much more artistic, rather than STEM mentality). She was allowed to do so as long as she accepted all of the students who auditioned. She took up the challenge, put together a series of broadway songs, worked with those students who had never preformed before, put together and taught group numbers/harmonies, and taught acting skills she had learned over the past two years doing community theatre. I did mostly supervisory tasks, although I did identify those areas in which her colleagues might have not understood her.

I was very impressed with how she handled those in the review. She made each of them feel as if they were vitally important to the show. She also knew when to get on their case when they did not focus in rehearsal, or they would give up or not practice. At one point, when she had to come to rehearsal late, they invoked her name, afraid that she would be angry if they didn't buckle down and do what she had directed them to do. I was told by a theatre professional who attended the review, that she had done a wonderful job in putting the show together, highlighting the students strengths, ensuring that the weaker performers did not follow very strong performers.

I did work with three of the performers that had little to no experience. One of my strengths, I have found, is to create confidence in my students to try new things, and to continue on when they perceive they have failed (or to reset their standards).

Students often just want a chance to try things and to feel as good in trying and failing as when they try and do a spectacular job. So when my son and two of his classmates, their last year in high school, played for the football team for the first time in their life, they felt great about it, even though they did not get a lot of play time. Why? Because their coach made them feel that he respected them just for trying something new. At the end of the season, he presented each one of them at a football banquet with over a hundred players and their families, pointing out how each had worked hard to learn the new skill and contributed to the team. It was amazing to see the pride that each had, even though most played only about 1-2 minutes each game (out of a possible hour).

My son and his classmate just asked if they could sit in on an advanced French class in December. Their college classes don't start again until January and both are interested in French (they completed their requirements in Spanish). Their teacher was more than happy to have them come to the class, although he warned them that they probably would not understand very much as neither has studied French. My daughter is taking Art in her free time as an independent study. In both cases, the teachers could have denied them, but they encouraged them to try something new.

Good teachers want their students to try and are proud of their journey and development more than the final accomplishment (test grade). Unfortunately, in the current educational climate in the US, this is not recognized. Low test scores equates to ineffective teaching. This loses the lifelong learning skills often developed by these teachers.

3. Young adults are works in progress until their early 20's. My son, a fairly intelligent, responsible teenager, still has his moments of total stupidity. Yesterday, while horsing around, he ended up with a face full of glass when one of his friends (also usually responsible) put his hand through a window (he thought is was plexy glass). When his mother, the school nurse, the dean of academics, and I asked the same question, "what were you thinking?", their answers were the same, "we didn't know it was glass."

I think we expect too much of young adults: that they know what they are doing is bullying, inappropriate, dangerous, etc..., that they know what they want to do with the rest of their life, that they are not going to make mistakes. Of course, they also like to exert their independence. What is important is that we allow them to make mistakes that won't impact their lives, that we allow them to crawl out of the messes they have made, that we are there, not to "save" them or take on their problems, but to support them as they work through the problems that everyone must face as part of life, and that we teach them the skills to deal with life that sometimes might be overwhelming for them.

I see many of my students who have no one to say, " you know, you're doing a good job coping, life sucks sometimes, but you have to keep going, there IS a light at the end of the tunnel, keep a positive out look." I also remind my students of those that have it worse than themselves (although sometimes it is hard when I hear some of their stories). One way to help students help is to have them help others.

4. Finally, I think we in the US have to recognize that ultimately, most teachers get into teaching because they really care about their students. While we may not always agree with their styles and not all teachers' styles will be effective with all students, teachers DO NOT get into education because they will have their summers off (the fact is, most states require that teachers have additional training during their "time off."). They truly believe they can teach. Most education programs weed out those that don't like or are unable to connect to students. I always have to catch myself when my daughter or son has a problem with a teacher. They may not be "good teachers" as I would define them, but for the most part, they do care about the students.

Tuesday, November 16, 2010

A new economic model for Higher Education: Part 1 history

More and more people have recently been writing about a new economic model for higher education (Andy Coverdale, Clark Quinn, and Tom Haskins, just to name a few). However, as we grapple around the world with how higher education should be structured and funded, we aren't willing to reexamine the underlying beliefs upon which the funding and academic structures were created. Now is the time to begin to look at the basis of the traditional structures and how they have changed, and the current needs in a new structure that will fit Higher Education's needs.

The History of Higher Education in the West


Higher education came out of a belief that only those within power should have access to knowledge. The knowledge included philosophy, history, stories (literature), music, etc... In other words, the humanities. This made sense as only those who were rich and powerful would have the time to study subjects that did not necessarily contribute to every day economics of that time: agriculture, warfare, trade skills. Often, those that were educated were the spare sons. This allowed powerful families to control what knowledge was passed down and how that knowledge would be perpetuated.

The advent of the printing press allowed for knowledge to be transferred from location to location in greater amounts. Still, higher education was only for those who were "scholars". The economic reasons for this was that the serf system allowed powerful families to maintain their power, and knowledge was perceived as a commodity to be controlled by those families who had power, land (thus resources), and a means to control their serfs.

Adam Smith developed the principles of Capitalism as the economy, in the form of the industrial revolution and the age of mercantilism, changed the need for knowledge within the economy. No longer was a person's wealth tied to family (birth), but also know how, skills, and the ability to understand the complex systems outside of the local environs. People were "human capital" and became mobile, something that was not possible under a serfdom. More importantly, a person could go to a university, if they were clever enough, and "gain" the knowledge that was originally set aside for children of the wealthy and powerful landowners.

There was also a shift towards science and the creation (think industrial revolution) of products, tools, and technology as people moved away from their sources of substance (food, water). Soon, in places like the US, there was the recognition that knowledge was a commodity that, when invested, could lead to power, riches, and opportunities. In other words, the university was one means to "acquire" the knowledge that could be used to participate in the economy. However, up until the end of the 20th century, higher education was still perceived as something that could be withheld or distributed, thus allowing some to "possess" the knowledge and then use that to be successful in the economy.

During this period of time, knowledge was also perceived as being individual. An individual could pass knowledge on to other individuals. If an individual did not do well in a class, it was because that individual, even though they had access to knowledge, was not able to use it because he or she was lacking in some way (not smart enough, not motivated enough, looking for the wrong type of knowledge that would be useful for that individual). The university was a way to train future leaders in the economy, and as a result, universities decided on who would have the most potential, which subjects to study, and what would be the most useful for the economy. This is one reason why so many universities eventually became government run. The university was a means to implement public policy.

However, also during this time period, the economy changed to one in which corporations, not individuals became the structure within which the economic decisions were being made. While there has been a lot written and criticized about corporations, they have had an impact on how business is done and who controls resources. Adam Smith's theories included an explanation of motivations based on the serfdom model in which the individual landowner would have a self interest in making sure that those within his or her community were taken care of. However, as communities became mobile, and companies no longer had individuals, but rather a collective making decisions, his theories no longer are true.

New Basis for Economic Model for Higher Education


Much work has been done in the last two decades on the knowledge economy. In addition, during the 20th century, there was a realization that the economic principles of the past were not fitting the economic realities on the present. With this in mind, any new economic model for higher education will have to take the following premises in mind:

1) Knowledge is no longer just in individual "thing" possessed internally. Knowledge can be collective (within an organization for example), be located externally (via the web for example), and time dated (it can be irrelevant the moment it is created).

2) Humans are no longer "capital" that can be or are expected to be moved around to take advantage of opportunities. When they do move, often it is based on many factors, most of which may not be quantifiable. Humans don't always make "rational" decisions. And societies in the 21st century have (for the most part) recognized that individuals have the right to make decisions about their education, work, where they live, and what they do with their free time.

3) Every individual has the right to education and literacy. It no longer (for the most part) should be limited to just those born into power and privilege.

4) Knowledge and services are major contributors to the economy. The basis of many of our jobs is the ability to learn new skills and apply both individual and collective knowledge to a situation.


As a result, it is clear to me that the current capitalistic model used currently to decide what we should be doing with higher education is no longer relevant. My next post, I will try to present some of my ideas on what a new economic model should include.

Tuesday, November 9, 2010

Culture and Technology

This post is as much a work in progress to help me understand some of what I am seeing in my dissertation. So I apologize for the lack of specific references at this point. I am hoping to find some to support some of the ideas I have uncovered.

Defining Culture and Technology

There has been a lot written over the last 2 decades on the impact of culture on technology and the impact of technology on culture. Betty Collis and Catherine McLoughlin have written extensively on this issue. Rather than reiterate what they have written, I would like to look at a framework for further research in culture and technology.

To begin with, the interaction of culture and technology often looks at the influence of one on the other. However, I feel that culture is the unseen basis of technology. Technology can be a process, a tool, and/or the use of a tool or process. As a result, knowledge is at the basis of what technology is. Epistemology (the belief of what knowledge is) is grounded in our cultures. This becomes evident when someone changes cultures or is introduced to a culture other than the one in which they grew up.

For example, when children first go to school, suddenly they are aware that there are differences between what the school believes is knowledge, what their classmates believe is knowledge, and what their families believe is knowledge (thus, I was told I "didn't know how to write my name" when I began school because it was not my given name I had learned--Virginia--but rather my nickname--Gin).

So our understanding of what a technology is and how it can be used may change if there is a cultural challenge to our understanding of that technology. At that point, we can either adapt the technology, change the technology we are using, or require that others use our technology.

Considerations for culture and technology research

In my current research, I have seen how organizational, departmental, personal, or professional cultures influence the understanding, use, and acceptance of technology for a given situation. In this section I will identify some of the factors that influence the impact of culture on and by technology.

1) Affordances: An affordance is the use of a technology for a given situation. It is the ability for a process, tool, or use of the process or tool to allow us to accomplish something. Many times, what we look for in an affordance for a specific situation is based on how that technology has been used in the past and what we understand its capabilities are. If the technology does not allow us to accomplish what we used it for, then we either did not use it correctly or the technology does not work. Rarely to we look at whether our expectations in the use of the technology differed from others expectations.

For example, my sister currently lives in the Midwest and has embraced a midwestern, protestant, rural culture. However, her New York, small town, Catholic culture in which she was raised comes out when she uses technology. Unlike her husband (who was raised in the culture where she lives), she wants to be able to individualize the technology she uses and expects to work with ITS personnel to help her to modify the technology or be given new tools when she finds the technology lacking. A case in point was her use of a LMS that she did not feel met the needs for her class. Her colleagues just adapted what they were given to their own teaching, while my sister demanded that the ITS look for modifications in order for her to accomplish the learning and communication goals she had set up for her class. She expected better affordances to monitor student progress, for students to be able to interact with content, and for better teacher student communication outside of the class.

2) Design: The spacial relationship with processes, tools, and the uses of those tools and processes differ depending on the cultural epistemology and context. In high context cultures, I feel there will be less variety in the understanding and expectations for a given technology (within that culture) whereas in a low context culture, there will be more variety. In addition, many western cultures will use a linear relationship within the technology while eastern cultures may be more apt to use a spatial relationship with the technology. There will also be differences in the relationship in the human/technology interaction and the human/technology/human interaction. This makes sense given the differences between cultures in the way they organize information, communicate ideas, and validate knowledge.

3)Visual and language differences How a tool looks, how a process is communicated, the terms and symbols that are integrated into technology will differ between cultures because these are all at the heart of culture differences. For example, many Asian languages read from right to left and their writing is based on symbols for ideas rather than phonetic symbols. Many cultures value oral traditions over written, written over visual, or equally value oral, visual, and written traditions. As a result, different technologies might be valued differently within one culture than another culture.


Future research

I feel it is important that we begin to look at the culture that is embedded in technology in order to understand how people decide what technologies to use and how to use them. This would also help us to identify what factors we need to consider when choosing appropriate technology for use with or in other cultures and the impact that that technology would have on its implementation and on the use by the culture.

Tuesday, November 2, 2010

A new model for Higher Education

I recently read two good blog posts about higher education: one by Clark Quinn and the other by Andy Cloverdale. In both posts they point out the need for change in the way that education is provided at the University and the way instructors/professors are trained to teach in the University.

This and the extreme budget cuts to our university in the face of rising enrollments got me thinking about the call for "reform" in how our universities are run in the US today.

Current system in the US

To understand what we are up against in the US, it is important to understand the model of education as it currently stands in the US. Our current system is based on a belief that the ultimate goal of education is to become an expert (which was redefined as "specialist" in the 1980's) in a specific field of study. In other words, the Ph.d. holds all knowledge about a content area, thus making them an "expert".

After a broad basic education at high school (secondary school), a person is expected to learn the basic requirements of functioning in our society (through understanding our culture through the study of history, literature, and social studies, to basic written communication skills through the study of language arts, to basic calculation skills through the study of math, to the understanding of our environment, health. and work processes through the study of science). This is the ideal.

What used to be called Junior College but is now called Community College has developed into two tracks: the first is vocational and advanced technical training to meet the needs of an educated workforce (but not management), especially those in manufacturing and the service industry, whereas the second is the preparation for those underprepared or not able to afford a university or college education. In the second case, students are expected to take a broad range of courses across disciplines. In the first case, students are expected to become proficient in a given skill or discipline. However, in our current model of community college education, those that finish community college (usually with an associate's degree) do not hold expertise even if they have specialized in an area. Rather, they are able to work with the experts and/or gain expertise as they work within the discipline.

The current model for undergraduate education is 2 years of general education courses (also known as gen ed or core courses) from categories of disciplines (i.e. quantitative studies, language and arts, culture, social sciences, man and environment, etc...). Then a student will specialize or "major" or "minor" in a field. The traditional majors and minors normally fall into humanities, social sciences, applied sciences, natural sciences, liberal arts, or professional schools (pre-law, pre-med, education, accounting, etc...). Each major normally has a dedicated faculty consisting of tenured and/or full-time professors and adjunct, part-time, or student instructors. In the last two decades, "interdisciplinary" majors consist of faculty drawn from different majors. Tuition flows into the traditional majors to sustain faculty positions and support staff. The interdisciplinary major ends up being "gravy" (extra money) as there is no support staff or dedicated instructors for these majors.

One problem with the interdisciplinary majors (which I suffered at both the undergraduate and graduate level since both of my degrees were interdisciplinary) is that many of the required courses for these interdisciplinary majors are cut during budget crisis because they are perceived as "electives" within the traditional majors. The result is that required courses for interdisciplinary majors are cut and students in these majors are unable to complete their course work in a timely manor. This has just happened with a course I have taught in our major. It now is a part of Public Policy, an interdisciplinary major. Normally the course is offered either every 2 or 3 semesters, depending on the faculty interest. But now that it is part of another major, the demand for the course has increased. It is possible that I will need to teach it more often or if I leave, it won't be offered at all (we are short staffed within the Communication Dept. for our department's required courses as it is).

Once students leave with a Bachelor's degree, at the end of their college experience, they are expected to have a certain cache of skills and abilities that will make them employable. As a result, more and more colleges are basing their curriculum on employer needs (i.e. computer program specific, accounting law specific, ability to be licensed or certified in a field). The college graduate, in other words, will bring away from the college, the content they will need in the work place.

At the Master and Ph.d. level, students are expected to drill down to one area of expertise, that area being specific to the field of study they are pursuing. Graduate studies are based on the expertise of the faculty in a program/ field of study. In our department (Communication), for example our programs focus on Healthcare communication, political communication, and interpersonal communication. Other schools of communication might focus on mass communication, written communication, speech communication and disorders, intercultural communication, communication strategy, organizational communication, communication technology, etc... Many graduate schools try to build up a reputation in a marketable area. They will hire new faculty to reflect trends in specialties and encourage tenured faculty to change their expertise through grant writing support and research funding. A department that does not bring in funding (either through research, grants, or student tuition) usually will have programs or entire departments cut from the university.

Impact of this model on the Current Higher Ed System

This business model of Higher Education does not connect with the educational needs of the 21st century. As our economy and society moves into the knowledge economy, CONTENT is not as important as understanding how to find, interpret, analyze, and update content/expertise. Companies may be looking for specific content from their graduates, but what they need are employees that have critical thinking and reading, communication, analytic, information literacy, technology literacy, creativity, and collaboration skills. These skills might manifest themselves in different ways within different disciplines, but for the most part they can be found in all fields. As a result, it is important that those at the upper end of higher education (Master, Ph.d.), be prepared to cross the traditional disciplines to understand how each functions within a certain field of study.

Likewise, the internet has made content available on a mass basis, whereas it was limited to the university, publishing houses, depositories (such as libraries), and management before social networking. Access to information is not as important as knowing how to find that information and what to do with it when it is found. "Expertise" can be found outside of those trained and educated in the discipline, thus making the expert professor obsolete. The result is a need for professors that can teach, mentor, and develop life-long learning skills, something that was limited to graduate students in the past.

With the focus on new skills over content and access to expertise and content outside of the university, the current system of testing for content and expertise is lacking. There needs to be a deeper level of assessment that objective tests don't access.

Finally, the current process of appropriating funding based on a major or program will limit education to those areas dictated by market needs and tradition. New ideas will not be funded nor will more imaginative, ground breaking approaches to learning and application of student learning. As education becomes more costly, students and stakeholders expect more with less resources, and education is in greater demand from populations that would not have thought of higher education a generation ago, the current system is not meeting the needs (economically or educationally) of the US society.

A new model

With this in mind, I'd like to propose a new model for higher education in the US.

1) The curriculum of higher ed should change focus from general to specific to one of having students work on a specific area they are interested in in order to learn life long learning skills such as critical reading, self-direction, information literacy, technology literacy, communication skills, and collaboration skills. What if freshman were to start their education with a research project, rather than waiting at the end of their 4 years to bring everything together. They would learn the basic skills needed to learn in any profession. This would allow them to work in smaller groups, to be mentored by an educational specialist, and given the ability to work on those areas where they might be lacking. At the Master and Ph.d level, students would be expected to move in and out of various disciplines, learning in a complex system rather than limiting their learning to just one area. There would not be Ph.d. departments but rather one Ph.d. program in which students worked with faculty in multiple settings doing research in multiple disciplines. This would require a much higher level of thinking and abstraction, creating Ph.d's that could work solving society's problems outside of the unnatural boundaries of academic departments. Many are already doing this.

2) Funding would be a combination of educational professionals (with Ph.d's in a variety of disciplines, but training in learning theory for adults), learning centers, research centers, and learning support services (i.e. collaboration, written and spoken communication, critical reading and writing skills, quantitative research methodology and analysis, project based learning and scientific problem solving, etc...).

3) Learning and degree granting would be based on a portfolio of work and oral examinations rather than a testing of "content". In fact, the use of computers to identify content would be encouraged for the assessment tests rather than excluded from the process. My Ph.d. program does this now. We are given some articles to analyze and then given an oral exam based on our analysis. The topic can be anything related to education whether we are interested in it or not, have learned about it or not. We are given 3 weeks to prepare a paper and then defend it to a committee. Not only are they testing our understanding of the field, they are testing our ability to learn something new in a short time, to find resources to support this learning, to collaborate with colleagues when we don't understand something, and then to present a view point and support it appropriately.

These are just some ideas I have been kicking around. I am sure there are others who have better and more creative ideas. But one thing is for sure, the system will need to change if we are going to keep up with the changes and needs of society.