This has me wondering if there is a transitional time for any new technology in which the "directions" are developed and thus an understanding of how the tool can be used is expanded. I tried to think of this in terms of the new tool that was introduced when I was growing up: the calculator (yes, I know I am dating myself). When it was first introduced, many looked at it as a substitute for a slide rule. How it was integrated into society, its uses, and how instructors/teachers integrated it into the classroom changed. I look now at how complex mathematical concepts are introduced into the curriculum for my son and daughter BECAUSE of the calculator. They are learning these concepts differently than I did with much more depth as students do not have to be slowed down by looking up numbers on a chart or slide rule (I never used a slide rule since "we had calculators"). I was still learning the computation, with the calculator used as a checking mechanism. How did the use of the calculator change so drastically? It was not overnight.
Many of us do not see the change in use of a technology as it becomes "mature". Using marketing (technology adoption) concepts, the majority adopters don't tend to use a new product until it is in the mature stage. So from here I have three questions:
- Do late adopters need to have "directions" or at least a protocol for use (e.g. through standardized training, later versions that have support in learning how to use the mechanism built in, tips or instructions available in various formats)?
- Are new technologies hitting the mature stage earlier (changing the product maturity curve) or having a greater number of early and late adopters earlier (changing the adaptation curve) or do we just expect that because of the pace of our society?
- What was the process in the past that helped to change curriculum, training, and user awareness as a result of the introduction of new technology? In other words, if we were to look at the integration of film into education, radio into education, or even mimiographs, what lead up to the point where it was acceptable to use these technologies in education? What ground work needed to be laid?
3 comments:
Tena koe V
My two young daughters 'play' with mobile phones, I-pods and games consoles. I watch them find things out. They use the suck-it-and-see approach, often with lightning speed.
But it's the suck-it-and-see approach, nothing else. Their instruction booklets and leaflets are crisp as the day they tumbled onto the floor when the devices were unpacked.
I think so-called intuitiveness of the applications (not the users) goes a long way to helping with this. But without the suck-it-and-see approach the oldies just have to read the manual - slow as.
Ka kite
from Middle-earth
By why do you think younger users use this approach while older ones tend to want a manual (notice I said want and not use--my husband wants the manual but rarely uses it until he really gets into trouble)? Do you think it is age, how they have been enculturated, education, personalities?
Kia Ora V
I firmly believe that the reason only 10% of the features are used on most sofware (and I embrace the functionality of a range of electronic devices here) is because the rest is never found.
It is only when I read the manual that I can find out features that my children didn't know existed. Frankly I don't think it is a fault of the children, me or the manual. It is a fault of the software that such features appear to be (and are) so obscure.
I could name a few sofware manufacturers that are guilty of this sort of design but it would be unethical here.
I'm sure you will have your own list of company names:-)
Ka kite
from Middle-earth
Post a Comment