Sunday, July 30, 2006

Dissertation

Data Collection
Data was collected over a two year period. This two year period is broken up into two parts, cycle 1, the first twenty-two months, and cycle 2, the last two months. During cycle 1 there was a long period of analysis and design. Cycle 2 was a rapid period of evaluation and design using classroom experiments.
Participants in Cycle 1 include six members of the design team, four subject matter experts, and six teacher-designers. The design team included one professor of Instructional Design, four graduate students in Education, and one computer programmer. The design team met with subject matter experts and teacher-designers each week during the first eight months of the Cycle. After which the team met with each other for the duration. During the first eight months teacher-designers participated in an intensive graduate program of studies learning design. Two subject matter experts were brought in as special guests to the class, two undergraduate students in geology assisted teachers in class. One was a professor of geomorphology and the other was a geoscience curriculum developer. During the the next fourteen months, the design team met both face-to-face and virtually to discuss design ideas and review prototypes that led to the development of the GO Inquire software.
Participants in Cycle 2 include the # students, three teachers, two instructional aides, and the computer programmer. Our initial purpose during Cycle 2 was to test the GO Inquire software with students and teachers. Two initial tests were conducted with fourth and fifth grade classrooms using the software after standardized testing towards the end of the year. In the final two full weeks of school, we revisited the classes over two to three days with a new set of classroom activities that were developed based on observation from the initial test of GO Inquire. With the fourth grade class we were able to test GO Inquire again on the last day of Cycle 2, after the new activities.
The design prototypes documented during Cycles 1 and 2 serve as the primary data source. It is from these objects that we examine the reasoning and abstract out the principles that went into making these objects. Prototypes and supporting documentation were stored and accessed online.
Observational data of the design process was not always documented in the design documents. The reporting on design is a narrative event (Barab, et al., in press) in that the historical nature of decisions of design. Evidence of design decisions again formulate in the prototypes, in that these decisions are instantiated as products, but also in the form of memos and discussions. These memos are stored as design documents. These discussions during design meetings for Cycle 1 were recorded and stored online. Sometimes before prototypes came design memos and just design ideas that needed to be recorded.
Formative tests during Cycle 2 of each of the design materials were conducted in the classroom. Video recordings of each session were transferred and stored online for review.
Data Analysis
In order to answer our research question, How do we design science curricular materials in which the existing teaching tools and practices are not congruent with the nature of the science? We have first outlined a model of geological sciences, reasoning and observation that differ from the experimental sciences. The analytical question is how do we filter through the data in a systematic study. First, we start at the aspects or features of the design. Listing and selecting important aspect from our experiences as designers. We work to construct a narrative from observational data that conveys the reasoning that went into the design of a feature. We also introduce in these narrative the within design process analysis that occurs. As in qualitative research, analysis begins as soon as data is collected (Maxwell, 2005). Looking across reasoning we start to see some similarities that are not apparent in the features. We then attempt to group these similar reasons into principles, or themes. We then search for observations and evidence that supports the reasoning for such principles.
Since we cannot rely on experimental design logic, like counterfactuals or correlations, analytical tests were constructed to improve the warrant for each principle as a claim (Toulmin). Evidence from Cycle 2, evaluation data. For principle 1, the test of similarity is the best test based on the evidence available. We seek to see if students cognitions are in fact similar to a geologists. We look for behaviors that indicate a scientific reasoning rather than a scholastic one. If students are simply performing academic tasks, rather than scientific ones, perhaps we have designed tasks that don't address geological observation. For principle 2, we draw on diffusion of innovations research (Rogers, 2003) to create a test of compatibility. If we have designed a solution that works within the scholastic constraints, then we expect that teachers would not find it to different to use from the normal way of teaching. We acknowledge there is one primary difference of nature of science beliefs, but we seek evidence that supports the claim that a compatible solution leads to positive attitudes around the innovation. For principle 3, we draw on strategic experiments in organizations (Vos..). Measures of performance for an innovation should be more reliable over time. We expect things to change progressively, not catastrophically. With the infrastructure in place the cost to change reduces by being faster. We seek such evidence that open technologies that were chosen led to events of low cost development and rapid change.
RESULTS
Principle 1. Start with a scientifically literate event. The problem this principle addresses is that the eventual task students are to perform must be congruent with the nature of geological science. The task must scaffold the student in the kinds of basic reasoning processes performed by scientists. The work of the designer is to simplify the task such that a novice can perform it. How did we come to identify and select this task? Our experience with a professor of geomorphology provided us unique insight into the geological reasoning process. We asked him to take us on a nature walk in a park next to an elementary school, present a basic lesson in fluvial processes and his own current research, and we interviewed him on his own scientific processes. We learned that imagery and the layering of that imagery played a huge role in structuring the knowledge and the ability to determine evidence. Using this task analysis, we constructed an interface in which the students interpretation of simple landform features like high and low points could be layered onto a photograph. Drawing from the nature walk, one designer walked around the elementary school observing the grounds for visual evidence of erosion, deposition and transportation. In testing we saw evidence of behavioral similarity. After first test of GO Inquire, one student said how she could see the location very differently that before. Also, in the field guide activity, we noticed students observing the ground looking for evidence of erosion, bare patches without grass. The lesson here comes from Latour (1986), by capturing the scientific event into a sharable, flat object of human scale, others can learn from them through efficient interpretation.
Principle 2. Design within scholastic constraints. The problem this principle addresses is that teachers are resistant to changes in their instruction. Lessons from inquiry science reform in that the complex nature of this science is very different from how teachers view science education. We learned this lesson speaking with elementary teachers that were learning design. A teacher's design planning is procedural and programmatic, starting with standards, and working with the resources available. Constraints on time from curriculum, school periods, and content coverage in multiple subjects must be respected. Our design solution fit only with the fifth grade standard, not fourth grade. An important recognition was that going outside of the classroom was disruptive and caused loss of time and demanded more teacher energy. A curriculum or activity, an obvious solution for content about learning about the earth, that required outdoor activity was not likely to be compatible with teacher's plans and not likely to be adopted. In working with our primary teacher, PM, we found we needed to design a series of panels so that the teacher could construct her own content. The cutting task emerged as an activity by adding writing components to it, meaningfulness ordering, and diagraming. The teacher said, we have to make it instructional. The primary teacher made some mention of wanting to use the GO Inquire some more, but there was little evidence of diffusion effect with the other teachers. Even less enthusiasm for the task structures as innovations. The early lesson here might be that innovation must be fully developed before adoption. The teacher is the gatekeeper, but they are assessing student engagement. We found that student groups could work for long periods of time up to an hour without break, teacher was surprised at this. The field guide activity was conducted with a Field Day activities going on. Students remained engaged with many possible distractions. Such engagement can be a marketable feature that leads to adoption decisions.
Principle 3. Use open technologies. The problem this principle addresses the scarcity of expertise. We were not expert in content writing. Selecting a open technology of database-driven web application provided us a platform to distribute the expertise at a scalable level, in large quantity and across distance. This time shifting of recording scientific events, and then sharing them with students allows students access to authentic activities. The open platform allows us to build interactive scaffolds to shape and develop cognitions. Also it allowed for students to generate content. Students from one class enjoyed this feature the most, finding this a very different experience in which there were many answers not just one. Selecting an open technology of photographs interconnect with the database, but also allowed for us to distribute these scientifically literate events using other media, namely paper. Paper prototypes allowed us to mesh other activities not amenable to computer-based instruction. These prototypes that developed into the cutting and field guide activities were designed in the weeks after the GO Inquire tests. Instead of bringing students into a computer lab for three straight days, we used the same photographs across three very different activities. We were able to develop the teachers' design of a summarization page into GO Inquire for a final activity with the fourth graders. The addition was added into the previous design with minor modification. Changes and new designs occurred rapidly during Cycle 2 with no additional costs.

Friday, July 21, 2006

Bowe, 2000

Bowe, F. (2000). Universal design in education: Teaching nontraditional students. Westport, CT: Bergin & Garvey.

Bichelmeyer, 2004

Bichelmeyer, B. A. (2004). “The ADDIE model” – A metaphor for the lack of clarity in the field of IDT. Paper presented at the annual meeting of the American Educational Communication and Technology in Chicago, IL.

Kenny, Zhang, Schwier, & Campbell, 2005

Kenny, R. F., Zhang, Z., Schwier, R. A., & Campbell, K. (2005). A review of what instructional designers do: Questions answered and questions not asked. Canadian Journal of Learning and Technology, 31. Retrieved on July 21, 2006 from http://www.cjlt.ca/content/vol31.1/kenny.html

Kirschner, Carr, van Merriënboer & Sloep, 2002

Kirschner, P., Carr, C., van Merriënboer, J., & Sloep, P. (2002). How expert designers design. Performance Improvement Quarterly, 15, 86-104.

Crawford, 2004

Crawford, C. (2004). Non-linear instructional design model: Eternal, synergistic design and development. British Journal of Educational Technology, 35, 413-420.

Rowland, 1993

Rowland, G. (1993). Designing and instructional design. Educational Technology Research and Development, 41, 79–91.

Shambaugh, & Magliaro, 2001

Shambaugh, N. & Magliaro, S. (2001). A reflexive model for teaching instructional design. Educational Research Technology and Development, 49, 69-92.

Wednesday, July 19, 2006

Barnett & Pratt, 2000

Barnett, C. K., & Pratt, M. G. (2000). From threat-rigidity to flexibility - Toward a learning model of autogenic crisis in organizations. Journal of Organizational Change Management, 13, 74-88.

Staw, Sandelands, & Dutton, 1981

Staw, B. M., Sandelands, L. E., & Dutton, J. E. (1981). Threat rigidity effects in organizational behavior: A multilevel analysis. Administrative Science Quarterly, 26, 501-524.

Monday, July 17, 2006

Kelley, 2005

Kelley, T. (2005). The ten faces of innovation: IDEO's strategies for beating the devil's advocate & driving creativity throughout your organization. New York: Doubleday.

Smith & Ragan, 2005

Smith, P. L., & Ragan, T. J. (2005). Instructional design (3rd ed.). Hoboken, NJ: John Wiley & Sons.

Sunday, July 16, 2006

(Goldin-Meadow, Wein, & Chang, 1992).

Goldin-Meadow, S., Wein, D., & Chang, C. (1992). Assessing knowledge through gesture: Using children's hands to read their minds. Cognition and Instruction, 9, 201-219.

(Baker & Dwyer, 2000)

Baker, R., & Dwyer, F. (2000). A meta-analytic assessment of the effect of visualized instruction. International Journal of Instructional Media, 27, 417-426.

(Reynolds & Peacock, 1998)

Reynolds, S. J., & Peacock, S. M. (1998). Slide observations – Promoting active learning, landscape appreciation, and critical thinking in introductory geology courses. Journal of Geoscience Education, 46, 421-426.

(Posner & Gertzog, 1982)

Posner, G. J., & Gertzog, W. A. (1982). The clinical interview and the measurement of conceptual change. Science Education, 66, 195-209.

(Posner, Strike, Hewson, & Gertzog, 1982).

Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A. (1982). Accommodation of a scientific conception: Toward a theory of conceptual change. Science Education, 66, 211-227.

(Minstrell, 1992)

Minstrell, J. (1992). Facets of students' knowledge and relevant instruction. In In R. Duit, F. Goldberg, & H. Niedderer (Eds.), Research in physics learning: Theoretical issues and empirical studies (pp. 110-128). Kiel: IPN.

(Blumenfeld, Mergendoller, & Swarthout, 1987)

Blumenfeld, P. C., Mergendoller, J. R., & Swarthout, D. W. (1987). Task as a heuristic for understanding student learning and motivation. Journal of Curriculum Inquiry, 19, 135-148.

(Doyle & Carter, 1984).

Doyle, W., & Carter, K. (1984). Academic tasks in classrooms. Curriculum Inquiry, 14, 129-149.

(McGee, Howard, & Hong, 1998)

McGee, S., Howard, B. C., & Hong, N. (1998). Evolution of academic tasks in a design experiment of scientific inquiry. Paper presented at the annual meeting of the American Educational Research Association in San Diego, CA.

(McLoughlin & Krakowski, 2001).

McLoughlin, C. & Krakowski, K. (2001, September). Technological tools for visual thinking: What does the research tell us? Paper presented at Apple University Consortium Academic and Developer's Conference, Townsville, Queensland, Australia.

(Crowley & Jacobs, 2002)

Crowley, K., & Jacobs, M. (2002). Building islands of expertise in everyday family activity. In G. Leinhardt, K. Crowley, & K. Knutson (Eds.), Learning conversations in museums (pp. 333–356). Mahwah, NJ: Lawrence Erlbaum Associates.

(Jennings, Swindler, & Koliba, 2005).

Jennings, N., Swindler, S., & Koliba, C. (2005). Place-based education in the standards-based reform era – Conflict or complement? American Journal of Education, 112, 44-65.

(Gass, Mackey, & Ross-Feldman, 2005).

Gass, S., Mackey, A., & Ross-Feldman, L. (2005). Task-based interactions in classroom and laboratory settings. Language Learning, 55, 575-611.

(Edelson, 2002).

Edelson, D. C. (2002). Design research: What we learn when we engage in design. The Journal of the Learning Sciences, 11, 105-121.

(Gruenewald, 2003).

Gruenewald, D. A. (2003). The best of both worlds: A critical pedagogy of place. Educational Researcher, 32(4), 3-12.

(Smith & Reiser, 2005)

Smith, B. K., & Reiser, B. J. (2005). Explaining behavior through observational investigation and theory articulation. The Journal of the Learning Sciences, 14, 315-360.

(Qutub, 2005)

Qutub, J. (2005, October). Are realistic or are abstract visual representations more effective tools in technology-based geosciences education? Poster presented at the annual meeting of the International Visual Literacy Association in Orlando, FL.

Saturday, July 08, 2006

(Latour, 1986).

Latour, B. (1986). Visualization and cognition: Thinking with eyes and hands. In H. Kuklick & E. Long (Eds.), Knowledge and society: Studies in the sociology of culture and past and present, Vol. 6, (pp. 1–40). Greenwich, CT: Jai Press.