≡ Menu

An NSDL Retrospective: The Case of the Instructional Architect by Mimi Recker

The IA as an NSDL Service

In terms of fostering collaboration, a critical venue over the years has been the NSDL Annual meetings. The project particularly benefited from participating and attending sessions, as well from spontaneous ‘hallway’ conversations.

During this period, project PI’s were integrally involved in the NSDL’s nascent Evaluation and Education Impact Standing Committee (EEISC) committee. In particular, the exchanges at NSDL Annual meetings helped clarify out thinking. Most recently, the long-term sustainability and utility of these NSDL committees has been questioned, as their volunteer nature competes with more urgent project demands.

Evaluation Strategies

Participation in this committee spurred two important evaluation activities. The first was the development of a logic model for our evaluation program, which made explicit the existing current conditions and assumptions that justify the grant, the program objectives, program activities to achieve those objectives, and desired outcomes from the program. Also included in the logic model are research and evaluation milestones and objectives that measure our progress and results (see Figure X for an early example).

The second development was the implementation of user tracking and analysis (called ‘webmetrics’) in greater sophistication since the beginning of operation. 2008 last year has seen remarkable increase in the ability to track user behavior using web-usage data. These data are the foundation for at least one doctoral dissertation study by Bart Palmer. Data described next were collected 17 July 2008.

User Registrations. The IA now has over 3350 registered users. For the year August 2007 through July 2008, the IA had 1,027 new registrants (+32%). When broken down into Aug-Dec and Jan-May, we enjoyed 519 (+40%) and 508 (+26%) new registrants, respectively.

Project Creation. Over 5,850 projects have been created by teachers—with 1,994 (+58%) being created this last grant-year. Fall project creations totaled 936 (+23%) and Winter/Spring saw a whopping 1,058 (+113%) creations. This is especially important, as described below that project and resource use are also increasing dramatically.

Resource Utilization. A project-resource is one which has been collected by a user and then utilized within the context of a project. The number of project-resources added to new project was 6,620 (+53%), split almost evenly between Fall and Winter/Spring with 3,519 (+40%) and 3,101 (+71%), respectively. Taken as an average, the number of resources per project in the past has been around 2; these new numbers show that this ratio has now increased to 3.3 resources per project for new projects created this year.

Project Usage. Project views are counted each time a page is refreshed. This is a departure from webmetric standard practice because of our particular user-base. For example, with in-class mini-lab use of an IA project, several students can reload the same project on the same computer within the industry-standard 30 minute session. 235,560 (+110%) successful project requests were served. Again, this usage was nearly evenly split between the two halves of the US academic year with 114,428 (+213%) and 121,132 (+61%). While not all projects are accessed equally, the number of views per project (across all IA projects) this grant-year is 40. The new advances in data collection will allow more detailed analyses should be able to discriminate different usage patterns and allow us to view project views with more precision in the future.

Resource Usage. This metric tracked the number of times each resource was clicked in IA projects. Unfortunately spammers learned they could abuse this data-gathering code and it was removed from production until a fix could be made to the code to reduce the probability of spam usage. Because the fix was implemented in January, the number of resource clicks can roughly be compared between the last two Winter/Spring parts of the year, being 92,126 (-2.54%) for Jan-May 2008. This reduction in numbers has several possible explanations, which are being investigated.

Note that there are far greater project views than resource clicks; this discrepancy also has valid explanations in that our bounce-rate (or number of one-hit-visits) is about 40% for all our pages, but higher (60%) for projects. Meaning that many project hits are the only hit to our site that user makes. This can be due to several reasons like, a specific search in which the visitor finds the IA project does not contain the specific information they are looking for. This can also be understood (as mentioned) in the previous counting of projects, which is overestimate of true project views. The quality and design of projects has now become more specifically discussed in our workshop as a result of these webmetric analyses.

Visits. The additional insight from tools like Google Analytics (http://google.com/analytics) has brought us a wealth of knowledge. For example, we understood our usage was probably related to the week and academic cycles in the United States, but we did not know how closely. Figure 5 shows the weekly visits for the grant-year now ending. Notice how summer, Thanksgiving, Christmas, and President’s Day vacations dramatically lower IA usage. The spike in late March 2008 was during several workshops and outreach efforts in major educational conferences where the IA was featured.

Graph of weekly IA visits

Figure 5: Graph of weekly IA visits from 1 Aug 2007 to 17 July 2008—note the correlation of visits with the US academic calendar.

We have found that analysis of user tracking data can help us better understand our user base and inform our outreach efforts and professional development workshops. This last grant-year has seen a great advance in the use of the IA by teachers and students alike. The inclusion of webmetrics observation and analyses has enhanced our ability to perceive these changes.

The work of this grant in reaching out to pre- and in-service teachers has directly contributed to the increase of IA usage—both in creation and usage of IA projects. Noting specifically the increase of the number of resources per project indicates that our registered users are beginning to “mash-up” content more than ever before as they collect and reuse online educational resources. Our continuing goals of better collection and analyses of these and other metrics are an integral part of research and outreach efforts.

Comments on this entry are closed.

  • Lois McLean March 8, 2009, 4:19 pm

    Mimi describes the problems of the “wheel reinvention” and “tool silos” that are created when projects with similar functionality and audiences develop in parallel rather than in a complementary fashion. Part of this may be due to the fact, which she points out, that technical standards for the NSDL were not in place when earlier projects such as Instructional Architect began creating tools. Are there some specific approaches that the NSDL should consider to avoid similar problems in the future? For example, could the annual RFP be more explicit in referencing what’s already been developed and suggesting directions for combining related efforts?

  • Lois McLean March 8, 2009, 4:20 pm

    Despite the often-expressed value of the NSDL Annual Meeting, Whiteboard Report, and other community activities, projects can still be unaware of very specific resources, tools, or evaluation approaches that could be adapted for their own purposes. Matchmaking efforts seemed more prominent several years ago, with events such as Tool Time. Should the NSDL again take a more active role in such activities, e.g. by sponsoring workshops for non-pathways groups, such service providers or tool builders?

  • Lois McLean March 8, 2009, 4:20 pm

    Mimi and others have pointed out the critical nature of the NSDL Annual meeting and the value of committees (such as the Evaluation and Impact Standing Committee) in fostering collaboration. In light of the planned shift from committees to work groups, how can collaboration be encouraged and sustained in practical and effective ways?

  • Lois McLean March 8, 2009, 4:21 pm

    Mimi comments that the NSDL continues to be primarily driven by technical concerns, leaving out the voices of the users, especially in the K-12 world. Do you agree? If so, how can the NSDL foster development that acknowledges and acts on the needs of that user base?

  • Kuko Ako March 24, 2009, 11:01 am

    I agree 100% with Mimi’s point (as pointed out by Lois) about the NSDL being driven by technical concerns. While technical elements are an essential component of the NSDL, attention has to shift to the needs of educators who are accessing the online resources and also on quality versus quantity. In these times of increased accountability and high stakes testing, we should try not to lead teachers astray by presenting them with tons of content that is only tangentially or topically related to the big ideas they have to teach. Otherwise they will promptly get out. So an important question is: What objective measures–preferably based on human rather than machine methods–can we present to the user to help them quickly decide whether any given resource is worth even checking out further–that is, likely to be useful to their teaching?