Friday, 8 November 2013

ALRI - 2013 Literacy and Learning Symposium Update

The Adult Literacy Research Institute (ALRI) team attended the annual provincial Literacy and Learning Symposium held at Deerfoot Inn on October 16, 17, and 18.
ALRI presented current research projects to attendees from Alberta’s literacy communities. Audrey Gardner and Brendan Baines from Literacy Alberta hosted a session on a collaborative project focused on Literacy and Essential Skills (LES).
Sandi Loschnig shared highlights from the Stories from the Field project, followed by small group discussions exploring practitioner experiences in the field. Berniece Gowan and Patricia Pryce presented findings from the WriteForward project.
ALRI participated in the Resource Fair Reception, where we had representatives discuss our research projects such as the Alberta Reading Benchmarks, Alberta Adult Literacy Assessment Framework for Aboriginal Peoples, Read Forward, Learner Progression Measures and Nations Learning Together: An Art and Adult Literacy Project.
We would like to thank Literacy AlbertaCommunity Learning Network and Centre for Family Literacy for giving us the opportunity to share our research and gain new information from fellow literacy organizations in the community.
See photos of the ALRI team at the Resource Fair Reception below.

Candace Witkowskyj, Project Research Officer and
Audrey Gardner, Coordinator  
ALRI project resources and information
on display.
Candace Witkowskyj, Project Research Officer and
Samra Admasu, Communication Officer
Berniece Gowan, Project Coordinator

Wednesday, 6 November 2013

Decoding the Diploma Exams: Part I

Part I
I am writing new Physics 30 unit exams based on an analysis of the diploma exam school reports, available from Alberta Education’s Extranet.  This analysis primarily explicates the underlying logic of the diploma exams in terms of how they emphasize elements of a course’s program of study: some outcomes are frequently tested, others are almost never tested, and most are tested about once per exam.  In this first part of a two-part article, I explore the knowledge gained from parsing the diploma exam school reports; how to estimate the degree of emphasis of each specific outcome on the diploma exams; and, how to estimate the best length of a unit exam.  Although I mainly focus on the Physics 30 school reports, these methodologies can be applied to any math or science school report with equal efficacy.

What are the Diploma Exam School Reports and Why Analyze Them?
The diploma exam school reports are published on Extranet a few weeks after each diploma exam is delivered.  Within one year, they are removed from Extranet and there is no way to retrieve them.  I recommend downloading them every February and July.  School reports compare the performance of “Bow Valley College-Main Campus” and the province.  Statistics are provided such as “Percentage of Students Who Achieved Standards on Their Final Course Mark” and “Percentage Distribution of A, B, C, and F.”  The school reports also provide valuable item-level data, the main focus of this article.  By contrast, the diploma exam information bulletins are published on the Alberta Education web site and provide information at the unit level, but not at the general outcome or specific outcome levels.  The item-level data from the school reports provides a more granular view of the diploma exams than the information bulletins--but achieving that granularity requires some work on our part.  See below for an annotated sample of the June 2011 school report for Physics 30:

Click on the image below to see in full size.  More detail will then be visible.


Item-level data from the school reports is useful when designing unit and equivalency exams.  The information bulletins indicate that Unit B, “Forces and Fields,” comprises 25-35% of the Physics 30 diploma exam.  However,  Unit B is composed of three general outcomes (GOs): B1, B2 and B3.  Each general outcome is itself composed of 15 or so specific outcomes (SOs). So, Unit B is composed of about 45 specific outcomes.  What level of emphasis should be given to specific outcome B2.2k (“compare forces and fields”) on a unit or equivalency final exam?   An analysis of six Physics 30 school reports reveals that this outcome was never tested on those six exams.  This is an important design consideration when developing unit exams.  Perhaps Alberta Education rarely, if ever, tests outcome B2.2k because it is considered much less important than other outcomes.  Or, perhaps items that validly evaluate B2.2k are very difficult to write.

Both of these potential reasons give me pause since actions are loudest:  latent values can be inferred from the school reports (actions) that aren’t apparent in the information bulletins (words).  If, by its choice of what to leave untested, Alberta Education deems B2.2k of low importance, then why would I test it?  If B2.2k is difficult to evaluate, then why would I try to evaluate it, given my very limited resources?  If we know which outcomes Alberta Education considers most important, or most validly testable, then we know which outcomes our unit exams, assignments, labs, course outlines, and course blueprints should typically address most deeply.  We also know which aspects of a course to focus on in diploma prep sessions and in practice diploma exams.  We want to give our students the best chance of success by helping them to allocate their very limited study time to the outcomes that are most likely to appear on the diploma exam they must write.  Our time is finite.  Their time is finite.  Economy recommends we focus on things most often tested.

That being said, it is also good to keep in mind that the school reports, the information bulletins, the performance standards such as standard of excellence or acceptable standard, and even the diploma exams themselves, are NOT the Program of studies.  Only the program of studies is the program of studies.  We need to address every outcome in some manner, even if Alberta Education is unable to test many of them on their machine-scored exams.  Furthermore, just because a certain outcome wasn’t tested on six Physics 30 exams doesn’t mean it won’t be tested on the seventh.  A deconstruction of the school reports provides general guidelines: we still need to use our individual professional judgment when developing instructional and assessment programs.

How to Estimate the Degree of Emphasis of Each Specific Outcome on Diploma Exams to Aid in the Design of Unit Exams and Equivalency Final Exams
The degree of emphasis of each specific outcome (SO) is determined by first tallying the number of times each SO was tested on each of a set of six Physics 30 diploma exams.  The tally is then divided by the number of items devoted to the unit on the particular diploma exam, not by the total number of items in the entire diploma exam.  This is because the number of items devoted to a unit varies significantly from exam to exam.  If we divided by the number of items on the entire diploma exam, a misrepresentation could result.  For example, the January 2011 exam devoted 20 items to Unit B, whereas the June 2012 exam devoted just 12 to Unit B.  We want to know the degree of emphasis of an SO per unit, not per diploma exam, to aid in the development of unit exam blueprints.  Finally, the percentage of items devoted to a particular SO is averaged across seven Physics 30 diploma exams to produce a more valid statistic.  See the diagram below for how the percentage emphasis of SO B2.8k, “describe, quantitatively, the motion of an electric charge in a uniform electric field,” is estimated.

Click on the image below to see in full size.  More detail will then be visible.


You can view this Google Spreadsheet at Part I: SCN3797_u2_schoolreportanalysis.  Although you cannot edit this spreadsheet, go to File->Copy to make a copy.  The copy that you make will be editable so you can analyze another unit of Physics 30, or of other course.

Physics 30 Unit Exams should be composed of about 25 selected-response items.  (See the next section for a method of finding a best estimate of the length of unit exams).  Based on the spreadsheet’s analysis, a Physics 30 Unit B Exam should devote approximately (10.3%)(25) = 2.6 items to specific outcome B2.8k.  Of course, some forms of the Unit B Exam would have 3 items, and others would have 2 items, but most often they would have 3 items.

Physics 30 Equivalency Final Exams are composed of 50 selected-response items, and 30% of these are are allocated to unit B, or 15 items.  Based on the spreadsheet's analysis, a Physics 30 Equivalency Final Exam should devote approximately (10.3%)(15) = 1.5 items to specific outcome B2.8k.

The spreadsheet provides guidelines for the degrees of emphases of the specific outcomes as they appear on diploma exams.  This is just an observation of what actually happened, and is not a value judgment (yet).  Essentially, the degree of emphasis of each SO is developed by parsing a set of diploma exams, which could then become the “Relative Importance” column on our exam blueprints--this is where a value judgment of sorts occurs.  As mentioned earlier, we must still employ our professional judgment when interpreting and applying the program of studies, and we should not adhere to this statistical analysis too strictly.  However, it is helpful to have a general outline of how the numerous SOs are emphasized on diploma exams when developing unit exams, equivalency final exams, and course blueprints.

How to Estimate the Length of a Unit Exam
Physics 30 diploma exams are composed of 50 items, and students get 150 minutes to complete.  Therefore, students, on average, need to answer at least one item every three minutes, if they are to finish the entire diploma exam.  If students get 75 minutes to complete a unit exam in class, then the length of a unit exam should be no longer than approximately (75 minutes) x (1 item / 3 minutes) = 25 items.  From this specific example, a general formula can be developed:


Next, the information bulletins specify 14 numerical response (NR) items and 36 multiple choice (MC) items.  The number of NR items on a Physics 30 unit exam is approximately: (25 items) x (14 NR / 50 total) = 7 NR items.  The number of MC items on a Physics 30 unite exam is approximately: (25 items) x (36 MC/ 50 total) = 18 MC items.  From this specific example, general formulas can be developed:


and


Part II
Stay tuned for Part II!  Please email, call, or stop by if you have any questions or comments, or if you would like to analyze the Physics 30 school reports for another unit, and then write a unit exam based on that analysis.

Regards,
Michael Gaschnitz
My previous blog postings

Science 10 Pilot Update

You may already know about the department’s new direction towards common, official exams, and the use of new answer sheets using FormReturn. SCN1270 was chosen to pilot the new answer sheets this fall.


What does the Science 10 Pilot hope to achieve?

The pilot hopes to demonstrate the benefit of common exams and the feasibility of using FormReturn answer sheets. Another goal is to demonstrate the benefit of item and exam analysis to improve the quality of exams. We also hope to gather feedback from instructors and students using surveys about the new exam format and the answer sheets.

How do the new answer sheets work?

The FormReturn software allows us to quickly create custom answer sheets. These sheets are printed on 8.5 × 11 inch paper and copied. Once marked by students, the answer sheets can be batch scanned. The scanned images are then processed electronically by FormReturn. Each exam has an associated answer key that is used to score student responses.

Why use the electronic answer sheets?

Multiple answer sheets can be marked at once, accurately and very quickly. In addition to total scores, all the responses are captured in a database. In the graphs below, for each of the four questions, the totals for each multiple choice response are shown.  Capturing all the responses allows for detailed item analysis, such as examining the validity of distractors.


Science 10 Pilot Update

To date, we have processed a total of 70 answer sheets. While most students are filling out the answer sheets correctly, some need additional help. From the data collected so far, already some potential improvements to the exams have been identified.

Thank you to the instructors and students who are participating in the pilot. 


Want to learn how FormReturn works?


FormReturn demos planned for November 13 and November 20. Location TBD