Technical Paper


The Landmark College Institute for Research and Training (LCIRT) received a grant from the National Science Foundation (NSF) Advanced Technological Education (ATE) program in July, 2009, to partner with three community colleges to improve instruction and support for students with learning disabilities and attention disorders (LD) in technological education programs. The project entails conducting a series of needs assessments of community college technology programs and regional technology employers to develop a hybrid (in-person and online) professional development program to be pilot tested with instructors who teach required courses for technology programs at community colleges. The primary goal of our project is to increase the number of students with LD who pursue and graduate from technological education programs and either continue their academic pursuit of four-year technological degrees, or gain employment as successful technology workforce employees. We implement a universal design philosophy, in which we believe that our professional development program will not only enhance access to education for students with LD, but also students who may be struggling with other issues that serve as barriers to education, such as under-prepared students, English language learners, and other students considered to be at-risk. Our two-year project is intended as a demonstration and proof of concept to increase the number and diversity of struggling students who succeed in technological programs at community colleges.

As part of our commitment to the project goals, we elected to write a technical paper that documents the process of developing the hybrid course. The purpose of this paper is to create a record of activities for our own use and to share our experiences with others who engage in similar projects. It enhances our ability to reflect on what worked well and on “lessons learned” from difficulties and unexpected obstacles.

From the start, our course development team committed to two principles:

  1. Work collaboratively as a design and review team
  2. Make use of resources from other ATE projects to the fullest extent possible

Effective collaboration was key to our process as well as a desirable goal. It allowed us to accomplish multiple tasks on a short time-line, capitalize on the diverse backgrounds and expertise of the project team and our community college collaborators, and spread the workload equitably in light of competing responsibilities and projects. Moreover, it provided an opportunity for us to practice the “soft skill” of teamwork that was a recurring theme in our course content.

We also wanted to capitalize on resources from other NSF ATE projects as a way to broaden their impact and build on the content and processes thoughtfully developed through other initiatives. To begin with, we adapted the WCIT Curriculum Development Toolkit developed by the Washington Center for Information Technology. This served as the framework for our own development process, which we called ATE Project Materials Development Process (ATE-MDP), and also provided many useful resources. Several team members participated in webinars on outcomes and assessment delivered by the Evalu-Ate Evaluation Resource Center for Advanced Technological Education ( One of our team members participated in the NSF ATE Scenario Based Learning (SBL) in Technical Education project, a hybrid course offered by the Experiential Learning Center of Foothills-DeAnza Community College District. We enthusiastically incorporated SBL design and content into our course and took advantage of many resources they identified.

We also incorporated content developed by LCIRT in earlier NSF-funded projects, notably resources from our NSF RDE “UD Algebra” (formally “Universal Design of College Algebra”) project and our NSF RDE “Biology Success!” Project, both projects that promoted universal redesign of courses. Finally, we used online learning materials developed under a U.S. Department of Education (ED) grant, under the Demonstration Projects to Ensure Students with Disabilities Receive a Quality Higher Education program known as “Demo Disabilities” (formally “A Needs-Based Best Practices Professional Development Program for Teaching Students with Learning Disabilities in the Community College Setting).


We appointed a cross-disciplinary team to collaborate on course development. A leader took responsibility for calling meetings, taking meeting notes, distributing the notes via email and posting the notes to an intranet work site (using the Moodle learning environment), while content development was shared equitably among LCIRT’s three Lead Education Specialists. The college’s Director of Web Strategy collaborated with the team in designing the course website. The team set a schedule of bi-weekly meetings from February to June 2010. At each meeting we reviewed completed action items and assigned new ones, using our project’s Table of NSF ATE Activities, Timelines, Deliverables as a checklist, and the ATE-MDP to guide our work.

Content Development

We used the ATE-MDP to guide development of course formats and content. The original WCIT Curriculum Development Toolkit was created to help community college instructors align Information Technology courses with workforce expectations and industry standards. We modified the WCIT process for our content, which is a professional development course for community college technology education (tech ed) instructors. The aim of our course is to improve instructors’ ability to successfully support students with LD. Like the original WCIT Toolkit, our ATE-MDP process has 8 steps as follows:

  1. Understand characteristics of standards-based curriculum
  2. Perform a market analysis of industry expectations
  3. Map existing curriculum
  4. Perform a gap analysis that links mapping and market analysis
  5. Develop modules for assignments, assessments to address gaps
  6. Plan online environment
  7. Assess impact
  8. Perform ongoing review

The activities we performed for each step of the ATE-MDP process are described in more detail in the sections below.

1. Understand characteristics of standards-based curriculum

Although our course is designed for instructors of standards-based courses, it is not in itself standards-based. We were not able to identify standards for professional development courses at the post-secondary level. We did identify national standards for K-12 professional development promoted by the National Staff Development Council, which guided our course design. These standards include:

  • Organize adults into learning communities
  • Provide educators with the knowledge and skills to collaborate
  • Provide resources to support adult learning and collaboration
  • Use student data to determine adult learning priorities, monitor, progress, and help sustain improvement
  • Use multiple sources of information to guide improvement and demonstrate its impact
  • Prepare educators to apply research to decision making
  • Use learning strategies appropriate to intended goal
  • Apply knowledge about human learning and change

More germane were the principles of Universal Design which infuse all of the work done by the LCIRT team. Universal Design is a framework which anticipates diversity among learners in designing and delivering educational content. It is especially useful when working with under-served and under-prepared students because it guides instructors to make course content more usable and accessible for a wide range of learners, without watering down the curriculum or lowering academic standards. We used the 9 Principles of Universal Design as articulated by the University of Connecticut to guide our course design and directed our course participants to apply Universal Design principles to their course re-design tasks.

Principles of Universal Design, adapted from University of Connecticut FacultyWare website

Principle 1: Equitable Use – Instruction is designed to be useful to and accessible by people with diverse abilities. Provide the same means of use for all students; identical whenever possible, equivalent when not.

Principle 2: Flexibility in Use – Instruction is designed to accommodate a wide range of individual abilities. Provide choice in methods of use.

Principle 3: Simple and Intuitive – Instruction is designed in a straightforward and predictable manner, regardless of the student’s experience, knowledge, language skills, or current concentration level. Eliminate unnecessary complexity.

Principle 4: Perceptible Information – Instruction is designed so that necessary information is communicated effectively to the student, regardless of ambient conditions or the student’s sensory abilities.

Principle 5: Tolerance for Error – Instruction anticipates variation in individual student learning pace and prerequisite skills.

Principle 6: Low Physical Effort – Instruction is designed to minimize nonessential physical effort in order to allow maximum attention to learning.
Note: This principle does not apply when physical effort is integral to essential requirements of a course.

Principle 7: Size and Space for Approach and Use – Instruction is designed with consideration for appropriate size and space for approach, reach, manipulations, and use regardless of a student’s body size, posture, mobility, and communication needs.

Principle 8: A Community of Learners – The instructional environment promotes interaction and communication among students and between students and faculty.

Principle 9: Instructional Climate – Instruction is designed to be welcoming and inclusive. High expectations are espoused for all students.

2. Perform a market analysis of industry expectations

We conducted a two-pronged Needs Assessment with each of our community college partners by interviewing college and industry personnel on-site or by telephone. Between February and March 2010 we sent two-person teams to each location to interview relevant faculty, staff, students, and local employers about tech ed courses, student outcomes and industry expectations. Because it was difficult to schedule interviews with local industry members while on-site, some interviews had to be conducted at a later date by telephone. The Needs Assessment teams conducted extensive structured interviews, analyzed the data, and identified common themes across the three community college settings: small, rural college; large, urban, multi-campus system; large, suburban multi-campus system with significant minority student population.

This process identified six themes that underlie the struggles experienced by students with LD in tech ed courses:

  • College advising programs don’t address tech ed programs and their students
  • Students and faculty lack awareness of disability laws, rights and responsibilities
  • Faculty lack an approach for supporting students with diverse learning profiles
  • Faculty lack awareness of specific resources available to support students with diverse learning profiles
  • Students struggle with the “hard” skills of reading, writing, math
  • Students struggle with “soft” skills demanded by industry including team work, critical thinking/problem-solving,  oral and written communication, professionalism, and work ethic

3. Map existing curriculum

Our existing curriculum primarily incorporated previously created professional development materials. We analyzed our existing resources for content that could address the themes identified. Earlier projects funded by NSF included our NSF RDE “UD Algebra” website and our NSF RDE “Biology Success!” manual. We also analyzed our online ED-funded “Demo Disabilities” modules, and other internally-developed professional development material for relevance. We created a list of potentially re-usable resources, keyed to the themes.

4. Perform a gap analysis that links mapping and market analysis

Next we compared our content to the themes, considering the specific community college programs we targeted, to identify matches and gaps. We determined that our existing content provided rich resources for addressing most of the themes, but we noted some gaps, particularly in content to address soft skills. We decided to incorporate SBL to address aspects of this gap and because it is being widely adopted by tech ed programs to improve the workforce skills of graduating students. We created a Matrix of Hard and Soft Skills to model for our instructors how to integrate both kinds of skills as part of their course re-design.

5. Develop modules for assignments, assessments to address gaps

Our team met bi-weekly from February through June 2010 to assign tasks and monitor progress. The team leader took notes, which were distributed by email and posted on our Moodle work space. Each team member kept an Activity Log to document work flow and a Time/Effort sheet to document time on task. The latter was submitted monthly to the project administrator.

Once we decided on a SBL approach, we had a framework for our hybrid course based on specific tasks instructors will perform to carry out their course redesign.

In-Person Course Introduction

  • Course Overview
  • Introduction to learning disabilities
  • Universal Design
  • Hands-on introduction to online content
  • Task One: Identify primary and secondary learning outcomes for your course

Online Modules

  • Task Two: Evaluate accessibility/usability of course content for students with different learning profiles
  • Task Three:  Re-design assessments using Universal Design
  • Task Four: Design grading rubric using Universal Design
  • Task Five: Re-design assignment using Universal Design and collaborative learning
  • Task Six: Re-design syllabus using Universal Design

Each module provides an overview of its purpose; a list, instructions, and timeline for deliverables; access to supportive resources; and reflection questions to post to a discussion board. Each task will be reviewed by a peer as well as shared with the online community of learners.

In July and August the team worked individually to develop and review the modules, revising them through a process of peer review. Based on feedback from partners regarding our Instructor Expectations, we scaled back the online content, limiting it to four modules of two weeks duration each. The final choice of modules:

  • Task One: Identify barriers to success faced by students with learning disabilities
  • Task Two: Design a rubric for a course assignment or assessment
  • Task Three: Design a collaborative-learning activity
  • Task Four: Re-design a lesson plan, using Universal Design and collaborative learning

The In Person course introduction was delivered to our partner schools in September and October, 2010, near the start of each school’s semester. Schools enrolled a minimum of two technology instructors in each course.

6. Plan online environment

The team worked with Landmark’s web manager from the start in order to ensure a universally designed website that exemplifies accessibility and usability. Our web manager contributed her extensive expertise and experience in conducting usability reviews on websites and software.

Early in our process we explored Moodle Teacher documents for features of Moodle, our course platform, that offer interactive options such as discussion boards and branching logic lessons. We also looked at other course websites and opted for the clean, recursive design of the SBL course as a model that incorporated many desirable features and was well suited to the format of our Task-Based Modules.

As our team created the content for each online module we submitted it to our web manager to format in the Moodle environment. The web manager also evaluated materials for accessibility and usability in our Landmark Universal Design and Usability Lab. She kept a running log of results and revised content accordingly.

The team also coordinated delivery of the online modules, ensuring prompt responses to participant queries, monitoring the discussion boards, and establishing protocols for responding to and assessing participant deliverables.

7. Assess impact

Our project included a significant evaluation component designed to assess its impact on participating faculty and students in their courses.  Faculty completed pre-and post-implementation surveys of self-efficacy and a knowledge inventory pertaining to course content. They participated in phone interviews early, mid, and late semester and their courses were observed twice using a behavioral checklist.

Students of participating faculty completed an initial demographic survey, and a pre-and post-self-efficacy survey.  Additionally, two students from each faculty were selected for phone interviews conducted early, mid, and late semester.

Survey, interview, and observation protocols were developed by the project team in consultation with our external evaluator.  NSF encouraged us to use items from previously validated instruments, so we selected the most relevant items to compose our own instruments, keeping in mind constraints on the time and effort required to complete them.

8. Perform ongoing review

A couple of initial faculty participants dropped out early in the course due to other work commitments. Remaining faculty were slow to participate in the online course. Faculty experienced some difficulties locating and accessing the online Discussion Board.  Most of them also missed the deadlines for submitting tasks. The Discussion Board questions were contingent upon completing the deliverable tasks, so no discussion got underway until a number of faculty had completed their assignments.

By mid-semester only 2 of the 7 remaining faculty had submitted assignments and only 1 had contributed to the Discussion Board.  LCIRT staff contacted site coordinators to alert them about the low participation rate and to brainstorm strategies to increase faculty participation. Site coordinators stepped up their efforts to contact instructors through email, phone, and in person.

By the final week of the semester 2 of the faculty still had not accessed the online course; 1 had participated in the Discussion Board but not submitted assignments; 1 had submitted assignments but not posted to the Discussion Board; 1 had submitted assignments that deviated from the required tasks; and the remaining 2 had submitted assignments and posted discussion questions according to course protocol.  LCIRT staff directly contacted the non-responders by phone and in person.

Some of the course assignments were submitted after the semester ended, which meant learning from the course would not be implemented during the semester for which we collected data. As one of our instructors noted, “Most of what I have learned from this study will be implemented in the Spring of 2011.  I will need a class enrollment of at least 8 people to get a really good feel for Task #4.”

[Step 8 will be completed after the course is completed in Fall 2010.]

Lessons Learned

  • Difficult to schedule interviews with relevant local employers. Had to conduct follow-up interviews long after site visits
  • Course development very labor intensive: grant funding covered only small fraction of actual work time
  • Grant funding covered only portion of travel costs incurred for Needs Assessment
  • Moodle has certain limitations, which were trade-offs for its generally high levels of accessibility and low cost.  For instance, we initially had trouble with our discussion board because we used the default News Forum, which doesn’t allow participants to post and reply to postings – only designated faculty can reply to items.  Our tech expert was able to solve these problems quickly.
  • As part of the evaluation process we were encouraged to select items from previously validated surveys, but no validated instruments exactly matched our participants, their teaching contexts, and our research needs. This proved challenging for survey development, and the use of previously-validated questions led to confusion during interviews and surveys.
  • Since Discussion Board questions depended on completion of deliverable tasks, no discussion took place until a number of faculty completed their tasks.  In the future, discussion questions should be independent of task completion, to support livelier discussion earlier in the course.
  • We had low faculty participation on the online course for the first few weeks.  It may be because the face-to-face segment of the course didn’t start right at the beginning of the semester.  In one instance, it was delayed for almost a month.  It may also be because the tasks were too time consuming.  It may be better to minimize the tasks and focus more on discussion board participation.
  • Our final Task (Task 5) wasn’t posted by our web manager until late in the semester.  We decided to jettison this Task because of its lateness and the low completion of earlier tasks.
  • Participation in Discussion Board has been very spotty.  We may have had too many questions associated with each task. We should consider re-writing the questions so some can be answered both before and after completing the tasks, rather than just after.  We should also encourage dialogue among the different schools by asking questions that explicitly compare programs and courses among the schools.
  • Most of what instructors learned probably won’t be implemented until next semester, Spring, 2011.  We don’t have the resources to collect data (especially to observe classes and interview students and faculty) at that time.