Archive Index

Home

Print this article (MS Word)

 

Nina L. Roofe
University of Central Arkansa
s


Contact: nroofe@uca.edu

 

Abstract

Summer enrichment programs for children are increasingly present on college and university campuses. Formal evaluations of these programs identify areas for program improvement, documentation of outcomes, and justification for program continuance. The University of Central Arkansas Super Kids program is for students entering first, second, and third grades. It utilizes a science-based curriculum and incorporates confidence and character-building activities. This program evaluation plan, based on causative theory, may serve as a guide for similar programs (Fitzpatrick, Sanders, & Worthen, 2004).

Program Description

Super Kids is a one-week science-based summer program offered at the University of Central Arkansas (UCA) to entering first, second, and third-grade children. This program addresses both the cognitive and affective domains of learning. Appendix A provides an overview of the science content offered each summer (Super Kids Brochure, 2006, 2007, 2008). Students take three courses, one each summer, during their three years of program eligibility. All classes are held at the UCA Child Study Center.

This program is similar to other summer youth enrichment programs on college or university campuses. Information for these programs was retrieved from selected websites during July 2008. An evaluation component was not available for any of these programs. Appendix B provides a summary of these programs, illustrating their similarities and differences.

Super Kids provides a service for children and families in this community during the summer months. The Department of Early Childhood and Special Education in the College of Education is the coordinating department. The program provides a model for exemplary and prospective teachers involved in UCA’s teacher education program. Each of the UCA Super Kids faculty members are experienced with children and their assigned subject areas. Dr. Mark Cooper is the founder of Super Kids. He directed this program for ten years in Texas and has now directed the program another ten years at the University of Central Arkansas. “UCA Super Kids is one thing I can do to enhance the lives of young children within our community.” (M. Cooper, personal communication, June 8, 2008). A program evaluation will provide analysis for ways in which to improve the services currently provided.

Healthy friendships are one domain of happiness contributing to success in life (Lyubomirsky, King, & Diener, 2005). During the lunch hour, 20-minute discussions focus on the development of confidence regardless of present performance and the development of self-success skills. Children also learn about compassion, self-control, responsibility, friendliness, problem solving, and responsibility.

A motto of UCA Super Kids is, “Once a Super Kid, Always a Super Kid!” Children learn that they do not become super kids, but that they are born with individual super abilities. The term, Super Kid, represents the belief that all children are super and thus should have an opportunity to attend the program. Scholarships are available for students who cannot afford the registration fee.

The UCA Super Kids program has spun-off two additional programs for elementary and secondary students on the UCA campus. The UCA Challenge Program is for children entering 4 th and 5 th grades and focuses on biology and chemistry. The Math and Science Investigator Program is for students entering 9 th, 10 th, and 11 th grades and focuses on math, physics, and biology.

Program goals for Super Kids include (a) teach science, (b) facilitate behavior development that builds confidence and character, and (c) recruit future UCA college students. The stakeholders include the Super Kids students, their parents, the Super Kids teachers, Conway Public Schools, St. Joseph School, Conway Christian School, the University of Central Arkansas, the College of Education, and the Early Childhood Education Department. Currently there is no formal evaluation component for Super Kids.

A logic model delineates the inputs, activities, outputs, and outcomes of the program (McLaughlin & Jordan, 1998; Kettner, Moroney, & Martin, 1999). The program description (Table 1) was written with input from Dr. Cooper. It will be included in the interim report for review by stakeholders to ensure accuracy of program identification.

Table 1. Logic Model: UCA Super Kids Program

Inputs

Activities

Outputs

Outcomes

Initial

Intermediate

Longer-term

  • Money (initial seed and program fees)
  • Staff (Dr. Cooper, teachers, staff)
  • Staff time
  • Facility—Child Study Center
  • Office supplies and equipment (paper, copier, etc)
  • Students
  • Parents
  • Teaching materials
  • Guest speakers
  • Science lessons
  • Confidence and character building lessons
  • Singing
  • Parent survey
  • Student Pre-test
  • Student Post-test
  • Teacher and staff interviews
  • Picture-taking
  • # science lessons taught
  • # confidence and character lessons taught
  • # education materials distributed
  • # hours of service delivered
  • # students served
  • # parents served
  • # newspaper articles printed
  • # information brochures distributed
  • Closing ceremony
  • Pictures
  • Participants’ understanding of science improves.
  • Participants master age-appropriate actions.
  • Quality summer activity for children.
  • Teacher preparation candidates’ opportunity to observe and participate.
  • MSE candidates’ opportunity to conduct practicum.
  • Service to the community by UCA.
  • Participants develop a love of learning.
  • Participants develop in affective domain re: friendship and self-confidence
  • Pre-teachers apply teaching skills
  • MSE candidates develop increased teaching proficiency
  • Participants choose to attend UCA Challenge (4 th-5 th grade) and the Math-Science Investigator Program (9 th-11 th grade).
  • Participants choose UCA for college.


Purpose and Scope of Evaluation

The purpose of this evaluation is to examine the value of the UCA Super Kids Program as evidenced by the knowledge and behavior skill development of the students in that program. The reasons for this evaluation are threefold: (a) to assist in future program planning and provide information to stakeholders, (b) to judge the overall value, worth, and merit of the program for participants, and (c) to determine if program goals or objectives are being met and to what extent for the stakeholders. This evaluation is formative as program improvement is the primary goal of the evaluation (McLaughlin, 2003; Fitzpatrick, et. al., 2004).

The scope of the evaluation is the UCA Super Kids Program. The evaluation will focus on three desired outcomes, which relate to the three program goals: (a) assessment of change in science knowledge gained during program participation, (b) assessment of positive behavior development and change due to program participation, and (c) examination of participation in Super Kids Program and subsequent enrollment at the University of Central Arkansas. An objectives-oriented approach is used to determine if and to what degree program goals and objectives are achieved (Fitzpatrick, et. al., 2004).

The primary target audience includes the Super Kids graduates and their parents, the local public and private elementary schools, UCA, the Program Director (Dr. Mark Cooper), and the Super Kids staff and teachers. The secondary target audience includes the Conway community and Faulkner County as the individuals and families involved in Super Kids live and function in this area.

Efficient program-level evaluation incorporates course-embedded assessments that link learning activities to program outcomes (Huba and Freed, 2000). For example, the student pre- and post-test evaluation is a course-embedded assessment that links the science content taught in the program to the program outcome of improvement in science knowledge. The parent survey is another program assessment tool used to link the character and friendship building components of the program to the program outcome of mastery of age-appropriate actions. The following guiding questions emerged from the identified program goals: (a) Does program participation affect the science knowledge of the participants and how? (b) Does program participation affect the behavior development of the participants and how? (c) Do Super Kids graduates attend UCA?

The evaluator is familiar with purposes to uphold the guiding principles for evaluations: systematic inquiry, competence, integrity/honesty, respect for people, and responsibilities for general and public welfare (American Evaluation Association, 2004). This evaluation plan outlines the technical standards and data based inquiry the evaluator will use. The evaluator will seek assistance from others to review statistics or other areas outside her scope of practice or area of comfort.

Integrity and honesty are crucial to the reputation of any evaluator, and the evaluator possesses these characteristics along with basic knowledge of program evaluation. Any potential conflict of interest must be identified prior to the start of the program evaluation along with the strategy the program evaluator will use to address the conflict of interest.

Another potential problem is resistance from the Super Kid’s staff at having the program evaluated. This program has never been formally evaluated and that process may be uncomfortable for those directly involved with the program. To maintain political viability it is vital to maintain open communication with Dr. Cooper and the Super Kids staff. Initial planning meetings will be conducted to allow stakeholders the opportunity to discuss their needs. The identified needs will be ranked to determine the value of each in light of the budget, time and personnel available for program evaluation. Identifying and agreeing on the value of the information obtained in the evaluation and how that information will be used or analyzed to meet the needs of the stakeholders will add value to the evaluation results. They will be informed that this evaluation is to provide the evaluator with experience in program evaluation and to provide them with formative information for program improvement and not summative information for program continuance.

Institutional Review Board (IRB) approval ensures protection of the rights of the program participants, and this approval will be obtained before performing the program evaluation. The management plan addresses dissemination of information to all interested parties. The identified boundary of the program evaluation is the UCA Super Kids Program objectives relative to program outcomes. The evaluator has some resources to assist in this evaluation including graduate assistants, photocopier access, office supplies, and a computer. Limitations include time constraints, as the program is in session only during the month of June each year, and financial support, as no part of the FACS budget is allocated for this project. Finding respondents in a timely manner, locating, and training support staff are other limitations.

Design

The data collection combines qualitative and quantitative methods, including teacher and staff interviews, student pre-tests/post-tests, and parent surveys. Review of the Mental Measurements Yearbook (2001) does not reveal instruments usable with this age group to assess science knowledge of the topics taught in Super Kids. Attention was given to item writing and format development in preparation of the instruments (Cox, 1996). Appendices C, D, and E contain the teacher and staff interview guide, student pre-and post-tests, and parent survey, respectively.

The teacher and staff interviews will be conducted at the end of the fourth and final week of Super Kids and hand-coded for trends. The student pre-test/post-test is on the second-grade reading level according to the Flesch-Kincaid scale. Five questions are included on this instrument to assess a representative sample of the content taught on each day of the program (Linn, Miller, & Gronlund, 2005). Descriptive statistics and paired sample t-tests for pre- and post-test variables will indicate the level of change in students’ science knowledge due to program participation.

The parent surveys will show no identification (e.g., “P1” for “Parent Number One”), and will be hand-tallied. The percentages of the total will be calculated for each of four possible responses: “definitely yes”, “generally yes”, “generally no”, and “definitely no” for each of the questions asked on the survey.

The University of Central Arkansas uses a program called Banner for academic advising, student registration, and housing demographic data. Line items can be added into the system to identify students who attended Super Kids and in which years they attended the program. The computer can then identify and tally raw numbers of students who attended the Super Kids Program. The raw number can be compared to the Super Kids graduation numbers for the appropriate year. Trends can be tracked starting with 2008-2009 school year because Super Kids is in its tenth year. Retention data are computed by UCA each year. Using the line items for Super Kids participation, we can determine if these students vary from non-participants in retention.

The evaluation agreement in Appendix F outlines the responsibilities of the evaluator and program director, timeline, and evaluator’s fee. The information sources are the student participants, the parents of the student participants, and the university’s admissions computer system. The students and parents will be selected based on their participation in the Super Kids Program and their willingness to participate by completing the consent form, surveys, pre-tests, and post-tests. The management plan outlines the procedure for administering these instruments. All data obtained will be coded with no names on the forms and stored in a locked file cabinet in Dr. Cooper’s office.

Content validity of the instruments will be reviewed by subject-matter experts, Writing Center employees, and graduate students. Recommended appropriate changes will be made. Reliability of the evaluation results will increase each time the evaluation is performed (Patten, 2007). The following table outlines the design of the evaluation.

Table 2. Evaluation Design

Evaluation Question

Information Required

Information source

Method for Collecting Data

Analysis Procedures

Interpretation Procedures & Criteria

1. Does program participation affect science knowledge of the participants and how?

Pre/Post test scores from 1 st-3 rd grade participants

Students pre/post tests

Pre/Post Tests

Descriptive statistics (means, standard deviation), paired sample t-tests using SPSS

Determine if student’s science knowledge levels changed to a practical degree after attending Super Kids.

2. Does program participation affect the behavior development of the participants and how?

Parent survey responses from parents of participants

Parent surveys

Survey tally form

Content analysis, coding by hand

Determine if student’s behavior development in the areas of confidence and character improved by attending Super Kids.

3. Do Super Kids graduates attend UCA?

Admissions data from UCA Admissions Department

Banner Admissions database

Access report in Banner system

Content analysis, tally raw numbers

Determine if students who participated in Super Kids attend UCA.

Plan for Reporting Results

Three groups of audience members will receive the evaluation results—students and parents, teachers and staff, and the UCA/Conway community. Each audience group is interested in different aspects of the evaluation (Fitzpatrick et al., 2004). Students and parents are interested in science content learned, confidence and friendship building skills acquired, access to quality summer activity, and continuance of similar programs for their children after third grade. Teachers and staff are interested in enrollment data, science content learned, and confidence and friendship building skills acquired. The UCA and Conway community are interested in continued enrollment in Super Kids and other programs at UCA and the effects this program has on high school students choosing UCA for college.

The evaluation report should include the program description, context, purposes, procedures, and findings. The Program Director is involved in the process from the beginning in order to increase the likelihood that the evaluation plan will be used. Dr. Stephanie Vanderslice, Assistant Professor of Writing at UCA, will serve as an impartial reviewer of the evaluation reports. Graduate students in the Statistics Department will review the data analysis for errors. The admissions personnel and the program evaluator will both access the Banner data to ensure accuracy. Table 3 outlines the plan for reporting results related to the identified evaluation questions.


Table 3. Reporting Plan

Evaluation Question

Audience

Content

Format

Schedule

Context

1. Does program participation affect science knowledge of the participants and how?

Super Kids students, parents, elementary schools, Program Director, teachers, staff

Student pre-and post-test scores and data analysis

Colorful bar graphs in evaluation report, slide show

At end of each summer session

Presented at fall staff meeting (Oct), interim & executive summary / full report, article in local and UCA newspapers

2. Does program participation affect the behavior development of the participants and how?

Super Kids students, parents, elementary schools, Program Director, teachers, staff

Parent survey

Results reported in presentation of evaluation results, use of color, simple format, easy to read

At end of each summer session

Mailed to participants, presented at staff meeting, included in executive summary and full report,

3. Do Super Kids graduates attend UCA?

UCA Depts: (Admissions, Retention, Early Childhood Educ.) Conway, Faulkner Co,. Program Dir.

Admissions reports and applications

Bar graph from Admission Department

Each fall, after day 10 of session

Presented at Super Kids, Admissions, & Recruitment staff meetings, interim, executive summary / full report, article in local and UCA newspapers


Management Plan/Schedule

Positive student outcomes are the focus of everyone involved in Super Kids. Periodic communication and monitoring is part of the process, so no one feels uninformed. Examples of how the evaluation results might be useful will be provided at the beginning of the evaluation and suggestions for implementation will be provided in the full report. Examination of the results will help identify the strengths as well as the problem areas so they can be addressed and thus strengthen the impact of the evaluation.

Full disclosure of all findings will be provided in writing with oral explanation to stakeholders. Dr. Cooper, the Super Kids staff, Admissions and Recruitment staff, and parents are encouraged to provide feedback on the interim reports. The following management plan outlines the tasks, time lines, personnel responsible, resources required, and the cost of each task. Appendix G contains the proposed timeline in the form of a Gantt chart.

Table 4. Management Plan / Schedule

Evaluation Question

Tasks

Estimated Task Beginning and Ending Dates

Personnel Involved and Estimated Costs

Other Resources Needed and Costs

Total Task Cost

1. Does program participation affect science knowledge of the participants and how?

1.a. Obtain IRB approval (Appendix I)

1.b. Administer student pre-tests on first day of Super Kids X 4 weeks.

1.c. Administer student post-tests to students on last day of Super Kids X 4 weeks; conduct teacher/staff interviews.

1.d. Collect and code student pre-test responses.

1.e. Enter data into SPSS and run data analysis.

1.f. Prepare report of results.

1.a. March

1.b. Each Monday in June during Super Kids program.

1.c. Each Friday in June during Super Kids program.

1.d. First week of July.

1.e. First and second weeks of July.

1.f. ~August-September

1.a. Evaluator, 2 hours paperwork time @ $100 per 8-hour day = $25.00

1.b. Student teachers and MSE candidates, 30 minutes @ no pay = $0.00

1.c. Student teachers and MSE candidates, 30 minutes @ no pay = $0.00; Evaluator, 4 hours @ $100 per day = $50.00

1.d. Evaluator, 1 day @ $100 per day = $100.00

1.e. Evaluator, 1 day @ $100 per day = $100.00; graduate students @ no pay = $0.00

1.f. Evaluator, 2 days @ $100 per day = $200.00

1.a. none

1.b. photocopier—part of Early Childhood Ed. Department at UCA

1.c. photocopier

1.d. none

1.e. SPSS software, evaluator already has

1.f. computer, Word, PowerPoint software, evaluator already has

1.a. $25.00

1.b. $0.00 for teacher/MSE candidates

1.c. $0.00 for teacher/MSE candidates; $50.00 for Evaluator

1.d. $100.00

1.e. $100.00

1.f. $200.00

2. Does program participation affect the behavior development of the participants and how?

2.a. Send parent survey to prospective parents with enrollment packet

2.b. Collect and code parent responses

2.c. Prepare report of results (teacher/staff interviews address in #1)

2.a. ~ Spring Break

2.b. First week of July

2.c. August-September

2.a. Super Kids Program secretary, 4 hours @ $10.50 an hour = $42.00 included in previous mail-out

2.b. Evaluator, 4 hours @ $100 per day = $50.00

2.c. Evaluator, 1 day @ $100 per day = $100.00

2. a. photocopier—part of Early Childhood Ed. Department at UCA

2.b. none

2.c. computer, Word, PowerPoint software

2. a. none additional

2.b.$50.00

2.c. $100.00

3. Do Super Kids graduates attend UCA?

3.a. Add line items to admissions information to designate Super Kids, UCA Challenge, and MSI Program attendance as a child

3.b. Access admissions information through Banner system

3.c. Prepare report of results

3.a. July (in anticipation of fall semester)

3.b. August, after 10 th day of class to ensure actual and accurate enrollment data

3.c. ~August-September

3.a. Admissions personnel and Evaluator, ½ day @ $10.50 per hour for admissions personnel; $100.00 per day for evaluator = $42.00 for admissions personnel; $50.00 for evaluator

3.b. Admissions personnel, 1 hour @ $10.50 an hour = $10.50

3.c. Evaluator, 1 day @ $100 per day = $100.00

3.a. Access to Banner System at UCA—allowed to Admissions Personnel and Evaluator

3.b. none

3.c. computer, Word, PowerPoint software

3.a. $92.00

3.b. $10.50

3.c. $100.00

4. Program Evaluation Reports

1.a. Prepare interim reports, executive summary, full report, and PowerPoint report

1.b. Present reports

1.c. Write articles

1.a. July-October

1.b. July & September interim reports; October—full report (fall planning meeting)

1.c. October

1.a. Evaluator, 2 days @ $100 per day = $200

1.b. Evaluator, 3 hours @ $100 per day = $37.50

1.c. Evaluator, 2 hours @$100 per day = $25.00

1.a. computer, Word, PowerPoint software

1.b. laptop, projector (UCA has)

1.c. computer, Word, PowerPoint software

1.a. $200.00

1.b. $37.50

1.c. $25.00


Proposed Evaluation Budget

The following budget (Table 5) will be reviewed and approved by Dr. Cooper before the program evaluation begins. Appendix H contains the narrative explanation of each budget item.

Table 5. Proposed Evaluation Budget

Category

Year 1 Budget

PERSONNEL/STAFFING

 

Salaries:

 

Evaluator

$ 1037.50

Super Kids Secretary

$ 42.00

Admissions Personnel

$ 52.50

Student teachers

$ 0.00

MSE candidates

$ 0.00

CONSULTANTS

 

Graduate Students

$ 0.00

Writing Center

$ 0.00

TRAVEL/PER DIEM

$ 0.00

COMMUNICATIONS

$ 50.00

OPERATIONS

 

Printing / Duplication

$ 50.00

Data Processing

$ 0.00

Printed Materials

$ 100.00

Supplies / Equipment

$ 20.00

OVERHEAD

$ 486.76

TOTAL

$ 1838.76

(Horn, 2001).

Adherence to Program Evaluation Standards

The Joint Committee on Standards for Educational Evaluation identifies 30 standards for program evaluation divided into four categories. The four categories are utility, feasibility, propriety, and accuracy (The Joint Committee on Standards for Educational Evaluation, 1994). Appendix I provides an outline of each standard addressed in the program evaluation. Using these standards clarify the purpose of the evaluation by maintaining context and focus on the stakeholders’ needs.

Meta-evaluation is important to increase the understanding of the evaluation plan and the credibility of the evaluation to stakeholders. It provides reassurance that the decisions made based on the program evaluation are credible (The Joint Committee on Standards for Educational Evaluation, 1994). The evaluator, program director, staff, and teachers should be involved in the meta-evaluation. Appendix J provides a checklist (Shepard, 1977) for conducting a meta-evaluation of the Super Kids Program evaluation plan.

Summary, Conclusion, and Reflection

The purpose of program evaluation is to prove the worth, value, or merit of a program. This evaluation of the UCA Super Kids Program utilized an objectives-based approach to determine achievement of program objectives. This approach was utilized to answer specific guiding questions about how well the program is working to achieve desired outcomes. This approach is also organized and systematic.

The objectives of the Super Kids Program serve as the standards or criteria for the program evaluation. Guiding questions emerge from the program objectives. Relevant information to be collected includes the student pre-tests and post-tests, teacher and staff interviews, parent surveys, and existing information. The evaluation plan facilitates application of the thirty Joint Committee’s Standards for Educational Evaluation and data analysis for documentation of achievement of program objectives.

For the conclusions of the program evaluation to be justified they must relate to the guiding questions and be fully reported to the stakeholders. Care must be taken for all conclusions to accurately, logically, and adequately reflect the results of the evaluation, including limitations.

One last thought to keep in mind is the concept of practical change may differ from statistically significant change. When data analysis is complete, all areas may not show statistically significant change, but may indicate practical change, which is appropriate for a formative evaluation with the purpose of program improvement. The important question here concerns stakeholders’ ability to use the program evaluation results, including the data analysis, to make program improvement decisions.


References

American Evaluation Association. (2004). Guiding principles for evaluators. Retrieved June 2, 2008, from http://www.eval.org/Publications/GuidingPrinciples.asp .

 

Appendix A

Description of Super Kids Content

Young Ornithologists: Birds of a Feather

Come join our flock! You will get more than a bird’s eye view of our feathered friends. With binoculars and guidebooks in hand, we will observe beaks and bills, legs and eggs, and feathers and feet. Our birds eat, move, communicate, and care for themselves. Our creativity will soar as we build feeders, design nests for nibbling, and experiment with flying. We are sure to hatch several feathered young ornithologists.

Young Medical Scientists

It’s never too early to discover the next Dr. Lee Salk. Scrub-up and get prepared for an exciting experience in a medical laboratory. Come and find out what it is like to be the doctor rather than the patient. Action packed, hands-on exploration in a wide variety of medical fields is a promise. Students will definitely feel like medical scientists!

Young Paleontologists: Dinosaur Galore!

Dinosaurs may be out-of-sight, but they are not out-of-mind! The exploration and investigation of dinosaurs is alive, well, and growing. New dinosaur discoveries are constantly changing the way we think about their appearance, movement, and behavior. Here is an opportunity for your young paleontologist to learn more about these “terrible-lizard friends” and have a “dino-mite” time!

Young Botanists and the Plant World

Come join our garden paradise while sprouting new ideas about the plant world. Budding botanists will get down and dirty in a classroom crawling with vines. As the sun shines on each new day, our thumbs will grow greener with every investigation about the friends and foes of the plan world. Our creativity will blossom as students’ plan their feet in a green house built for Super Kids!

Young Ecologists and the Rainforest

Welcome to the sounds and sights of the jungle! Come explore the rainforests around the world where the temperature hardly changes from day to day, season to season, and year to year. Learn how rainforests affect our air, weather, and lives. Grab your cameras and gear to come join the adventure. The rainforest will come alive with exotic creatures that beg to meet new friends.

Young Oceanographers and the Ocean

Super Kids will enjoy the sparkling water, exotic fish, colorful coral, sandy beach, and warm sun. Get ready to explore the underwater kingdom. The young oceanographers will investigate what makes the ocean salty, where the water comes from, ocean creatures, and the survival characteristics and habits of ocean creatures. Pack your scuba gear and prepare for the dive into an ocean of new ideas and excitement.

Young Astronauts and Outer Space

Blast off to an exciting study of the planets, space travels, space stations and many other space related topics. Students will step into a space laboratory once they enter the classroom. This futuristic investigation of space will spark the imagination of every student. It should come as no surprise when a future astronaut is discovered from this experience!

Young Biologists and Planet Earth

Come join our animal friends and have egg-ceptional fun discovering and witnessing the metamorphosis of all kind of creatures that hatch from eggs. Our young biologists will experience firsthand the sounds and sights of toads, frogs, birds, baby chicks, and other animal surprises. Students will get their feet wet and hands dirty exploring the egg-friendly habitats of our animal buddies.

Young Explorers and the Arctic

Cool off from the summer heat and join the young explorers to the vast regions of the North and South Poles. Students will meet new friends—polar bears, penguins, seals, and walruses. Our young explorers will learn how these animals, and even humans, survive the coldest parts of the world. Earmuffs and gloves are needed for this “chilly” experience!

Appendix B

Summary of Similar Programs

Name of Program

College or University

Content Taught

Age of Attendees

Cost

Children’s Creative Learning Center

University of Alabama at Birmingham

All curricular areas

Ages 7-9

$125 per week

Kaleidoscope

Century College

White Bear Lake, MN

Art, science, math

Ages 3-16

$55.00 per class

Kids U

Southern Oregon University, Ashland, OR

Art, science, tennis, fencing, swimming

Ages 8-12

$99 per class

Super Kids

University of Central Arkansas, Conway, AR

Science, character, confidence

Grades 1-3

(ages 6-10)

$145 per week

Young Artists Summer Program

Silver Lake College, Manitowoc, WI

Art

Grades 1-9

(ages 6-14)

$40 per class

Young Child Program

University of Northern Colorado

Visual and performing arts, math, science, language, literature

Ages 4 to 4 th grade

$185 for a two-week session

Appendix C

Teacher and Staff Interview Guide and Tally Form

Below is a list of guiding questions dealing with the preparation of teachers and staff for the UCA Super Kids Program. These questions deal with issues that are important to the organization and delivery of that program. Other questions may be pursued as the interview progresses.

  1. What is your responsibility or area of focus with the Super Kids Program?
  2. Describe your preparation for teaching or working with the Super Kids Program.
  3. Describe the science curriculum used in the program. How is it adapted to make it grade/age-appropriate?
  4. Describe the behavior development portion of the program. What is the theory base? Describe some of the behavior development activities.
  5. What else would you like to express concerning your involvement with the Super Kids program?

Interview Question

Response Code

Tally

1. Responsibility

Teacher—room

                                              
 

Staff—duties

 
 

Other

 

2. Preparation

Lesson plans

 
 

Activity preparation

 
 

Mail-outs

 
 

Room decorating/preparation

 
 

T-shirt orders

 
 

Copying

 
 

Reserve room for Closing Ceremony

 
 

Other

 

3. Science

Curriculum

 
 

Grade/age-appropriate

 
 

Other

 

4. Behavior Development

Theory base

 
 

Friendship Activities

 
 

Confidence Activities

 
 

Other

 

5. Other issues or topics identified

Other

 
 

Other

 


Appendix D

Example Student Pre- and Post-Test

Name: ________________________________________

Classroom: Young Ecologists—Biology Room, Oviparous Animals

Circle the answer which best answers the question:

  • Which animal comes from an egg?
    • Human
    • Squirrel
    • Chicken
    • Bear
  • Which of these choices describes a reptile?
    • Warm-blooded
    • Lay eggs on land
    • Has fur
    • Has long ears
  • Which of these birds can fly for long distances?
    • Penguin
    • Ostrich
    • Flamingo
    • Cardinal
  • How many weeks does it take a tadpole to turn into a frog?
    • 2 weeks
    • 8 weeks
    • 12 weeks
    • 16 weeks
  • A trout is a
    • Reptile
    • Bird
    • Fish
    • Human

Student Pre-Test / Post-Test Data Analysis Procedure :

Create a data file in Notepad or Excel and imported into SPSS with the File à Read Text Data command. Follow these steps:

  • Text file does not match a predetermined format
  • Variable are delimited by a specific character
  • Variable names are included at the top of each file, with a space, no text qualifier, and cache data locally
  • Code each possible answer as correct (“1”) or not correct (“0”) and entered into the SPSS data file using the variable view.
  • Add value labels as follows:
    • Subjects: 1-20 for each student in the class
    • Grade: entering first grade = 1, entering second grade = 2, entering third grade = 3
    • Egg: human = 0, squirrel = 0, chicken = 1, bear = 0
    • Reptile: warm = 0 eggsland = 1, fur = 0, ears = 0
    • Fly: penguins = 0, ostrich = 0, flamingo = 0, cardinals = 1
    • Frogweeks: 2=0, 8=0, 12=0, 16 = 1
    • Trout: reptile = 0, bird = 0, fish = 1, human = 0
  • Create Pretest and Posttest sum scores for items 5.c-5.g.
  • Click Save.
  • To analyze the data using SPSS version 15.0 click Analyze > Compare Means > Paired Samples T-Test (Stockburger, 2008).

Appendix E

Parent Survey

Parents: It is very important to the teachers and staff of UCA Super Kids that parents have a chance to tell how they feel about what is being taught to their children. Below are some statements to help us get needed information concerning the Super Kids Program. Please answer exactly the way you feel. You are not asked to put your name on this paper or on the return envelope that is provided. Your answers will be grouped with those of other parents to give an overall picture of how parents feel. Thank-you for taking your time to provide the needed information, and for helping to improve the Super Kids Program. Please return the survey by the end of the week that your child attends Super Kids.

Read each statement listed below and check the box that comes closest to your feelings.

Definitely
Yes

Generally
Yes

Generally
No

Definitely
No

The program blends well with science content taught in my child’s elementary school.

       

The character building lessons support what I teach my child at home.

       

I observed my child exhibiting good character during or after attending Super Kids.

       

The confidence building lessons support what I teach my child at home.

       

I observe my child exhibiting confidence in his/her abilities during or after attending Super Kids.

       

My child has made developed friendship building skills (e.g. self-control, problem-solving) at Super Kids.

       

My child feels comfortable at the UCA Child Study Center facility.

       

My child has expressed a desire to attend UCA Challenge.

       

My overall feeling of the Super Kids program is positive.

       

Please write any additional comments you would like the teachers and staff to know:

Appendix F

Evaluation Agreement

TO: Dr. Mark Cooper, Program Director of UCA Super Kids

FROM: Nina Roofe, Program Evaluator

RE: Evaluation of UCA Super Kids Program

DATE: _____________

I am happy to confirm acceptance of our agreement to conduct a program evaluation for the UCA Super Kids Program. As per our previous discussion, you (Program Director) will provide the previously requested documents and access to personnel during June and July, and I (Evaluator) will provide the evaluation as outlined in the management plan for the agreed on price of $1838.76 distributed as outlined in the budget.

This price includes presentation of the executive summary, slide show, and full report at the fall planning meeting in October 20__. Please indicate your formal acceptance of this plan by signing below. Retain one copy for yourself and return the original to me.

I look forward to working with you.

__________________________________ ________________

Dr. Mark Cooper, Program Director, UCA Super Kids Date

_________________________________ ________________

Nina Roofe, Program Evaluator Date

Appendix G

Gantt Timeline




Appendix H

Budget Narrative

Personnel/Staffing

  • $1037.50 is requested for Nina Roofe to serve as the Program Evaluator; this is based on estimated hourly rate of pay and going consulting fees of $100.00 per 8-hour work day
  • $42.00 is requested for the Super Kids Program secretary; this is based on estimated hourly rate of pay of $10.50 per hour X number hours estimated she will provide to the evaluation component of this program
  • $52.50 is requested for Admissions Personnel; this is based on estimated hourly rate of pay of $10.50 per hour X number hours estimated he/she will provide to the evaluation component of this program
  • $0.00 is requested for student teachers/MSE candidates as this provides a learning opportunity and/or required part of their educational program

Consultants

  • $0.00 is requested for the graduate students in statistics as he/she is paid by the University
  • $0.00 is requested for the Writing Center Director as she provided review as a service to the University

Travel/Per Diem

  • $0.00 is requested for travel or per diem as all parties currently work at UCA and would not incur additional travel or meal costs as a result of participating in the evaluation of Super Kids

Communications

  • $50.00 is requested for postage and phone use; this is based on an estimated mail out of 100 enrollment packets at the corporate rate of 0.40 cents per packet plus an additional $10 for estimated phone use

Operations

  • $50 is requested for copying of parent surveys, student pre-and post-tests, and interview guides
  • $0 is requested for data processing as this is already present with the evaluator
  • $100 is requested for printing of the executive summary, full report, and PowerPoint presentation handouts
  • $20 is requested for miscellaneous office supplies and equipment usage

Overhead

  • $486.76 is requested for overhead, which is 43% of the salaries only; percentage set by UCA Budget Office, spoke with Terrie Camino who explained that indirect cost at UCA is calculated at 43% of salaries only, not including fringe benefits or other costs.

Total

  • $1838.76 is requested for the evaluation of the UCA Super Kids Program

Appendix I

Evaluation Standards Checklist

 

Standard addressed

Standard partially addressed

Standard not addressed

Standard not applicable

Page(s)

U1 Stakeholder Identification

X

     

5

U2 Evaluator Credibility

X

     

8

U3 Information Scope and Selection

X

     

10-12

U4 Values Identification

X

     

9

U5 Report Clarity

X

     

13-14

U6 Report Timeliness and Dissemination

X

     

13-14

U7 Evaluation Impact

X

     

15-18

F1 Practical Procedures

X

     

15-18

F2 Political Viability

X

     

9

F3 Cost Effectiveness

X

     

19, 46-47

P1 Service Orientation

X

     

4

P2 Formal Agreements

X

     

44

P3 Rights of Human Subjects

X

     

27-35

P4 Human Interactions

X

     

27-35

P5 Complete and Fair Assessment

X

     

10-12,
15-18

P6 Disclosure of Findings

X

     

15-18

P7 Conflict of Interest

X

     

8

P8 Fiscal Responsibility

X

     

19, 46-47

A1 Program Documentation

X

     

4-6

A2 Context Analysis

X

     

10-12

A3 Described Purposes and Procedures

X

     

7-12

A4 Defensible Information Sources

X

     

11, 13-14

A5 Valid Information

X

     

10-14

A6 Reliable Information

X

     

10-14

A7 Systematic Information

X

     

13-14

A8 Analysis of Quantitative Information

X

     

10-12

A9 Analysis of Qualitative Information

X

     

10-12

A10 Justified Conclusions

X

     

21

A11 Impartial Reporting

X

     

13-14

A12 Meta-evaluation

X

     

20, 50-51

(The Joint Committee on Standards for Educational Evaluation, 1994)

Appendix J

Meta-Evaluation Checklist

Evaluator’s Name: ____________________________________

Date: _____________

Questions to Ask

Answer

Comments

1. Is the purpose of the evaluation clearly defined and accurate?

   

2. Is the scope of the evaluation clearly defined and accurate?

   

3. Does the design and data analysis answer the question(s) it was intended to answer?

   

4. Do the assessment results have the desired generalizabilty?

   

5. Are the evaluation data accurate and consistent?

   

6. Would other competent assessors agree on the conclusion of the evaluation?

   

7. Are the stakeholders accurately identified?

   

8. Are the target audiences accurately identified?

   

9. Are the findings relevant to the audiences of the program?

   

10. Do the reports (executive summary, full report, newspaper articles) accurately convey the results of the program evaluation?

   

11. Have the most important and significant data been included in the assessment?

   

12. Do the audiences view the assessment as valid and unbiased?

   

13. Are the results provides to audiences in a timely manner?

   

14. Are the results disseminated to all intended audiences?

   

15. Is the program cost-effective in achieving the assessment results?

   

16. Is the management plan and schedule realistic and inclusive of all needed elements?

   

17. Is the budget accurate and appropriate?

   

18. Do the Joint Committee Standards apply to this evaluation?

   

(adapted from Shepard, 1977)





Human Sciences Working Papers Archive: A Project of the Kappa Omicron Nu Leadership Academy
Copyright & Disclaimer