National Post School Outcome Data Community of Practice
------------------------------------------------------

I believe the use of FaceBook was previously discussed, what was the
inherent problem in this?

Phil

 

 

 

 

Philip J. Vitkus, M.S.Ed.

Director of Instructional Services

Wizdom Systems, Inc.

www.wizdom.com

www.wizdomeducation.com

630.357.3000 x3004

pvit...@wizdom.com

Wizdom “the process people”

Business Process Improvement

Employee Performance Improvement

 

-----Original Message-----
From: owner-npsos...@lists.uoregon.edu
[mailto:owner-npsos...@lists.uoregon.edu] On Behalf Of Dawn Rowe
Sent: Wednesday, January 19, 2011 1:58 PM
To: npsoserv@lists.uoregon.edu
Subject: Re: [NPSO]: [NPSO] Notes from January 13th COP call

National Post School Outcome Data Community of Practice
------------------------------------------------------

Thanks for responding. I will update the notes that will be posted on the
website. Thanks again for participating. Have a great week.

Dawn


On 1/19/11 8:47 AM, "Eudora Watson" <p...@potsdam.edu> wrote:

> National Post School Outcome Data Community of Practice
> ------------------------------------------------------
> 
> Greetings,
> 
> Thanks very much for your work and the chance to hear about the 
> experience with SPP#14 data collection in other states. One thing I 
> came away with was support for the idea of a mid-year communication 
> with exiters as a means to keep in touch. This is an idea that Robert 
> Shepherd and I have discussed - a report from the field that it was 
> effective in at least one instance is encouraging.
> 
> And - a clarification on the NY comments:
> 
> */What was the outcome that stood out most in your SPP/APR analysis?
> Share one positive outcome and maybe one that was somewhat shocking./*
> 
> NY: We had a decrease in response rate this year. The numbers we could 
> reach were down this year. Our male to female ratio was what we 
> expected. We had more females than males.
> 
> We had more females than males reporting engagement in postsecondary 
> education, particularly in 2 and 4 year colleges/universities (that's 
> the ratio I was referring to.)
> 
> *What was your response rate?*
> 
> NY: We get many more phone numbers now. Many are disconnected. We 
> offer a web version of the survey but very few use this option. We 
> just try to collect as many number as possible in order to reach them.
> 
> Did I say "just"? - ouch. We try to obtain additional names and 
> contact information for adults who do not reside in the home and we 
> work on our relationship with the schools (they supply the contact 
> information) to help them gather good information. We report to the 
> schools on the students for whom we have "no good numbers" - with 
> mixed results: some schools work hard to find these students, and of 
> these some are more successful than others.
> 
> */How are states currently reaching those hard to reach populations?/*
> NY: Our minorities were not represented.
> 
> Some minority groups are under-represented, a problem that is tied to 
> the difficulty of keeping in touch with exiters who went to city schools.
> 
> That's it. Thanks again,
> 
> Eudora
> 
> Dawn Rowe wrote:
>> The following are notes from last weeks community of practice call.
>> The notes will also be posted on the website. Thank you to all who 
>> participated. The topic for next months call will be Tools for 
>> marketing the post-school outcomes survey and examining the adequacy 
>> of your current survey. We are looking forward to hearing from states 
>> on next months call.
>> 
>> *Community of Practice** * *January 13, 2011* * *
>> *Participants*
>> Dan Boomer [California], Judy Johns [Kentucky], Patti Johnson 
>> [Oregon], Jackie Burr [Oregon], Jennifer Kane [Nevada], Bobby Grammar 
>> [North Carolina], Eudora Watson [New York] Deborah Donovan 
>> [Mississippi], Susan Loving [Utah], Amy Jinks [New Hampshire] ,
>> 
>> [NPSO] Ryan Kellems Deanne Unruh Dawn Rowe Jim Leinen
>> 
>> *Please let us know if we misspelled your name or didn¹t include you 
>> on the list!*
>> 
>> /Notes are not verbatim but rather an attempt to capture the essence 
>> of what is shared. Please alert us if there are glaring errors!
>> /
>> *Announcements & Reminders*:
>> 
>> NPSO I-14 Data Use Toolkit Training: March 1-2, 2011
>> 
>> Secondary Transition State Planning Institute: May 17-21, 2011
>> 
>> *Topic*: *Response Rate, Why are students not engaged and how have 
>> you reached those hard to reach populations?
>> *
>> Welcome. My name is Dawn Rowe. I am the project coordinator for NPSO.
>> I started this position just this past November, so I am relatively 
>> new to this project. I am not new to the field of transition however.
>> I am finishing up my doctoral work at the University of North 
>> Carolina at Charlotte and have worked with NSTTAC for the past two 
>> and half years. Prior to that I was a transition specialist for a 
>> local school district in South Carolina.
>> 
>> Thank you all for calling in. Today I will be facilitating a 
>> conversation about your SPP/APR analysis and what you have learned 
>> about students who are leaving high school, particularly students who 
>> fall into the non-engaged group. I encourage you to share information 
>> you have obtained about who falls into the non-engaged group and 
>> improvement activities that are being developed to reduce the numbers 
>> of individuals who fall into this category. We will also discuss the 
>> your response rate to I-14 data collection and strategies for 
>> reaching those hard-to-reach populations. So let¹s get started.
>> 
>> Many of you have either completed the SPP/APRs or are putting the 
>> final touches on them prior to sending them to OSEP. You should now 
>> have an idea of who are employed, who are enrolled in higher 
>> education, who fall into the some other employment category and who 
>> falls into the some other postsecondary education group. Lets begin 
>> by hearing about some of your outcomes.
>> 
>> */What was the outcome that stood out most in your SPP/APR analysis?
>> Share one positive outcome and maybe one that was somewhat shocking.
>> /*
>> NH: The response rates from students with LD and ED were better than 
>> anticipated. We had a slightly lower response rate due to 
>> undeliverable surveys. We¹ll need to work on that.
>> 
>> UT: We had more students in the postsecondary education category than 
>> in employment. We have found it challenging to explain what is going 
>> on with students using the ABC definitions required in the SPP/APR.
>> When we are sharing data with public we have to explain what ³other² 
>> means.
>> 
>> CA: We had a high response rate (94%)
>> 
>> MS: Our response rate was 87%. The number reported enrolled in some 
>> other education or some other employment was small. These were the 
>> lowest percentages we had.
>> 
>> NY: We had a decrease in response rate this year. The numbers we 
>> could reach were down this year. Our male to female ratio was what we 
>> expected. We had more females than males.
>> 
>> NC: Our response rate was down. Measurement C was highest area. Our 
>> students are engaged in something. We provide intensive job training 
>> prior to leaving high school, which helps with competitive employment.
>> The majority of our students going to postsecondary education are 
>> going to community colleges.
>> 
>> NV: We have changed systems for collecting I-14 data. We are still 
>> working on our data poll.
>> 
>> KY: We had a 61% response rate. 39% of those were not engaged and 28% 
>> were employed.
>> 
>> OR: Our response rate was low about 72%. We had more individuals 
>> working than in higher education. We had a smaller other category. 
>> Our biggest districts are not as engaged in the process as the 
>> smaller districts.
>> 
>> CA: 2400 students in community college. 5000 in some type of other 
>> postsecondary education. 3700 employed. We were not able to contact 
>> about 14,000.
>> 
>> */Have you drilled into your data to determine what is happening with 
>> the non-engaged group? What will you do with this information? How 
>> will you use this information to develop improvement activities? What 
>> types of improvement activities have you got planned to reduce the 
>> non-engaged group?
>> /*
>> UT: We got a lot of responses like he has a disability so he can¹t 
>> work or he can¹t go to college. The parent and student expectations 
>> are low. We need to look into increasing these expectations prior to 
>> leaving high school and looking at the impact of disability on 
>> employment and postsecondary education. Provide more support.
>> 
>> CA: Saying ³I can¹t² was a big NO NO in front of my Dad.
>> 
>> NY: We have One-stop centers and other services providers that 
>> provide a multitude of services; however, awareness of programs and 
>> supports available after an individual leaves high school was low.
>> 
>> NC: We had lots of ³I don¹t know² responses. SSI was the most well 
>> known service provided. We have a huge partnership with VR, but this 
>> was fairly low on the list. Getting accurate information from the 
>> larger districts is a challenge.
>> 
>> OR: We have taken a case study approach to reporting information back 
>> to the LEA¹s. We do a pre-exit survey. Districts are able to look at 
>> individual student outcomes and transition services and supports 
>> provided.
>> 
>> NH: We need to look at questions to be able to then look further into 
>> the non-engaged group. We had lots of people who did not complete the 
>> item and we did not ask any other questions that would allow us to 
>> drill further. As far as improvement activities, we are focusing on
>> I-13 and providing better transition services in school to improve 
>> post-school outcomes. We also had an issue with the 90 days question.
>> Lots of students were employed but not for 90 days and we did not 
>> have questions to learn why.
>> 
>> UT: Bad numbers
>> 
>> CA: I am looking at breaking down the question to not able to contact 
>> the first time and not able to contact the second time and look at 
>> the response rate that way. If you take out the other category the 
>> response rate went up.
>> 
>> *What was your response rate?
>> *
>> See above responses as well
>> 
>> UT: We had a 20% response rate. 80% could not be contacted or did not 
>> answer the survey. We had lots of bad numbers and disconnected numbers.
>> 
>> OR: We had lots more disconnected numbers or people who did not want 
>> to participate. We think it is due to the economy. We do a phone 
>> survey and people are not answering the phone. Many people have debt 
>> collectors calling and are just not answering.
>> 
>> NV: When we just did a hard copy of the survey we had a higher 
>> response rate. We now give people options to participate. They can 
>> participate by phone, snail mail, or online. We thought that the use 
>> of technology would improve our response rate because it was easier 
>> to complete the survey but it did not happen. We think it is because 
>> previously parents were responding to the surveys on behalf of their 
>> students. Students are not responding at the same rate as their 
>> parents. Plus we have had lots of kinks in our new system to work out.
>> 
>> NY: We get many more phone numbers now. Many are disconnected. We 
>> offer a web version of the survey but very few use this option. We 
>> just try to collect as many number as possible in order to reach them.
>> 
>> NH: We offer a hard copy of the survey and one online. Few students 
>> take the online survey. We met with our stakeholders and they 
>> suggested getting parent emails. Students get so many emails they 
>> just want click on especially if it hasn¹t worked prior. We also have 
>> lots of rural areas that do not have email access.
>> 
>> */How are states currently reaching those hard to reach populations?
>> /*
>> UT: We continue to do telephone surveys
>> 
>> OR: Our strategy is to get as much information the year before a 
>> student leaves as possible. We have suggested they get this 
>> information as an activity in the classroom. Those who have gotten 
>> the information a year ahead of time have a better rate of response 
>> than those who do not.
>> 
>> NV: We have a similar process.
>> 
>> KY: We do a senior survey and are talking about linking the exit 
>> survey, the senior survey, and the individual learning plans so that 
>> were are collecting this information once rather than from multiple 
>> sources.
>> 
>> NV: We also conduct a senior survey. I also do a district poll and 
>> pull information from the statewide database to ensure I have the 
>> most up-to-date information.
>> 
>> UT: We do not do any other type of survey.
>> 
>> CA: We had 29, 500 leavers this last year. We rely on the SELPAs to 
>> collect data.
>> 
>> NY: We are using our transition coordinators and trying to provide 
>> them with concrete information and a means to work with districts.
>> 
>> UT: We have a very mobile population. Many of these are also low 
>> income and have the highest drop-out rate. How do we deal with this?
>> 
>> NY: Our minorities were not represented.
>> 
>> UT: Minorities were represented; however, we have a growing refugee 
>> population we are going to have to consider. Lots of barriers to 
>> reaching that group. With the huge variations in languages and 
>> dialect it is impossible to translate all your material.
>> 
>> Our time has come to an end. I appreciate your participation in 
>> today¹s call. I will post notes for today¹s call on the website 
>> shortly. Just a reminder, we are available to review your SPP/APR¹s 
>> if needed. Just send them our way. Our next call will be February 10th.
>> We will be talking about tools for marketing the post-schools outcome 
>> survey and examining the adequacy of your current survey. We look 
>> forward to your participation. Have a wonderful day.
>> 
>> 
>> 
>> --
>> Dawn A. Rowe
>> Project Coordinator
>> National Post-School Outcome Center
>> University of Oregon
>> 541-346-8412
>> dro...@uoregon.edu
>> www.psocenter.org
>> 

-- 
Dawn A. Rowe
Project Coordinator
National Post-School Outcome Center
University of Oregon
541-346-8412
dro...@uoregon.edu
www.psocenter.org




To unsubscribe from this mailing list, please send an email to
majord...@lists.uoregon.edu with "unsubscribe npsoserv" in the body
(without the quotes).


To unsubscribe from this mailing list, please send an email to
majord...@lists.uoregon.edu with "unsubscribe npsoserv" in the body
(without the quotes).

Reply via email to