Monday, December 11, 2006

PLS 595 JOURNAL ENTRY 17

Lessons Learned

Under constraints and the influence of the evaluation team, the original vision for an impact evaluation had devolved into a lower level of investigation, program monitoring. Furthermore, much longer time frame would be required than was available even at the start of the project to show the type of program results that matter most. Nevertheless, the kind of practical, performance-oriented data that program monitoring offered The Center became the focus of evaluation efforts. That is what is called “a good start.” Simply being able to measure performance based on client perceptions is a huge leap toward more rational forms of accountability. Plus the habit of data collection is a good one to have. It not only sets a higher professional standard, it sends the message that programs that engage in it are to be taken seriously. Later on, that sets the stage for more elaborate forms of analysis, the logical conclusion of which could become a goal-oriented outcomes evaluation or even an impact evaluation if other agencies adopt the same progressive orientation.

On the whole, the experience of initiating and following through with the evaluation process has been a positive one, making my destination all the more enjoyable, confirming that the journey does matter. It has been difficult for me to single out the lessons that matter to me most. Certainly, some do stand out more than others.

  1. Expect the Unexpected: Orchestrating a team can be rewarding and highly productive. However, the flip side is that it is frequently frustrating and involves many unattractive trade-offs, especially in terms of processing time. Teams offer many advantages including breadth of knowledge, wisdom and due caution. However, they can also squelch creativity and impede progress. In this case, the team experience was a mixture of positives and negatives. Overall, the team came through in the end and help set my project agenda for the semester just in time. What I did not anticipate were the turf issues associated with interagency comparisons. That, perhaps, was the most frustrating aspect of the experience.
  2. Be Proactive: When it comes to project management, I have a tendency (like many of my generation) to procrastinate. In my own defense, however, it might be partly explained as a kind of self-defense mechanism I have developed given the crushing weight of my responsibilities. Arguably, I always get the job done in my own particular idiom. Nevertheless, I would never advise that anyone do things the way I do. I will say that when faced with such trade-offs, technology can be a real lifesaver. My use of the Basecamp to keep myself and the project on track is a good example of that. Even lo-tech solutions, however, can make a big difference, such as making to-do lists and keeping in touch.
  3. Set the Bar Higher: When I started this evaluation, I purposely set the bar as high as it would go. I knew that compromises were inevitable and that disappointment was the rule and not the exception, especially where ambition exceeds resources. Regardless, I would not change a thing. As naïve as it might seem to some, having a vision gives one a reason to go on when things go wrong. Without the moment of force that accompanies hopeless idealism, one might just peter out before reaching up just high enough to achieve that personal best.
  4. Become an Advocate for Your Mission: Having a just cause is everything. It sets the tenor for everything. If the cause is just, resources will not matter. As long as someone is a true advocate, it stands to reason he or she will encounter like-minded individuals. Resources will follow. In this case, anyone could see the needs were real and that the stakes were high. Though I may not be there to witness my success, I know it will come as long as I can get someone else to care as much as I do. After all, so much more depends upon this evaluation’s fruition than better performance data.

Ultimately, I keep in my mind the most important stakeholders of all: child victims. They are the reason why we do what we do. They are the legacy and mine is but one small part in righting one of the worst kind of wrongs. That bears repeating. Whatever challenge I may have to face, I will never have to bear the kinds of wounds that they always will. However difficult my role becomes, I have no better motivator than knowing that my hard work and that of the team will eventually help alleviate suffering and to take a stand for those who cannot stand for themselves. To me, that is an encouraging thought.

Sunday, December 03, 2006

PLS 595 JOURNAL ENTRY 16


Protocols for Toddlers, DC and Hearing/Speech Impaired Children

The staff at CC brought up an important point that needs some clarification. Occasionally, the surveyor will be faced with a child victim who is too young to respond accurately or, for other reasons, unable to respond. Examples of inability to respond could include developmentally challenged children or children with hearing and/or speech impairments. Potentially, this could be an issue since they would affect the accuracy of the sample by being excluded. Therefore, I should add several protocols about this problem:

  1. Toddlers: Children whose answers are suspect should be excluded from the sample. Obviously, children develop at different rates. Some 2 and 3 year olds are more responsive and aware than others. This will be a judgment call on the part of the surveyor in consultation with the staff that is most familiar with the child. If the child seems reasonably responsive and focused in his or her answers, include that child.
  2. Developmentally-Challenged Children: The same may be said for these children. In these cases, the surveyor should always confer with the service delivery staff to determine whether the child CAN respond appropriately and accurately. If yes, survey the child and then decide whether his or her answers can be trusted. Otherwise, exclude the child from the survey. The surveyor’s judgment is also key here.
  3. Hearing/Speech Impaired Children: Obviously, every effort should be made to include these children in the survey. In these cases, the usual protocol of interviewing the child alone will have to be foregone unless the surveyor can sign or communicate with the child in some manner. Allowing literate older children (7 y.o. and up) with such disabilities to fill in the survey is permissible. Otherwise, a translating adult will need to be present. Allow this.

PLS 595 JOURNAL ENTRY 15

Survey Training and Protocols

I met with Kevin and April today. I gave them a student copy of SPSS 14 and helped them to load it onto Kevin’s desktop computer. Then I sat down and trained them about how best to enter data into their new datasets. They both seemed very engaged and asked lots of good questions. I did not get to know April at all until recently. She has a keen eye and an outstanding memory for details. She will make an outstanding project manager.

I offered my suggestions to help make data entry a little easier. I opted to not build a separate Access database. That would lead to a lot of duplicated effort when it came to data entry and more opportunities for entry errors. It just did not seem worth the extra ease of data entry that it would create. Ultimately, an electronic form would be the easiest means of data entry and probably more accurate, too. That would not work with the student version of SPSS. I think only the full version would work with the SPSS form generator. It is out there, though.

I made a number of recommendations and went over the data collection protocols with them, as I had discussed them with Aly and Allison. In order to ensure that I have covered this thoroughly and we are all on the same page, I am supplying some explicit protocols to guide the survey:

  1. Only ONE surveyor should talk with each subject.
  2. This surveyor should ideally have NO core service delivery contact with the subject.
  3. NO identifying numbers or names should be put onto the forms.
  4. NO parent, guardian or family member should be in the room during the child’s survey. They can be filling out their survey at the same time.
  5. For the child survey, the surveyor fills out the form based on the child’s responses.
  6. For the family and team surveys, each subject fills out his or her form and hands it back to the surveyor when complete. The surveyor ensures all question are answered.
  7. The surveyor should fill out a cover sheet with accurate information taken from the case file. Then, the surveyor should attach one copy of the filled out cover sheet to both client surveys. Then the surveys are inserted into the box.
  8. After that, NO one except the person assigned to data entry should handle or look at the surveys until time for entry. To do so could harm objectivity and cast doubt on the results.
  9. At data entry time, each case gets assigned a unique id number starting with 1 in the dataset. Each combination of child and parent/caretaker or other family member constitutes a case. Write the case number at the top of the form.
  10. After data entry the forms should be kept together in numerical order for data checking later.
  11. The forms should be sealed into manila envelopes and dated at the end of each week. The envelopes should be kept together in a safe, locked place.
  12. Later on, a separate person from the data enterer will check the data entry during the data “cleaning” process. This will probably be an MPA student or other volunteer.
  13. The forms are to be shredded after data cleaning.
  14. There will be no attempt at connection between the results of the first two client surveys and the three month client survey. Anonymity must be maintained.
  15. Any deviations from these instructions should be addressed with me or another MPA student ahead of time.