My-Peer Toolkit

Data are usually collected through qualitative and quantitative methods. 1 Qualitative approaches aim to address the ‘how’ and ‘why’ of a program and tend to use unstructured methods of data collection to fully explore the topic. Qualitative questions are open-ended such as ‘why do participants enjoy the program?’ and ‘How does the program help increase self esteem for participants?’. Qualitative methods include focus groups, group discussions and interviews. Quantitative approaches on the other hand address the ‘what’ of the program. They use a systematic standardised approach and employ methods such as surveys 1 and ask questions such as ‘what activities did the program run?’ and ‘what skills do staff need to implement the program effectively?’

Both methods have their strengths and weaknesses. Qualitative approaches are good for further exploring the effects and unintended consequences of a program. They are, however, expensive and time consuming to implement. Additionally the findings cannot be generalised to participants outside of the program and are only indicative of the group involved. 1

Quantitative approaches have the advantage that they are cheaper to implement, are standardised so comparisons can be easily made and the size of the effect can usually be measured. Quantitative approaches however are limited in their capacity for the investigation and explanation of similarities and unexpected differences. 1 It is important to note that for peer-based programs quantitative data collection approaches often prove to be difficult to implement for agencies as lack of necessary resources to ensure rigorous implementation of surveys and frequently experienced low participation and loss to follow up rates are commonly experienced factors.

Mixed Methods

Is there a way to achieve both the depth and breadth that qualitative and quantitative methods may achieve individually? One answer is to consider a mixed methods approach as your design, combining both qualitative and quantitative research data, techniques and methods within a single research framework. 2

Mixed methods approaches may mean a number of things: ie a number of different types of methods in a study or at different points within a study, or, using a mixture of qualitative and quantitative methods. 3,4

Mixed methods encompass multifaceted approaches that combine to capitalise on strengths and reduce weaknesses that stem from using a single research design. 4 Using this approach to gather and evaluate data may assist to increase the validity and reliability of the research.

Some of the common areas in which mixed-method approaches may be used include:

Some of the challenges of using a mixed methods approach include:

These challenges call for training and multidisciplinary collaboration and may therefore require greater resources (both financial and personnel) and a higher workload than using a single method. 4 However this may be mediated by identifying key issues early and ensuring the participation of experts in qualitative and quantitative research. 2

Mixed methods are useful in highlighting complex research problems such as disparities in health and can also be transformative in addressing issues for vulnerable or marginalised populations or research which involves community participation. 3 Using a mixed-methods approach is one way to develop creative options to traditional or single design approaches to research and evaluation. 5

Surveys

Surveys are a good way of gathering a large amount of data, providing a broad perspective. Surveys can be administered electronically, by telephone, by mail or face to face. Mail and electronically administered surveys have a wide reach, are relatively cheap to administer, information is standardised and privacy can be maintained. 1 They do, however, have a low response rate, are unable to investigate issues to any great depth, require that the target group is literate and do not allow for any observation. 1

As surveys are self-reported by participants, there is a possibility that responses may be biased particularly if the issues involved are sensitive or require some measure of disclosure on trust by the participant. It is therefore vital that surveys used are designed and tested for validity and reliability with the target groups who will be completing the surveys.

Careful attention must be given to the design of the survey. If possible the use of an already designed and validated survey instrument will ensure that the data being collected is accurate. If you design your own survey it is necessary to pilot test the survey on a sample of your target group to ensure that the survey instrument is measuring what it intends to measure and is appropriate for the target group. 1

Questions within the survey can be asked in several ways and include: closed questions, open-ended and scaled questions, and multiple choice questions. Closed questions are usually in the format of yes/no or true/false options. Open-ended questions on the other hand leave the answer entirely up to the respondent and therefore provide a greater range of responses. 1 Additionally, the use of scales is useful when assessing participants’ attitudes. A multiple choice question may ask respondents to indicate their favourite topic covered in the program, or most preferred activity. Other considerations when developing a survey instrument include: question sequence, layout and appearance, length, language, and an introduction and cover letter. 1 Sensitive questions should be placed near the end of a survey rather than at the beginning.

Offering young people an incentive for completing the survey or embedding the survey as a compulsory item within the program schedule or curriculum may be useful to maximise the response rate.

Interviews

Interviews can be conducted face-to-face or by telephone. They can range from in-depth, semi-structured to unstructured depending on the information being sought. 6

Face to face interviews are advantageous since:

Disadvantages of face to face interviews include:

Telephone interviews according to Bowling 6 , yield just as accurate data as face to face interviews.

Telephone interviews are advantageous as they:

Disadvantages of telephone interviews include:

Focus groups

Focus groups or group discussions are useful to further explore a topic, providing a broader understanding of why the target group may behave or think in a particular way, and assist in determining the reason for attitudes and beliefs. 1 They are conducted with a small sample of the target group and are used to stimulate discussion and gain greater insights. 6

Focus groups and group discussions are advantageous as they:

Disadvantages of focus groups include:

Documentation

Substantial description and documentation, often referred to as “thick description”, can be used to further explore a subject. 7 This process provides a thorough description of the “study participants, context and procedures, the purpose of the intervention and its transferability”. 7 Thick description also includes the complexities experienced in addition to the commonalities found, which assists in maintaining data integrity.

The use of documentation provides an ongoing record of activities. This can be records of informal feedback and reflections through journals, diaries or progress reports. The challenge of documentation is that it requires an ongoing commitment to regularly document thoughts and activities throughout the evaluation process

Creative strategies

Drama, exhibition, and video are imaginative and attractive alternatives to the written word. 8 These imaginative new approaches can be used to demystify the evaluation process. Using creative arts in evaluation offers opportunities for imaginative ways of understanding programs and creating evaluation knowledge. The creative arts may be used in designing, interpreting, and communicating evaluations. 9 The direct perception and understanding a creative arts approach brings is helpful to the evaluator in gaining a deep understanding of the program. In addition, this approach is a useful means of connecting with participants’ experience in an evaluation. 9

Creative strategies are advantageous as they:

Challenges arising from creative strategies include:

There are multiple forms of creative strategies which you can explore here.

Triangulation

Triangulation is used to address the validity of the data. 10 Triangulation methods use multiple forms of data collection, such as focus groups, observation and in-depth interviews to investigate the evaluation objectives. Utilising multiple data collection methods leads to an acceptance of reliability and validity when the data from the various sources are comparable and consistent. 11,12 Using more than one person to collect the data can also increase its reliability. This, however, will significantly increase the cost of the evaluation. Additionally, theory triangulation provides new insights by drawing on multiple theoretical perspectives. 13

References

  1. Hawe, P., Degeling, D., Hall, J. (1990) Evaluating Health Promotion: A Health Worker’s Guide, MacLennan & Petty, Sydney.
  2. Taket A. 2010. In Liamputtong L (ed). Research methods in health: Foundations for evidence-based practice. Oxford University Press: South Melbourne.
  3. Hanson WE, JW Creswell, VL Plano Clark, KS Petska and JD Creswell. Mixed Methods Research Designs in Counseling Psychology. Journal of Counseling Psychology, 2005, Vol. 52, No. 2, 224–235.
  4. Leech, NL and AJ. Onwuegbuzie. A typology of mixed methods research designs. 2009. Qual Quant 43:265–275
  5. Greene, J. C., & Caracelli, V. J. (2003). Making paradigmatic sense of mixed methods practice. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 91–110).Thousand Oaks, CA: Sage.
  6. Bowling, A. 1997. Research methods in health: Investigating health and health services. Place Published: Open University Press.
  7. B.Nastasi, and S. Schensul. 2005. Contributions of qualitative research to the validity of intervention research. Journal of School Psychology 43 (3): 177-195.
  8. Curtis, L., J. Springett, and A. Kennedy. 2001. Evaluation in Urban Settings: the challenge of healthy cities. In Evaluation in Health Promotion: principle and perspectives, edited by I. Rootman and M. Goodstadt: World Health Organization Regional Office for Europe.
  9. Simmins, H., and McCormack, B. (2007). Integrating Arts-Based Inquiry in Evaluation Methodology: Opportunities and Challenges. Qualitative Inquiry 13(2): 292-311.
  10. Barbour, R. 2001. Education and debate. British Medical Journal 322 (7294): 1115-1117.
  11. Golafshani, N. 2003. Understanding reliability and validity in qualitative research. The Qualitative Repor 8 (4): 597-607.
  12. Ovretveit, J. 1998. Evaluating health interventions. Berkshire: Open University Press.
  13. Nutbeam, D., and A. Bauman. 2006. Evaluation in a nutshell. North Ryde: McGraw-Hill.

Sub-Navigation