fbpx
Skip to content
Support Midwestern creativity with a gift to Arts Midwest! Donate Now

Beginning Evaluation with a Data Justice Lens 

by Ellen Mueller

Three young people sitting around a table with laptops in front of them.
Brooke Cagle via Unsplash

Plan your evaluation approach using data justice principles, which promote truth and storytelling based on consent.


The best time to rethink an evaluation process is at the beginning of a process.

This is the ideal moment to identify specifically why you are collecting information, tweak your line of questioning to avoid bias, and to trim out unnecessary data collection by following data justice principles.

It is also an ideal moment to double check the security of your data collection and storage, and formulate plans to share back with the community and partners. 

These steps are some of the fundamentals of data justice.

“Data justice values the dignity, privacy, and humanity of all people. Data justice is active resistance to all oppressive systems within the arena of data as a part of building a better world that promotes dignity, consent, truth, accountability, and learning.”

Metropolitan Alliance of Connected Communities
A group of people sitting on chairs in a circle as one person gestures and talks. Some have sheets on paper on their lap, while others take notes on their laptops.
Photo Credit: Levi Shand / ACRE

What is data justice? 

The Metropolitan Alliance of Connected Communities defines data justice as, “Data justice is the collection, analysis, sharing, and use of data entirely in service of and with accountability to participants and their communities. Data justice is a lens to evaluate the intended or unintended consequences of researching or collecting data on individuals and communities. Data justice values the dignity, privacy, and humanity of all people. Data justice is active resistance to all oppressive systems within the arena of data as a part of building a better world that promotes dignity, consent, truth, accountability, and learning. This work will be necessary as long as oppressive systems exist.”

Readers can find more detailed information at the MACC website

How does data justice apply to the beginning, middle, and end of a project? 

At the beginning of a project, it is important to pause and reflect on your processes.

Start by identifying how the data you plan to collect will be useful to individuals, communities, the organization, and how it will be part of a specific utilization plan. Once you have articulated this, it will be your ‘why’ statement, which answers, “Why collect this data?” Plan to share this why-statement with all partners and participants. 

Additionally, before starting the project it is best to limit the size and scope of your evaluation process as much as possible. Recognize that data has financial costs (wages for staff and technical infrastructure) and human costs (surveys can act as barriers to services, and can pose retraumatizing questions – do not do this).

At this point in the process, eliminate collecting data that do not serve participants and their communities. This could look like collecting data ‘just in case’ or just because it would be ‘nice to have’. Ask yourself, “What would happen if we didn’t collect ___?” 

Once you’ve determined what you will collect, have sensitivity in evaluation design, questions, and purpose. Review materials for biases and potential to cause harm. Elevate diverse voices and perspectives, as well as accessibility and inclusion, at all stages. Prioritize evaluation that contributes to program improvement, equity, access, and organizational learning.

This might also be a moment to revisit your ‘why’ statement In case your reflections have changed any of your answers. 

People sit in a group reading books. Behind them is a set of bookshelves filled with books, sculptures, and plants.
Photo Credit: Indigenous Roots

Next, double-check the security and privacy plan for storing the data you plan to collect. Once that is confirmed, you can share the evaluation engagement expectations with partners and participants as you start your time together. Be open to feedback and Iteration based on that feedback. Allow people to opt-out without threat of losing funding or other support/activities. 

Towards the middle of a project, It’s a good idea to check-in with your partners and participants to briefly ensure that needs are being met. A simple three question check-In can help people provide concise feedback.

  1. 1

    What’s working?

  2. 2

    What could be improved?

  3. 3

    Is there anything else we should know?

These three questions will help provide an opportunity to iterate and improve before the conclusion of the project whenever possible. 

At the end of a project, there is often a final evaluation. At this time, any narratives and information about communities and individuals should be true and not fabricated or assumed. Cross-check narratives and information with the partners and participants. Additionally, consider the balance of power in situations in which data are shared.

Finally, make a plan to give the data back to the community formatted as a usable tool (potentially for advocacy). This might take the form of a presentation, a 1-sheet summary, or other format appropriate to the context. 

Try it for yourself

Are you feeling inspired? It’s time to try it out using our worksheet.

Download PDF Download Word

Steps to start off your evaluation journey: 

  1. 1

    Identify your “why” statement.

    Identify how the data you plan to collect will be useful to both individuals and communities, the organization, and how it will be part of a specific utilization plan. (this is your ‘why’ statement – why collect this data?). Plan to share this why-statement with all partners and participants.

  2. 2

    Limit the size and scope of your evaluation process.

    Recognize that data has financial costs (wages for staff; technical infrastructure) and human costs (surveys can act as barriers to services, and can pose retraumatizing questions – do not do this). Do not collect data that do not serve participants and their communities (for example, collecting info ‘just in case’ or just because it would be ‘nice to have’). Ask yourself, “What would happen if we didn’t collect ___?”

  3. 3

    Have sensitivity in evaluation design, questions, and purpose.

    Review materials for biases and potential to cause harm. Elevate diverse voices and perspectives. Prioritize accessibility and inclusion. Prioritize evaluation that contributes to program improvement, equity, access, and organizational learning.

  4. 4

    Have a plan to treat all collected data with care.

    Have a plan to securely and privately store all data.

  5. 5

    Share the evaluation engagement expectations with participants.

    At the start of your time together, give potential participants the option to opt-out without threat of losing funding or other support/activities.

  6. 6

    Cross-check collected data with partners and participants.

    After collection, cross-check narratives and information about communities and individuals; these should not be fabricated or assumed. What is your plan to cross-check collected data?

  7. 7

    Consider the balance of power in situations in which data is shared.

    Consider the position of the participant as you are requesting and collecting data. Where might the participant feel pressure to answer or submit particular information due to systemic power relationships and hierarchies?

  8. 8

    Make a plan to give data back to the community.

    Make a plan to give data back to the community formatted as a usable tool (potentially for advocacy).

Resources used in developing this tool: 

Metropolitan Alliance of Connected Communities, “Data Justice.” 

Metropolitan Alliance of Connected Communities “Data Justice for Nonprofits” panel presented by at MCN 2022 conference by Alicia Ranney, Adam Cowing, Emily Barter, Joylenna Garcia, Rebecca Mino, and Lucy Geach.