Isn't it so gratifying to learn a new skill and get to apply it?
One thing I've been learning lately is how to use ArcGIS, a super fancy mapping tool that allows you to collect, analyze, and visualize all sorts of data.
I've talked about mapping with clients and at conference presentations for awhile, and I've loved using public (read: FREE) mapping tools to learn more about the communities I was serving or studying.
I’ve used maps in many ways – describing the community for grants or needs assessments, determining which students need home visits, or figuring out which resources are near students’ homes.
Most recently, I've gotten to use maps through my part-time work as a researcher at Ohio State.
We were trying to figure out if the students in our college had practicum placements within federally designated "medically underserved communities."
Using a free public map file from a government agency and uploading a list of addresses where our students were placed, I was able to instantly visualize and (through ArcGIS's fancy tools) analyze the percentage of our students working within underserved communities.
Seeing it all come together was magical.
It painted such a clear picture of the impact of our college and the difference that our students are able to make.
Given the disproportionate impact of COVID-19 on disadvantaged communities and the rising awareness of systemic racism on, well, every aspect of community life, knowing what children and families need outside of school – and acting on it — is critical.
Using data to pinpoint which students are at the greatest risk of disengaging from online school or whose families struggle to meet basic needs is essential for targeting interventions and outreach.
Here's a list of free mapping tools to get you started (from my May post on the AEA365 blog):
City and county agencies also have amazing resources. See if your health or police departments, school districts, or universities have online tools for exploring your area.
Here’s a map I created of the schools, hospitals, and other services where I live, from the city’s mapping tools:
To get started mapping your own data, I always recommend starting with Google Maps!
Beyond being a lifesaver for those with a poor sense of direction like me, Google Maps offers a free tool for creating your own maps. You can map multiple data sources and use colors, symbols, and labels to make sense of your data.
Then go play! See what maps you can create of your community.
Now think about each of your students and families as dots on that map. Imagine what they might see every day when they walk down the street. What resources are available to them in their community? In what ways is their community potentially putting them at risk?
You can use what you learn from your mapping explorations to influence survey questions and interview protocols for students and families. For example, if there has been a recent rise in crime rates in a neighborhood, ask families and students if they feel safe and what the school or district could do to make them feel safer.
Certainly, your findings can also help you figure out what services to offer within your program or school.
It's been so enjoyable to build my mapping skillset and explore a whole new way of looking at data. I hope you take some time to play around with these mapping tools and see what you can learn!
This week I've got a co-author to help me continue my qualitative data series!
Sarah Dunifon is Founder and Principal Evaluator of Improved Insights LLC, an educational evaluation firm focused on STEM and youth-based programming. She is based in Cleveland, Ohio and is a fellow board member of the Ohio Program Evaluators Group.
We hope you enjoy our post below.
Qualitative data can be a bit elusive.
It’s not usually too hard to find data for things that are measurable. We know we can do surveys, or count the number of attendees, or track patterns over time.
Qualitative data though - the context for those numbers - often takes a little more work to track down. Of course, we can always do interviews and focus groups with stakeholders to learn about their experiences, our usual go-to’s.
However, if you think of qualitative data for what it is - simply put, another information source - you’ll find that so many other forms of it are hiding in plain sight.
Think about the chatbox in your last Zoom session - you may not have realized it, but that’s a source of qualitative data! Other sources you may have readily available include the phone call logs that your teachers keep when they call families or even the observations you did of an event (online, drive-up, or fully in-person).
If you need more, there are lots of ways of collecting qualitative data, and many of them are even more prevalent now in our almost fully virtual world.
This makes our lives a lot easier, as we prepare to write our annual reports, apply for grants, or share the impact that our program had during this unusual year.
Like I mentioned in my last post, sharing the context for our quantitative findings can make those reports tell a much richer story.
Yet it’s not always intuitive to know how to turn a whole bunch of text into these powerful programmatic insights.
So when you find these sources of qualitative data, what do you do with them?
We can actually find patterns in our data by assigning thematic codes to different words, phrases, or even images. Sometimes, you start with a set of codes that have to do with your program goals, or the research concepts underlying your program.
Other times, you just code as you go. If you start to see a lot of mentions of a particular topic, that topic can become a code.
Coding can take many forms, and there is fancy software that can help you do it, but sometimes all you need is a notebook and some markers or a color-coded spreadsheet.
Below you can see some sample data about an after-school program focused on science and animals that we’ve color-coded according to the themes we saw.
In one glance, you can see that our participants liked a lot of aspects of the program, but games and activities (in blue) and the food (in pink), got the most mentions.
Coding allows us to see what’s happening across the dataset and pull out themes or key insights that we need to highlight.
Sharing your qualitative data analysis can be an important addition to your data story when demonstrating the impact of your work. It can add relevance, personality, and context to quantitative data by illustrating individual effects.
By reviewing our datasets systematically, we can also find some incredible quotes - the kind you would never attempt to paraphrase if you were writing a paper because they were so perfectly worded -- and let our stakeholders’ words shine.
You can feature key quotations by offsetting them or putting them in a different color in your report to highlight individual experiences and catch readers’ attention.
Another popular way to display qualitative data is in a word cloud.
Word clouds are visual representations of keywords that come up frequently in a set of qualitative data. Typically, the bigger the word, the more frequently it appeared in a data set.
There are plenty of critiques of word clouds in the data visualization space and rightly so - word clouds can often obscure meaning rather than clarify it. So if you are going to use them, here are three things you should know:
1. Give the data a good cleaning to remove anything that you don’t want represented in the visual.
Here, we’d recommend removing any responses that do not give value (e.g., “idk,” “I’m not sure,” “Nothing,” etc.) as well as any text surrounding the main themes (e.g., “I like the [...],” “I love [...],” “my favorite thing is [...],” etc.).
2. Consider the messages or key points you see in the data that you wish to convey visually. If it is possible to condense themes further or pull out important words, now is the time to do so.
This might mean collapsing phrases as best as possible to a single word, or perhaps a few words of important meaning.
3. Make sure to keep the essence of the data - meanings can be misconstrued when collapsing phrases into single words or shorter phrases.
If you’re finding this is happening, perhaps a word cloud is not the best way to display your data.
However, with data cleaning and basic analysis, the word cloud can change drastically.
Take a look here at three versions of the same word cloud we generated on WordItOut using the data we shared earlier. The first was created with original - or “raw” - data, the second with cleaned data, and the third with some basic analysis and condensing.
Notice how the prominent words change with each version, and how the meaning and key messages can shift.
As you can see, while word clouds are one of the most accessible forms of qualitative data displays, they take some work to be most effective.
However, word clouds aren’t your only option. Data visualization experts like Stephanie Evergreen, Storytelling with Data, and Depict Data Studio all have great resources on different qualitative data displays.
The case is clear - with some simple analysis and visualization, qualitative data can be a powerful addition to your data story.
You should know by now that I'm a bit of a data nerd.
I love spreadsheets. I love organizing data and using it to illuminate patterns. I love the "ah-ha" moments when clients realize how much their own data can tell them about the kids and families they're serving.
So it may surprise you that I'm here to say that numbers and spreadsheets don't tell us everything.
That doesn't mean that numbers (or quantitative data) are irrelevant.
It just means that they are even more informative when paired with stories, quotations, or anecdotes (qualitative data).
(See the box for a quick refresher on the difference between the two).
Here's an example. Yesterday, I was re-reading an article from The Columbus Dispatch, my local paper, about the spike in youth violence that has occurred during the pandemic.
It's been horrible to hear about how many children and teens (well, really anyone, for that matter) have been victims of gun violence since the spring.
The article cites a number of statistics -- that the number of children treated in Columbus for gunshot wounds this spring and summer was double the rate from 2019 (from 16 to 32); and that children from racial or ethnic minorities are twice as likely to be shot than white children.
Those are AWFUL statistics - and they certainly help me see that there is a dire situation here.
But then, the article talks to a teacher whose student -- an eight year-old boy -- was killed. Here's what the article shares about (and from) the teacher:
Thalgott has lost a handful of former students during her 20 years of teaching on the South Side. She's seen even more students who have lost a parent to gun violence.
Having lost some former students or their family members to gun violence -- either as victims or perpetrators -- this quote really gets to me.
This quote conjures up such raw emotions that suddenly it puts the statistics they cited into context.
Those 32 children are somebody's child, somebody's sibling, somebody's student, somebody's mentee. Hearing from a person who actually experienced that loss made a big difference in how I processed this article. I imagine it did for you too.
Quantitative data can be so powerful, but its impact is amplified when we lift up the voices of those we are serving or studying.
Qualitative data -- gathered through interviews, focus groups, open-ended survey questions, or observations -- can sometimes more effectively communicate the experience of what is happening in your school or community.
I'll be doing a series of posts on qualitative data over the next few weeks -- how to collect it, how to use it, and how using a combination of data can truly help you tell your story.
Sometimes you launch a survey, and you're blown away by the number of responses you get.
And sometimes, you're not.
I had one of these moments last week.
I was SUPER excited to try something new with my blog and launch a survey to hear what readers wanted to see in future posts.
I sent out my blog to my email list, waiting with bated breath (okay, maybe I'm being dramatic) for all of the responses to pour in.
... And then I realized that the survey I embedded didn't even show up in the email.
INSERT FACE PALM EMOJI.
Let's just say that I didn't get the response rate I was hoping for.
But here's the thing - it's okay to have a survey fail. All hope is not lost.
If you don't get the response rate you were hoping for, take a step back and consider:
For me, technical issues definitely got the best of my survey attempt, but I also think a reminder wouldn't hurt.
So here's my plug:
I'd love to know more about what you want to learn! I'd appreciate if you could take a minute to share your thoughts and preferences with me. I'll report the results and use them to make this blog even more beneficial for you.
The survey is embedded below, but if it's not loading for you, click the button to go right to it.
We have all taken TONS of surveys in our lifetime.
We get surveys when we make an online purchase, when we speak with a customer service agent, when we want to get a free gift card, and even when we go to the hospital.
We're all pros at taking surveys... and we all know when we're taking one that's TERRIBLY designed.
For me, if I don't feel like I can answer the questions, or if it gets too long or overly annoying, I'm out.
And that organization just lost a respondent.
I don't want that to happen to you -- because in education, surveying our stakeholders is SO important. It shows that we value our stakeholders' opinions, feedback, and experiences.
We can't afford to lose respondents because of iffy survey design.
Here are a few of my tips for upping your survey game:
1. Ask only what's really important.
Make a list of what your team is wondering about or what the impact of your proposed projects/plans might be before you draft your survey questions.
Keep it short and sweet ... if it's not related to those things, don't include it.
2. Reach respondents where they are.
Think of all of your touch points with your key stakeholders. Students may be log in for online class, families may check social media for updates, and all of your stakeholders may access meal sites.
At all of these venues, you can easily ask about needs, satisfaction with the school's efforts, or other questions you may have.
You can also get feedback through polls in Google Classroom, Zoom, via text message, or even on social media.
3. This may seem obvious, but ... make it easy for respondents to actually answer your questions.
Keep the language clear and simple so a person of any reading level can understand it.
Never ask about more than one topic in a single question, and try to avoid giving a neutral middle answer option when you can.
(In both of these cases, it's very hard for you to actually learn anything from the data.)
And of course, if you work with communities who speak languages other than English, find a way to translate your survey into their language.
Translation is a much tougher process than it should be, but it is essential for making all of your families feel valued and for hearing from your entire community, not just one subset.
All that being said, I feel the same way about you - my colleagues, clients, and readers.
I want to know what's important to you and what would be helpful for me to cover on the blog.
I hope that you've been inspired by this post and will take my brief survey below.
I appreciate your feedback and will use it to generate future content for you!
When I started my doctoral program at Vanderbilt, I certainly didn't expect to get into a ... heated discussion, shall we say? ... with the professor of my first course.
We were discussing characteristics of effective leaders, and our professor mentioned that the Myers-Briggs Type Indicator, one of the most well-known personality tests, was essentially worthless.
You see, despite its incredible popularity, there is actually no data to show that Myers-Briggs is a valid and reliable assessment -- that it measures what it intends to, and that you'd consistently get the same outcomes if you took it again and again.
Now, I've always been a pretty introspective person, and I (still) love personality tests as a fun way to reflect on how I think, feel, and interact with others. I'd never taken them as a scientific assessment of my psyche, but Myers-Briggs especially had stood out to me as a somewhat revelatory framework for why people interact and act the way they do.
I had always gotten the exact same result when I'd taken the Myers-Briggs (ENFJ, if you're curious), so when my professor started talking about how most people get quite different results each time they take it, and that there was no research to support its utility, part of me was bummed, and part of me was fired up.
I argued (civilly, of course) that I didn't use it as a formal diagnostic tool, but instead as a helpful resource or an interesting way of looking at things. So why should it matter? (Newsflash: It does matter.)
For fun, I recently read The Personality Brokers: The Strange History of Myers-Briggs and the Birth of Personality Testing by Merve Emre. Of course, she confirmed what my professor had said many years ago. However, it reminded me of something I see often in education.
People who are passionate about helping children and families often feel that they KNOW that what they're doing is helping the communities they serve, even without any real data to back it up.
We KNOW that our Family Science Night was a success because there were lots of families there, and everyone enjoyed themselves. We BELIEVE that a teacher is effective because the children love them. We FEEL the impact of an after-school program because, well, it's been in the community forever.
Unfortunately, we can't rely on gut instincts, feelings, and beliefs alone to tell us if something is effective... just like I couldn't make decisions based on only an affinity for Myers-Briggs.
Let me be clear: education, and family engagement in particular, tends to get kind of fuzzy. While we can't rely on intuition, it's also true that we can't rigorously test everything that happens in schools. We need to find a middle ground.
But this isn't just my random interest in personality theory.
When it comes to children and families, we need to make sure that what we're doing to try to help them actually works.
Luckily, it's not that hard to get started. We can begin tracking data, analyzing trends, and ultimately, measuring our impact so that we know we aren't just THINKING that we're changing lives. We actually are.
Ready to start your evidence journey?
Sign up below to get the Evidence for Engagement mini-course sent to your inbox.
You have successfully joined our subscriber list.
I love a good spreadsheet. I mean, I really get excited about it. You may have read on the About page how my business evolved from the development of a really fancy spreadsheet. True story. Now I get to help others learn to use Excel to improve their work and watch them get excited about it too.
One positive thing to come out of the pandemic is an increased appetite for online professional development. Recently, I've gotten to connect with old friends and colleagues by providing a three-part Excel workshop series for the Family League of Baltimore. So on top of hanging out with my old network, I've gotten to teach them about all the fun stuff Excel can do. Win-win!
Here's an overview of the series:
Part 1: Excel Basics
A lot of educators just haven't been trained in how to use data. They may be consumers of it, using someone else's spreadsheet to glean information, but often, they just don't know how to utilize Excel's features for themselves. The Excel Basics workshop starts from the top and discusses formatting, functions, and formulas that beginners can use to build their Excel capacity.
Part 2: Creating and Using Templates in Excel
In the engagement world, there is so much to track! This session built on what was covered in the Basics session and walked participants through the process of designing their own customized tracking sheets. We used breakout rooms to discuss how to track different topics, and we walked through some more advanced features and functions to make these tools as automated as possible.
Part 3: Reporting and Visualizing Data in Excel
Data visualization is a hot topic in evaluation right now, and I get why. When you're able to effectively show your data graphically, you can make your results accessible for a much wider audience. In this session, we talked about so many fun parts of Excel - PivotTables, creating charts and tables for reporting, and ... drumroll, please ... creating interactive dashboards! Did you know you can create dashboards like the one below to share with your team?
Here's what some of my past workshop participants had to say about their experience:
Besides my obvious bias towards using Excel for ... well, everything ... I think it is even more important now for schools and districts to be effectively tracking their work. As we navigate through so many unknowns with school reopening, it will be critical to keep an eye on students who are at risk of falling through the cracks.
Good news - Excel can help (and so can I!).
I'd love to bring this workshop series to more places, so if your team could use a bit of an Excel boost, let's talk!
It's funny how things work out sometimes.
Tamara Hamai and I have been sowing the seeds for our new program, Evidence for Engagement, for months. Our partnership happened so organically - a meeting of the minds for two evaluators who have experience with and a passion for organizations that serve youth and families. We'd been toying with the best way to support the organizations that we serve and help them use evaluation to improve their access to funding and the children and families they serve.
Then COVID hit.
The pandemic has caused all of us to pause and re-evaluate how our work fits into a very new, very different reality. Tamara and I know that small organizations, especially those who work in schools, are struggling right now. Their access to the people they serve has been essentially cut off. We realized that organizations may need our help even more than before.
Our solution: We're running a totally free, three-week email series that will help small youth- and family-serving organizations build their evidence base (which is required under the Every Student Succeeds Act for any organization receiving federal education funds). Through videos, worksheets, frameworks, and success stories, Tamara and I will walk participants through the process of becoming evidence-based organizations and help them see this as an opportunity, not a burden.
The goal: We want to help vital, community-based organizations plan for the future, open themselves up to new opportunities, and become more sustainably funded. We're hoping that this opportunity will help them better serve youth and families, not only during this difficult period of time, but also for a long time afterward.
For us, this is also about equity. We know that for many community-based, minority-owned organizations, budgeting for evaluation is out of the question. We also know that these grass-roots organizations are having a profound impact on their communities -- and that their communities need all the support we can give. We're hoping that we can get more small, local organizations approved as evidence-based programs in their districts and begin to level the playing field.
If you think this program will benefit you and your organization, sign up below! If you know of someone else who could use this support, encourage them to join.
Ready to start your evidence journey?
Sign up below to get the Evidence for Engagement mini-course sent to your inbox.
You have successfully joined our subscriber list.
I've been training in Muay Thai (kickboxing) for a few years. I am always learning something new and being pushed outside of my comfort zone ... and I love it. However, I am and have always been a perfectionist. It's something I've struggled with my whole life: sometimes, I'm really proud that I've been lenient with myself, and other times, the perfectionism rears its ugly head. Lately, I've noticed it manifesting at the gym. As I'm trying to apply a new skill in work with a partner or coach, I've been getting frustrated and self-critical. My self-protective instincts (ironically not working to appropriately block a punch or kick) have made me think, "I don't like this aspect or that skill," instead of allowing me to see that this is a process of growth and that there is no place for perfection in that process.
I think organizations (and the people within them) can be the same way when it comes to evaluation. We get used to our routines, we think we've perfected them, and then one of a few scenarios happen that push us out of our comfort zones. Maybe we are required to learn a new system or skill, or -- even worse! -- we get feedback that doesn't match our own perceptions. Now at the gym, my feedback can be a simulated sparring round that doesn't end so well for me. But in our workplaces, while we are focused on serving the people we care about, feedback that we're not doing so well is upsetting to hear and painful to accept. That upset and pain is followed by questions -- "What could we do differently?" Why is so-and-so doing well at this when we're struggling?" or even "Is this feedback accurate or reliable?" Our self-protective instincts kick in.
The anticipation of negative feedback -- in whatever form -- is a huge barrier for people (including myself!) to try new things, reflect on their own performance, or seek help and other perspectives. Certainly, the accountability culture in education has only made these innate fears and insecurities worse.
Today at the gym was different though. The past few days, I've been more reflective about why I'm getting so frustrated and how that is keeping me from truly learning and growing. So today, I tried to pay attention to the moments when I got frustrated (ie. I collected some data on myself!). I worked with my coach to talk through those negative feelings and develop some strategies I could try in those situations. Then, I practiced and stayed open to more feedback... and by the end of the session, I felt more resilient and confident in my skills than I had in awhile.
Terms and methods like "continuous improvement" and "improvement science" get used a lot in both education and evaluation, and they are proven methods for making institutional (or personal) changes on all levels. I'm sure that what I did at the gym today was just a tiny Plan-Do-Study-Act cycle. Yet for me, these formal frameworks for self-assessment and reflection can sometimes be hard to grasp - and they can feel like another thing we're accountable for doing. However, we can look at them more simply: sometimes, all we need to do is recognize that we're passionate but not perfect, allow ourselves to be open to feedback, and develop an authentic plan for how we can improve. This is true for individuals and organizations.
As an evaluator, I love the moments when conversations about data lead to a-ha moments instead of feelings of defeat. (Data visualization is especially helpful here.) Sometimes, when we take a step back and think about why we're assessing or evaluating, we can see that it's not all about accountability and funding requirements (and not about our individual or collective insecurities either). Sometimes, it is just about putting our guard down (or up, if you're at the gym), remembering that we can always do better, and learning to see our imperfections as a sign of growth in the making.
I had some great conversations this week with colleagues about establishing a culture of data in organizations and training organizations who are new to evaluation and data. These conversations reminded me about one of my favorite old blog posts, that I originally wrote for the National Association for Family, School, and Community Engagement (NAFSCE) in 2017. Given this week's discussions, it felt like a good time to bring it back into the rotation (with a few updates!).
Don't Be Scared of Data - How it Can Guide Family Engagement and Attendance Interventions
When I was a teacher, conversations around instructional data were baffling to me. Fresh out of policy school, I was eager to use what I had learned about data analysis to monitor how my students were performing, but as a social studies teacher, this task was more difficult than I had anticipated. I was required to keep a data binder, and administrators would periodically check to confirm that, well, it existed. However, I struggled to figure out what to put inside of it. My administrators did not help me understand how – absent standardized test data – I could track progress on specific standards outside of my grade book. It often felt like the conversation ended after the word “data” was uttered.
As I have focused my career on family engagement efforts, I have seen how conversations about using data to improve engagement are often greeted by the same blank stares I encountered as a teacher of a non-tested subject. On other days, talking about data elicits looks of panic or skepticism. At one particularly memorable training, community school coordinators were led in a debate about the utility of data. Sitting from my seat on the pro-data side of the room, I was amazed by arguments from the anti-data group. What resonated most is that these capable and talented colleagues understood data to simply be numbers on which their performance review was based, not as a tool to discover context and unlock insights about the families being served.
I think this belief system exists for a number of reasons. First, many educators are tired of increasing demands for data without sufficient training. Professionals need to understand how data can be collected, ways in which it should be analyzed, and how it can actually make their work easier. I have found that on-the-ground staff are often the last to receive the proper supports and professional development around understanding and using data. It becomes a symbol for all of the things we don’t like about accountability instead of the asset that it truly can be.
Perhaps more importantly, the work of engaging families – understanding needs, forming trusting relationships, and helping people when they are vulnerable – is incredibly difficult to quantify. Often, we know we have made progress or achieved results – not because of a spreadsheet or heat map – but because a family had enough food for the weekend or because a child stopped acting out as much in class. How do we tell those stories? How do we show our value as professionals when these important markers seem impossible to put into a spreadsheet? These are the critical questions we need to answer.
For these reasons, it is my mission to help educators and professionals realize that data does not have to be scary or intimidating. It does not require complex coding skills or mathematical know-how to track how clients are being served. If you would have been sitting across from me in the data debate, here are some tips to get you started:
Using your organization's qualitative and quantitative data can give you amazing insight into both the ongoing needs and continuing growth of the students and families you serve. With a little less reticence towards this approach, we can make a lot more progress in engaging families to help their children succeed.
Of course, if your organization is unsure of how to get started in this area, I'd love to be of assistance. Learn about the new Build Your Evaluation Capacity training package!
The goal of this blog is to highlight relevant issues that impact students, families, and communities and spark engaging discussions about how to address those issues through evaluation.