In my last post, I talked about how logic models -- although they can be a chore -- can actually be a great visual roadmap of your program's components, benchmarks, and goals.
Working collaboratively to develop a logic model can be a unifying experience for teams, a way to get everyone on the same page about the work.
But what keeps your finished logic model from disappearing into the dark void of your computer's file system, never to be found or opened again?
If you've been reading my blog for awhile, you may know where I'm going...
You can use your logic model to help design your team's data tracking system!
Although people design them differently, logic models always have a column for either the measures that you'll use to assess progress (as I have in my template below) or the immediate outputs that would occur as a result of your activities.
If you're measuring your family engagement efforts, some examples could include:
Your logic model is basically a cheat sheet to the data points you'll want to track for your program!
So if you follow the steps in my free guide to tracking your engagement data, you'll see that you've already answered some of the questions in Step 1 -- why you're tracking the data.
I don't know about you, but I think that identifying the purpose behind each part of our work is often the hardest. But since you've already connected your measures to your short- and long-term goals, you're ahead of the game!
With your logic model in hand, you'll be ready to tackle the rest of the questions with ease and start tracking.
If you haven't gotten your hands on my free guide, use the form below to get your copy!
"Ugh, they're making us submit a logic model? What is the point of a logic model anyway?"
"I don't understand all that technical jargon. What am I supposed to put in a logic model?"
These are the thoughts I imagine my clients having when a funder or state department of education requires them to create a logic model ... and rightfully so.
Logic models are often overly complicated, far too technical, and not connected to ACTUAL practice - so it's no wonder that they are not intuitive for people doing "the work" in education.
It doesn't have to be that way though. Let's talk about what a logic model actually is and how it can help your organization.
I found the following definition of a logic model on the Community Tool Box (emphasis added):
“A logic model presents a picture of how your effort or initiative is supposed to work… Effective logic models make an explicit, often visual, statement of the activities that will bring about change and the results you expect to see for the community and its people. A logic model keeps participants in the effort moving in the same direction by providing a common language and point of reference.”
Here's what I love about this definition:
1) It clearly conveys that a logic model is a visualization of how your program operates, and
2) It helps teams see that a logic model can actually streamline their work and make sure everyone is on the same page.
The image below from the CDC is another great way to think about what a logic model can do.
So instead of an annoying task you have to complete for grant funding, think of a logic model as a dynamic map of your program and how you will collaboratively work towards achieving your goal.
Recently, I introduced this new way of thinking to a school district team I'm working with.
I'm designing a data dashboard (learn more here!) for them so that all of their family-serving teams can share data and serve their district's families more efficiently. (I'm really excited about this.)
When we met with each team individually, they were all saying the same things about how they engage with families and what data they already or want to track.
Yet, collectively, they couldn't see the forest for the trees. They didn't know how similarly each team was operating!
Let me be clear: this wasn't because they were not communicating or working together. It's because they didn't have a framework to guide their collective work and show where there was overlap across project teams.
So we got to work. Using Google Slides, we did an interactive work session where the teams brainstormed what they would put in each part of a logic model. Below is a screenshot of their "Activities" brainstorm.
Then we did a virtual "gallery walk" so they could see how much overlap there was. See how many "I do this too" stars there are in the image?
After this, it took no time to put together their ideas into a more traditional logic model format.
If your team is currently struggling with making a logic model, don't be afraid!
Reframing how we think about logic models can go a long way towards making them purposeful, usable tools to make our family engagement work more effective.
Isn't it so gratifying to learn a new skill and get to apply it?
One thing I've been learning lately is how to use ArcGIS, a super fancy mapping tool that allows you to collect, analyze, and visualize all sorts of data.
I've talked about mapping with clients and at conference presentations for awhile, and I've loved using public (read: FREE) mapping tools to learn more about the communities I was serving or studying.
I’ve used maps in many ways – describing the community for grants or needs assessments, determining which students need home visits, or figuring out which resources are near students’ homes.
Most recently, I've gotten to use maps through my part-time work as a researcher at Ohio State.
We were trying to figure out if the students in our college had practicum placements within federally designated "medically underserved communities."
Using a free public map file from a government agency and uploading a list of addresses where our students were placed, I was able to instantly visualize and (through ArcGIS's fancy tools) analyze the percentage of our students working within underserved communities.
Seeing it all come together was magical.
It painted such a clear picture of the impact of our college and the difference that our students are able to make.
Given the disproportionate impact of COVID-19 on disadvantaged communities and the rising awareness of systemic racism on, well, every aspect of community life, knowing what children and families need outside of school – and acting on it — is critical.
Using data to pinpoint which students are at the greatest risk of disengaging from online school or whose families struggle to meet basic needs is essential for targeting interventions and outreach.
Here's a list of free mapping tools to get you started (from my May post on the AEA365 blog):
City and county agencies also have amazing resources. See if your health or police departments, school districts, or universities have online tools for exploring your area.
Here’s a map I created of the schools, hospitals, and other services where I live, from the city’s mapping tools:
To get started mapping your own data, I always recommend starting with Google Maps!
Beyond being a lifesaver for those with a poor sense of direction like me, Google Maps offers a free tool for creating your own maps. You can map multiple data sources and use colors, symbols, and labels to make sense of your data.
Then go play! See what maps you can create of your community.
Now think about each of your students and families as dots on that map. Imagine what they might see every day when they walk down the street. What resources are available to them in their community? In what ways is their community potentially putting them at risk?
You can use what you learn from your mapping explorations to influence survey questions and interview protocols for students and families. For example, if there has been a recent rise in crime rates in a neighborhood, ask families and students if they feel safe and what the school or district could do to make them feel safer.
Certainly, your findings can also help you figure out what services to offer within your program or school.
It's been so enjoyable to build my mapping skillset and explore a whole new way of looking at data. I hope you take some time to play around with these mapping tools and see what you can learn!
It's the end of the year, when we reflect on the past year and look forward to new beginnings.
I haven't done any rigorous data collection about this, but I think it's fair to assume that most people would rate 2020 as a giant dumpster fire.
So let's take it back to 2019.
Last year was challenging for me in a different way. I had moved to Ohio from Maryland the previous year, and I naively thought that I could transition my business to my new home with relative ease since most of my work was remote.
As it took time for new connections to materialize into new contracts, I knew that things weren't headed in the right direction.
Now, I had a few metrics I used to measure my business:
At the time, those weren't pretty.
However, what was most telling for me was how I FELT.
I was discouraged, uncertain, and anxious. And in terms of my business, I didn't know what to do to make it better.
Certainly, the quantitative data was informing how I was feeling, but the numbers alone did not compel me to act.
I just wanted to feel better! I knew it was time to make a change.
Around that time, I met a new neighbor who specialized in website design and SEO. I heard about a marketing consultant whose approach resonated with me. I saw that the American Evaluation Association's (AEA) conference was featuring a lot of workshops and resources for independent consultants.
I sprang into action. I met with these new specialists, went to the conference, and got to work. And ultimately, I turned my business (and outlook) around.
While there is always room to grow, the numbers I mentioned earlier do reflect the changes I've made to my business. Yet I don't think they show HOW or WHY those changes happened.
From my neighbor, I learned something new and gained hope that some simple strategies could make my website work better for me.
From the marketing consultant, I felt understood and discovered a new way to communicate what I do and why I love it.
From AEA, I gained a large group of new colleagues, friends, and referral partners - but most of all, I felt accepted, validated, and supported.
To me, those feelings and networks are what helped me turn the page in my business - the fuzzy, not easily measurable, qualitative stuff.
Had I not reflected on those things, I might have stayed in my rut.
Maybe you already track your data, and maybe you don't, but if you're getting the feeling that something's not right, think about the qualitative data you can explore to see what's up.
How do your families feel when they interact with you? How do the staff feel? What is the tone of your interactions? How engaged are students in relationships with peers and staff and with their learning?
These things matter.
One thing I'm grateful for from 2020 is that a lot of educators are seeing just how vital family engagement is for student achievement.
So while we're reflecting on this crazy, crazy year, let's take a second to examine how our students, families, and staff FELT and how we helped them feel better.
If that's what we take with us into 2021, then I think we're off to a good start.
This week I've got a co-author to help me continue my qualitative data series!
Sarah Dunifon is Founder and Principal Evaluator of Improved Insights LLC, an educational evaluation firm focused on STEM and youth-based programming. She is based in Cleveland, Ohio and is a fellow board member of the Ohio Program Evaluators Group.
We hope you enjoy our post below.
Qualitative data can be a bit elusive.
It’s not usually too hard to find data for things that are measurable. We know we can do surveys, or count the number of attendees, or track patterns over time.
Qualitative data though - the context for those numbers - often takes a little more work to track down. Of course, we can always do interviews and focus groups with stakeholders to learn about their experiences, our usual go-to’s.
However, if you think of qualitative data for what it is - simply put, another information source - you’ll find that so many other forms of it are hiding in plain sight.
Think about the chatbox in your last Zoom session - you may not have realized it, but that’s a source of qualitative data! Other sources you may have readily available include the phone call logs that your teachers keep when they call families or even the observations you did of an event (online, drive-up, or fully in-person).
If you need more, there are lots of ways of collecting qualitative data, and many of them are even more prevalent now in our almost fully virtual world.
This makes our lives a lot easier, as we prepare to write our annual reports, apply for grants, or share the impact that our program had during this unusual year.
Like I mentioned in my last post, sharing the context for our quantitative findings can make those reports tell a much richer story.
Yet it’s not always intuitive to know how to turn a whole bunch of text into these powerful programmatic insights.
So when you find these sources of qualitative data, what do you do with them?
We can actually find patterns in our data by assigning thematic codes to different words, phrases, or even images. Sometimes, you start with a set of codes that have to do with your program goals, or the research concepts underlying your program.
Other times, you just code as you go. If you start to see a lot of mentions of a particular topic, that topic can become a code.
Coding can take many forms, and there is fancy software that can help you do it, but sometimes all you need is a notebook and some markers or a color-coded spreadsheet.
Below you can see some sample data about an after-school program focused on science and animals that we’ve color-coded according to the themes we saw.
In one glance, you can see that our participants liked a lot of aspects of the program, but games and activities (in blue) and the food (in pink), got the most mentions.
Coding allows us to see what’s happening across the dataset and pull out themes or key insights that we need to highlight.
Sharing your qualitative data analysis can be an important addition to your data story when demonstrating the impact of your work. It can add relevance, personality, and context to quantitative data by illustrating individual effects.
By reviewing our datasets systematically, we can also find some incredible quotes - the kind you would never attempt to paraphrase if you were writing a paper because they were so perfectly worded -- and let our stakeholders’ words shine.
You can feature key quotations by offsetting them or putting them in a different color in your report to highlight individual experiences and catch readers’ attention.
Another popular way to display qualitative data is in a word cloud.
Word clouds are visual representations of keywords that come up frequently in a set of qualitative data. Typically, the bigger the word, the more frequently it appeared in a data set.
There are plenty of critiques of word clouds in the data visualization space and rightly so - word clouds can often obscure meaning rather than clarify it. So if you are going to use them, here are three things you should know:
1. Give the data a good cleaning to remove anything that you don’t want represented in the visual.
Here, we’d recommend removing any responses that do not give value (e.g., “idk,” “I’m not sure,” “Nothing,” etc.) as well as any text surrounding the main themes (e.g., “I like the [...],” “I love [...],” “my favorite thing is [...],” etc.).
2. Consider the messages or key points you see in the data that you wish to convey visually. If it is possible to condense themes further or pull out important words, now is the time to do so.
This might mean collapsing phrases as best as possible to a single word, or perhaps a few words of important meaning.
3. Make sure to keep the essence of the data - meanings can be misconstrued when collapsing phrases into single words or shorter phrases.
If you’re finding this is happening, perhaps a word cloud is not the best way to display your data.
However, with data cleaning and basic analysis, the word cloud can change drastically.
Take a look here at three versions of the same word cloud we generated on WordItOut using the data we shared earlier. The first was created with original - or “raw” - data, the second with cleaned data, and the third with some basic analysis and condensing.
Notice how the prominent words change with each version, and how the meaning and key messages can shift.
As you can see, while word clouds are one of the most accessible forms of qualitative data displays, they take some work to be most effective.
However, word clouds aren’t your only option. Data visualization experts like Stephanie Evergreen, Storytelling with Data, and Depict Data Studio all have great resources on different qualitative data displays.
The case is clear - with some simple analysis and visualization, qualitative data can be a powerful addition to your data story.
You should know by now that I'm a bit of a data nerd.
I love spreadsheets. I love organizing data and using it to illuminate patterns. I love the "ah-ha" moments when clients realize how much their own data can tell them about the kids and families they're serving.
So it may surprise you that I'm here to say that numbers and spreadsheets don't tell us everything.
That doesn't mean that numbers (or quantitative data) are irrelevant.
It just means that they are even more informative when paired with stories, quotations, or anecdotes (qualitative data).
(See the box for a quick refresher on the difference between the two).
Here's an example. Yesterday, I was re-reading an article from The Columbus Dispatch, my local paper, about the spike in youth violence that has occurred during the pandemic.
It's been horrible to hear about how many children and teens (well, really anyone, for that matter) have been victims of gun violence since the spring.
The article cites a number of statistics -- that the number of children treated in Columbus for gunshot wounds this spring and summer was double the rate from 2019 (from 16 to 32); and that children from racial or ethnic minorities are twice as likely to be shot than white children.
Those are AWFUL statistics - and they certainly help me see that there is a dire situation here.
But then, the article talks to a teacher whose student -- an eight year-old boy -- was killed. Here's what the article shares about (and from) the teacher:
Thalgott has lost a handful of former students during her 20 years of teaching on the South Side. She's seen even more students who have lost a parent to gun violence.
Having lost some former students or their family members to gun violence -- either as victims or perpetrators -- this quote really gets to me.
This quote conjures up such raw emotions that suddenly it puts the statistics they cited into context.
Those 32 children are somebody's child, somebody's sibling, somebody's student, somebody's mentee. Hearing from a person who actually experienced that loss made a big difference in how I processed this article. I imagine it did for you too.
Quantitative data can be so powerful, but its impact is amplified when we lift up the voices of those we are serving or studying.
Qualitative data -- gathered through interviews, focus groups, open-ended survey questions, or observations -- can sometimes more effectively communicate the experience of what is happening in your school or community.
I'll be doing a series of posts on qualitative data over the next few weeks -- how to collect it, how to use it, and how using a combination of data can truly help you tell your story.
I started writing a completely different blog post for this week, but when I read the news this morning, I knew I had to shift gears.
CNN featured a story yesterday called "Teachers and social workers search for students who are 'missing' in the pandemic."
The word "missing" made my heart drop.
It immediately reminded me of this story, of a young girl from Washington, DC named Relisha Rudd. I heard about her story in 2014, and it broke my heart -- at the time, I was a community school coordinator and led our school's charge for attendance and engagement tracking and interventions. Staff from Relisha's school tried to track her down, only to find that a man working at the homeless shelter where she was staying had been impersonating her doctor to the school.
She is still missing to this day.
As a community school coordinator, her story lit a fire under me to do everything I could to make sure that we knew, to the best of our ability, that our kids were safe and able to come to school.
I have thought of Relisha over the years and find the tragedy of her story to be a call to action for schools, districts, and other youth- and family-serving organizations.
How can we make sure that no other students fall through the cracks?
When I read the story this morning about the Robla School District in California doing home visits and trying everything they can to find their students "missing" from online school, I had so many thoughts:
Labeling students as "missing" drives home the gravity of the situation our country is in. Families are truly struggling because of the virus and the economy, but honestly, lacking access to the internet, to stable housing, and to consistent work have been challenges for so many families for so long. The fact that things are only getting worse is upsetting and shows us that we have so much work to do.
Literally going into neighborhoods searching for children is heroic, but also emotionally grueling. I remember the disappointment and worry of having a string of unsuccessful home visits -- you gear yourself up for making a difference, only to find that addresses were incorrect or have changed, or worse -- you just don't know where students and families are living. That is scary, and it is emotionally taxing for educators.
While data tracking can't help us physically locate a family, it can help us focus our efforts where they are needed most. You may have seen that last week, I released a guide for how to Track Your Engagement Data in 4 Simple Steps. I believe strongly that using simple functions in Excel can help educators pinpoint exactly which students and families need additional support -- whether that's with attendance, engagement, or academics.
(I know it can work because I've seen the impact it has had on my own work in schools!)
So in honor of Relisha and in commitment to the well-being and success of students who may be "missing" from online school today, let's get tracking.
To learn more about data tracking, visit my Engage with Excel page or sign up below.
Sometimes you launch a survey, and you're blown away by the number of responses you get.
And sometimes, you're not.
I had one of these moments last week.
I was SUPER excited to try something new with my blog and launch a survey to hear what readers wanted to see in future posts.
I sent out my blog to my email list, waiting with bated breath (okay, maybe I'm being dramatic) for all of the responses to pour in.
... And then I realized that the survey I embedded didn't even show up in the email.
INSERT FACE PALM EMOJI.
Let's just say that I didn't get the response rate I was hoping for.
But here's the thing - it's okay to have a survey fail. All hope is not lost.
If you don't get the response rate you were hoping for, take a step back and consider:
For me, technical issues definitely got the best of my survey attempt, but I also think a reminder wouldn't hurt.
So here's my plug:
I'd love to know more about what you want to learn! I'd appreciate if you could take a minute to share your thoughts and preferences with me. I'll report the results and use them to make this blog even more beneficial for you.
The survey is embedded below, but if it's not loading for you, click the button to go right to it.
We have all taken TONS of surveys in our lifetime.
We get surveys when we make an online purchase, when we speak with a customer service agent, when we want to get a free gift card, and even when we go to the hospital.
We're all pros at taking surveys... and we all know when we're taking one that's TERRIBLY designed.
For me, if I don't feel like I can answer the questions, or if it gets too long or overly annoying, I'm out.
And that organization just lost a respondent.
I don't want that to happen to you -- because in education, surveying our stakeholders is SO important. It shows that we value our stakeholders' opinions, feedback, and experiences.
We can't afford to lose respondents because of iffy survey design.
Here are a few of my tips for upping your survey game:
1. Ask only what's really important.
Make a list of what your team is wondering about or what the impact of your proposed projects/plans might be before you draft your survey questions.
Keep it short and sweet ... if it's not related to those things, don't include it.
2. Reach respondents where they are.
Think of all of your touch points with your key stakeholders. Students may be log in for online class, families may check social media for updates, and all of your stakeholders may access meal sites.
At all of these venues, you can easily ask about needs, satisfaction with the school's efforts, or other questions you may have.
You can also get feedback through polls in Google Classroom, Zoom, via text message, or even on social media.
3. This may seem obvious, but ... make it easy for respondents to actually answer your questions.
Keep the language clear and simple so a person of any reading level can understand it.
Never ask about more than one topic in a single question, and try to avoid giving a neutral middle answer option when you can.
(In both of these cases, it's very hard for you to actually learn anything from the data.)
And of course, if you work with communities who speak languages other than English, find a way to translate your survey into their language.
Translation is a much tougher process than it should be, but it is essential for making all of your families feel valued and for hearing from your entire community, not just one subset.
All that being said, I feel the same way about you - my colleagues, clients, and readers.
I want to know what's important to you and what would be helpful for me to cover on the blog.
I hope that you've been inspired by this post and will take my brief survey below.
I appreciate your feedback and will use it to generate future content for you!
Growing up in New Jersey, the day after Labor Day always marked the start of a new school year ... and the day I finally got to wear the new outfit I had carefully planned and crack open my new, pristine notebooks.
If you couldn't tell, I have always loved the excitement of returning to school.
Unfortunately, for many children, families, and educators, this year felt different.
Some of the usual excitement and jitters have been replaced by trepidation about what to expect from a year like no other.
Concerns about health and safety, academic progress, and schedule juggling have been abundant in my conversations with teachers and the staff and family surveys I have analyzed.
So how will schools and districts know if they are adequately addressing their stakeholders' fears?
Well... they've got to ask them.
Colleagues in a number of recent conversations have been discussing the use of continuous improvement cycles. If you're not familiar with continuous improvement, its hallmark is the Plan-Do-Study-Act (PDSA) Cycle.
Alicia Grunow of the Carnegie Foundation for the Advancement of Teaching explains the PDSA cycle:
More simply put, schools and districts need to:
Then, the cycle starts all over again ... quickly.
We're not talking about huge, multi-year studies here ... This is a relatively quick and simple process!
Make a plan, implement the plan, figure out if the plan worked, and if not, adjust and try again!
With school kicking off, schools and districts have already put a short-term plan in place and are putting it into action.
And this year, short cycles of trial and error are going to be key, as even our modes of schooling could change multiple times throughout the year.
So how can schools and districts get feedback from their stakeholders NOW to see if their plan worked?
Instead of a lengthy formal survey, think of creative ways to ask for feedback:
Asking one or two questions at a time in interactive ways will make it easy for stakeholders of all groups to participate, prevent them from getting tired of surveys, and give you real-time data about how people are feeling.*
*Just make sure the platforms you choose will allow for translation.
Now here's the kicker: once you collect data, you have to complete the cycle ... ACT!
Make it clear for students, families, and staff that you valued their feedback and are going to put it to use ... and tell them how!
Start this crazy school year off right by lifting up the voices of your stakeholders in fun and easy ways and demonstrating that their feedback will guide your next round of planning and action.
What are your creative ideas for hearing from stakeholders? Share them in the comments!
The goal of this blog is to highlight relevant issues that impact students, families, and communities and spark engaging discussions about how to address those issues through evaluation.