Veg variety expands acceptance with kids

Australia: Increased acceptance for multiple vegetables was noted during the five weeks of one study and sustained at the three-month followup. Following the study, parents reported that offering the vegetables was “very easy” or “quite easy” with the majority following the instructions provided by the study.

This study recruited 32 families with children between the ages of four and six where low consumption of vegetables was reported. Parents completed an online survey and attended an information meeting prior to participating.

Study data was collected in several ways: two dinner meals served at the research facility during which children could eat as much of the broccoli, cauliflower and green beans as they wished; changes to actual vegetables consumed at home, childcare or school recorded through food diaries; and parents reporting on usual vegetable consumption. Families introduced one vegetable served broccoli, other families tried multiple vegetables. Parents were provided with a voucher to purchase the vegetables and given instructions on portion size and cooking instructions along with tips on how to offer the vegetables. Children were served a small piece of vegetable three times a week for five weeks. A sticker was given as a reward to children trying a vegetable.

Families that offered multiple vegetables recorded an increase in consumption from .6 to 1.2 servings, while no change in consumption was observed in families serving a single vegetable or families that did not change their eating habits.

 

https://www.sciencedaily.com/releases/2019/09/190909123713.htm

2019 data collection strategies-South Champlain Islands and Capital City Farmers Markets, Part 2

from Part 1

For the last few years,I have worked on an FMPP-funded project under the supervision of NOFA-VT’s Direct Marketing Coordinator, Erin Buckwalter. This project will aid in building a culture of data collection at Vermont’s farmers markets and has included resource development, evaluation strategies for all market types, and direct technical assistance and training. Because of this, I added a second annual trip besides my usual winter conference attendance. And lucky for me, it was scheduled for the summer rather than the usual winter trip, which, although very lovely, is somewhat limiting for this Southerner and means I see few markets.

Erin suggested that we create a team of market managers, agency leaders, and market volunteers to gather data for markets in August. The goals were multiple:
1. model good data collection habits
2. network markets interested in data collection
3. test out some methods for different types of markets
4. look for opportunities for needed resource development on evaluation
5. see more markets and make a direct connection with market leaders
6. collect some data!

She sent out an email to a few markets to nominate themselves. Obviously we needed to be able to do them in a short span of days, the successful applicants needed to have a use for the data, and they would have to have some capacity to assist the team.

We ended up with 2 excellent choices: Champlain Islands Farmers Market – South Hero, held Wednesday afternoons 3-6 pm, and Capitol City Farmers Market (Montpelier) held 9-1 pm Saturdays.

They were wonderful choices because they were so very different, and they have enthusiastic leadership that are very interested in the data.

Capital City Farmers Market-Montpelier

The team:
Jennie Porter, NOFA-VT’s Food Security Coordinator
me
Dave Kaczynski , Montpelier FM board member, VTFMA board member
Sherry Maher, Brattleboro Winter mkt leader, and NOFA-VT’s lead for in-state data collection strategies on this project
Alissa Matthews, VT Agency of Ag, Food and Food Systems (VAAFM)
NOFA-VT alumni Jean Hamilton and Libby MacDonald
Elizabeth Parker from Sustainable Montpelier Coalition who offered to stay and help when we approached her as a shopper that morning.

Dave and his fellow board member Hannah Blackmer were our leads for the this farmers market collection. This required a very different plan than South Hero, as the Montpelier market is much larger and is situated on a busy shopping district street. As most Vermonters know, this beloved market has been around for 40 years, but has already had to move locations more than once, and will have to do that again after this year. So questions about location had to be added to this survey which meant a flurry of emails and even some refinements to the survey on Saturday morning- thankfully, there is a copy/print company right down the street that was open.
And because this market was on a Saturday morning, market leaders who were interested in doing team data collection could not help, as most were either running their own market or working another job.
So because we had a smaller than necessary team, and the survey would take longer, we decided on a different and relatively new method for collecting the visitor count. We used a method that works better for small teams and for less busy markets: the Sticker Count.
The idea is to give each adult who enters the market a sticker to wear, telling them that we are counting the attendance that day, and then count how many stickers were given out to assess the number. And by wearing the sticker, we won’t double count them.
This method can be fun and less taxing to counters than clicking entries, but it has its own issues, such as:
1. The community has to be aware of this activity beforehand and know to take a sticker but only one.
2. Since counts are estimating potential shoppers, kids are not usually counted. That can be difficult when kids cannot take one of these stickers as they are often the only ones who want to wear a sticker. (Our solution was to stick those stickers to the back of our paper that someone had refused to take to be able to give kids one of those. That way the child’s sticker was not adding to our count. Another way to solve this will be to have kids-only stickers to hand out.)

3. Complex layouts can also make this hard (although complex layouts make ALL counting hard!) and CCFM has ONE fascinating and complicated layout:

 

 

In terms of the survey, we decided to have more ways to complete them as we had a goal to get over 260 completed surveys:
• “intercept” surveys, which means a surveyor asked questions and wrote the answers on their form

•  self-reported surveys under a tent, where people could fill out the forms on their own on paper, or on one of our laptops set up for that;


•  having signs with a QR code for smartphone users to snap a picture using their smartphone which takes that phone to the form online.


The tent was ably staffed by Alissa Matthews, who we decided to have there because she has been involved with this relocation process and could better answer questions about the possible locations and is always calm and cheerful . Dave set the tent up beautifully, both by adding eye-catching signs and table coverings. He also knows how to make tables comfortable for those reading or writing by adding leg extensions which helped as well. His survey work is also stellar; he is a natural at it.
The tent was constantly bustling, Alissa aided by me or by nearby Sticker Queen, Libby McDonald.

One issue at the tent was that the online form was designed to require an email address, which is helpful to ensure only one response per email, but it seemed to freak out those at the computer. The reason the online survey was also included was partly to gather more responses next week after market day, because the location issue is significant for the entire market community to be able to weigh in. Oddly, those doing the self-reporting paper surveys at the same tent were less concerned about the email request on their form and even when we told people they didn’t have to fill out their email on those forms, they often did, saying they would be happy to learn more about the market or the relocation process. (And those doing intercept surveys don’t ask for emails at all.) Another issue was that the printed self survey had a few areas that confused people (the frequency of visit choices were too close together so many people circled more than one choice, and lots of folks missed the other side!) One last issue that I noted a few times were both members of a couple were filling out surveys, which means their economic contribution that day would be doubled. I don’t think any of these damaged the day’s data in a major way, but these are the issues that can arise with allowing self-reported survey completion.

 


The sticker counting started off extremely well, with aforementioned volunteer Libby taking the entrance near to our tent as her stickering responsibility. We worked out language around that, as brandishing a sticker at someone entering a market could seem off-putting, and the market had less time to let folks know beforehand that we’d be doing this.

Instead of “Can I offer you a sticker? The market is counting everyone attending..” which offers an easy chance for a NO.

we settled on:

“here’s a sticker for you (putting it gently on a shoulder or handing to the person) ; the market is counting everyone attending by giving each adult a sticker. The good news is if you wear it, we’ll not bother you again!”

3-5 of us were constantly handing out stickers (and the surveyors also had stickers if someone they stopped had gotten past us), all strategically placed near entrances or busy areas. We also had signs at all of the vendor booths and also explained what was happening and asked them to steer anyone without a sticker to one of us.

Our estimate was that maybe 20% were not stickered, especially later on when larger groups started to show up and we couldn’t get to them all. That part is still a very rough guess, but with more trials, we may get better at it.

Still, it was a cheerful, participatory way to do counting and many people were intrigued by the idea and one person even said enthusiastically to one of our team when asked if she had taken a sticker: “Yes,  I was counted today!”  Honestly, that made my day.

Overall, the numbers of surveys far exceeded our goal (we even had to go print more surveys for people to fill out!), our count felt as if was a good test and the team felt relatively confident about the numbers.

 

The Cap City Team: Me, Dave, Sherry, Jennie, Jean, Alissa, and Libby (sorry to miss Elizabeth who had left.)

Part 1

2019 data collection strategies-South Champlain Islands and Capital City Farmers Markets – Part 1

Checking out different ways that markets collect and use data is one of my chief duties in developing evaluation tools over the past 20 years. And since part-time at FMC, I have also contracted directly with some markets and networks, mostly on data collection strategies, which also informs my FMC duties.
One of those delightful synergies can be illustrated through my long time relationship with Northeast Organic Farming Association- Vermont (NOFA-VT). For the last few years, I have worked on an FMPP-funded project under the supervision of NOFA-VT’s Direct Marketing Coordinator, Erin Buckwalter. This project will aid in building a culture of data collection at Vermont’s farmers markets and has included resource development, evaluation strategies for all market types, and direct technical assistance and training. Because of this, I added a second annual trip besides my usual winter conference attendance.  And luckily for me, it was scheduled for the mid-summer rather than the usual winter trip, which, although very lovely, is somewhat limiting for this Southerner and has meant few market visits.

Erin suggested that we create a team of market managers, agency leaders, and market volunteers to gather data for markets in August. The goals were multiple:
1. model good data collection habits
2. network markets interested in data collection
3. test out some methods for different types of markets
4. look for opportunities for needed resource development on evaluation
5. see more markets and make a direct connection with market leaders
6. collect some data!

She sent out an email to a few markets to nominate themselves. Obviously we needed to be able to do them in a short span of days, the successful applicants needed to have a use for the data, and they would have to have some capacity to assist the team.

We ended up with 2 excellent choices: Champlain Islands Farmers Market – South Hero, held Wednesday afternoons 3-6 pm, and Capitol City Farmers Market (Montpelier) held 9-1 pm Saturdays.

They were wonderful choices because they were so very different, and because they have enthusiastic leadership that are very interested in the data and learning more about collecting it.

Champlain Islands Farmers Market – South Hero
is one of those organizations that operate markets 2 days a week in 2 different locations. As such, it means the two are actually quite different in terms of vendors, products, programs, and visitors.
The Wednesday market is held behind a church and its location was partly chosen to take advantage of the visitors who are on that part of the island before they turn to the ferry. It has around 16 vendors, offering a wide variety of what is needed by seasonal visitors who will be cooking in their vacation kitchens and what permanent residents need for their table. Because the site is offered by a third party, sharing data on the positive impacts of this location is always helpful, as is analyzing the functionality of the site. Cindy Walcott, Market Chair/Treasurer and Julia Small (market manager) were gracious hosts, giving a lot of assistance to our team.

The team:
Erin
me
Dave Kaczynski , Montpelier FM board member, VTFMA board member
Sherry Maher, Brattleboro Winter mkt leader, and NOFA-VT’s lead for in-state data collection strategies on this project
Janice Baldwin also from the Brattleboro Winter Market
Alissa Matthews, VT Agency of Agriculture, Food, and Markets (VAAFM)
Anisa​ Balagam​, the new market manager of the Winooski Farmers Market​.​

This market organization has collected data previously and has devised an almost fool-proof way to count their visitors. Since the parking is routed from the main road via a narrow drive to a graveled area, they can position someone at the beginning of the drive, counting every car and the number of adults inside. Additionally, it also allows the market to collect the license plate state which is extremely important as Cindy says that the attendance for this weekday market usually about 50% Vermonters.

Using their counting sheet, two of us went to the vantage point to gather the count. Cindy has also downloaded a counting app on her smart phone, having set it up previously to capture the same detailed data, but we decided to go paper.

Cindy gives us an overview of the counting method the market uses.

The rest of us would gather surveys from the visitors, and since we had a good crew size, could team folks up and also allow them to take breaks to shop and eat.
They had a tent and tables for our use, and we decided to put it in the location where we could best capture folks on their way out. Deciding if the team will survey folks coming in or out is one of the decisions the collection supervisor needs to make before or on the day of – with input from the team.

Whether you do it on the way in or out has a lot to do with the shopping behavior

-are people frantic about missing items that quickly sell out? they will be less interested in doing the survey on the way in.

-are people loaded down with bags and have a long way to go to their parking? they may be less interested in offering data on the way out, although having tables and a tent to put their items down does help!

-and if you are asking intent on learning about their purchases that day, it may be better to wait until the end of the shopping trip. – However, if you have a small market with a lot of regular weekly shoppers, it may be okay to do it as they come in as the amount spent may not vary as much week to week for those shoppers.

We began the day with a group logistical meeting: introductions, and discussing who would be where and how to get breaks when needed. Depending on the group, a quick round of role plays with the survey sheet may also helpful. Cindy gave us the likely attendance number (which decides how many surveys to collect), and the type of shoppers this market usually experiences. My responsibility as the Data Collection Coordinator was simple for this market (and was a very different job for our Saturday market visit at Capital City – more on that in Part 2) but even when it is simple, the Coordinator should be constantly rotating, collecting completed sheets to make sure things look right, re-assigning folks when necessary, and generally seeing what else can be done (and if possible, doing data collection too.)
The crew was eager and because it was a group of market leaders was great at problem-solving, very willing to engage with shoppers, and able to gracefully steer “I don’t know” answers to a specific amount or answer.

market map

The market had originally had us next to the Land Trust info booth, but after a short discussion, the team decided that moving our tent to a spot closer to where we estimated the path to leaving the market would be was better for us. Dave also suggested that we move the picnic table into our tent for folks to sit or to place their bags, and since the day began rainy,  Julia thought it fine to do just that.
The survey collection went great as everyone was very willing to stop and answer questions. I find that the majority of people (90-95%) are always very open to this, especially if the opening line is something like “Can you spare a minute to help the market?”  It has almost always been true on the farmers market data collection teams that I have worked that surveyors constantly exceed the collection goals set for them because they find it easier and more fun than they originally expected. Sometimes it is harder to get them to slow down, which can be necessary to make sure that a comparable number of surveys are collected in each hour.Making it fun for the surveyor and not taxing to the respondent are other reasons that the survey should be well designed and as short as possible!

I must say for this experience of having every person we asked say yes AND people making a beeline for us to take the survey before we approached them was delightful, and is a credit to the excellent pre-market communication that the market had with this community and also makes it clear that the community understands that this is a data-driven market.
Well done Champlain Island Farmers Markets!

More later on the data that was collected, once it has been cleaned and organized by the market organization and NOFA-VT. We did exceed our goal for the number of surveys that the team and the market agreed it wanted to collect. For most markets, collecting 10-15% of the usual attendees will be a good number, but there are ways to calculate that further.

Anissa uses the tent

Erin does the first survey

Data collection and time for sharing and general conversations too

 

Part 2

Indicators (sick of them yet?)

With the announcement of the 2018 FMPP/LFPP RFA this week – tucked into the Specialty Crop Block Grant announcement- I wanted to alert you to this 2017 post below about the indicators that are included in the proposal.

There is also a shorter version on FMC’s website.  Here is the link to it. )

Congratulations to everyone who got their FMPP/LFPP grants in by the deadline yesterday. I talked or emailed with a few of you throughout that process and was impressed by the well-crafted strategies that I read and heard about.

As you can imagine, a lot of the calls I was on focused on the new prescribed indicators (performance/outcome measures) that were included with the RFP for the first time. Those were the same for FMPP as for LFPP projects and were:


 OUTCOME 1: TO INCREASE CONSUMPTION OF AND ACCESS TO LOCALLY AND REGIONALLY PRODUCED AGRICULTURAL PRODUCTS.

Indicators 1. Of the [insert total number of] consumers, farm and ranch operations, or wholesale buyers reached, a. The number that gained knowledge on how to buy or sell local/regional food OR aggregate, store, produce, and/or distribute local/regional food b. The number that reported an intention to buy or sell local/regional food OR aggregate, store, produce, and/or distribute local/regional food c. The number that reported buying, selling, consuming more or supporting the consumption of local/regional food that they aggregate, store, produce, and/or distribute

2. Of the [insert total number of] individuals (culinary professionals, institutional kitchens, entrepreneurs such as kitchen incubators/shared-use kitchens, etc.) reached, a. The number that gained knowledge on how to access, produce, prepare, and/or preserve locally and regionally produced agricultural products b. The number that reported an intention to access, produce, prepare, and/or preserve locally and regionally produced agricultural products c. The number that reported supplementing their diets with locally and regionally produced agricultural products that they produced, prepared, preserved, and/or obtained

OUTCOME 2: INCREASE SALES AND CUSTOMERS OF LOCAL AND REGIONAL AGRICULTURAL PRODUCTS.

Indicator 1. Sales increased from $________ to $_________ and by ______ percent ( n final – n initial/n initial (100) =% change), as result of marketing and/or promotion activities during the project performance period. 14 | Page 2. Customer counts increased from [insert total number of] to [insert total number of] customers and by _____percent ( n final – n initial/n initial (100) =% change) during the project performance period.

OUTCOME 3: DEVELOP NEW MARKET OPPORTUNITIES FOR FARM AND RANCH OPERATIONS SERVING LOCAL MARKETS.

Indicators 1. Number of new and/or existing delivery systems/access points of those reached that expanded and/or improved offerings of: a. ______farmers markets. b. ______roadside stands. c. ______community supported agriculture programs. d. ______agritourism activities. e. ______other direct producer-to-consumer market opportunities. f. ______local and regional Food Business Enterprises that process, aggregate, distribute, or store locally and regionally produced agricultural products. 2. Number of local and regional farmers and ranchers, processors, aggregators, and/or distributors that reported: a. an increase in revenue expressed in dollars: _____ b. a gained knowledge about new market opportunities through technical assistance and education programs: ______

3. Number of: a. new rural/urban careers created (Difference between “jobs” and “careers”: jobs are net gain of paid employment; new businesses created or adopted can indicate new careers): _______ b. jobs maintained/created:_______ c. new beginning farmers who went into local/regional food production: _____ d. socially disadvantaged famers who went into local/regional food production: ______ e. business plans developed:____

OUTCOME 4: IMPROVE THE FOOD SAFETY OF LOCALLY AND REGIONALLY PRODUCED AGRICULTURAL PRODUCTS.

Indicator(s) – Only applicable to projects focused on food safety. 1. Number of individuals who learned about prevention, detection, control, and intervention through food safety practices:_____ 2. Number of those individuals who reported increasing their food safety skills and knowledge:______ 3. Number of growers or producers who obtained on-farm food safety certifications (such as Good Agricultural Practices or Good Handling Practices): _____

The applicant is also required to develop at least one project-specific outcome(s) and indicator(s) in the Project Narrative and must explain how data will be collected to report on each applicable outcome and indicator.



These confounded many,  while others knew exactly how to use these to define their grant’s outcomes. I hope that  USDA calls in some of those who do a bang up job in setting and achieving their numbers to talk with the newbies in future years.

Because of the previous work on the trans•act tools (which include the SEED tool) while at Market Umbrella, and the more recent and engrossing Farmers Market Metrics (FMM) work I have been doing with FMC and their partners these last few years, I have become very familiar with this language and these indicators.  Most are included in the metrics chosen by FMC to be collected starting in 2016 with FMM through their own projects and through offering support to networks that area ready to embed evaluation systems in their projects.

Since I spent some time working with various project leaders on this, I thought I’d give my two cents here as to how I’d approach these if I was the lead.In this post, I’m going to talk about my general theory of data at the grassroots level and the first two outcomes; I’ll tackle #3 and #4 and unique indicators in upcoming posts.

Some may disagree with my assessment of how to handle these indicators which to me is actually a good thing since by tackling this in varying ways,  we are likely to hit on the best  methods of establishing these baseline numbers and for collecting the data.

The first thing that confounded some proposal writers is how every indicator could be met by the varied projects: of course, they cannot and are not expected to. Since some projects are focused only on increasing sales at a market and not on increasing the number of outlets, some indicators are more relevant than others and should be used in more detail. Remember, these indicators are for both FMPP and LFPP projects which covers a wide spectrum and so are meant to support the general outcomes for all. It is my opinion that the unique indicators asked for at the end are likely to be the most useful for reviewers to read closely in order to match to the narrative or budget. I’d expect though that those proposals that could not reasonably answer a majority of the indicators with numbers will suffer in that reviewing process, as did USDA it seems, as they recommended in their webinar that everyone explain those that they couldn’t answer. Or if possible, add a piece to their project to address that indicator. And I think you can assume that USDA was being firm in saying that this pot of money should result in changes of these kinds, so if your project cannot reasonably do any of them, maybe look elsewhere for support.

I think the best way to really make these outcomes accurate is for the project lead to write them with the vision of using them as a banner to fly throughout the term of the project for the team to hit, surpass or to discuss why they cannot be met and what that means. And that the numbers should be slightly lofty-it is better to extend the reach at the outset and urge the team to do their best work to reach or even surpass it. However, don’t just throw some outrageous numbers in there or you will be telling the reviewers and your team that you have no intention of achieving them. So even though I used the word lofty, there is something in being efficient with your project through establishing very precise numbers too.

Efficiency is a good plan for our tiny organizations in order to conserve ours and our vendors’ energy for the long haul and to be there for another day. And that how well we plan and how we address our assumptions about those we hope to reach has a lot to do with setting numbers and meeting or achieving them.

Okay let’s look at the first two outcomes now:

Outcome 1: Increase consumption and access.

The indicators that are clustered with this outcome are related, meaning that once you have established the  (a) the number of buyers and or producers that gained knowledge, you can then estimate the number (b) of those that then report an intention and then finally, the number (c) that reported actually buying, selling, aggregating etc. The second part of this outcome is related to those professionals like chefs or incubator-users who, if the project is expecting to reach that audience, then they are also going to be measured for knowledge, intention and actual activity.

I think this one was written out particularly well done as it takes a project step by step through the process of establishing their reach. This should have been relatively easy for most projects, as knowing how many people you plan on reaching is sort of 101 for FMPP or any USDA grant!

USDA’s suggestion was to write them out in a mathematical formula writing a beginning number, then the number you want to hit and then calculating the percentage of increase. It may be helpful to do that in 2 columns and consider both the direct and indirect ways that your project will reach people. Certainly, if you are doing training or workshops you can estimate your attendance, but how about those who just read about your training or workshop and track down the info that way? How about through the media that your project uses to gain attendees? Is it reasonable to think that others will hear about the market or outlet and begin to attend because of it? And never forget the vendors and including them into any project outcome, even if it is a straight up new shopper project; the vendors also can learn about the marketing and use it in their own sales reach if it is shared properly.  And of course, how about the project partners and their reach?

Once you set the number who will gain knowledge (and I think that your project should plan that just about everyone that gets your materials or attends your workshop will gain knowledge) you then think about who will change their behavior because of it. I wonder if I had a group of market managers and a group of vendors in one room and asked them to gauge that if 1,000 people are reached through materials or training, how many they think will actually intend to use it, and then how many will actually use that knowledge to buy, sell aggregate etc what differences we’d see. Because that estimate can vary, based on the perspective and experience of those setting the number.

My feeling would be that the vendors would assume that more people will intend to come but would think that less will actually buy. I say that because they deal with everyone directly and know painfully well how many pass by their table without eye contact or a deep perusal of what is for sale. So they know firsthand how getting people to actually do something is hard. I’d say that managers would be more likely to think more people will be reached but that less would report an intention to come to a market, but that once they are there, that a higher percentage will purchase. My assumption may be entirely wrong and maybe someday I can test it and readjust it. The most important thing is to test your project assumptions by asking everyone for numbers and adjusting them accordingly to their bias and experience and according to your plan.

I also think percentages without numbers can be difficult to be realistic about, so I often suggest that people start on the wrong end: if the project is for increasing shoppers to a single market, how many more shoppers could that market actually handle per week? 100? 200? 1000? Think about the vendors and your space and your Welcome Booth and visualize adding that number every week. Would it overwhelm the market? Do you have enough parking or access to transportation to make it happen? How many added shoppers per hour would that mean to your anchor vendors? Is that worth it?

Remember that the average shopper in most markets spends between 10-30 dollars so using those numbers above, the market would add another $1000 -$30,000 week in sales. Pretty cool huh? Or if you hope to add another market day: Maybe your Saturday market has 45 vendors on average, you might estimate that since your new market is smaller and has less parking, that you hope 25 or so can use this new outlet. In both cases, your initial outreach has to be wider than the final number, as some will not get to your market or have the ability to add market days even when told of the opportunity.

Outcome 2: To increase sales

Couldn’t be simpler as, in most cases,  FMPP projects are still chiefly attempting to increase sales. It may be true that at some later date, sales increases are not the primary indicator of the success of our work, but with the small reach that alternative food outlets currently have with food shoppers, I agree that this should still be the main goal. Even so, this indicator stymied more people (and I would imagine contributed to some not writing a grant at all) and since it is a common metric for FMM, I’m going to attempt to reason why it is necessary and how we can capture this.

Measuring an increase of sales for a project that is going to do marketing or outreach for a single sales outlet is pretty standard.  The issue is that you need a baseline number (starting point) and that is the thing many markets do not have yet. So how do you find the baseline?

Everyone knows that the majority of markets ask for standard stall fees which are not based on vendors’ sales percentages and because of that, many markets have never asked for sales data from their vendors*. What USDA, FMM, Wholesome Wave and others are now saying is that we need to know the impact of our work whether you collect this data for the market’s fee rates or not. So, for those who do already collect it, you are ahead of the curve and probably have a lot to teach the rest of us about how to do it well.

So how do the rest of us do it? Well, the simplest way is to ask vendors directly, either every market day, every month or every season. As you can imagine, the longer you wait to ask this, the more difficult it becomes for the vendor to separate the numbers from your market from the other outlets he/she sells at. However, it also is difficult for multi-tasking vendors to stop at the end of the day to count their money and get that number to you. So what works best? My answer is one that some people hate hearing: whatever works best for your community and your management level is what works best- as long as it gives you accurate data in increments acceptable to those using it.

I’ll talk your ear off about accurate data whenever discussing market evaluation because it is my experience that markets rely too much on anecdotal information and estimates that probably are better described as guesstimates as they have almost no basis in real numbers. I can hear many of you yelling at me through your computer that you are not evaluators and cannot be expected to gather data. My answer to that is as soon as you create projects that use the resources of partners and promise your community some change in behavior because of these efforts, you are both. Meaning as soon as you decided to run a market. (You like how I run the entire argument on my own and that I get the last word?)

However, I am in agreement with many market leaders and vendors that too much data is often asked of markets or vendors that is never used or not shared back with those who offered it. And of course, that collecting the data and the costs associated are almost never added to the cost of any project, and usually, partners just assume that overworked market communities will just throw that added work in their long list and get it to them toot sweet.

Yeah, don’t get me started on data collection challenges here.

Additionally, sales data is at the top of the sensitive information asked presently and I often ask managers or market partners to tell me how much is in their bank account right now as an example of how asking for information without context or reason is alarming to say the least. That is, if you even know a precise number! So I say first be the change you want to see by sharing market data with vendors regularly: token sales for debit are going up but SNAP is steady? What do you think that means? And then ask them what they think it means.

Asking for it in anonymous sales slips is the  way FMM suggests it is collected, but I assume that there are other good methods to test. And that it helps all of those methods when the raw data is shared with the vendors and it is used to advocate for their needs. It must be said that to be able to use it in aggregate means it has to be collected in the same way for the same time period and a lot more data is needed to get to any collective contribution, so we do need to hit upon some common methods sooner rather than later. Here are two more possibilities:

And as many of you know, the SEED tool  asks shoppers to estimate their purchases and then calculates overall sales from those numbers.  Many feel this method of getting sales is better, but it does require more surveying of shoppers more often which means added staff and volunteers.

Another way may come as some markets grow their token systems. Depending on your market, it might be possible to estimate how many of your shoppers use that system and whether it is representative of the type of overall shopper you have and use the data to estimate sales.

The main point is we have to agree that we need some data and it should be as precise as possible without violating privacy or exposing weaknesses in one business over another- after all, this is a competitive place. The data you can use for internal analysis as to the market’s impact on its vendors and shoppers can be a lot less and a lot less specific than the data your research partners will need when they start to calculate economic numbers. And that until you have actual data, how you calculated your starting point for these indicators says a lot about your circle of advisors, your experience and your knowledge of the target population.

Whew; enough for now. I’d love to hear how some of you did calculate both of these outcomes and especially sales, both in systems you had baselines and ones that did not. I expect that some of you will disagree with much of my unscientific approach to measurement but hope you know that I welcome your opinions.

Counting public gatherings in 2017-Washington Post article

The point of this post is to show how complex and grassroots public gatherings can be counted and measured. The two main researchers quoted in these Washington Post articles are Erika Chenoweth and Jeremy Pressman, both respected analysts of the details of large-scale civil movements and gatherings. As a data junkie, I have followed this effort with a great deal of interest (and have even counted some of these gatherings in my own town to check others’ counts) and look forward to more of the analysis of both the methodology and the actual count data. The analysis included not just the number who gathered but who and what was being protested or being supported, where these events were held, what symbols were used, how many arrests were made.

For March 2017, we tallied 585 protests, demonstrations, marches, sit-ins and rallies in the United States, with at least one in every state and the District. Our conservative guess is that 79,389 to 89,585 people showed up at these political gatherings, although it is likely that there were far more participants.

Certainly, food and farming systems should note some of the systems used for collection and analysis. For example, the Crowd-Counting Consortium may be something that national entities involved in any grassroots data collection systems like food systems should discuss creating for their own use.

Here is their counting method:

We arrived at these figures by relying on publicly reported estimates of march locations and the number of participants involved in each. We started a spreadsheet and called for crowdsourced information about the location and number of participants in marches. Before long, we had received thousands of reports, allowing us to derive low and high estimates for each event. We carefully validated each estimate by consulting local news sources, law enforcement statements, event pages on social media, and, in some cases, photos of the marchers. When reports were imprecise, we aimed for conservative counts; for example, if observers reported “hundreds” of participants, we reported a value of 200 (“thousands” was 2,000, “tens of thousands” was 20,000, etc).

An example of their public data set.

https://www.washingtonpost.com/news/monkey-cage/wp/2017/04/24/in-trumps-america-whos-protesting-and-why-heres-our-march-report/?utm_term=.ce99baecf0b6

Communicating community

Hopefully, all of you who read this blog are okay with my use of Vermont as one of this blog’s recurring examples of food system work. I will caution my readers to refrain from assuming that the Vermonters think they have it all figured out just because their residents are rightly proud of its glorious revival of small-acreage farming and its rep as an organic stronghold. I’d say the state food and farming leaders are very honest about the issues that they continue to face and their assessment of what remains to do. For example, there are still no full-time market managers at all and the average market manager makes less than 10,000 per year (really, it’s likely much less but I am trying to not overstate it here) and, in a state with only 625,000 residents (the most rural state in the union with 82.6 percent of its population living in either rural areas or small cities, and many of them poor*), the 80 or so markets are always struggling with maintaining attendance and sales amid strong competition from co-ops and other well-regarded outlets. And of course, like everywhere else, the state’s farmers are pulled in so many directions trying to serve every outlet at once while dealing with weather and regulatory woes and the typical small business challenges that many are not profitable.

What is exciting is that they all try to work collaboratively at the network level to seek appealing ways to showcase producers and organizers’ hard work. The pictures below are an example of that. My home team there (NOFA-VT) has an artist who does lovely work that hang on their walls and whose art is used by NOFA in many other ways. During my last visit, Erin Buckwalter showed me this bowl in their office of farming “affirmations” from that artist some of which also include actual data. She encouraged me to take a handful and so I have been asking people at markets to reach in to my bag of cutouts and take one. What a simple way to display the difference in our system from the one that reduces everything in a store to a place for purchased advertising. So if you see me, ask to dip in and see what you get…

IMG_4714.JPGIMG_4855.jpg

IMG_4854.jpg

This one, from NOFA-VT’s imaginative and thoughtful Executive Director, will remain on my board to inspire me.

 

 

• Income and Poverty in Vermont

iMedian household income (in 2015 dollars), 2011-2015 $55,176 (US $53,889)
iPer capita income in past 12 months (in 2015 dollars), 2011-2015 $29,894 (US $28,930)
iPersons in poverty, percent Warning Sign 10.2% (US 13.5%)
Fifteen states have more than half their populations living in rural areas or in towns under 50,000 population.