Distilling Our Craft
This weekend I learned a lot about how alcohol is made.
Chris and I and our spouses went on a tour and tasting St. George’s Spirits in Alameda. The only thing I knew going in was that I really liked St. George’s “Terroir” gin (which is why we were there).
In just an hour, we heard a lot about the history of craft distilling in America and the distinctions between different types of alcohol and how they are made. It was a lot to absorb, and much of it I don’t remember accurately, but there is one comment that stood out to me.
After explaining many of the complex processes St. George’s employs to make its brandy, vodka, whiskey, and gin, our tour guide asked us to look around the distillery floor and notice what we *didn’t* see there: namely, any digital devices collecting “data” on what was being produced.
He said that the quality of all of the distillery’s liquors is determined by taste and smell – by the judgement of experienced craftsmen, not by instruments or measurements.
The result is a high-quality, uniquely tasty product, but one that is difficult to replicate, which reminded me of a lot of the discussions we have about Braid.
This is important (especially to potential funders of this work), but as part of our strategic planning we have also had a great deal of discussion about the aspects of our work that can’t be measured and quantified. These are much more difficult to describe, and they take a lot of effort to produce.
Most granting organizations want to invest in programs that get the most bang for their buck, that serve the highest number of youth for the least amount of money.
These organizations are usually going by the numbers alone, without regard for the more qualitative aspects of the program – whether volunteers are trained to work with youth who have experienced trauma, whether they stick around for more than a year, how much support they get along the way, and those special “mentor moments” that we love to hear about from you.
Braid is considered a “high touch” organization because we use more people to reach a smaller number of youth than most mentoring programs.
We know that our youth require more support than average, and that our mentors and facilitators need extra support to work with them long-term. So our resources go toward that recruiting and training and ongoing support.
And just like those master craftsmen tasting spirits as they emerge from the still, we are constantly evaluating and honing this program as we go. We know that every youth and every team is different, so there is not one standard definition of what is best for each of them. This is not easy to replicate on a large scale.