This is the first of a two part article on butchering in America.
Several years ago, I remember making my first attempt at Julia Child’s famous Boeuf Bourgignon. I wanted to be sure I did it right and be as true to the recipe as I possibly could with the best ingredients I could find. The recipe calls for slab bacon, that glorious un-sliced hunk of deliciousness that provides incredible flavors to stews and soups.
Living in Minneapolis, heart of the Midwest and only miles from one of the largest densities of pork production in the country (thanks to Hormel in Austin, MN), one would think that pork products would be an easy find in the city’s many grocery stores. I must have gone to five places – each with their own meat counters and ‘butchers’ – and inquired about the availability of slab bacon. Two of the ‘butchers’ said they didn’t carry it, which was disappointing. But three of them said they’d never even heard of it. “Slab bacon,” I replied. “It’s bacon before you cut it.” They simply stared at me with vacant expressions. One suggested I try the deli counter, who only seemed more confused than the others.
One of the greatest challenges in cooking what I call “heritage recipes” for this blog is the lack of availability of many ingredients that used to be common place. Brisket, for example, used to be a cheap cut of meat – hence its historic popularity in the American Irish and Jewish tenement communities. Now, because few people cook with it or know what to do with it, it’s rarely carried in stores, and when it is, it’s usually frightfully expensive – in order to cover the cost of having it fresh and on hand. This is to say nothing of ofal – those mysterious and largely unfamiliar bits of
animals so regrettably rare to encounter anymore. Less exotic cuts like ham, for example, are also increasingly rare outside the holidays or that pressed stuff that’s more water and filler than meat.
What is a real butcher?
There is an important distinction to be made between the good folks who work behind a meat counter at your local supermarket and a real, honest-to-goodness butcher. Broadly, a butcher is a skilled craftsperson who dresses (cuts) animal carcasses into their distinctive parts and sells them. A good butcher will be knowledgeable about meats, and will be able to tell you where their products came from. They will be able to dress meat, to order, for your while you wait. They can offer you recipes and suggestions for how to store, prepare and serve any cuts of meat. A good butcher will also follow a code of ethics that supports local, sustainable meat production, and a commitment to whole animal butchering – wasting as little as possible.
And despite recently being lumped together in one category by the Bureau of Labor Statistics, there is a big difference between being a butcher and being a meat cutter. A meat cutter works in a large processing plant, often on an assembly line, where they are responsible for cutting only one part of the animal before the rest of it is sent down the line to someone else. Meat cutting is mechanical, where butchering is an art.
What happened to butchering in America?
Two major shifts in American foodways have brought the neighborhood butcher to near extinction. The first was the birth and growth of the supermarket. During the 19th century, the grocery store was a large room with high shelves and a counter where customers would bring their shopping list, and an attendant would gather all the items for them.
In 1916, Clarence Saunders developed a shopping model that would allow customers to select their own grocery items. He called the stores Piggly Wiggly and began operating them as a franchise across the country. Spin-offs followed including Kroger, the A&P, and Safeway.
By the 1950s, these grocery stores became supermarkets – one-stop shopping experiences where foods could be offered in bulk at low prices. Rather than offering a butcher counter in each store that had to be staffed by a trained professional, it was economically cheaper to have
a single offsite meat processing plant where meats could be pre-cut, preserved if needed, placed on a Styrofoam tray, and shrink-wrapped. This way, the customer would be offered a variety of meats quickly, and give them the feeling of being able to pick their own from the selection.
By operating in bulk like this, supermarkets were able to drive down the cost of meat. The drop in quality of meat was balanced out at the time by convenience – a customer no longer had to stop at multiple places to buy meat, produce, dry goods, etc. The competition put the vast majority of neighborhood butchers out of business.
Meanwhile, the rise in supermarkets forced a dramatic change in meat production and subsequently American eating habits. With meat being offered in larger quantities for cheaper, Americans began eating more of it, creating greater demand.
To meet that demand, new methods of raising livestock were developed that would shorten the time from birth to slaughter. Cattle, for example, required a great deal of pastured land to supply the grass they needed. Then it was discovered that cows could be fed grains like corn that would allow them to be kept in smaller, cheaper confinement, and would also cause them to get fatter quicker. National American meat standards were actually changed so that the grading of meats like beef would be based on marbleization and fat content, thereby encouraging farmers to switch to this method of production.
Then the meat processing industry was centralized largely in the Midwest. Whole animals were no longer sent to cities to be slaughtered and butchered, but rather to large processing plants.
I’ll spare you the details of the conditions of these animal production and processing plants – they are readily available all over the internet. Suffice it to say, it’s gruesome. For those aware of these conditions, it takes an enormous act of willful ignorance to continue eating these foods.
For part two in this series on the American Butcher, click here. For the related recipe – Beef Tongue Sandwiches – click here.