Monday, January 23, 2012

What we talk about when we talk about bread


 
Bread. As astonishing a testament to human ingenuity and resourcefulness as the Roman aqueducts, the Great Wall of China, or the Taj Mahal—and a whole lot tastier too. Just imagine what's necessary to transform a stalk of wild grass into even the simplest unleavened flatbread: reaping, threshing, parching, hulling, grinding, combining with liquid, and baking, whether on stone slabs or in an oven.

When you think about it, bread's got something else in common with those other, more architectural, testaments to human ingenuity. Because it's produced (rather than found in nature), it too can be made in a dizzying variety of ways and from as many raw materials (grains, in the case of bread) as you can shake a stick at.

But it’s not simply a matter of wheat bread and barley bread, is it? In what area of life is difference free of judgment? Who doesn't know that wheat bread was traditionally regarded not just as different from but as better than barley or millet bread, both of which were seen as having something vaguely primitive about them, as though only those people ate them who hadn’t yet been exposed to the good stuff? By the same token, oat and rye bread were generally presumed to be the fallbacks of northern people whose colder climates didn’t allow wheat to grow. Why else did the Scots eat their dry little oatcakes and the Finns the flat barley bread called ohrarieska? And why, according to more developed countries, did less developed ones eat such flatbreads as Afghan naan, Indian paratha, Bedouin fatir, Ethiopian injeera, Mexican tortillas, Armenian lavash, and Mediterranean pita? Because they hadn’t mastered the art of leavening, were too nomadic to have standing brick ovens, or were too poor to afford the fuel required for baking towering loaves.

Deliberately leavened white bread was traditionally seen as the sophisticated, civilized, and better product, and although we’ve come a long way, it's telling that the words we use today still express our assumptions about what bread’s supposed to be. When it’s the leavened wheat stuff we have in mind, we can refer simply and succinctly to bread, but when we want to clarify that what we’re eating is unleavened, we have to specify flatbread.

Despite the assumptions embedded in the words we use, today we recognize that Ethiopians, Mexicans, and Arabs might not just be used to eating their injeera, tortillas, and pitas, but might actually prefer them to risen wheat loaves. Which goes to show that traditional assumptions of better and worse may be no more than acquired cultural tastes and values, in culinary matters as in other walks of life. The renowned 2 kg miche of Parisian boulanger Lionel Poilâne is not inherently better than a corn tortilla slapped between a Mexican woman’s palms and quickly browned on a cast-iron comal; it’s just that many people have been brought up to believe it’s better—and “better,” in this context, means more refined, sophisticated, and civilized.

As is so often the case, food preferences and values turn out to be acquired prejudices, held all the more stubbornly  because we’re not even aware we're holding them. Until very recently, food was considered best when it had been most transformed from its natural state by deliberate, careful, and painstakingly acquired human control—but was that verdict the result of taste or of culture (if they can even be distinguished from one another)?

When it comes to bread, there’s no such thing as a natural state (it doesn’t grow on trees, after all), but still, some loaves are more transformed than others—and it’s those loaves that were traditionally the most highly esteemed: the baguettes, the miches, and the pains au levain.

Isn’t it finally possible, though, to think of tortillas, injeeras, parathas, lavashes, and all the other wonderful flatbreads of the world not as inferior to, but simply as different from the loaves of risen wheat bread that still, almost invariably, come to mind when you think—and say the word—bread?

Thursday, January 12, 2012

The Meat Market


It’s a curious fact that although so many English words trace back to the Latin word for meat, carniscarnage, carnal, carnation (named after its fleshy pinkness), carnival (a portmanteau word combining carne and vale; literally, “good-bye to meat”), carnivore, incarnation, to name only a few—when it comes to the substance itself, we insist on using an entirely different word: meat. What’s even more curious is that until the past few centuries, the word didn’t mean what we understand it to mean today. If it did, how could the King James Bible (1611) have God tell Adam and Eve that “I have given you every green herb for meat”? Herbs as meat? Strange as it may sound to us, time was when meat meant food in general. If you wanted to refer to the edible tissue of animals, you would have called it flesh. Open the pages of one of our earliest collections of recipes and you’ll find dishes with names like “Tartes of Flessh.” Just imagine a menu today featuring items such as “Fleshloaf and mashed potatoes” or “Spaghetti and fleshballs.” Obviously we react to the word flesh very viscerally. But it was not always so.

In German to this day meat is still Fleisch and a carnivore is, logically, a Fleischfressender—literally, an eater of flesh. But as the centuries have rolled by, we English speakers have become more squeamish than our Teutonic relatives. Even the most staunchly carnivorous carnivore wouldn’t relish being referred to as an eater of flesh—a designation we reserve for the likes of Hannibal Lecter. We prefer to distinguish between our flesh (which we don’t eat) and our meat (which we do). Which is a further reason the June 1978 cover of Hustler Magazine created such an uproar. It featured a woman’s naked body descending into a meat grinder, with only her lower torso and legs still unmangled; emerging from the other end of the grinder was a mass of ground meat. Her living, breathing flesh had turned into hamburger, thus literalizing the pornographic metaphor of the “meat market.”

When it comes to meat, we English speakers seem to have a hard time confronting the fleshy reality of what we eat and so we resort to many different strategies to avoid stating the obvious. Euphemisms come in handy when you don’t want to acknowledge biting into a pancreas or a thymus gland. Sweetbreads sound so much nicer, and although honeycomb tripe may not particularly whet your appetite, it nonetheless does a better job than reticulum, or the lining of a cow’s second stomach.   

Other languages are also called upon for a bit of elegant obfuscation. There’s a world of difference between a pâté de fois gras and fatty liver paste, or between osso buco and hollow bones. French and Italian, in particular, have an almost magical ability to transform the humble into the elegant and to convert the less-than-appealing into the more-than-tantalizing.  

But you don’t have to turn to gourmet fare to see such linguistic double-talk at work when it comes to the matter of edible animal tissue. Why is it that we have different names for the cuts of meat that appear on our dinner plates and for the animals from which those cuts were, well, cut? We eat beef but it’s a cow our sirloins and porterhouse steaks came from; and I’ve never heard of an animal named pork or veal.  Lamb, of course, is an exception but I can remember being taken aback when I first moved to New England and saw “lamb legs” featured in the supermarket’s weekly circular. Somehow the more familiar “leg of lamb” had never evoked the mental image of a little lamb prancing about on a meadow as “lamb legs” did so immediately and so powerfully.

Obviously we feel the urge to rename the animals we eat in a way we don’t when it comes to fruits and vegetables—or even when it comes to fish, which somehow seem to occupy a middle-ground between the animal and vegetable kingdoms. I know plenty of vegetarians who are fine eating fish and even observant Jews who strictly separate milk from meat find it permissible to eat creamed herring or bagels with lox and cream cheese.

Why we feel differently about eating meat isn’t difficult to figure out. In order to enjoy a steak or a chop, a life had to be taken and blood had to be shed. Even those of us who are OK with this can’t deny that many people have a hard time justifying such killing. It was a vegetarian diet, after all, that God intended for Adam and Eve in Eden, which is why he instructed them to eat “every green herb for meat.” It was only after he was forced to acknowledge that people were inherently aggressive that he allowed them to kill and eat meat. After angrily flooding the earth and destroying all living beings except for Noah, his family, and the animals brought into the ark, God acknowledged that the time had come to revise his original dietary allowances to include meat: “Every moving thing that liveth shall be meat for you.” The conclusion is easy to figure out: had people turned out to be what God had hoped for, they’d have remained vegetarian.

Even those of us who dismiss the Bible as a collection of primitive myths nevertheless maintain something of its ambivalence about meat. We want it, but we don’t want to think too closely about where it came from or what it took to get it to our plates. And so, perverse creatures that we are, we rename it, as though to call it by a another name were somehow to make it less suspect, less visceral, less fleshy.

Don’t believe me? Just try serving your family a dinner of roast flesh, calf cutlets, or pig chops for dinner and see how quickly they ask for seconds.  

Friday, January 6, 2012

The Great Brownie Debate: Cakey or Squidgy?



Six days into the new year, the sky is a whitish-gray, and I’m home with a cold. Tomorrow, though, is my sweetie’s birthday which means that, runny nose notwithstanding, I’m going to bake something that he loves almost as much as he loves me: something very very chocolate. Hmmmmm. Let’s see. On Christmas Day, I baked a dark chocolate tart studded with fresh raspberries and on New Year’s Eve, I made individual molten chocolate cakes, but that was when I had far more energy than I do at this moment.

I’m thinking brownies. Who doesn’t love brownies? They’ve been one of our favorite American desserts since they were first mentioned in the 1897 Sears, Roebuck Catalog. Their virtues are innumerable. They’re quick. They’re easy. They’re made from what you’ve almost always got hanging around the kitchen. You stir them up in a single saucepan which means that cleanup is a snap. You can add white chocolate chips to make them look adultly sophisticated or you can add brightly colored M & M’s to make them look childishly happy. You can adorn each one with a single candle and illuminate the whole platter for a festive birthday presentation. What’s not to love?

Simple as they are to produce, however, there is one tricky matter when it comes to baking brownies. Timing is of the essence. Take them out a minute too soon and they’re so damp you can hardly cut them; leave them in a minute too long and they’re all dried out. The window of perfection lies somewhere in the middle, but even then it’s a tough call. Some people like their brownies on the cakey side, but I find that more people prefer them the slightest bit undercooked, resulting in a consistency somewhere in between sticky, moist, and fudgy—or what the English call squidgy.

Squidgy? Judging from my circle of friends and acquaintances, Americans tend to describe fudgy brownies as just that: fudgy. Or maybe gooey, undercooked, squishy, or slumped. The English, on the other hand, refer to them as squidgy and lest you think it a made-up word, I’ll refer you to the Oxford English Dictionary. “Squidgy: Short and plump. Podgy.” (Podgy, by the way, is our pudgy, and squidge is defined as “the sound made by soft mud yielding to sudden pressure.” It’s not a new word either. Back in 1892, Rudyard Kipling referred to Gunga Din as “You squidgy-nosed old idol.” A hundred years later, the English tabloid The Sun featured an article entitled “Squidgygate” about secretly taped phone conversations in which a clearly besotted James Gilbey called Princess Diana by the pet names “Squidgy” and “Squidge,” sparking all sorts of rumors and innuendos.


The other sense of squidgy is “moist and pliant; squashy, soggy. Especially of food.” Squashy and soggy might not sound especially appetizing if it’s cakes and cookies you’ve got in mind, but if it’s a brownie, suddenly squashy and soggy might be just what you want. By the same token, squodge—defined by Urban Dictionary as “post-partum belly fat, the kind you can't get rid of even after the kid has hit kindergarten. See also podge”—doesn’t sound at all appealing until the moment it’s applied to a brownie. To wit, consider the following description of White-Chocolate and Macadamia Brownies by the reigning queen of English cuisine, Nigella Lawson, “The important thing is to maximize squodge.” Leave it to the land of flummeries, fools, trifles, and spotted dicks to have some fun with dessert terminology.

Herewith, the recipe I’ll be using for my Sweetie’s birthday brownies:

Seriously Squidgy Saucepan Brownies (slightly adapted from Marilyn M.Moore’s The Wooden Spoon Dessert Book)

Preheat the oven to 350. Butter an 8 x 8 x 2” baking dish. In a large heavy saucepan, stir over low heat until melted and smooth:

4 squares (1 ounce each) semisweet chocolate, coarsely chopped
8 tbsp unsalted butter, cut into pieces.

Remove from the heat. Stir in, in this order, adding the eggs one at a time (blend well, but do not overmix):

1 cup sugar
2 large eggs
¼ tsp salt
1 tsp pure vanilla extract
¾ cup unbleached all-purpose flour
(Optional: ¾ cup white chocolate chips, M & M’s, or chopped walnuts)

Spoon and scrape into the buttered dish, spreading the thick batter into the corners with the back of the spoon. Bake at 350 for about 25 minutes. The edges should be about dry; the center should still be soft to the touch. (Bake them for 30 minutes if you like your brownies less squidgy.) Cool in the pan on a wire rack. Cut into squares to serve. Makes 16 brownies.