I grew up in a home where there were only two rules as far as mealtimes were concerned: you had to finish all your vegetables and you had to sit at the table. Everything else was up for grabs. When I went to Europe during college and learned that a fork was not meant to be passed back and forth between two hands, it occurred to me that there was a whole world full of rules and regulations embedded within the meal in any given culture. So table manners and etiquette have always interested me. For one, because I find them arbitrary and preposterous and also because I recognize that they provide the meal with a certain level of decorum. So, while researching a story about communal eating, I was excited to stumble upon Margaret Visser’s book The Rituals of Dinner: The Origins, Evolution and Eccentricities and Meaning of Table Manners.
Table manners are universally upheld and are as old as human society itself. Visser writes that without eating laws, humans would not have developed kinship systems which separated us from animals and allowed us to flourish as a species. These laws and rituals evolved along with human society and eventually became manners which dictated a “right” and “wrong” way to eat. But at their heart, manners ensure a certain level of culturally-specific cleanliness. One delightful story of evolving etiquette describes how the early eaters of modern spaghetti in nineteenth century Naples ate this informal dish with their hands. The tradition involved grabbing a fistful of noodles, throwing your head back and lowering the noodles into your gaping mouth without (rudely) slurping.
We get the word polite from the French word poli, meaning polished. Unsurprisingly, the French gave us most of our etiquette-related words (and etiquette itself). And it’s amazing to me how guided by dining etiquette the French still are. Meals in France are still largely eaten three times a day, no snacking! And when you eat in a restaurant, there are very specific things that you do with knives and forks and their position at the table that seem to convey both your knowledge of etiquette and your commitment to cleanliness (for instance, forks go face-down on the table between courses, which makes sense). But as Visser notes, while etiquette and politeness are meant to make social interaction a bit smoother, they can also precent us from truly interacting:
“Politeness forces us to pause, to take extra time, to behave in pre-set, pre-structured ways, to ‘fall-in’ with society’s expectations. But nothing about being polite is simple: the ‘polish’ intended to help people interact with one another can be used to prevent real contact from occurring at all. It can also become itself a barrier, keeping the ‘unpolished’ beyond the pale.”
The vilification of food is something Americans do all too well. Many food items have swung back and forth between villain and savior (Harvey Levenstien wrote a great book about this: Fear of Food A History of Why We Worry About What We Eat). Caffeine has long been one of the most popular targets and perhaps no one did more to try to destroy the reputation of coffee than the man who brought us Grape Nuts: C.W. Post. If you’ve ever wondered why Grape Nuts has that strange, indiscernible but slightly gravely taste it’s because it originally wasn’t meant to be a cereal at all, but rather a coffee substitute–one of Posts’ many efforts to rip off Kellogg’s creations.
Riding on the wave of the success of Grape Nuts and new health concerns over coffee, which he called a “drug drink,” Post developed another coffee alternative made from roasted grains called Postum. In his book, Uncommon Grounds, Mark Pendergrast argues that Posts’ ability to use bogus health jargon while appealing to people’s fears and snobbery paved the way for modern consumer advertising practices. Post himself was likely influenced by Coke’s ads in the 1880′s promoting the drink as a “brain tonic.” Post promised that drinking Postum would put consumers on the “road to Wellville,” and as this ad contends, would also rebuild one’s nervous system from the effects of “the old poison–caffeine,” and win over one’s husband to boot! In the end, commerce won out over caffeine ideology. C.W. Posts’ daughter took over the company following his death and bought Maxwell House in 1928.
Beer gardens are experiencing a small, but notable renaissance in New York City these days. Sadly they don’t even touch the kinds of massive, throbbing beer gardens that used to be commonplace. Sure, you could grab a beer at any of these German-style beer halls that dotted 19th century New York, but they were so much more than watering holes. The largest and most famous of these was the Atlantic Garden, which was a cavernous multi-storied social space were patrons would spend the better part of the day drinking, playing pool and listening to live music and of course, drinking beer. The largest of these was the Atlantic Garden, opened in 1858 on the Bowery at number 50 extending to Elizabeth Street. At a time when male and female social spheres rarely overlapped, beer halls were frequented by both men and women. The Atlantic Garden was particularly popular with German families who came to enjoy some evening entertainment together. This included an array of diversions such as a shooting gallery, pool tables, bowling allies and live entertainment. Despite the flowing beer, the crowd was relatively tame. On the lower lefthand corner of the postcard you can see a young girl sitting patiently at the end of the table as her parents chat away and another couple walking hand in hand between the tables on the right side.
As the Bowery increasingly became home to some of the most desolate populations in the city, Atlantic Garden managed to maintain its clean reputation and remained popular with locals and tourists. That is, until the locals finally left for less seedy pastures. Atlantic Garden was closed in 1902 because its main clientele, Germans and Irish, had moved away from the Bowery. This particularly vivid passage from the New York Times article announcing the closure describes the ending of an era:
Dwellers of the Bowery paused and rubbed their eyes yesterday when they passed Atlantic Garden, for the front of the famous old resort, which had stood almost unchanged on its site just below Canal Street since before the Civil War, was plastered over with Billboard in Yiddish announcing a Yiddish variety programme.”
(Yes, we still clung to that extra “me” in 1902.)
The article goes on to describe how the hall had hosted vaudeville acts, a new form of entertainment when it first opened, and specialized in novelty acts such as “‘teams’ of negro performers,” and later a “ladies orchestra.”
This past fall, it was discovered that the basement of the tavern that had previously occupied 50 Bowery and which had supposedly been George Washington’s headquarters, was still intact. Almost just as soon, it was demolished to make way for a 22-story, 160-room hotel. Hopefully plans will also include an enormous beer hall.
January 17, 1920: the day Americans were legally prohibited from consuming alcohol for the next 13 years—the day the Eighteenth Amendment went into effect. But teetotalism had been popular in America for nearly a century before the government got involved.
Alcohol had long been the target of American reformers who aimed to restore order to society through publishing diatribes on the harmful effects of excessive drinking, such as this illuminating pamphlet by University of Pennsylvania professor, Dr. Benjamin Rush, An Inquiry Into the Effects of Ardent Spirits Upon the Human Body and Mind (in its 8th edition by 1823).
A note to those considering imbibing tonight: Dr. Rush warns that while small amounts of alcohol “have a friendly influence upon health and life,” overindulgence in anything as innocent as punch leads to idleness, which leads to sickness and eventually to debt. Cordials lead to swindling, while the stronger stuff like gin and brandy will ultimately lead to murder and…the gallows. He suggests sticking to water which brings “health and wealth”—a radical concept at a time when the quality of most water was questionable.
American Teetotalism actually has its origins in Ireland with the Catholic Temperance movement when priest Mathew Theobald established the Cork Total Abstinence Society in 1838. His followers each signed a pledge of total alcohol abstinence and met on Friday and Saturday nights and on Sundays after Mass. In just the first five months, Theobald had conscribed 130,000 members. He began taking his cause on the road and eventually came to the U.S. in 1849 where he enlisted another 500,000. Catherine Cauty was the 4,281,797th to sign the pledge, promising to “abstain from all intoxicating drinks, except used medicinally and by order of a medical man, and to discountenance the cause and practice of intemperance.” The above card illustrates the consequences of alcohol consumption: wife-beating, and the benefits of temperance: a happy family gathering in front of a hearth.
Slimy, briny and amorphous; oysters are an implausible aphrodisiac and an unlikely delicacy. And yet, oysters are part of a cadre of much-valued foods with the fabled powers to generate sexual stimulation, along with chocolate and apparently striped bass. While oysters have long been shrouded in a myth of sensuality, they were not always so rarefied. In the nineteenth century, oysters were one of few foods that both rich and poor could agree were delicious, and they ate them with equal enthusiasm. Mark Kurlansky writes in The Big Oyster: History of the Half Shell that mid-nineteenth century New York oystermania “was one of the few moments in culinary history when a single food, served in more or less the same preparations, was commonplace for all socioeconomic levels.”
New York was synonymous with oysters long before it was known for its hotdogs and bagels. The nineteenth century waterways were so bountiful that oysters were sold as street food for pennies. New Yorkers were so mad for oysters that shuckers would work 10 hour days opening up 1,000 oysters per hour just to keep up with the demand.
Oysters could be found all over town and prepared in all sorts of ways: boiled, stuffed into in pies and stewed. In The House of Mirth, Lily Bart seeks refuge at a white table-clothed restaurant on 59th Street and Park Avenue where she has a rejuvenating snack of tea and stewed oysters. While just a few blocks downtown in Edith Wharton’s New York, oysters were casually consumed for a penny each off carts by the harbor.
When it came to oysters, elites were essentially indulgent gilded age locavores. Delmonico’s, New York’s original fine dining establishment, put oysters, or huîtres as was their francophile predilection, at the very top of their menu and set the trend for serving them raw on the half-shell. Oysters also attracted the attention of visitors to New York, including Charles Dickens who found most of America to be horrendous and did not hesitate to call parts of New York “loathsome, drooping, and decayed.” And yet, he found the New York oyster cellars “pleasant retreats” and seemed to enjoy their “wonderful cookery.”
I certainly cannot think of a single food item, even with the relative democratization of food over the past century and the increased access and exposure to a wider variety of cuisines, that would be equally celebrated at, let’s say Daniel’s and IHOP. I would love to be wrong about this (Cronuts at Dunkin’ Donuts don’t count), and maybe someone will point out that I am. But from what I can tell, I’m not.
A combination of pollution and bitter turf wars between competing fisherman caused the decline of New York’s oyster population. Eventually typhoid fever and cholera outbreaks forced the City Health Commission to close all the oyster beds in 1927, ending New York’s oyster bonanza.
New York is more socioeconomically divided today than it has been in over a century but it has been nearly as long since oysters were consumed with equal gusto by both the haves and have nots. But could it happen again? There are projects to revitalize New York’s oyster population, like efforts by the New York Harbor School and The River Project. But it’s doubtful that their numbers will ever get as high as they once were when oysters were so plentiful that their shells literally paved the streets.
The very first thing I do each morning is not shower, not check my email, not even caffeinate. I open up the fridge, reach for a plastic bottle and pour myself a small glass of orange juice. The citrus and sweetness seem to go through my tastebuds and right to that “on” switch in my brain. You know how courteous hosts ask if there’s anything you’d like them to stock up on when you’re visiting, and this is meant as a gesture rather than an invitation? Well, I actually answer: orange juice.
So when I started researching how orange juice became a staple of the American breakfast, it did not surprise me that its reputation for restoring energy and vitality was a central marketing theme. What I didn’t expect was that its meteoric rise to breakfast classic also involved a scare over a rare blood condition, an obsession with vitamin C and nearly a decade of government research. Oh, and then there’s this: If you value your morning glass of orange juice with its happy bits of pulp and consider it to be as close to the real thing as can be readily available at your local supermarket, do not read on. If, however, you are prepared to be nauseated by your once innocent glass of store-bought orange juice, this one’s for you.
“Most commercial orange juice is so heavily processed that it would be undrinkable if not for the addition of something called flavor packs. This is the latest technological innovation in the industry’s perpetual quest to mimic the simplicity of fresh juice. Oils and essences are extracted from the oranges and then sold to a flavor manufacturer who concocts a carefully composed flavor pack customized to the company’s flavor specifications. The juice, which has been patiently sitting in storage sometimes for more than a year, is then pumped with these packs to restore its aroma and taste, which by this point have been thoroughly annihilated. You’re welcome.”
For more on flavor packs, juice processing and the entire orange juice industry, read Alissa Hamilton’s illuminating book, Squeezed: What You Don’t Know About Orange Juice. I’m not sure what my mornings are going to look like from now on, but thanks to her I’ve drunk my last glass of year-old orange juice. For more on how we got here in the first place, read the rest of my piece on TheAtlantic.com: The Myth of Orange Juice as a Health Drink.
Here are some delightful orange juice commercials that didn’t make it into the piece:
1950′s canned Florida Citrus: “Because I like to get my vitamin C the way nature intended.”
1950′s Florida Citrus Fresh-Frozen: “Quench your thirst with health.”
1954 animated Bing Crosby for Minute Maid: “Healthier teeth, sturdier bones, better growth, rich red blood, and more vitality.”
1980 Florida Orange Growers: “It isn’t just for breakfast anymore.”
1993 Tropicana Pure Premium Grovestand: “The newest orange juice sensation…a taste so fresh, so pure, so real, every sip is like biting into an orange.”
In searching for the origins of the American breakfast I wandered down a health reform wormhole that took me from corn flakes to graham crackers (read the previous post for the connection). Now I’m meandering back to corn flakes for a moment because of a curious offhand remark in Stephen Nissenbaum’s book on Sylvester Graham. Apparently, Dr. John Harvey Kellogg, the inventor of corn flakes and basically the entire cereal industry, also predicted that the soybean would one day be one of the most important foods in America.
Dr. Kellogg, who ran a popular sanitarium in Battle Creek Michigan, was an influential figure in the American health reform movement during the last quarter of the nineteenth century and first quarter of the twentieth. Apart from inventing breakfast foods, he was also an ardent soy-lover. Kellogg first mentions soy in his 1917 book The New Method in Diabetes, in which he states: ”The soy bean is a remarkable legume which is two-fifths fat and one-third protein, giving a food content closely resembling fat meat…The soy bean is a highly valuable food for diabetics.”
Of the recipes in this book only one includes soy, and is only suggested as a peculiar replacement for savora (I didn’t know what it was either) in a simply (and not terribly appetizing) recipe for okra soup:
Kellogg’s Okra Soup:
2 cups cooked okra
2 teaspoons grated onion
2 tablespoons savora
1 1/2 cups strained tomato
Rub the okra through the colander. Heat with the other ingredients, and serve. Three tablespoons of Japanese Soy, may be used instead of the savora.
Where an early 19th century Michigander was to acquire Japanese soy, is one question. What Savora, a French condiment which includes cinnamon, mustard and nine other herbs and spices have anything to do with Japanese soy, is another. But, that’s what the doctor ordered.
By 1920, Kellogg devoted several chapters in The Health Question Box or A Thousand and One Questions Answered to the health benefits of the soybean and its prevalence in Chinese and Japanese cuisine and included a method for making tofu. In the following years, Kellogg became more convinced of the positive attributes of soy, not just for diabetics but also as a healthy alternative to meat and milk, both of which he believed were unsuited for human consumption. Before he discovered soy milk, Kellogg was recommending a product called Kumyzoon (produced by his Sanitarium Health Food Co.) which contained lactic acid instead of milk lactose, and which appears on this 1900 menu from the Battle Creek Sanitarium (along with Graham crackers, lots of other Graham products, “passover bread” and something called “bromose” which was another Kellogg original–a milk substitute in powder form to be dissolved in water).
Kellogg wrote quite a lot about the soybean in his books and published soy-promoting articles in his magazine Good Health. One such article 1928 ran with title: ”Chinese Babies Thrive on Milk from Beans.” There were many others that followed, including an 8-page speech written by Kellogg himself and published in the magazine in 1930 titled: “Soybeans as Human Food.” Kellogg’s first mission was to convince the public that soybeans could be used for food other than for cattle, which had been its primary use in America up until that point.
In 1942, Kellogg’s Battle Creek Food Company began selling its first commercial soy milk: Soygal. A year later Kellogg developed Soy Protose, his first soy-based meat substitute. This was also the year of his death at the robust age of 91, nine years short of the centennial he had hoped for. Though neither Soygal nor Soy Protose achieved quite the status of his corn flakes, Kellogg did accurately predict the rise of the soybean and its future use as a popular meat substitute. Perhaps uncoincidentally, corn and soybeans are American’s two largest agricultural crops today. Makes you wonder if perhaps with a more tantalizing name and better branding, we might all be eating our corn flakes with Soygal today.
Poor Reverend Sylvester Graham only wanted to save Americans from themselves and their harmful sexual urges, glutinous habits and materialism. His solution: crackers.
This most humble and innocent of American snacks has a strange and unlikely history that spans health and dress reforms, temperance movements, early vegetarianism, a mob of bakers, and the birth of the entire cereal industry. Read my full story on TheAtlantic.com and listen to me talk all about Grahamism on New Hampshire Public Radio’s Word of Mouth.
There is an ever-narrowing window, it seems like mere hours now, between the moment the clock of commerce strikes the end of Thanksgiving and the start of the “holiday season.” Barely have our over-burdened stomachs digested our preternaturally large turkeys than the elves of CVS, Target and Best Buy come crawling into their respective stores to exchange the autumnal colors for whites, reds and greens. I imagine these brave defenders of mercantilism slinking in under the cloak of darkness, staving off tryptophan as the rest of us lay splayed on couches clutching our stomachs while eyeing the last piece of pie, to furtively haul out the pumpkin and apple flavored everything and heave in the bulk bins of Santa hats and stock endless shelves with gift “suggestions.”
“Christmas has been ruined!” We bark. “Its spirit has been sullied by our lust for commercial grade juicers!”
We do like to harp on. But what if there never was a noncommercial American Christmas to ruin? What if the commercialization we love to hate on is the very thing that brought us the “spirit of Christmas” that we lament has been destroyed? So argues Penne Restad in a Bloomberg article from last year. The American tradition of Christmas first took off within the confines of the home where upper class ladies of leisure took to decorating and indulging in cobbling together “the scraps and slivers of various folk traditions blended to serve a religion of domesticity.” Rituals such as tree decorating and stocking stuffing where then picked up by the press, which published Christmas recipes along with etiquette tips, gift ideas and morality tales of the “true meaning of Christmas.” Publications like Harper’s Weekly also began to develop the version of Santa Clause from a marginal character to the jolly gifting supernatural being we recognize today.
As early as the 1830’s poinsettias were being grown in greenhouses and by the 1870’s Christmas trees were living room fixtures. Woolworths played a role by importing cheap ornaments from Germany while department stores around the country stuffed their displays full of glittering merchandise, pushing gifts on American consumers with unbridled zeal. This Richmond Times Dispatch article from 1916 reported on the “Christmas spirit” on display in the “brilliantly decorated windows…within, the stores are temples of plenty, thronged with devotees. Even the shops which have to do only with humdrum articles of constant necessity and ugly drudgery have been infected with the desire for cheer and beauty.”
As the spirit of commerce and Christmas were created in tandem, so have they continued to grow concurrently. As Restad writes:
“These days, it is a commonplace to say that the economy depends on Christmas sales and that marketing strategies, such as Black Friday, threaten the holiday of yore. True enough. But less often noted is that the market revolution of the 19th century, and the consumer economy it created, made possible and continues to sustain what we mean when we talk about the ‘spirit of Christmas.’”
“Except for the medieval codpiece and the bra, clothing has never had a gender unless it clings tightly to the body.”
In the early 2000’s I attended my cousin’s wedding which took place at an event hall in a kibbutz in Israel. The space had likely been a storage facility not one year prior, but had since been transformed into an minimalist hall resembling a clinical Euro night club in which the guests sat along terraces that cascaded down to a pit where the happy couple was married by an Orthodox Rabbi under dramatic lighting. There were trapeze artists by way of entertainment and what little food there was on offer was passed around by fit waiters wearing floor-length orange skirts. I never figured out if this was what the Israeli nouveau riche considered counter-culture or if this was in fact the vogue (the Israeli propensity for trend-following makes me suspect it was the later). This is a long way of saying that Jean Paul Gaultier’s influence seemed to have reached the suburbs of Tel-Aviv.
A retrospective of Gaultier’s work is currently on view at the Brooklyn Museum, and if you’re like me and was really only familiar with Gautlier’s work through Madonna’s cone bra, it’s well worth a visit. It is exceptionally well-curated and the opportunity to see such playful, inventive and well-crafted pieces up-close is a treat alone. But more than anything it’s an opportunity to experience the work of a man who has been thinking about gender roles over a lifetime. In the nearly forty years since the enfant terrible of high fashion has been producing couture that reverses traditional gender roles and undergarments, two items of clothing have captured Gaultier’s imagination more than anything: the skirt and the corset.
The first room of the exhibit introduces the visitor to Gaultier’s relationship with the corset. Gaultier’s corsets were designed not as undergarments but as exposed outerwear. Part of the reason this is successful is due to the simple fact that a corset can be quite beautiful. Its curved form supported by laces, ribbons and boning gives it contrasting and architectural characteristics. In one of the first pieces, Gaultier plays off these elements with a corset made of ballet slipper pink satin complete with prominent bows at the shoulder, hyper-feminizing an already historically feminine garment. Only, this is a man’s corset. The mannequin modeling the corset boasts a sizable chest and biceps. The exaggerated boning protrudes beyond the natural waistline crossing at the pelvis and leading our sightline south. The text for this piece informs us that corsets were worn by men for centuries. Some British soldiers would wear them under their uniforms as part of their training, and they were especially popular with the nineteenth century dandies.
As Gaultier attempts to overturn notions of gender, body-shaping and power, he underlines the idea that our modern concepts of gendered clothing are merely culturally constructed. The Scotts have their kilts and the Samurai warriors had their armor skirts, so men wearing corsets should not be so far-fetched. It’s somewhat satisfying to me to know that historically some men have inflicted this kind of discomfort upon themselves in the service of fashion and arbitrary notions of beauty. Only, it’s not exactly true, at least not in any broadly accepted way. As Valerie Steele writes in The Corset: A Cultural History, some eighteenth century military men did don corsets as did Regency-era dandies, but not without being subjected to criticism for “feminine” attention to their appearance.
Dandyism first appeared as a men’s lifestyle and fashion trend during the early eighteenth century in England and later spread to France. These bourgeois and aristocratic men were deeply interested in their appearance, spending thousands of pounds on clothing and hundreds of hours on their careful toilette. But their style was not highly creative or ostentatious, but rather defined by form-fitting, sober and refined attire. Corsets were sometimes used to slim the figure, and for a brief period some dandies wore corsets in order to create a wasp-waisted silhouette—nipping the waist in order to contrast it with the broadness of the shoulders. As evidenced by the many caricatures in popular print, the idea of a man wearing a corset was considered by many to be ridiculous and feminine. As Steele writes: “The idea that ‘effeminate’ men endangered national strength had been a theme in British popular culture at least from the middle of eighteenth century, when attacks on fops and macaronies were common in the iconography of political caricature. Regency dandy caricatures continued this theme, along with its corollary, the spread of effeminacy throughout society.”
From a Western perspective, it seems hard to argue that corsets have ever been anything but a feminine garment that at worst promoted female repression (Regency-era female corsets provided more lift than the suffocating Victorian-era corsets). As a male garment, corsets largely garnered ridicule for their mimicry of the female form and associations with feminine preoccupations with appearance. And no matter the gender of the wearer, corsets are still inherently uncomfortable and restrictive contraptions. But in Gaultier’s hands the corset is rendered as armor—a garment of empowerment for both sexes.