Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A lot of this was common in the US and Europe a hundred years ago, but laws and changing culture have stopped it. Just look at snake oil, or the many other “remedies”. See, eg, “ Down and Out in Paris and London” by Orwell, where among many other things he talks about how the restaurant kitchens in early 1900s Paris were extremely gross and dangerous, but because the front of the restaurants were clean, the customers seemed happy.

“A customer orders, for example, a piece of toast. Somebody, pressed with work in a cellar deep underground, has to prepare it. How can he stop and say to himself, 'This toast is to be eaten—I must make it eatable'? All he knows is that it must look right and must be ready in three minutes. Some large drops of sweat fall from his forehead on to the toast. Why should he worry? Presently the toast falls among the filthy sawdust on the floor. Why trouble to make a new piece? It is much quicker to wipe the sawdust off. On the way upstairs the toast falls again, butter side down. Another wipe is all it needs. And so with everything.”

China is at an interesting point because they’ve developed economically so much in the last fifty years. Their culture and law enforcement hasn’t caught up yet while the internet has, so this is shared with us.



> how the restaurant kitchens in early 1900s Paris were extremely gross and dangerous, but because the front of the restaurants were clean, the customers seemed happy.

I heard a lot of restaurants in China these days, including takeout outlets, make live video feeds of their kitchens available online so that customers can be assured that the kitchens are hygienic. Of course that doesn’t tell you about the quality of source materials.


Well, a video feed of _a_ kitchen, at least.


I've ordered from such a place in Munich. The novelty factor made it entertaining to watch and at some point I could identify my order being prepared. Not sure how many people you could deceive before you start to get called out, if the feed was of a different kitchen


I’m not sure much has changed in regard to the anecdote you shared. Kitchens are under a lot of pressure to get food out quickly and finish all of an order at the same time.

I worked at a restaurant in the 90s as my very first job, and one day someone dropped a steak on the floor after it was done. I said oh dear, I’ll get another from the fridge. The cooks told me no, just wash it off and serve it, reasoning “If we bring out every else’s meal and they guy who ordered the steak has to wait 10 minutes, he’ll be mad”. My guess was that he would be angrier to know that he was served a steak that had been dropped on our filthy floor and given the choice, would probably prefer to wait for a different one.


Your anecdote sparked an old thought for me.

In your story, the customer would perceive the service as inferior if their food was late. A cook, knowing that the service isn't really inferior (there was just one little snag), justifies a little action to put appearances back closer to reality.

That last sentence is a study for me. I see it in everybody including myself. Somehow we convince ourselves that our ideal is the normal and the actual reality is an aberation. Then we rationalize little efforts to hide the aberation.

It can be hard to acknowledge the truth about ourselves and accept the consequences. But it is very hard to improve otherwise.

(At the same time we need to be careful of imposing overly severe consequences for mistakes.)


My last boss worked at a sandwich shop in college (Pasadena, CA). One shift he knocked over a 5 gallon bucket of olives, which spilled all over the floor.

He was mortified, but the owner seemed unperturbed. He handed him a broom and a dustpan and told him to get them all back in the bucket.


That reminds me of a story I heard from a friend in the Midwest who worked at a factory assembling frozen Italian dinners. He said that part of his job was to sweep up spilled grated cheese from the floor so they could use it in dinners.


And yet none of those examples compare to consciously using known poisons as a replacement for or additive to real food. That’s cynical brutality on another level.


How about preserving milk with formaldehyde?

https://www.pbs.org/wgbh/americanexperience/films/poison-squ... https://allthatsinteresting.com/poison-squad-harvey-wiley

One of the tricks here is that "known" poisons are the same as "agreed-upon" poisons -- or that the definition of "poison" isn't as clear as you might assume.

If one compares the US/Canada and EU restrictions of additives with the assumption that the latter are founded, you'd come to the conclusion that the only thing North America must have more of than cynical brutality is cancer. However, while I'm not going to say they're right, there are typically plenty of folks in HN comment sections happy to say that many of the EU's food restrictions are ascientific pandering.

Everyone has some intuitive threshold about what they'd consider reasonable and what they wouldn't, but they don't always align. Acrylamide from roasting coffee? https://www.nbcnews.com/news/us-news/california-judge-rules-... Nitrates and nitrites are indisputably poisonous as hell, so are cured meats out of bounds? https://www.bbc.com/news/health-34615621


> formaldehyde

As a metabolic intermediate, formaldehyde is present at low levels in most living organisms. Formaldehyde can be found naturally in food up to the levels of 300 to 400 mg/kg, including fruits and vegetables (e.g. pear, apple, green onion), meats, fish, crustacean and dried mushroom, etc.

Ingestion of a small amount of formaldehyde is unlikely to cause any acute effect. The main health concern of formaldehyde is its cancer causing potential. The International Agency for Research on Cancer (IARC) considered that there was sufficient evidence for carcinogenicity in humans upon occupational exposure via inhalation. On the other hand, WHO in 2005 when setting its Drinking Water Guidelines considered that there was no definitive evidence for carcinogenicity upon ingestion.


Well, sure. Here's the thing: with anything, the dose makes the poison -- and "low levels" and "small amount" are pretty important phrases in those claims, right? (https://wwwn.cdc.gov/TSP/MMG/MMGDetails.aspx?mmgid=216&toxid...) People aren't talking about the amounts in apples when they're referring to e.g. https://en.wikipedia.org/wiki/2005_Indonesia_food_scare


Agree.

But selecting formaldehyde for your scary example mostly just sounds scary because of it’s relation with embalming fluid. As you say, anything can be poisonous: the LD50 of many substances is surprisingly high https://en.wikipedia.org/wiki/Median_lethal_dose#Examples

From https://www.smithsonianmag.com/science-nature/19th-century-f...

  In 1896, desperately concerned about diseases linked to pathogens in milk, he even endorsed formaldehyde as a good preservative. The recommended dose of two drops of formalin (a mix of 40 percent formaldehyde and 60 percent water) could preserve a pint of milk for several days. It was a tiny amount, Hurty said, and he thought it might make the product safer.

  But the amounts were often far from tiny. Thanks to Hurty, Indiana passed the Pure Food Law in 1899 but the state provided no money for enforcement or testing. So dairymen began increasing the dose of formaldehyde, seeking to keep their product “fresh” for as long as possible. Chemical companies came up with new formaldehyde mixtures with innocuous names such as Iceline or Preservaline. (The latter was said to keep a pint of milk fresh for up to 10 days.) And as the dairy industry increased the amount of preservatives, the milk became more and more toxic.

  Hurty was alarmed enough that by 1899, he was urging that formaldehyde use be stopped, citing “increasing knowledge” that the compound could be dangerous even in small doses, especially to children. But the industry did not heed the warning.

  In the summer of 1900, The Indianapolis News reported on the deaths of three infants in the city’s orphanage due to formaldehyde poisoning. A further investigation indicated that at least 30 children had died two years prior due to use of the preservative, and in 1901, Hurty himself referenced the deaths of more than 400 children due to a combination of formaldehyde, dirt, and bacteria in milk.

  Following that outbreak, the state began prosecuting dairymen for using formaldehyde and, at least briefly, reduced the practice. But it wasn’t until Harvey Wiley and his allies helped secure the federal Pure Food and Drug Act in 1906 that the compound was at last banned from the food supply.
The article also talks about pasteurization being discovered in the 1850’s, but would not become standard procedure in the United States until the 1930s.


Can one really say that it's "mostly" its "relation with embalming fluid" that makes it scary... in a comment containing the line "at least 30 children had died two years prior due to use of the preservative"?

Also I want to be careful not to allow lethality to be seen as coextensive with poisonousness. Non-acute effect is kind of the name of the game with additive-type doses. The bulk of the misery caused by "lead poisoning" has nothing to do with people who hit an LD50-type threshold.


But we are missing the counterfactual of how many lives were saved due to the antibiotic effects of safe amounts of formaldehyde (previously mentioned two drops of formalin). Pasteurisation dropped infant deaths by 2/3rds, so it is quite possible that overall the addition of formaldehyde saved many lives (hard to tell without more study of whatever facts are available).

  Nonetheless, the pasteurization movement was gaining steam. In 1909 Chicago became the first American city to enforce a compulsory milk pasteurization law, despite strong opposition at the state level. After vehement back-and-forth editorials, prolonged political maneuvering, and a typhoid epidemic blamed on raw milk, New York’s commissioner of health followed suit in 1914 with the enforcement of a previously adopted ordinance. Seven years later the city’s infant mortality rate dropped to 71 deaths per every 1,000 births—less than one-third of the rate in 1891. 
Disclaimer: I am an armchair scientist, so I know nothing about the subject beyond my few searches. I am just trying to follow up on a “fact” that seems to be very biased. Even in modern times with strict standards and far far better systems of protection and antibiotic interventions, raw milk causes problems:

  Raw milk and raw milk products can be contaminated with bacteria that cause serious illness, hospitalization or even death. From 1998 through 2011, 148 outbreaks due to consumption of raw milk or raw milk products were reported to CDC. These resulted in 2,384 illnesses, 284 hospitalizations and 2 deaths. Most of the illnesses were caused by E. coli, Campylobacter, Salmonella or Listeria. A substantial proportion of the raw milk-associated disease burden falls on children; among the 104 outbreaks from 1998-2011 with information on the patients' ages available, 82% involved at least one person younger than 20 years old.
I would like to read this paywalled paper: https://pubmed.ncbi.nlm.nih.gov/30234385/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: